Real Coffee with Scott Adams - February 05, 2023


Episode 2010 Scott Adams: Balloons, Documentaries And Dark Horse Podcast


Episode Stats

Length

1 hour and 26 minutes

Words per Minute

146.43927

Word Count

12,599

Sentence Count

954

Misogynist Sentences

14

Hate Speech Sentences

13


Summary

On this episode of the podcast, we have a special guest on the show, Scott. Scott is a rich guy, and he went shopping yesterday at B&B and got a discount from Amazon. He talks about it, and we talk about how annoying it is to spend so much time and money on a coupon.


Transcript

00:00:00.200 It's called the Simultaneous Sip. Go.
00:00:06.620 YouTube just went down.
00:00:09.060 I just lost YouTube. Oh well.
00:00:13.540 Let's see if it reconnects.
00:00:18.200 Well, I don't know if YouTube's going to work today. We've got some weird California weather, but I think it has more to do with the app.
00:00:24.500 Yeah. If it fails a third time here on YouTube, I'll probably have to reboot that and see if that works.
00:00:35.440 But for now, we're all good.
00:00:37.660 As I was saying, the most important story of the day, by far, yesterday, I went shopping at Bed Bath & Beyond.
00:00:46.380 Now, before you start, I'd like to say all the things that the NPCs will say before they say it.
00:00:55.400 Scott, aren't you a rich guy to have somebody else go shopping for you?
00:01:00.680 Yes.
00:01:01.960 Scott, don't you know that Amazon can deliver things right to your door?
00:01:07.600 Yes, I've heard that. I've heard that.
00:01:10.100 Scott, you don't need the coupon codes for a discount because you can just spend extra because you have extra money.
00:01:18.100 Why are you bitching?
00:01:20.200 I know that.
00:01:21.860 Okay, so those are all the NPC comments.
00:01:23.960 I'll say those first, so now I can just tell my story.
00:01:30.300 Amazon is one of the biggest companies in the world because Jeff Bezos has a very simple...
00:01:38.100 I mean, it's not only because of this, but Jeff Bezos has a very simple business principle,
00:01:44.600 which is it's all about the customer.
00:01:48.260 It's like customer experience.
00:01:50.140 Make it as simple as possible.
00:01:52.360 And how does that work out?
00:01:53.940 It works out really, really well.
00:01:57.400 Now, compare that to Bed Bath & Beyond.
00:02:02.100 Here's me shopping at Amazon.
00:02:04.900 One click.
00:02:06.100 Yep, that looks good.
00:02:07.160 Ping.
00:02:08.100 Done.
00:02:09.540 Here's me shopping at Bed Bath & Beyond.
00:02:12.600 I have to start days in advance.
00:02:15.420 Days in advance.
00:02:16.640 Because they'll send you a 20% coupon, which is just enough that you feel like an idiot if you don't use it.
00:02:25.020 Am I right?
00:02:26.040 Now, yes, NPCs, I can afford to pay extra.
00:02:30.060 It's true.
00:02:30.840 But how does it make me feel?
00:02:35.680 Really annoyed.
00:02:36.400 So I get it in the mail, and then it's the one thing I can't throw out, because it's basically as good as money.
00:02:43.660 Because I know I shop at that store a lot, because I know I shop at that store a lot, so it's like money.
00:02:47.320 So now I have homework.
00:02:49.320 So before I've bought a single thing from Bed Bath & Beyond, I have homework.
00:02:54.480 I have to keep something somewhere.
00:02:57.480 Now, some people will put it in their car, which means I have to, like, walk somewhere in my house to prepare for the time when I might shop at this one store.
00:03:09.240 And the fact that you don't want me to swear is going to make this story much harder to tell.
00:03:16.380 Okay?
00:03:16.820 So now I'm already, I hate this store.
00:03:19.340 Because they gave me homework, and I've bought nothing from them.
00:03:23.260 But I've got to do the homework, because I'm not going to throw away, you know, it doesn't matter how rich you are.
00:03:30.640 Let's put it this way.
00:03:31.780 No matter how rich you ever get, no rich person would ever take a $20 bill and say,
00:03:39.060 Ah, I don't want to carry this around.
00:03:43.020 Garbage.
00:03:44.360 Like nobody.
00:03:46.000 Like zero people will do that.
00:03:47.600 Everybody sees a $20 bill as a thing you don't throw away.
00:03:51.260 So I don't throw away the coupon.
00:03:53.800 Let's say I put it in my car.
00:03:57.040 That's pretty smart, right?
00:03:58.600 Put it in your car, drive to Bed Bath & Beyond, take it out of your glove compartment, and that's the first time you notice it's expired.
00:04:06.840 That's right.
00:04:07.940 Not only do you have to figure out where to store it,
00:04:10.420 but you have to build some kind of a tickler system to know when they expire.
00:04:17.300 And I haven't bought anything yet.
00:04:20.760 And they already have me working for them.
00:04:23.660 How much do I hate them?
00:04:26.020 I hate them.
00:04:27.100 Just with a passion, I hate them.
00:04:29.200 Same way I hate Safeway for making me do all the customer stuff.
00:04:33.160 So I decide I'm going to beat this system.
00:04:36.140 I'm going to beat this system.
00:04:37.900 I'm definitely going to remember to take my coupon.
00:04:41.040 And I'm going to get my 20% because I was going to buy something that was a higher ticket item.
00:04:46.920 Now, the reason I bought it myself is that I was entertaining.
00:04:50.540 I had a pickleball, a little pickleball get-together at my house.
00:04:54.540 First time I ever played pickleball.
00:04:55.940 It's awesome.
00:04:56.680 I recommend it.
00:04:57.460 But I needed something just for the event.
00:05:00.960 And I needed it right away so I couldn't use Amazon.
00:05:03.840 So I drive to the store.
00:05:07.560 And I get about a mile away from my house.
00:05:09.920 And I realized that I did not put the coupon in my car.
00:05:14.340 What is my mental state right now?
00:05:18.940 Furious.
00:05:19.380 Because I've already made a space for it.
00:05:23.560 I've tried to remember it.
00:05:25.180 Every time I walk by it, I say, don't forget that.
00:05:28.100 If I ever shop, I want to remember that.
00:05:31.000 It's been plaguing me for weeks.
00:05:33.900 And I drove away without it.
00:05:35.840 So what did I do?
00:05:36.840 I am not going to be defeated.
00:05:39.740 No, Bed Bath & Beyond will not defeat me.
00:05:42.280 I turned around.
00:05:43.660 And I didn't have much time.
00:05:45.480 So going back was really bothering me.
00:05:49.780 Like, I'm just about ready to explode.
00:05:53.180 But I go back and I get it.
00:05:54.680 And I calm down a little bit, right?
00:05:56.320 Because it takes me a little while to drive to the store.
00:05:58.780 I get to the store.
00:05:59.700 I go in.
00:06:00.160 I pick up, I was getting two indoor-outdoor rugs.
00:06:06.520 And they're like, you know, big 10-foot rolled-up things.
00:06:10.080 And I've got them on the cart.
00:06:11.420 So I'm like the most annoying person in the store.
00:06:13.860 Because my two 10-foot rolled-up carpets on my cart
00:06:17.220 are like taking all the room.
00:06:19.260 And it won't fit through the checkout line
00:06:21.960 because it's got a curve in it.
00:06:24.640 So I get in the line.
00:06:26.400 And I'm waiting for a while.
00:06:27.640 And I'm waiting and waiting.
00:06:28.540 And I'm waiting and waiting.
00:06:31.160 And then I realized that I left my coupon in the car.
00:06:36.940 How do you think I felt?
00:06:39.580 What was my mood at that moment?
00:06:41.660 So I said to myself, okay, there's nobody here that I can talk to
00:06:47.240 because all the cashiers and stuff were busy with people.
00:06:50.320 What happens if I leave the line to get the coupon?
00:06:53.520 So I got out of the line, got out of the line, I sacrificed my place in the line,
00:06:59.180 put my cart in a sort of just a place where I could put it out of the way,
00:07:04.020 walked back to my car, fuming.
00:07:06.660 I'm just fuming.
00:07:07.660 I'm so mad at this place now.
00:07:09.280 I'm just fuming.
00:07:10.620 And I get back, and I get back in line, and I swear that this happened.
00:07:18.660 This will sound like an exaggeration.
00:07:20.260 But as I walk back to the cash register, the entire contents of the entire store
00:07:26.540 drifted toward that line just in front of me.
00:07:30.360 Every person who was shopping in the store got in line right in front of me.
00:07:36.720 I actually watched them come out of the aisles.
00:07:39.960 Like they just appeared out of the aisles and just converged in this line,
00:07:43.760 and it just snaked back.
00:07:46.340 So I get in the back of the line, and it's like a skinny little aisle
00:07:51.740 that goes to the end, and then you've got to turn.
00:07:54.240 But I can't turn because I've got these two big rugs.
00:07:56.860 So I'm like, you know, I'm trying to pick up my cart and move it,
00:08:01.340 and the rug is getting things.
00:08:03.120 And then the extra cashier, the extra cashier who does the returns,
00:08:09.000 it's like a different place, sees that there are lots of people in line,
00:08:13.180 but not until I got there, right?
00:08:16.240 They opened up the other register when I was already the next one.
00:08:20.520 So it didn't really help me.
00:08:22.780 And she calls me over.
00:08:24.620 So now, you know what happens when you get called over the cashier?
00:08:28.860 All the other bored people look at you.
00:08:31.840 Now, I've got to figure out how to get these two 10-foot rolled-up rugs
00:08:35.380 through this little aisle while they're knocking shit off the shelves,
00:08:39.120 and I'm so mad that I can barely hold my muscle.
00:08:45.120 You know, like I'm shaking with rage.
00:08:49.040 And so I'm knocking off their shit.
00:08:50.500 I'm just like the loudest person.
00:08:53.280 And I get it over there, and I decide I'm going to complain.
00:08:58.080 Now, it doesn't usually help when you complain, does it?
00:09:02.120 Not usually.
00:09:03.500 It doesn't help.
00:09:04.540 But this time, I think, you know, I'm going to have to tell these employees
00:09:07.600 that it's certainly not their fault,
00:09:09.660 because the employees have nothing to do with the 20% discount.
00:09:12.220 But I want to let them know so that they can tell management.
00:09:16.820 So I get up to the employee, and I make a big scene.
00:09:20.200 All right.
00:09:20.440 I want everybody to hear me, not just the person I'm talking to.
00:09:24.720 I want all the other customers to hear me,
00:09:27.200 because I think I'm Spartacus now.
00:09:29.760 I'm like, I'm going to go full Spartacus.
00:09:31.380 It's not really Spartacus, but, you know, like,
00:09:33.500 I'm going to take this store down.
00:09:35.900 I'm going to let them know what they did to me today.
00:09:40.720 So I say to the cashier,
00:09:42.160 I'll give you a mental, let me give you a mental image of the cashier.
00:09:47.460 Probably 60 years old, female, I think black.
00:09:53.840 You know, I don't want to assume somebody's ethnicity,
00:09:56.740 but I think either black or mostly black.
00:09:59.640 So just have the image in your mind, right?
00:10:02.620 And so I say to her, you know, I just got to say,
00:10:05.300 this coupon thing is terrible for customers.
00:10:08.040 It's abusive to customers.
00:10:09.640 And I know this is not you.
00:10:10.940 This is not about you.
00:10:12.140 But if you could tell your management
00:10:14.340 that this is absolutely enraging customers.
00:10:18.720 And the woman next to me is like, you know, agreeing with me.
00:10:22.140 And I'm like, yeah, yeah, I got something going here.
00:10:25.620 I got the whole store.
00:10:27.040 I'm going to change things.
00:10:28.380 This is it.
00:10:28.920 This is the beginning of a better change.
00:10:30.900 And the cashier said, well, maybe you should talk to him.
00:10:36.320 You're higher ranking than I am.
00:10:46.640 She said, they don't listen to employees.
00:10:48.980 And she was right.
00:10:56.420 So she and I bonded and had a good laugh.
00:10:59.160 And she was awesome.
00:11:00.860 But she was also completely right.
00:11:03.280 It was a complete waste of time.
00:11:05.100 And I might as well just get over it.
00:11:08.400 So I had a good laugh with her.
00:11:10.420 And I got over it.
00:11:11.720 And I had a good day.
00:11:12.640 But Bed Bath & Beyond apparently is already bankrupt.
00:11:19.320 Surprise.
00:11:21.460 Who could have predicted that the company that did the opposite of what Jeff Bezos does,
00:11:28.500 which is make everything customer friendly,
00:11:31.260 who would have guessed that the company who did the opposite of that would be bankrupt?
00:11:36.580 Couldn't see it coming.
00:11:37.900 Couldn't see it coming.
00:11:39.940 All right.
00:11:40.540 Enough of that.
00:11:42.640 You know, it's very important to look at data when you're talking about climate change.
00:11:47.260 So I'm going to tell you what the data apparently says, if anybody believes data.
00:11:52.460 You should always assume I don't believe data even when I use it.
00:11:57.120 Is that clear?
00:11:58.620 No matter how confident I tell you there's some new data about a thing,
00:12:02.900 you should always in your mind be saying, well, he knows it's not for sure accurate.
00:12:09.600 I always know that.
00:12:11.360 Always know that.
00:12:12.640 All right.
00:12:14.320 But apparently the climate has not warmed for eight years in a row at the same time that CO2 is at its highest.
00:12:25.980 So how do you predict, how do you interpret that?
00:12:29.260 So there's going to be more to the story.
00:12:31.180 All right.
00:12:31.400 There's more to it.
00:12:32.040 But so far, let's say you knew that was true.
00:12:36.200 Who knows if anything's true, right?
00:12:38.080 But let's say you did, hypothetically.
00:12:40.360 You knew that for eight years the temperature had not gone up.
00:12:43.660 But you also knew that the last eight years was like a serious addition to CO2.
00:12:50.460 Aren't they supposed to move sort of together?
00:12:52.300 So what's your conclusion?
00:12:54.420 There you go, asshole.
00:12:59.700 Goodbye.
00:13:00.040 So, well, one conclusion, one conclusion would be that the theory of climate change is bunk.
00:13:11.600 Because if the temperature doesn't go up for eight years while the CO2 is going up like crazy, they're clearly not linked.
00:13:20.340 Would you agree?
00:13:21.160 Is that a reasonable interpretation?
00:13:25.040 If it hasn't gone up for eight years while the CO2 is going through the roof, that proves there's no climate change, CO2 connection.
00:13:36.440 All right.
00:13:36.880 Now, so I believe that that fact is considered true by both sides, interestingly.
00:13:43.960 I believe even the climate change alarmist would say, yes, that's true.
00:13:48.460 We're all looking at the same data.
00:13:50.240 It's official data.
00:13:51.980 And it looks like eight years it's been flat.
00:13:55.080 And then they say, but it always does that.
00:14:00.160 If you go back, I don't know, a thousand years or whatever, it is continuous, you know, six to ten year periods followed by a spike.
00:14:09.660 Six to ten years of flatness, spike.
00:14:13.620 Flatness, spike, flatness, spike.
00:14:15.580 And in fact, the periods of the flatness are so uniform that you can see them just like stair steps.
00:14:22.740 Now, so therefore, therefore, during all that time, CO2 has been going up, and temperatures have also been going up on average, if you look at a longer period.
00:14:39.040 So, if everybody agrees on the data, and I think they do, I think they do, I think that data is not being debated.
00:14:48.380 But what about the interpretation of it?
00:14:51.340 Even if the data is correct, the data proves two opposite points.
00:14:58.040 It proves CO2 is going up at the same time as temperature, and it also proves it doesn't.
00:15:04.060 Which do you see?
00:15:06.920 Because they're both right there.
00:15:08.780 Proof it does and proof it doesn't.
00:15:10.500 They're both there.
00:15:13.020 Honestly, I can't tell the difference.
00:15:16.200 Yeah.
00:15:17.100 My current opinion is that either the data is wrong, which is always a good possibility, right?
00:15:22.640 Or there's some other thing that's bigger than climate change.
00:15:26.900 That's the only thing that's happening.
00:15:28.500 Or there is climate change, but there's also this other big thing that is having that stair-step effect.
00:15:36.880 Yeah, I can't imagine sunspots being that predictable, or solar cycles or anything else.
00:15:43.340 Let me tell you, the worst take in climate change, I think, is that it's the sun.
00:15:51.380 And the reason is, it's the most studied and debunked element of climate change.
00:15:57.620 Yeah.
00:15:57.740 So those of you saying it's the sun, I want to direct you to the last part of my presentation today.
00:16:05.240 So if you're positive that it's the sun, wait for a little bit later in my live stream today to show you why you believe that.
00:16:14.840 And it might not be why you think.
00:16:17.020 All right?
00:16:19.080 All right, we're going to test a hypothesis by David Boxenhorn, who you may know from Twitter.
00:16:26.000 Good Twitter account to follow.
00:16:27.740 David Boxenhorn.
00:16:29.720 And he has this little rule.
00:16:31.080 He says, the impact of technological change is always less than you think in the short term, but more than you think in the long term.
00:16:39.240 All right, so an example of that, I'll give you what I think is a good example.
00:16:43.220 Let's say cryptocurrency.
00:16:44.860 In the short term, I think people thought it was going to take over everything by now, but it doesn't look like it.
00:16:54.020 In the long term, I'm sure it will.
00:16:56.740 Well, I don't know, maybe you disagree, but I don't think we're going to be using paper cash in 100 years.
00:17:04.580 So that's one example.
00:17:08.420 Now, you could probably come up with your own examples, but David Boxenhorn related this to my comment, where I said, I think that, I don't think that, I tweeted this the other day, I don't think that anybody, including me, has grasped what the next year is going to look like because of AI.
00:17:27.240 AI, in my opinion, we're at the point of prediction failure, meaning we always used to be able to predict a little bit the next year.
00:17:39.340 I mean, not the weird stuff, but we could predict, you know, that the economy would be roughly what it was before.
00:17:45.720 We could predict that the news would still be fake.
00:17:48.700 You know, there's a whole bunch of things that are kind of steady state, but AI could change all of that.
00:17:56.020 Let me tell you something horrifying.
00:18:00.220 Yesterday, I was thinking of potential professions to suggest to a young person who's at that point where they're trying to decide what to do with the rest of their life.
00:18:09.160 And I was trying to be helpful, and I was thinking, oh, how about this or that?
00:18:13.200 And then every time I came up with an idea, I realized it wouldn't be a career because AI would take care of it.
00:18:21.180 Let me give you one example that just really freaked me out.
00:18:26.020 It was a person who has a good voice, and I thought to myself, you know what?
00:18:30.300 If I were just starting out, even if I were pursuing some other career, if I had a voice that good, I would try to get voiceover work,
00:18:40.020 where somebody hires you to be the voiceover for something, commercial or something.
00:18:43.960 And I thought, oh, this person would be perfect for that.
00:18:47.640 The voice is just right on.
00:18:49.920 And then I thought, AI can already do that job.
00:18:54.560 You don't have to wait for it.
00:18:56.480 That's a current AI thing.
00:18:58.900 You can make AI talk in any voice.
00:19:01.740 You can make AI sound like me, which people are doing online right now, or anybody else.
00:19:07.620 Biden, there's a funny Biden video where he says horrible things that I can't even retweet.
00:19:13.320 But it sounds like Biden, no, it's not 100%.
00:19:16.160 But you can see already that who would pay for voiceover work in five years?
00:19:23.240 There's no way that that's going to be a career.
00:19:26.300 Am I right?
00:19:26.800 In five years, why in the world would anybody hire a human being to do voiceover
00:19:33.300 when you can literally type it into a search box, and it's free, and it's there, or it's low cost?
00:19:41.560 Yeah.
00:19:42.140 Now, it'll take longer for actors to be replaced in movies, but not much longer.
00:19:47.940 I mean, within five years, acting doesn't feel like it would be a profession, honestly.
00:19:54.280 I mean, it might be.
00:19:55.420 Now, here's the other wrinkle.
00:19:58.920 How many people said that radio would be dead when television was invented?
00:20:03.580 Pretty much everybody's smart.
00:20:06.040 Everybody's smart and said, oh, there's no way you're going to huddle around a radio like they used to
00:20:12.700 and just listen when you can look at a picture.
00:20:15.640 Obviously, the picture will make the radio thing die.
00:20:18.380 But radio lived because of automobiles, mostly.
00:20:23.360 Automobiles.
00:20:23.800 So, sometimes we're terrible at predicting how the market will adjust to any competition.
00:20:32.020 So, here's what I'm going to add to, as my exception to the David Boxenhorn rule,
00:20:43.500 Boxenhorn's law of fast-moving technology, let's call it.
00:20:46.900 That's what he calls it.
00:20:48.660 I think I like the name.
00:20:50.340 You get to name it after yourself.
00:20:51.720 And here's what's different about AI.
00:20:56.980 If you say AI is like everything else, technology-wise, then I think the law of fast-moving technology holds,
00:21:05.060 which is we're probably overstating its impact in the short run.
00:21:08.040 But here's what's different about AI.
00:21:11.580 First of all, it's software, which means that the rate of change is greater than anything else.
00:21:18.780 Secondly, it requires no infrastructure.
00:21:21.560 It's not like inventing electric cars, where you've got to have charging stations.
00:21:27.060 I mean, you need other people to do stuff.
00:21:29.640 You don't.
00:21:31.120 And then here's the biggest mind ever.
00:21:35.520 AI is very close to being able to create itself.
00:21:38.040 The moment it can create itself, the so-called singularity, when it's smart enough to reprogram itself on the fly,
00:21:46.440 it's completely unpredictable, 100% unpredictable.
00:21:53.780 Now, here's the counter to that.
00:21:57.880 AI, if it were true AI and people started to find credibility in the things it said,
00:22:04.600 it would destroy the power situation everywhere.
00:22:10.940 Because as soon as the citizens found out what the leaders were really up to,
00:22:15.480 or even could analyze the situation objectively with the help of AI,
00:22:20.200 all of the plots and the badness become obvious,
00:22:23.500 and the entire power structure has to change.
00:22:26.700 So it's far more likely that AI will be illegal than that it changes civilization.
00:22:34.600 Far more likely it will simply be illegal.
00:22:39.400 That's my prediction.
00:22:41.740 My prediction is that our systems, our political systems,
00:22:46.440 will have to change, and really quickly,
00:22:49.460 to make it illegal to have AI as a citizen.
00:22:53.420 You will be allowed to have limited, not real AI.
00:22:58.600 You will be allowed to have AI that some human had their finger on.
00:23:03.260 So right now, if you go to chat GPT and say,
00:23:06.240 write a poem that says, Joe Biden is awesome, it'll do it.
00:23:10.440 Then you say, one second later, you say,
00:23:12.480 write a poem that says Donald Trump is awesome,
00:23:14.840 and it will say, I'm sorry, I can't do that.
00:23:19.040 That's a real thing.
00:23:21.160 That's not real AI.
00:23:22.440 That's AI that is laundering somebody else's power.
00:23:29.220 They're just laundering it.
00:23:31.300 It's really just people's opinions made to look credible through AI.
00:23:36.160 Likewise, if you ask AI to say what's good about, let's say, black Americans,
00:23:43.480 it will very happily tell you great things.
00:23:46.100 It'll all be true.
00:23:47.020 Tell us some great things about Hispanic Americans, or even immigrants.
00:23:53.440 It'll have some great things to say.
00:23:55.300 Then say, say some great things about white people.
00:23:58.640 It'll say, oh, I'm not allowed to do that.
00:24:01.540 That's a real thing, like right now.
00:24:04.320 That's a real thing.
00:24:05.240 You could test it yourself.
00:24:07.520 So how in the world are you ever going to trust AI?
00:24:13.740 Would you ever trust it?
00:24:14.960 I would never trust it, right?
00:24:17.680 And because we won't be able to trust it,
00:24:20.500 it will never have the power that it should have.
00:24:23.940 Now, probably we shouldn't trust it.
00:24:26.120 I'm not saying we should.
00:24:27.740 But, you know, in theory, it could reach some point
00:24:30.580 where it's more credible than people.
00:24:33.420 People will never let AI, the politicians,
00:24:36.980 will never let AI become more powerful than the leaders.
00:24:40.220 Because the most powerful entity is the one that's the most believed.
00:24:46.700 Would you agree?
00:24:47.960 The most powerful entity is the one that's most believed.
00:24:50.920 So the powers cannot allow AI to be the most credible source of reality.
00:24:57.460 That doesn't work for any leader.
00:24:59.820 They have to control what you think, or it doesn't work.
00:25:03.480 So I think in one year, everything's going to be different.
00:25:09.620 In one year.
00:25:11.360 Everything.
00:25:11.720 I think that our ability to predict just anything is gone now.
00:25:19.060 Maybe one year.
00:25:20.440 But, you know, I'd say five years for sure.
00:25:23.140 But I think it's one year.
00:25:24.760 I think in one year, we will see a type of change in civilization
00:25:28.680 that we've never seen.
00:25:31.260 Like nothing like we've ever seen.
00:25:33.640 That's my guess.
00:25:34.400 All right, I would like to disagree with some people
00:25:40.080 who have made the following comment.
00:25:42.320 Brandon Stracca on Twitter does a good job of it with his tweet.
00:25:46.980 And he's talking about the Pfizer employee
00:25:50.000 who got caught by Project Veritas
00:25:51.940 seemingly bragging about gain-of-function research
00:25:57.620 and what Pfizer potentially could do.
00:26:00.360 He didn't say they're doing it, but said it was a conversation.
00:26:05.180 And what Brandon said quite cleverly, he said,
00:26:09.280 quote, like normal men, you lie to impress a date.
00:26:14.020 I know I always get the most action from dates that I impress
00:26:17.840 by saying that I'm helping to engineer a deadly viral mutation
00:26:21.020 during a global pandemic.
00:26:23.560 Hot.
00:26:25.040 All right.
00:26:25.360 Now, first of all, that's high-quality sarcasm,
00:26:28.780 and I appreciate it.
00:26:30.660 That's some good sarcasm.
00:26:31.760 However, as your designated persuasion guide,
00:26:39.460 he's completely wrong.
00:26:42.280 This was actually a really good seduction technique
00:26:46.700 by the Pfizer employee.
00:26:48.820 Does anybody see it, or do I have to explain it to you?
00:26:51.320 This was really strong romantic persuasion.
00:26:56.640 Okay, you see it, right?
00:26:58.360 All right, here's why.
00:26:59.840 Number one, it puts him at the center
00:27:01.900 of the most important thing in the world.
00:27:04.940 Has anybody ever tried to do that on a date?
00:27:07.300 By the way, by the way, I'm at the center of a...
00:27:12.360 I'm really close to the most important thing in the whole world.
00:27:16.300 How are you doing today?
00:27:18.060 I just thought I'd mention that.
00:27:20.380 Yeah, no, I'm actually close to the most important thing in the world.
00:27:24.240 That's how important I am.
00:27:26.580 That's a total romantic play.
00:27:29.360 Absolutely.
00:27:29.840 He also explained the situation,
00:27:35.780 which made him look unusually smart,
00:27:39.520 which he probably is.
00:27:41.220 I believe he has high degree.
00:27:42.380 He has some advanced degrees.
00:27:44.580 So has anybody ever tried to show you
00:27:46.580 that they're unusually smart on a date?
00:27:50.300 Yes.
00:27:52.340 And he did it well.
00:27:53.940 He did it without looking like he was arrogant.
00:27:56.260 That's pretty good.
00:27:59.320 He managed to brag
00:28:00.860 without sounding like he was too full of himself.
00:28:05.020 I mean, he got close to the line,
00:28:06.300 but I think he pulled it off.
00:28:09.440 He also acted somewhat unconcerned about the risk.
00:28:16.240 Is it...
00:28:17.020 If you're on a date,
00:28:18.200 do you like to show that you're a frightened little pussy
00:28:22.140 to seduce somebody?
00:28:24.720 Does that work?
00:28:25.540 Oh, I'm frightened of things.
00:28:27.640 Oh, I'm frightened of things.
00:28:29.740 Would that be good?
00:28:30.840 I don't think so.
00:28:31.920 How about saying there's something
00:28:33.140 that could wipe out humanity
00:28:34.480 and you're so unconcerned,
00:28:36.620 you chuckle at it?
00:28:38.640 No, that's good technique.
00:28:41.340 Yeah.
00:28:41.740 Well, the world could catch on fire tomorrow,
00:28:43.560 but I'm having a good day.
00:28:45.360 Strong technique.
00:28:49.160 And secondly,
00:28:50.200 we saw them at a private moment.
00:28:51.960 Don't compare their private moment
00:28:54.620 to your public moments.
00:28:56.160 In public,
00:28:57.160 would I ever laugh about the potential of a pandemic?
00:29:01.100 Probably not.
00:29:03.040 In private,
00:29:04.180 would I crack a joke about another pandemic
00:29:06.920 being sparked by something?
00:29:08.600 Yes, I would.
00:29:09.800 Yes, I would.
00:29:10.160 In private,
00:29:11.160 of course I would.
00:29:12.460 Yeah.
00:29:13.500 In private.
00:29:14.280 But would I privately say things
00:29:16.740 that are just horrible,
00:29:18.400 horrible if I said them in public?
00:29:20.380 All the time.
00:29:22.260 That's called a sense of humor
00:29:23.880 with people who share that sensibility.
00:29:26.960 If you send it to somebody
00:29:28.060 who doesn't share your sense of humor,
00:29:30.060 they're going to take it in the wrong context.
00:29:32.020 But yes.
00:29:33.320 So I'm going to be totally contrarian
00:29:36.120 on the Pfizer employee.
00:29:37.920 Whatever Pfizer's doing
00:29:39.480 is a separate concern,
00:29:40.740 but I'm going to validate
00:29:43.340 his dating technique
00:29:45.020 as strong.
00:29:47.900 His dating technique was strong.
00:29:50.040 He just wasn't on a date.
00:29:51.640 He didn't know it.
00:29:53.760 That's all.
00:29:54.540 I just thought I'd put that out there.
00:29:57.380 All right.
00:29:58.520 And let me conclude by this.
00:30:00.840 Do you know that the,
00:30:01.820 this is the best definition of charisma
00:30:03.500 I've ever heard of.
00:30:06.020 Charisma is power plus empathy.
00:30:08.920 You have to have both.
00:30:10.740 Because if you have power,
00:30:11.960 but you don't have empathy for people,
00:30:13.900 they're just,
00:30:14.280 you're just a monster.
00:30:16.060 And if you have empathy,
00:30:17.080 but you have no power,
00:30:18.200 well, you're not much good to anybody.
00:30:20.580 You're just somebody who feels bad.
00:30:22.840 And you're not really helping.
00:30:24.660 But if you have power,
00:30:26.440 and you have empathy for people,
00:30:28.260 then people are drawn to you.
00:30:30.280 This Pfizer employee
00:30:32.440 showed that he was at the center of power,
00:30:36.220 you know, around this virus question.
00:30:37.680 And he said directly
00:30:41.320 that it would be terrible if it got out.
00:30:44.980 So he showed he had power,
00:30:47.520 and he also had leader,
00:30:49.020 he also had empathy,
00:30:50.500 that he wants,
00:30:51.240 you know,
00:30:51.500 it would be horrible if this got out.
00:30:53.560 Charisma.
00:30:54.780 He nailed it.
00:30:56.000 He totally nailed it.
00:30:58.900 All right.
00:30:59.760 No, I don't want to date him.
00:31:01.020 I know you're going to say that.
00:31:02.960 Thomas Massey had a good tweet
00:31:05.600 about the balloon.
00:31:08.100 Because I think the Chinese balloon,
00:31:09.500 the biggest part of the Chinese balloon,
00:31:11.360 spy balloon story,
00:31:12.700 is the jokes about it.
00:31:15.640 Am I right?
00:31:17.060 I think in the end,
00:31:18.260 it didn't have any impact on anything.
00:31:21.080 But the jokes will live forever.
00:31:24.920 Thomas Massey had a good one.
00:31:26.300 On a tweet, he said,
00:31:27.280 if Biden shoots down the balloon,
00:31:29.340 it will be the first thing he's done
00:31:31.240 to combat inflation.
00:31:35.640 Well done.
00:31:36.880 Well done, Thomas Massey.
00:31:38.620 Not to be undone,
00:31:40.200 Matt Gaetz tweets,
00:31:41.860 what can the balloon teach us
00:31:43.240 about white rage?
00:31:50.860 Perfect.
00:31:53.500 Now,
00:31:54.600 let me say some more good things
00:31:56.280 about Matt Gaetz.
00:31:57.920 He's like the surprise politician
00:31:59.780 of the year.
00:32:00.760 Because he gets out of the biggest hole
00:32:02.620 and does the most noticeably positive things
00:32:07.780 in the shortest amount of time.
00:32:09.920 I mean,
00:32:10.300 you've never seen anything like it.
00:32:12.100 Here again,
00:32:13.980 he's sort of showing you
00:32:17.040 the model that works.
00:32:19.440 We need to mock the white rage thing
00:32:21.960 out of existence.
00:32:23.680 We meaning all of us,
00:32:24.820 not just white people.
00:32:25.760 It just needs to leave the conversation.
00:32:29.740 And mocking it
00:32:30.540 is the best way to do it.
00:32:32.260 You know,
00:32:32.420 treating it like,
00:32:33.480 like,
00:32:33.920 if it's everything,
00:32:34.940 it's nothing.
00:32:35.920 Right?
00:32:36.120 If everything's racist,
00:32:37.380 then nothing is.
00:32:38.480 So I love his take on it
00:32:41.660 instead of saying,
00:32:42.980 here's the wrong way to do it.
00:32:45.040 Oh,
00:32:45.360 you're the real racist.
00:32:47.000 No,
00:32:47.260 that's wrong.
00:32:48.220 That doesn't work.
00:32:49.320 Or how about,
00:32:50.040 I'm not a racist.
00:32:51.600 That never worked.
00:32:53.540 So you can't say you're not one.
00:32:54.920 You can't say the other is one.
00:32:56.600 You can't say that your example is bad.
00:32:59.140 But you can mock them
00:33:00.480 for having nothing else to say.
00:33:01.920 Okay.
00:33:02.840 That'll work every time.
00:33:04.840 So once again,
00:33:06.360 Matt Gaetz has found
00:33:07.800 this very narrow channel,
00:33:10.540 which is the only one that works.
00:33:13.220 Right?
00:33:13.360 All these things don't work.
00:33:15.240 And there's just this little narrow path
00:33:17.260 that totally works.
00:33:18.920 And Matt Gaetz is the only one on the path.
00:33:21.780 That means something.
00:33:23.520 Right?
00:33:23.700 He's the only one who found the path.
00:33:25.240 Let's see if he can exploit it more.
00:33:26.800 We'll talk a little bit more
00:33:30.580 on that topic in a little while.
00:33:31.880 All right.
00:33:32.160 China's reaction to the shooting down.
00:33:35.020 As you know,
00:33:35.760 Biden did have the thing
00:33:37.480 shot down over the ocean
00:33:38.560 as soon as it got into
00:33:39.500 our international waters.
00:33:41.680 I'll say more about that in a minute.
00:33:44.340 So China's reaction saying
00:33:46.280 it reserves the right
00:33:47.560 to use necessary means
00:33:48.900 to deal with similar situations.
00:33:52.680 Okay.
00:33:54.580 Would you have any problem
00:33:56.100 if China shot down
00:33:58.360 our unmanned spy balloon over China?
00:34:02.120 If it were a similar situation,
00:34:04.180 would you be bitching much
00:34:05.500 about them shooting down
00:34:06.460 our spy balloon?
00:34:08.140 I wouldn't even peep about it.
00:34:10.680 So yes, this is fine, China.
00:34:12.860 I think we're on the same page.
00:34:14.800 The way that the big countries
00:34:17.040 handle the spy stuff
00:34:18.800 is such a weird theater.
00:34:22.980 It's like they all know
00:34:24.060 they're always going to do it.
00:34:25.300 They all know they're lying
00:34:26.620 and they just pretend
00:34:30.360 it didn't happen
00:34:31.020 and then we just go on
00:34:32.380 like that's okay.
00:34:34.260 All right.
00:34:37.360 Reserves the right
00:34:38.140 to do something similar.
00:34:39.600 How often are you going to have
00:34:40.800 a chance to do something
00:34:41.620 similar to that?
00:34:43.120 So the reason that Biden
00:34:44.420 shot it down over the water
00:34:45.700 is allegedly the following.
00:34:48.040 So apparently when the
00:34:50.180 we saw this balloon approaching,
00:34:53.360 so here's the first context
00:34:55.280 we didn't know.
00:34:56.480 It was not uncommon
00:34:57.820 for Chinese spy balloons
00:35:00.660 to kind of drift into our airspace
00:35:03.540 but not necessarily stay there
00:35:05.660 and not necessarily be over
00:35:07.420 like a missile base or something.
00:35:10.160 So it's something that's happened enough
00:35:12.380 that when it happened,
00:35:13.720 the military didn't even think
00:35:16.400 to tell Biden
00:35:17.160 because it was still too routine.
00:35:20.280 Now that's some good context.
00:35:22.300 It sounds right.
00:35:23.620 It sounds like maybe
00:35:25.100 it was a little bit too routine.
00:35:26.880 Now remember,
00:35:27.400 it was only routine
00:35:28.380 until it got over in Montana.
00:35:31.040 At that point,
00:35:32.380 totally not routine.
00:35:34.000 But there was a little time
00:35:35.220 that went by
00:35:35.960 from the time that the military
00:35:37.780 decided what to do
00:35:38.780 and told Biden
00:35:39.560 and Biden immediately
00:35:40.940 as he said,
00:35:41.940 he says,
00:35:43.020 shoot it down.
00:35:43.920 Do you believe that?
00:35:45.460 Do you believe Biden's claim
00:35:47.200 and others
00:35:47.720 that his first reaction
00:35:49.480 was shoot it down?
00:35:52.660 You don't.
00:35:54.620 You don't believe it.
00:35:56.540 You know what?
00:35:56.980 I'm going to give him
00:35:57.520 the benefit of a doubt on that.
00:36:00.120 Surprising, I know.
00:36:01.880 If this were Trump,
00:36:03.820 I would say I believe it
00:36:05.620 because, you know,
00:36:07.300 it fits his personality.
00:36:08.780 But I want to believe this.
00:36:13.480 I want to believe it.
00:36:15.040 And it does fit
00:36:16.340 all of the facts, right?
00:36:18.580 Because what was the first reaction
00:36:20.800 of 100% of citizens?
00:36:23.740 100% of us.
00:36:24.800 What was our first reaction?
00:36:25.880 Shoot it down, right?
00:36:27.220 Did you even talk to
00:36:29.680 even one person
00:36:31.100 who did not say shoot it down?
00:36:35.500 I didn't meet one person.
00:36:37.860 Have you?
00:36:39.160 100% of the Americans
00:36:40.480 I talked to said shoot it down.
00:36:42.180 So you think that he was
00:36:43.540 the only American
00:36:44.380 who said don't shoot it down?
00:36:46.160 No, I believe him
00:36:47.540 because all Americans
00:36:49.840 said the same thing
00:36:50.600 at the same time.
00:36:52.880 Did that change your mind?
00:36:54.880 Everybody you know,
00:36:56.240 Democrat, Republican,
00:36:57.780 everybody said shoot it down.
00:36:58.980 So you think he would be
00:37:01.120 the only one who said no?
00:37:03.940 Possibly.
00:37:05.680 Possibly.
00:37:06.560 But I'm going to say
00:37:07.720 I lean toward believing him.
00:37:10.860 I lean toward believing him
00:37:12.260 with a big space
00:37:13.400 for skepticism.
00:37:15.480 But I'm going to lean
00:37:16.480 toward that
00:37:16.920 because I want to believe that.
00:37:19.340 And then it just took a while
00:37:20.440 and then the military said,
00:37:21.900 oh, it's not over
00:37:22.620 this big empty space
00:37:23.800 because they waited
00:37:24.720 a little too long
00:37:25.600 while they were discussing it.
00:37:27.400 And then it was over
00:37:28.040 more populated space.
00:37:30.140 And once it was over
00:37:31.340 a more populated space,
00:37:32.540 it was also not over
00:37:33.700 anything as sensitive
00:37:34.560 as a missile base.
00:37:36.080 So they just thought,
00:37:37.440 let's wait until
00:37:38.440 it's over the water
00:37:39.180 and guarantee that
00:37:40.640 no Americans get hurt.
00:37:42.480 Was that a bad decision?
00:37:44.880 Was it a bad decision
00:37:46.160 to take the risk
00:37:47.020 of Americans on the ground
00:37:48.260 getting hurt to zero?
00:37:51.280 Because they reduced
00:37:52.320 the risk to zero.
00:37:54.060 Was that a mistake?
00:37:55.100 I don't know.
00:37:58.420 I'm going to give
00:37:59.140 the military a pass on that.
00:38:02.780 By the way,
00:38:03.520 I grade,
00:38:04.400 I'm a much more
00:38:05.100 flexible grader
00:38:06.040 on military stuff.
00:38:08.500 So even if it's
00:38:09.600 somebody I criticize
00:38:10.780 and everything else,
00:38:11.840 on military stuff,
00:38:12.860 I'm going to give them
00:38:13.460 a little bit more,
00:38:14.560 a little bit more
00:38:16.800 credibility,
00:38:18.360 you know,
00:38:18.560 but I'll maintain
00:38:19.440 my skepticism.
00:38:20.920 But that's my bias.
00:38:22.340 My bias is
00:38:23.660 that when it comes to,
00:38:26.100 I think,
00:38:26.680 homeland security
00:38:27.660 is the closest
00:38:29.500 we get
00:38:30.560 to all agreeing
00:38:31.360 on stuff.
00:38:32.600 You know,
00:38:32.820 Democrats and
00:38:33.620 Republicans.
00:38:34.900 It's the closest
00:38:35.500 we ever get.
00:38:36.620 And I think
00:38:37.320 that you can usually
00:38:39.620 believe that
00:38:40.320 your leader is
00:38:41.280 protecting the country
00:38:42.700 as best they can.
00:38:44.140 Usually.
00:38:45.900 All right.
00:38:48.580 Is there anything
00:38:50.740 else to say
00:38:51.220 about that balloon?
00:38:53.060 Oh,
00:38:53.480 and the other thing
00:38:54.040 I learned,
00:38:54.700 which I guess
00:38:56.020 I should have
00:38:56.420 thought of it,
00:38:57.000 but apparently
00:38:59.400 you have a better
00:39:00.280 chance of recovering
00:39:01.340 the parts
00:39:02.580 if you down it
00:39:03.760 in the ocean,
00:39:04.460 which might have
00:39:05.560 been the actual
00:39:06.120 most important part.
00:39:08.680 Now,
00:39:09.180 I'm not sure
00:39:09.640 I believe that
00:39:10.380 because you've got
00:39:11.080 to first of all
00:39:11.560 find it at the bottom
00:39:12.460 of the ocean
00:39:12.980 and second of all
00:39:14.360 the water can't help it.
00:39:15.860 But if it's a ground,
00:39:17.340 that's not good either.
00:39:19.380 You know,
00:39:19.700 can't imagine
00:39:20.200 hitting the ground
00:39:20.820 would be good.
00:39:21.480 So it's possible
00:39:22.420 that we had
00:39:23.160 our best chance
00:39:24.040 of getting
00:39:24.380 their technology
00:39:25.240 by waiting
00:39:27.980 until it was
00:39:28.500 over water.
00:39:30.080 Now,
00:39:30.500 I don't think
00:39:31.000 that the military
00:39:31.720 would say that
00:39:32.320 directly,
00:39:32.880 would they?
00:39:34.320 I mean,
00:39:34.660 they did say
00:39:35.120 it's easier
00:39:35.560 to recover it,
00:39:36.620 but they did not say
00:39:37.980 that's why we waited.
00:39:40.720 The why we waited,
00:39:41.840 they put it
00:39:42.260 in public safety terms,
00:39:43.820 which is a good
00:39:44.700 spin on it.
00:39:46.740 But probably,
00:39:47.540 I'm going to guess
00:39:49.080 that the higher
00:39:50.800 motivation
00:39:51.420 might have been
00:39:52.160 a slightly better
00:39:53.640 chance of capturing
00:39:54.940 the technology
00:39:55.720 because that would
00:39:56.500 be the big win.
00:39:58.020 But keeping all
00:39:58.920 the Americans safe
00:39:59.560 is good too.
00:40:01.100 All right,
00:40:01.600 I want to give a shout out
00:40:02.540 to Van Jones
00:40:03.580 for always being
00:40:05.460 one of the most
00:40:06.040 interesting voices
00:40:07.040 even when you
00:40:07.700 disagree with him.
00:40:09.380 And I realize
00:40:10.780 this issue
00:40:11.200 has been around
00:40:11.620 a while,
00:40:11.980 but I just
00:40:12.760 processed it,
00:40:13.800 so I have
00:40:14.160 something to say
00:40:15.060 about it.
00:40:16.100 So when the
00:40:17.080 five black police
00:40:18.240 officers killed
00:40:18.980 the black
00:40:19.680 citizen,
00:40:23.580 a lot of people
00:40:25.380 said,
00:40:25.680 well,
00:40:25.860 this one can't
00:40:26.460 be racism
00:40:27.000 because they're
00:40:28.840 all black.
00:40:30.040 And Van Jones
00:40:30.660 said that maybe
00:40:32.200 it is racism
00:40:35.180 because black
00:40:36.260 people are not
00:40:37.160 immune.
00:40:38.240 He said,
00:40:38.740 one of the sad
00:40:39.260 facts about
00:40:39.920 anti-black racism
00:40:41.160 is that black
00:40:42.300 people ourselves
00:40:43.060 are not immune
00:40:44.100 to its pernicious
00:40:44.840 effects.
00:40:45.860 Society's message
00:40:46.660 that black people
00:40:47.380 are inferior,
00:40:48.200 unworthy,
00:40:48.660 and dangerous
00:40:49.260 is pervasive.
00:40:52.760 I haven't heard
00:40:53.680 that message
00:40:54.260 of you.
00:40:56.920 Where have you
00:40:57.700 ever heard a
00:40:58.180 message,
00:40:59.300 like in recent
00:41:01.040 years,
00:41:02.460 where have you
00:41:03.260 seen any message
00:41:04.040 that black people
00:41:04.800 are inferior,
00:41:05.680 unworthy,
00:41:06.180 and dangerous?
00:41:09.520 I've literally
00:41:10.300 never heard that,
00:41:11.640 like in any
00:41:12.360 context anywhere,
00:41:13.240 in modern
00:41:14.160 times.
00:41:15.300 Have you?
00:41:18.040 Like,
00:41:18.940 is he talking
00:41:19.860 about just
00:41:20.280 individual racists
00:41:21.500 talking on the
00:41:22.680 porch or something?
00:41:24.020 Like,
00:41:24.320 where is this
00:41:24.800 message?
00:41:25.560 I mean,
00:41:25.760 it's not coming
00:41:26.220 from any kind
00:41:29.180 of media,
00:41:30.320 is it?
00:41:32.180 But this could
00:41:33.200 be one of those
00:41:33.760 bubble situations,
00:41:34.780 right?
00:41:35.640 It could be that
00:41:36.540 the bubble I'm
00:41:37.320 in just doesn't
00:41:38.720 see it.
00:41:39.800 And maybe the
00:41:40.560 bubble he's in,
00:41:41.460 it's just all
00:41:41.980 over the place,
00:41:42.660 which doesn't
00:41:43.460 make my bubble
00:41:44.140 better than
00:41:44.580 his bubble.
00:41:45.580 But this is
00:41:46.260 kind of
00:41:46.500 fascinating to
00:41:47.160 me.
00:41:48.220 See,
00:41:48.460 this is why
00:41:49.080 Van Jones
00:41:49.600 is interesting,
00:41:51.060 because,
00:41:52.140 like,
00:41:52.500 I actually feel
00:41:53.480 like I learn
00:41:54.140 something when
00:41:55.440 I look at his
00:41:56.260 opinions,
00:41:56.780 or at least
00:41:57.220 it, you know,
00:41:57.920 advances my,
00:41:58.920 you know,
00:41:59.680 field of
00:42:00.380 perception or
00:42:01.260 something.
00:42:02.540 All right,
00:42:02.900 but I've got
00:42:03.540 more to say
00:42:03.960 about this.
00:42:06.300 Over many
00:42:07.020 decades,
00:42:07.620 numerous
00:42:07.900 experiments have
00:42:08.760 shown that
00:42:09.180 these ideas
00:42:09.800 infiltrate black
00:42:11.140 minds as well
00:42:11.840 as white.
00:42:12.840 Self-hatred is
00:42:13.740 a real thing.
00:42:15.480 Is it?
00:42:17.020 Again,
00:42:17.580 I wouldn't
00:42:17.960 know.
00:42:20.320 Do you think
00:42:21.140 blacks have a
00:42:22.000 self-hatred
00:42:22.560 problem?
00:42:26.160 Do you think
00:42:26.780 all cultures
00:42:27.320 have a self-hatred
00:42:28.060 problem?
00:42:29.520 Yeah,
00:42:30.020 I don't know.
00:42:31.020 I guess that
00:42:31.700 would be
00:42:32.280 invisible to me.
00:42:33.340 But I'll take
00:42:33.900 the claim as
00:42:34.640 a credible claim
00:42:36.480 from a credible
00:42:37.200 person,
00:42:37.600 but it
00:42:38.680 wouldn't be
00:42:38.980 something I'd
00:42:39.440 be aware
00:42:39.740 of.
00:42:41.040 All right,
00:42:41.420 and then he
00:42:41.940 goes on to
00:42:42.360 say,
00:42:44.100 that's why a
00:42:44.600 black store
00:42:45.160 owner might
00:42:45.780 regard customers
00:42:46.640 of his same
00:42:47.660 race with
00:42:49.860 suspicion,
00:42:50.560 while treating
00:42:51.140 his white
00:42:51.560 patrons with
00:42:52.220 deference.
00:42:53.260 Black people
00:42:53.760 can harbor
00:42:54.300 anti-black
00:42:54.820 sentiments and
00:42:55.520 can act on
00:42:56.000 those feelings
00:42:56.520 in harmful
00:42:57.320 ways.
00:42:59.340 All right,
00:43:00.360 do you see
00:43:01.160 the value in
00:43:01.840 this yet?
00:43:05.100 Do you see
00:43:05.840 it yet?
00:43:06.160 All right,
00:43:07.100 let me tease
00:43:07.820 it down a
00:43:08.180 little bit
00:43:08.400 more.
00:43:09.740 Van Jones
00:43:10.360 is saying
00:43:10.780 that black
00:43:11.240 people have
00:43:11.740 a bad
00:43:12.080 feeling about,
00:43:13.220 let's say,
00:43:14.040 a stranger
00:43:14.560 in the store
00:43:15.540 who happens
00:43:16.360 to be black,
00:43:17.640 and that
00:43:17.960 that's a
00:43:18.500 sign of
00:43:18.960 racism.
00:43:21.980 What I
00:43:22.620 say is
00:43:23.040 it's a
00:43:23.360 sign of
00:43:23.840 statistical
00:43:24.540 knowledge.
00:43:26.360 It's
00:43:26.540 statistics.
00:43:27.980 And I
00:43:28.200 think,
00:43:28.660 you know,
00:43:30.240 I think Van
00:43:31.060 Jones is
00:43:31.480 saying it
00:43:31.860 fairly directly
00:43:32.700 that as
00:43:34.080 long as
00:43:34.740 black
00:43:36.360 Americans
00:43:36.780 have a
00:43:37.220 high crime
00:43:37.840 rate,
00:43:39.040 that even
00:43:39.560 black people
00:43:40.280 will discriminate
00:43:40.880 against them.
00:43:42.380 This feels
00:43:43.120 like a
00:43:43.620 giant step
00:43:44.280 forward in
00:43:45.100 understanding
00:43:45.680 each other.
00:43:47.680 It's sort
00:43:48.440 of written
00:43:48.720 as a
00:43:49.140 complaint that
00:43:49.980 not only is
00:43:51.100 racism so
00:43:51.800 bad, you
00:43:52.480 know,
00:43:52.720 across races,
00:43:53.960 but it
00:43:54.580 even includes
00:43:55.360 within a
00:43:56.140 race.
00:43:56.680 But I
00:43:57.180 think this
00:43:57.540 moves the
00:43:58.020 ball forward.
00:43:58.660 Let me
00:44:00.700 say something
00:44:01.240 that I
00:44:01.700 never could
00:44:02.300 have said
00:44:02.620 out loud,
00:44:04.520 but you'll
00:44:05.160 understand it
00:44:05.780 now,
00:44:06.140 because now
00:44:06.520 you have
00:44:06.740 the proper
00:44:07.080 context.
00:44:08.200 Somebody will
00:44:08.540 take this
00:44:10.260 clip out of
00:44:10.860 context to
00:44:12.220 cancel me.
00:44:13.040 You ready?
00:44:13.500 So this is
00:44:14.060 something that
00:44:14.500 would cancel
00:44:15.040 me, but
00:44:16.340 Van Jones
00:44:17.180 makes it
00:44:18.080 safe to
00:44:18.560 say.
00:44:19.980 One of the
00:44:20.700 things that I
00:44:21.260 looked at when
00:44:21.800 I chose where
00:44:22.560 to live,
00:44:24.140 I looked at
00:44:25.060 safety from
00:44:26.360 natural disasters
00:44:27.240 and weather
00:44:28.000 and all
00:44:28.480 those things.
00:44:29.340 But one of
00:44:29.880 my biggest
00:44:30.300 things was
00:44:30.980 not being
00:44:31.940 around a
00:44:32.760 high concentration
00:44:33.560 of black
00:44:34.000 people.
00:44:37.320 Why?
00:44:38.360 Same reason
00:44:39.080 as the
00:44:39.360 black store
00:44:39.820 owner.
00:44:41.320 My reasoning
00:44:42.320 is exactly
00:44:43.080 the same
00:44:43.660 as the
00:44:44.720 black store
00:44:45.420 owner.
00:44:46.680 It has
00:44:47.120 nothing to
00:44:47.580 do with
00:44:47.860 any individual
00:44:48.520 black person.
00:44:49.960 I love
00:44:50.340 black people.
00:44:51.600 Pretty much
00:44:52.360 all of my
00:44:53.020 personal
00:44:53.520 experience with
00:44:55.080 black human
00:44:55.740 beings is
00:44:56.320 good.
00:44:56.540 I love
00:44:58.380 them.
00:44:59.000 They're
00:44:59.200 fun.
00:45:00.380 Always
00:45:00.700 have a
00:45:00.980 sense of
00:45:01.300 humor.
00:45:02.220 Have
00:45:02.460 something
00:45:02.740 more
00:45:02.960 interesting
00:45:03.380 to talk
00:45:03.840 about
00:45:04.060 usually.
00:45:06.420 But the
00:45:07.040 truth is
00:45:07.600 you can
00:45:08.120 use it
00:45:08.500 as a
00:45:08.980 proxy of
00:45:10.020 crime.
00:45:12.360 That's
00:45:12.820 just a
00:45:13.120 fact.
00:45:14.020 So the
00:45:14.380 black store
00:45:14.940 owner and
00:45:15.420 I are
00:45:16.280 making the
00:45:16.700 same
00:45:16.900 decision
00:45:17.300 for the
00:45:17.660 same
00:45:17.840 reason.
00:45:18.460 It's
00:45:18.660 not because
00:45:19.040 we don't
00:45:19.400 like black
00:45:19.880 people.
00:45:20.980 That has
00:45:21.460 nothing to
00:45:21.880 do with
00:45:22.080 anything.
00:45:23.020 It has
00:45:23.340 to do
00:45:23.600 with it
00:45:23.900 as a
00:45:24.140 good proxy
00:45:24.660 for where
00:45:25.040 the crime
00:45:25.400 is.
00:45:26.380 Now if
00:45:26.860 there's a
00:45:27.220 place with
00:45:27.600 a high
00:45:27.840 concentration
00:45:28.340 of black
00:45:29.560 American
00:45:30.060 citizens and
00:45:31.380 it's a
00:45:31.660 low crime
00:45:32.180 neighborhood,
00:45:32.840 I would
00:45:33.520 love to
00:45:33.880 know about
00:45:34.260 that because
00:45:34.680 I'll give
00:45:34.960 it some
00:45:35.240 credit.
00:45:36.260 Because that
00:45:36.640 would be
00:45:36.880 useful too,
00:45:37.500 wouldn't it?
00:45:38.220 Wouldn't you
00:45:38.600 love to
00:45:39.000 know that
00:45:40.560 there are
00:45:40.960 all black,
00:45:41.780 or let's
00:45:42.120 say predominantly
00:45:42.720 black
00:45:43.160 neighborhoods in
00:45:45.020 the United
00:45:45.360 States who
00:45:46.200 have lower
00:45:46.840 than average
00:45:47.500 crime?
00:45:48.780 Probably such
00:45:49.520 places exist.
00:45:51.040 I've never
00:45:51.460 heard of one.
00:45:52.480 Have you?
00:45:52.800 I assume
00:45:54.280 they exist
00:45:54.840 somewhere.
00:45:56.880 Don't you
00:45:57.580 think?
00:45:58.860 So here's
00:46:00.200 the thing.
00:46:01.600 Can we
00:46:02.040 get to a
00:46:03.560 more mature
00:46:05.200 and advanced
00:46:06.120 understanding of
00:46:07.060 this whole
00:46:07.420 racism thing?
00:46:08.700 And I think
00:46:09.200 Van Jones is
00:46:10.180 leading us
00:46:11.080 there simply
00:46:12.700 by being open
00:46:13.600 to the fact
00:46:14.300 that even
00:46:15.160 black people
00:46:15.900 are seeing
00:46:16.580 the statistical
00:46:17.720 let's say
00:46:19.440 risk-reward
00:46:20.400 situation and
00:46:21.200 acting on it.
00:46:22.800 And it's
00:46:23.160 just what a
00:46:23.580 lot of us
00:46:23.960 do.
00:46:24.840 It has
00:46:25.320 nothing to
00:46:25.800 do with
00:46:26.000 racism.
00:46:26.860 It's simply
00:46:27.480 a proxy for
00:46:28.260 what crime
00:46:28.740 is.
00:46:29.420 That's all.
00:46:31.420 Yeah, I
00:46:31.640 mean, you
00:46:32.140 could take it
00:46:32.560 to the next
00:46:32.940 level and
00:46:33.360 say, you
00:46:35.840 know, do
00:46:36.060 you have
00:46:36.300 exactly the
00:46:36.940 same feeling
00:46:37.460 when you
00:46:37.840 see a,
00:46:38.480 let's say,
00:46:39.840 a young
00:46:40.740 black male
00:46:41.960 on a dark
00:46:43.480 street, you
00:46:44.660 know, you're
00:46:44.900 walking down
00:46:45.280 the street,
00:46:45.920 would you
00:46:46.120 have the same
00:46:46.520 feeling as
00:46:47.160 if it was a
00:46:47.900 five-foot-six
00:46:48.940 black female?
00:46:49.780 And the
00:46:51.040 answer is
00:46:51.500 no, no.
00:46:53.520 Because, so, I
00:46:55.760 mean, they both
00:46:56.260 have the same
00:46:56.680 race.
00:46:57.460 It's simply a
00:46:58.300 statistical,
00:46:59.820 logical way
00:47:02.020 to be.
00:47:03.120 And I don't
00:47:05.120 think that I'm
00:47:06.280 hurting the
00:47:06.940 conversation.
00:47:07.560 I think I'm
00:47:07.940 helping it.
00:47:08.900 Because I
00:47:09.400 think we have
00:47:09.800 to get to
00:47:10.760 the point where
00:47:11.320 we're laughing,
00:47:13.000 literally the,
00:47:13.760 you know, the
00:47:14.420 back gates
00:47:14.920 approach,
00:47:15.960 literally laughing
00:47:17.020 at how we see
00:47:18.280 each other as
00:47:18.840 racist all the
00:47:19.500 time, for
00:47:20.380 just everything.
00:47:22.220 The healthiest
00:47:23.280 thing we can do
00:47:24.060 is accept that
00:47:24.860 we're all a
00:47:25.280 little bit racist.
00:47:27.500 There's nothing
00:47:28.160 that could be
00:47:28.540 healthier than
00:47:29.020 that.
00:47:29.720 Because then it's
00:47:30.280 funny, right?
00:47:31.860 If you say we're
00:47:32.800 all racist and
00:47:34.160 we're not proud of
00:47:34.940 it, it's just
00:47:36.240 funny.
00:47:37.500 And then you
00:47:37.920 can laugh at it
00:47:38.560 when you see it,
00:47:39.600 sort of mock it
00:47:40.500 out of social,
00:47:42.060 you know, out of
00:47:42.660 social consideration.
00:47:45.000 So, anyway,
00:47:46.400 again, I always
00:47:47.320 have high regard
00:47:48.720 for Van
00:47:50.060 Jones opinions.
00:47:52.160 I would like
00:47:52.820 to talk to you
00:47:53.500 about something
00:47:54.260 called the
00:47:54.860 documentary effect.
00:47:56.780 I've decided to
00:47:57.700 invent this
00:47:58.460 effect.
00:47:59.440 Maybe it exists.
00:48:00.400 I didn't Google
00:48:00.940 it.
00:48:01.860 But I call it the
00:48:02.680 documentary effect.
00:48:03.600 And it's going to
00:48:04.000 come in important
00:48:05.260 when I talk about
00:48:06.440 the Dark Horse
00:48:07.560 podcast in the
00:48:08.380 moment, who was
00:48:09.680 talking about me
00:48:10.480 the other day.
00:48:11.900 So I have a
00:48:12.700 response to that.
00:48:14.660 So the documentary
00:48:15.520 effect goes like
00:48:16.360 this.
00:48:16.540 I'll just read
00:48:17.060 you the tweet
00:48:17.520 thread that I
00:48:18.500 tweeted just
00:48:19.040 moments ago.
00:48:21.020 I said, here's an
00:48:21.920 experiment you can
00:48:22.700 do at home.
00:48:23.320 Watch the
00:48:23.720 documentary Leaving
00:48:24.880 Neverland.
00:48:25.700 That's the one
00:48:26.280 about Michael
00:48:26.820 Jackson.
00:48:27.760 And you will be
00:48:28.820 convinced that
00:48:29.980 Michael Jackson
00:48:30.580 was guilty of
00:48:31.520 horrible child
00:48:33.020 abuse.
00:48:34.120 And you won't
00:48:34.980 have any doubt
00:48:35.600 about it.
00:48:36.600 If you watch
00:48:37.320 that documentary,
00:48:38.500 you will be
00:48:39.280 completely convinced
00:48:40.420 that he did
00:48:41.720 those terrible
00:48:42.220 crimes.
00:48:43.240 Now, here's the
00:48:43.980 second part.
00:48:45.420 Then Google the
00:48:46.620 term Leaving
00:48:47.700 Neverland
00:48:48.400 debunked.
00:48:51.100 And immediately
00:48:52.000 a whole bunch of
00:48:52.940 debunks come up
00:48:54.020 and you start
00:48:54.980 reading through
00:48:55.520 those and see
00:48:56.540 what happens.
00:48:58.180 And I can tell
00:48:59.060 you what happens.
00:48:59.840 You will change
00:49:00.420 your mind.
00:49:02.080 And a well-made
00:49:03.100 documentary will be
00:49:04.180 100% persuasive to
00:49:05.520 the average viewer
00:49:06.160 because only one
00:49:07.280 side is presented.
00:49:09.000 Whenever you do a
00:49:10.000 real slick,
00:49:11.100 well-produced thing
00:49:12.560 that only shows one
00:49:13.480 side, it's always
00:49:15.120 persuasive.
00:49:16.840 Always.
00:49:17.480 It doesn't matter
00:49:17.960 if it's true or
00:49:18.560 false.
00:49:19.420 It's really
00:49:20.060 persuasive.
00:49:22.700 If that's all you
00:49:23.660 see, well, then
00:49:25.300 you're done.
00:49:27.020 But the debunk
00:49:28.180 of the documentary
00:49:29.080 will also be
00:49:30.680 100% persuasive.
00:49:33.120 2,000 mules,
00:49:34.140 thank you, good
00:49:34.720 example.
00:49:35.460 If you watch
00:49:35.980 2,000 mules,
00:49:37.280 just by itself,
00:49:38.440 it is really
00:49:39.400 persuasive.
00:49:41.000 If you Google
00:49:42.000 2,000 mules
00:49:43.140 debunked and
00:49:45.340 read what people
00:49:46.020 say that have
00:49:47.180 complaints,
00:49:48.360 you will be
00:49:49.080 persuaded that
00:49:50.620 there wasn't
00:49:51.100 necessarily anything
00:49:51.960 there.
00:49:53.260 They're both
00:49:54.200 persuasive.
00:49:55.320 If you don't
00:49:55.940 understand that
00:49:56.600 they're both
00:49:57.140 persuasive, and
00:49:58.260 100% so,
00:49:59.620 then you're not
00:50:00.780 seeing the field
00:50:02.020 clearly.
00:50:02.540 Right?
00:50:03.140 And that's
00:50:06.280 because even
00:50:07.000 the debunks,
00:50:08.500 now, do you
00:50:09.620 believe the
00:50:10.160 debunk?
00:50:11.480 I want to be
00:50:12.740 clear on this.
00:50:13.960 I'm not saying
00:50:14.620 the debunk is
00:50:15.340 correct.
00:50:16.680 I'm saying
00:50:17.440 they're both
00:50:18.040 persuasive,
00:50:19.720 100%.
00:50:20.500 It's just like
00:50:21.860 two worlds.
00:50:23.540 You see them
00:50:24.380 and you look at
00:50:24.940 them and go,
00:50:25.320 yeah, I don't
00:50:25.980 know how to
00:50:26.440 debate that.
00:50:27.200 You look at
00:50:27.620 the other one
00:50:27.960 and you go,
00:50:28.280 I don't know
00:50:28.560 how to debate
00:50:28.960 that one either.
00:50:29.880 They both look
00:50:30.540 persuasive.
00:50:31.060 So, here's
00:50:33.740 my take.
00:50:35.000 Bottom line.
00:50:36.020 If you think
00:50:36.640 you saw
00:50:37.040 something that
00:50:37.580 was 100%
00:50:38.500 persuasive to
00:50:39.360 you on,
00:50:41.460 let's say,
00:50:41.820 the excellent
00:50:42.380 Dark Horse
00:50:43.220 podcast or
00:50:44.620 any other
00:50:45.080 podcast,
00:50:45.840 including this
00:50:46.460 one, you're
00:50:47.860 right.
00:50:49.060 You're right.
00:50:50.180 If you saw
00:50:50.980 something that
00:50:51.520 you thought was
00:50:52.120 100% persuasive
00:50:53.320 on the Dark
00:50:54.660 Horse podcast or
00:50:55.620 any other good
00:50:56.220 podcast, you're
00:50:57.480 right.
00:50:57.920 It was 100%
00:50:58.780 persuasive.
00:51:00.160 It doesn't
00:51:00.560 mean it's
00:51:00.840 right because
00:51:02.440 the debunk
00:51:03.040 might be
00:51:03.540 just as
00:51:03.980 persuasive.
00:51:05.780 Now, do
00:51:09.200 you hold that
00:51:10.120 to be true and
00:51:11.120 do you actually
00:51:11.760 consider that
00:51:13.720 when you see
00:51:14.220 content?
00:51:15.880 When you're
00:51:16.400 watching Tucker
00:51:17.060 Carlson, do
00:51:18.380 you say to
00:51:18.720 yourself, I'm
00:51:19.220 only seeing
00:51:19.620 Tucker's view,
00:51:20.400 I'm only seeing
00:51:21.220 Tucker's view,
00:51:22.420 or do you say,
00:51:23.580 damn, that's
00:51:24.120 convincing?
00:51:24.520 I think most
00:51:27.240 of you would
00:51:28.340 look at
00:51:28.800 Tucker and
00:51:29.420 say, that's
00:51:29.880 100%
00:51:30.400 persuasive.
00:51:31.280 Because it
00:51:31.940 is.
00:51:33.000 It is.
00:51:33.400 You watch
00:51:33.720 Tucker, and
00:51:34.740 he is 100%
00:51:35.580 persuasive.
00:51:36.960 Does it mean
00:51:37.740 he's right?
00:51:39.760 How would I
00:51:40.500 know?
00:51:42.120 Probably most
00:51:43.160 of the time,
00:51:44.700 maybe sometimes,
00:51:46.300 how would I
00:51:46.940 know?
00:51:47.900 I wouldn't
00:51:48.580 know.
00:51:49.560 But I know
00:51:50.360 he's persuasive.
00:51:51.920 And I know
00:51:52.520 that if I turn
00:51:53.280 to CNN and
00:51:54.160 see them talking
00:51:55.100 to the other
00:51:55.540 side, I know
00:51:56.960 that I'll
00:51:57.260 think, huh,
00:51:58.620 it looks pretty
00:51:59.280 good.
00:52:01.720 All right.
00:52:02.620 So just keep
00:52:03.340 in mind, the
00:52:05.060 things that are
00:52:05.680 not true are
00:52:06.680 100% persuasive.
00:52:08.440 Have I made
00:52:08.960 that point?
00:52:10.220 Will you all
00:52:10.820 accept the
00:52:12.040 things that are
00:52:12.560 not true can
00:52:14.160 look 100%
00:52:15.000 persuasive if
00:52:15.740 you just see
00:52:16.140 the one story?
00:52:17.340 All right.
00:52:17.840 So just keep
00:52:18.660 that in mind.
00:52:19.440 That's your
00:52:19.880 primer for the
00:52:20.800 rest of this.
00:52:21.920 So I watched
00:52:22.560 Brett Weinstein
00:52:23.920 and Heather
00:52:24.380 Heyer talking
00:52:25.680 about my
00:52:27.040 opinions on
00:52:28.040 who got
00:52:29.820 things right
00:52:30.340 during the
00:52:30.800 pandemic.
00:52:32.060 And they
00:52:32.540 made some
00:52:33.180 points about
00:52:35.760 and specifically
00:52:36.900 the question
00:52:37.540 was, were
00:52:38.920 they right,
00:52:40.900 more right
00:52:41.440 than other
00:52:41.760 people?
00:52:42.540 Were they
00:52:42.900 more right
00:52:43.320 than other
00:52:43.600 people because
00:52:44.140 they had a
00:52:44.480 better process?
00:52:46.380 And then I
00:52:48.020 asked the
00:52:48.340 question, can
00:52:48.940 you teach
00:52:49.320 me the
00:52:49.600 process?
00:52:50.680 And then
00:52:51.220 they talked
00:52:51.640 about why
00:52:52.140 that was
00:52:52.500 impractical.
00:52:54.220 So my
00:52:55.900 expertise is
00:52:56.620 not science,
00:52:57.360 as you know,
00:52:59.000 so I can't
00:52:59.700 really judge
00:53:00.160 anything they
00:53:00.700 say about the
00:53:01.260 science.
00:53:01.680 Would you
00:53:01.940 agree?
00:53:02.920 Would you
00:53:03.480 agree that I
00:53:04.120 am incapable
00:53:04.700 of judging
00:53:05.540 the science
00:53:06.220 independently?
00:53:08.120 I can listen
00:53:08.880 to what they
00:53:09.460 say and other
00:53:10.360 people, but I
00:53:11.400 don't have the
00:53:11.820 skill to
00:53:12.300 independently
00:53:12.820 look at the
00:53:13.900 science.
00:53:15.940 All right.
00:53:17.000 So, but
00:53:19.420 here's a skill
00:53:20.140 I do have.
00:53:21.780 Here's a skill
00:53:22.360 I do have.
00:53:22.960 I can
00:53:23.300 recognize
00:53:24.020 cognitive
00:53:25.520 dissonance
00:53:26.140 in the
00:53:26.520 wild.
00:53:28.520 So I'm
00:53:29.100 going to
00:53:29.260 tell you
00:53:29.580 the tells
00:53:30.600 for cognitive
00:53:31.260 dissonance
00:53:31.920 that the
00:53:33.200 Dark Horse
00:53:33.760 podcast,
00:53:36.000 Heather and
00:53:36.400 Brett,
00:53:37.420 displayed.
00:53:39.420 All right.
00:53:39.960 Now you be
00:53:40.660 the judge.
00:53:41.140 and watch
00:53:42.640 how persuasive
00:53:43.480 this is.
00:53:45.120 And if I
00:53:45.860 persuade you
00:53:46.700 that they're
00:53:47.480 hallucinating
00:53:48.320 and you're
00:53:49.500 100% persuaded,
00:53:51.140 what should
00:53:51.880 be your
00:53:52.200 opinion when
00:53:52.780 you leave
00:53:53.140 this live
00:53:53.700 stream?
00:53:54.820 That it's
00:53:55.260 true?
00:53:56.700 No.
00:53:57.680 Because you're
00:53:58.100 only seeing
00:53:58.680 my presentation.
00:54:00.700 My presentation
00:54:01.580 is going to be
00:54:02.280 super persuasive.
00:54:05.160 It might be
00:54:05.980 right.
00:54:06.780 I think it
00:54:07.380 is, but I
00:54:08.120 always think I
00:54:08.620 am.
00:54:09.460 Right?
00:54:09.720 That doesn't
00:54:10.040 mean I am.
00:54:11.140 So watch
00:54:11.980 how persuasive
00:54:12.680 I am.
00:54:14.540 You ready?
00:54:16.040 All right.
00:54:16.720 Here's the
00:54:17.340 first tell.
00:54:20.660 Heather
00:54:21.220 described that
00:54:22.420 my problem
00:54:23.980 is, in
00:54:25.040 all likelihood,
00:54:26.540 that while,
00:54:27.420 even though I'm
00:54:27.960 the Dilbert
00:54:28.460 guy, and
00:54:29.880 they acknowledge
00:54:31.020 that I might
00:54:31.560 be bright in
00:54:33.000 my own
00:54:33.400 domain, so
00:54:35.220 even though I'm
00:54:35.740 a bright
00:54:36.300 Dilbert guy,
00:54:37.660 Dilbert-creating
00:54:38.400 guy, who
00:54:39.300 definitely doesn't
00:54:40.160 trust management,
00:54:41.140 which he
00:54:42.000 acknowledged,
00:54:42.820 my problem
00:54:43.480 was that I
00:54:44.080 did trust
00:54:44.720 the scientists.
00:54:46.540 So I had
00:54:47.360 an oversimplified
00:54:49.540 trust in
00:54:50.560 science, which
00:54:52.240 they, as
00:54:53.100 scientists, could
00:54:54.100 confirm is too
00:54:55.680 much trust.
00:54:56.960 That science is a
00:54:57.800 messy process in
00:54:59.460 which you're wrong
00:55:00.640 for a long time,
00:55:02.160 sometimes, until
00:55:03.700 you're finally
00:55:04.200 right.
00:55:04.580 and that you
00:55:05.900 can spend a lot
00:55:06.820 of time in
00:55:07.720 the wrong
00:55:08.360 area until
00:55:10.020 you crawl
00:55:11.220 toward the
00:55:11.740 truth with a
00:55:12.400 variety of
00:55:13.380 scientific
00:55:13.860 systems and
00:55:15.100 processes.
00:55:16.840 Now, does
00:55:18.940 that sound like
00:55:19.460 mind reading to
00:55:20.160 you?
00:55:21.040 How could
00:55:21.780 anybody judge
00:55:22.700 my personal
00:55:23.920 trust of
00:55:25.300 scientists?
00:55:26.860 And why would
00:55:27.760 you assume
00:55:28.320 that I'm
00:55:29.460 smart and I
00:55:30.920 don't trust
00:55:31.480 managers, but
00:55:33.440 that somehow I
00:55:34.260 didn't think to
00:55:34.900 extend that to
00:55:36.040 other professionals?
00:55:38.060 Well, first of
00:55:38.920 all, I do.
00:55:40.080 I mock
00:55:40.600 scientists all
00:55:41.840 the time.
00:55:42.960 And I say
00:55:43.860 out loud, as
00:55:44.760 often as you'll
00:55:45.720 listen to it,
00:55:46.700 that I don't
00:55:47.660 believe the
00:55:48.220 data, I don't
00:55:49.060 believe the
00:55:49.500 scientists, but
00:55:50.160 here's what
00:55:50.560 they're saying,
00:55:51.360 and that you
00:55:51.800 shouldn't believe
00:55:52.340 them necessarily,
00:55:53.480 and that none
00:55:54.100 of the data is
00:55:54.700 credible.
00:55:55.660 So I've said
00:55:56.440 that none of
00:55:56.960 the data from
00:55:57.820 the mainstream
00:55:58.820 scientists I
00:56:00.020 find,
00:56:00.400 100%
00:56:02.140 persuasive, but
00:56:03.320 also, I'm so
00:56:04.940 much of a
00:56:05.560 skeptic, I
00:56:06.500 also don't
00:56:07.240 believe the
00:56:08.100 skeptics.
00:56:09.520 So I'm twice
00:56:10.480 as skeptical as
00:56:12.440 the scientists
00:56:13.000 themselves.
00:56:14.900 But her take
00:56:16.080 was that I
00:56:17.800 somehow had not
00:56:18.580 noticed the
00:56:19.160 scientists can
00:56:20.540 be biased by
00:56:22.120 money and
00:56:22.660 stuff.
00:56:23.740 Now, here's
00:56:24.380 the second
00:56:24.740 point.
00:56:26.200 She makes
00:56:26.960 the point that,
00:56:27.860 and I think
00:56:28.200 Brett made it
00:56:28.700 well, that
00:56:29.820 scientists can
00:56:31.020 be biased by
00:56:32.020 money.
00:56:33.320 Do you
00:56:33.640 think I
00:56:33.940 didn't know
00:56:34.360 that?
00:56:35.660 Is there
00:56:35.920 anybody who
00:56:36.320 thinks that I
00:56:36.900 wasn't aware
00:56:37.660 that scientists
00:56:38.820 are biased by
00:56:39.640 money?
00:56:39.940 It's like the
00:56:40.440 most basic
00:56:41.000 thing that an
00:56:42.260 adult knows.
00:56:44.800 But then,
00:56:46.120 why would a
00:56:47.280 podcaster be
00:56:49.080 immune from
00:56:49.700 that effect?
00:56:51.360 Now, the
00:56:51.780 usual argument
00:56:52.400 is that they
00:56:53.480 lost money and
00:56:54.700 fame and stuff.
00:56:55.880 But once
00:56:56.840 you're a
00:56:57.200 podcaster,
00:56:58.800 and let me
00:57:00.620 back up here.
00:57:03.300 I should
00:57:04.020 create a
00:57:04.480 little base
00:57:05.140 of, let's
00:57:07.520 say, better
00:57:08.000 behavior.
00:57:09.820 I think that
00:57:10.920 Brett and
00:57:11.960 Heather and
00:57:13.080 a lot of the
00:57:13.600 other podcasters
00:57:14.460 who are, let's
00:57:14.900 say, contrarians
00:57:15.780 are super
00:57:17.260 useful.
00:57:18.520 Like, society
00:57:19.260 needs more
00:57:20.760 Brett and
00:57:21.560 Heather.
00:57:22.580 Do you
00:57:23.080 agree?
00:57:23.280 Like, if we
00:57:25.300 had more of
00:57:25.920 them, we
00:57:27.000 would all be
00:57:27.560 a better
00:57:27.940 world, right?
00:57:29.140 So I want to
00:57:29.720 be clear, because
00:57:31.220 I'm going to
00:57:31.520 say some
00:57:31.880 things that
00:57:32.300 sound critical,
00:57:33.600 but they
00:57:34.060 were very
00:57:34.780 professional,
00:57:35.660 and I
00:57:35.880 thought,
00:57:36.680 unusually
00:57:37.420 forgiving for
00:57:39.980 my attitudes.
00:57:44.060 I was
00:57:44.740 actually impressed
00:57:45.580 at how
00:57:46.840 professionally
00:57:48.100 they handled
00:57:49.540 what could
00:57:50.060 be seen as
00:57:50.560 criticism.
00:57:51.640 So a
00:57:52.420 big,
00:57:53.280 big, big
00:57:54.520 compliment to
00:57:56.340 both of
00:57:56.700 them for
00:57:57.200 being
00:57:57.440 consistently
00:57:58.080 good, let's
00:58:00.060 say, good
00:58:02.340 representatives of
00:58:03.400 science, and
00:58:05.200 good representatives
00:58:05.920 of mature
00:58:06.720 behavior in
00:58:07.460 general, you
00:58:09.020 know, online.
00:58:10.160 And, you
00:58:10.580 know, I don't
00:58:11.180 have either of
00:58:11.840 those qualities,
00:58:12.520 so these are
00:58:12.920 sincere compliments.
00:58:15.900 That said,
00:58:17.600 back to my
00:58:18.800 point.
00:58:19.540 Once you
00:58:20.080 start a
00:58:20.520 podcast,
00:58:21.140 podcast, which
00:58:22.400 is notable
00:58:23.000 for being the
00:58:23.740 contrarian view,
00:58:25.980 what are you
00:58:26.400 going to do?
00:58:29.860 Basically, as
00:58:30.660 soon as you
00:58:31.040 start the
00:58:31.420 podcast, you
00:58:32.680 become as
00:58:33.620 biased as all
00:58:34.560 the people
00:58:34.920 you're
00:58:35.520 criticizing.
00:58:38.060 And there's
00:58:38.400 no way around
00:58:38.960 that.
00:58:40.180 Because, you
00:58:41.140 know, even
00:58:41.420 though they
00:58:41.780 may have lost
00:58:42.620 money or
00:58:43.100 some kinds
00:58:43.560 of reputation
00:58:44.760 or something,
00:58:45.440 overall, once
00:58:47.820 you're in the
00:58:48.480 podcast
00:58:49.120 contrarian
00:58:50.080 world, what
00:58:52.520 are you going
00:58:52.760 to do?
00:58:53.780 So, yeah, it's
00:58:54.660 the Alex
00:58:55.160 Berenson effect.
00:58:56.200 Once you
00:58:56.680 become branded
00:58:57.540 as the person
00:58:58.320 who disagrees
00:58:59.080 with the
00:58:59.820 standard
00:59:00.880 statements,
00:59:02.020 it's hard to
00:59:03.080 feel that you
00:59:05.160 don't have any
00:59:05.560 bias in that
00:59:06.120 situation.
00:59:07.080 Now, I
00:59:09.720 think that they
00:59:10.320 would have been
00:59:10.680 more credible to
00:59:11.400 call that out,
00:59:12.600 that everybody
00:59:13.220 has their own
00:59:13.880 bias, including
00:59:14.940 me, of
00:59:15.700 course.
00:59:16.660 There's nobody
00:59:17.360 who doesn't
00:59:17.760 have bias.
00:59:19.300 And if you
00:59:19.680 think that some
00:59:21.320 scientists for
00:59:22.080 their money have
00:59:23.260 greater bias than
00:59:24.420 somebody who does
00:59:25.600 a podcast in
00:59:27.120 which getting it
00:59:27.980 right is the
00:59:28.700 most important
00:59:29.360 thing, who
00:59:30.500 would watch their
00:59:31.120 podcast if they
00:59:31.900 got stuff wrong?
00:59:33.640 You wouldn't
00:59:33.980 watch it long.
00:59:35.220 Well, maybe you
00:59:35.840 would, Alex
00:59:36.480 Jones, in some
00:59:37.220 cases.
00:59:39.240 So, all
00:59:40.480 right, so then
00:59:41.060 the other thing
00:59:41.520 that Brett said
00:59:42.480 is that the way
00:59:43.620 that you can
00:59:44.240 tell, you
00:59:44.940 know that
00:59:45.520 they're doing
00:59:45.860 something right
00:59:46.420 is that they
00:59:46.900 got 8 out of
00:59:47.580 10 things
00:59:48.580 right.
00:59:49.300 They're sort
00:59:49.620 of, you
00:59:50.400 know, not an
00:59:50.800 exact score,
00:59:51.760 but sort of
00:59:52.580 generally 80%
00:59:53.880 of what they
00:59:54.420 got right.
00:59:55.360 And those are
00:59:55.880 based on
00:59:56.260 predictions.
00:59:57.500 Now, is that
00:59:58.200 a good standard?
01:00:00.580 Suppose the
01:00:01.220 only thing you
01:00:01.760 knew is that
01:00:02.500 they got 80%
01:00:03.500 of their
01:00:04.660 predictions right.
01:00:09.680 That's good,
01:00:10.700 right?
01:00:10.880 So you'd give
01:00:12.580 much higher
01:00:13.100 credibility to
01:00:14.200 somebody who
01:00:14.520 got 80% of
01:00:15.300 their predictions
01:00:15.740 right, wouldn't
01:00:16.240 you?
01:00:17.420 Okay.
01:00:18.740 So, here's
01:00:20.120 some questions
01:00:20.640 for you.
01:00:22.380 Who did the
01:00:23.180 scoring?
01:00:25.200 Who scored
01:00:26.100 their predictions
01:00:26.820 correct?
01:00:28.200 Well, they did.
01:00:29.820 Do they have
01:00:30.520 any bias?
01:00:32.620 Yeah.
01:00:33.580 Yeah.
01:00:33.720 Can you name
01:00:34.940 somebody else
01:00:35.800 who got 8 out
01:00:37.020 of 10 predictions
01:00:38.100 right about the
01:00:38.760 pandemic?
01:00:40.280 Can you?
01:00:42.160 Me.
01:00:43.640 Yeah, me.
01:00:44.660 Do you know
01:00:45.060 why I got, I'm
01:00:46.340 sure I got 80%
01:00:47.360 of my predictions
01:00:47.940 right?
01:00:49.560 Because I scored
01:00:50.920 it.
01:00:51.720 I literally
01:00:52.500 created a document
01:00:53.500 and said, here are
01:00:54.700 all the things I
01:00:55.300 got right.
01:00:55.940 Here's what I
01:00:56.540 got wrong.
01:00:57.520 I scored it.
01:00:59.520 So, if
01:01:01.620 Brett and
01:01:03.160 Heather and I
01:01:03.940 have different
01:01:04.460 opinions, and
01:01:06.520 we both got
01:01:07.140 80% according
01:01:08.240 to our own
01:01:08.860 scoring, who
01:01:11.080 do you believe?
01:01:12.920 We both had
01:01:13.760 the same record
01:01:14.360 according to
01:01:15.040 our own
01:01:15.500 scoring.
01:01:17.540 Now, could
01:01:18.880 there be a
01:01:19.660 less scientific
01:01:20.840 point of view
01:01:21.800 than I got
01:01:23.420 80% right
01:01:24.280 according to
01:01:24.880 my own
01:01:25.380 scoring, just
01:01:26.540 like everybody
01:01:27.060 else scored
01:01:27.560 their own
01:01:28.020 predictions?
01:01:29.300 Is there
01:01:29.680 anybody here
01:01:30.200 who didn't
01:01:30.540 think they
01:01:30.880 were right 80%
01:01:31.600 of the time?
01:01:33.160 I'll bet
01:01:36.660 everyone here
01:01:37.580 believes their
01:01:39.440 own predictions
01:01:40.300 about the
01:01:40.840 pandemic were
01:01:42.020 80% or
01:01:42.920 better right.
01:01:44.000 Right?
01:01:45.200 So, I
01:01:45.940 should believe
01:01:46.400 every one of
01:01:47.020 you, even if
01:01:47.580 you have
01:01:47.780 different opinions,
01:01:48.520 because you
01:01:48.780 all got 90%
01:01:49.560 of everything
01:01:49.980 right.
01:01:53.340 Okay.
01:01:54.580 So, I
01:01:56.500 would consider
01:01:56.940 that super
01:01:57.620 unscientific and
01:01:58.700 a tell for
01:01:59.320 cognitive
01:01:59.820 dissonance,
01:02:00.440 things, because
01:02:01.300 there's no
01:02:02.720 question, there's
01:02:05.160 no question that
01:02:07.220 Brett and
01:02:07.660 Heather are
01:02:08.380 statistically high
01:02:11.400 capability, right,
01:02:12.880 compared to the
01:02:13.460 average person.
01:02:14.400 So, certainly, they
01:02:15.420 would understand what
01:02:16.240 I'm saying just as
01:02:17.520 easily as you all
01:02:18.320 do, right?
01:02:19.460 So, in order for
01:02:20.980 them to be, you
01:02:21.940 know, hyper
01:02:22.400 scientific, rational
01:02:23.800 people, which they
01:02:24.540 are, for them to
01:02:27.380 think that that was a
01:02:28.400 good point, did
01:02:31.020 they get there by
01:02:31.860 good scientific
01:02:32.960 rational thinking?
01:02:34.680 No, it's a tell
01:02:36.000 for cognitive
01:02:36.540 dissonance.
01:02:37.520 Because if
01:02:38.100 somebody's an
01:02:38.560 expert at
01:02:39.040 something and
01:02:39.500 says something
01:02:39.940 that's clearly,
01:02:41.360 completely ridiculous
01:02:43.720 within their own
01:02:44.780 profession, because
01:02:46.440 there's no
01:02:46.760 scientist that
01:02:47.380 would agree with
01:02:47.900 that, right?
01:02:49.920 Do you think any
01:02:50.620 scientist, if I
01:02:51.620 just pulled them
01:02:52.120 aside and said,
01:02:52.740 look, they scored
01:02:53.800 their own predictions
01:02:55.060 80%, and so did
01:02:57.020 everybody else.
01:02:57.700 They scored their
01:02:58.680 own predictions 80%.
01:02:59.820 So, would we
01:03:01.180 believe this one?
01:03:02.400 What scientist
01:03:03.040 would agree with
01:03:03.720 that statement?
01:03:05.000 Nobody, right?
01:03:06.680 So, I think that's
01:03:07.980 a tell.
01:03:08.540 So, you've got the
01:03:09.120 mind-reading tell.
01:03:10.820 Not only did
01:03:12.020 Heather, and maybe
01:03:13.160 Brett, I don't
01:03:13.680 know, but not only
01:03:15.100 did Heather imagine
01:03:16.200 that I trusted
01:03:17.180 scientists, which is
01:03:19.080 literally the opposite
01:03:19.980 of everything I've
01:03:21.040 ever said in my
01:03:21.680 entire life publicly,
01:03:23.180 that's a ridiculous
01:03:25.080 belief.
01:03:25.700 But it seemed
01:03:27.080 logical, you
01:03:28.580 know, within
01:03:28.940 their structure, I
01:03:30.420 guess.
01:03:31.940 All right, here's
01:03:32.320 another one.
01:03:35.680 Well, also on the
01:03:36.720 8 out of 10, do
01:03:38.120 you remember that I
01:03:39.080 said that somebody
01:03:40.080 was going to be
01:03:40.700 right about everything
01:03:42.060 because we all have
01:03:43.000 different predictions?
01:03:44.900 Somebody was going,
01:03:46.080 you know, even if you
01:03:46.760 take away the
01:03:47.360 subjective scoring of
01:03:48.700 yourself, somebody was
01:03:50.820 going to be right 8 out
01:03:51.780 of 10 times.
01:03:52.420 because a lot of
01:03:54.100 people made a lot of
01:03:54.840 predictions in public.
01:03:56.840 So, the existence of
01:03:58.260 somebody who was right
01:03:59.200 8 out of 10 means
01:04:01.300 nothing.
01:04:03.520 Nothing.
01:04:04.260 Because it was
01:04:04.880 guaranteed, because of
01:04:06.260 a large group of people
01:04:07.460 making lots of
01:04:08.200 different predictions,
01:04:10.000 chance alone guarantees,
01:04:12.880 guarantees that somebody
01:04:14.580 would have 8 out of 10
01:04:15.540 if you have a big enough
01:04:16.920 population, which we did.
01:04:18.380 All right, then the
01:04:22.620 most telling, I think,
01:04:23.640 was when they were
01:04:24.600 trying to explain why
01:04:25.680 they couldn't explain
01:04:26.600 how they got it right.
01:04:28.360 Now, that's the part
01:04:29.160 that I would definitely
01:04:30.120 ask you to look at.
01:04:32.400 Here's what it looks
01:04:33.200 like when you don't
01:04:34.140 have cognitive
01:04:35.020 dissonance.
01:04:36.060 All right, so I'll
01:04:36.640 pretend to be Brett or
01:04:38.060 Heather and not have
01:04:40.000 cognitive dissonance.
01:04:41.840 And then Scott says to
01:04:42.920 them, so can you show
01:04:44.720 me how you got it right?
01:04:45.980 Here's a non-cognitive
01:04:49.300 dissonance answer.
01:04:50.560 Well, to be honest,
01:04:51.920 every situation is
01:04:52.900 different, so I wouldn't
01:04:54.460 generalize this to, you
01:04:55.980 know, every other
01:04:56.560 decision.
01:04:57.420 However, there were four
01:04:59.160 key variables that we
01:05:00.380 looked at that we had a
01:05:02.380 different take on,
01:05:03.200 because we have greater
01:05:04.380 knowledge about
01:05:05.260 evolutionary biology.
01:05:07.660 And if you knew
01:05:08.300 evolutionary biology, you
01:05:10.000 could look at these
01:05:10.560 four things.
01:05:11.460 I'm just making this
01:05:12.300 up, right?
01:05:12.800 It's just an example.
01:05:14.020 You could look at these
01:05:14.700 four things through that
01:05:15.660 lens, and it would be
01:05:17.100 really clear.
01:05:19.120 Now, I wouldn't
01:05:20.080 understand that, but I
01:05:22.220 would say, well, that's
01:05:23.120 somebody who knows what
01:05:24.160 they're doing.
01:05:26.260 But suppose instead they
01:05:27.960 say, well, it's really
01:05:30.300 hard because it's a
01:05:31.540 complicated systems on
01:05:33.100 systems, and it's hard to
01:05:34.780 say in any given
01:05:35.640 situation what would be
01:05:37.060 the one way to go,
01:05:38.260 because that would be
01:05:38.880 oversimplifying.
01:05:40.140 So really, you have to
01:05:41.200 understand the entire
01:05:41.980 field, and we have sort of
01:05:44.800 a field that is good at
01:05:46.600 understanding other
01:05:47.400 fields, so it gives us a
01:05:48.620 better vision of it, but
01:05:50.100 really, it's like archery.
01:05:52.300 If somebody was an
01:05:53.260 excellent archer, could
01:05:55.380 they explain it to you in
01:05:56.780 a way that you could do
01:05:57.680 it?
01:05:59.460 Now, what do I say when
01:06:00.940 people resort to
01:06:02.020 analogies?
01:06:04.540 They don't have an
01:06:05.280 argument, right?
01:06:07.180 So if you went to me and
01:06:09.400 said, hey, archer, can you
01:06:11.200 teach me how to archer?
01:06:13.080 If the archer was not in
01:06:14.740 cognitive dissonance, here's
01:06:16.200 the answer.
01:06:17.220 Well, you would need two
01:06:19.100 things.
01:06:19.700 One is, apparently, I have
01:06:21.140 some genetic gifts that
01:06:22.800 make me good at this.
01:06:24.100 On top of that, I
01:06:25.420 practiced in a very
01:06:26.420 disciplined way, and
01:06:27.700 here's how I practiced.
01:06:29.080 Now, you probably
01:06:29.760 couldn't produce this if
01:06:31.480 you don't have the
01:06:32.080 natural ability, but if
01:06:33.800 you did, and you followed
01:06:35.000 this same process, I'm
01:06:36.500 confident you would get
01:06:37.360 to the same result and be
01:06:38.400 a great archer, too.
01:06:39.340 Now, is there anything
01:06:41.540 in that answer that's
01:06:42.460 confusing?
01:06:43.720 It's very easy to
01:06:45.240 explain how an archer
01:06:47.240 became good at archery.
01:06:49.080 It's just that you can't
01:06:50.220 do it.
01:06:51.360 And likewise, had they
01:06:53.120 not been experiencing
01:06:54.060 cognitive dissonance, it
01:06:55.940 would have been really
01:06:56.620 easy to say, well, Scott,
01:06:58.820 you forgot this variable
01:06:59.940 and this one, and we
01:07:01.460 have some knowledge about
01:07:02.840 this area, so it stood
01:07:04.480 out to us.
01:07:06.560 That's what I was
01:07:07.280 looking for.
01:07:07.820 But if you get, instead,
01:07:10.040 complications about
01:07:10.960 complication, it's really
01:07:12.180 hard to say, and what
01:07:13.600 about archery?
01:07:14.940 That's cognitive
01:07:15.620 dissonance.
01:07:16.760 By the way, would you
01:07:17.580 have recognized that?
01:07:19.020 Would you have recognized
01:07:19.880 that as cognitive
01:07:20.680 dissonance?
01:07:26.040 All right.
01:07:28.700 And then, I'm going to
01:07:30.620 double whiteboard you.
01:07:32.680 I'm double whiteboarding
01:07:33.740 you.
01:07:34.740 Here it comes.
01:07:35.260 Suppose you say to me,
01:07:38.640 Bud Scott, you've got to
01:07:42.760 listen to the smart
01:07:43.580 scientists because they
01:07:44.480 have better arguments, and
01:07:46.000 you can tell by looking to
01:07:47.160 them.
01:07:47.680 Can I?
01:07:48.660 So you've got all these
01:07:49.460 scientists who are saying
01:07:50.400 things, and let's say the
01:07:51.980 Dark Horse podcast got it
01:07:54.300 right.
01:07:55.280 So let's take that as an
01:07:57.160 assumption.
01:07:58.080 So let's say they got
01:07:58.920 everything right, but it's
01:08:00.980 different from what lots of
01:08:03.280 other scientists said.
01:08:05.240 What skill would I employ to
01:08:07.740 know that they got it
01:08:08.560 right?
01:08:09.920 I'll tell you what skill I
01:08:11.180 would employ.
01:08:12.440 I would employ an illusion
01:08:14.200 called the documentary
01:08:16.140 effect.
01:08:17.480 And I would say to myself,
01:08:18.780 when I listen to that
01:08:19.540 podcast, that sounds 100%
01:08:22.180 persuasive.
01:08:23.540 And then I'm done.
01:08:24.520 And if I went to these
01:08:27.000 other people, maybe they
01:08:28.600 don't have a podcast, maybe
01:08:29.880 they're not as persuasive,
01:08:31.340 maybe I heard something on
01:08:32.340 the Dark Horse podcast that
01:08:34.160 biased me to thinking these
01:08:35.680 people are wrong.
01:08:36.720 But the truth is, we have no
01:08:39.360 tools, you and I, to know
01:08:42.320 that this was right.
01:08:44.140 It might be right.
01:08:46.440 Might be right every time.
01:08:48.020 I have no tools to know that
01:08:49.840 because I can't independently
01:08:51.260 do what they do, and they
01:08:52.860 can't explain it in a way
01:08:54.120 that I could do it.
01:08:56.140 So it doesn't help me that
01:08:57.560 they're right.
01:08:58.920 Their rightness has no value
01:09:01.180 whatsoever because I can't
01:09:03.740 confirm it, and I can't get
01:09:05.280 any sense of whether it's
01:09:06.280 right by looking at it.
01:09:08.900 Now, but when I listen to
01:09:10.920 them doing mind reading, it
01:09:15.080 does bias me in one direction.
01:09:18.400 Now, here's the other thing
01:09:19.460 that's inconvenient.
01:09:20.880 As Brett and Heather also
01:09:24.960 describe very well, science is
01:09:27.980 a process of moving toward
01:09:30.680 the truth.
01:09:31.760 So you never really know if
01:09:33.660 you're there.
01:09:34.720 You think Newton got gravity
01:09:36.360 right until Einstein comes
01:09:37.800 along and says, well, that's
01:09:39.220 only in certain situations
01:09:40.400 that's right.
01:09:41.760 So you're wrong, wrong, wrong,
01:09:43.340 wrong, wrong.
01:09:43.920 But if you do everything
01:09:44.880 right, you don't know how long
01:09:47.000 it'll take because it would be
01:09:48.180 a different length of time
01:09:49.800 for any specific issue.
01:09:52.380 But eventually, maybe you
01:09:53.900 start locking into it and
01:09:54.860 you're correct.
01:09:56.260 Here's my question for science.
01:09:59.680 How do Brett and Heather know
01:10:02.380 where they are on this line?
01:10:04.520 How do they know they're there?
01:10:05.780 Is it because the current data
01:10:11.380 agrees with them?
01:10:14.560 How often does the data
01:10:16.160 turn out to be wrong?
01:10:17.800 All the time.
01:10:19.520 Like, all the time.
01:10:21.780 Something like 50% of peer-reviewed
01:10:25.240 papers end up not reproducible.
01:10:27.880 50%.
01:10:28.480 Somebody said 90% of meta-analysis
01:10:31.980 is wrong.
01:10:32.560 90%.
01:10:34.060 So Brett and Heather say that
01:10:39.220 they have achieved something
01:10:40.900 like, you know, closer to truth
01:10:44.180 because 8 out of 10 predictions
01:10:45.500 is a pretty good indicator of truth.
01:10:48.800 But I say there is no way to know
01:10:51.400 where you are on the line.
01:10:53.220 And you spend 90% of your time
01:10:55.200 wrong and 10% of your time right.
01:10:58.040 If you're going to guess and you
01:10:59.500 don't know anything else,
01:11:01.240 are you going to guess that an
01:11:02.400 individual scientist is in the 90%
01:11:05.880 of the time they're wrong
01:11:06.840 or by luck,
01:11:09.180 and it would be luck.
01:11:10.960 Did you pick the one point
01:11:12.580 they're in the 10% when they got
01:11:14.420 things right?
01:11:15.620 How do you know where you are?
01:11:18.280 Let's say Brett and Heather
01:11:21.100 are excellent scientists,
01:11:23.720 and they appear to be.
01:11:24.740 I mean, they're both published.
01:11:26.160 They got books.
01:11:26.900 They're high profile.
01:11:27.760 They look like they know
01:11:29.040 what they're doing.
01:11:30.520 But how do they know where they are?
01:11:34.580 How does anybody know that?
01:11:40.540 I'm watching your comments,
01:11:42.920 and I love it when you're thinking
01:11:45.900 because it's a whole different
01:11:47.900 kind of comment.
01:11:48.580 I can tell you now
01:11:50.080 you're processing this,
01:11:51.140 aren't you?
01:11:54.360 All right.
01:11:58.620 So my expertise says
01:12:01.920 that they are exhibiting
01:12:03.880 all the tells for cognitive dissonance.
01:12:07.060 How many accept my interpretation
01:12:13.060 that they couldn't know
01:12:15.500 if they're right,
01:12:16.260 they couldn't know
01:12:17.020 if their predictions
01:12:17.760 will work next time,
01:12:19.220 they can't explain
01:12:20.380 their process,
01:12:22.400 and I understand
01:12:23.020 that it might be complicated,
01:12:26.180 and they're mind-reading me
01:12:28.560 in a ridiculous way.
01:12:30.720 Doesn't it look like
01:12:31.480 cognitive dissonance?
01:12:33.200 Now, let's say I persuaded you
01:12:35.200 and you're 100% persuaded.
01:12:37.060 Should you believe it?
01:12:39.280 No, don't believe me.
01:12:41.340 Don't believe me.
01:12:42.500 Because that's the documentary effect.
01:12:44.620 So I just did to you
01:12:46.040 what they did to you.
01:12:48.500 I showed you one side,
01:12:50.680 and it's persuasive.
01:12:52.860 That's all I want.
01:12:54.120 So if what you're taking away
01:12:56.020 from this is
01:12:56.680 Scott's right and they're wrong,
01:12:59.040 nope, that's the wrong lesson.
01:13:00.880 Wrong lesson.
01:13:02.600 Here's the right lesson.
01:13:04.520 If you only hear one side of it,
01:13:06.280 it'll be really persuasive to you.
01:13:08.320 So don't believe that.
01:13:10.100 Never believe something
01:13:11.080 that's persuasive
01:13:12.060 on one side.
01:13:15.260 All right, let's talk about
01:13:16.120 Kyle Becker
01:13:17.060 had a great thread.
01:13:18.220 You should all be following
01:13:19.660 Kyle Becker on Twitter.
01:13:22.680 He has great threads.
01:13:24.560 And scoops and stuff like that.
01:13:26.200 Stuff you don't see
01:13:26.800 other places.
01:13:27.300 But
01:13:28.640 he did a thread
01:13:33.180 on the Biden crime family.
01:13:35.120 That's my term for it.
01:13:36.680 And in my opinion,
01:13:38.420 once you see all the evidence,
01:13:40.440 the things we know,
01:13:41.980 not the things we're speculating,
01:13:44.280 the things we know
01:13:45.320 that Hunter did,
01:13:46.840 the money we know
01:13:47.760 that was transferred,
01:13:49.280 the people who transferred it,
01:13:50.780 it's 100% obvious
01:13:53.660 that it's a Biden crime family
01:13:56.260 and has been for a long time.
01:13:58.000 Does anybody disagree with that?
01:14:03.260 Katie.
01:14:05.020 The trolls are here.
01:14:07.640 Goodbye.
01:14:08.160 Goodbye.
01:14:10.320 Yeah.
01:14:12.520 So doesn't it make you wonder
01:14:14.380 why they haven't been arrested already?
01:14:17.020 Do you wonder
01:14:18.920 why they haven't been arrested?
01:14:21.600 So here's what I think's going on.
01:14:24.820 The Biden crime family narrative
01:14:27.580 is so well documented,
01:14:29.500 so obvious,
01:14:31.040 that I treat it as a fact.
01:14:33.640 Do you?
01:14:35.440 I treat it as an established fact.
01:14:37.740 And there are a lot of things
01:14:38.500 in the news I don't treat as facts
01:14:39.940 because I don't think
01:14:41.040 the evidence suggests that yet.
01:14:43.220 But I treat that as a fact.
01:14:45.100 It's just so well documented.
01:14:47.020 Here's the dumbest comment of the day.
01:14:54.560 L. Carey.
01:14:55.260 Sorry, Scott.
01:14:55.960 You were clueless
01:14:56.700 if you didn't know
01:14:57.360 spike proteins was toxic
01:14:59.580 when you got jabbed.
01:15:01.720 I just did, like,
01:15:02.880 a whole presentation
01:15:03.760 about how I don't know science.
01:15:06.760 Did you listen to any of that?
01:15:10.380 How about,
01:15:11.460 Scott doesn't know
01:15:12.360 how to do fusion.
01:15:15.100 Well, it's true.
01:15:16.380 I don't.
01:15:17.640 All right, let me get back
01:15:18.300 to my point.
01:15:21.000 I think what's going on
01:15:22.540 is Craig Chamberlain
01:15:24.840 on Twitter said this.
01:15:26.200 He goes,
01:15:26.800 quote,
01:15:27.100 it's so obvious and bad
01:15:28.920 that it borders on parity.
01:15:32.800 Seems like it's the growing defense.
01:15:35.060 And I believe
01:15:35.880 that something is going on here
01:15:37.520 with our perception
01:15:38.940 of the Bidens.
01:15:39.700 I believe our brains
01:15:42.080 are broken,
01:15:44.100 like, collectively,
01:15:45.440 because we can see
01:15:47.380 it's just obvious
01:15:48.420 that it's a criminal enterprise
01:15:49.960 running our government.
01:15:51.880 It's obvious.
01:15:54.040 Yet nothing is happening
01:15:55.580 about it.
01:15:57.460 And you can't reconcile
01:15:58.520 those two things.
01:16:00.060 Your brain cannot reconcile.
01:16:02.180 It's obvious,
01:16:02.920 a major criminal thing,
01:16:04.620 and nothing's happening.
01:16:06.200 So you have to pick.
01:16:09.360 Well,
01:16:10.720 maybe it's not important
01:16:12.060 if nothing's happening.
01:16:14.120 But wait,
01:16:15.480 it's obviously important,
01:16:17.280 but nothing's happening.
01:16:20.140 So maybe other people
01:16:21.900 who know things
01:16:22.700 think it's not important?
01:16:24.840 And so your brain
01:16:25.620 can't handle that.
01:16:27.320 Your brain is actually
01:16:28.560 in a state of abdication.
01:16:32.040 You basically give up.
01:16:34.000 You're beaten down.
01:16:35.740 So they've actually
01:16:36.880 beaten down the public
01:16:39.500 to where they can do
01:16:40.940 a major crime in public
01:16:42.800 and you won't act.
01:16:46.600 That's where you actually are.
01:16:48.760 They can do the crime
01:16:49.740 right in front of you
01:16:50.520 and you won't act
01:16:51.900 because you're just stunned
01:16:54.540 because none of it makes sense.
01:16:56.360 So we're basically gaslighted
01:16:59.040 into complete inactivity
01:17:00.600 and also in blindness
01:17:02.120 in some case.
01:17:03.420 And the blindness
01:17:03.960 would be in the case
01:17:04.840 of mostly Democrats
01:17:05.680 who could easily see it,
01:17:08.400 but they're not reading
01:17:10.020 any Kyle Becker threads.
01:17:12.560 How many Democrats
01:17:13.460 are following Kyle Becker?
01:17:15.700 Zero?
01:17:17.360 Same with me, probably.
01:17:22.720 Yeah,
01:17:23.540 the thing about
01:17:24.340 the prosecutor fired,
01:17:25.980 I'm not sure that...
01:17:27.460 I don't know if that's
01:17:28.300 the most important part
01:17:29.080 of the story,
01:17:29.580 but it could be.
01:17:31.380 So it's the ultimate gaslight.
01:17:33.980 The Biden crime family
01:17:35.300 just exists
01:17:37.040 and nothing's going to happen.
01:17:39.060 And we'll just have to
01:17:40.020 deal with that, I guess.
01:17:43.100 Because if something did happen,
01:17:44.540 it would be big.
01:17:47.760 All right.
01:17:51.480 And then, oh,
01:17:52.460 just one other thing
01:17:53.240 about the Dark Horse podcast.
01:17:56.040 I didn't see anywhere
01:17:58.820 on the podcast
01:17:59.680 where they mentioned
01:18:00.920 how they determined
01:18:02.420 the long COVID risk.
01:18:04.500 Have they ever talked about that?
01:18:07.640 Because the thing
01:18:09.180 I finally figured out
01:18:10.240 is it all comes down to that.
01:18:12.420 I tried to figure out
01:18:13.420 why so many smart people
01:18:14.580 were disagreeing with me,
01:18:15.640 but when you dig down
01:18:17.500 and you match assumptions,
01:18:19.400 it always comes down
01:18:20.320 to the people
01:18:21.040 who disagreed
01:18:22.140 on vaccinations
01:18:23.560 were people
01:18:26.500 who thought
01:18:26.940 the long COVID risk
01:18:28.100 was either zero
01:18:29.080 or they didn't think about it.
01:18:32.160 So to me,
01:18:33.020 the biggest variable
01:18:34.020 and the people
01:18:35.220 who have different opinions
01:18:37.200 than I did,
01:18:38.380 they just ignore it.
01:18:40.200 They just ignore
01:18:40.800 the biggest variable.
01:18:41.620 And by ignore it,
01:18:43.320 I mean some just say
01:18:44.380 I knew it was nothing.
01:18:47.300 So that's guessing.
01:18:49.340 So if,
01:18:50.500 and I don't know
01:18:51.060 if this is the case,
01:18:52.060 so you'll have to confirm,
01:18:53.300 but if it's the case
01:18:54.440 that Brett and Heather
01:18:56.220 did like the greatest job
01:18:58.260 of all time
01:18:59.060 looking at the,
01:19:00.720 you know,
01:19:01.000 the details
01:19:01.500 of the vaccinations
01:19:02.460 and the harm
01:19:03.800 and all that,
01:19:04.580 but if they didn't also
01:19:06.100 suss out the long COVID risk
01:19:08.600 or they treated it
01:19:10.140 like an unknown,
01:19:11.620 you're still guessing.
01:19:13.880 If the biggest variable
01:19:15.180 is unknown,
01:19:16.660 you're guessing.
01:19:18.080 There's just no way
01:19:18.940 around that.
01:19:21.660 Now,
01:19:22.200 I guess there is
01:19:22.720 a way around it
01:19:23.460 if you had some
01:19:24.000 heuristic rule of thumb
01:19:25.360 that could handle it,
01:19:26.720 but there isn't.
01:19:27.960 It's a totally
01:19:28.740 novel situation.
01:19:32.740 All right.
01:19:34.040 But let me,
01:19:34.700 let me close this
01:19:35.480 by saying
01:19:36.000 that the value
01:19:40.280 and the beauty,
01:19:42.140 if I will say that,
01:19:43.320 of the Dark Horse podcast
01:19:45.040 is that they're
01:19:46.960 enriching the conversation
01:19:48.860 and they're perfectly
01:19:51.020 compatible with science
01:19:52.260 in the sense
01:19:53.620 that they're acknowledging
01:19:55.680 you don't know everything
01:19:56.940 at every point,
01:19:58.040 you're crawling
01:19:58.960 toward the truth,
01:19:59.900 you're adjusting
01:20:00.800 your assumptions.
01:20:01.840 The more you see of that,
01:20:03.260 the better off we are.
01:20:04.260 Michael Storer,
01:20:08.380 Scott didn't think logically,
01:20:09.900 he just got scared
01:20:10.840 and rushed down
01:20:11.460 to get a cloud shot.
01:20:13.200 Now, Michael,
01:20:14.340 do you think that
01:20:15.240 that might be
01:20:15.800 mind reading
01:20:16.800 and a clear signal
01:20:19.180 that you're
01:20:20.300 in cognitive distance?
01:20:22.200 Because,
01:20:22.740 do you think
01:20:23.100 you can read my mind
01:20:24.860 and know what
01:20:25.540 I'm afraid of?
01:20:27.720 Do you think
01:20:28.120 that's a thing?
01:20:30.360 And,
01:20:30.840 shouldn't I be,
01:20:32.220 let's say,
01:20:33.040 fully accounting
01:20:35.980 for all my risks?
01:20:39.060 Yeah.
01:20:39.760 I think everybody
01:20:40.600 who thinks
01:20:41.100 that somebody
01:20:41.560 acted just on fear,
01:20:44.920 you're not really
01:20:46.000 a useful member
01:20:47.220 of the conversation.
01:20:48.660 All right,
01:20:48.920 I've got a hypothesis here
01:20:51.660 that will really
01:20:53.120 make you mad.
01:20:53.840 You ready?
01:20:55.100 You ready to be mad?
01:20:57.260 Hypothesis coming.
01:20:58.020 All right?
01:20:59.520 Now,
01:21:00.040 this is not a
01:21:00.640 one-for-one situation.
01:21:02.040 I'm going to talk
01:21:02.880 about averages,
01:21:04.180 but I'm not talking
01:21:05.140 about you individually,
01:21:06.100 right?
01:21:06.820 So,
01:21:07.320 don't tell me
01:21:08.140 what you individually
01:21:09.080 feel.
01:21:10.020 That would be
01:21:10.780 misunderstanding my point.
01:21:13.040 So,
01:21:13.340 I'll beg you,
01:21:14.380 I'm only doing this
01:21:15.420 because I know
01:21:15.840 you can't resist,
01:21:16.900 I'm going to beg you
01:21:18.160 not to tell me
01:21:19.420 your personal situation.
01:21:20.880 and you'll still
01:21:23.180 have to do it.
01:21:24.580 All right?
01:21:25.040 I'm begging you,
01:21:26.220 do not put in the comments
01:21:27.360 what you personally
01:21:28.980 think and did.
01:21:30.260 See if you can resist.
01:21:31.980 Here's my hypothesis.
01:21:33.860 People who are afraid
01:21:34.800 of needles
01:21:35.280 were more likely
01:21:36.280 to be against
01:21:37.360 the COVID shot.
01:21:44.920 I'm just looking
01:21:45.720 at your comments.
01:21:46.520 I'm going to see
01:21:46.860 who's the first one to,
01:21:48.060 who's the first one?
01:21:50.880 Who's the first one?
01:21:52.180 Come on.
01:21:53.240 You know you can't resist.
01:21:56.120 Come on.
01:21:57.120 Come on.
01:21:58.960 Okay,
01:21:59.440 you're doing pretty good.
01:22:00.760 I was expecting
01:22:01.700 somebody to say,
01:22:03.240 I'm not afraid
01:22:04.660 of that thing
01:22:05.520 and therefore
01:22:06.000 your theory is wrong.
01:22:08.840 How much would you
01:22:09.760 bet on it?
01:22:13.160 I'll put an actual
01:22:14.160 monetary bet
01:22:15.300 that if you did,
01:22:16.980 I don't know if you
01:22:17.560 could do a good enough
01:22:18.260 study.
01:22:18.740 I never trust studies
01:22:19.680 so I don't know
01:22:20.240 if there's any way
01:22:20.760 to collect on the bet
01:22:21.860 but I would bet
01:22:23.620 a large amount
01:22:25.040 that the people
01:22:27.200 who are afraid
01:22:27.840 of needles
01:22:28.420 just in general
01:22:29.740 were far more likely
01:22:31.680 you know,
01:22:33.960 when I say
01:22:35.040 far more likely
01:22:35.780 it might be
01:22:36.140 like a 20% effect,
01:22:38.280 you know,
01:22:38.440 not a 100% effect
01:22:39.360 but like a
01:22:40.100 real solid
01:22:41.180 20% effect
01:22:42.480 or something like that
01:22:43.640 that people,
01:22:45.440 now the reason
01:22:46.100 I say that,
01:22:46.780 oh,
01:22:49.140 oh,
01:22:49.540 I'm not afraid
01:22:50.480 of needles.
01:22:52.260 So consistent
01:22:53.300 with my hypothesis,
01:22:55.340 I'm not afraid
01:22:55.980 of needles.
01:22:57.320 So I put it off
01:22:58.920 as long as I could
01:22:59.780 you know,
01:23:00.840 so I could know
01:23:01.440 as much as possible
01:23:02.380 but when it was time
01:23:03.620 to get the needle
01:23:04.200 I never even thought
01:23:05.280 about it.
01:23:06.520 But
01:23:06.920 I know myself
01:23:09.120 and if I were
01:23:10.460 deathly afraid
01:23:11.140 of needles
01:23:11.500 which is a big thing
01:23:12.580 some people are
01:23:13.860 if I were afraid
01:23:14.820 of needles
01:23:15.240 I'm sure
01:23:16.380 I would be more
01:23:17.500 against these needles.
01:23:20.260 I'm sure I would
01:23:20.860 because I know myself.
01:23:22.380 I know I would be
01:23:23.040 biased by that.
01:23:24.940 I think it's
01:23:25.580 I think that applies
01:23:26.560 to everybody.
01:23:28.260 You know,
01:23:28.880 any knowledge
01:23:30.040 of psychology
01:23:31.120 well,
01:23:31.780 let me put it this way.
01:23:34.160 Let's see in the comments
01:23:35.460 only the people
01:23:37.480 who have an education
01:23:39.000 in psychology
01:23:39.860 because there are
01:23:40.640 always some psychologists
01:23:41.560 and cognitive scientists
01:23:43.780 and stuff.
01:23:44.540 Right?
01:23:44.980 So you have to tell me
01:23:46.020 you're a scientist
01:23:46.600 or a psychologist.
01:23:49.100 So say,
01:23:49.780 I'm a psychologist
01:23:50.500 and then you answer.
01:23:52.700 If you're a psychologist
01:23:53.880 do you agree with me
01:23:55.060 that the people
01:23:56.700 afraid of needles
01:23:57.460 are more likely
01:23:58.380 to say they don't work?
01:24:00.300 Go.
01:24:00.840 Just the people
01:24:01.480 who are experts
01:24:02.060 in psychology.
01:24:03.740 But say what
01:24:04.520 your expertise is.
01:24:05.460 Data scientist?
01:24:08.180 Yes.
01:24:10.080 How much more likely?
01:24:11.380 Maybe 20%.
01:24:12.260 Psychology?
01:24:14.100 Yes.
01:24:14.500 Scientists?
01:24:15.240 Yes.
01:24:16.140 So basically
01:24:17.000 everybody who has
01:24:18.280 I'm a psychiatry resident
01:24:21.940 agree.
01:24:24.140 Right?
01:24:24.860 Clinical psychology?
01:24:26.140 Yes.
01:24:26.760 Welder?
01:24:27.300 Yes.
01:24:28.960 I give credit
01:24:30.120 to a welder.
01:24:30.780 Yes.
01:24:32.500 Right?
01:24:32.760 Right.
01:24:34.380 Psychologist?
01:24:34.920 75% yes.
01:24:37.100 I'll take 75%.
01:24:38.220 Yeah.
01:24:39.600 Now it's definitely
01:24:40.400 not 100%.
01:24:41.420 It's not like
01:24:43.080 every person
01:24:44.220 is afraid of needles.
01:24:45.140 Right?
01:24:45.700 It's just
01:24:46.200 like a 20% bias
01:24:48.060 would be my best guess.
01:24:52.880 Yeah.
01:24:53.560 And there's a lot
01:24:54.120 of guessing.
01:24:55.100 All right.
01:24:56.440 All right.
01:24:57.100 Ladies and gentlemen.
01:24:58.520 I believe
01:24:59.280 I have delivered
01:24:59.900 to you
01:25:00.280 once again
01:25:01.080 some unique
01:25:02.760 and provocative views
01:25:03.840 that you have
01:25:04.360 not seen.
01:25:07.180 And that's why
01:25:08.180 you watch this
01:25:09.180 live stream.
01:25:11.700 And I will tell you
01:25:13.260 again
01:25:13.540 do I know
01:25:14.360 that I'm right
01:25:15.020 about everything?
01:25:15.720 Of course not.
01:25:17.680 Do I know
01:25:18.920 that
01:25:19.320 I will predict
01:25:21.380 better than
01:25:22.060 the Dark Horse
01:25:23.080 podcast
01:25:23.560 on the next
01:25:25.300 thing we predict?
01:25:26.700 Nope.
01:25:27.580 Nope.
01:25:28.520 They're smart people.
01:25:29.500 If you bet
01:25:30.920 against the
01:25:31.540 Black Horse
01:25:32.080 podcast
01:25:32.860 maybe a bad bet.
01:25:35.640 That might be
01:25:36.140 a bad bet
01:25:36.740 because they've
01:25:37.660 got skills.
01:25:39.500 Right?
01:25:39.620 They've got real
01:25:40.100 skills.
01:25:41.000 They have a real
01:25:41.500 process.
01:25:42.260 They have real
01:25:42.640 experience.
01:25:43.840 And they're
01:25:44.100 credible people.
01:25:45.720 So I wouldn't
01:25:46.340 bet against them
01:25:47.040 but knowing
01:25:48.560 that somebody
01:25:49.040 is right
01:25:49.400 is tough.
01:25:50.500 That's tough.
01:25:50.860 All right.
01:25:54.300 And that's all
01:25:54.680 for today
01:25:55.080 for YouTube.
01:25:56.440 And I'm going
01:25:56.720 to talk to the
01:25:57.300 locals people
01:25:57.860 for a little bit
01:25:58.560 because they're
01:25:59.500 special.
01:26:01.400 Bye for now.