Real Coffee with Scott Adams - August 24, 2021


Episode 1478 Scott Adams: Vaccination Reasoning Viewed Through a Hypnosis Filter. And Coffee.


Episode Stats


Length

44 minutes

Words per minute

152.65314

Word count

6,844

Sentence count

550

Harmful content

Misogyny

5

sentences flagged

Hate speech

19

sentences flagged


Summary

Summaries generated with gmurro/bart-large-finetuned-filtered-spotify-podcast-summ .

In the latest episode of Coffee with Scott Adams, the host talks about the Portland Police Department s decision to let two opposing political groups fight it out on the streets of the city, and why it might be a good idea to let them fight. Plus, an update on the Biden administration's efforts to get the troops out of Afghanistan.

Transcript

Transcript generated with Whisper (turbo).
Misogyny classifications generated with MilaNLProc/bert-base-uncased-ear-misogyny .
Hate speech classifications generated with facebook/roberta-hate-speech-dynabench-r4-target .
00:00:00.000 Bum, bum, bum, bum.
00:00:02.220 Good morning, everybody.
00:00:04.280 And it's time for another rousing edition of Coffee with Scott Adams,
00:00:09.600 the best thing that happens in the universe,
00:00:11.880 except for the sun itself.
00:00:14.460 And if you'd like to make it better,
00:00:16.420 well, all you need is a copper mug or a glass,
00:00:19.160 a tankard chalice or a stein, a canteen drug or a flask,
00:00:22.200 a vessel of any kind.
00:00:24.240 Fill it with your favorite liquid.
00:00:25.160 I like coffee.
00:00:26.760 Join me now for the unparalleled pleasure,
00:00:29.560 the dopamine hit of the day.
00:00:33.280 Somebody says, I simply tolerate Scott.
00:00:36.820 Well, thank you.
00:00:38.300 Some of you like the content,
00:00:40.660 but others are here for the tolerance,
00:00:43.100 and I appreciate it.
00:00:44.640 So join me now for the unparalleled pleasure,
00:00:46.580 the dopamine hit of the day,
00:00:47.520 the thing that makes everything better.
00:00:48.640 It's called the simultaneous sip.
00:00:50.780 Go.
00:00:50.980 Go.
00:00:54.800 Ah.
00:00:56.320 Oh, good.
00:00:58.420 Yeah, that's good stuff.
00:00:59.560 Well, in Portland,
00:01:03.840 the people in charge have decided that instead of using the police,
00:01:08.960 which, as you know, are overfunded in Portland,
00:01:12.280 according to Portland,
00:01:13.980 instead of using their limited police
00:01:15.880 to break up violence between opposing political groups in the city,
00:01:20.900 they decided to stand back and let them fight.
00:01:23.620 Because they were just using things like sprays and clubs and stuff like that.
00:01:30.640 It wasn't especially dangerous until the gunfire.
00:01:33.840 I guess the police got involved when there were shots fired.
00:01:37.220 But until then, they actually just stood back and observed,
00:01:40.820 and they let the two opposing sides fight it out.
00:01:43.960 To which I say, thank you.
00:01:46.820 I have never agreed with Mayor Wheeler more than right this moment.
00:01:53.820 They're letting them fight it out.
00:01:55.440 I don't know what could end it faster.
00:02:00.720 Right?
00:02:01.780 Because it seems to me that if you try to break it up
00:02:04.240 and you keep them from hurting each other too much
00:02:07.080 or too many people getting hurt,
00:02:08.860 they'll just come back.
00:02:10.600 But what happens if you let them fight it out a few times?
00:02:13.440 I think fewer people show up next time.
00:02:18.140 You know, if you've got your head broken,
00:02:20.660 probably you don't go back.
00:02:22.540 I don't know.
00:02:23.620 So we'll try that.
00:02:25.040 And I like it in the A-B testing sense of things.
00:02:28.620 So while common sense tells you,
00:02:30.300 hey, put those police in there 0.99
00:02:31.860 and stop that fighting on your streets,
00:02:34.700 well, let's try this.
00:02:36.280 It could work.
00:02:37.900 I'm not saying it will.
00:02:39.720 I don't think you can quite predict
00:02:41.540 that it'll come to a good end if you let them fight.
00:02:45.080 But we haven't tried it.
00:02:47.260 Let's give it a try.
00:02:49.040 If Portland wants to be the test case for a let them fight,
00:02:53.960 I say put some bleachers in there and sell tickets
00:02:56.220 and let's go nuts.
00:02:58.960 Well, I think I predicted to you
00:03:01.880 that Afghanistan has some surprises coming. 0.60
00:03:05.220 I didn't know exactly what the surprises would be.
00:03:07.560 But one thing you can be sure of
00:03:09.680 that whenever you've got one of these fog of war
00:03:12.440 or chaotic situations,
00:03:14.340 something is going to come out of this
00:03:16.140 that you say to yourself,
00:03:17.780 what?
00:03:19.460 How did that happen?
00:03:21.040 You know, just something completely
00:03:22.860 in a left field that explains everything.
00:03:25.820 Now, I don't know what that'll be.
00:03:27.580 But there's a weird thing happening now
00:03:29.700 in slow motion that I don't understand.
00:03:33.080 And it goes like this.
00:03:34.200 The Americans are getting out of Afghanistan 1.00
00:03:37.620 without violence.
00:03:40.260 Mostly.
00:03:41.340 Right?
00:03:41.660 You're hearing anecdotal stories
00:03:43.000 of somebody who doesn't want to cross a checkpoint.
00:03:46.660 Stories of the Taliban taking away passports,
00:03:50.960 something like that.
00:03:51.920 And it's probably happening
00:03:52.780 because I doubt the Taliban has such command and control
00:03:55.900 that they can control every checkpoint.
00:03:58.200 But overall, am I wrong
00:04:02.300 that the Biden administration
00:04:06.240 seems to think it can get the people out
00:04:09.200 on time by the end of the month
00:04:12.120 and that it's going peacefully?
00:04:16.520 Now, here's the weird thing.
00:04:19.500 You know your unintended consequences
00:04:21.940 and we're all geniuses?
00:04:23.740 Aren't we all geniuses?
00:04:25.400 We can predict exactly what will happen
00:04:28.360 in every international situation
00:04:30.380 because we're so smart.
00:04:34.900 But here's something
00:04:36.340 that might have accidentally happened.
00:04:40.440 The best case scenario.
00:04:44.120 I'm not going to say it's true yet
00:04:46.520 and certainly I think all evidence suggests
00:04:49.200 that the withdrawal was botched.
00:04:52.140 So my first take, if you remember,
00:04:54.480 was hold, hold.
00:04:56.720 We need to find out more information.
00:04:58.400 It looks botched, but wait.
00:05:00.640 There might be something we don't know about
00:05:02.220 because why is it we all see it botched
00:05:04.080 but the people in charge
00:05:05.740 all did the thing that was obviously wrong?
00:05:08.740 There's something to explain that.
00:05:10.360 We don't know yet.
00:05:10.960 So I said, oh, wait.
00:05:12.300 But I waited.
00:05:13.940 I waited for the administration
00:05:15.120 to give us its version of the story
00:05:16.840 and it just sounds botched.
00:05:20.240 Maybe there's more to it,
00:05:21.580 but it just sounds botched so far.
00:05:24.940 And I say that only from the perspective
00:05:27.320 of the administration didn't really put up a fight.
00:05:31.020 They didn't give you an argument
00:05:32.260 that they did it right that made any sense.
00:05:35.200 They're saying they did a good job,
00:05:36.960 but they're not really supporting that argument.
00:05:39.700 So here's the most optimistic thing you could say.
00:05:47.080 And I'm not saying it's true.
00:05:49.000 It's just interesting that it's possible.
00:05:51.320 And in fact, maybe more possible than the alternative.
00:05:54.840 And it goes like this.
00:05:56.560 The way this came down
00:05:58.160 was the least loss of death possible under all scenarios.
00:06:04.140 I think that's where it's heading.
00:06:08.800 That this is accidentally, totally botched,
00:06:12.600 but accidentally gave us the best outcome.
00:06:16.140 Why?
00:06:17.220 Because Afghanistan fell so quickly.
00:06:20.520 What could have been the safest thing
00:06:22.240 to happen in Afghanistan?
00:06:24.480 I hate to say it,
00:06:25.420 but if it's inevitable the Taliban is going to take over,
00:06:28.720 the safest thing is to surrender right away.
00:06:31.600 That's what happened.
00:06:32.200 So we didn't see it coming.
00:06:34.280 We weren't prepared for it.
00:06:36.400 But it was the safest thing.
00:06:38.580 Just complete surrender on day one.
00:06:41.420 Because otherwise you have the fight
00:06:42.980 and you still surrender.
00:06:45.260 I mean, you get to the same place, right?
00:06:48.020 So, and then, but the second part,
00:06:51.100 and the part we're obsessing on,
00:06:52.900 is the withdrawal.
00:06:54.520 You know, that's completely botched, right?
00:06:57.020 Does everybody agree
00:06:58.200 that we're all the experts
00:07:00.140 and we would have gotten the people out
00:07:01.760 that wanted to be evacuated,
00:07:04.140 we would have done that first
00:07:06.240 before we got rid of most of the military.
00:07:08.640 Now, we did put military back in,
00:07:10.700 but they're not getting people, etc.
00:07:12.700 So it's still botched,
00:07:14.680 no doubt about it.
00:07:15.740 But, so far, the Taliban has been playing it strategically very smart. 0.81
00:07:23.560 And the strategic smart thing for the Taliban to do 0.97
00:07:27.540 would be to give us a deadline that's a little challenging, but possible.
00:07:31.760 And they did that.
00:07:34.680 Apparently, at least according to the Biden administration,
00:07:37.720 that deadline is challenging,
00:07:40.420 but possible.
00:07:42.220 Like, you know, we have a really good chance
00:07:44.280 of getting really close to achieving it.
00:07:47.220 Now, so far, that's pretty reasonable
00:07:49.520 for a group that's known for its unreasonable actions.
00:07:55.240 What happens if...
00:07:58.840 I'll just put this as a hypothetical.
00:08:00.900 It's not a prediction.
00:08:02.580 It's a hypothetical,
00:08:03.740 and things are heading in this direction.
00:08:05.960 What happens if the hostages...
00:08:08.240 Not the hostages.
00:08:09.380 Well, they might be.
00:08:10.380 What happens if the people we want to evacuate,
00:08:12.740 including the Afghan interpreters
00:08:14.580 and the people who we want to get out,
00:08:16.400 what happens if we get them all out?
00:08:18.020 Isn't this going to look like the biggest success of all times?
00:08:25.800 At the same time, it was botched.
00:08:28.500 Believe it or not, they can happen at the same time.
00:08:31.600 You can botch everything and just get lucky.
00:08:35.540 That's what it looks like.
00:08:37.400 It looks like everything was botched.
00:08:40.560 And so far, fingers crossed,
00:08:43.240 you know, there's no reason to think it'll go this way
00:08:45.100 all the way to the end,
00:08:45.880 but fingers crossed,
00:08:48.420 they avoided a civil war by collapsing immediately,
00:08:52.160 and if we get the people out,
00:08:54.660 which looks like it's happening,
00:08:56.820 it's the best thing that could have happened.
00:08:59.980 Am I wrong about that?
00:09:02.140 I mean, if you're just counting bodies,
00:09:05.080 how could we have done better than this,
00:09:06.840 assuming that we do get the evacuees out?
00:09:10.060 Now, that's a big, big assumption, right?
00:09:11.840 But so far, so far it's happening.
00:09:14.740 Now, I'm not saying you're saying it's delusional.
00:09:18.620 It's not delusional because it's put in a statistical sense, right?
00:09:24.360 It could happen.
00:09:25.720 It's just unlikely.
00:09:27.320 I mean, if you had to predict,
00:09:29.240 wouldn't you predict that the Taliban will not let everybody out,
00:09:32.920 keep a bunch of hostages?
00:09:34.820 Most obvious thing, right?
00:09:36.100 And I feel like, inevitably, they're going to keep a few.
00:09:40.340 But let's say, I'll just give you a number.
00:09:42.780 Let's say they keep 20, I don't know, 20 Americans that didn't get out. 0.88
00:09:49.040 But we get everybody else out.
00:09:52.160 So suppose we lose 20 souls if they happen to be American. 0.98
00:09:56.380 What did we avoid?
00:09:57.580 Probably 100,000 deaths, 50,000, if the Civil War happened.
00:10:05.720 So Biden may have accidentally traded 20 American lives.
00:10:11.040 This is just a speculation, not based on data.
00:10:14.280 He could have accidentally traded 20 lives to save 50,000 Afghans.
00:10:18.820 Is that immoral?
00:10:22.280 Well, it's not America first.
00:10:24.860 But I don't know if it's immoral.
00:10:27.500 You know, you could argue that one.
00:10:29.320 All right.
00:10:31.900 Rasmussen poll asked,
00:10:33.680 if Taliban takes American hostages,
00:10:35.480 should the U.S. use military to rescue them?
00:10:37.560 And 78% say yes.
00:10:40.320 Is that good news or bad news?
00:10:42.700 It's good news.
00:10:44.320 Because the Taliban needs to know
00:10:46.560 that we're going to get our hostages out
00:10:49.700 or there's going to be big trouble.
00:10:53.000 Hostages slash evacuees.
00:10:55.940 And having 78% of the public say,
00:10:58.760 yeah, send the military in,
00:11:00.920 while about the same amount of the public says
00:11:03.880 get out of Afghanistan, 1.00
00:11:06.020 that is exactly the right situation
00:11:09.140 for the Taliban to say,
00:11:11.960 you know, if we don't let everybody out,
00:11:15.080 we're really effed.
00:11:17.000 Right?
00:11:17.860 So this is the best poll I've ever seen in my life
00:11:20.920 because it says the American public
00:11:23.040 are completely on the same side.
00:11:25.560 I mean, 78% is as good as you can get
00:11:27.500 in the American public.
00:11:29.520 And then Rasmussen poll also asked,
00:11:32.840 is the Biden administration doing enough
00:11:35.140 to evacuate Americans?
00:11:37.060 And 59% said no.
00:11:38.600 And I think this is under the category of,
00:11:42.440 how could you do enough?
00:11:43.440 You know, really, how could you do enough?
00:11:48.980 You know, it's always going to look like
00:11:50.760 it wasn't soon enough, didn't do enough.
00:11:53.960 But it does look like it's not enough.
00:11:55.600 All right.
00:11:57.460 Here's the question that I asked on Twitter.
00:12:01.380 And this is a,
00:12:04.120 this topic will be more about persuasion
00:12:07.440 than about whether you should get a vaccination.
00:12:10.820 Can we agree that I don't care if you get vaccinated?
00:12:13.960 Everybody?
00:12:14.540 Everybody?
00:12:15.500 I don't care if you get vaccinated.
00:12:16.800 So anything I say next
00:12:18.880 is going to be about vaccination persuasion,
00:12:22.500 but it's not intended to work on you.
00:12:25.220 Really, it's not a trick.
00:12:26.940 Right?
00:12:27.360 Let me say it again.
00:12:28.660 This is not some kind of like
00:12:29.820 backward psychology hypnosis trick.
00:12:32.320 I really don't want to persuade you on this.
00:12:34.920 I really, really don't.
00:12:36.200 It's totally unethical
00:12:37.380 because it's a, you know,
00:12:39.520 it's a health decision.
00:12:40.480 You've got to make your own decision on that.
00:12:42.340 I'll persuade you on politics.
00:12:45.080 I'd be happy to do that.
00:12:46.820 I'll persuade you on how to live your life better,
00:12:49.220 how to have systems instead of goals.
00:12:50.780 I'll persuade you on all kinds of stuff,
00:12:53.160 but not your health decisions.
00:12:55.680 Not that.
00:12:57.120 That's on you.
00:12:58.100 Totally.
00:12:59.180 But let's talk about the persuasion game
00:13:01.460 because it's happening,
00:13:02.380 whether I participate or not.
00:13:04.740 And Omar Khatib alerted me to,
00:13:07.640 there's a series of tweets from Advertising Insider
00:13:10.740 in which they asked some advertising professionals
00:13:14.200 what the governments could do
00:13:17.340 to persuade more people to get the vaccination.
00:13:20.760 Now, did the advertising executives
00:13:23.840 say much the same thing I did,
00:13:26.720 which is, hold on,
00:13:28.640 even if we could,
00:13:30.480 even if we could persuade people
00:13:33.000 to get the vaccinations,
00:13:34.200 it would be unethical.
00:13:35.120 So let me back out of this conversation right away
00:13:38.940 because I'm a marketing professional
00:13:41.100 and I'm not going to cross that ethical barrier.
00:13:44.720 Well, that didn't happen.
00:13:46.500 Instead, they gave you pretty good specific suggestions
00:13:49.100 for doing the most unethical thing in the world,
00:13:53.980 persuading people to get a health outcome,
00:13:56.660 a particular treatment.
00:13:58.300 So here are some of their persuasion points
00:14:01.600 and I'm going to evaluate them
00:14:03.520 so you can get a little lesson on persuasion
00:14:05.660 at the same time.
00:14:07.080 Number one,
00:14:08.260 are marketing executives good at persuasion?
00:14:12.240 Anybody?
00:14:13.880 Anybody?
00:14:15.180 Who works at a big company
00:14:16.640 where there are professional marketing people?
00:14:19.700 Are those professional marketing people
00:14:21.700 trained in persuasion, let's say, the way I am?
00:14:25.140 I'm a hypnotist, et cetera.
00:14:26.980 Do you think they have the same kind of persuasion training
00:14:30.140 as this cartoonist?
00:14:33.400 No, not even close.
00:14:35.600 But they do have a lot of tools
00:14:37.440 so they can do polls
00:14:39.160 and they can do focus groups
00:14:41.120 and they can A-B test things.
00:14:43.500 So they can use brute force
00:14:45.400 to get to a good outcome
00:14:47.300 because you can just randomly try stuff,
00:14:49.820 semi-randomly,
00:14:51.040 and then just test it and see if it works
00:14:52.940 and then do more of the stuff that works.
00:14:54.520 So that's marketing.
00:14:55.200 But here's what marketers are not good at.
00:14:59.680 The first try.
00:15:01.420 Because almost nobody is, right?
00:15:03.240 The first thing you try
00:15:04.600 isn't necessarily going to work.
00:15:06.900 So here are some of their first things to try.
00:15:11.140 From, let's see,
00:15:12.900 this is from,
00:15:13.740 these are different marketing executives.
00:15:15.980 This is from Jim Lesser.
00:15:17.900 Suggests using humor
00:15:19.060 to disarm those who oppose the vaccine.
00:15:21.340 Nope.
00:15:24.160 Nope.
00:15:25.220 You can use humor.
00:15:27.220 It'll maybe make a good commercial
00:15:29.060 for selling your soap
00:15:30.360 or your beer.
00:15:32.400 But humor isn't going to change
00:15:34.660 your health care decisions.
00:15:36.800 I'm sorry.
00:15:37.880 It just isn't.
00:15:39.620 Now,
00:15:41.460 maybe he's right.
00:15:43.000 That's why you test it, right?
00:15:44.120 So what is the value of my opinion
00:15:47.140 on the initial input,
00:15:49.280 the thing that you could test?
00:15:51.020 Well,
00:15:52.160 I think my opinion is pretty good,
00:15:54.020 better than the average,
00:15:54.960 because I study this field.
00:15:57.020 But I could be wrong.
00:15:58.680 Maybe you test this
00:15:59.620 and it's exactly the right thing.
00:16:01.200 But I doubt it.
00:16:02.660 My instinct tells me
00:16:03.700 that humor
00:16:04.180 is not where you go on this.
00:16:07.040 Definitely not.
00:16:07.820 In fact,
00:16:09.340 I would go with fear.
00:16:11.380 The opposite of humor.
00:16:13.060 Humor is,
00:16:13.720 ha, ha, ha,
00:16:14.140 I'm relaxed,
00:16:14.820 I'm having a good time.
00:16:16.160 That's not the head
00:16:17.040 you want to put people in.
00:16:18.500 If you want somebody
00:16:19.180 to do something as radical
00:16:20.540 as putting a needle in their arm,
00:16:22.460 you've got to scare the shit out of them.
00:16:24.740 Right?
00:16:25.580 That'll work.
00:16:26.900 You've got to scare them.
00:16:28.540 I don't think anything else will work.
00:16:30.160 Let's go on with these other ideas.
00:16:32.960 He said that,
00:16:33.900 the same guy said that
00:16:35.140 the stickiness of puns
00:16:36.920 can be effective.
00:16:41.680 I don't know about that.
00:16:43.560 Now,
00:16:44.240 the stickiness of puns
00:16:45.800 in terms of wordplay,
00:16:47.440 that does have
00:16:48.640 some scientific support.
00:16:50.840 If the glove doesn't fit,
00:16:52.180 you must acquit.
00:16:53.820 It's sort of wordplay.
00:16:55.220 It's a rhyme.
00:16:55.940 The rhyme does actually
00:16:56.940 create some persuasion.
00:16:59.680 So science supports that.
00:17:02.560 But that's not enough
00:17:03.700 to get somebody vaccinated.
00:17:05.160 All right.
00:17:06.920 Another guy said,
00:17:09.460 a marketing professional said,
00:17:10.680 let's turn it from
00:17:11.660 you've got to get protected
00:17:12.800 to let's effing
00:17:14.660 track this thing down 0.86
00:17:16.440 and kill it. 0.94
00:17:18.300 So Mike Lee,
00:17:19.860 I think,
00:17:20.140 said that.
00:17:21.940 Well,
00:17:22.380 I'm not sure about the attribution.
00:17:23.980 But the idea is that
00:17:24.740 you've got to get tough with it.
00:17:26.640 Talk tough.
00:17:27.220 And then people say,
00:17:28.100 yeah,
00:17:28.720 let's go to war 0.72
00:17:29.660 against that virus.
00:17:31.140 Terrible idea.
00:17:32.520 Terrible idea.
00:17:34.680 I mean,
00:17:35.740 I don't even know
00:17:36.920 what to say about it.
00:17:37.720 It's just a terrible idea.
00:17:39.900 All right.
00:17:40.240 So forget that one.
00:17:43.140 Another said that
00:17:44.440 who originally
00:17:46.180 cast a doubt
00:17:47.040 on the vaccine
00:17:47.800 should play the trump card
00:17:49.280 and give him credit
00:17:50.880 for starting
00:17:51.380 the vaccine program
00:17:52.320 and directing
00:17:52.960 the resources.
00:17:53.640 Possibly the worst
00:17:55.420 of all the ideas.
00:17:56.980 If you bring Trump
00:17:58.460 into the conversation
00:17:59.700 of vaccinations,
00:18:01.420 do people even try
00:18:02.580 to make a medical decision?
00:18:05.000 No.
00:18:05.940 As soon as you add Trump
00:18:07.280 to the conversation,
00:18:08.500 it's a political conversation.
00:18:10.500 It doesn't matter
00:18:11.440 what the data is.
00:18:13.400 You throw Trump
00:18:14.200 into any conversation
00:18:15.280 about health care
00:18:15.980 and it's just
00:18:16.920 a Trump conversation.
00:18:18.320 So this is
00:18:19.020 a horrible idea.
00:18:21.320 Like really,
00:18:22.200 really,
00:18:22.540 really bad
00:18:23.860 from this
00:18:25.340 marketing professional.
00:18:26.640 Don't throw Trump
00:18:27.440 into the conversation.
00:18:28.820 That just makes
00:18:29.440 everything worse.
00:18:31.140 All right.
00:18:31.640 Here's another one.
00:18:33.780 An effective campaign
00:18:34.940 would have to find
00:18:35.640 people who are
00:18:36.320 anti-vaccine
00:18:37.460 and convert them.
00:18:41.680 So you'd have to find
00:18:42.440 the anti-vaccine people
00:18:43.960 and get them
00:18:44.660 to be pro-vaccine
00:18:45.760 and then help sell it.
00:18:48.360 Would that convince you?
00:18:50.320 If you saw somebody
00:18:51.300 who is anti-vaccine,
00:18:52.740 let's say you are,
00:18:54.900 and they were that
00:18:55.980 anti-vaccine
00:18:57.280 just like you,
00:18:58.620 but unlike you,
00:18:59.600 they had changed
00:19:00.540 to be pro-vaccine,
00:19:01.580 would that change
00:19:02.120 your mind?
00:19:03.580 No.
00:19:04.980 No.
00:19:06.200 No, it wouldn't.
00:19:08.100 Not even for a second.
00:19:09.680 The first thing you say
00:19:10.680 is, well,
00:19:11.140 they bought off another.
00:19:12.820 There's another sheep.
00:19:14.580 Looks like another sheep 0.90
00:19:15.640 went to slaughter.
00:19:17.080 No, it would be
00:19:17.780 the least persuasive thing
00:19:19.060 in the world
00:19:19.460 to see what other people
00:19:20.500 are doing.
00:19:21.620 Now,
00:19:22.820 suppose
00:19:23.540 all you did
00:19:25.040 was show that
00:19:25.800 smart, good-looking people
00:19:27.160 were getting vaccinated
00:19:28.140 and the people
00:19:30.100 who were not getting
00:19:30.820 vaccinated were
00:19:31.700 unhealthy-looking slobs 1.00
00:19:33.400 who make lots of
00:19:34.120 bad decisions in general.
00:19:36.220 Well, that might work.
00:19:37.660 That might work.
00:19:39.040 Not a specific person
00:19:40.460 because we discount
00:19:42.320 celebrities' opinions
00:19:43.560 completely.
00:19:44.080 But what if it was
00:19:45.320 a montage
00:19:46.980 of smart,
00:19:49.200 beautiful-looking people
00:19:50.160 getting vaccinated
00:19:50.980 and, you know,
00:19:52.700 obese,
00:19:53.680 bad decision-makers
00:19:55.260 not getting vaccinated?
00:19:58.340 That might work
00:19:59.240 because people are tribal
00:20:00.600 and they'd say,
00:20:01.280 oh, I want to be
00:20:02.020 in that tribe,
00:20:02.620 not that tribe.
00:20:04.000 So, I mean,
00:20:05.200 it would move
00:20:06.220 some people
00:20:06.760 in either direction,
00:20:08.000 but I think your net
00:20:08.740 would be good.
00:20:09.600 And when I say that,
00:20:10.480 again, remember,
00:20:11.200 you'd have to test it.
00:20:12.360 I'm not smart enough,
00:20:14.460 nor is anybody,
00:20:15.960 to know what will work
00:20:17.220 without testing it.
00:20:18.540 Persuasion doesn't 0.92
00:20:19.240 really work that way.
00:20:20.180 Even the hypnotist,
00:20:21.520 even the hypnotist
00:20:22.880 who's hypnotizing you
00:20:23.880 in real time,
00:20:25.100 just the two of you
00:20:25.900 in a room,
00:20:27.320 that hypnotist
00:20:27.980 is still A-B testing
00:20:29.080 because you're trying stuff,
00:20:30.800 you're looking at the reaction,
00:20:31.960 try something else,
00:20:32.740 look at the reaction.
00:20:33.880 So without the testing,
00:20:35.120 you don't really have
00:20:36.160 a system for persuasion.
00:20:40.700 It'd have to be
00:20:41.600 part of the team.
00:20:42.740 All right,
00:20:42.960 here's some more.
00:20:45.640 You shouldn't assume
00:20:46.640 everyone is motivated
00:20:47.660 by politics.
00:20:49.060 Okay,
00:20:49.380 that's real helpful
00:20:50.180 because,
00:20:53.240 well,
00:20:53.660 how does that
00:20:54.340 persuade you?
00:20:55.400 I guess the point is
00:20:56.460 to make it non-political,
00:20:57.860 but we've tried that.
00:20:58.640 That didn't seem
00:20:59.320 to make any difference.
00:21:02.420 Someone who was pregnant 0.96
00:21:03.940 and also a marketing professional
00:21:05.540 said that she was persuaded
00:21:07.560 by someone,
00:21:08.920 by hearing from people
00:21:10.020 who are genuinely
00:21:10.960 concerned for you.
00:21:13.140 Interesting.
00:21:14.180 So her suggestion was
00:21:15.720 that she was persuaded
00:21:16.720 by hearing from people
00:21:18.420 who were genuinely
00:21:19.300 concerned for her.
00:21:21.260 Now we're getting closer.
00:21:23.660 Now that's some real persuasion,
00:21:27.160 at least good thinking,
00:21:29.340 because we are persuaded
00:21:31.180 by people who genuinely
00:21:32.380 care for us.
00:21:33.960 Why is it that you don't think
00:21:35.680 the pharmaceutical companies
00:21:37.060 can be trusted?
00:21:38.540 Because they don't care for you.
00:21:40.340 Why is it that you don't think
00:21:41.500 the government
00:21:41.960 could be trusted?
00:21:43.720 They don't care for you.
00:21:45.060 Why do you not believe
00:21:46.140 the pundit you see
00:21:47.740 on the news?
00:21:49.060 Don't care for you.
00:21:50.500 In fact,
00:21:51.100 it might be the same pundit
00:21:52.260 who wants to hunt you down.
00:21:54.160 It might be,
00:21:54.960 in a lot of cases.
00:21:57.360 So no,
00:21:58.660 most of the people
00:21:59.600 telling you
00:22:00.200 to get the vaccination
00:22:01.400 are people
00:22:02.120 who don't care for you.
00:22:04.160 Literally don't care
00:22:05.080 about you one bit.
00:22:05.900 They don't even know you.
00:22:07.220 You're a stranger.
00:22:08.880 But how persuasive
00:22:09.820 would be somebody
00:22:10.540 who actually does
00:22:11.540 care for you?
00:22:13.120 Now,
00:22:13.640 they may not be experts
00:22:14.820 on vaccinations,
00:22:17.120 but I'll bet
00:22:17.940 it is persuasive.
00:22:19.560 So I don't know
00:22:20.360 how you could ramp that up
00:22:21.760 to get lots of concerned people
00:22:23.660 talking to vaccinated people,
00:22:25.180 and I'm not suggesting
00:22:26.260 you should.
00:22:27.740 I'm just saying
00:22:28.440 that at least this is
00:22:29.420 genuine persuasion
00:22:30.580 and smartness.
00:22:31.960 It looks like something
00:22:33.000 you could test.
00:22:34.620 All right,
00:22:34.780 here's another one.
00:22:35.380 You should treat
00:22:38.620 the campaign
00:22:39.340 for the vaccination
00:22:40.160 persuasion
00:22:40.840 with the kind
00:22:41.820 of budget
00:22:42.220 and tactics
00:22:43.000 you'd see
00:22:43.480 a political party
00:22:44.400 use to get out
00:22:45.340 the vote.
00:22:47.540 I think they did,
00:22:49.280 didn't they?
00:22:50.260 Wouldn't you say
00:22:50.840 that the government
00:22:51.480 did exactly that?
00:22:53.020 Used the same kind
00:22:53.960 of mass persuasion
00:22:55.100 as get out the vote?
00:22:56.480 I feel like that's
00:22:57.440 sort of a non-answer.
00:22:58.720 That's a very
00:22:59.200 marketing-meeting answer.
00:23:02.100 All right,
00:23:02.480 here's my take.
00:23:03.480 So I'm going to add
00:23:04.320 my own persuasion
00:23:06.200 suggestions.
00:23:07.140 Again,
00:23:08.840 I'm not trying
00:23:09.780 to persuade you.
00:23:10.600 I'm teaching you
00:23:11.200 how to do persuasion.
00:23:13.000 Okay?
00:23:13.620 Can we all deal
00:23:14.580 with that distinction?
00:23:15.940 Not trying to persuade you.
00:23:17.800 That would be unethical.
00:23:19.220 But I'll tell you
00:23:19.760 how persuasion works.
00:23:21.540 I think you need
00:23:22.400 a fake because.
00:23:24.140 In other words,
00:23:24.760 I think there are
00:23:25.180 a bunch of people
00:23:25.860 who are close
00:23:26.760 to being persuaded
00:23:27.720 and would like to.
00:23:28.760 But they're still
00:23:30.180 stuck in their
00:23:30.840 old opinion.
00:23:32.220 So they need
00:23:32.720 a fake because.
00:23:34.440 Something that
00:23:35.200 they can say,
00:23:35.980 oh,
00:23:36.960 something changed.
00:23:38.800 So now my
00:23:39.420 old opinion
00:23:39.960 was right,
00:23:40.980 but now my
00:23:41.860 new opinion
00:23:42.320 is right too 0.97
00:23:43.080 because something
00:23:44.520 changed.
00:23:45.460 So you need
00:23:46.020 that fake because
00:23:47.100 the something
00:23:47.720 changed.
00:23:48.240 The FDA approval
00:23:49.060 might be that.
00:23:50.220 So I think
00:23:50.600 some people
00:23:51.040 are going to say
00:23:51.580 even though the polls
00:23:52.500 suggest it's not
00:23:53.460 going to make
00:23:53.780 much difference,
00:23:54.700 I think it will.
00:23:55.580 I think the polls
00:23:57.240 are misleading.
00:23:58.740 I think that
00:23:59.300 the FDA approval
00:24:00.220 is even going
00:24:00.900 to make the
00:24:01.460 Moderna vaccination
00:24:03.360 more popular
00:24:04.260 even though it
00:24:05.360 doesn't have
00:24:05.860 FDA approval.
00:24:06.800 People are going
00:24:07.320 to kind of expect
00:24:08.000 that it will get it
00:24:08.920 because the Pfizer
00:24:09.900 got it.
00:24:10.900 So there's some
00:24:11.780 crossover persuasion
00:24:13.320 that the Pfizer
00:24:14.120 thing will make 0.86
00:24:14.840 the Moderna thing
00:24:15.560 look safer
00:24:16.180 even though they're
00:24:16.780 different things.
00:24:17.940 It's just how
00:24:18.460 your brain works.
00:24:19.980 So the fake because.
00:24:21.560 I would look
00:24:22.700 for other fake
00:24:23.440 becauses,
00:24:24.240 not just the FDA
00:24:25.260 approval,
00:24:25.960 but how about
00:24:27.080 we've waited
00:24:27.940 long enough
00:24:28.600 that the odds
00:24:30.500 of problems
00:24:32.100 are way down
00:24:33.620 because you
00:24:34.080 wouldn't have
00:24:34.380 seen most of
00:24:34.920 them pop up.
00:24:36.940 That would be
00:24:37.500 a fake because,
00:24:38.220 but there could
00:24:38.600 be others too.
00:24:40.000 As I said,
00:24:40.820 you need fear
00:24:41.520 and visual persuasion.
00:24:43.180 Anything less
00:24:43.880 is barely trying.
00:24:45.360 So you need to
00:24:46.120 show people
00:24:46.860 dying of COVID
00:24:47.980 like in the
00:24:48.740 worst possible
00:24:49.360 ways and saying,
00:24:50.820 I should have
00:24:51.480 gotten the
00:24:51.820 vaccination
00:24:52.220 and then
00:24:53.260 dying.
00:24:54.700 And there
00:24:55.000 are plenty
00:24:55.300 of anecdotes
00:24:55.840 like that.
00:24:57.140 CNN is
00:24:57.720 promoting that
00:24:59.100 kind of
00:24:59.640 persuasion
00:25:00.400 with their
00:25:01.000 story choices
00:25:02.240 and I think
00:25:04.260 it works.
00:25:05.180 I mean,
00:25:05.400 you could argue
00:25:06.160 that CNN
00:25:06.640 shouldn't be
00:25:07.240 in the persuasion
00:25:07.980 game,
00:25:09.040 but unfortunately
00:25:09.760 they are.
00:25:10.860 And I would
00:25:11.160 say that those
00:25:11.780 anecdotal,
00:25:13.000 terrible visuals
00:25:14.380 where you can
00:25:15.120 sort of put
00:25:15.580 yourself in the
00:25:16.180 bed and you
00:25:16.560 say,
00:25:16.760 oh,
00:25:16.980 I can see
00:25:17.400 myself being
00:25:18.200 that guy.
00:25:19.440 Don't be
00:25:19.840 that guy.
00:25:20.300 Show people
00:25:25.020 that you
00:25:25.500 like getting
00:25:26.420 vaccinated.
00:25:27.400 That's similar
00:25:27.960 to showing
00:25:28.440 the two
00:25:30.080 groups,
00:25:30.720 one is
00:25:31.080 beautiful and
00:25:31.680 one is
00:25:31.960 not.
00:25:33.180 I would
00:25:33.860 also like
00:25:34.280 to see an
00:25:34.680 expert explaining
00:25:35.720 risk management
00:25:36.760 to people.
00:25:38.860 You've never
00:25:39.600 seen that,
00:25:40.580 except me,
00:25:41.500 and I'm
00:25:42.060 far from an
00:25:42.640 expert.
00:25:43.620 If you
00:25:44.560 can't show
00:25:45.040 me,
00:25:45.760 let's say,
00:25:46.280 Nate Silver,
00:25:47.580 I like to
00:25:48.060 use him
00:25:48.460 because a
00:25:49.720 lot of
00:25:49.960 you disagree
00:25:50.560 with some
00:25:51.000 of his
00:25:51.280 opinions,
00:25:52.080 so he's
00:25:52.940 sort of
00:25:53.220 perfect for
00:25:53.700 this.
00:25:54.360 You can
00:25:54.780 disagree with
00:25:55.380 his opinions,
00:25:56.700 but not
00:25:57.220 his rationale,
00:25:59.160 meaning that
00:25:59.760 his thinking
00:26:01.160 process is
00:26:02.140 generally pretty
00:26:03.200 close to
00:26:03.680 flawless,
00:26:04.940 because it's
00:26:05.320 his field.
00:26:06.540 He knows
00:26:06.900 how to think
00:26:07.440 statistically.
00:26:08.260 He's good
00:26:08.660 at it.
00:26:09.600 Wouldn't
00:26:09.820 you like to
00:26:10.280 see somebody
00:26:11.020 who is just
00:26:11.760 flat out good
00:26:13.200 at it,
00:26:14.200 maybe a few
00:26:14.840 of them,
00:26:15.200 so you've
00:26:15.440 got a few
00:26:15.740 different opinions,
00:26:16.760 explaining to
00:26:17.540 you the
00:26:17.860 risk benefit,
00:26:18.660 thinking through
00:26:20.040 all of the
00:26:20.600 risks we
00:26:21.060 know about
00:26:21.540 and all
00:26:22.400 the ones
00:26:22.680 we don't,
00:26:23.720 putting some
00:26:24.220 kind of
00:26:24.680 statistics on
00:26:25.580 them and
00:26:25.780 just walking
00:26:26.340 you through
00:26:26.700 it.
00:26:26.920 Say,
00:26:27.140 okay,
00:26:27.800 here's the
00:26:28.260 risk of
00:26:28.580 getting
00:26:28.800 vaccinated.
00:26:29.760 Boom,
00:26:30.060 boom,
00:26:30.220 boom,
00:26:30.380 boom,
00:26:30.500 boom,
00:26:30.540 boom.
00:26:31.140 All these
00:26:31.560 potential risks.
00:26:32.880 Here's what
00:26:33.220 we know about
00:26:33.800 them.
00:26:34.800 Here's the
00:26:35.240 risk of not
00:26:35.740 getting vaccinated.
00:26:36.560 Here's what
00:26:36.860 we know about
00:26:37.400 it.
00:26:37.800 Put some
00:26:38.140 numbers on it.
00:26:39.540 I think that
00:26:40.320 would be at
00:26:41.120 least a fake
00:26:41.780 because for
00:26:42.480 some people.
00:26:43.220 Some people
00:26:43.680 would say,
00:26:44.000 you know,
00:26:44.620 nobody had ever
00:26:45.280 explained it that
00:26:46.000 well.
00:26:47.220 Right?
00:26:48.320 All right,
00:26:48.920 let me,
00:26:50.940 okay,
00:26:51.420 here's the
00:26:52.300 persuasion that
00:26:53.040 would work.
00:26:53.840 You ready for
00:26:54.320 this?
00:26:54.980 Again,
00:26:55.400 I'm not
00:26:55.740 suggesting it.
00:26:56.960 This is an
00:26:57.440 example of
00:26:58.000 what would
00:26:58.320 work.
00:26:59.380 You take
00:26:59.780 Nate Silver
00:27:00.360 and you
00:27:01.800 say,
00:27:02.200 sit down
00:27:02.760 for a
00:27:03.580 couple
00:27:03.760 hours
00:27:04.280 with
00:27:05.320 Mike
00:27:06.480 Rowe.
00:27:07.620 You all
00:27:08.140 know Mike
00:27:08.580 Rowe.
00:27:09.360 If you're
00:27:09.740 not American,
00:27:10.460 you might
00:27:10.680 not.
00:27:11.680 Mike
00:27:11.960 Rowe is
00:27:12.300 a famous
00:27:13.420 personality
00:27:14.300 who is
00:27:15.140 sort of
00:27:15.420 an every
00:27:15.840 man.
00:27:16.860 He does
00:27:17.420 what's
00:27:17.720 called
00:27:17.920 dirty
00:27:18.240 jobs.
00:27:18.840 He did
00:27:19.040 a show
00:27:19.320 where he
00:27:19.740 would do
00:27:20.100 these awful
00:27:20.860 jobs where
00:27:21.460 you literally
00:27:21.940 get physically
00:27:22.600 dirty,
00:27:23.180 doing gross
00:27:24.080 stuff.
00:27:25.160 So he's
00:27:26.000 famous for
00:27:26.500 being like
00:27:26.940 a level
00:27:28.680 headed,
00:27:29.740 rational
00:27:30.320 person who
00:27:31.580 just can
00:27:32.860 see how
00:27:33.220 to get
00:27:33.480 stuff done.
00:27:34.800 That's
00:27:35.120 sort of
00:27:35.340 his brand.
00:27:36.580 So you
00:27:36.860 take Nate
00:27:37.360 Silver and
00:27:38.920 you spend
00:27:39.260 two hours
00:27:39.780 with Mike
00:27:40.180 Rowe teaching
00:27:40.800 him how to
00:27:41.900 look at the
00:27:42.480 statistics and
00:27:43.940 then you have
00:27:44.360 Mike Rowe
00:27:45.120 explain the
00:27:46.960 statistics to
00:27:47.880 the public.
00:27:48.920 Why?
00:27:49.660 Because if Nate
00:27:50.200 Silver does
00:27:50.700 it, you're
00:27:50.960 not going to
00:27:51.360 understand it.
00:27:53.480 He's almost
00:27:54.300 too good
00:27:54.840 because his
00:27:55.980 explanation would
00:27:56.720 have enough
00:27:57.320 nuance in it
00:27:58.100 to be accurate
00:27:58.900 but maybe a
00:28:00.320 little confusing
00:28:00.920 because you can't
00:28:01.520 follow the
00:28:01.900 nuance of
00:28:02.420 statistics.
00:28:03.440 But you take
00:28:03.920 that stuff and
00:28:05.180 you package it
00:28:05.860 up with Mike
00:28:06.440 Rowe, a
00:28:07.840 really, really
00:28:08.640 credible voice,
00:28:09.660 especially to the
00:28:10.360 right, who has a
00:28:11.480 lot of
00:28:11.780 resistance.
00:28:14.140 Mike Rowe
00:28:14.740 could sell the
00:28:15.560 shit out of
00:28:16.160 this.
00:28:17.160 He really
00:28:17.460 could.
00:28:18.360 Now, people
00:28:19.060 are saying, is
00:28:19.540 he a scientist?
00:28:20.460 Is he a
00:28:20.880 statistician?
00:28:21.580 No.
00:28:22.360 No.
00:28:23.040 He's you.
00:28:24.760 That's why it
00:28:25.420 works.
00:28:26.400 Do you know
00:28:26.740 who would be
00:28:27.160 the most
00:28:27.740 persuasive person
00:28:29.000 for you?
00:28:29.920 The person who
00:28:30.720 would convince
00:28:31.300 you specifically
00:28:32.480 the best
00:28:33.740 would be you.
00:28:35.160 If you could
00:28:36.340 make a digital
00:28:37.180 version of
00:28:37.760 yourself and
00:28:39.540 give it a script
00:28:40.380 written by
00:28:40.840 somebody who
00:28:41.280 knows what
00:28:41.620 they're talking
00:28:42.000 about, and
00:28:42.900 then that
00:28:43.220 digital version
00:28:43.900 talks you into
00:28:44.780 getting vaccinated,
00:28:46.160 it would be the
00:28:46.840 most persuasive
00:28:47.840 thing that could
00:28:48.400 happen.
00:28:49.040 You couldn't
00:28:49.420 beat that.
00:28:51.160 Mike Rowe is
00:28:53.360 you.
00:28:54.520 That's sort of
00:28:55.220 his brand.
00:28:56.940 I hate to
00:28:57.900 characterize other
00:28:58.720 people because he
00:28:59.340 might not want to
00:29:00.860 characterize himself
00:29:02.140 that way, which is
00:29:02.880 unfair.
00:29:03.920 But in my view,
00:29:06.340 the thing that makes
00:29:06.940 him popular is you
00:29:08.100 say, yeah, I think
00:29:08.940 just like that.
00:29:09.700 Mike Rowe is
00:29:11.760 saying the words
00:29:12.740 coming out of his
00:29:13.340 mouth are the
00:29:14.300 ones I'm thinking,
00:29:15.300 but he's saying it
00:29:15.980 better than I'm
00:29:16.540 thinking it.
00:29:17.400 That's the ultimate.
00:29:18.720 He says what you're
00:29:19.660 thinking, but he
00:29:21.140 says it better than
00:29:21.960 you're thinking it.
00:29:23.700 Somebody says, why
00:29:24.520 a white person?
00:29:25.180 That's a good
00:29:25.560 question.
00:29:26.660 If you could take
00:29:28.160 the same concept and
00:29:30.260 replace Mike Rowe with,
00:29:31.880 I'm going to say
00:29:32.280 Charles Barkley, just
00:29:33.640 to pick a name,
00:29:34.240 right?
00:29:34.700 I picked Charles
00:29:35.500 Barkley because I
00:29:36.880 don't know if
00:29:37.200 anybody's been more
00:29:37.940 popular with
00:29:38.740 everybody.
00:29:40.180 If you don't
00:29:40.860 follow basketball,
00:29:42.080 that's an unfamiliar
00:29:43.000 reference.
00:29:44.020 But Charles Barkley
00:29:44.800 is black, but his
00:29:47.820 sort of approach to
00:29:49.440 race is so 0.93
00:29:51.000 commonsensical that
00:29:53.020 it just appeals to
00:29:53.860 left and right in a
00:29:55.280 weird kind of way.
00:29:56.700 So, yeah, you take
00:29:58.360 a Charles Barkley, who
00:29:59.420 everybody likes, and
00:30:00.760 he's famous for being
00:30:01.960 a plain talker,
00:30:03.840 common sense kind of
00:30:04.880 guy.
00:30:05.140 Yeah, that's
00:30:06.820 actually, that might
00:30:08.460 even be an upgrade on
00:30:09.480 my idea.
00:30:10.160 I think Mike Rowe would
00:30:10.860 be great.
00:30:12.080 But yeah, a Charles
00:30:12.920 Barkley, absolutely.
00:30:14.160 He could do that.
00:30:15.740 Somebody says Steve
00:30:16.520 Harvey.
00:30:17.800 Maybe.
00:30:19.000 I don't know.
00:30:19.400 I think Charles
00:30:20.480 Barkley is more
00:30:21.580 relatable, I feel
00:30:24.440 like.
00:30:25.060 More of the, you
00:30:26.640 know, the every man
00:30:27.480 talks like you do
00:30:28.480 kind of thing.
00:30:28.940 And here's another
00:30:33.140 idea.
00:30:34.220 It seems to me that
00:30:35.300 the anti-vaxxers
00:30:36.620 like to call the
00:30:38.200 people who are
00:30:38.600 taking the
00:30:39.040 vaccination sheep.
00:30:41.100 What would Trump
00:30:42.200 do if he were the
00:30:43.040 recipient of a thing
00:30:44.040 like that?
00:30:45.060 He'd take the gun
00:30:45.880 out of their hand and
00:30:46.560 he'd turn it around.
00:30:48.180 So, if, here's a
00:30:50.540 little persuasion
00:30:52.820 tip.
00:30:54.240 The things that
00:30:54.940 people call you are
00:30:57.320 things that they
00:30:58.000 personally think are
00:30:59.080 persuasive.
00:31:00.760 So, if the
00:31:01.680 anti-vaxxers are
00:31:02.820 calling you sheep
00:31:03.840 for getting
00:31:04.760 vaccinated, what
00:31:05.600 would be the most
00:31:06.400 piercing thing you
00:31:08.840 could call them?
00:31:11.020 Sheep.
00:31:11.780 They've given you
00:31:12.600 the answer.
00:31:13.800 You don't have to
00:31:14.480 wonder what's the
00:31:15.400 worst thing that you
00:31:16.240 could say about
00:31:17.080 them, because they
00:31:18.080 told you.
00:31:19.080 It's what they're
00:31:19.660 calling you.
00:31:21.040 Sheep.
00:31:21.980 So, if you could
00:31:22.620 find a way, and
00:31:23.700 again, this is
00:31:24.240 persuasion.
00:31:24.980 I'm not saying you
00:31:25.560 should do this.
00:31:26.260 It's just how it
00:31:26.880 works.
00:31:27.220 If you could
00:31:28.340 find a way to
00:31:28.980 make the people
00:31:29.540 not getting
00:31:30.300 vaccinated and
00:31:31.880 label them
00:31:32.460 sheep, in all
00:31:34.660 likelihood, that
00:31:35.380 would really hurt,
00:31:36.420 because it's the
00:31:37.600 word they use when
00:31:38.660 they're insulting
00:31:39.260 other people.
00:31:40.160 If you could make
00:31:41.060 that stick to the
00:31:41.980 person using the
00:31:42.880 word, it's going
00:31:44.700 to hurt a little
00:31:45.280 extra.
00:31:46.640 So, that would be
00:31:47.260 an approach.
00:31:48.520 Also, the
00:31:49.360 anti-vaxxers tend
00:31:50.420 to be conspiracy
00:31:51.640 theorists or the
00:31:53.780 only people who
00:31:54.420 are right.
00:31:54.780 two possibilities,
00:31:57.320 right?
00:31:58.320 Either they are
00:31:59.440 subject to
00:32:00.320 believing conspiracies,
00:32:02.140 or they're right,
00:32:04.000 and we'll all find
00:32:04.740 out later.
00:32:06.000 But, in terms of
00:32:08.280 persuasion, if you
00:32:10.320 came up with a
00:32:11.080 conspiracy theory that
00:32:12.300 worked in the other
00:32:13.040 direction, hypothetically,
00:32:16.220 it could persuade
00:32:16.900 people.
00:32:17.720 So, in other words,
00:32:18.340 you would need a
00:32:18.940 counter-conspiracy
00:32:20.360 theory.
00:32:20.740 conspiracy theory
00:32:22.460 that says, for
00:32:23.100 example, that China
00:32:24.720 is the one telling
00:32:25.840 you that the
00:32:26.420 vaccinations are
00:32:27.200 dangerous.
00:32:28.580 Because they
00:32:29.240 probably are.
00:32:30.960 I don't know if
00:32:31.920 that's true, but I
00:32:34.120 would guess that
00:32:35.620 China and Russia are 0.98
00:32:36.700 messing with our
00:32:37.520 communications about
00:32:38.500 the vaccines, just
00:32:39.460 like they do with
00:32:40.240 our social structure
00:32:42.740 and our politics.
00:32:44.400 Right?
00:32:44.520 So, it seems to
00:32:47.320 me that you could
00:32:53.340 do a counter-conspiracy
00:32:55.020 theory.
00:32:55.960 All right.
00:32:57.980 When I asked
00:32:58.760 people to describe
00:33:00.320 to me how vaccines
00:33:01.760 make things worse, I
00:33:03.260 got a ton of
00:33:04.520 cognitive dissonance.
00:33:06.320 I'm going to give
00:33:06.900 you the hypnotist's
00:33:08.140 filter on how to
00:33:09.860 see the world.
00:33:11.080 So, the question I
00:33:11.940 asked is, how could
00:33:12.800 it be that so many
00:33:14.120 people think that
00:33:14.980 vaccinations make 0.52
00:33:16.600 the variants worse?
00:33:19.200 So, that's a very
00:33:21.140 popular thought and
00:33:22.680 may be true.
00:33:23.580 It might be true that
00:33:25.000 the vaccinations make
00:33:26.340 variants worse.
00:33:28.120 So, I asked the
00:33:28.980 following question.
00:33:29.800 Can you describe the
00:33:30.820 mechanism for how
00:33:32.160 that could possibly
00:33:32.920 happen?
00:33:33.700 Because everybody's
00:33:34.460 sure of it.
00:33:35.880 I mean, it seems like
00:33:36.880 the entire public is
00:33:37.900 sure that's true.
00:33:39.240 But I said, well,
00:33:40.000 describe how exactly
00:33:41.440 that works.
00:33:42.700 And what happened
00:33:47.300 was, and this is
00:33:49.200 the hypnotist's
00:33:49.880 filter.
00:33:51.040 So, my filter on
00:33:52.080 this is probably
00:33:52.960 different than yours.
00:33:54.060 Your filter is
00:33:54.960 probably something
00:33:56.660 like this.
00:33:58.160 Some people are
00:33:58.980 right and some
00:34:00.620 people are wrong.
00:34:02.500 Some people are
00:34:03.220 well-informed.
00:34:04.220 They did their own
00:34:04.860 research.
00:34:06.000 Some people are
00:34:06.480 not.
00:34:07.680 And to you, I
00:34:08.280 would think that
00:34:08.840 that's all there is
00:34:09.860 to this vaccination
00:34:10.840 makes variants worse
00:34:11.820 question.
00:34:12.740 Somebody's right,
00:34:13.540 somebody's wrong.
00:34:14.700 The hypnotist's
00:34:15.460 filter is different.
00:34:17.280 The hypnotist's
00:34:17.880 filter says this.
00:34:19.800 There were a lot of
00:34:20.640 people who publicly
00:34:22.160 and to their friends
00:34:23.200 have said, I
00:34:24.280 understand this issue.
00:34:26.020 It's basic evolution.
00:34:28.080 If you put
00:34:28.620 evolutionary pressure
00:34:29.660 on the normal
00:34:31.400 virus, it will give 0.63
00:34:32.960 an advantage to
00:34:33.900 the variant.
00:34:35.280 Boom, Scott, I
00:34:36.500 just explained it.
00:34:38.000 Evolution 101.
00:34:39.680 Just apply it to
00:34:40.520 this situation and
00:34:41.520 you're done.
00:34:42.560 Anybody can
00:34:43.160 understand this.
00:34:44.380 Evolution is
00:34:45.200 survival of the
00:34:45.960 fittest.
00:34:46.800 The fittest variant
00:34:48.000 will be the one
00:34:48.740 that can get
00:34:49.160 through the
00:34:49.660 vaccination, right?
00:34:52.680 Pretty logical.
00:34:54.640 Okay, here's my
00:34:55.640 view of the world.
00:34:56.880 My view of the
00:34:57.640 world is that when
00:34:58.280 people realize they
00:34:59.180 couldn't explain how
00:35:01.340 a vaccination makes
00:35:02.500 a variant worse,
00:35:04.340 that they spun into
00:35:05.560 cognitive dissonance.
00:35:07.240 And when you look at
00:35:08.020 my tweet and you see
00:35:09.120 the comments, you'll
00:35:10.380 see that it's all
00:35:10.960 word salad.
00:35:12.740 But it'll be word
00:35:14.620 salad you think
00:35:15.360 makes sense.
00:35:16.880 Because the person
00:35:17.660 who wrote it thinks
00:35:18.420 it makes sense.
00:35:19.500 Let me give you
00:35:20.060 some.
00:35:20.400 I know you're
00:35:20.920 skeptical.
00:35:21.920 I'm going to read
00:35:22.480 you some of the
00:35:23.120 explanations of
00:35:24.220 people who are
00:35:24.740 very smart, by the
00:35:25.780 way.
00:35:26.380 So everybody who
00:35:27.180 gave these
00:35:28.100 explanations of how
00:35:29.280 evolution would
00:35:30.780 cause the variants
00:35:32.520 to be dominant,
00:35:33.660 they're smart
00:35:34.240 people.
00:35:34.580 but they're
00:35:37.060 showing all the
00:35:37.640 tells of cognitive
00:35:38.520 dissonance.
00:35:39.640 I'll give you some
00:35:40.120 examples.
00:35:41.020 Well, so far, let's
00:35:42.240 start with this
00:35:42.820 whiteboard.
00:35:44.080 Let's say there are
00:35:45.040 two situations.
00:35:46.560 There's a person with
00:35:47.800 no vaccination and a
00:35:48.960 person who's
00:35:49.400 vaccinated.
00:35:50.520 Let's say the
00:35:51.160 non-vaccinated person
00:35:53.180 gets the regular
00:35:54.680 alpha virus, but it
00:35:58.780 mutates inside them,
00:36:00.500 and the mutation is a
00:36:02.000 new variant, and then
00:36:03.060 it spreads.
00:36:03.580 Everybody agrees
00:36:05.060 that can happen,
00:36:06.620 right?
00:36:07.420 That it can happen,
00:36:08.780 that you can get the
00:36:09.600 regular virus, but it
00:36:11.460 mutates inside the
00:36:12.560 person, and then what
00:36:14.020 comes out the other
00:36:14.820 end is a new variant,
00:36:16.140 and then that variant
00:36:16.940 could spread.
00:36:17.840 That's if you don't
00:36:18.600 have a vaccination.
00:36:20.360 But what if you do
00:36:21.220 have a vaccination?
00:36:23.000 Well, in that case,
00:36:24.120 the alpha virus goes
00:36:26.300 in, it mutates, and
00:36:29.060 then a new variant
00:36:29.760 comes out, and it
00:36:31.300 spreads.
00:36:31.680 What was the
00:36:33.560 difference between the
00:36:34.360 vaccinated person and
00:36:35.460 the no vaccinated
00:36:36.160 person?
00:36:38.160 Nothing.
00:36:39.580 Because both of them
00:36:40.660 have the virus in them,
00:36:42.480 both of them can spread
00:36:43.420 it.
00:36:45.040 There's no difference.
00:36:46.840 So now, everybody who
00:36:47.900 is positive there's a
00:36:48.800 difference has to
00:36:50.240 explain this, and that
00:36:53.280 triggers cognitive
00:36:54.160 dissonance.
00:36:55.220 Now, let me say as
00:36:56.340 clearly as possible, I
00:36:58.700 don't know if
00:36:59.340 vaccinations cause more 0.93
00:37:01.620 variants.
00:37:02.920 I don't know.
00:37:04.080 I mean, I legitimately
00:37:04.820 don't know, and I'm not
00:37:06.460 sure I even am biased in
00:37:08.480 one direction.
00:37:09.700 All I do know is that
00:37:11.480 when people try to
00:37:12.220 explain it, they're
00:37:13.380 spinning into cognitive
00:37:14.840 dissonance.
00:37:15.620 That I do know.
00:37:16.800 So I can say that with
00:37:17.680 great confidence.
00:37:18.680 It's something I study.
00:37:20.160 So let me give you some
00:37:20.840 examples.
00:37:21.200 Again, these are smart
00:37:24.140 people with smart
00:37:25.000 explanations.
00:37:29.280 A vaccine which targets
00:37:30.920 a specific part of a
00:37:32.200 virus will mean that
00:37:33.440 small mutations in the
00:37:34.620 virus can survive, while
00:37:36.440 the targeted virus
00:37:37.480 cannot.
00:37:39.080 Since viruses are
00:37:40.260 mutating all the time,
00:37:41.460 eventually one of these
00:37:42.360 mutants, aka variants,
00:37:44.640 will become dominant.
00:37:46.460 How'd that happen?
00:37:48.300 How did one of them
00:37:49.220 become dominant?
00:37:51.200 Just by being more
00:37:52.420 viral, right?
00:37:53.880 In my example, the
00:37:55.420 vaxxed or the
00:37:56.140 unvaccinated person,
00:37:58.080 which of these cases
00:37:59.440 is there going to be
00:38:00.900 more variants?
00:38:02.380 Well, I would say that
00:38:03.300 the vaccinated person
00:38:04.460 has less virus to begin
00:38:06.140 with and less chances
00:38:07.220 spreading it.
00:38:08.780 So I would think there
00:38:09.880 would be fewer variants
00:38:10.820 if you got vaccinated.
00:38:13.180 I don't know that
00:38:14.000 that's true.
00:38:14.860 I'm just saying that's
00:38:15.660 where the logic takes
00:38:16.540 it with the limited
00:38:17.980 information I have.
00:38:19.640 So here's my question.
00:38:21.200 What would make the
00:38:22.700 variant survive?
00:38:25.720 And I'm hearing things
00:38:26.820 like, well, the old
00:38:28.420 virus and the new
00:38:29.600 variant are fighting it
00:38:31.220 out, and one is
00:38:33.000 compelled.
00:38:35.460 Nothing compels the
00:38:36.420 virus.
00:38:37.880 If you have two viruses
00:38:39.240 in you, they both would
00:38:40.620 spread.
00:38:41.560 If you had the delta,
00:38:42.580 it spreads.
00:38:43.020 If you have the other
00:38:43.600 one, it spreads.
00:38:44.140 So I can see why there
00:38:45.440 would be more...
00:38:48.140 Well, actually, there's
00:38:49.680 just no mechanism
00:38:50.500 described here.
00:38:51.860 Now, then other people
00:38:54.780 use analogies.
00:38:57.640 But the analogies fall
00:38:58.740 apart.
00:38:59.060 I'll give you one
00:38:59.560 analogy that was used.
00:39:00.980 If you're trying to
00:39:01.680 breed small dogs, let's
00:39:03.720 say you kill all the big
00:39:04.700 dogs, and then you breed
00:39:05.980 the small ones, and then
00:39:07.580 you do the same round
00:39:08.640 again.
00:39:09.200 Whatever the largest
00:39:10.580 puppies are, you kill 0.79
00:39:11.740 all of them, and then
00:39:13.360 only small puppies grow
00:39:14.820 up, and then they have
00:39:15.620 babies, and they're
00:39:16.280 small, right?
00:39:17.080 So that's the analogy.
00:39:18.900 So the virus would be
00:39:20.320 like that.
00:39:22.620 No, it wouldn't, because
00:39:24.260 the virus doesn't kill
00:39:25.240 all puppies.
00:39:26.520 The virus doesn't kill
00:39:28.060 all anything.
00:39:29.320 It just reduces it.
00:39:31.060 So the analogy would be
00:39:32.900 if you were trying to
00:39:33.860 breed small puppies, you
00:39:36.200 take all puppies and 0.81
00:39:37.200 kill 10% of them.
00:39:40.140 How does that help?
00:39:41.740 So the analogy just
00:39:43.220 falls apart, because
00:39:44.760 there's always some
00:39:45.700 difference in the
00:39:46.400 analogy from the
00:39:47.360 original, and it's a
00:39:48.020 big difference.
00:39:49.520 So there's that.
00:39:50.340 All right, so some of
00:39:51.060 you in the comments are
00:39:51.760 saying survival of the
00:39:53.740 fittest.
00:39:54.780 How many of you in the
00:39:56.020 comments think that
00:39:57.640 survival of the fittest
00:39:59.060 can explain why the big
00:40:01.580 variants get out past the
00:40:03.800 vaccines, and the weaker
00:40:06.060 ones don't?
00:40:06.740 Is it survival of the
00:40:07.700 fittest?
00:40:08.380 Because that's the thing,
00:40:09.460 right?
00:40:09.640 You've all studied
00:40:10.280 evolution.
00:40:11.640 Survival of the fittest.
00:40:13.940 How many of you know
00:40:14.900 that survival of the
00:40:15.780 fittest is not science,
00:40:18.260 and that it's not
00:40:19.060 evolution, and that
00:40:20.680 survival of the fittest
00:40:21.720 doesn't exist?
00:40:23.400 How many of you know
00:40:24.100 that?
00:40:25.000 How many of you know
00:40:25.940 that the theory of
00:40:26.700 evolution does not
00:40:27.880 include survival of the
00:40:29.880 fittest?
00:40:30.700 That's completely
00:40:31.540 debunked.
00:40:33.000 How many of you knew
00:40:33.940 that?
00:40:36.900 Somebody was about to
00:40:38.260 say that.
00:40:39.100 And do you know who
00:40:39.760 debunked it?
00:40:41.480 Stephen Jay Gould.
00:40:43.500 I think probably the
00:40:45.120 top evolutionary
00:40:46.040 biologist in the
00:40:47.120 country.
00:40:47.880 I think he's passed
00:40:48.580 away now.
00:40:49.540 But it's debunked by
00:40:51.660 the top evolutionary
00:40:52.820 evolution scientist,
00:40:54.860 and agreed by all of
00:40:56.480 science.
00:40:58.740 There's nobody in the
00:41:00.160 evolution field who
00:41:02.660 thinks survival of the
00:41:04.860 fittest is a thing.
00:41:06.600 Did you know that?
00:41:08.580 Because so many of you
00:41:09.460 said, well, it's
00:41:09.980 survival of the fittest.
00:41:11.200 You just apply it to
00:41:12.100 the virus.
00:41:13.200 But it doesn't exist
00:41:14.140 anywhere.
00:41:15.140 It's not a thing.
00:41:17.380 Most people think it
00:41:18.280 is.
00:41:18.480 It's sort of like people
00:41:19.740 believe that Trump said,
00:41:21.020 you know, the Nazis were
00:41:22.620 fine people.
00:41:23.800 People think that Trump
00:41:24.920 said drink bleach.
00:41:26.280 It just didn't happen.
00:41:27.920 None of those things
00:41:28.600 happened.
00:41:29.260 And there's no such
00:41:30.700 thing as survival of the
00:41:32.620 fittest.
00:41:33.000 It doesn't exist.
00:41:35.380 Do you know what does
00:41:36.060 exist instead?
00:41:38.100 Survival of things that
00:41:39.100 didn't die.
00:41:40.580 It's a big difference.
00:41:42.460 Right?
00:41:42.900 So if you had a species
00:41:44.980 that was just perfectly
00:41:46.560 well-suited, but then
00:41:47.780 let's say a tsunami
00:41:52.440 kills everybody, was that
00:41:55.740 group unfit?
00:41:57.340 No, they just had bad
00:41:58.340 luck.
00:41:58.620 There was a tsunami.
00:41:59.820 It got them all.
00:42:00.420 So there's luck, and
00:42:03.680 there's survival of things
00:42:04.660 that survived, but that's
00:42:05.740 it.
00:42:06.460 There's just survival of
00:42:07.520 things that survived.
00:42:08.760 So in the world of
00:42:10.080 survival of things that
00:42:11.340 survive, which is the
00:42:12.740 whole explanation, just
00:42:14.160 some things survive.
00:42:15.460 They get lucky.
00:42:16.400 That's it.
00:42:18.080 No, it's not semantics.
00:42:20.020 It's not even close to
00:42:21.380 just being semantics.
00:42:22.480 If you don't see the
00:42:23.160 difference between
00:42:24.360 survival of the fittest
00:42:25.840 and survival of the
00:42:27.660 random, you're missing a
00:42:29.720 really big point.
00:42:31.020 I'm saying that the way
00:42:32.060 we evolve is survival of
00:42:33.780 the random.
00:42:34.900 It's not the fittest.
00:42:37.320 Sometimes it is, but it's
00:42:39.340 coincidence.
00:42:41.340 All right.
00:42:43.340 So it's not so much a
00:42:44.480 natural selection.
00:42:45.640 It's just chance would be a
00:42:47.380 better way to say it.
00:42:48.180 It's just chance.
00:42:48.740 So I don't see how chance
00:42:50.900 can be part of the
00:42:52.620 explanation.
00:42:54.600 But we are warned by
00:42:57.080 Ian Martises that there's
00:42:59.600 a bigger risk called
00:43:01.660 original antigenic sin.
00:43:03.560 How many of you have
00:43:04.160 heard of that?
00:43:05.920 Raise your hands if you've
00:43:07.240 ever heard of original
00:43:08.800 antigenic sin.
00:43:10.640 I'm going to try to
00:43:11.640 explain it, but forgive me
00:43:13.820 because I don't understand
00:43:15.120 this field well enough.
00:43:16.340 So I'll give you the dumb
00:43:17.640 person's, you know, the
00:43:18.640 idiot's definition.
00:43:20.580 It goes like this.
00:43:21.720 If your immune system has
00:43:23.040 been trained against a
00:43:24.820 particular attacker, let's
00:43:26.560 say a virus, you have a
00:43:30.020 kind of immunity memory.
00:43:33.100 And that immunity memory
00:43:34.480 could work against you as
00:43:37.360 well as for you.
00:43:38.200 And the way it could work
00:43:38.920 against you is that when a
00:43:40.780 new virus comes in, a
00:43:41.980 variant, let's say, your
00:43:43.800 immune system says, oh, I
00:43:44.960 know exactly what to do.
00:43:46.420 And it ramps up to fight
00:43:47.620 the virus, but it's a
00:43:49.140 different virus.
00:43:50.160 It's just one that's like
00:43:51.260 it.
00:43:52.080 Now your immune system is
00:43:53.360 all ramped up to fight the
00:43:55.140 wrong thing.
00:43:56.520 It's actually working
00:43:57.540 overtime on the wrong
00:43:58.460 thing because it said, I
00:43:59.640 think that virus is a lot
00:44:00.820 like that other one.
00:44:01.780 So it goes to work on the
00:44:03.040 wrong virus.
00:44:04.380 And then the right one just
00:44:05.640 has a, you know, clear
00:44:06.860 channel.
00:44:07.920 Now, apparently that's a
00:44:10.040 real thing which has
00:44:10.880 happened in the past in
00:44:11.860 different situations.
00:44:13.200 So just know that that's
00:44:15.040 out there.
00:44:16.300 All right.
00:44:17.520 I don't know what the
00:44:18.540 size of that risk is
00:44:19.700 actually.
00:44:22.720 And that is what I
00:44:24.240 wanted to talk about
00:44:25.060 today.
00:44:26.620 I'm going to run and do
00:44:27.560 something else.
00:44:28.840 And I hope you don't
00:44:30.920 hate the persuasion
00:44:32.600 lessons in the context of
00:44:34.020 these boring topics like
00:44:35.300 vaccinations and masks and
00:44:37.700 stuff like that.
00:44:38.360 I think I think the
00:44:39.220 persuasion stuff is
00:44:40.100 interesting and if you
00:44:43.200 don't let me know.
00:44:44.300 But I think that you
00:44:45.020 learn something when you
00:44:45.900 see the persuasion
00:44:46.960 element separate from the
00:44:48.560 science.
00:44:49.260 All right.
00:44:49.520 That's it for now.