Real Coffee with Scott Adams - September 20, 2021


Episode 1505 Scott Adams: Today I Will Trigger Massive Cognitive Dissonance in My Audience. You Should Not Watch.


Episode Stats


Length

50 minutes

Words per minute

144.04633

Word count

7,239

Sentence count

616

Harmful content

Misogyny

1

sentences flagged

Hate speech

8

sentences flagged


Summary

Summaries generated with gmurro/bart-large-finetuned-filtered-spotify-podcast-summ .

This is the most provocative and annoying live stream you'll ever see. If you don't want to have that happen to you, just turn it off. It's going to make a lot of people mad, like, really mad.

Transcript

Transcript generated with Whisper (turbo).
Misogyny classifications generated with MilaNLProc/bert-base-uncased-ear-misogyny .
Hate speech classifications generated with facebook/roberta-hate-speech-dynabench-r4-target .
00:00:00.000 Good morning, everybody.
00:00:04.160 Welcome to the most provocative and annoying live stream you'll ever see for the rest of
00:00:11.920 your life.
00:00:13.200 Today, unlike most of my live streams, where I try to entertain you and make you happy,
00:00:21.220 today will make you very unhappy.
00:00:23.920 In fact, so unhappy, I recommend you turn it off right now.
00:00:27.700 This will be the, I don't know, maybe I've done this before, but quite literally, it's
00:00:35.220 going to make a lot of you really mad, like really, really mad.
00:00:39.240 If you don't want to have that happen to you, just turn it off, because you don't need that
00:00:43.420 kind of trouble in your life, probably.
00:00:46.500 Here's what's going to happen.
00:00:49.120 I'm going to take apart some of the arguments about COVID.
00:00:54.100 We'll do that at the end, so I'm going to do the general stuff first.
00:00:56.840 If you want to hang in here for that.
00:00:58.680 But when we talk about the COVID stuff, I'm only going to be talking about decision making.
00:01:05.220 I don't care.
00:01:06.560 Let me be as clear as possible about this.
00:01:09.780 I love you all, and I don't care if you live or die.
00:01:15.860 Can you accept that?
00:01:17.580 I love you all, and I don't care if you live or die.
00:01:21.060 As long as you're pursuing your life the way you want to pursue it.
00:01:25.600 If you want to do some extreme sports and you die, I love you, but I also don't care,
00:01:33.380 because you did what you wanted to do.
00:01:34.860 You died the way you wanted to die.
00:01:36.900 And I'm all for that, completely.
00:01:39.020 So, if you can't accept the fact that I love you, but I'm totally okay with you killing yourself any way you want to, really.
00:01:48.460 That's just your business.
00:01:49.920 So, I'm not going to try to shill any vaccinations or anything like that.
00:01:53.800 And let me even go the extra distance.
00:01:59.340 I am so much into your personal freedom that I will risk my life for you to have that right.
00:02:08.040 Because in some small sense, you know, we all take a little extra risk if you take a little extra risk, with the virus anyway.
00:02:15.380 Because if you get sick, maybe you get other people sick, etc.
00:02:19.100 But, I'm not going to stop you, because your freedom might cost me a little bit, or be dangerous to me.
00:02:28.240 I let you drive a car, and yet you might take a drink and drive your car into my car.
00:02:34.360 But that doesn't stop me from letting you drive, or trying to stop you from driving, as if I could.
00:02:40.400 I don't care that you do dangerous sports, even though I know that raises my healthcare costs.
00:02:46.720 You know, if you hurt yourself, and it raises the expense of healthcare for everybody in the country.
00:02:52.760 I'll take that.
00:02:53.680 I'll take that risk for your personal freedom.
00:02:56.620 So, I'm very, very strongly in favor of your personal freedom, even when it affects me or has a risk that I accept.
00:03:06.000 Because I figure you do the same for me, right?
00:03:08.620 I take risks.
00:03:10.360 You pay for my risks, too.
00:03:12.620 Part of the deal.
00:03:13.220 All right, let's talk about some other stuff.
00:03:17.760 Rasmussen asked people if they think the economy will be getting stronger or weaker in the coming year.
00:03:25.380 29%.
00:03:25.820 Thank you, Hannah.
00:03:29.540 We will do the sip in a moment.
00:03:31.400 I have not forgotten.
00:03:34.480 29% said the economy will be stronger, but 47% said they expect it to be weaker next year.
00:03:40.880 47% of the country thinks the economy will be weaker next year.
00:03:47.940 Remember what I told you about economics?
00:03:50.440 One of the few things I know something about.
00:03:53.100 If anybody's new to me, I have a degree in economics and an MBA.
00:03:57.820 Okay, so one of the things that economists know is that economics is an optimism machine.
00:04:07.240 Optimism, specifically your expectation that next year will be okay, is what drives economics.
00:04:13.880 Because you won't invest in anything if you think it won't pay off.
00:04:18.780 So you have to have optimism in order to invest.
00:04:21.420 And you have to have investment in order to have an economy.
00:04:24.560 So optimism drives everything.
00:04:25.920 If I make a deal with you, any kind of a business deal, it's based on optimism that you will pay me back or you'll do your part of the deal.
00:04:36.880 Without optimism, everything falls apart.
00:04:39.920 So I've argued that Trump brought the optimism in a way few people could.
00:04:46.560 He was a salesperson for the economy, for the United States, America first.
00:04:51.300 And I feel as if Trump was unambiguously the better optimism president.
00:04:57.400 He said it directly.
00:04:58.940 I'm going to be the cheerleader for the country and for the economy.
00:05:02.100 And then he went out and did it.
00:05:04.140 And even people who didn't like Trump still thought the economy was going to do well.
00:05:11.320 So I don't think you can ignore this.
00:05:14.720 That the president is pretty directly a cause of what you think about the economy.
00:05:21.480 Now suppose Trump had been president, but we still had all this debt.
00:05:27.140 You know, debts, it's something you've got to worry about.
00:05:30.160 It's not exactly like personal debt.
00:05:32.140 We don't quite understand national debt.
00:05:35.480 Economists, I think, would agree that they were completely wrong about how much debt we could take on.
00:05:42.500 Completely wrong.
00:05:43.360 I mean, like, maybe by a factor of 10, they've been wrong.
00:05:49.720 Or were they wrong?
00:05:50.980 Maybe we can't handle the debt.
00:05:52.520 Maybe we'll be surprised.
00:05:54.160 But certainly, you know, national debt doesn't operate the way personal debt does.
00:05:58.360 With personal debt, you kind of know what's too much.
00:06:00.620 You know, at least within a narrow range.
00:06:04.300 But with national debt, we don't know.
00:06:06.860 You know, as long as you have nuclear weapons and people like your currency better than other currency, 0.61
00:06:10.960 I don't know, maybe you can just inflate away most of your troubles and nobody notices.
00:06:18.940 Somebody's suggesting that we all pray for rain.
00:06:22.280 Worth a shot.
00:06:23.680 Pray for rain for the whole country.
00:06:26.740 So how about this Woodward book?
00:06:28.500 So the book Peril that we're all talking about, I learned yesterday that in that book,
00:06:35.180 Woodward treats the fine people hoax as though it's a historical fact.
00:06:39.440 Not a fact that it's a hoax, Woodward treats the fine people hoax like it's not a hoax, like it actually happened.
00:06:50.420 And he reports it as one of the reasons that Biden was so interested in being president.
00:06:55.060 Now, there are many sketchy things in this and other Woodward books.
00:07:00.580 Things which are quotes that people say, I didn't really say that, etc.
00:07:03.960 But here's the thing.
00:07:10.000 I'm starting to wonder if Watergate really happened.
00:07:13.920 Right?
00:07:15.340 Because everything we knew about Watergate came from Woodward and Bernstein, at least originally.
00:07:22.280 And they don't have any credibility.
00:07:25.100 I mean, I suppose they did when they did the Watergate stuff.
00:07:30.900 But at the moment, we know that Woodward, at least, is not a credible character.
00:07:35.960 He's not even close.
00:07:37.940 I mean, if you believe the fine people hoax and you put it in a book, in 2021 it's in a book,
00:07:43.880 that's embarrassing, really.
00:07:46.160 It doesn't seem like there's any interest in the truth.
00:07:52.740 And I'm semi-joking about doubting Watergate.
00:07:57.140 But here's what I do think.
00:07:58.560 I wonder if the details of Watergate are even close.
00:08:02.120 Probably the big picture is right.
00:08:04.740 Well, I bet the details are all wrong.
00:08:07.400 If these guys were the only source of those details.
00:08:10.800 I suppose a lot of it could be independently checked.
00:08:13.000 All right, we've got a little mystery here with excess mortality.
00:08:18.480 Who knows if the data is right, but it comes from ourworldanddata.org.
00:08:23.660 And Adam Dopamine was tweeting about this.
00:08:26.600 Apparently the numbers showed that 2020 was a much higher than normal excess mortality.
00:08:32.620 Completely understandable.
00:08:34.380 Because we had a massive pandemic.
00:08:37.200 So the 2020 numbers make complete sense.
00:08:40.360 Way higher than the baseline.
00:08:41.640 We had a pandemic.
00:08:43.460 Perfectly explained.
00:08:45.800 But 2021 is below average.
00:08:49.100 Can you explain that?
00:08:51.120 How do you explain that we're below average mortality?
00:08:57.920 How do you explain?
00:08:59.720 Well, let me give you an explanation that I think makes sense.
00:09:04.340 That the people who are going to die this year died last year.
00:09:08.740 Right?
00:09:09.000 The pandemic took out all the people closest to death.
00:09:14.620 And if you take out all the people who are closest to death, the obvious outcome of that should be a lower than normal death rate for the next three to five years.
00:09:26.000 Probably three.
00:09:27.440 Wouldn't you say?
00:09:28.100 Yeah.
00:09:28.340 The vulnerable died early.
00:09:29.360 So I think this number actually makes perfect sense.
00:09:32.760 Doesn't mean it's right.
00:09:34.240 Could be there's a lag.
00:09:36.020 Could be the data's wrong.
00:09:37.100 Any of those are a possibility.
00:09:38.720 But it makes sense.
00:09:40.420 Based on what we know, it's a perfectly sensible outcome, even if it turns out later to be wrong.
00:09:46.000 There's some weird things happening with my printer here.
00:09:58.840 Oh, here we go.
00:10:01.000 All right.
00:10:02.860 We're going to talk about COVID stuff now.
00:10:06.640 So those of you who don't like this conversation, I would invite you to leave because you don't need to have any bad feelings about me.
00:10:13.980 Just come back when I'm not talking about this.
00:10:16.920 Now, I did hear your...
00:10:18.220 I have heard all of your complaints.
00:10:21.440 Yeah, we will do this simultaneous sip before I do this.
00:10:24.200 That's a good idea.
00:10:25.480 Thank you on locals for reminding me.
00:10:29.380 Actually, why don't we do that now?
00:10:31.700 You know, if you'd like to enjoy this morning to the maximum potential, what you need is a simultaneous sip.
00:10:38.540 And all you need for that is a cup or a mug or a glass, a tank or a chelsea, a canteen jug or a flask, a vessel of any kind.
00:10:44.900 Fill it with your favorite liquid.
00:10:46.340 I like coffee.
00:10:48.220 And join me now, commercial free, for the unparalleled pleasure, the dopamine here of the day, the thing that makes everything better.
00:10:57.840 It's called the simultaneous sip.
00:10:59.760 It happens now.
00:11:00.760 Go.
00:11:00.900 Go.
00:11:05.600 Ah, yeah.
00:11:07.780 Yeah, so good.
00:11:08.920 All right, we're going to talk about COVID stuff, but as you know, my beat is not health.
00:11:16.940 My beat is not health.
00:11:19.040 So none of this is about your health choices, okay?
00:11:21.760 It's only about how you think about it.
00:11:24.080 The only thing I care about is how you think about it.
00:11:28.240 Don't care if you die, as long as you're living your life the way you want it.
00:11:32.940 All right.
00:11:34.560 So here's the first question.
00:11:39.160 If hearing that some percentage of vaccinated people are still getting COVID makes you conclude that vaccinations don't work, I've got one question for you.
00:11:50.760 How does confirmation feel when you're in it?
00:11:53.780 Now, here's the thing you need to know about confirmation bias.
00:11:56.500 The people who have it don't know it.
00:12:01.160 But the people who don't have it, because for whatever reason they're uninterested in the question usually, they can see it.
00:12:08.140 It's like really obvious to the observer if you're being irrational.
00:12:11.920 It's just not obvious to you.
00:12:14.200 So I'm telling you that if you have this opinion, you're experiencing confirmation bias.
00:12:20.580 I'm a third party, and I'm not terribly interested one way or the other.
00:12:24.380 Because like I said, I don't care if you live or die.
00:12:26.720 I don't care if you get vaccinated or not.
00:12:29.400 Moreover, I'm not even so sure I made the right decision.
00:12:33.780 Hear this really clearly.
00:12:36.060 I got vaccinated.
00:12:38.100 I made that decision.
00:12:39.940 Am I sure I made the right decision?
00:12:42.480 Nope.
00:12:43.760 Nope.
00:12:44.860 If I were sure I made the right decision, then you could say to me, hmm, hmm, might be confirmation bias.
00:12:51.160 But I'm telling you I'm not sure.
00:12:55.280 That's almost a guaranteed get-out-of-jail card.
00:12:58.760 If you can say unambiguously, I don't know.
00:13:01.660 I could be wrong.
00:13:02.920 It could be that getting vaccinated was a big mistake.
00:13:05.940 Maybe.
00:13:07.160 I mean, I don't think so.
00:13:08.340 I'm playing the odds.
00:13:09.780 I think the odds are strongly in my favor.
00:13:12.840 But I don't know.
00:13:14.380 And the people who are skeptical are not crazy.
00:13:17.280 Are they?
00:13:17.860 Are they?
00:13:19.880 Is anybody who's skeptical crazy?
00:13:22.340 We do have a vaccination that, you know, we don't know, you know, if anything could pop up in the long term.
00:13:31.260 Who knows?
00:13:32.560 I don't think so.
00:13:34.080 I think the risk was worth it.
00:13:36.020 But I don't know.
00:13:37.620 Okay?
00:13:37.840 So here's your first tip.
00:13:40.220 Remember, I'm not telling you anything about vaccinating or not vaccinating.
00:13:44.240 I'm just talking about the psychology of it.
00:13:46.740 If you can say out loud, I made a choice, but I really don't know if it's the right one.
00:13:52.560 You might not have confirmation bias.
00:13:56.380 That would be one way to tell.
00:13:58.580 But if you say to yourself, I didn't get the vaccination, then damn it, I'm right.
00:14:02.880 I'm really right.
00:14:05.120 Hmm.
00:14:05.660 You might.
00:14:06.540 You might be right.
00:14:07.860 That's one possibility.
00:14:09.480 But you might have confirmation bias.
00:14:11.540 The person who says, I don't know if I'm right or not, almost certainly doesn't.
00:14:15.020 Because the conditions for it don't exist.
00:14:20.760 But here's the problem.
00:14:22.900 What if, and I'll just put this the way I put it to my critic.
00:14:28.080 It looks like the page didn't print here or something in my notes.
00:14:37.060 If you think that the vaccinations don't work because you need a booster, does anybody
00:14:43.680 think that?
00:14:44.160 Is there anybody here who would say to me, the vaccinations don't work because you need
00:14:50.360 boosters?
00:14:51.880 Or I'll modify that.
00:14:53.680 They don't work as promised because you need boosters.
00:14:58.260 Does anybody think that?
00:15:00.020 Who has that opinion?
00:15:02.760 That vaccinations don't work, or at least don't work as promised because of boosters?
00:15:08.640 Well, do you know that tetanus shots are vaccinations and they require boosters?
00:15:16.260 Did you know that?
00:15:16.840 There are other shots which require boosters.
00:15:21.160 It's not the only one.
00:15:22.260 Now, if you get a tetanus shot, you might need it only every 10 years or some say even 30.
00:15:30.340 Is the tetanus shot just as good in, let's say, the eighth year as it was the first year?
00:15:35.820 What do you think?
00:15:36.960 Is that tetanus shot just as good in year nine as it was in year one?
00:15:43.140 Well, apparently not, unless booster shots are 100% until the day they're not.
00:15:50.900 I don't think that's the case.
00:15:52.560 Seems like they probably just wear off slowly and after 10 years, it's smart to get another
00:15:57.420 one.
00:15:57.620 But could you say that somebody who has the tetanus shot could still get tetanus?
00:16:05.680 Well, yes, right?
00:16:07.440 Because otherwise there wouldn't be boosters, would there?
00:16:09.980 If you couldn't get tetanus while you also had a tetanus vaccination, you would never need
00:16:17.880 a booster.
00:16:19.400 Somebody fact-checked me on that.
00:16:20.900 I mean, it sounds logical, but with this medical stuff, there could be some non-obvious
00:16:25.160 explanation for things.
00:16:26.240 All right?
00:16:28.480 So, I would say the fact that the vaccinations do not work perfectly, and for as long as
00:16:35.820 you'd like them to, is not uncommon, and therefore not a question of whether they work or don't
00:16:42.500 work.
00:16:43.820 All right?
00:16:44.420 Now, if you disagree with what I just said, probably you're having some kind of confirmation
00:16:48.860 bias problem.
00:16:51.100 Right?
00:16:52.760 And some of you said, but wait, Scott.
00:16:55.440 Ten years for a booster, that's way different than six months.
00:17:00.700 Yeah, that's way different.
00:17:02.440 Not in a way that matters.
00:17:05.400 It doesn't matter to the argument.
00:17:07.700 I mean, it matters to your life, of course.
00:17:10.360 But it doesn't matter to the argument.
00:17:11.440 The argument is, can it be called a vaccination if it doesn't completely stop the thing it's
00:17:17.180 vaccinating?
00:17:17.860 And the answer is yes.
00:17:19.620 That is the standard.
00:17:21.100 The standard is you can call it a vaccination even if it doesn't completely stop the thing
00:17:26.120 that you're trying to stop.
00:17:27.120 Does every person who gets a vaccination not get, I don't know, chicken pox?
00:17:34.420 Are there any doctors on here who can answer that?
00:17:37.080 Do you think that everybody who got the chicken pox vaccination, do you think none of them got
00:17:43.360 chicken pox?
00:17:45.180 I'll bet some did.
00:17:46.760 I'll bet they did.
00:17:47.520 So the fact that a vaccination is not 100% doesn't mean it's not a vaccination.
00:17:53.760 It just means that it's a little less than it could be.
00:17:56.900 It's a little, it's leaky, right?
00:17:59.060 It's a leaky vaccination.
00:18:00.920 Now, did they lie to us?
00:18:03.620 Here's the next question.
00:18:05.040 Did the experts lie to us?
00:18:08.600 I'd say no.
00:18:09.740 Because they probably were optimistic and might have actually thought it would be closer
00:18:17.960 to 100% than it is.
00:18:21.040 Now, do you think that they knew that you'd need a booster?
00:18:25.980 Let me ask you this.
00:18:28.320 How many trials of the vaccinations were there?
00:18:33.020 Like Moderna, how many trials did it do?
00:18:36.160 And Pfizer, how many trials did it do?
00:18:38.180 Because if they only did, I don't know, one or two, however many each of them did, how
00:18:44.660 many doses, different doses did they test?
00:18:49.460 Probably one or two, right?
00:18:52.120 And how did they know what to test?
00:18:54.200 Like, how did they know what dose to test?
00:18:57.220 They didn't.
00:18:58.420 They didn't.
00:18:59.560 They couldn't have.
00:19:00.960 Educated guess.
00:19:02.820 It was an educated guess.
00:19:04.020 So what are the odds that their educated guess hit exactly the right amount of vaccination
00:19:09.700 you need?
00:19:11.600 Very low.
00:19:13.100 Very low.
00:19:14.020 Common sense tells you that they were guessing a little bit, a little bit, and that there
00:19:19.020 might be some adjustments needed.
00:19:21.140 So if they guessed on the low side, meaning they didn't want to kill you with the vaccination
00:19:25.880 itself, so they made it low enough that, you know, maybe it gave you some side effects,
00:19:30.140 but didn't kill too many people.
00:19:32.840 Yes, I know the vaccinations have side effects.
00:19:35.520 Yes, I know that every vaccination kills people, and that more of them, apparently, with this
00:19:39.260 vaccination, perhaps, than others.
00:19:42.980 But there's no such thing as a vaccination that they can guess accurately the right amount,
00:19:49.960 test it once, and then they just got lucky.
00:19:52.940 They got the right amount.
00:19:53.740 So the fact that they have to, I'm seeing the VAERS database being shown to me about
00:20:03.360 all the alleged adverse events.
00:20:07.920 Most of you know the VAERS database should not be used, right?
00:20:12.580 Do any of you think the VAERS database gives you useful information about the adverse side
00:20:20.260 effects?
00:20:21.300 It does not.
00:20:22.480 It's not for that.
00:20:24.740 It's self-reported.
00:20:27.160 So what it is, is to raise flags, such that you would more quickly notice if there was
00:20:34.680 something to look into.
00:20:36.480 So the VAERS database tells you there's something to look into.
00:20:39.920 It doesn't tell you there's a problem.
00:20:41.320 It's not meant for that.
00:20:42.440 It's not designed for that.
00:20:44.760 Right?
00:20:44.880 The reason you have controlled experiments is because self-reported stuff is so wildly inaccurate
00:20:51.960 that's not useful for scientific decisions.
00:20:55.600 Yeah.
00:20:56.180 So would we all agree that the VAERS database has raised important questions which we should
00:21:01.260 really look into?
00:21:02.880 Everybody agree with that?
00:21:04.660 The VAERS database, where people self-report the side effects, it absolutely raises a flag
00:21:12.220 that we should look into.
00:21:13.680 There's nobody who disagrees with that, right?
00:21:16.980 But do you also also agree that it's not intended to give you the answer?
00:21:22.500 It's just supposed to raise a flag so you look into it.
00:21:25.880 We're all on the same page on the VAERS database.
00:21:28.940 And if you weren't, you are now.
00:21:31.080 Because if you thought that it was telling you something, it isn't.
00:21:33.740 It's just raising a question, basically.
00:21:35.960 All right.
00:21:40.800 Isn't it kind of too late?
00:21:42.420 Well, that's an important question, too.
00:21:46.120 So here are some cognitive dissonance that got triggered when I started asking these questions
00:21:51.560 on Twitter today.
00:21:52.520 And I'll tell you how to recognize cognitive dissonance in case it happened to you.
00:21:57.780 Here are some of the comments from people who are almost certainly experiencing cognitive
00:22:03.780 dissonance.
00:22:04.680 You ready?
00:22:04.960 A tweet that just says, I had a bad take.
00:22:11.120 It's a bad take.
00:22:12.140 No other reason.
00:22:12.820 It's just a bad take.
00:22:13.960 When you see an attack on the person without any reference to the reason, that's almost
00:22:20.360 always cognitive dissonance.
00:22:22.100 Because people will include a reason if they have one.
00:22:26.660 And if they don't have one, it's cognitive dissonance.
00:22:29.360 So why would you say I had a bad take instead of saying, for example, you're underestimating
00:22:37.300 something, something?
00:22:42.280 Hey, Norwood?
00:22:44.320 Norwood says I have two states and they're both boring.
00:22:47.000 And yet, he's decided to stay here.
00:22:50.000 So we're going to hide you from the channel because you're an asshole.
00:22:52.400 Also, very dumb because you do things that you don't like repeatedly.
00:22:58.560 Like listening to me.
00:23:00.380 Very dumb.
00:23:01.260 If you don't like it, probably stop doing it.
00:23:04.680 Probably.
00:23:04.900 All right.
00:23:08.180 Another comment I got today.
00:23:10.740 Somebody said, you should have thought this through before sending.
00:23:13.960 In other words, before tweeting.
00:23:16.520 Do you think that this person couldn't have put any reference to what was wrong with my
00:23:21.160 thinking in a tweet?
00:23:22.320 Wouldn't fit?
00:23:23.080 Not enough words?
00:23:25.200 No, easily you could put what's wrong with it.
00:23:27.920 You underestimated this.
00:23:30.160 You have a wrong fact.
00:23:31.580 Your logic connecting this to this doesn't work.
00:23:35.940 Lots of things you could have said.
00:23:38.980 Another one I got today was, that's not how it works.
00:23:42.960 And then a reference to my Dilbert cartoon career.
00:23:46.280 That's not how it works.
00:23:48.520 Or, that's not how any of this works.
00:23:52.100 The Dow is down 600.
00:23:53.920 Jeez.
00:23:54.300 So, these are the arguments that are pretty much tip-offs for cognitive distance.
00:24:06.560 Here's an economist's tip.
00:24:09.320 How to compare things.
00:24:10.600 One of the things that economists learn is how to accurately compare the right stuff.
00:24:14.580 If you're comparing the wrong things, you're always going to get the wrong decision.
00:24:19.420 Right?
00:24:19.580 You have to compare the right things.
00:24:21.560 That's just basic.
00:24:22.300 Here's what people are doing wrong in their comparisons.
00:24:27.820 If you're comparing the COVID vaccination to your original belief of how good it would be,
00:24:36.220 you haven't done anything good or useful.
00:24:40.160 Right?
00:24:40.300 And most of you are doing that, right?
00:24:41.780 A lot of the anti-vax people, the vaccination skeptics, are saying,
00:24:47.080 Scott, Scott, Scott.
00:24:47.960 But, if you compare what they led us to believe about the effectiveness of the vaccination and its safety,
00:24:55.500 if you compare that to what we got, whoa, we didn't get what they promised,
00:25:01.820 so therefore it's all a scam and a shill and it'll kill you. 0.73
00:25:07.080 Does that make sense?
00:25:09.060 No.
00:25:09.500 No.
00:25:09.980 That's the wrong comparison.
00:25:12.460 Because the thing that you imagined it would be never existed.
00:25:18.140 You get that?
00:25:19.600 The thing you imagined it would be, 100% effective or whatever you imagine, that never existed.
00:25:25.580 That was in your mind.
00:25:27.200 It wasn't in the real world.
00:25:28.280 In the real world, here's the comparison that makes sense.
00:25:33.220 Getting vaccinated versus not getting vaccinated.
00:25:38.080 That's the comparison that matters.
00:25:39.940 Because your decision is vaccinate or not vaccinate.
00:25:42.620 That's all that matters.
00:25:44.080 Vaccinate or not vaccinate.
00:25:45.760 If you're still comparing how you felt about it in the beginning to how it turned out,
00:25:51.980 a little less than you hoped, closer to 80% to 90% effectiveness against hospitalization and death,
00:25:59.780 maybe 80% effective in keeping you from getting it in the first place,
00:26:03.660 but not 100% and not 99%.
00:26:06.620 We hope for that.
00:26:08.280 So don't compare it to your expectations.
00:26:11.260 That's nothing.
00:26:12.840 That's nothing.
00:26:14.000 I saw people doing that today and it's just purely irrational.
00:26:17.460 Your expectations are always disappointing.
00:26:21.980 Compared to reality.
00:26:28.640 So there was a troll this morning who learned that the fine people hoax was a hoax for the first time.
00:26:35.420 And that was kind of fun, watching somebody fall into that trap.
00:26:41.780 And here's how you know that you've triggered somebody into cognitive dissonance.
00:26:45.300 Now, for anybody who's new to me, live streaming, I'm a trained hypnotist.
00:26:53.060 And so I know how to trigger people into cognitive dissonance.
00:26:56.800 I can do it intentionally.
00:26:58.440 And I can do it almost at will.
00:27:00.940 Most of you do it too, but less directly and less intentionally.
00:27:06.040 Anybody can trigger somebody into cognitive dissonance.
00:27:08.460 But when a Twitter user named WhoRadiation found out today that the fine people thing was a hoax,
00:27:18.780 here's one of the tells.
00:27:20.460 Word salad.
00:27:22.200 Word salad.
00:27:23.440 So they will say something when they're experiencing cognitive dissonance,
00:27:27.340 but they're the only person who thinks it makes sense.
00:27:30.800 Everybody else reading it would say,
00:27:33.040 All right, here's another example of a cognitive dissonance right here from El Diablo Locopoco.
00:27:40.760 So here's the classic one.
00:27:42.720 Scott, Scott, Scott.
00:27:43.800 It sounds like you are suffering cognitive dissonance
00:27:47.000 and are rationalizing your decision to be vaccinated.
00:27:50.700 Now, what's wrong with that comment, and why do I say that is cognitive dissonance?
00:27:56.960 I'll read it again and see if you can feel the sound with me.
00:28:01.620 Scott, Scott, Scott.
00:28:03.200 It sounds like you are suffering cognitive dissonance
00:28:05.700 and are rationalizing your decision to be vaccinated.
00:28:10.540 Did I not just tell you that I could be totally wrong about getting vaccinated?
00:28:16.100 Why do I need to rationalize it?
00:28:18.680 I told you what I looked at.
00:28:20.700 I told you what the odds were.
00:28:22.520 If the facts were wrong that I used, I'll say,
00:28:25.080 Whoa, damn, those facts were wrong.
00:28:27.000 So I got the wrong decision.
00:28:29.780 What would be my trigger for cognitive dissonance?
00:28:32.320 You have to look for the trigger.
00:28:34.320 I don't have a trigger.
00:28:36.300 There's nothing that can cause me to have cognitive dissonance
00:28:39.440 because I'm not committed to a side.
00:28:42.180 You have to be committed to an opinion before you can be triggered.
00:28:45.480 I'm as non-committed as you could possibly be.
00:28:48.140 Now, I'm also allowing that I could have made the right decision
00:28:53.620 and had the wrong outcome.
00:28:56.400 Will you allow me that?
00:28:58.200 Let's say I looked at the odds, and the odds were, hypothetically,
00:29:02.740 you don't have to agree with this,
00:29:04.300 but let's say I looked at the odds and I thought the odds were
00:29:06.520 that I should get vaccinated.
00:29:09.420 And then maybe I'm one of the people who has a bad outcome
00:29:11.920 from the vaccination itself.
00:29:13.240 Did I make a wrong decision?
00:29:16.180 I'd say no.
00:29:17.480 Because if you make a decision based on the odds,
00:29:20.000 it's the right decision.
00:29:21.440 It's just a bad outcome.
00:29:23.600 Now, given that I make that distinction,
00:29:26.580 that I can make a good decision
00:29:28.180 and still have perfectly reasonable to have a bad outcome,
00:29:31.880 I don't have anything that can trigger me into cognitive dissonance.
00:29:35.620 Because everything that can happen is within my model.
00:29:39.500 Right?
00:29:40.300 That's the best way to say it.
00:29:41.280 Everything that's possible fits within my model of the world.
00:29:46.340 Because my model of the world says,
00:29:48.060 well, I followed the odds as best I understood them.
00:29:51.320 Data was probably wrong.
00:29:53.320 Maybe even some of my thinking was wrong.
00:29:56.580 But, you know, I had to make a decision.
00:29:58.460 So if it turns out I'm completely wrong about everything,
00:30:01.300 it's still in my model.
00:30:02.980 It's still consistent with everything I've said.
00:30:04.860 Oh, I could be wrong on those facts.
00:30:06.680 I could have made the wrong decision.
00:30:08.020 But if your model says that what you did is the only thing that's possible,
00:30:13.600 as soon as something outside of your model happens,
00:30:17.060 you're triggered into cognitive dissonance.
00:30:19.600 So you've got to make sure that your model of reality
00:30:22.520 incorporates all the possibilities.
00:30:25.620 That's the only way you can keep yourself at a cognitive dissonance.
00:30:30.660 Did you get that?
00:30:31.160 So now, how many of you are asking yourself,
00:30:34.380 hey, I think I trapped myself.
00:30:38.300 I created a trap for myself.
00:30:40.360 By creating a world view that doesn't allow the other one in.
00:30:45.500 Did you do that?
00:30:46.920 Did you create a world view that just doesn't let the other one in?
00:30:51.820 Because if you did, you're setting yourself up.
00:30:55.760 You're setting yourself up.
00:30:57.360 Are comments off?
00:30:58.380 No, comments are on.
00:30:59.180 Well, comments are on.
00:31:01.940 I am reading them.
00:31:03.640 I don't comment on all the comments, but I am reading them.
00:31:10.400 Somebody says,
00:31:11.100 find the clip of Norm MacDonald saying how he wants his funeral to be.
00:31:15.000 You know, I actually had some instructions in my estate
00:31:18.040 and my will at one point for having a funny funeral,
00:31:21.980 but I took those out.
00:31:22.720 Would you make the same decision today?
00:31:26.040 Yeah, I would.
00:31:26.560 Because I don't think the data changed enough.
00:31:32.640 And remember, I'm in a higher risk category.
00:31:35.460 So any decision I make about vaccinations doesn't apply to any of you.
00:31:39.260 You get that, right?
00:31:40.800 Do you get that my decision about vaccination should not apply to any of you?
00:31:46.180 Like, you shouldn't be influenced by it at all.
00:31:48.020 Because my, you know, I'm in a high risk category.
00:31:50.260 Right?
00:31:52.240 And my high risk is different than your high risk.
00:31:54.820 It's different than everybody else's.
00:31:56.340 So you definitely should not be influenced by my decision.
00:31:59.520 That would be irrational.
00:32:00.960 You don't have the same variables.
00:32:06.520 Here's a comment from Mr. Janice Esquire.
00:32:14.260 A lawyer, I assume.
00:32:17.560 First name Hugh.
00:32:20.720 Last name Janice.
00:32:23.040 He's a lawyer.
00:32:25.800 How does that sound if you say his first name and his last name together?
00:32:29.780 First name Hugh.
00:32:31.800 Last name Janice.
00:32:34.700 Hugh Janice.
00:32:37.220 Hugh Janice Esquire.
00:32:39.520 So Hugh Janice says to me, he goes,
00:32:45.380 that Scott is having another one of his vax spells
00:32:48.060 where he thinks if people don't vax, they will die.
00:32:53.320 Does that sound like me?
00:32:55.100 Does this sound like something that happens to me on a regular basis?
00:32:59.040 That if I get into this vax spell
00:33:01.940 where I think that if people don't vax, they will die?
00:33:05.820 Nope.
00:33:06.260 I think you're all going to die.
00:33:09.520 Me too.
00:33:10.700 I think you'll die if you get vaxed.
00:33:12.880 I think you'll die if you don't get vaxed.
00:33:15.700 I think you're all going to die.
00:33:18.080 And do I care if you die soon?
00:33:20.240 Nope.
00:33:20.960 As long as you're happy.
00:33:22.160 As long as you lived the life you wanted to live.
00:33:24.880 I'm good with that.
00:33:26.240 So a complete hallucination of my opinion
00:33:28.760 by Hugh Janice Esquire.
00:33:31.580 And he says that I'm a vax shill.
00:33:38.040 Do you know any vax shills
00:33:39.960 who advise you to do whatever you want and die?
00:33:43.620 I would be the worst vax shill in the world 1.00
00:33:48.180 if I told you, I don't care if you get it.
00:33:50.900 You can die.
00:33:52.840 Any way you want.
00:33:54.460 Any way you want.
00:33:56.040 As long as you allow me the same.
00:33:58.520 Let's talk about Hunter Pepper,
00:34:00.860 the 19-year-old council member
00:34:02.380 who vowed to, quote,
00:34:04.540 fight to the end against a mask mandate
00:34:06.840 in Decatur, Decatur.
00:34:08.860 I guess Decatur, Alabama,
00:34:10.980 revealed he's been hospitalized
00:34:12.480 with shallow breathing.
00:34:14.680 And, of course, the photo of Hunter Pepper.
00:34:17.640 And we do not do fat shaming.
00:34:19.760 No fat shaming on this channel.
00:34:22.440 And we don't do addiction
00:34:23.700 or alcoholic shaming either.
00:34:26.060 They're all in the same category, in my opinion.
00:34:29.120 Which is, we think we have free will,
00:34:31.120 but addiction is kind of a different thing.
00:34:34.140 And I do think people can be addicted to food
00:34:36.320 and addicted to all kinds of stuff.
00:34:37.820 So I don't make fun of people who have addictions
00:34:40.180 because I don't believe in free will
00:34:42.160 the way other people conceive of it.
00:34:44.760 And I would appreciate it
00:34:45.740 if you did not make fun of people
00:34:47.960 who are overweight just for being overweight.
00:34:50.800 Because they just have an addiction
00:34:52.960 you don't have.
00:34:54.600 If you're lucky.
00:34:58.560 But how in the hell...
00:35:00.340 I mean, I just have to say this
00:35:01.560 every time they do that.
00:35:02.340 How in the hell do you not mention his weight?
00:35:04.980 You know, he's clearly quite overweight.
00:35:07.820 And every time they show this picture
00:35:10.660 I feel like it's doing the opposite
00:35:12.420 of what they want.
00:35:13.680 I feel like they're doing these...
00:35:15.380 I feel like they're doing these...
00:35:20.180 these anecdotal stories
00:35:23.120 about the person
00:35:24.140 who thought the vaccination was bad
00:35:28.100 and then got hospitalized.
00:35:29.260 but it's all a story of overweight people.
00:35:33.000 They're all overweight stories.
00:35:35.700 And if we don't treat them that way
00:35:37.060 I don't know how we get through this
00:35:41.320 any smarter.
00:35:42.660 Like if we're trying to get through this
00:35:44.240 alive and smarter
00:35:45.540 we're not doing the right thing for that.
00:35:47.440 So
00:35:49.220 I got a tweet from
00:35:52.880 D.O. Genius.
00:35:56.680 So this is somebody who puts
00:35:58.020 the word genius
00:35:59.160 right in their Twitter handle.
00:36:02.220 May I make a recommendation?
00:36:07.280 Brief recommendation.
00:36:08.780 If you're going to brand yourself
00:36:10.860 with the word genius
00:36:12.100 it does
00:36:14.360 sort of give you
00:36:16.100 the responsibility
00:36:17.020 to
00:36:17.520 up your game a little bit.
00:36:19.940 You know?
00:36:20.420 To act a little genius-y.
00:36:23.100 Because people are going to be watching
00:36:24.580 if you call yourself a genius.
00:36:27.240 So let me explain
00:36:28.340 what this genius
00:36:29.120 tweeted today.
00:36:30.880 He said to me
00:36:31.780 if learning that
00:36:32.540 80% of people
00:36:33.640 dying from COVID
00:36:34.640 in the UK
00:36:35.320 are fully vaccinated
00:36:36.600 according to the
00:36:37.860 public health
00:36:38.500 blah blah blah
00:36:39.020 makes you conclude
00:36:40.780 that vaccinations
00:36:41.580 work
00:36:42.980 how does
00:36:45.120 confirmation bias
00:36:46.260 make you feel, Scott?
00:36:48.320 So here he's calling me out
00:36:49.940 for my own
00:36:50.420 confirmation bias
00:36:51.320 by saying that
00:36:52.860 80% of people
00:36:54.040 dying in Great Britain
00:36:55.240 or UK
00:36:56.160 are fully vaccinated.
00:37:00.060 So how does that work, Scott?
00:37:02.760 How does that work?
00:37:04.360 Well, let me ask you
00:37:05.540 some questions
00:37:06.360 in response.
00:37:08.880 D.O. Genius.
00:37:10.780 I'll answer your question
00:37:12.880 after you answer
00:37:13.760 a couple of mine.
00:37:15.860 Number one.
00:37:17.500 Why do bank robbers
00:37:19.240 rob banks?
00:37:22.740 Huh?
00:37:23.320 Huh?
00:37:23.820 Why don't they rob,
00:37:25.260 let's say,
00:37:25.900 trash dumps?
00:37:28.420 Why?
00:37:29.280 Why do they go
00:37:30.400 to banks?
00:37:32.000 Could it be?
00:37:33.760 That's where the money is.
00:37:35.660 Yeah, that's where
00:37:36.220 the money is.
00:37:37.900 Willie Sutton
00:37:38.600 allegedly said that
00:37:39.660 but probably didn't.
00:37:41.620 Yes.
00:37:42.580 Here's another one.
00:37:46.680 Were you aware
00:37:48.120 that
00:37:49.820 almost 98%
00:37:53.060 of all the people
00:37:54.420 who die in China 0.73
00:37:55.380 are Chinese?
00:37:57.580 Did you know that?
00:37:59.220 Why?
00:38:00.120 Why is it
00:38:01.040 that almost everybody
00:38:02.260 who dies in China
00:38:03.320 is Chinese?
00:38:04.200 Like, what causes that?
00:38:07.880 Is it some kind
00:38:08.960 of weird disease?
00:38:10.940 I'm all confused.
00:38:12.920 And so would
00:38:13.860 D.O. Genius
00:38:14.840 be confused.
00:38:17.100 Now, let's take
00:38:18.020 the UK
00:38:20.880 where most people
00:38:22.360 are vaccinated.
00:38:23.900 Let's see.
00:38:24.920 In a world
00:38:25.520 where most people
00:38:26.620 are vaccinated,
00:38:27.520 what kind
00:38:31.220 of people
00:38:31.540 would be dying?
00:38:34.360 Could it be
00:38:35.120 the people
00:38:35.660 who are vaccinated?
00:38:36.520 Because that's
00:38:37.140 almost all there is.
00:38:38.880 If
00:38:39.240 all you have
00:38:41.140 is people
00:38:41.540 who are vaccinated,
00:38:43.160 who the hell
00:38:43.720 else is going
00:38:44.260 to be dying?
00:38:46.320 Right?
00:38:46.480 Right?
00:38:46.540 My expectations
00:38:54.760 were there
00:38:55.320 were 0% chance
00:38:56.380 it would be
00:38:56.880 completely safe.
00:38:58.120 Yeah, well,
00:38:59.020 0% chance
00:39:00.000 it's completely safe.
00:39:04.240 So here's the thing.
00:39:05.960 If your country
00:39:07.140 is entirely vaccinated,
00:39:08.860 then only vaccinated
00:39:10.100 people will die
00:39:10.840 of COVID.
00:39:11.620 Because we do know
00:39:12.340 that people get it
00:39:13.140 and they die
00:39:13.680 even if they're vaccinated.
00:39:14.660 So don't call
00:39:16.860 yourself genius
00:39:17.680 if you don't
00:39:19.380 understand why
00:39:20.180 most of the people
00:39:20.920 in China
00:39:21.540 who die
00:39:22.460 are Chinese.
00:39:23.820 They're Chinese.
00:39:25.120 Because that's
00:39:25.740 who lives there.
00:39:29.520 Let me take it
00:39:30.660 to the next extreme
00:39:31.500 in case Mr. Genius
00:39:33.380 wasn't following that.
00:39:35.100 I'll make it
00:39:35.660 a little simple.
00:39:36.820 If 100%
00:39:37.880 of the people
00:39:38.400 in a country,
00:39:39.180 whatever country
00:39:39.900 it is,
00:39:40.800 are vaccinated,
00:39:41.980 100% are vaccinated,
00:39:43.320 what percentage
00:39:46.400 of the people
00:39:46.840 who die
00:39:47.260 in that country
00:39:47.880 would be vaccinated?
00:39:52.520 Work on that
00:39:53.340 and get back to me.
00:39:54.640 If 100%
00:39:55.540 are vaccinated,
00:39:56.580 what percentage
00:39:57.580 of the people
00:39:58.020 dying in that country
00:39:58.860 would also be vaccinated?
00:40:01.020 I know it's a tough one,
00:40:02.420 but see if you can
00:40:03.060 get that right.
00:40:05.200 All right.
00:40:05.560 That is my lesson
00:40:12.480 on cognitive dissonance
00:40:14.040 and confirmation bias.
00:40:16.340 Now,
00:40:17.620 I think I touched
00:40:20.300 all my incredibly
00:40:21.440 important points.
00:40:22.740 Yes, I did.
00:40:24.020 Yes, I did.
00:40:25.660 So,
00:40:27.660 let me look
00:40:30.140 at your comments.
00:40:31.460 Scott,
00:40:31.800 if you get COVID,
00:40:32.740 will you get Regeneron?
00:40:34.320 Yes, I will.
00:40:35.000 And do you know
00:40:37.100 why I'll get Regeneron?
00:40:38.940 Will it be
00:40:39.840 because Regeneron
00:40:41.000 was tested
00:40:43.360 in long-term trials?
00:40:46.800 Did anybody
00:40:48.160 do a long-term
00:40:49.360 test of Regeneron
00:40:50.760 as a treatment
00:40:52.280 for COVID?
00:40:53.820 What?
00:40:54.920 What?
00:40:55.360 There's no 10-year
00:40:56.160 study of Regeneron
00:40:57.500 for COVID patients?
00:40:59.420 Why would you
00:41:00.300 put that in your body?
00:41:02.140 It seems a little
00:41:02.960 experimental to me,
00:41:04.360 doesn't it?
00:41:05.440 So,
00:41:05.940 you would put
00:41:06.320 Regeneron
00:41:06.940 in your body
00:41:07.600 just because
00:41:08.540 all the experts
00:41:09.220 say it works,
00:41:10.840 even knowing
00:41:11.580 there are no
00:41:13.660 long-term
00:41:14.620 10-year trials
00:41:15.860 of Regeneron
00:41:16.720 for COVID.
00:41:18.740 You would put
00:41:19.220 that in your body?
00:41:20.640 I would.
00:41:22.220 Yeah,
00:41:22.580 I probably would.
00:41:24.440 But,
00:41:25.000 all of your options
00:41:27.360 are untested stuff.
00:41:29.200 Does everybody
00:41:29.760 get that?
00:41:30.220 every one of your
00:41:31.880 options
00:41:32.280 is untested
00:41:34.100 for the long-term.
00:41:36.200 For the short-term,
00:41:37.280 you actually know
00:41:37.960 your vaccination
00:41:38.900 risk pretty well
00:41:40.040 because we have
00:41:41.840 millions and millions
00:41:43.060 of short-term
00:41:43.960 experiences.
00:41:45.440 It's the long-term
00:41:46.440 you're worried about,
00:41:47.200 right?
00:41:47.780 You're worried about
00:41:48.640 being infertile
00:41:52.160 or whatever.
00:41:52.600 Yeah,
00:41:56.460 same thing
00:41:56.980 with ivermectin,
00:41:58.160 same thing.
00:41:58.820 Now,
00:41:59.020 we do know
00:41:59.420 that a lot
00:41:59.720 of these
00:42:00.040 have been
00:42:00.640 safe
00:42:02.100 for other
00:42:04.720 conditions
00:42:05.320 for years.
00:42:06.880 But,
00:42:07.420 who exactly
00:42:07.940 was tracking
00:42:08.720 the long-term
00:42:10.180 effects of
00:42:10.880 ivermectin?
00:42:11.520 Is that a study?
00:42:14.540 Did somebody
00:42:15.000 ever do
00:42:15.340 a 10-year
00:42:15.900 study of
00:42:16.460 ivermectin
00:42:17.080 for the
00:42:18.900 other problems,
00:42:19.640 not for COVID?
00:42:21.640 I doubt it,
00:42:22.900 right?
00:42:23.820 Do you think
00:42:24.280 somebody did
00:42:24.760 a 10-year
00:42:25.340 study
00:42:25.720 to follow up
00:42:27.400 and see
00:42:27.680 if ivermectin
00:42:28.380 was hurting
00:42:28.920 anybody
00:42:29.360 10 years
00:42:30.560 later?
00:42:31.340 No,
00:42:32.300 no.
00:42:32.900 Didn't do it
00:42:33.380 for the
00:42:33.620 vaccinations.
00:42:34.920 Didn't do it
00:42:35.460 for anything
00:42:35.780 you're going
00:42:36.100 to put
00:42:36.280 in your
00:42:36.480 body.
00:42:39.540 Somebody says
00:42:40.240 maybe.
00:42:40.600 I don't think
00:42:40.940 so.
00:42:42.380 Merck is
00:42:42.820 coming out
00:42:43.320 with
00:42:44.000 therapeutic
00:42:45.040 to replace
00:42:46.940 ivermectin.
00:42:48.180 I'll bet.
00:42:50.940 We know
00:42:51.680 ivermectin
00:42:52.300 is not
00:42:52.700 100%
00:42:53.280 fatal
00:42:53.620 after 10
00:42:54.280 years.
00:42:54.540 Well,
00:42:54.840 we know
00:42:55.060 it's not
00:42:55.340 100%
00:42:55.800 fatal.
00:42:56.060 Yeah.
00:42:59.520 Yes,
00:43:00.100 ivermectin
00:43:00.640 has been
00:43:00.940 used for
00:43:01.420 10-plus
00:43:01.960 years in
00:43:02.360 other
00:43:02.560 countries.
00:43:03.040 Why does
00:43:03.400 that matter?
00:43:04.720 They weren't
00:43:05.180 studying it,
00:43:06.540 were they?
00:43:08.400 I don't think
00:43:09.620 any country
00:43:10.160 studied
00:43:10.720 how do
00:43:12.220 all the
00:43:12.560 people look
00:43:13.040 10 years
00:43:13.520 after taking
00:43:14.120 ivermectin?
00:43:15.980 Now,
00:43:16.580 what do you
00:43:17.100 think is
00:43:17.560 the period
00:43:18.200 beyond which
00:43:19.020 a vaccination
00:43:19.600 is almost
00:43:21.460 certainly telling
00:43:22.260 you everything
00:43:22.740 that you need
00:43:23.260 to know
00:43:23.560 about side
00:43:24.060 effects?
00:43:24.400 how long
00:43:25.820 do you think
00:43:26.140 that period
00:43:26.540 is?
00:43:27.540 In the
00:43:28.120 comments,
00:43:28.700 let's see
00:43:29.080 how well
00:43:30.300 informed you
00:43:31.100 are.
00:43:31.840 How many
00:43:32.360 days,
00:43:33.100 weeks,
00:43:33.480 months,
00:43:33.740 or years
00:43:34.220 after taking
00:43:35.840 a vaccination?
00:43:37.060 We'll say
00:43:37.420 vaccinations
00:43:38.000 in general.
00:43:39.980 How long
00:43:40.720 do you
00:43:40.960 wait before
00:43:41.780 you really,
00:43:42.640 really know
00:43:43.080 you've got
00:43:43.500 just about
00:43:44.100 all the
00:43:44.400 side effects?
00:43:46.660 Somebody
00:43:47.100 says two
00:43:47.560 weeks.
00:43:49.320 I'm looking
00:43:49.760 at all
00:43:50.100 your,
00:43:51.340 I see
00:43:51.860 one month,
00:43:53.180 one month,
00:43:53.920 20 years,
00:43:55.740 36 months,
00:43:56.760 15 minutes,
00:43:58.460 48 hours.
00:44:00.760 Why don't
00:44:01.560 you all know
00:44:02.100 the answer
00:44:02.500 to this?
00:44:05.460 This is
00:44:06.160 one of
00:44:06.440 those,
00:44:06.960 oh my
00:44:09.180 God,
00:44:09.740 this is
00:44:10.120 one of
00:44:10.360 those facts
00:44:10.940 that should
00:44:11.420 be the
00:44:12.620 most important
00:44:13.400 fact in
00:44:14.080 your decision
00:44:14.620 about getting
00:44:15.080 vaccinated.
00:44:16.360 If I had
00:44:17.120 to say
00:44:17.360 there was
00:44:17.640 one fact
00:44:18.580 that you
00:44:19.780 should know
00:44:20.400 to make
00:44:21.520 a decision
00:44:22.020 about
00:44:22.320 vaccination,
00:44:23.220 it would
00:44:23.860 be this
00:44:24.380 and none
00:44:24.720 of you
00:44:24.940 know it.
00:44:26.200 None of
00:44:26.620 you know it.
00:44:27.720 Not one
00:44:28.160 of you is
00:44:28.540 right.
00:44:30.420 None?
00:44:33.600 I'm
00:44:34.060 watching all
00:44:34.500 your answers.
00:44:35.020 I'm seeing
00:44:35.260 two years,
00:44:36.140 a lifetime,
00:44:37.080 15 minutes.
00:44:38.860 None of
00:44:39.260 you are even
00:44:39.680 close.
00:44:40.840 It's the
00:44:41.220 number one
00:44:41.640 most important
00:44:42.180 variable.
00:44:43.540 The number
00:44:44.320 one most
00:44:45.000 important variable
00:44:45.800 in your
00:44:46.400 personal decision
00:44:47.380 about whether
00:44:48.020 to get a
00:44:48.400 vaccination.
00:44:50.240 And not
00:44:50.660 a fucking 0.88
00:44:51.240 one of you
00:44:51.740 knows the
00:44:52.200 answer to
00:44:52.620 it.
00:44:53.660 This is
00:44:54.080 public
00:44:54.360 information.
00:44:55.900 I've seen
00:44:56.520 it a number
00:44:57.000 of times.
00:44:58.480 The answer
00:44:59.140 is two
00:44:59.460 months.
00:45:01.100 Two months.
00:45:03.300 After two
00:45:04.060 months,
00:45:05.140 the odds
00:45:05.780 of anything
00:45:06.320 bad happening
00:45:07.140 to you in
00:45:07.520 the long
00:45:07.860 term become
00:45:08.980 really small.
00:45:10.260 They don't
00:45:10.660 disappear,
00:45:11.980 but really
00:45:12.540 small.
00:45:13.480 And that
00:45:13.880 first 15
00:45:14.600 minutes tells
00:45:15.220 you a whole
00:45:15.600 lot, right?
00:45:16.540 That's why
00:45:16.860 they make
00:45:17.220 you wait.
00:45:18.020 But two
00:45:19.360 months,
00:45:20.400 typically for
00:45:22.000 all other
00:45:22.580 vaccinations,
00:45:23.360 that's where
00:45:23.660 problems appear.
00:45:26.660 Now,
00:45:27.820 I'm seeing
00:45:28.820 Robert laughing.
00:45:30.220 Now, I'm
00:45:30.620 not saying
00:45:31.040 that everything
00:45:33.320 has to fit
00:45:34.020 into that
00:45:34.460 model.
00:45:35.220 I'm saying
00:45:35.740 that if you
00:45:36.260 didn't know
00:45:36.840 that and you
00:45:37.720 made your
00:45:38.080 decision on
00:45:38.660 vaccinations,
00:45:39.460 that was a
00:45:39.800 very irrational
00:45:40.500 decision.
00:45:41.680 If you are
00:45:42.440 not aware
00:45:43.180 that two
00:45:44.720 months gets
00:45:45.400 you pretty
00:45:46.480 much everything
00:45:47.100 you need
00:45:47.500 to know
00:45:47.900 about side
00:45:48.820 effects,
00:45:49.480 not all
00:45:50.140 of it,
00:45:50.620 not 100%,
00:45:51.600 but close.
00:45:55.600 Oh, somebody
00:45:56.060 said two
00:45:56.460 months there.
00:45:57.000 Good for
00:45:57.280 you.
00:45:59.240 All right,
00:46:00.080 so I'm
00:46:01.960 kind of
00:46:02.220 blown away
00:46:02.660 by this.
00:46:03.400 I really
00:46:03.740 am.
00:46:04.100 I'm kind
00:46:04.680 of blown
00:46:05.000 away that
00:46:05.980 the most
00:46:06.480 important
00:46:06.920 variable,
00:46:07.540 nobody
00:46:07.800 knows,
00:46:08.360 it was
00:46:08.640 like one
00:46:09.060 person who
00:46:09.520 knew,
00:46:10.400 and that's
00:46:10.740 it.
00:46:10.900 I would
00:46:14.180 say that
00:46:14.540 was the
00:46:14.860 thing that
00:46:15.160 made my
00:46:15.520 decision.
00:46:17.140 I would
00:46:17.640 say I
00:46:18.000 based my
00:46:18.620 decision on
00:46:19.320 the thing
00:46:19.640 that almost
00:46:20.120 nobody knew
00:46:20.820 here, but
00:46:21.480 was public
00:46:21.980 information,
00:46:23.160 that after
00:46:23.620 two months
00:46:24.100 you know.
00:46:25.160 Remember I
00:46:25.740 told you in
00:46:26.120 the beginning,
00:46:26.700 people ask me
00:46:27.320 from day one,
00:46:28.120 am I going to
00:46:28.480 get the
00:46:28.780 vaccination?
00:46:29.900 Do you
00:46:30.100 remember what
00:46:30.460 I said?
00:46:31.860 Yes, but
00:46:32.520 not in the
00:46:32.840 first two
00:46:33.260 months.
00:46:35.140 Right?
00:46:36.420 And I
00:46:36.760 waited six
00:46:38.780 months?
00:46:39.500 I think I
00:46:39.920 waited six
00:46:40.420 months.
00:46:41.560 Or five
00:46:42.480 or something.
00:46:43.280 Before getting
00:46:43.860 vaccinated.
00:46:44.780 What was my
00:46:45.460 risk after
00:46:47.000 waiting five
00:46:47.800 months for
00:46:48.440 other people's
00:46:49.160 experience to
00:46:49.920 tell me if
00:46:50.360 it was
00:46:50.520 dangerous or
00:46:51.000 not?
00:46:51.620 Pretty low.
00:46:53.420 Pretty low.
00:46:54.580 I mean, I
00:46:54.900 don't know how
00:46:55.220 to calculate
00:46:55.700 it.
00:46:55.980 There's no
00:46:56.260 way to do
00:46:56.620 that.
00:46:57.240 But low.
00:46:58.780 Now, that's
00:46:59.580 low based on
00:47:00.420 living life and
00:47:03.020 seeing that
00:47:03.580 usually it's
00:47:04.580 two months or
00:47:05.180 so, tells you
00:47:05.820 everything you
00:47:06.260 need to know.
00:47:06.880 It doesn't
00:47:07.220 mean it's
00:47:07.520 true with
00:47:07.880 ease.
00:47:08.720 I mean,
00:47:09.360 mRNA stuff might
00:47:10.460 be different.
00:47:12.080 But, oh,
00:47:15.420 and the
00:47:15.620 fact that
00:47:16.000 I'm fit and
00:47:17.760 healthy, but
00:47:18.280 I'm also
00:47:18.720 64 and I
00:47:19.760 have asthma.
00:47:20.900 So I'm on
00:47:21.560 the short
00:47:21.960 list of
00:47:22.400 people who
00:47:23.040 die from
00:47:24.180 COVID.
00:47:25.840 Right?
00:47:26.200 If you have
00:47:26.520 asthma, you're
00:47:27.180 in bad shape.
00:47:28.460 I'm pretty sure
00:47:29.160 the people who
00:47:29.680 died from it
00:47:30.300 had asthma and
00:47:31.160 were fat, if
00:47:32.380 we're being
00:47:32.700 honest.
00:47:34.100 Probably.
00:47:34.540 So, yeah.
00:47:41.760 Thalidomide.
00:47:44.300 Thalidomide would
00:47:44.980 be the
00:47:45.580 counter example
00:47:46.560 of something that
00:47:47.760 took a while for
00:47:48.660 people to realize.
00:47:50.580 But, correct me
00:47:52.160 if I'm wrong, I
00:47:53.760 believe that
00:47:54.460 current medical
00:47:55.840 processes, I
00:47:56.700 need a fact
00:47:57.160 check on this.
00:47:58.680 This is an
00:47:59.300 important one.
00:48:00.040 So if you could
00:48:00.960 really give me a
00:48:01.620 fact check on
00:48:02.140 this.
00:48:02.720 My understanding
00:48:03.480 would be that
00:48:04.280 we would not
00:48:04.940 make the
00:48:05.420 thalidomide mistake
00:48:07.100 today.
00:48:08.340 That we have
00:48:09.140 safeguards that
00:48:10.000 would have caught
00:48:10.500 that early.
00:48:11.700 In other words,
00:48:12.200 we wouldn't have
00:48:12.740 had to wait to
00:48:14.160 find out it was
00:48:14.740 bad.
00:48:15.600 Give me a
00:48:16.020 fact check on
00:48:16.560 that.
00:48:17.200 I think we
00:48:17.880 would have caught
00:48:18.840 that one.
00:48:19.780 If it happened
00:48:20.380 today, I think
00:48:21.000 we would have
00:48:21.360 caught it right
00:48:21.840 away.
00:48:23.200 But maybe not.
00:48:26.900 Yeah.
00:48:27.640 And let's say
00:48:28.520 you only had one
00:48:29.300 example in which
00:48:31.860 the vaccine did
00:48:32.940 have some bad
00:48:33.460 outcomes and
00:48:34.020 they were pretty
00:48:34.340 bad later.
00:48:39.780 Still, that
00:48:40.320 would give you
00:48:40.680 lots of
00:48:41.100 vaccinations that
00:48:41.940 didn't have that
00:48:42.560 problem.
00:48:43.540 So you'd still be
00:48:44.160 looking, well,
00:48:44.920 one did, but
00:48:45.720 20 did not.
00:48:47.060 One in 20,
00:48:47.840 5% chance.
00:48:53.440 Smoking a lot
00:48:54.100 of weed is
00:48:54.580 protecting your
00:48:55.160 lungs, you 0.50
00:48:55.560 say?
00:48:56.300 That might
00:48:56.900 actually be
00:48:57.420 true.
00:48:57.620 We don't
00:48:57.900 know.
00:48:59.460 Yeah.
00:49:01.240 And then, yeah,
00:49:02.220 I think there is
00:49:02.840 a separate
00:49:03.180 question about
00:49:03.820 pregnant women. 0.94
00:49:04.780 That's a whole
00:49:05.240 different calculation.
00:49:06.720 But, you know,
00:49:07.080 pregnant women 0.99
00:49:07.620 don't want to
00:49:08.020 have COVID
00:49:08.460 either, so.
00:49:12.240 Well, I think
00:49:12.940 locals' comments
00:49:14.040 are quite responsive,
00:49:15.080 whoever said
00:49:15.520 they're not,
00:49:16.580 because they're
00:49:17.340 in real time.
00:49:19.840 It's not a
00:49:20.640 vaccine.
00:49:21.800 It is a
00:49:22.400 vaccine.
00:49:23.160 Tetanus is a
00:49:23.780 vaccine, and you
00:49:24.600 need a booster
00:49:25.100 for tetanus, too.
00:49:25.940 All right.
00:49:31.820 40 booster,
00:49:33.000 what?
00:49:34.140 All right, I
00:49:34.700 got to run, and
00:49:35.500 I'll talk to you
00:49:36.260 tomorrow.
00:49:38.780 How many of
00:49:39.380 you...
00:49:41.940 I'm seeing
00:49:43.200 people suggesting
00:49:44.040 different drugs
00:49:45.700 or things that
00:49:46.420 may have had
00:49:47.180 long-term
00:49:47.840 consequences.
00:49:51.040 I'd love to
00:49:51.660 see a
00:49:52.460 summary of
00:49:53.160 that, a
00:49:54.000 summary of
00:49:54.840 any
00:49:55.820 meds or
00:49:56.380 vaccinations
00:49:56.920 that did
00:49:57.500 have long-term
00:49:58.320 consequences
00:49:59.340 in relation
00:50:00.300 to all the
00:50:00.920 meds that
00:50:01.300 have ever
00:50:01.600 been improved?
00:50:02.840 Is it
00:50:03.200 1%?
00:50:04.280 I don't
00:50:04.560 know.
00:50:05.600 5%?
00:50:06.600 I have no
00:50:06.960 idea.
00:50:08.560 All right.
00:50:11.340 I will
00:50:11.900 talk to you
00:50:12.500 tomorrow.
00:50:13.720 tomorrow.
00:50:14.720 And I
00:50:14.820 see you...