Real Coffee with Scott Adams - September 28, 2021


Episode 1513 Scott Adams: Today's Show Will Be Mindbendingly Awesome


Episode Stats


Length

56 minutes

Words per minute

148.9651

Word count

8,471

Sentence count

630

Harmful content

Misogyny

8

sentences flagged

Hate speech

15

sentences flagged


Summary

Summaries generated with gmurro/bart-large-finetuned-filtered-spotify-podcast-summ .

In this episode, we talk about the Supreme Court and why it's not as popular as it used to be, and what it really means to be a good judge. And we discuss the dopamine hit of the day: The Single Sip.

Transcript

Transcript generated with Whisper (turbo).
Misogyny classifications generated with MilaNLProc/bert-base-uncased-ear-misogyny .
Hate speech classifications generated with facebook/roberta-hate-speech-dynabench-r4-target .
00:00:00.000 Point checklist, 13 of them checked, one of them not checked.
00:00:06.720 Guess which one I didn't check.
00:00:08.780 That would be the one where I turn around and take my notes off of the printer, which
00:00:13.320 don't exist.
00:00:15.020 So it looks like you're going to be watching me print my notes.
00:00:19.300 I don't know what could be more fun than that, really.
00:00:22.440 But it's going to be a great show today, I promise you.
00:00:25.280 By the way, I'm working on some kind of a drum sting to open up the show.
00:00:37.620 And so you might hear that.
00:00:40.760 You're probably familiar with the Tucker Carlson opening.
00:00:44.000 He's got a little drum thing.
00:00:46.800 And I'm trying to do something like that, like five seconds of a drum, some kind of a
00:00:53.860 drum thing.
00:00:55.280 But we'll see.
00:00:56.520 All right, so Boo the Cat is back.
00:00:59.780 And I've got 10 days of feeding her through a tube and medicating her. 1.00
00:01:06.480 Let me tell you how complicated it is to be my age and have a sick cat and a full-time job.
00:01:15.320 Let me just give you a sense of the complexity.
00:01:20.080 Now, you've probably had enough medical problems yourself that you know it's just a gigantic
00:01:27.160 problem to get anything solved.
00:01:30.100 There's just so many decisions and medications.
00:01:31.980 But I've got a cat with five different medications with five different schedules, and I'm counting
00:01:37.320 the food as a medication because it has to be injected.
00:01:40.860 And then you add my own, right?
00:01:44.160 If you're a certain age, you've probably accumulated a number of just ordinary medications.
00:01:49.620 I've got one for acid reflux, one for blood pressure.
00:01:56.140 And so I've got something like 13 medications with different schedules between the cat and
00:02:03.060 me that I have to juggle.
00:02:05.340 13 medications on different schedules every day.
00:02:11.400 Then I have to work two full-time jobs.
00:02:15.400 It's pretty ugly over here.
00:02:16.860 I haven't slept a lot.
00:02:17.800 Anyway, but you know what would be great?
00:02:20.120 I think you do.
00:02:22.280 What would be great?
00:02:24.360 The simultaneous sip.
00:02:26.080 Yeah.
00:02:27.020 And you're here, and you're ready for it.
00:02:28.880 And it's going to be great.
00:02:30.260 Probably one of the best ones ever.
00:02:32.220 I mean, I don't want to build it up too much, but I'm feeling it.
00:02:35.020 Are you feeling it?
00:02:36.440 Yeah, this one's going to be good.
00:02:37.860 And all you need is a cup of margarita glass, a tecarchel, a stein, a canteen jug of a flask
00:02:41.440 of a vessel of any kind.
00:02:43.380 Fill it with your favorite liquid.
00:02:46.760 I like coffee.
00:02:48.580 And join me now for the unparalleled pleasure.
00:02:51.900 The dopamine hit of the day, the thing that makes everything better.
00:02:55.840 And it really does.
00:02:58.160 If you haven't tested this yet, you've got to do a little A-B testing.
00:03:01.540 See how you feel with it.
00:03:02.460 See how you feel without the simultaneous sip.
00:03:04.460 Oh, you'll be surprised.
00:03:06.300 Here it comes.
00:03:07.320 You ready?
00:03:08.040 Go.
00:03:12.560 Yeah.
00:03:13.780 Oh, so good.
00:03:16.800 I pity the people who have not taken the simultaneous sip, because their lives are impoverished
00:03:22.760 in many ways.
00:03:23.980 Not just monetarily, but impoverished, I say.
00:03:27.560 Well, Rasmussen has a poll that says the Supreme Court is not too popular these days.
00:03:35.120 Only 10% think the Supreme Court is doing an excellent job.
00:03:38.760 23% say good.
00:03:40.020 So you only have 33% approval for the Supreme Court.
00:03:43.880 Now, if you had to look for one, let's say, identifier or signal that the country might
00:03:53.740 be in a little bit of trouble, I would look at the popularity of the Supreme Court.
00:03:58.660 Now, but I would add this.
00:04:06.440 I don't know if doing a good job is the right question for the Supreme Court, because the
00:04:12.940 Supreme Court is unique in that their entire purpose is to make decisions that you know
00:04:19.360 most of the country is going to hate, or a lot of the country is going to hate, not
00:04:22.780 most.
00:04:23.820 So maybe the question should have been, if anybody's listening to this from Rasmussen,
00:04:32.040 I'd love to see this question put in terms of credibility.
00:04:37.460 Credibility.
00:04:37.940 Now, that's a little bit different than coming up with the right answer according to you.
00:04:43.140 If you're willing to trust the Supreme Court, even if you don't like their decisions, then
00:04:49.420 you're in good shape.
00:04:50.820 Do you think we're there?
00:04:51.900 I feel like we are.
00:04:54.280 I feel like they're credible still, even though biased, right?
00:05:00.100 Because I don't think that they would do something that's, like, ridiculously bad.
00:05:03.700 They would do things you don't like, but a third of the country thinks is awesome.
00:05:08.480 You know?
00:05:08.960 I mean, that's not crazy.
00:05:10.620 So I feel like their credibility would be higher than their approval, because the approval
00:05:14.660 is really about, do you agree with their decisions?
00:05:16.460 Do you follow, on Twitter, Balaji Srinivasan?
00:05:21.900 What?
00:05:22.960 You don't?
00:05:24.360 Well, you should.
00:05:25.780 So Balaji's on my very short list of people that everybody should be following, because 0.95
00:05:31.180 there aren't that many independent thinkers in the world, and there are even fewer independent
00:05:35.500 thinkers who come up with ideas that you haven't thought of yourself.
00:05:38.360 It's kind of rare.
00:05:41.320 But Balaji does consistently.
00:05:43.740 So follow him.
00:05:44.740 I think his Twitter is just at Balaji, B as in boy, A-L-A-J-I.
00:05:52.500 And he tweeted this morning, employees should start demanding a 90-day cool-down period in
00:05:59.160 their contracts, such that they can't be precipitously fired due to passing social media
00:06:04.600 storms.
00:06:07.260 Sometimes it's a real issue.
00:06:08.680 If so, it'll still be real in 90 days.
00:06:11.640 Let cooler heads prevail.
00:06:13.640 What do you think of that?
00:06:15.460 What do you think of that idea?
00:06:16.660 It's pretty good, isn't it?
00:06:19.800 You know, if you buy into the idea that employees should be organized, at least in some important
00:06:27.140 ways, why isn't the union demanding this?
00:06:31.360 This feels like a just basic, right down the middle, union requirement.
00:06:39.160 So unions get on this.
00:06:40.480 I don't know if unions just maybe didn't think of it, but this seems like basic, really basic
00:06:46.840 employee protection, wouldn't you say?
00:06:49.320 Just really basic.
00:06:51.220 Because I don't think this is, the thing that I like about this idea is that the moment you
00:06:57.000 hear it, you wonder why it's not already being done.
00:07:01.360 You know, as soon as you hear it, you're like, uh, really?
00:07:04.820 This is the first time we've even talked about this?
00:07:06.840 This is obvious, once you hear it.
00:07:09.740 So I'll just put that out there.
00:07:11.380 Unions, maybe you could do something about that.
00:07:13.720 90-day cool-down period is a good idea.
00:07:17.220 You might remember that after Trump lost the election, I was predicting that you would
00:07:25.700 see people on the left hunting Republicans.
00:07:29.160 Do you remember what happened to me when I said that?
00:07:33.220 People said, oh my God, Scott, you are way on crazy town left field.
00:07:40.900 And I'll tell you one thing that's not going to happen.
00:07:43.900 Nobody's going to be hunting Republicans, that's for sure.
00:07:47.440 Well, a story yesterday is breaking news.
00:07:51.600 An Antifa member, Benjamin Varela, allegedly, well, not allegedly, but he was charged with
00:07:58.660 allegedly shooting anti-mandate protester.
00:08:02.220 An anti-mandate protester.
00:08:05.060 Was the anti-mandate protester probably Republican?
00:08:08.860 Or at least, would this Antifa member believe that this person was probably Republican?
00:08:15.640 Yeah.
00:08:16.720 So is this a clear example of somebody on the left literally hunting a Republican?
00:08:24.240 Looking for somebody to shoot, and then shooting them because of their point of view.
00:08:29.140 It looks like that happened, allegedly.
00:08:31.100 We'll find out if it's real.
00:08:32.280 All right, the Taliban, big surprise. 0.92
00:08:37.960 They're not going to be allowing women to go to a Taliban, or to the Kabul University.
00:08:46.800 And I assume this would apply to other universities, or maybe it's the only one.
00:08:50.120 I don't know.
00:08:50.900 Are there a lot of universities in Afghanistan?
00:08:54.480 Maybe just the one?
00:08:55.960 I don't know.
00:08:56.400 But the Taliban says that women will not be allowed until they can Islamic it up.
00:09:04.060 So they will be allowed later, they say, but not until they can make the environment somehow 0.88
00:09:09.660 more compatible with Islam, which they're not. 1.00
00:09:12.280 But in the meantime, they have a good solution.
00:09:14.800 And I think you'll appreciate this.
00:09:17.760 It's sort of like, well, I'll tell you what it's like after I tell you what it is.
00:09:22.560 They're going to use male lecturers for the women, so women will be able to attend classes 1.00
00:09:29.700 in some cases.
00:09:31.280 But there aren't enough female lecturers, so they're going to hire men, but since it would 1.00
00:09:37.460 be apparently un-Islamic, according to the Taliban, to have the men teaching the girls
00:09:44.600 directly, the men will stand behind a curtain.
00:09:48.000 So the man will be there in person, but behind a curtain.
00:09:52.840 And I thought to myself, I don't feel like I'm nearly as inclusive enough in my live stream
00:09:59.420 here as I could be, because I realized how the women in Afghanistan would not be able 1.00
00:10:05.720 to watch me.
00:10:07.840 Because same situation, right?
00:10:10.540 You know, man.
00:10:12.020 I don't know if they could.
00:10:13.040 So I wanted to give you an example of what I call Taliban Zoom. 1.00
00:10:18.320 So this is, you know, you know what Zoom is.
00:10:20.780 So the Taliban is going to do a version of remote learning, except it's a little bit of
00:10:28.320 a simpler model instead of the technology and stuff.
00:10:31.440 Sort of this.
00:10:34.800 Hey.
00:10:36.020 Hey.
00:10:36.540 Welcome to Taliban Zoom class.
00:10:39.120 I'm your instructor, Scott Adams.
00:10:41.080 And you can't see me, but trust me, I am totally behind this curtain.
00:10:47.140 I'm not sitting at home on a computer.
00:10:49.360 I'm behind this curtain, giving you a Zoom class. 1.00
00:10:52.520 So Taliban ladies, pay attention. 1.00
00:10:56.700 So I think that would work pretty well.
00:11:03.880 Taliban Zoom class.
00:11:05.800 Scene.
00:11:06.280 All right.
00:11:08.620 Here's one of my weirdest predictions that looks like it might come true.
00:11:15.260 By the way, how many of you know the inside joke of the plaid blanket?
00:11:22.000 If you know the inside joke, don't tell anybody.
00:11:26.240 Don't put it in the comments.
00:11:29.320 Yeah.
00:11:29.580 I mean, you can refer to it, but just don't give away the reveal.
00:11:33.060 All right.
00:11:35.320 It turns out that Bitcoin miners are looking to nuclear power plants to power their Bitcoin mining.
00:11:43.800 Now, those of you who don't follow cryptocurrency, here's the quick lesson.
00:11:48.180 In order to create a new Bitcoin, which is created through a process of brute force computing,
00:11:58.040 where it follows an algorithm, a formula, if you will, and only once every, you know, who knows how long,
00:12:05.360 it depends on your computing power, you can discover a series of, I don't know,
00:12:11.180 let's say a series of bits that Bitcoin recognizes as a coin.
00:12:14.860 Is that crude enough, an explanation?
00:12:18.780 So, in other words, you can kind of discover Bitcoins hidden in the math.
00:12:25.100 All right.
00:12:25.320 I'm giving you the real idiots version of this, you know, so the crypto people are going crazy right now.
00:12:30.740 No, that's not quite accurate.
00:12:32.140 But just for the, you know, the every person explanation, Bitcoins are hidden in math,
00:12:38.700 and in order to tease them out and own them, you have to do something called mining,
00:12:44.220 which is running a powerful computer or network of computers for long periods of time,
00:12:50.020 and the more Bitcoins are found, the harder it is to find the next one.
00:12:53.880 So every Bitcoin that you find creates a higher challenge for the next available one.
00:12:59.740 So you've got to get more and more computing power.
00:13:02.420 And you take so much computing power to find a Bitcoin now that it's a drag on climate.
00:13:10.400 The climate is actually at risk if you accept that humans are causing climate change.
00:13:16.300 So there's so much energy that they need that they're talking to,
00:13:18.860 the Bitcoin miners are talking to nuclear power plants to use their excess nuclear power.
00:13:24.340 Because I guess even nuclear power plants will generate a little bit more than they need.
00:13:29.200 Makes sense, right?
00:13:30.220 They don't want to have only just enough.
00:13:32.520 So power plants are going to have more power than they need on a regular basis.
00:13:37.500 But of course, it'd be easy to turn off the Bitcoin part if you ever got in trouble, right?
00:13:41.620 So it's kind of a perfect marriage.
00:13:44.000 You put the Bitcoin mine close enough to take advantage of the nuclear energy capacity,
00:13:50.820 and suddenly you've got free Bitcoins.
00:13:54.360 Or not free, but, you know, way cheaper.
00:13:57.740 So I ask this question.
00:14:00.900 Can Bitcoin ever become big enough, in terms of its economic potential,
00:14:07.600 to pay for nuclear power?
00:14:11.360 In other words, is there anybody right now who's putting on the drawing board
00:14:15.700 a combination nuclear power plant, maybe Gen 3 or Gen 4,
00:14:22.200 right next to, or at least close enough, to a Bitcoin mine?
00:14:27.540 Is anybody looking to build both of those at the same time?
00:14:30.540 Because wouldn't the Bitcoin mine actually pay for the whole power plant?
00:14:35.620 Or would it?
00:14:37.520 So this is my question.
00:14:39.300 If you were to do a project that had, let's say, I don't know,
00:14:42.260 10 to 20 years to build a proper nuclear plant,
00:14:45.700 if you knew it was going to take you, let's say you got it down to 15 years,
00:14:49.480 you were doing great, which I think is a stretch.
00:14:52.440 But if you get it down to 15 years,
00:14:54.620 could you make a 15-year economic prediction
00:14:58.580 that if you had relatively free energy,
00:15:03.100 you could pay for the entire nuclear plant just with Bitcoins?
00:15:08.040 I'm thinking not, because I think that 15 years makes it so hard to get a new Bitcoin
00:15:15.560 that even a nuclear power plant wouldn't be enough to get you a new one.
00:15:20.180 But maybe you only need one, right?
00:15:23.640 Maybe Bitcoin's worth $20 billion in 15 years.
00:15:28.060 Maybe one Bitcoin is worth $20 billion.
00:15:30.960 Right?
00:15:31.480 Is there anybody who knows enough about this area to tell me that that's crazy?
00:15:37.080 Is it crazy that one Bitcoin could be worth $15 billion 20 years from now?
00:15:43.560 I think that's possible, right?
00:15:45.760 Because we're only using fractions of them anyway.
00:15:48.980 Anyway, I'll just put that out there as an interesting thing that might be happening.
00:15:52.920 Russell Brand made news today by being not crazy.
00:16:04.280 What?
00:16:05.580 Did you know that you could make news, national news?
00:16:08.880 He's trending all over Twitter.
00:16:12.840 Eric, I see that comment.
00:16:15.720 He's trending all over Twitter for simply being aware of the news.
00:16:21.140 Apparently, he is aware of the news that Hillary Clinton was the real person behind the Russia conspiracy.
00:16:27.780 Not Trump colluding with Russia, but rather Hillary Clinton was actually the architect of the Russia conspiracy collusion thing. 0.99
00:16:38.420 And Russell Brand did a show with Glenn Greenwald, who's, I would say, the most important voice on this topic lately.
00:16:47.280 And the big news is that Russell Brand actually accurately reported a story that the left is largely blind to, but the right largely knows.
00:16:58.820 Now, here's the question.
00:17:01.460 We know that Russell Brand identifies with the left.
00:17:05.740 He would call himself a liberal, I guess.
00:17:08.420 And why is it so unusual that he can simply see a story that's in the news?
00:17:15.060 He's not making stuff up.
00:17:18.700 He's simply objectively looking at the news, and he actually can see it, and he can talk about it.
00:17:25.900 How many people on the left could do that?
00:17:28.580 How many people could hear that news, that Hillary Clinton was always the one behind the Russia collusion stuff,
00:17:34.540 and just report it straight?
00:17:37.480 Just the facts.
00:17:39.200 Almost nobody.
00:17:40.740 Almost nobody.
00:17:41.620 And the reason is cognitive dissonance.
00:17:44.420 Because if you are so committed to a side, it's just hard to change.
00:17:47.540 You'll find some weird rationalization why it really was Trump talking to Putin after all,
00:17:54.820 even though there's no evidence of that whatsoever.
00:17:57.100 All right?
00:17:57.660 So, yeah, Bill Maher is another one who's awake to these things.
00:18:03.320 Now, here's my question.
00:18:04.200 What makes a Bill Maher or a Russell Brand capable of avoiding cognitive dissonance or confirmation bias in this case?
00:18:16.520 What is it that makes them able to do that?
00:18:18.760 What have they done or what do they have that allows them to be immune?
00:18:24.200 They've actually got immunity to cognitive dissonance.
00:18:27.680 Well, I don't know, but I'll give you a few, some speculation.
00:18:31.040 I mean, some of them might be genetic, you know, their brain is just built a different way.
00:18:35.400 That could be.
00:18:36.580 But I would suggest the following.
00:18:39.100 Number one, think of these three people who all have the same quality.
00:18:43.720 They seem to be able to actually just objectively look at stuff on left or right.
00:18:47.940 Bill Maher, Russell Brand, and I'll throw Glenn Greenwald in there.
00:18:53.600 What quality do they all share?
00:18:57.000 I want to see if you can get this.
00:18:58.320 What quality do they share?
00:19:00.540 Contrarian habits.
00:19:01.560 That's a good one, yeah.
00:19:02.620 So they do have enough of a history of contrarianness that they can be consistent being contrarians.
00:19:10.640 So that's actually a really good answer.
00:19:12.820 It wasn't what I was looking for, but that's maybe better than my answer.
00:19:16.700 Comedy?
00:19:17.380 Well, Glenn Greenwald, sort of.
00:19:19.280 You know, indirectly, maybe.
00:19:21.540 Fired from the mainstream?
00:19:24.220 Well, okay.
00:19:25.880 Yeah, fired from the mainstream.
00:19:27.320 Was Russell Brand ever fired from the mainstream?
00:19:30.080 I don't know about that one.
00:19:31.420 Craving attention.
00:19:32.780 Not bad.
00:19:34.200 These are actually really good hypotheses.
00:19:36.420 Somebody says craving attention.
00:19:38.800 Yeah, yeah.
00:19:39.700 I mean, but would they get it that way?
00:19:42.200 It's a...
00:19:42.920 I mean, they could get attention other ways.
00:19:45.720 They're fact-based, but why?
00:19:47.140 Why can they be fact-based, and why can they be honest, but other people are in cognitive dissonance?
00:19:52.260 I'll tell you, I'll tell you the shrooms.
00:19:58.540 They all took shrooms.
00:20:00.040 Oh, you magnificent bastard.
00:20:02.520 Somebody says that they all took mushrooms.
00:20:05.740 I'll bet that's true.
00:20:08.320 I'll bet that is true.
00:20:10.120 I mean, if I had to guess from, you know, Bill Maher and Russell Brand, I'd say, you know, if I had to guess, probably more yes than no.
00:20:19.500 Glenn Greenwald?
00:20:20.940 I don't know.
00:20:22.380 I don't know.
00:20:22.940 That would be an interesting...
00:20:23.640 I'd love to ask him that question.
00:20:25.560 I don't know.
00:20:26.780 But fearless is very close to it.
00:20:30.920 So Tony is saying fearless.
00:20:32.320 Here's the answer I was looking for.
00:20:34.960 Immune from embarrassment.
00:20:38.980 Immune from embarrassment.
00:20:41.760 Now, we can't know what's in their head, right?
00:20:43.660 So that's a little bit mind-reading.
00:20:44.980 So we have to do it observationally and say, does that look right?
00:20:49.060 Because we're just guessing what they're thinking.
00:20:51.180 But if you look at Bill Maher, Glenn Greenwald, and Russell Brand, I would speculate, I don't know this for sure, that they're unusually free from worrying about being embarrassed.
00:21:07.440 It has something to do with the jobs they've chosen, right?
00:21:10.100 If you're not afraid of being embarrassed, you can say anything.
00:21:13.660 You're free.
00:21:15.060 So you don't have to worry about covering up for that thing you used to say to make it all sound like your ego is intact.
00:21:21.460 And you were always smart, even if you were dumb.
00:21:24.200 They also have chosen jobs in which proving that they have been wrong is actually an asset.
00:21:31.220 Same with me.
00:21:32.040 I've chosen a job, if you can call whatever this is, a job, sort of.
00:21:37.620 I've chosen one in which if I am completely wrong about something in public, like really seriously wrong, that's content.
00:21:47.800 I would love that.
00:21:49.380 I would love to find out how wrong I am about something I've always thought was true.
00:21:54.340 Because that, to me, is exhilarating.
00:21:56.100 So am I likely to suffer cognitive dissonance when I'm exhilarated to find out I'm wrong?
00:22:03.700 No.
00:22:04.500 It's immunity.
00:22:05.820 So I have the same immunity.
00:22:08.260 I'm speculating here, right?
00:22:09.940 So bear with me that I'm trying to teach you a concept that may not apply to any of these three individuals.
00:22:17.000 Yeah, I can't read their minds.
00:22:17.940 But I think that the things that give you immunity to cognitive dissonance is you have to learn to be excited when you're wrong in public.
00:22:27.800 Excited instead of embarrassed.
00:22:30.340 You have to be a little bit fearless.
00:22:33.800 And it helps that you've been battered before.
00:22:37.340 It helps that you've survived a number of shames and embarrassments, as I have.
00:22:41.780 So I do think that there is a formula for being free of cognitive dissonance.
00:22:48.580 And whether these people did it intentionally or it's just how things turned out,
00:22:53.740 you have this small group of people who literally doesn't seem to be affected as much by the things that are affecting other people.
00:23:02.240 Now, here's another one I'm going to add to this.
00:23:05.380 Joe Rogan has a video.
00:23:06.780 And I couldn't tell from the video.
00:23:09.440 This is on the Internet today.
00:23:11.040 If it was a trailer for a movie or was it just a video meme, I wasn't sure what it is.
00:23:19.560 But the nature of it is Joe Rogan talking on his show, and they're taking clips from it,
00:23:25.620 in which he's talking about how freedom is the basic operating system that makes everything work in the United States.
00:23:32.960 And as soon as you start taking freedom away, then everything falls apart, like the thing that makes us great.
00:23:39.720 Now, what makes the video strong is the way he does it.
00:23:43.760 His presentation is really impressive.
00:23:46.460 And here's my question for you.
00:23:53.300 Ready?
00:23:53.840 Here's a fun question of the day.
00:23:55.260 Does it feel to you that there's some kind of a 1776 kind of thing forming?
00:24:04.700 You know what I'm talking about?
00:24:06.540 Remember the founders of the country?
00:24:08.760 So you had your Washingtons and your Jeffersons and your Hamiltons and your Franklins and stuff.
00:24:15.360 Who are those people today?
00:24:18.980 It's Joe Rogan, right?
00:24:20.080 Like, if you had to map, who are the, like, the founders who need to reset the United States?
00:24:32.840 Because it feels like we need a tuning, doesn't it?
00:24:35.980 Like we're in a tune.
00:24:37.020 We're like an instrument.
00:24:38.380 We're like an instrument that was really in tune for decades.
00:24:42.940 But now it's out of tune.
00:24:44.380 Something's wrong.
00:24:45.460 And it needs to get fixed, retuned.
00:24:50.080 Which, can you map today's, let's say, the independent pundits?
00:24:56.980 And I'm going to take Alex Jones off the list.
00:25:00.760 Because as awesome as Alex Jones is in many ways, as an entertainer, et cetera,
00:25:05.840 I don't feel like he quite fits this model that I'm talking about.
00:25:09.960 He's a little too provocative.
00:25:11.740 So, I mean, he's a special case.
00:25:14.080 But could you map the current pundits, the voices that you hear, the independent voices,
00:25:20.440 could you map them to the founders?
00:25:22.600 Like, who would be George Washington?
00:25:26.580 It might be Joe Rogan.
00:25:29.440 It might be Joe Rogan.
00:25:31.380 It might be George Washington.
00:25:33.920 Who is Benjamin Franklin?
00:25:38.980 Who is Benjamin Franklin?
00:25:44.500 Who is Jefferson?
00:25:48.120 Who is Hamilton?
00:25:50.700 You could almost see that there's a thing forming.
00:25:55.780 It's like the founders are reforming.
00:25:59.960 Now, I have a last name which has some historical relevance to the revolution.
00:26:09.520 We're related, but I'm not a descendant.
00:26:12.280 I think I'm a distant cousin or something.
00:26:15.060 But isn't it interesting that there's something forming that seems to be the right counterbalance
00:26:22.660 to whatever looks like is the tuning problem with the country?
00:26:30.800 And it's time to go to the whiteboard.
00:26:33.000 Now, those of you who are watching on YouTube right now, you didn't know there was going to be a whiteboard, did you?
00:26:42.320 Yeah, yeah, you would have been twice as excited if I told you that up front.
00:26:45.820 But there's going to be a whiteboard.
00:26:47.680 People and locals already knew it because they get a preview before you do.
00:26:52.000 But here is my hypothesis.
00:26:54.660 There is a gigantic, wide path open for a candidate who wants to end the division in the country
00:27:07.340 and to bring us to, let's say, a new level of 1776-like freedom.
00:27:15.580 And I'm going to suggest that there exists, completely by coincidence,
00:27:23.440 this is completely a coincidence,
00:27:24.700 that most of our divisive topics have a middle ground that both sides would agree to.
00:27:34.980 That sounds crazy, doesn't it?
00:27:37.140 If I told you that given the huge disparity and, like, the divisions in this country,
00:27:42.940 you'd say to yourself, my God, we're separated and it's getting worse.
00:27:46.740 Remember one of the things I taught you?
00:27:51.080 That sometimes you can't tell the difference between being on the edge of disaster
00:27:54.760 and on the edge of the golden age?
00:27:57.880 They feel the same, right?
00:27:59.780 Darkest before the dawn situation, right?
00:28:03.320 We have blundered into a situation, just blundered,
00:28:07.080 into a situation in which a middle-of-the-road candidate could satisfy everybody.
00:28:13.500 And I'm going to make that case.
00:28:14.840 Do you think so?
00:28:15.360 A middle-of-the-road candidate, and I don't know who it would be.
00:28:18.100 I mean, I'm not talking about a Trump, obviously.
00:28:20.760 I'm not talking about Bernie Sanders, obviously.
00:28:23.100 But somebody could emerge.
00:28:26.080 And let me tell you what that would look like, all right?
00:28:29.320 Take the election.
00:28:31.840 On the left, you've got people who say the election was fine, stop complaining.
00:28:36.040 And on the right, you've got people who say it was a fraud.
00:28:40.540 But it's also history.
00:28:42.080 So how could a candidate who wants to bring the country together come up with a plan that makes the left and the right happy?
00:28:50.600 It looks like this.
00:28:52.540 Hey, there's one thing we can agree on.
00:28:55.180 The one thing we can agree on is that we didn't agree about the last election.
00:29:00.320 Right?
00:29:00.580 We can all agree that we didn't agree about the last one.
00:29:04.900 So let's fix the next one.
00:29:07.040 And I'll make it my main thing to have election reform instead of rules that guarantee the states have to do it right. 0.91
00:29:16.440 They can still do things differently, but they have to meet a certain standard of auditability.
00:29:24.460 Right?
00:29:24.840 Now, would you vote for that candidate?
00:29:27.160 The one who says, you know, whether you think that 2020 was fair or unfair, it's also over.
00:29:34.200 It's also done.
00:29:35.980 So let's fix it for next time.
00:29:38.240 You would immediately like that candidate.
00:29:40.700 Right?
00:29:41.960 Because they found the middle ground.
00:29:43.760 It's exactly what you wanted.
00:29:44.940 Let's do another one.
00:29:47.120 How about the climate?
00:29:48.560 You've got people on the right say, hey, it's a hoax, or it's no problem at all.
00:29:52.540 People on the left, it's a crisis.
00:29:54.420 How could you possibly integrate it's a crisis with it's not a problem?
00:30:00.580 Nuclear energy.
00:30:02.120 Because it's the same solution whether it's a hoax or not.
00:30:05.560 You need the energy.
00:30:07.020 You want clean energy and clean air.
00:30:09.420 So you're going to do nuclear energy as hard as you can, no matter what.
00:30:13.380 You don't even have to decide if climate is real.
00:30:16.680 Just find the middle path and say, look, I'll make both of you happy.
00:30:19.880 We'll just go balls to the wall on nuclear.
00:30:22.280 The left likes it.
00:30:23.320 Biden likes it.
00:30:24.400 The right likes it.
00:30:25.500 Why are we even having a conversation?
00:30:27.820 We have to do the same thing whether climate change is what you think it is or not.
00:30:33.840 It's exactly the same path.
00:30:35.940 Boom.
00:30:36.160 Let's do another one.
00:30:38.500 Infrastructure bill.
00:30:39.740 The public just wants an infrastructure bill, according to polls.
00:30:44.120 So this one's easy.
00:30:45.560 Just give the public what they ask for.
00:30:47.720 Be in favor of infrastructure only, not the big one that's going to change society overall.
00:30:54.740 And satisfy the people who think it's just a giant power grab by just saying, oh, it's just infrastructure.
00:31:01.260 Now, would you be able to get that passed through Congress?
00:31:05.440 I don't know.
00:31:06.320 But if you ran for president on trying, you would look pretty attractive.
00:31:11.120 Because the public is on the same page.
00:31:13.480 The public says, can you just give me a bill that's on this topic?
00:31:18.160 And then we'll vote on it.
00:31:19.240 And then we'll do the other topic.
00:31:20.700 The public's already there.
00:31:22.040 You don't have to convince anybody.
00:31:23.240 Just take the view that the public already has.
00:31:27.020 Just adopt it.
00:31:27.780 How about schools?
00:31:30.980 We've got the right who likes their homeschooling and having their freedom to teach their children the way they want.
00:31:36.580 Without all these social justice stuff.
00:31:40.700 But then on the left, you've got the people who believe in systemic racism.
00:31:45.400 That's what that says down there.
00:31:47.080 Systemic racism.
00:31:48.540 So how do you integrate them?
00:31:50.100 Simple.
00:31:50.920 They all agree that the teachers' unions are the problems.
00:31:53.980 So just go after the teachers' unions.
00:31:55.500 The left and the right have different issues, but they both have the same solution.
00:32:03.660 Neuter the teachers' unions.
00:32:05.760 Same solution.
00:32:06.960 Two different worldviews.
00:32:08.500 But same solution.
00:32:09.920 Easy to unify.
00:32:11.440 Like, moronically easy to unify the country.
00:32:15.020 In fact, I think it's so easy to unify the country that we're blind to it.
00:32:20.420 That we're just not that...
00:32:21.820 We're not really that far on different pages.
00:32:23.760 We have different views of what's going on, but weirdly, the solutions are the same.
00:32:29.340 No matter what you think is going on.
00:32:32.080 So, here's some more.
00:32:40.640 You're worried about vaccine mandates?
00:32:42.600 We're not going to talk about it.
00:32:44.180 I'm not going to talk about it.
00:32:45.340 I know you've warned me you're sick of vaccines.
00:32:48.880 I'm just saying it's an issue.
00:32:50.440 How could you bring the left and the right together on vaccine mandates?
00:32:56.440 Here's how you do it.
00:32:57.880 Leave it to the insurance companies.
00:33:00.500 That's it.
00:33:02.040 Just take the government out of it and just leave it to insurance.
00:33:06.360 Do you know what would happen?
00:33:08.100 Same thing that happens with everything.
00:33:09.900 Same thing that happens with every topic.
00:33:14.920 Insurance companies will decide how much risk you can take.
00:33:19.720 Or they'll charge you for it.
00:33:21.980 That's it.
00:33:22.820 The obvious thing that would happen is insurance companies would charge a different premium for vaccinated versus unvaccinated people.
00:33:30.280 Now, they probably should do something for people who have natural immunity as well.
00:33:35.560 But I'm pretty sure that the government could just walk away from this question.
00:33:40.780 And the insurance industry would just say, well, you don't have to get vaccinated.
00:33:45.800 It's your body.
00:33:47.400 But you do have to pay more.
00:33:50.060 Now, do you think that that's right?
00:33:52.320 Doesn't matter.
00:33:53.380 They're going to do it anyway.
00:33:54.920 Insurance companies are just going to do the math the way they want to do it.
00:33:58.540 Your opinion doesn't matter.
00:34:00.680 How about my opinion that I should not have to pay more for car insurance just because I'm a male?
00:34:06.760 Because I happen to be a very safe driver.
00:34:09.160 So it's very unfair for me, right?
00:34:11.360 But I don't bitch about it too much because the insurance company did their math.
00:34:15.620 They have to provide insurance in a way that they can make money.
00:34:18.840 That's the only way they can do it.
00:34:20.680 So I kind of live with it.
00:34:22.220 So get the government out of the business and let the free market decide what we do.
00:34:28.580 The free market works every time, doesn't it?
00:34:31.960 Well, that's an exaggeration.
00:34:33.680 But it works often.
00:34:35.620 So how about another one?
00:34:37.240 Health care.
00:34:38.560 The left would like universal health care.
00:34:40.920 The right would rather just have competition and do what you, you know, you're on your own.
00:34:45.800 I feel like there's definitely a middle ground in which we say our objective is to get everybody insurance.
00:34:52.680 We don't know how to do it in a way that makes everybody happy.
00:34:56.200 So I think there should be a poor person's plan.
00:34:59.580 I've been saying this for years.
00:35:00.960 The government needs a poor person's plan.
00:35:03.700 What would that look like?
00:35:05.480 Mostly it would look like removing regulations.
00:35:09.540 Now you like it, don't you?
00:35:11.520 If I said to you we're going to insure all the poor people and you're going to pay for it, 1.00
00:35:15.820 you're like, oh, I don't know.
00:35:17.540 I like poor people having insurance, but I don't want to pay for it.
00:35:20.400 But if I said to you that poor people would have insurance just by removing regulations,
00:35:28.940 let me give you an example.
00:35:31.040 The regulation against telehealth over the phone across borders.
00:35:37.740 Just get rid of that.
00:35:39.180 Then suddenly you have all kinds of competition for doctors because they can do it over the phone.
00:35:44.920 Now there's still a physical manipulation part.
00:35:47.480 Somebody has to give you the shot or put on the Band-Aid or set the bone or whatever it is.
00:35:52.660 So you still need people in person.
00:35:54.980 So you make one other regulatory change.
00:35:59.080 The regulatory change is that maybe nurse practitioners or nurses can do a lot of the physical stuff
00:36:05.560 that's maybe a little bit more than they did before.
00:36:08.340 Maybe while the doctor's also still on the phone.
00:36:11.400 I mean, you can work it out.
00:36:12.300 But it seems to me that if you made use of, let's say, excess capacity,
00:36:18.040 which is, let's say, getting an MRI at midnight,
00:36:21.540 do you think that the demand for MRIs at midnight is the same as it is during the day?
00:36:27.720 Probably not.
00:36:28.720 So if you're a poorer person, maybe you've got to get the 2 a.m. MRI.
00:36:34.240 And you pay way, way less.
00:36:35.840 So the point is, a middle-of-the-road candidate has all the space in the world
00:36:42.440 to create solutions that the left and the right go, hey, that's not bad.
00:36:48.100 Now, I don't know that there's any candidate who could pull this off.
00:36:50.920 I don't think that this stuff is compatible with the left or the right.
00:36:54.180 I feel like a Republican could pull it off better.
00:36:57.800 Am I wrong?
00:36:58.560 I feel like this is a little bit more rustic.
00:37:05.400 What are you talking about?
00:37:06.780 It says, Scott, please stop.
00:37:10.820 Stop what?
00:37:15.080 You've got to do better than please stop, really.
00:37:19.140 You really have to do better than that.
00:37:20.700 Just up your game a little bit.
00:37:21.820 Just give me a little taste of what you don't like about it.
00:37:26.340 Just anything.
00:37:26.820 All right.
00:37:28.560 So that is that.
00:37:34.000 California dismissed 124,000 marijuana convictions,
00:37:37.740 or they will after this latest batch.
00:37:40.480 124,000 Californians had their lives ruined
00:37:44.700 because they smoked some marijuana and got caught.
00:37:47.500 They probably were small dealers, too.
00:37:49.640 But this is good news.
00:37:53.520 124,000 people just got their life back.
00:37:57.400 That's really big.
00:37:59.480 So California does some things right.
00:38:06.180 CNN had a report that says that misinformation
00:38:11.220 gets six times more clicks, at least on Facebook,
00:38:15.660 than real information.
00:38:16.900 So misinformation gets six times more clicks.
00:38:19.980 Big problem, huh?
00:38:22.240 And then they went further and said this.
00:38:25.360 68% of far-right posts are misinformation.
00:38:28.220 And 36% of misinformation is on the left.
00:38:33.400 So two-thirds of misinformation comes from the far right.
00:38:38.780 Do you believe that?
00:38:40.500 Two-thirds of misinformation comes from the far right,
00:38:45.680 according to CNN, according to some study about Facebook.
00:38:49.160 How does that...
00:38:51.160 Now, try to...
00:38:53.320 Imagine that's true.
00:38:55.000 Assume it's true.
00:38:55.880 68% of the far right posts are misinformation.
00:38:59.220 How does that jibe with what Bill Maher was just recently talking about,
00:39:05.180 that people on the right are far better informed about the risks of COVID?
00:39:11.140 How does that...
00:39:13.980 How can it be true that 68% of the far right posts are misinformation,
00:39:19.100 but the far right is far more informed, better informed?
00:39:22.860 Explain those two facts.
00:39:24.240 Somebody says probably the opposite.
00:39:28.840 Okay, maybe the data's just wrong.
00:39:30.340 That's possible.
00:39:33.560 But how could this both be true?
00:39:37.200 Do you know how it could both be true?
00:39:40.800 The conservatives don't believe everything they click.
00:39:44.780 Right?
00:39:46.140 So if you're talking about what they clicked,
00:39:49.500 the conservatives are clicking like crazy on false information.
00:39:54.240 But apparently, they're not believing it.
00:39:56.880 Because in the end, they have better information
00:39:58.920 after looking at six times...
00:40:00.740 Well, after looking at way more misinformation,
00:40:04.340 they still have a clearer idea of what's happening.
00:40:08.580 Does that mean that they're filtering it better?
00:40:11.060 Or...
00:40:11.740 Or...
00:40:13.620 Is it that it's only a small number of conservatives
00:40:17.020 who are doing all of the clicking?
00:40:19.120 It could be that there's like a small active group of far right people
00:40:22.480 who are just doing tons of clicking,
00:40:24.240 and it doesn't really affect the average that much.
00:40:28.600 That's probably what's going on.
00:40:30.780 So a very misleading report.
00:40:33.300 Surprise from CNN.
00:40:37.340 I saw a tweet from Daniel Buck on Twitter.
00:40:40.920 And I don't know if these numbers are right,
00:40:44.060 but...
00:40:44.700 So I guess that's question number one.
00:40:46.280 Are these numbers right?
00:40:47.220 It says that the average yearly homeschooling cost for homeschooling
00:40:51.920 is $700 to $1,800 per student.
00:40:56.120 Now, obviously, that doesn't cost the...
00:40:58.460 You know, count the time of the parents, et cetera.
00:41:00.700 But average public school cost $10,000 to $15,000 per student.
00:41:07.840 So, you know, like 10 times as much for a public school.
00:41:14.980 So he says, imagine what would happen if you used that money for homeschooling.
00:41:18.800 Now, I don't think it's quite that clean, right?
00:41:21.020 You know, that comparison's kind of ugly.
00:41:23.680 But I would add this to that equation.
00:41:27.200 100% of the mental and emotional problems of children
00:41:30.700 come from their classmates.
00:41:35.280 100% of the emotional and mental problems of kids,
00:41:40.760 and they're fairly extreme,
00:41:43.220 the mental problems of kids today,
00:41:45.560 come from their classmates.
00:41:49.020 Their classmates.
00:41:51.060 Do you know any teenagers?
00:41:53.680 Ask them what's bothering them.
00:41:55.720 It's their classmates.
00:41:57.220 You know, sometimes a teacher, but that's like 1%.
00:41:59.460 It's like almost entirely bullies and bad behavior.
00:42:06.060 Now, the homeschool kids don't get much of that, do they?
00:42:09.960 They don't get the continuous bullying and criticism
00:42:12.280 and, you know, attacks on your self-esteem and everything else.
00:42:16.220 And let me ask you this.
00:42:18.460 If I said you're going to go to an environment,
00:42:20.920 you have to go there.
00:42:21.740 There's no choice.
00:42:22.640 You have no freedom.
00:42:23.320 You have to go to this environment
00:42:24.440 where 20% of the people will be just awful.
00:42:28.260 They'll just be bullies.
00:42:29.700 They'll be destroying you.
00:42:31.640 You're going to have PTSD when it's done.
00:42:34.240 Would you go to that environment
00:42:35.780 if you knew that 20% of the people
00:42:38.660 were going to just wreck you?
00:42:41.100 No.
00:42:41.700 You would never do that voluntarily,
00:42:43.140 but that's what school is.
00:42:44.460 School is a guarantee that 20% of your classmates
00:42:47.360 are just monsters.
00:42:48.740 They're just literally just monsters.
00:42:50.980 They're just literally just monsters.
00:42:53.080 And every kid is being destroyed by these, you know, 20%, 1.00
00:42:56.580 it might be more, it could be 50%, of monsters.
00:43:00.360 Every class.
00:43:01.380 Every kid in every class is being destroyed by bullies.
00:43:05.840 Because once you have social media,
00:43:08.400 it's like a weapon of mass destruction.
00:43:10.500 You know, you can bully people in a massive, pervasive way.
00:43:13.960 So I would say that the whole idea
00:43:16.860 of sending people to school with their peers
00:43:19.760 is broken because social media broke it.
00:43:23.000 Let me say that again.
00:43:25.960 Social media made public school a nightmare.
00:43:32.040 And it probably needs to be just eliminated.
00:43:35.440 You probably need to get rid of public schools
00:43:37.060 or get rid of social media,
00:43:38.380 but that's not going to happen.
00:43:39.860 So almost certainly,
00:43:40.920 we have to get kids out of that environment.
00:43:43.380 Because social media plus public school
00:43:46.020 equals mental destruction of kids.
00:43:49.820 Really serious mental destruction.
00:43:51.640 I'm not talking about annoyances.
00:43:54.540 I'm not talking about,
00:43:55.800 oh, little Angela is unhappy today
00:43:59.760 because somebody said something bad.
00:44:01.120 I'm talking about people being destroyed.
00:44:04.240 I mean, your lives just destroyed.
00:44:07.400 Just going to school.
00:44:08.780 And it's because of social media plus school.
00:44:10.780 They just can't be impaired.
00:44:12.300 You can't put a weapon of mass destruction
00:44:15.800 in a child's hand.
00:44:17.360 And that's what we did with social media.
00:44:18.840 It's a weapon of mass destruction
00:44:21.160 usually applied one person at a time.
00:44:25.420 But we basically are arming children
00:44:27.800 with the most dangerous mental weapons
00:44:31.060 you could ever have.
00:44:32.000 And we're like, oh, okay.
00:44:33.300 Let's arm these children and walk away. 0.99
00:44:34.980 This should be fine.
00:44:37.420 All right.
00:44:37.820 I'm going to ask a question
00:44:42.140 which some of you will turn off this feed.
00:44:47.140 But if you stick with me for a moment,
00:44:51.860 I think you're going to find it
00:44:53.120 more interesting than you thought.
00:44:56.000 So I don't like to do your standard
00:44:58.060 get vaccinated, don't get vaccinated.
00:45:01.040 You can make up your own decision.
00:45:02.680 But there are questions
00:45:04.480 about the statistics of it
00:45:07.280 that are unanswered.
00:45:09.520 And here's the one that's bothering me.
00:45:12.620 And the reason it bothers me
00:45:13.900 is that it's my opinion.
00:45:15.960 So watch what I do now.
00:45:17.840 All right.
00:45:18.040 So I talked about people
00:45:19.020 who can be free of
00:45:21.400 or at least immune
00:45:22.120 to cognitive dissonance.
00:45:24.000 This is how you do it.
00:45:25.900 By questioning your own opinion.
00:45:28.800 All right.
00:45:28.940 If you can question your own opinion,
00:45:30.600 both privately and in public,
00:45:32.720 you're a little bit more immune
00:45:34.060 to cognitive dissonance.
00:45:35.440 Because then if something happens
00:45:36.660 and it shows your opinion is wrong,
00:45:38.880 then you say,
00:45:41.680 well, I told you it might be wrong.
00:45:43.660 So you don't have any trigger
00:45:45.020 for cognitive dissonance
00:45:46.100 because you've allowed
00:45:46.860 you could be wrong.
00:45:48.640 So here's something
00:45:50.340 that I thought was true
00:45:51.620 that I'm rethinking.
00:45:52.960 I think I might be wrong.
00:45:54.580 And I need you to help me here.
00:45:55.940 Okay?
00:45:56.680 So I'm going to ask you a question.
00:45:57.960 Well, let me start with a primer.
00:46:01.800 I believe that your opinion
00:46:03.100 on vaccinations
00:46:03.940 and my opinion on vaccinations
00:46:05.580 are both guesses.
00:46:09.020 We believe that we have looked
00:46:11.040 at statistics and facts
00:46:12.660 and we've looked at the odds
00:46:14.620 and we've looked at our own
00:46:15.580 comorbidities
00:46:16.340 and we made our decisions.
00:46:17.560 Do you think that's what happened?
00:46:19.300 Because I don't.
00:46:20.380 I think we both guessed.
00:46:22.340 And here's specifically
00:46:23.380 what we guessed on.
00:46:24.420 The risk of a vaccine
00:46:27.780 after the first few months.
00:46:30.200 Now, historically,
00:46:31.340 if you had a vaccination
00:46:33.480 that was safe
00:46:35.860 for the first few months,
00:46:37.460 in other words,
00:46:38.200 we couldn't find
00:46:39.160 a massive unidentified side effect
00:46:42.620 for the first few months,
00:46:44.760 you were pretty good.
00:46:46.340 You know, the odds
00:46:47.060 of ever finding one
00:46:48.120 then become almost zero.
00:46:51.380 Historically, historically,
00:46:53.520 with different vaccinations
00:46:55.880 with different technology,
00:46:59.720 we knew that if you waited
00:47:01.700 X months
00:47:02.340 and you didn't see anything,
00:47:03.360 you were fine.
00:47:04.580 Why does that apply
00:47:05.800 to a new vaccine?
00:47:07.560 Why can I use the experience
00:47:09.760 with completely different vaccinations
00:47:11.720 to make a decision
00:47:13.900 about this one
00:47:14.740 that's brand new
00:47:15.540 and hasn't been around for years?
00:47:19.160 Now, suppose it's true
00:47:20.820 that every...
00:47:21.520 let's just...
00:47:23.160 I don't know if this is true,
00:47:24.240 but let's say it's true.
00:47:25.640 That every vaccination
00:47:26.540 for the last 30 years,
00:47:28.940 we had the experience
00:47:30.840 that if it didn't give you problems
00:47:32.020 in the first few months,
00:47:33.260 you were safe,
00:47:35.100 statistically speaking.
00:47:37.340 But why would that be true
00:47:38.440 of every new vaccine,
00:47:39.880 especially if it's
00:47:40.640 a new technology?
00:47:42.300 Why is it that our experience
00:47:43.920 with other vaccines,
00:47:45.480 which are other vaccines,
00:47:48.120 why does that tell us
00:47:49.940 what's going to happen
00:47:50.520 with this one?
00:47:51.860 Just because they all
00:47:53.040 went a certain way before,
00:47:55.500 but this isn't those.
00:47:57.980 Is it?
00:47:59.180 You know,
00:47:59.620 saying that your next vaccine
00:48:01.380 will operate like the last one,
00:48:03.460 knowing it's a completely
00:48:04.360 different technology,
00:48:06.120 isn't that exactly like
00:48:07.500 saying that Trump
00:48:09.260 would be just like
00:48:10.020 other presidents?
00:48:10.580 Every president we elect
00:48:13.720 is sort of the same.
00:48:15.960 Every time we get a president,
00:48:17.440 they move to the middle.
00:48:19.360 They're sort of the same
00:48:20.160 as every other president.
00:48:23.260 But then Trump comes along.
00:48:25.740 Didn't see that coming,
00:48:26.820 did you?
00:48:28.060 Right?
00:48:28.680 Why is it that I believe
00:48:30.780 that,
00:48:31.560 and this is the part
00:48:33.360 I'm questioning,
00:48:33.760 why is it that I believe
00:48:35.280 that the history
00:48:37.000 of other vaccines
00:48:38.360 tells me I'm safe
00:48:40.460 with this one?
00:48:41.900 How is that logical?
00:48:43.420 That's my opinion.
00:48:44.940 I'm telling you
00:48:45.780 my opinion
00:48:46.380 doesn't make sense
00:48:47.140 to me.
00:48:49.100 Right?
00:48:50.760 Tell me I'm wrong.
00:48:52.280 That's my opinion
00:48:53.380 and it doesn't even
00:48:54.180 make sense to me.
00:48:55.540 Because there's
00:48:56.240 no connection
00:48:56.820 between those
00:48:57.540 other vaccinations
00:48:58.460 and the one I got.
00:49:01.520 Right?
00:49:02.900 So I would love
00:49:03.820 to hear a question
00:49:05.500 for Dr. Drew,
00:49:06.860 question for any
00:49:07.880 of the doctors
00:49:08.480 who have more insight
00:49:09.660 on this.
00:49:10.780 What logical connection
00:49:12.480 can I make
00:49:13.340 between this new vaccine
00:49:15.240 and others
00:49:16.160 and why would I expect
00:49:18.220 this experience
00:49:19.080 to be the same as that?
00:49:21.220 Now, here's the other thing
00:49:22.580 you don't know.
00:49:23.720 Long haul.
00:49:25.080 So you don't know
00:49:25.900 what the long haul risk is.
00:49:28.460 And you don't know
00:49:29.220 what the long haul risk is
00:49:30.580 if you can call it that
00:49:31.780 from the vaccination itself.
00:49:34.140 And yet you and I
00:49:35.060 having no information
00:49:37.220 whatsoever
00:49:37.820 about the main risks
00:49:39.640 have come to these
00:49:41.180 like solid decisions
00:49:42.380 about what to do.
00:49:43.960 If you're not questioning
00:49:45.280 your decision
00:49:46.080 you should be.
00:49:50.480 Well, let me ask you
00:49:51.380 this question.
00:49:52.260 Whichever way you went
00:49:53.440 how many of you
00:49:55.940 are 100% sure
00:49:57.140 you made the right choice?
00:49:58.460 No matter which way
00:49:59.500 you went.
00:49:59.920 Vaccinated or unvaccinated. 0.90
00:50:01.380 In the comments
00:50:02.100 how many of you
00:50:03.460 are 100% sure?
00:50:06.580 I'm looking at the comments.
00:50:08.380 All right.
00:50:08.680 I'm saying mostly no.
00:50:10.080 But there are a few
00:50:10.840 who are 100% sure.
00:50:13.480 90% seems like
00:50:14.980 you know
00:50:16.220 that's where you should be
00:50:17.820 I think.
00:50:19.420 Yeah.
00:50:20.740 A lot of 100%ers.
00:50:22.800 More than you think.
00:50:24.680 Maybe
00:50:25.060 I don't know.
00:50:25.700 It's not scientific
00:50:26.480 or anything.
00:50:26.900 But I'm seeing a lot
00:50:27.740 of 100%ers.
00:50:29.400 I don't think
00:50:30.160 that's a rational opinion.
00:50:33.360 So I think that
00:50:34.080 whether you're pro
00:50:35.360 or anti-vaccination
00:50:36.420 the 100%ers 0.94
00:50:37.740 are the ones
00:50:38.220 that have the wrong opinion.
00:50:40.000 Now
00:50:40.240 let me adjust that
00:50:42.860 a little bit.
00:50:44.020 If you have some
00:50:44.680 strange comorbidities
00:50:46.000 or you're 8 years old
00:50:47.460 maybe you're closer
00:50:49.560 to 100%.
00:50:50.320 But for people
00:50:51.520 in the middle
00:50:52.260 now let's take
00:50:54.020 me specifically.
00:50:56.420 If I were to decide
00:50:57.700 to get vaccinated
00:50:58.360 or not
00:50:58.880 I would have
00:51:01.380 to calculate my odds.
00:51:02.540 How do I calculate
00:51:03.300 my odds?
00:51:05.220 Can't do it.
00:51:06.380 Right?
00:51:07.120 I can't calculate
00:51:08.060 my odds.
00:51:09.020 I can calculate
00:51:09.780 the odds of somebody
00:51:10.620 in my age group
00:51:11.280 but that's not me.
00:51:13.340 Do my odds
00:51:14.220 apply to anybody
00:51:14.940 in my age group?
00:51:16.600 How many people
00:51:17.380 who are 64
00:51:18.420 are as fit
00:51:20.740 as I am
00:51:21.260 and have my
00:51:22.000 body mass index?
00:51:24.700 Not a lot.
00:51:26.000 So what is the risk
00:51:26.800 for people
00:51:27.200 with my body mass index
00:51:28.580 who are also 64?
00:51:30.600 Who knows?
00:51:32.280 I also have
00:51:33.080 a comorbidity
00:51:33.860 asthma.
00:51:35.540 Asthma is on the list 0.99
00:51:36.720 of the bad comorbidities.
00:51:38.900 So therefore
00:51:39.400 I'd say
00:51:39.820 oh okay
00:51:40.240 I'm a risky situation.
00:51:42.700 But
00:51:42.960 one of the medications
00:51:45.020 that seems to have
00:51:46.700 a high effectiveness
00:51:48.380 for COVID
00:51:49.140 is one that I take
00:51:50.500 for asthma.
00:51:52.060 The budesonide
00:51:53.300 or whatever the hell
00:51:53.960 it is.
00:51:54.320 I forget the name of it.
00:51:55.520 But there's an asthma
00:51:56.860 meds that seems
00:51:57.680 to help against COVID.
00:51:59.660 So if I have asthma
00:52:00.640 but I'm also doing
00:52:01.700 those medications
00:52:02.800 that may or may not
00:52:03.880 help against COVID
00:52:04.700 and I'm thin
00:52:06.100 but I'm old
00:52:06.760 what is my risk?
00:52:10.060 No idea.
00:52:11.500 No idea.
00:52:12.480 So if you're looking
00:52:13.600 at your own personal risk
00:52:15.040 and say okay
00:52:15.540 in my specific case
00:52:16.940 I've got this kind of risk
00:52:18.760 and then you know
00:52:19.860 there's this much
00:52:20.600 long haul risk
00:52:21.920 and there's this much
00:52:22.660 long term vaccination risk
00:52:24.200 these are all unknowns.
00:52:27.400 You don't know
00:52:28.420 your personal risk.
00:52:29.900 You might think you do
00:52:30.720 but you don't.
00:52:31.660 You don't know
00:52:32.380 your DNA
00:52:32.900 and how that affects
00:52:34.200 things etc.
00:52:36.720 You don't know
00:52:37.600 your ACE2 inhibitors.
00:52:39.760 You don't know
00:52:40.240 that stuff.
00:52:40.720 So to imagine
00:52:44.080 that we're making
00:52:45.280 rational decisions
00:52:46.380 is not
00:52:47.300 I don't feel
00:52:48.480 like we are.
00:52:50.100 So here's why
00:52:50.960 and let me tell you
00:52:52.240 where this all started.
00:52:53.460 I was accused
00:52:54.420 of having contempt
00:52:55.620 for my own viewers
00:52:57.820 and I'll bet
00:52:59.760 that's not
00:53:00.320 I'll bet that wasn't
00:53:01.240 just one person
00:53:01.980 who thought that.
00:53:02.700 How many of you
00:53:03.500 think that?
00:53:04.400 How many of you
00:53:05.240 think that I've shown
00:53:06.740 contempt
00:53:07.440 for my audience?
00:53:09.240 I'm just looking
00:53:12.580 at your comments
00:53:13.160 for a moment
00:53:13.660 because I think
00:53:15.020 there's good
00:53:15.420 I think some of you
00:53:16.480 are going to say yes
00:53:17.160 sometimes
00:53:18.600 no
00:53:20.140 mostly no
00:53:20.860 but seen a few yeses
00:53:23.160 and here's what
00:53:24.240 I was thinking
00:53:24.680 I can't have contempt
00:53:27.120 for your guess
00:53:28.440 when I know
00:53:29.880 mine is a guess.
00:53:32.480 Let me say that again
00:53:33.280 it wouldn't be rational
00:53:35.220 and I don't know
00:53:36.220 how I would generate
00:53:37.280 contempt
00:53:38.380 for an opinion
00:53:40.020 which is just a guess
00:53:41.400 when my opinion
00:53:42.980 is just a guess.
00:53:44.520 Why is my guess
00:53:45.360 superior to your guess?
00:53:48.540 And by the way
00:53:49.540 my guess is just
00:53:50.380 about me
00:53:51.040 my guess isn't
00:53:52.420 even about you
00:53:53.140 I don't have an opinion
00:53:53.920 about whether you
00:53:54.760 should get vaccinated
00:53:55.440 I don't have an opinion
00:53:56.160 about me.
00:53:57.520 So
00:53:57.700 I would say
00:53:59.440 that we should all
00:54:00.140 take a little
00:54:00.800 helping of humility.
00:54:04.620 If you think
00:54:05.580 you can calculate
00:54:06.440 your personal odds
00:54:07.700 you can't
00:54:08.480 and if you think
00:54:09.700 you can calculate
00:54:10.320 the odds of the vaccine
00:54:11.660 being a danger
00:54:12.300 you can't
00:54:12.920 and if you think
00:54:14.180 you can calculate
00:54:15.000 the long haul
00:54:16.320 COVID risk
00:54:17.340 you can't
00:54:18.000 you can't
00:54:19.440 so we're all guessing
00:54:21.760 and you
00:54:22.840 you know
00:54:23.320 I see Jay
00:54:24.200 you're calling
00:54:24.600 an educated guess
00:54:26.080 is it?
00:54:27.500 I mean
00:54:29.020 is it?
00:54:30.660 Because how educated
00:54:31.580 are you
00:54:32.040 about the long
00:54:32.960 long term risk
00:54:34.200 of this vaccination?
00:54:36.020 You couldn't be
00:54:36.700 because nobody
00:54:37.420 knows that.
00:54:40.060 All right
00:54:40.780 so definitely
00:54:42.200 I can tell you
00:54:43.600 with certainty
00:54:44.180 that whatever
00:54:45.240 my internal process is
00:54:46.860 it doesn't feel
00:54:47.420 anything like contempt
00:54:48.500 and let me say
00:54:49.560 even more clearly
00:54:50.940 there is no way
00:54:52.920 I could ever
00:54:53.600 generate
00:54:54.340 a feeling
00:54:55.480 like contempt
00:54:56.760 for subscribers
00:54:58.880 on locals
00:54:59.640 people who are
00:55:00.960 literally paying
00:55:02.120 their money
00:55:02.840 for my content
00:55:04.880 there's no way
00:55:06.600 under any scenario
00:55:07.820 I'm going to feel
00:55:08.400 contempt
00:55:08.940 about that group
00:55:09.860 of people
00:55:10.360 yeah
00:55:11.320 how would you
00:55:12.080 even generate
00:55:12.640 that feeling
00:55:13.200 like I know
00:55:14.660 what contempt
00:55:15.100 feels like
00:55:15.720 I don't know
00:55:16.720 even how you
00:55:17.260 would possibly
00:55:18.060 have that feeling
00:55:18.740 no matter what
00:55:19.320 they said
00:55:19.740 right
00:55:20.700 if somebody's
00:55:21.280 paying
00:55:21.620 paying for
00:55:22.580 my association
00:55:23.440 like I'm
00:55:24.520 automatically
00:55:25.000 on your team
00:55:25.660 you know
00:55:26.640 I'm good
00:55:27.520 with you
00:55:28.000 all right
00:55:29.480 but then
00:55:32.780 you felt
00:55:33.080 pressure
00:55:33.400 to justify
00:55:34.200 your decision
00:55:35.080 explain my
00:55:36.400 decision
00:55:36.840 did I
00:55:37.580 yeah
00:55:38.320 I think
00:55:38.900 that's a fair
00:55:39.360 comment
00:55:39.720 it sounds
00:55:40.200 like
00:55:40.500 I tried
00:55:41.680 to justify
00:55:42.380 my decisions
00:55:43.060 by explaining
00:55:43.840 it
00:55:44.140 but I think
00:55:45.000 in the end
00:55:45.720 the cleanest
00:55:47.120 way to
00:55:47.560 express
00:55:48.500 what's happening
00:55:50.180 is that
00:55:50.760 we're all
00:55:52.660 guessing
00:55:52.940 all right
00:55:54.480 let me tell
00:55:54.860 you what
00:55:55.120 you missed
00:55:55.460 on locals
00:55:56.220 if you
00:55:56.700 are not
00:55:57.020 a local
00:55:57.440 subscriber
00:55:58.080 here are
00:55:59.040 the micro
00:55:59.540 lessons
00:55:59.880 that I've
00:56:00.320 added
00:56:00.620 a micro
00:56:01.180 lesson
00:56:01.480 just put
00:56:02.160 this on
00:56:02.520 there
00:56:02.640 on how
00:56:02.940 to pay
00:56:03.180 attention
00:56:03.560 how to
00:56:04.520 focus
00:56:04.940 if you're
00:56:05.280 having
00:56:05.560 focus
00:56:06.040 problems
00:56:06.480 how to
00:56:08.200 make
00:56:08.380 yourself
00:56:08.660 happier
00:56:09.060 how to
00:56:09.520 teach
00:56:09.920 with
00:56:10.100 motivation
00:56:10.600 how to
00:56:11.220 use
00:56:11.580 funny
00:56:11.960 words
00:56:12.480 how to
00:56:12.840 find a
00:56:13.800 mentor
00:56:14.100 how to
00:56:14.760 wake up
00:56:15.800 on time
00:56:16.200 how to
00:56:16.460 give
00:56:16.640 criticism
00:56:17.180 the power
00:56:17.640 of praise
00:56:18.020 there are
00:56:18.440 about 150
00:56:19.220 of them
00:56:20.400 about 150
00:56:21.560 micro lessons
00:56:22.280 they're all
00:56:22.760 two to five
00:56:23.840 minutes
00:56:24.200 each one
00:56:25.140 would give
00:56:25.500 you a new
00:56:26.000 skill
00:56:26.360 for five
00:56:28.060 to seven
00:56:28.480 dollars a
00:56:28.920 month
00:56:29.080 depending on
00:56:29.600 whether you
00:56:29.880 have the
00:56:30.100 annual
00:56:30.360 subscription
00:56:30.860 and so
00:56:31.760 what I
00:56:32.040 try to
00:56:32.400 do
00:56:32.640 is give
00:56:33.840 you more
00:56:34.280 than seven
00:56:34.840 dollars worth
00:56:35.460 of value
00:56:35.860 every month
00:56:36.520 and ideally
00:56:37.600 thousands of
00:56:38.460 dollars worth
00:56:38.940 of value
00:56:39.300 in terms of
00:56:40.280 what it does
00:56:40.680 for your
00:56:40.920 life
00:56:41.180 and that
00:56:42.320 is my
00:56:42.860 show for
00:56:43.220 today
00:56:43.600 best one
00:56:44.640 ever
00:56:44.880 I think
00:56:45.580 so
00:56:45.800 I think
00:56:46.840 you'd
00:56:47.060 agree
00:56:47.260 just keep
00:56:48.240 getting better
00:56:48.740 and I will
00:56:49.720 talk to you
00:56:50.100 tomorrow
00:56:50.420 I gotta go
00:56:51.240 do some
00:56:51.580 stuff