Real Coffee with Scott Adams - August 11, 2021


Episode 1465 Scott Adams: Today I Put the Hypnosis Filter on the News So You Can Understand it For the First Time


Episode Stats

Length

48 minutes

Words per Minute

146.99907

Word Count

7,066

Sentence Count

534

Misogynist Sentences

2

Hate Speech Sentences

20


Summary

It's the first day of school, and it's time to wear masks in school. But what if you don't have to wear them in school? What if you can just take them off at the same time? Is that possible?


Transcript

00:00:00.000 Hey everybody, it's time for Coffee with Scott Adams, the best part of the day.
00:00:07.280 I don't know if you've noticed this, but if you start your day off right, everything seems
00:00:11.820 a little bit better.
00:00:13.100 Well, I'm going to be relentlessly positive today.
00:00:17.000 It's all optimism and good news and golden age and all that stuff.
00:00:21.460 But I'm going to put the hypnosis filter on the news so you can see it the way a hypnotist
00:00:26.580 sees it.
00:00:27.120 But first, but first, how about the simultaneous sip and all you need is a cup or a mug or
00:00:32.860 a glass, a tank or a chalice or a canteen jug or a flask or a vessel of any kind.
00:00:37.020 Fill it with your favorite liquid.
00:00:39.060 I like coffee.
00:00:40.460 And join me now for the unparalleled pleasure, the dopamine hit of the day, the thing that
00:00:45.620 makes everything better.
00:00:46.960 Yeah, it's called the simultaneous sip and it happens now.
00:00:50.460 Go.
00:00:50.700 Go.
00:00:50.760 And if this wasn't enough, and I think it was, but if it wasn't, it'll be a whiteboard
00:01:02.880 later.
00:01:03.540 Yeah.
00:01:04.920 Yeah.
00:01:05.560 Talk about having it all.
00:01:08.000 They say you can't have it all, but we're going to challenge that today.
00:01:12.240 You're going to have it all.
00:01:13.320 Well, all right, some reason all of my show notes just disappeared.
00:01:21.760 Huh.
00:01:24.340 That's a first.
00:01:26.460 I had them actually open and was looking at them.
00:01:28.840 Well, let's look at trash.
00:01:32.900 Well, it's a tragedy, folks.
00:01:37.560 My show notes have just disappeared.
00:01:40.460 But as luck would have them, I posted them on the Locals platform.
00:01:44.940 This is going to sound like a clever commercial, but I swear I didn't plan this.
00:01:49.840 If you're a member of the Locals community, I've started posting my show notes before the
00:01:56.080 show.
00:01:56.360 And there they are.
00:02:01.400 Okay.
00:02:02.280 So I'll just read them off the Locals.
00:02:04.940 So I told you I was going to give you the hypnosis filter on the news.
00:02:08.100 So it's basically the persuasion filter.
00:02:10.520 I had a look at the irrational parts of the news.
00:02:13.480 We'll get to all that.
00:02:14.640 But today in California, at least where I live, is the first day of school.
00:02:20.180 First day of school with masks.
00:02:23.260 And that means that today is the first day of mass government-sponsored child abuse.
00:02:29.820 Well, at least according to a lot of us.
00:02:32.720 We'll talk about the statistics of it.
00:02:36.160 But I'm wondering if children wearing masks in school is one TikTok meme away from going
00:02:44.400 away.
00:02:45.540 And let me give you an analogy.
00:02:47.720 When I was in high school a million years ago, we had something called Senior Ditch Day.
00:02:55.920 And it was a day that the seniors would all collectively ditch classes and go to the lake
00:03:00.460 and drink beer illegally and basically just play hooky from school.
00:03:07.740 Now, if three or four of the seniors had decided to do that, they'd probably get in trouble.
00:03:13.740 Well, they'd be absent.
00:03:15.960 Somebody would punish them.
00:03:17.300 Maybe their parents would be mad.
00:03:19.440 But because they all do it, it's just a tradition.
00:03:24.320 So the school just says, ah, crap, what are you going to do?
00:03:27.980 And they just deal with it.
00:03:29.380 So the teachers just have a day off.
00:03:31.580 And the kids all go to the lake.
00:03:34.420 I recall vividly being thrown in that lake on a very cold day.
00:03:41.180 Very cold day.
00:03:43.000 So it wasn't a good day for me if you get thrown in a frozen lake.
00:03:48.580 But it was an accident, actually.
00:03:51.540 But I did end up in the frozen lake.
00:03:54.260 And it seems to me that all it would take is one TikTok meme to tell everybody to take
00:04:03.060 off their masks, I'm talking about students now, to take off their masks at the same time.
00:04:09.420 That is probably all you need.
00:04:11.340 Now, I'm not predicting that'll happen.
00:04:13.080 But I'd like to point out how thin the difference is between everybody wearing masks in school
00:04:21.180 and nobody wearing masks in school.
00:04:24.640 It's probably right on just a little bit it would take to completely change the situation.
00:04:30.480 All it would take is one meme that says do it at the same time.
00:04:36.120 And if the kids bought into it, that's all it would take.
00:04:39.540 So if Senior Ditch Day works, could it be scaled up?
00:04:44.100 Here's some indication of good news to come.
00:04:48.020 I've told you many times that we'll never be able to tax and spend our way to everybody
00:04:54.760 having everything they need.
00:04:55.960 We're going to have to lower the cost of a high-quality life.
00:05:01.100 You know, your housing, your food, etc.
00:05:03.500 And there's something happening here that's really exciting in that world.
00:05:08.320 Do you know what ADUs are?
00:05:09.940 It's probably a term that is local.
00:05:13.800 But it refers to, I forget what ADU stands for, something like, it's like an extra unit of
00:05:21.120 living space, basically a building that you can put into your backyard that would be a
00:05:25.940 stand-alone home for, you know, an in-law or somebody who just needed a little care.
00:05:32.380 And there's one called Boxable.
00:05:34.780 The company is called Boxable.
00:05:36.680 And it's a folding home.
00:05:39.320 They actually ship it, fold it up, and then on site the walls just fold out and the furniture
00:05:45.120 is already there, or at least the built-in stuff is already there.
00:05:47.860 They claim they can give you 375 square feet of buildings of space, which would be, you
00:05:56.220 know, a pretty tiny apartment, but it would have a kitchen, bathroom, bedroom, little living
00:06:00.620 area.
00:06:01.900 Now, if it starts at 50K, just for the unfolding box itself, and you figure that's the starting
00:06:08.220 price, so the one you want is going to be more than that, then you add your land, your
00:06:12.920 taxes, your permits, your plumbing, your sewer, and all that stuff.
00:06:16.080 So, I don't know, could you get the cost of a decent home down to $100,000, and then
00:06:25.080 everything changes?
00:06:26.820 I think you could, and I think we're heading in that way.
00:06:29.300 But it looks like the ADU industry is what's going to lead the way, because they're going
00:06:34.360 to do all the innovations for how to make the cheapest home, and I think that's just a
00:06:39.400 crossover of regular homes.
00:06:40.860 All right, here's a warning to you.
00:06:45.500 Beware of your own certainty.
00:06:47.980 Do you ever wonder, let's say you wanted to know, are you hallucinating your opinion,
00:06:54.820 or do you have a rational opinion?
00:06:57.860 Because wouldn't you like to know that?
00:07:00.040 You probably know that the only person who can't tell they're crazy is the crazy person,
00:07:05.820 right?
00:07:06.440 The person who doesn't know they're hallucinating is the one who's doing it, but everybody else
00:07:11.800 can see it.
00:07:12.660 You know, if you run into somebody who's hallucinating, you can tell.
00:07:16.200 It's easy, but they can't tell.
00:07:18.820 So, wouldn't you like a little rule that you could use, just an objective little rule, to
00:07:24.540 know if you're hallucinating?
00:07:27.000 All right?
00:07:27.900 Well, here's one.
00:07:28.780 Now, this is not one of those 100% rules.
00:07:32.380 It's just a really good indication that you're hallucinating, and it goes like this.
00:07:38.080 You have certainty about the uncertain.
00:07:42.080 That's it.
00:07:43.340 Do you have certainty about something that really can't be known?
00:07:47.320 For example, are you certain, just totally dead certain, that getting the vaccination is
00:07:53.840 better than not?
00:07:55.200 Or, the other way, are you dead certain that the reverse is true?
00:08:01.020 If you're dead certain about either of those things or anything else, you're probably hallucinating.
00:08:07.560 In other words, you could be reasonably sure that your opinion is not moored to any facts
00:08:13.780 or rational anything.
00:08:15.720 It could be right.
00:08:17.120 I'm not saying you're wrong.
00:08:18.180 I'm saying that you might have arrived at your decision through an irrational process.
00:08:22.600 And certainty is really a tell for that.
00:08:26.600 If you see somebody saying, well, you know, I'm like 80% sure that vaccinations are a good
00:08:32.760 idea.
00:08:33.520 Is that person hallucinating?
00:08:36.780 No.
00:08:37.620 No.
00:08:37.900 They could be right, and they could be wrong.
00:08:40.720 The 80% could be way off.
00:08:42.660 Or it could be right, and they made the wrong choice.
00:08:46.620 But they're at least not hallucinating.
00:08:48.180 If you're talking in terms of probability, you're probably looking at data and doing the
00:08:54.240 best you can with rational thought.
00:08:56.260 We're not good at it.
00:08:57.560 But at least you're trying.
00:08:59.000 If you have certainty, that's a tell.
00:09:02.840 It's a gigantic tell.
00:09:05.260 All right.
00:09:07.460 That's your first tip, how to know you're hallucinating.
00:09:09.840 Is it my imagination, or has Trump been a little bit too quiet recently?
00:09:17.000 Right?
00:09:19.060 Because he's fairly consistent about staying in the news.
00:09:23.020 But correct me if I'm wrong.
00:09:25.280 Hasn't Trump gone quiet?
00:09:28.140 And if he has, is something coming?
00:09:32.120 Is there something coming?
00:09:33.820 Maybe he's working on legal issues because of his taxes or whatever.
00:09:40.380 I don't know.
00:09:40.900 So he might be working on something.
00:09:44.000 But still, wouldn't he make sure that he issued some provocative statements to get in the news
00:09:50.140 again?
00:09:51.460 Well, no, I don't think it's a storm coming, as somebody just said.
00:09:55.340 But there's something coming, right?
00:10:00.040 You feel it, don't you?
00:10:01.940 Because he's too quiet.
00:10:04.260 So that's one of the things that I like to look at, is the dog that's not barking.
00:10:10.280 And at the moment, Trump is not barking.
00:10:12.320 And you've got to ask why.
00:10:14.260 Either something big is coming, or it's a mystery.
00:10:19.940 All right.
00:10:22.020 Here's some China persuasion.
00:10:26.600 Why is it that we still do business, we, the business community, in the United States
00:10:32.460 and other countries, why do we still do business with China when we know that there are clear
00:10:38.700 business risks and also horrible things going on?
00:10:43.640 Let's say the Uyghurs being put into camps.
00:10:46.900 Let's say the allegations that the Falun Gong people are being used for unwilling donations
00:10:56.660 of body parts, you know, transplants and stuff.
00:11:00.300 I mean, they're just horrible allegations against China.
00:11:05.220 And we still do business there.
00:11:07.720 We still do business.
00:11:08.980 Why is that?
00:11:09.840 All right.
00:11:10.040 Here's the hypnotist filter on it.
00:11:12.740 The reason is it doesn't affect us.
00:11:14.400 Have you once had your day affected by the Uyghurs being in camps?
00:11:22.780 Well, if you have any kind of an empathy for human beings, you certainly had some empathy
00:11:28.420 for it.
00:11:28.880 You felt bad.
00:11:29.520 You thought you should do something about it.
00:11:31.520 You know, maybe you doubled down on your intentions to not buy from China or something.
00:11:35.980 But mostly, mostly it didn't affect you, right?
00:11:39.880 Maybe a little bit in your head, but it didn't affect your day, right?
00:11:44.000 So it doesn't matter how good an argument is, how moral it is, how ethical it is, what's
00:11:53.060 right and what's wrong.
00:11:54.880 We kind of care what affects us today, right?
00:11:58.020 That just matters more.
00:11:59.460 Because we're not really all about the morality and the principle when it comes down to it.
00:12:05.140 It's not very predictive.
00:12:07.260 But I'll tell you what is predictive.
00:12:08.820 If China gets, let's say we find out that the Wuhan lab was absolutely the source of
00:12:18.360 a, in any way, engineered virus.
00:12:21.920 And that China is, let's say, held accountable for letting it happen and for maybe covering
00:12:28.240 it up.
00:12:29.120 So those things both seem reasonably likely to happen.
00:12:32.140 What happens then to China as a place to do business?
00:12:38.500 Because let's say you're Nike.
00:12:41.820 You're Nike and you say, gosh, I really want to keep making my sneakers there because I'm
00:12:46.660 getting good quality.
00:12:47.820 I'm getting the good price.
00:12:49.380 I don't want to have to make a change.
00:12:50.920 It would be really expensive.
00:12:52.340 We've got business relationships.
00:12:54.260 I really, really, really, really, really want to just keep making my sneakers in China.
00:12:58.440 Well, they can get away with that as long as their customers say some form of, you know,
00:13:06.300 I hate what China's doing, but it doesn't seem to affect me.
00:13:10.300 But what happens when it does?
00:13:12.300 What happens when the thing you think about China is the Wuhan lab causing the pandemic?
00:13:17.700 Because the pandemic affects you.
00:13:21.340 All right.
00:13:21.520 If you're wearing a mask anywhere, it's because of China.
00:13:25.760 Or it'll feel like that if the attribution goes to the Wuhan lab.
00:13:33.060 And I feel like this will be a turning point for China that it seems likely that we're going
00:13:38.360 to attribute the problem to the lab.
00:13:40.720 And it seems likely that the story, the narrative will include China covering it up.
00:13:46.760 The two worst things that you could compare.
00:13:49.020 At that point, it's personal.
00:13:52.980 You see the difference?
00:13:53.900 And so this is the hypnotist's filter on the news.
00:13:58.640 As long as what China is doing that's bad is just a concept, yeah, they stole some intellectual
00:14:04.640 property from somebody I don't know, the company I don't care about.
00:14:11.580 I don't know.
00:14:12.640 They locked up some people for their religious views, the Uyghurs.
00:14:16.760 Well, but I still went to work and my day was exactly the same.
00:14:24.360 Yeah.
00:14:24.600 Even fentanyl.
00:14:26.020 You know, I'm activated because fentanyl touched my family.
00:14:30.740 But what if it hadn't?
00:14:32.680 What if I never knew anybody who had a fentanyl overdose?
00:14:35.600 I wouldn't feel any personal connection to China whatsoever.
00:14:38.120 But it's personal for me, and I think that it's going to get personal for a lot of Americans
00:14:43.900 when China becomes the cause of the pandemic, more so than you already think.
00:14:51.220 You know, I think there's another level for that.
00:14:53.200 So I think China has got...
00:14:54.860 China has enormous problems, and I don't even know if they know how big they are.
00:14:59.420 Because if you're in China, can you really read the mood of the United States?
00:15:04.620 Because their fate sort of depends on our mood, doesn't it?
00:15:09.740 I mean, literally?
00:15:11.080 Literally, our mood in the United States will determine the future of China.
00:15:15.920 And our mood is on the border right now between,
00:15:20.840 eh, it's not my problem.
00:15:23.000 Yeah, I don't like China.
00:15:23.920 Not my problem.
00:15:24.900 And you did this to me, China.
00:15:27.820 That's a big difference.
00:15:29.800 China's in trouble.
00:15:31.380 Let me give you a little inspirational lesson here.
00:15:35.020 I'm going to plop this right in the middle of my news talk.
00:15:37.900 Three experiences on learning and motivation.
00:15:40.680 I'm going to give you three just quick little anecdotes,
00:15:44.320 and then tie them together.
00:15:46.880 Years ago, when I was first trying to get syndicated as a cartoonist,
00:15:51.540 which is the big break for a cartoonist,
00:15:53.240 I sent my samples to a number of cartoon syndication companies.
00:15:57.200 They're the ones who make a deal with a cartoonist, which is your big break,
00:16:01.520 and then they sell it to newspapers and license it and stuff if you're lucky.
00:16:05.040 So I submitted my stuff to the several syndicates that existed at the time.
00:16:10.720 Most of them just rejected me, you know, with just a form letter.
00:16:15.660 But one of them rejected me by telling me that my writing might have some merit,
00:16:20.280 but I should find somebody else to do the drawing for me.
00:16:26.340 Just what you want to hear when you're trying to become a professional cartoonist
00:16:30.200 is one of the most knowledgeable people in the industry,
00:16:33.660 an editor at a top syndication company,
00:16:36.660 telling you that maybe you should look into having somebody else do the hard part,
00:16:41.420 the drawing.
00:16:41.880 So that wasn't so good for my cartooning ego.
00:16:49.560 But I took all of my materials and I put them in a closet and I said to myself,
00:16:53.820 well, I tried.
00:16:55.260 And I felt good that I tried, but of course it was an abject failure.
00:17:00.100 A few months later I get a call from Sarah Gillespie, an editor at United Media,
00:17:04.020 the biggest, I think they were the biggest, cartoon syndication company.
00:17:07.880 And I thought I'd gotten all of my rejections,
00:17:11.360 but somehow I'd missed that I hadn't heard from them.
00:17:14.000 And she offered me a contract right over the phone to become a syndicated cartoonist,
00:17:19.820 which turned out to be my big break that made Dilbert successful.
00:17:24.560 And toward the end of the conversation, you know,
00:17:26.700 I'm still reeling from the fact that I got this offer at all.
00:17:30.220 I said to her, but I realized that my drawing style is not up to any professional level.
00:17:36.020 And I said, I'd be willing, you know, if you think this would make the product better,
00:17:40.540 I'd be willing to work with an artist to do the actual drawing for me
00:17:44.380 and maybe I could just do the writing.
00:17:46.780 And Sarah Gillespie said to me,
00:17:48.840 what's wrong with your art?
00:17:51.760 It's fine just the way it is.
00:17:55.100 So she wizard of Oz me.
00:17:58.000 She frickin wizard of Oz me.
00:17:59.980 Do you know what happened to my drawing quality within 24 hours of being told
00:18:06.280 that I was a professional cartoonist
00:18:08.260 and that my current drawing style was already professional class,
00:18:14.840 which I didn't believe to be even close to true.
00:18:18.820 The moment a professional at the top of the industry told me that my drawing was good enough,
00:18:25.440 the drawing style improved about 30% in a day and then kept improving.
00:18:34.520 So at this point, you know, I'm certainly not an artist with a capital A,
00:18:38.540 but certainly my comic is well drawn and executed, you know, just through practice.
00:18:44.960 So that's your first lesson,
00:18:47.160 that somebody literally changed my performance in one day by about 30%,
00:18:51.860 I would figure, simply by telling me that I was worthy.
00:18:56.440 That's it.
00:18:57.220 Now hold that story for a moment.
00:18:58.540 Here comes another one.
00:19:00.420 I took the Dale Carnegie course.
00:19:02.160 You've heard this story where you learn to do public speaking
00:19:05.240 and also interacting with strangers, you know, small talk and stuff.
00:19:08.680 And you learn a bunch of things about how to be comfortable
00:19:10.820 in these uncomfortable social situations, especially giving a speech.
00:19:15.600 And the technique that Dale Carnegie uses, I've told this story a bunch of times,
00:19:21.300 is they only give you compliments.
00:19:24.240 That's it.
00:19:25.360 That's the whole technique.
00:19:27.120 They don't tell you what you did wrong.
00:19:29.220 Imagine trying to learn a skill and nobody ever tells you what you're doing wrong.
00:19:34.360 Not once did they say, stop jingling the change in your pocket.
00:19:38.960 Not once did they say, make eye contact.
00:19:41.400 Not once did the instructor tell anybody that anything was imperfect.
00:19:48.180 Not once.
00:19:49.200 The whole class.
00:19:51.020 How many people in that class went from basket cases,
00:19:54.840 just couldn't talk in front of people, literally couldn't even make a word.
00:19:58.540 There were people who couldn't form words in front of a crowd.
00:20:03.640 Like they were just, ah, um, ah, just frozen.
00:20:08.680 And with no specific help, no specific criticism, and only encouragement,
00:20:16.700 telling you what you did right, even if it was a small thing,
00:20:20.540 how many of the, I don't know, there were probably 25 people in the class that I took,
00:20:24.860 how many of them do you think became really good speakers at the end of, I don't know,
00:20:29.840 10 weeks or something?
00:20:31.460 100%.
00:20:31.900 I've never seen anything like it.
00:20:35.080 100%.
00:20:35.600 Every single person there could stand up on a moment's notice and give a speech to a large crowd of people
00:20:43.300 without dropping a bead of sweat.
00:20:47.920 Third story.
00:20:50.320 Some of you know I've been trying to learn drums.
00:20:54.360 I started with an online teacher who was very good at a lot of getting me the basics.
00:20:59.700 But ultimately, it wasn't quite making progress, and I thought, oh, I'll try to be an autodidact,
00:21:07.180 self-teach myself, look at a lot of YouTube videos and see how far I can get.
00:21:12.200 Didn't work at all.
00:21:13.700 All right?
00:21:14.180 There are a lot of things that I can teach myself.
00:21:17.000 I'm mostly self-taught in most of the things I do.
00:21:20.140 But I couldn't get there.
00:21:21.840 The drumming was beyond my ability to self-teach.
00:21:24.960 So I now have a new drum teacher, and he's come over, I don't know, half a dozen times or more.
00:21:33.060 And we do half an hour, and my drumming is way better, just with a few lessons, just way better.
00:21:41.720 And I realized, I didn't realize this until yesterday, that I've had all these lessons with this instructor.
00:21:49.020 He's never once told me I was doing anything wrong.
00:21:54.100 Not once.
00:21:55.820 He never once told me I did anything wrong.
00:21:58.760 In all those lessons.
00:22:00.320 The only thing he does is he tells me I did great, and I must have been practicing during the week.
00:22:06.200 Which I wasn't.
00:22:08.300 He's actually complimenting me for things I didn't do.
00:22:11.640 Like practicing.
00:22:13.500 You know, I mean, I practice a little bit, not enough to make a difference.
00:22:15.860 But how much better is my drumming with nothing but positive reinforcement?
00:22:22.820 Way better.
00:22:24.340 Way better.
00:22:25.620 It's like, you know, suddenly, it just, you know, the curve just went straight up.
00:22:32.220 And it has nothing to do with anything except the fact that he told me I was good, and I didn't believe it.
00:22:38.720 Because I'm not.
00:22:40.220 But the fact that he tells me that makes me interested.
00:22:43.640 And then I'm excited.
00:22:44.440 And then I can learn it.
00:22:46.320 So suddenly the ability to learn just sort of turns on.
00:22:50.660 Now, this particular drum instructor also is a personal trainer.
00:22:56.280 You know, piano instructor.
00:22:58.100 He instructs Christina in piano as well.
00:23:01.380 And his technique is basically to make people successful.
00:23:05.760 Here's the fourth story.
00:23:06.700 I had a neighbor who was a tennis pro.
00:23:09.780 And I would ask him, how do you teach little kids to play tennis when every time they swing the racket, they're not going to hit the ball back?
00:23:18.700 So how do you go from, I never hit a ball back, to being a tennis player?
00:23:24.180 Like, how could you get past the mental part of, well, it failed a billion times in a row.
00:23:29.480 This is fun.
00:23:30.080 And he told me that you only teach them success.
00:23:33.840 So if he has a little kid who basically can barely hold a tennis racket, he has them walk right up to the net, and then he tosses a ball to them from two feet away and has them go, whoop, and just swing the racket in the general direction.
00:23:48.780 He hits some part of the racket and ends up on his side of the court, and then he praises them.
00:23:55.580 And that's all he does.
00:23:56.800 He'll stand there for an hour, just going, whoop, you know, a two foot, and the kid swings, and it goes on, it hits the ground, and the kid's happy.
00:24:04.820 All he teaches is success.
00:24:07.340 And then they get interested.
00:24:08.960 And as they get older, their ability to hold a racket improves, and they just get into it entirely through being interested.
00:24:17.180 So that's your micro lesson of the day.
00:24:21.260 It's sort of similar to the things I put on the Locals platform.
00:24:24.840 I've got a number of them queued up that I'm going to be adding there soon, if you like that kind of stuff.
00:24:31.440 Rasmussen has a poll asking people, has Biden kept his campaign promises?
00:24:37.040 More, less, or about the same as most presidents?
00:24:40.620 30% said he's done better than most presidents.
00:24:44.320 41% said he's done less well in terms of keeping his promises.
00:24:50.560 25% about the same.
00:24:53.460 25%.
00:24:54.100 But here's the interesting part.
00:25:01.940 In every poll, every political poll, and even a lot of scientific-related polls,
00:25:08.700 we expect this gigantic difference between conservatives and liberals.
00:25:13.060 And sure enough, 63% of conservatives think Biden has done less well than other presidents at keeping his promises.
00:25:22.700 But yet, a fairly similar amount of liberals think he has.
00:25:28.000 So the people who like Biden the least, the conservatives, think he's not keeping his promises.
00:25:35.860 Isn't that good?
00:25:39.140 Wait a minute.
00:25:40.220 If the people who don't like Biden believe he's not keeping his promises, then they should be kind of happy.
00:25:47.080 But wait, 58% of liberals think he is keeping his promises better than usual.
00:25:52.420 So they're happy too.
00:25:53.700 How weird is it that both the conservatives and the liberals got what they wanted, as far as they can tell?
00:26:02.180 The conservatives think they got what they wanted because he's not doing his promises.
00:26:06.820 And the liberals think they did get what they wanted because he's keeping his promises.
00:26:12.480 They're both happy.
00:26:13.480 You know, something like 60%-ish of both groups kind of getting what they want.
00:26:21.340 I mean, indirectly, not completely.
00:26:24.640 So there's some good news.
00:26:27.700 See, I told you I was going to put the optimistic filter on everything.
00:26:31.860 That's what that was.
00:26:33.580 Yes, we're going to talk about Governor Cuomo resigning.
00:26:36.820 Why do you think Cuomo had to resign with these allegations, but Trump did not and went on to win the presidency?
00:26:45.780 In the comments, let's put the hypnosis filter on it yourself.
00:26:51.040 Tell me what was different.
00:26:53.140 Why did Trump survive allegations and Cuomo did not?
00:27:00.300 I see somebody saying the brand.
00:27:03.060 Close.
00:27:04.160 And maybe that's correct.
00:27:06.820 In office when it happened?
00:27:09.600 Okay.
00:27:10.740 I don't know if that makes a difference.
00:27:13.720 Damn it.
00:27:14.380 I just lost my local's feed.
00:27:16.020 28-minute cutoff.
00:27:18.160 But they're used to it now.
00:27:20.520 So nursing home deaths?
00:27:22.640 Oh, that's...
00:27:23.240 Yeah.
00:27:24.060 The nursing home deaths might have been the real reason, huh?
00:27:26.880 And people just used whatever they could.
00:27:29.320 So it could be that it's a fake because, right?
00:27:36.820 Hello, locals.
00:27:37.680 You're back.
00:27:40.740 So, yeah, I'll give you the answer.
00:27:42.460 You didn't miss the answer.
00:27:45.080 Here's the answer.
00:27:46.660 Trump inoculated you.
00:27:49.240 He inoculated you.
00:27:51.840 Trump...
00:27:52.600 Trump gave you the vaccination, oddly enough.
00:27:56.980 You were vaccinated against accusations about Trump.
00:28:01.320 Because what did Trump tell you directly?
00:28:04.440 My favorite thing that Trump has ever said,
00:28:07.760 at any time, president or pre-president or ex-president,
00:28:11.040 my favorite thing he ever said was,
00:28:13.460 in public,
00:28:14.980 I'm no angel.
00:28:18.040 His actual words, right?
00:28:20.360 I'm no angel.
00:28:21.040 And was there anybody who,
00:28:24.200 when he entered,
00:28:25.140 when he entered the race,
00:28:26.480 is there anybody who thought there wouldn't be
00:28:28.400 allegations of sexual something?
00:28:32.080 Nobody thought that.
00:28:33.780 Everybody who lives in the real world said,
00:28:35.840 well, wait for these stories.
00:28:37.580 We're going to see some good stories here.
00:28:39.760 And sure enough,
00:28:41.000 the stories were just as good as we thought they would be.
00:28:44.020 You got your porn star?
00:28:46.060 Check.
00:28:46.920 You got your Playboy Playmate of the Year?
00:28:49.900 Check.
00:28:50.340 You got your allegations of something horrible
00:28:54.140 happening in a dressing room?
00:28:56.200 Well, I didn't see that one coming exactly,
00:28:57.900 but something like that was going to come.
00:29:00.280 Check.
00:29:01.280 You got it all.
00:29:03.380 Now, you could argue that,
00:29:06.080 and I think I would argue this,
00:29:08.120 that the allegations against Trump
00:29:10.040 felt less credible.
00:29:13.940 All right?
00:29:14.400 Now, I'm not going to say that none of it's true,
00:29:16.520 because I wasn't there.
00:29:17.360 And I do agree with the kind of general statement
00:29:21.260 that you should take people seriously
00:29:22.980 when they make allegations.
00:29:24.640 But they didn't feel as true
00:29:27.140 as the Cuomo ones.
00:29:29.640 Now, maybe that's my bias.
00:29:31.420 I don't know.
00:29:32.000 Maybe you'd have a different view.
00:29:33.300 Maybe Democrats and Republicans
00:29:36.620 have different views on that.
00:29:37.900 But it didn't seem as credible.
00:29:40.720 They felt a little manufactured,
00:29:43.020 even if they were true.
00:29:44.260 They didn't feel as true.
00:29:46.400 So I would say that the biggest thing
00:29:49.300 that Trump did right is he inoculated you,
00:29:51.900 and you knew that his brand was consistent
00:29:54.580 with being kind of that guy,
00:29:57.480 like it or not.
00:29:58.480 Whereas Cuomo was sort of the lefty,
00:30:03.880 support all women,
00:30:05.700 he shouldn't have been that guy.
00:30:07.480 So they had to get rid of their own guy
00:30:09.600 for being too much like Trump.
00:30:15.120 All right, so here are the things
00:30:17.540 that won't get you fired at CNN.
00:30:19.280 Of course, the big story that we want to talk about
00:30:22.100 is Chris Cuomo interviewing his brother
00:30:25.380 and, you know, being basically a cheerleader
00:30:28.480 instead of asking him hard questions
00:30:30.680 and, you know, it's inappropriate
00:30:32.980 and, you know, Chris Cuomo was advising his brother
00:30:37.100 so it's extra creepy, blah, blah, blah.
00:30:40.060 Here's my take on that.
00:30:42.280 I don't care.
00:30:44.640 Why do you care about that?
00:30:47.060 Why do you care that Chris Cuomo
00:30:49.720 is the brother of Andrew Cuomo
00:30:52.200 and that Chris Cuomo treated him extra, extra nice on TV?
00:30:57.740 Why do you care about that?
00:30:59.600 Was there anybody there who thought
00:31:01.320 that a brother was not going to talk to his brother
00:31:06.540 or give advice?
00:31:08.580 Nobody.
00:31:10.060 Nobody.
00:31:10.800 There's not one person in the world
00:31:12.100 who thought that was reasonable,
00:31:14.000 that a brother doesn't talk to another brother
00:31:15.820 under any conditions.
00:31:18.060 I don't care who it is.
00:31:19.460 I don't care if it's part of the legal system.
00:31:23.320 You know, I don't care if they've got conflict of interest.
00:31:26.920 Under no circumstance
00:31:28.780 do you tell me that there's something wrong
00:31:32.340 with a brother talking to a brother.
00:31:34.220 I'm sorry.
00:31:35.420 That's an absolute.
00:31:37.220 There's some things that are just absolutes.
00:31:40.280 That's a fucking absolute.
00:31:41.900 Sorry.
00:31:42.840 I'm trying to curse last.
00:31:44.240 That's just an absolute.
00:31:45.320 So if you're getting worked up about a brother
00:31:49.040 talking to a brother,
00:31:50.820 I mean, check your thinking there.
00:31:53.800 It's not a standard you want.
00:31:55.860 I mean, you don't want that applied to you.
00:31:57.920 So I can't get mad about that.
00:32:00.180 Nor can I get mad that, you know,
00:32:03.600 it was, you know,
00:32:05.300 such nice treatment of his brother.
00:32:07.700 Because what do you expect?
00:32:09.660 Secondly, I found it entertaining.
00:32:11.600 CNN is an entertainment network
00:32:15.920 as much as a news network.
00:32:17.540 Maybe more entertainment, you could argue.
00:32:19.680 I personally enjoyed watching the brothers talk
00:32:23.380 because it was like an extra layer
00:32:25.000 of something interesting.
00:32:26.340 But as long as you know it's an opinion program,
00:32:29.540 who cares?
00:32:32.220 If it were the news segment,
00:32:38.040 I would say that's a big problem.
00:32:39.240 But if something is clearly an opinion show
00:32:42.420 and somebody in an opinion show
00:32:44.500 is talking to their brother,
00:32:47.380 is there anybody who doesn't understand
00:32:49.280 that that's supposed to be biased?
00:32:52.140 I don't know.
00:32:52.700 I just don't have any problem
00:32:53.940 with Chris Cuomo on any of this.
00:32:57.720 So, I mean, it'd be fun if I did, right?
00:33:00.660 You'd probably enjoy it more.
00:33:02.020 It'd be more entertaining.
00:33:02.980 But really?
00:33:04.300 Now, let me ask you this.
00:33:05.600 In brainwashing news.
00:33:11.660 Brainwashing news.
00:33:14.880 How is it that we have been convinced
00:33:17.240 that some of the most important news in the country
00:33:19.780 has to do with what Tucker Carlson says or does
00:33:23.320 and Chris Cuomo says or does,
00:33:26.340 who are basically competitors, right?
00:33:28.740 The two competing networks.
00:33:30.420 Is the time slots the same?
00:33:32.700 Somebody give me a fact check.
00:33:34.100 Is Chris Cuomo's show the same
00:33:37.320 as Tucker Carlson's time slot?
00:33:39.660 I think it is, right?
00:33:41.720 Somebody tell me for sure.
00:33:48.020 Looks like I got some crackling audio over on...
00:33:53.100 over on Locals.
00:33:56.740 Play with my cables a little bit
00:33:58.280 and see if anything changes.
00:34:01.340 You'll tell me.
00:34:02.160 All right.
00:34:04.720 So, if you think that it's natural
00:34:07.040 that we're all talking about Chris Cuomo
00:34:09.320 and Tucker Carlson
00:34:10.500 when they are the two opinion people
00:34:13.360 on the two, I would say,
00:34:14.540 the most news-making entities right now.
00:34:17.600 If Fox or CNN says something's news,
00:34:20.440 then it's news, I guess.
00:34:23.140 You've been brainwashed.
00:34:26.240 You've been brainwashed.
00:34:27.440 Because these are not important things.
00:34:31.960 What Chris Cuomo says about his brother,
00:34:34.460 totally unimportant.
00:34:35.780 What Tucker Carlson says is very entertaining
00:34:38.380 and often, you know,
00:34:39.860 some of the bravest stuff that's on TV.
00:34:43.000 But I don't know that you should treat that
00:34:45.940 as one of your top priorities,
00:34:47.740 but it feels like it
00:34:48.860 because they're in the news.
00:34:50.400 So, just consider that
00:34:51.960 what you consider to be important
00:34:53.920 and worthy of your brain cycles
00:34:56.760 has been assigned to you.
00:34:59.540 The thought that Tucker Carlson
00:35:02.740 and Chris Cuomo,
00:35:04.420 you know, that they're the story,
00:35:06.480 as awesome as they both are,
00:35:07.840 I think they're both great at their jobs individually,
00:35:12.240 you've been brainwashed.
00:35:14.120 They're not important,
00:35:15.500 but great.
00:35:16.180 They're great at their jobs.
00:35:17.060 I'd like to give you my impression
00:35:22.140 of talking about the infrastructure bill,
00:35:25.520 which has gone through
00:35:26.560 some of the first procedural hurdles.
00:35:29.720 I'm going to try to do this
00:35:31.160 while staying awake
00:35:32.360 to the end of the sentence.
00:35:34.720 All right?
00:35:35.000 So, I haven't tried this before in public,
00:35:37.520 so this could be embarrassing,
00:35:39.500 but I'm going to read a sentence
00:35:40.980 about the infrastructure bill.
00:35:43.220 I'm going to try to remain awake
00:35:46.060 awake and conscious
00:35:47.320 from the beginning of the sentence
00:35:49.580 to the end of the sentence,
00:35:50.800 and I'm going to see if I can do that.
00:35:57.060 Ready.
00:35:58.820 All right.
00:36:00.260 A 15-hour votarama,
00:36:02.960 which is a procedural move,
00:36:05.100 to pass the framework
00:36:07.900 of the reconciliation
00:36:09.800 package
00:36:14.680 for the infrastructure...
00:36:18.100 were we talking about
00:36:29.340 the infrastructure bill
00:36:30.320 or something?
00:36:31.940 Something about money or something?
00:36:33.700 I don't know.
00:36:34.560 I don't even remember
00:36:35.560 what we were talking about.
00:36:37.440 So, that's what I have to say
00:36:38.920 about the infrastructure bill.
00:36:39.960 Now, I would put myself
00:36:44.140 in perhaps the top...
00:36:48.600 Let's do a fact check on this.
00:36:50.340 Let's see if you agree
00:36:51.340 or disagree with this
00:36:52.300 following assumption.
00:36:53.880 I think I'm in the top 5%
00:36:56.020 of people who follow the news.
00:36:59.560 I'm not sure that's the same
00:37:00.860 as being well-informed,
00:37:02.080 but let's say I'm in the top 5%
00:37:03.620 of people who follow
00:37:04.780 most of the big news stories.
00:37:06.640 I have no idea
00:37:08.640 what's in the infrastructure bill.
00:37:10.860 Do you?
00:37:12.200 I mean, I know the headline stuff.
00:37:14.780 Oh, a bunch of socialism stuff
00:37:16.620 got in there,
00:37:17.460 and why are we calling
00:37:18.780 this infrastructure?
00:37:22.020 And there's some stuff
00:37:23.300 that's in there,
00:37:24.020 but then it's taken out,
00:37:25.140 and then the Republicans
00:37:26.240 took it out,
00:37:27.040 but then Nancy Pelosi
00:37:29.540 might put it in
00:37:30.440 with reconciliation,
00:37:31.660 and what the hell
00:37:32.220 is reconciliation anyway?
00:37:33.500 Does anybody understand
00:37:37.260 what's in the infrastructure bill?
00:37:39.380 Do you know why
00:37:40.260 the public should oppose
00:37:41.700 the infrastructure bill?
00:37:43.740 Because they can't
00:37:44.580 explain it to us.
00:37:47.500 Basically, you should oppose
00:37:48.660 anything that nobody
00:37:49.620 will explain to you
00:37:50.480 in a way that you can
00:37:51.740 understand it
00:37:52.560 in some easy way.
00:37:54.000 Now, I have seen,
00:37:54.760 I think, the New York Times.
00:37:56.680 I've seen some graphs
00:37:57.740 that show what's in
00:37:58.520 and what's out,
00:37:59.100 and I've looked at those graphs,
00:38:00.300 and I didn't really understand them.
00:38:01.420 Somehow, they made it
00:38:03.200 too complicated.
00:38:04.660 I feel as if there's
00:38:06.020 some way to tell this story
00:38:07.620 that you'd understand,
00:38:10.160 but it hasn't happened.
00:38:11.920 Yeah, it's like Obamacare.
00:38:13.340 Once something reaches
00:38:14.380 a level of complication,
00:38:17.100 and the public is essentially
00:38:18.960 taken out of the process,
00:38:21.620 because we just don't know
00:38:22.540 what's in there,
00:38:23.220 what should be in there,
00:38:24.140 how the process works,
00:38:25.240 anything.
00:38:27.060 So I think the public
00:38:28.040 can't help on this one.
00:38:28.840 That's part of the reason
00:38:31.140 that I think it's
00:38:32.220 whatever optimism you have
00:38:34.280 about this getting passed,
00:38:36.380 I don't know.
00:38:37.740 I don't know.
00:38:38.740 We'll see.
00:38:42.840 So here's something
00:38:44.760 that really makes you question
00:38:46.680 the simulation
00:38:47.440 or reality itself.
00:38:49.360 Why is it that the idea
00:38:51.920 that you follow the money
00:38:53.260 to understand
00:38:54.500 what's really happening
00:38:55.420 in the world,
00:38:56.420 why does it keep working?
00:38:58.840 Especially for prediction.
00:39:00.920 Even when it doesn't look like
00:39:02.720 it should be the reason
00:39:03.640 for anything,
00:39:04.780 it keeps working.
00:39:06.560 So this is what I told you
00:39:07.600 about prediction.
00:39:09.020 Even if you think
00:39:09.940 the main reasons
00:39:11.420 for why something will
00:39:12.660 or will not happen
00:39:13.620 in the future
00:39:14.160 have nothing to do
00:39:15.540 with money,
00:39:17.020 watch how often
00:39:18.060 the money
00:39:19.580 coincidentally
00:39:20.720 also predicts.
00:39:24.140 Let me give you
00:39:24.940 some examples.
00:39:25.520 what were the odds
00:39:29.220 that we would need
00:39:30.480 booster shots
00:39:31.400 for the vaccinations,
00:39:33.140 which would be
00:39:34.620 wildly profitable
00:39:35.860 for the people
00:39:37.460 who make booster shots
00:39:39.020 and the vaccinations.
00:39:41.600 Now,
00:39:42.220 I don't think
00:39:43.000 the reason
00:39:43.720 that we need
00:39:45.000 booster shots
00:39:45.760 is because
00:39:46.700 it was all
00:39:47.240 a plan
00:39:47.960 to make them
00:39:48.600 ineffective
00:39:49.140 so they could
00:39:49.700 make more money
00:39:50.300 with booster shots.
00:39:51.260 I don't think
00:39:51.820 that's the reason.
00:39:52.420 but why is it
00:39:54.480 that if you predicted
00:39:55.660 using money
00:39:56.740 you would have
00:39:57.840 gotten the right answer?
00:40:00.140 Right?
00:40:01.700 Let's take another one,
00:40:02.840 the therapeutics.
00:40:04.400 We know that
00:40:05.420 if the therapeutics
00:40:06.520 were excellent
00:40:07.400 and they all worked,
00:40:09.420 that the vaccination makers
00:40:11.260 would probably
00:40:11.840 make less money
00:40:12.580 and the pharma
00:40:14.940 in general
00:40:15.400 would probably
00:40:15.800 make less money
00:40:16.560 if it turned out
00:40:18.700 that any of the
00:40:19.780 cheap generics
00:40:20.840 or whatever worked.
00:40:22.420 So it turns out
00:40:23.500 that we're told
00:40:24.640 that the therapeutics
00:40:25.620 don't work well enough
00:40:26.840 to make vaccinations
00:40:28.760 unnecessary.
00:40:30.540 Now,
00:40:31.620 is the reason
00:40:32.720 they told us that
00:40:33.660 because of money?
00:40:36.140 I don't think so.
00:40:38.260 I think the reason
00:40:39.100 is that the therapeutics
00:40:40.640 didn't pass
00:40:41.420 enough scientific
00:40:43.260 rigor,
00:40:44.980 I think.
00:40:46.660 But
00:40:47.220 why did money
00:40:48.760 predict exactly
00:40:49.760 what would happen?
00:40:51.180 Again,
00:40:52.420 is it a coincidence?
00:40:54.900 Watch how many times
00:40:56.200 you're sure
00:40:56.840 that the reason
00:40:57.480 is something else
00:40:58.480 and you're really
00:40:59.800 sure of it.
00:41:01.500 But yet,
00:41:02.280 money predicted it.
00:41:04.800 Coincidence?
00:41:07.820 I don't know.
00:41:08.620 It's something to watch.
00:41:10.700 How about
00:41:11.100 Afghanistan?
00:41:14.180 Given what we were
00:41:15.080 spending there
00:41:15.840 and given that
00:41:17.100 we weren't really,
00:41:18.640 didn't seem like
00:41:19.220 we were making
00:41:19.720 any major progress
00:41:21.060 in a permanent solution,
00:41:23.040 could you have
00:41:23.660 predicted that we
00:41:24.460 would eventually
00:41:25.000 move out of Afghanistan
00:41:26.200 because it just
00:41:26.920 cost too much?
00:41:28.820 Yes.
00:41:29.680 Yes,
00:41:30.000 money would predict
00:41:31.020 that you couldn't
00:41:32.300 stay there forever.
00:41:33.040 And
00:41:34.680 what about
00:41:35.780 the future
00:41:36.480 of China?
00:41:39.060 Well,
00:41:39.680 I think that
00:41:40.220 given the
00:41:40.660 reputational problem
00:41:41.740 and the likely,
00:41:42.960 you know,
00:41:43.280 the Wuhan lab
00:41:44.100 connection
00:41:44.600 that's likely
00:41:45.200 to be
00:41:45.760 demonstrated
00:41:46.600 with some
00:41:48.300 greater level
00:41:49.580 of certainty
00:41:50.040 in the future,
00:41:51.360 I feel like
00:41:52.320 China is doomed.
00:41:53.960 If you were
00:41:54.660 to follow
00:41:55.060 the money,
00:41:56.160 the money says
00:41:57.080 it's going to
00:41:57.520 flow out of China
00:41:58.560 because
00:41:59.660 they'll just
00:42:00.660 be too toxic.
00:42:01.420 So I'm going
00:42:02.720 to predict,
00:42:03.480 as I have,
00:42:04.440 that China's
00:42:04.960 got some
00:42:05.320 big problems,
00:42:06.720 like so big
00:42:07.540 they don't really
00:42:08.500 have any sense
00:42:09.640 of how big
00:42:10.080 these problems
00:42:10.620 are going to be.
00:42:11.800 That's what
00:42:12.280 the money says.
00:42:13.980 All right,
00:42:14.300 here's some good news.
00:42:15.260 I just saw
00:42:15.800 a video on
00:42:16.820 Mashable,
00:42:17.580 which is a good
00:42:18.280 thing to follow
00:42:18.880 on Twitter,
00:42:19.860 Mashable.
00:42:21.500 There's a
00:42:22.140 pepper harvesting
00:42:22.840 robot.
00:42:23.800 Pepper is being
00:42:24.440 the plant
00:42:25.800 that makes
00:42:26.260 a big pepper.
00:42:27.540 And they showed
00:42:28.040 the robot,
00:42:28.680 it could identify
00:42:29.400 a pepper
00:42:29.860 and it would
00:42:31.180 know it was
00:42:31.660 ripe and
00:42:32.060 would pick
00:42:32.420 it off the
00:42:32.800 vine.
00:42:33.960 So here's
00:42:34.700 what's coming.
00:42:37.120 Farming is
00:42:37.800 pretty expensive.
00:42:39.400 And there are
00:42:39.980 a bunch of
00:42:40.400 risks and
00:42:41.900 expensive,
00:42:42.740 especially if
00:42:43.760 you're doing
00:42:44.020 it outdoors.
00:42:45.160 So if you're
00:42:45.600 doing it
00:42:46.420 outdoors,
00:42:46.960 you've got to
00:42:47.200 worry about
00:42:47.560 weeds,
00:42:48.300 bug sprays,
00:42:49.600 weather damage,
00:42:50.640 flooding and
00:42:51.420 hurricanes and
00:42:52.140 every other
00:42:52.980 thing,
00:42:54.380 lack of water.
00:42:56.380 You've got to
00:42:56.780 worry about
00:42:57.140 labor costs,
00:42:58.060 and then you've
00:42:58.340 got to ship
00:42:59.740 it from your
00:43:00.400 land to
00:43:01.100 wherever the
00:43:01.760 city is.
00:43:02.900 So you've got
00:43:03.240 all these
00:43:03.600 costs.
00:43:05.200 Most of them
00:43:05.900 could go away
00:43:06.720 if indoor
00:43:07.980 farming became
00:43:09.060 more modular.
00:43:10.660 By modular,
00:43:11.600 I mean that
00:43:12.080 somebody makes,
00:43:14.060 like the
00:43:15.020 boxable example,
00:43:16.860 a little indoor
00:43:17.640 farm that can
00:43:19.240 connect side to
00:43:21.220 side with as
00:43:21.940 many other
00:43:22.420 little indoor
00:43:23.020 farms as you
00:43:23.680 want.
00:43:23.980 You can make
00:43:24.360 as long as
00:43:25.340 you want.
00:43:25.700 and once
00:43:28.180 they get
00:43:28.520 the making
00:43:29.560 of the
00:43:29.940 actual farm,
00:43:31.480 which is
00:43:31.880 basically just
00:43:32.580 a glass
00:43:33.200 container with
00:43:34.100 some utilities,
00:43:36.980 once you
00:43:38.500 could make
00:43:38.860 those cheaply
00:43:39.680 and in
00:43:40.080 mass units,
00:43:41.240 and then
00:43:41.460 once you've
00:43:41.940 got...
00:43:43.700 Oh,
00:43:46.220 shit.
00:43:47.680 All right.
00:43:48.820 Problem here.
00:43:50.640 I've got to
00:43:51.480 take care of
00:43:52.180 in a minute.
00:43:52.480 All right.
00:43:57.340 So I think
00:43:58.340 the cost of
00:43:59.320 farming may
00:44:00.040 go way down
00:44:00.840 with...
00:44:05.480 Okay.
00:44:07.700 Somebody needs
00:44:08.560 to know that
00:44:09.140 I'm live
00:44:09.560 streaming right
00:44:10.920 now.
00:44:17.520 Sorry.
00:44:18.980 See, my
00:44:19.360 personal life
00:44:19.980 intrudes
00:44:20.880 sometimes into
00:44:21.560 my professional
00:44:22.200 until...
00:44:26.200 Okay.
00:44:27.920 Looks like
00:44:28.660 we'll be
00:44:29.020 wrapping up
00:44:29.500 here pretty
00:44:29.840 quickly.
00:44:30.660 All right.
00:44:30.900 Let's just
00:44:31.760 finish a few
00:44:32.280 things.
00:44:34.320 In
00:44:34.800 brainwashing
00:44:35.420 news,
00:44:36.000 we're hearing
00:44:36.760 that the
00:44:37.140 Florida
00:44:37.420 health care
00:44:37.820 workers are
00:44:38.340 exhausted
00:44:38.760 and angry.
00:44:40.380 And here's
00:44:42.620 the thing.
00:44:43.500 Is that
00:44:43.900 good persuasion?
00:44:45.220 The people
00:44:45.660 who want you
00:44:46.120 to get
00:44:46.400 vaccinated
00:44:46.880 are doing
00:44:47.300 stories about
00:44:48.140 health care
00:44:49.220 workers who
00:44:49.840 are exhausted
00:44:50.340 and angry.
00:44:51.020 Will that
00:44:51.400 convince you
00:44:54.260 that you
00:44:55.560 better get
00:44:55.980 vaccinated
00:44:56.500 because the
00:44:57.080 health care
00:44:57.480 workers are
00:44:58.040 overworked?
00:44:59.460 Nope.
00:45:00.360 Nope.
00:45:01.160 Do you know
00:45:01.580 what doesn't
00:45:02.380 matter to
00:45:03.120 anybody?
00:45:04.400 That your
00:45:05.100 job is hard.
00:45:07.620 There's nothing
00:45:08.500 that matters
00:45:09.060 less to other
00:45:10.760 people than
00:45:12.280 your job is
00:45:13.080 really hard.
00:45:13.700 You worked
00:45:14.060 hard today.
00:45:15.000 You have
00:45:15.500 stress.
00:45:16.640 You might have
00:45:17.280 to quit.
00:45:18.340 Very, very
00:45:18.900 hard, long
00:45:20.540 hours.
00:45:21.760 Nobody cares.
00:45:23.260 Because you
00:45:23.700 know who
00:45:24.200 else has a
00:45:24.700 hard job?
00:45:26.020 You do.
00:45:28.480 Everybody
00:45:29.000 thinks a
00:45:29.480 job is hard.
00:45:30.740 So the
00:45:31.060 persuasion
00:45:31.580 we're seeing,
00:45:32.440 which is
00:45:32.820 mostly limited
00:45:33.780 to the
00:45:34.240 health care,
00:45:35.720 nobody cares.
00:45:37.080 You do care
00:45:37.720 if you don't
00:45:38.160 get to go
00:45:38.640 in.
00:45:39.220 So here's
00:45:39.580 what they
00:45:39.880 should do.
00:45:40.820 Just like
00:45:41.440 the Vietnam
00:45:42.040 War was
00:45:42.700 ended, I
00:45:43.300 think, because
00:45:44.200 of all the
00:45:44.680 news coverage,
00:45:45.700 they gave
00:45:45.960 you lots of
00:45:46.400 video and
00:45:46.960 scary stuff
00:45:47.640 and blood.
00:45:48.760 I think
00:45:49.280 the news
00:45:49.840 people need
00:45:50.660 to park
00:45:51.140 outside the
00:45:52.020 emergency
00:45:53.100 rooms and
00:45:54.440 show the
00:45:54.900 ambulances
00:45:55.500 backed up
00:45:56.200 and every
00:45:57.000 day just
00:45:57.580 say, well,
00:45:58.140 here's your
00:45:58.520 local hospital.
00:45:59.920 Ten ambulances
00:46:00.700 backed up.
00:46:01.480 If you came
00:46:02.000 here, you'd
00:46:02.720 have to wait
00:46:03.140 five hours.
00:46:04.620 And you'd
00:46:05.120 probably be
00:46:05.500 transferred
00:46:05.820 somewhere else.
00:46:07.320 That would
00:46:07.960 get your
00:46:08.280 attention, right?
00:46:09.880 Because you
00:46:10.300 would put
00:46:10.600 yourself in
00:46:11.120 the ambulance.
00:46:12.880 You'd say,
00:46:13.300 oh, shoot,
00:46:14.940 if I need
00:46:15.460 to go there,
00:46:16.220 that's my
00:46:16.680 hospital.
00:46:17.640 So I
00:46:18.100 think that
00:46:18.440 would convince
00:46:18.900 you.
00:46:19.200 But you
00:46:19.440 don't care
00:46:19.800 that the
00:46:20.160 doctors are
00:46:20.960 overworked.
00:46:22.880 We do
00:46:23.300 care.
00:46:23.860 I mean, we
00:46:25.020 have empathy
00:46:25.560 for human
00:46:26.140 beings and
00:46:26.760 stuff.
00:46:27.460 But there's
00:46:28.240 nothing less
00:46:28.820 persuasive than
00:46:29.620 that somebody
00:46:30.020 else's job is
00:46:30.740 hard.
00:46:31.320 So stop
00:46:31.780 that.
00:46:33.700 All right.
00:46:37.400 We do have
00:46:38.220 a tale of two
00:46:38.920 pandemics here.
00:46:40.020 People think
00:46:40.560 that there's
00:46:41.160 one happening.
00:46:42.080 And I
00:46:43.600 thought the
00:46:43.940 most interesting
00:46:44.540 thing is I
00:46:45.140 asked them
00:46:45.540 on scientific
00:46:46.260 polls.
00:46:47.540 And 18%
00:46:48.300 of the people
00:46:48.740 who follow
00:46:49.220 me and
00:46:49.720 answered the
00:46:50.180 poll, so
00:46:50.840 it's very
00:46:51.180 unscientific,
00:46:52.420 said that
00:46:53.060 they knew
00:46:53.900 people who
00:46:54.340 had died
00:46:54.680 personally of
00:46:56.380 COVID.
00:46:57.300 18% of the
00:46:58.400 people who
00:46:58.760 follow me
00:46:59.460 know people
00:47:00.940 who died
00:47:01.560 of COVID.
00:47:03.800 That's a lot.
00:47:05.540 All right.
00:47:05.700 But here's
00:47:07.360 the interesting
00:47:07.800 part.
00:47:08.280 I asked the
00:47:08.740 people, another
00:47:09.620 poll, just the
00:47:10.960 people who are
00:47:11.740 not vaccinated.
00:47:13.900 How many
00:47:14.300 people who are
00:47:14.880 not vaccinated
00:47:15.880 know somebody
00:47:16.840 personally who
00:47:17.560 has died?
00:47:18.820 Half as many.
00:47:20.900 Closer to
00:47:21.600 9%, 10%.
00:47:22.460 So there seems
00:47:24.200 to be, and
00:47:24.700 this is deeply
00:47:25.360 unscientific
00:47:26.140 polling, but
00:47:26.960 my curiosity
00:47:28.360 starts with
00:47:29.020 this.
00:47:29.900 Is it that
00:47:30.920 knowing somebody
00:47:31.800 who died of
00:47:32.520 COVID makes
00:47:33.120 it real to
00:47:33.620 you?
00:47:34.740 Because that's
00:47:35.240 persuasion.
00:47:36.620 So the
00:47:36.980 hypnotist says
00:47:37.880 you need to
00:47:38.980 know somebody
00:47:39.520 who died or
00:47:41.300 it's not
00:47:42.600 going to
00:47:42.760 work.
00:47:43.400 You could
00:47:43.880 digitally age
00:47:44.880 somebody and
00:47:45.380 put them in
00:47:45.700 the hospital and
00:47:46.480 that would
00:47:46.740 convince them
00:47:47.260 to.
00:47:47.980 Take
00:47:48.260 somebody's
00:47:48.640 actual photo
00:47:49.220 and say,
00:47:49.560 well, here's
00:47:50.040 you in the
00:47:50.360 hospital gasping
00:47:51.540 for air.
00:47:54.780 Looks like
00:47:55.540 I'm going to
00:47:55.900 need to go
00:47:56.360 and go
00:47:58.600 in right
00:47:59.940 here.
00:48:01.520 First day of
00:48:02.260 school, you
00:48:02.920 can imagine.
00:48:03.680 Got to go.