Real Coffee with Scott Adams - January 30, 2023


Episode 2004 Scott Adams: Alternative WEF, Dogbert Takes On Canada, Biden Documents Scandal & More


Episode Stats

Length

1 hour and 12 minutes

Words per Minute

145.14474

Word Count

10,475

Sentence Count

766

Misogynist Sentences

20

Hate Speech Sentences

13


Summary

The good news is that the drought is over in California. The bad news is, it means you won t be able to go out in the sun in the summer. Also, CNN's ratings have reached a new 9-year low.


Transcript

00:00:00.320 Good morning, everybody, and welcome to the highlight of civilization, the best thing
00:00:05.380 that'll ever happen to you in your whole damn life.
00:00:07.560 And if you'd like to increase your enjoyment level, and now it's hard to imagine you could
00:00:13.680 even do it at this point, we're reaching such lofty levels.
00:00:17.720 Well, there's one thing you can try, one last thing you can try to increase your enjoyment,
00:00:23.600 and all you need is a cup or mug or a glass of tank or chalice or stein, a canteen jug
00:00:28.100 or flask, a vessel of any kind, fill it with your favorite liquid.
00:00:32.460 I like coffee.
00:00:34.340 And join me now for the unparalleled pleasure, the dopamine hit of the day, the thing that
00:00:38.080 makes everything better.
00:00:39.300 It's called the simultaneous sip, and it happens now.
00:00:43.400 Go.
00:00:48.100 Ah.
00:00:50.200 Let me ask you a question, fine people.
00:00:52.840 Do you believe you can tell the difference between good news and bad news with your common
00:01:02.620 sense and your judgment and your instincts and your heuristics?
00:01:10.000 Well, sometimes it's not so clear.
00:01:13.240 Sometimes we can't tell the difference between good news and bad news.
00:01:17.020 For example, California is suffering a drought for a number of years.
00:01:23.860 What would be good news if you were suffering a drought?
00:01:29.420 Rain.
00:01:31.140 Not just rain, but a lot of rain.
00:01:34.700 A lot of rain.
00:01:36.000 In fact, California got so much rain that it made a dent in the drought.
00:01:43.180 So the news today is the dangerous side to this.
00:01:51.080 Oh yeah, there's a dangerous side to not having a drought.
00:01:55.040 It's that the plants and trees will grow so quickly with all that water that when they
00:02:00.220 inevitably dry out in the summer, it's going to be a forest fire catastrophe.
00:02:08.020 Right.
00:02:09.260 So the good news is the drought's over.
00:02:11.640 The bad news is we'll go up in flames and you won't be able to go outdoors in the summer.
00:02:18.260 So is there anything that Tinder can't ruin?
00:02:22.900 I'll just let that one sit there for a while.
00:02:25.420 Is there anything that Tinder can't ruin?
00:02:29.500 You could just fill in the rest yourself at home.
00:02:31.580 Well, I saw a review, I think it was on CNN, that said there's a new show on that's apparently
00:02:40.960 a banger.
00:02:41.840 It's a banger.
00:02:43.080 I think that's like a British word for good.
00:02:46.160 And I said to myself, wow, are you kidding me?
00:02:49.400 There's something worth watching on television?
00:02:51.320 But before I read the review, before I read the review, I realized what was going on.
00:03:00.120 What does it mean in 2023 when there's a movie or a TV series that's excellent?
00:03:08.520 What's that mean?
00:03:09.840 It's really, really good.
00:03:11.220 Yeah, it was basically Brokeback Mountain.
00:03:16.460 Yeah.
00:03:16.960 So it had to be either Holocaust or Brokeback Mountain, had to be LGBTQ, had to be a trans
00:03:23.560 in there somewhere.
00:03:25.720 Right?
00:03:26.560 So I'm reading the review.
00:03:27.720 It's like, and the love story between these two men.
00:03:30.800 Oh, okay, I get it.
00:03:32.020 Best thing that's ever been on television.
00:03:33.700 Because it's a love story between two men.
00:03:36.780 And that's all you need to know.
00:03:38.140 Best thing on television ever since the beginning of time.
00:03:40.680 Probably win a lot of awards.
00:03:46.000 CNN's ratings have reached a new nine-year low.
00:03:51.880 And this is why we call Greg Goffheld the Thanos of news.
00:03:57.920 He's destroyed half of his competition already.
00:04:00.820 But that's only part of the story.
00:04:03.700 Apparently, CNN notched its, I'm sorry, a Megyn Kelly tweet.
00:04:13.600 I saw her talking about it.
00:04:14.720 She said it got its lowest ratings in nine years, but it's also the lowest rating across
00:04:19.700 every category of the day.
00:04:22.040 So every hour of the day, a new low.
00:04:25.880 Now, here's the funny thing.
00:04:27.000 On Twitter, you can read the comments of the ex-CNN viewers and how unhappy they are.
00:04:33.700 Do you know why they're so unhappy?
00:04:36.260 Why are the CNN viewers so unhappy with CNN?
00:04:41.620 It started telling real news.
00:04:43.840 Not completely.
00:04:45.940 They still have a good pocket of bias over there that comes out pretty obviously.
00:04:51.840 But they made a concerted, and I think legitimate, move toward actual news, where they didn't just
00:05:00.400 lie about conservatives.
00:05:01.360 And then the people are like, what?
00:05:06.820 What?
00:05:07.860 Yeah.
00:05:08.540 So it turns out you can't make money telling people the truth or attempting to tell them the truth.
00:05:15.260 You could argue whether they get it right.
00:05:16.740 But attempting to tell people a calm, objective truth is terrible for your ratings.
00:05:25.780 Do you know how I know that?
00:05:27.960 Does anybody know how I know that attempting to tell the truth, even if you don't do it right,
00:05:34.440 but attempting to tell the truth, you know that's terrible for ratings, right?
00:05:38.460 It's certainly bad for mine.
00:05:42.580 Now, luckily, I have, you know, a death wish.
00:05:46.660 You know, that's not literal.
00:05:48.840 I don't have a literal death wish.
00:05:51.220 And by the way, I'm not planning to commit suicide.
00:05:53.940 So let me put that out there.
00:05:57.040 I am not planning to commit suicide.
00:05:59.580 Feeling good.
00:06:00.740 Feeling great, actually.
00:06:02.440 Now that I'm off my blood pressure meds, they were definitely screwing with my mind.
00:06:06.320 But I want you to know, I'm not planning to kill myself.
00:06:09.820 Because I do think there's a reasonably good chance I'll get murdered in the next two years.
00:06:14.720 I mean, reasonable chance meaning 5%.
00:06:17.820 Maybe 5%.
00:06:19.500 But that's, you know, more than you want to worry about.
00:06:21.740 Well, you probably don't have a 5% chance of getting murdered.
00:06:25.940 But I have a 5% chance of getting murdered for political reasons.
00:06:29.840 Because I moved the deal.
00:06:33.100 So, anyway.
00:06:37.200 What was I talking about?
00:06:40.240 I completely lost the thread.
00:06:43.500 I don't know.
00:06:43.840 Was it interesting?
00:06:45.200 Was it anything good?
00:06:47.420 Oh, about the news, yeah.
00:06:48.700 So, CNN is losing viewers because they're attempting to tell them the truth.
00:06:55.360 I attempt to tell the truth on my live streams and tweeting.
00:06:59.640 But, of course, everybody just says, well, that's your truth, you big old stupid liar.
00:07:03.900 You gullible idiot.
00:07:05.420 We'll talk more about how dumb I am later.
00:07:09.000 All right.
00:07:09.440 Here's a story that, I swear to God, you really can't tell the difference between parody and reality.
00:07:16.860 If I didn't tell you, if I had not told you in advance that this is real, the following story, would you have believed it?
00:07:25.380 All right.
00:07:25.820 If you didn't know it was real, would you have believed this if you just heard it in the wild?
00:07:29.940 I'm not even sure I believe it, but I saw it today.
00:07:32.160 That the book, 1984, George Orwell's book, famous classic, the estate has approved a request to have it rewritten from a woman's point of view.
00:07:54.720 Finally.
00:07:55.320 Finally.
00:07:55.440 Finally.
00:07:59.520 Yeah.
00:08:00.240 Finally.
00:08:02.460 Finally.
00:08:03.520 I'll tell you, I tried to read that thing.
00:08:06.440 You know I tried to read 1984.
00:08:08.980 I was familiar with it, of course.
00:08:10.820 But I wanted to see if it held up.
00:08:12.840 And I tried to read that thing, and I was like, oh, my God.
00:08:15.780 This book is entirely from a man's point of view.
00:08:19.500 I couldn't even finish it.
00:08:21.460 There were no LGBTQ themes.
00:08:23.920 The Holocaust wasn't even mentioned.
00:08:26.700 Not once.
00:08:30.180 Maybe it was.
00:08:31.460 Maybe it was.
00:08:32.280 Actually, I don't remember.
00:08:35.780 But, you know, the biggest problem was it wasn't from a woman's point of view.
00:08:41.360 And if you could get that fixed, then that book might have some legs.
00:08:45.560 People might want to start talking about that.
00:08:47.860 Well, well, I swear to God, I had something on my, you know, why is it that, have you ever
00:08:59.560 just tried to open an iPad and just leave it there so you could go back to it?
00:09:04.820 And it'll, like, just, like, change its page, and it'll turn off, and it'll come up, and it'll
00:09:10.780 ask for software updates, and, like, why can't it just sit there?
00:09:15.300 Just sit there and do what the fuck?
00:09:17.600 Sorry.
00:09:18.800 There will be no cursing.
00:09:20.940 There will be no cursing.
00:09:22.140 But I'd like to call your attention to today's Dilbert comic, which I just tweeted, and you
00:09:29.880 may recognize this story.
00:09:31.640 It's Dogbert talking to Dilbert.
00:09:34.420 See if you recognize this story from the news.
00:09:37.500 Dogbert's reading his phone, and Dogbert says, the College of Psychologists of Ontario
00:09:42.600 says they will pull my license unless I surrender to a re-education camp.
00:09:47.160 Dilbert says, you don't live in Canada, and you don't have a license to practice.
00:09:53.800 And then Dilbert says, they are oddly aggressive for Canadians.
00:09:57.400 And Dogbert takes a sip of his coffee and says, sounds like a mental disorder.
00:10:03.640 Do you recognize that story?
00:10:06.220 Yeah, the College of Psychologists of Ontario has asked Dr. Jordan Peterson to come in to
00:10:13.940 be re-educated.
00:10:17.160 The best thing that ever happened to Canadian psychologist industry is Jordan Peterson.
00:10:29.080 He actually made their entire profession look worthwhile for a change.
00:10:35.160 He took this thing that didn't have the greatest reputation in the world, you know, the practice
00:10:39.920 of psychology, and turned it into something useful for millions of people around the world.
00:10:45.320 So what do they got to do?
00:10:47.980 Well, you better re-educate that motherfucker, because, sorry, I just couldn't do it.
00:10:53.360 I just couldn't get through it without swearing.
00:10:57.000 But I think you agree that one belonged.
00:10:59.880 That one belonged there.
00:11:02.200 So I had already previewed to Dr. Peterson that I was going to run this comic.
00:11:09.240 He asked for the original.
00:11:12.300 Do you think I said yes?
00:11:15.540 It's a trick question.
00:11:17.700 It's a trick question.
00:11:19.420 No, it's a trick question.
00:11:20.660 Come on.
00:11:21.200 Come on, it's me.
00:11:22.320 You know it's always a trick question.
00:11:24.780 There's no original.
00:11:26.640 There's no such thing as originals.
00:11:28.760 They're all created digitally.
00:11:29.980 But I will send up a nice copy, because I just wanted him to know, I'm in his corner
00:11:38.380 solidly in his fight against the Ontario, or the College of Psychologists of Ontario.
00:11:45.140 And if you are not aware of this, I'm just guessing.
00:11:52.520 I'm just going to take a guess.
00:11:53.700 There's probably a better than 50% chance that the members of the College of Psychologists
00:12:00.040 of Ontario have been sent a few copies of this comic this morning.
00:12:05.180 Probably.
00:12:06.440 I'll bet they got a few copies sent to them.
00:12:08.960 I'll bet when they opened their email, they were, oh, not 400 copies of this damn thing
00:12:13.700 being sent to me.
00:12:14.860 Why do people keep sending me this comic?
00:12:16.640 Why are they mocking us for this totally reasonable effort of ours to re-educate the only person
00:12:25.200 who has made them look good in their entire history?
00:12:31.320 So, anyway.
00:12:33.000 We'll see if the power of Dilbert has any hold.
00:12:36.200 By the way, this will become the subject of a lawsuit.
00:12:39.940 I think, isn't Dr. Peterson taking him to court?
00:12:44.720 Some kind of lawsuit about this?
00:12:46.640 I don't know the details.
00:12:48.640 But I will tell you that the Dilbert comic has been used in lawsuits before, in legal cases.
00:12:55.480 And it's used in the sense of what should be common knowledge.
00:13:00.160 If it appears in a Dilbert comic, that's evidence that the public should know about this.
00:13:07.300 That it's a commonly understood thing.
00:13:09.440 Otherwise, it doesn't become a comic.
00:13:11.340 You know, I wouldn't do a comic about something that, you know, isn't real.
00:13:14.760 Well, I guess I would, but you know the difference.
00:13:18.900 So, it could make a difference.
00:13:22.700 It could make the difference.
00:13:23.820 Because being in a Dilbert comic establishes that it's ridiculous.
00:13:28.520 You get that, right?
00:13:30.220 Like, it wouldn't work as a joke.
00:13:32.840 It wouldn't work unless it was ridiculous on its own.
00:13:35.560 You can't take something that makes sense and make a joke about it.
00:13:38.900 By the way, did you know that?
00:13:40.700 Did you know you can't make a joke about something that's just perfectly reasonable?
00:13:44.300 For example, my company decided to cut expenses because we were losing money.
00:13:55.480 What can I do with that?
00:13:57.720 Nothing.
00:13:59.140 There's no joke you can make about somebody who does the thing you're supposed to do.
00:14:02.980 So, the fact that it's in a Dilbert comic and we can all laugh at it and recognize it as funny.
00:14:08.900 And I didn't change it much.
00:14:10.640 Just that Dogenbert had to be re-educated.
00:14:13.340 The fact that they would re-educate anybody is so ridiculous
00:14:16.700 that it just stands on its own as, okay, that's just stupid.
00:14:23.820 All right, enough about that.
00:14:24.980 Do you think that there's a phone addiction that we have
00:14:31.620 and that the reason we check our phones so much is that the phones are really, really good?
00:14:37.480 Is that why we check them?
00:14:38.940 Because they're so addictive and they're good?
00:14:41.800 That's what we all think, right?
00:14:43.280 And it feels that way.
00:14:44.620 When I use my phone, it feels addictive.
00:14:47.260 So, it's perfectly reasonable to assume that's what's going on.
00:14:50.260 Let me throw out another possibility.
00:14:52.100 Have you ever been with somebody you really enjoyed being with
00:14:56.560 and then you noticed you hadn't looked at your phone in hours?
00:15:01.720 Has that ever happened to you?
00:15:03.880 If you're with people or even just one person
00:15:06.540 and you're really happy with what you're doing,
00:15:09.460 you don't even think about using your phone.
00:15:12.660 So, here's the provocative new theory.
00:15:16.600 Phones are not awesome.
00:15:18.500 They're not addictive.
00:15:20.540 People got worse.
00:15:22.100 People started sucking so badly
00:15:26.280 that spending time with them is just painful.
00:15:29.640 But my phone never disappoints me.
00:15:32.580 When was the last time you looked at your phone
00:15:34.100 and it did not give you a hit?
00:15:37.200 It's like my best dealer ever.
00:15:39.780 I'm like, I need a hit of some chemicals here.
00:15:43.720 Scroll, scroll, scroll.
00:15:44.900 Ah, ha, ha, ha.
00:15:45.620 Good.
00:15:46.060 Okay.
00:15:46.980 Scroll, scroll, scroll.
00:15:48.020 Ah, ha, ha, ha.
00:15:49.280 All right.
00:15:49.860 I'm happy now.
00:15:50.560 I've said this a lot,
00:15:54.640 but I think that our changing preferences
00:15:57.020 have made us all terrible to each other.
00:15:59.980 In the early days, nobody had much of an interest
00:16:04.380 outside of what they were doing, you know, locally.
00:16:07.900 So, you know, at the end of the day,
00:16:09.620 how are your crops?
00:16:11.640 Oh, my crops are good.
00:16:12.740 How are your crops?
00:16:13.680 I mean, we basically had common references
00:16:15.420 and, you know, people enjoyed each other
00:16:18.580 because that's what they had.
00:16:20.160 And now I think people are so picky about what they like
00:16:26.260 and don't like, you put any two together
00:16:28.760 and there's so much they disagree on
00:16:31.040 because there's so much they think about
00:16:33.400 and there's so many different ways to look at everything
00:16:36.120 that we end up being like two porcupines who can't hug.
00:16:40.540 You know, it's like, although I guess porcupines do reproduce.
00:16:44.540 You ever wonder how porcupines reproduce?
00:16:49.680 Let me turn it into a joke.
00:16:51.560 It's already a joke.
00:16:52.480 I'm stealing this joke.
00:16:54.280 How do...
00:16:54.760 Yeah, all right, you've already heard the joke.
00:16:57.000 How do porcupines reproduce?
00:16:59.200 Very carefully.
00:17:00.460 Very carefully.
00:17:02.260 All right.
00:17:04.440 Just putting that out there.
00:17:06.000 Were you aware that the biggest cause of depression
00:17:08.920 and suicide among the youth is lack of sleep?
00:17:14.540 Did you know that?
00:17:16.360 I heard this as a data,
00:17:18.640 that if you look at lack of sleep,
00:17:21.320 it tracks directly to depression and suicide.
00:17:24.720 Not too surprising, right?
00:17:27.680 So lack of sleep is the number one cause
00:17:30.760 of suicide and depression.
00:17:34.160 What causes lack of sleep in 2023?
00:17:37.280 Only the phone and, you know, screens.
00:17:40.440 Just screens.
00:17:41.080 So, I'm going to go all the way here.
00:17:47.240 I think social media should be illegal for minors.
00:17:51.600 American social media and TikTok and everything else.
00:17:56.200 Now, TikTok should be banned in America.
00:17:58.600 But the other ones, you know, they're American companies
00:18:00.740 and I think they should stay for adults.
00:18:03.120 But I think that minors should not have access
00:18:04.980 to social media after 9 p.m.
00:18:08.840 What do you say?
00:18:10.020 After 9 p.m., no access to social media.
00:18:13.660 But my first choice is to ban it completely for young people.
00:18:17.240 I don't think young people should have access
00:18:18.720 to any social media.
00:18:21.200 It's not like we don't know that it's dangerous.
00:18:25.060 Let me ask you this.
00:18:26.140 Let's say Big Pharma came up with an idea.
00:18:31.860 And the idea was they were going to have you look at screens every night
00:18:36.420 and you could do it as long as you want.
00:18:38.400 You could stay up all night if you want.
00:18:40.360 And then they do a study.
00:18:41.440 This is Big Pharma, right?
00:18:42.580 Just a hypothetical.
00:18:43.920 And they do the randomized control trial.
00:18:46.280 And they find out that the people who use their screens at night
00:18:48.980 lose sleep and end up being depressed and suicidal.
00:18:52.300 The people who do not use the screens have just normal lives
00:18:57.160 and they're better.
00:18:59.140 So would the government approve that product from Big Pharma?
00:19:04.900 If social media had been tested in a randomized control trial
00:19:08.760 before anybody had access to it,
00:19:11.220 and of course you can't do it,
00:19:13.080 would it be illegal?
00:19:15.000 I don't think so.
00:19:17.080 No.
00:19:17.540 I think the health impact, especially on children,
00:19:20.720 I think the health impact of social media
00:19:23.180 would have made it banned if it had been a drug.
00:19:27.360 And the reason that we don't call social media a drug
00:19:30.080 is because we're stupid bucket people.
00:19:34.940 Humans are stupid bucket people.
00:19:37.280 Everything has to be in a bucket.
00:19:39.120 What are you, conservative?
00:19:40.520 Are you conservative or a liberal?
00:19:42.820 Let me get my bucket.
00:19:44.480 Oh, you're in between?
00:19:45.700 No, you can't be in between.
00:19:47.720 Nobody's in between.
00:19:49.560 Get in your bucket.
00:19:51.520 Right?
00:19:51.960 So as soon as you say that drugs are a pill or an injection,
00:19:58.400 then you say, okay, that's drugs.
00:20:01.180 We'll treat drugs this way.
00:20:02.980 We've got all these rules for what a drug is.
00:20:05.580 And then somebody invents social media,
00:20:07.460 which is clearly a drug.
00:20:10.400 Right?
00:20:10.820 I won't even listen to an argument on this point.
00:20:13.960 Social media is a drug
00:20:15.280 that's just administered in a different way.
00:20:18.020 Instead of a needle or a pill that just goes in through your eyes as images.
00:20:24.280 But it's absolutely a drug.
00:20:26.360 And the fact that we don't treat it the way we treat the things that we test and do randomized control trials
00:20:33.480 has nothing to do with whether we should or whether it makes sense.
00:20:38.320 It has nothing to do with the logic, the reasoning, the priorities.
00:20:42.160 Nothing to do with that.
00:20:43.360 It has only to do with the fact that we reflexively put things in buckets
00:20:47.680 and our medical bucket just doesn't have screens in it.
00:20:52.740 And that's it.
00:20:53.900 We put screens in the entertainment bucket
00:20:55.840 when it should have been in the medical bucket.
00:21:00.800 There's not one person arguing the point.
00:21:03.480 There's not a single person here who will argue that point.
00:21:06.460 It's a digital narcotic.
00:21:09.180 Right?
00:21:09.420 You tell me.
00:21:12.180 If we had considered social media a drug,
00:21:15.780 it would not pass government approval.
00:21:19.480 It wouldn't even come close.
00:21:21.440 Not even close.
00:21:26.060 I've heard more people...
00:21:27.420 I heard Bill Maher say the other day
00:21:29.660 that he can't watch a movie anymore
00:21:31.860 because they're too long and boring
00:21:33.940 compared to the quick hit of social media.
00:21:35.800 I don't know if he was talking about other people
00:21:38.500 or about himself, but he made the point.
00:21:41.040 Now, who was the first person you heard say that?
00:21:44.560 I think I'm the first person you've ever heard say that,
00:21:47.600 that movies are dead.
00:21:50.060 You'll never be able to watch a movie again
00:21:51.880 unless you're with somebody
00:21:53.520 and it's more about being with the person.
00:21:56.180 Yeah.
00:21:57.360 Yeah.
00:21:58.280 Yeah.
00:21:59.960 So I was ahead of the curve on that, I guess.
00:22:02.360 So U.S. News,
00:22:07.620 the periodical, the publication, U.S. News,
00:22:11.440 every year they do a ranking of colleges and schools.
00:22:15.360 I guess they do graduate schools
00:22:16.860 and medical schools and stuff like that.
00:22:19.560 And they're having trouble ranking medical schools
00:22:22.600 because some of the medical schools
00:22:25.400 are resisting being ranked on their usual standards.
00:22:30.740 And they say that they won't even contribute information
00:22:34.780 to the rankings
00:22:35.580 unless their diversity and equity successes
00:22:40.000 are included in their ranking.
00:22:43.160 What do you think of that?
00:22:45.500 Would you like colleges to be ranked in part
00:22:49.660 the quality of the school,
00:22:52.080 the quality of the school,
00:22:53.220 not just the fairness,
00:22:54.760 but the quality of the education?
00:22:56.860 That's what's being ranked.
00:22:57.700 Not fairness,
00:22:59.620 but the quality of the education.
00:23:01.660 And they're saying that the quality of the education
00:23:03.780 is better
00:23:04.580 when you've got more diversity
00:23:06.660 and equity and inclusion.
00:23:09.680 What do you think of that?
00:23:11.640 Do you think you can get a better education
00:23:13.180 if there's more diversity?
00:23:20.620 You're all racist.
00:23:22.560 You totally get a better education
00:23:24.360 if there's more diversity.
00:23:25.400 You're all frickin' racists.
00:23:28.320 And I mean that literally.
00:23:31.020 I'm sorry.
00:23:32.020 I just have to say that to your screened faces.
00:23:35.340 No.
00:23:36.160 If the only variable that's different,
00:23:38.440 if the only variable that's different is diversity,
00:23:41.760 it's all positive.
00:23:46.320 Do you disagree?
00:23:47.440 Because the diversity does give you more access
00:23:51.800 to more points of view.
00:23:54.140 So if everybody's equally qualified,
00:23:56.820 that's a benefit.
00:23:59.900 Really, you don't think that's a benefit?
00:24:02.480 If everybody's the same qualifications,
00:24:04.500 you don't think diversity gives you more points of view,
00:24:07.120 more access,
00:24:08.100 more windows to look through,
00:24:10.100 you know,
00:24:10.320 more understanding,
00:24:12.820 better bedside manner, perhaps,
00:24:15.240 because you understand people better.
00:24:17.860 No, it's unambiguously good.
00:24:20.940 All right.
00:24:21.480 Well, you could argue that point.
00:24:23.120 But here's my point.
00:24:24.940 If the way they get there
00:24:26.340 is sacrificing merit,
00:24:29.020 then does anybody think that's good?
00:24:32.500 If you were to deprioritize merit
00:24:36.740 to get more equity and more diversity,
00:24:40.300 is that still good?
00:24:43.200 Well, you know,
00:24:44.340 there's a crossover point, right?
00:24:46.660 So you're getting something from diversity,
00:24:48.980 and I'll argue that to the death,
00:24:52.020 simply because I've been in those environments.
00:24:55.660 Whenever I'm in an environment
00:24:56.760 where there's more diversity,
00:24:58.220 I'm absolutely picking up more understanding.
00:25:01.280 There's just no way to argue that.
00:25:03.740 You can't argue that.
00:25:06.240 But if you give up merit,
00:25:08.800 here's what I think we need to do.
00:25:11.700 I think I would like to know
00:25:13.040 if my doctor got one of those equity degrees
00:25:18.080 or one of those merit degrees.
00:25:20.620 Don't you think we should make a difference?
00:25:22.900 Wouldn't you like to know?
00:25:24.260 Oh, doctor, I see you have a degree.
00:25:26.340 Did you get one of the equity types
00:25:27.820 or one of the merit types?
00:25:30.860 Now, this will never happen, of course.
00:25:33.500 But there are so many things
00:25:34.740 that could be solved by truth and labeling, right?
00:25:38.460 If the only thing I'm asking for
00:25:40.040 is truth and labeling,
00:25:42.160 why would that be a problem?
00:25:43.560 Let the free market decide.
00:25:46.340 If they think that somebody
00:25:47.640 who went to a sort of equity-priority school
00:25:50.920 got more out of the equity and the diversity,
00:25:54.640 because there is a lot to get there,
00:25:56.580 then they gave up on maybe a little bit,
00:25:59.200 you know, in the rankings or the academics.
00:26:01.880 Yeah, maybe it's not that big a deal.
00:26:03.700 You know, maybe it's a small tweak
00:26:05.340 to the merit to get a lot more diversity.
00:26:10.220 You can make that argument.
00:26:13.420 But I would certainly like, as a consumer,
00:26:15.520 to know the difference.
00:26:17.840 I want to know,
00:26:19.220 was equity and diversity a big deal
00:26:21.780 at the school you went to?
00:26:23.800 And did that have anything to do
00:26:25.340 with the fact that you made it through?
00:26:27.420 Or was it merit only?
00:26:30.880 So today I have a doctor who is a woman,
00:26:35.520 and she's brown of some type.
00:26:39.500 I've never asked.
00:26:41.200 All right, so she's a person of color,
00:26:43.120 some kind of brown, whatever,
00:26:45.480 and she's a woman.
00:26:48.500 I have never once thought that she was there
00:26:52.460 for any reason other than merit.
00:26:55.220 Never even occurred to me.
00:26:57.260 Never once in my mind did I say,
00:27:00.120 ooh, I wonder if she only got through medical school
00:27:03.020 because of, you know, her...
00:27:05.620 Never once.
00:27:06.860 Never thought.
00:27:07.360 And she's a great doctor.
00:27:08.340 I like her a lot.
00:27:10.660 But what happens in five years
00:27:12.600 if the doctor's coming out
00:27:14.180 or coming out of, you know,
00:27:15.620 equity, priority college?
00:27:18.080 I'd kind of want to know.
00:27:20.080 I mean, I might not make a different decision.
00:27:22.500 I might make the same decision.
00:27:24.460 I'd kind of like to know.
00:27:25.660 A little truth in labeling would be nice.
00:27:27.200 Speaking of Dr. Jordan Peterson,
00:27:33.760 he is creating an alternative
00:27:35.840 World Economic Forum.
00:27:39.300 And I guess it would be
00:27:40.640 pro-energy that's cheap
00:27:44.640 so that poor people can have access to the energy
00:27:47.080 and pro-families and monogamy.
00:27:51.380 Now, here's my big problem
00:27:53.060 with Dr. Peterson's philosophies of life.
00:27:58.640 He's big on family and monogamy.
00:28:04.340 And I absolutely agree with him
00:28:08.800 that that is the best system
00:28:11.160 for a stable world.
00:28:14.160 But only maybe for 25% of people,
00:28:16.800 and that's the problem.
00:28:18.140 In the old days,
00:28:19.020 maybe it was better for a higher percentage.
00:28:21.280 I don't think,
00:28:22.880 I really don't think
00:28:23.720 more than 25% of the public
00:28:25.380 is going to make a marriage work.
00:28:27.940 For the rest,
00:28:28.540 it's probably a bad idea.
00:28:30.020 And I don't know what to do about that.
00:28:32.040 It could be that his approach is the best,
00:28:34.460 which is if you don't push as hard as possible
00:28:36.920 and the thing you know works,
00:28:39.460 you're in trouble.
00:28:41.180 Yeah, I get that point.
00:28:42.360 That makes sense.
00:28:43.080 And I do agree
00:28:43.820 that if you get the right two people
00:28:46.140 and you put them in a monogamous marriage,
00:28:48.460 it is the best.
00:28:49.080 I think that that's just sort of obviously true.
00:28:53.560 You know, you can see it in a lot of ways.
00:28:56.940 But let me ask you this question.
00:28:59.560 I'm going to put on my skeptic hat for a moment.
00:29:03.280 Do we all agree that single moms
00:29:06.100 produce children who have the most criminal records?
00:29:11.520 Would you agree?
00:29:13.240 That correlation seems to be really clean.
00:29:16.380 You don't agree?
00:29:19.920 I'm not saying it's causation.
00:29:21.580 I'm saying the correlation.
00:29:23.880 Would it be easier if I just say correlation?
00:29:27.380 Because the causation is what I'm going to question next.
00:29:30.440 All right, so here's my question.
00:29:33.480 Is it also true of college-educated single moms?
00:29:37.740 In other words, if you looked at only the single moms
00:29:41.120 who went to Ivy League colleges
00:29:42.920 and have high incomes,
00:29:45.720 they have high incomes,
00:29:48.200 are their children also more likely to be criminals
00:29:50.340 compared to the average?
00:29:57.940 You think yes?
00:29:58.640 I would make a really, really large bet
00:30:02.760 that they have a low criminal record.
00:30:06.520 I would make a really, really big bet on that.
00:30:09.640 You really think it's going to be the same?
00:30:11.920 So to me, that looks like brainwashing.
00:30:16.380 But I'll allow that you could be right,
00:30:18.360 since I don't have the data.
00:30:19.680 I will allow that you could be right.
00:30:21.820 On the surface, it looks like brainwashing.
00:30:23.740 Because if you think that in all situations
00:30:26.500 simply having one parent
00:30:29.100 is worse than all situations with two,
00:30:32.680 that's not really rational thinking.
00:30:35.940 But if you believe there's a strong correlation
00:30:38.060 when you look at the whole thing,
00:30:39.860 probably yes.
00:30:41.180 I think it's been studied a lot.
00:30:43.180 But here's the thing.
00:30:45.000 There is a lot more going on
00:30:46.900 with those single mothers
00:30:47.880 than just the fact that they're single mothers.
00:30:51.380 Whatever it is that made a man
00:30:53.220 not want to be with them.
00:30:55.680 Do I need to finish the sentence?
00:30:59.120 Right.
00:30:59.800 Now, when I talk about the Ivy League woman
00:31:03.380 who chooses to be a single mother,
00:31:06.340 that's a choice.
00:31:09.540 That's somebody who says,
00:31:10.640 oh, I can handle this.
00:31:11.940 I think I'll choose to be a single mother.
00:31:14.780 Do you think that's the same
00:31:15.980 as somebody who couldn't get a man
00:31:17.800 to live with her even if she tried?
00:31:19.680 There's no way those are the same.
00:31:24.120 There's no way.
00:31:26.260 And what about the man who left?
00:31:29.580 If that woman had stayed with the man who left,
00:31:33.480 they would be happy?
00:31:35.060 And the child would do well?
00:31:37.280 Because there's got to be something
00:31:38.500 about the man who leaves
00:31:39.860 and doesn't take responsibility
00:31:41.880 that suggests that if he had stayed,
00:31:44.460 it would have been pretty bad.
00:31:45.480 because he doesn't take responsibility
00:31:48.640 and obviously was not in love with the woman.
00:31:53.560 So I think our thinking about single parenting
00:31:56.980 is completely mixed up
00:32:01.260 with too many other variables.
00:32:03.540 And if the only thing you're seeing
00:32:04.920 is the single part,
00:32:06.660 you're missing the biggest part of the picture.
00:32:09.340 I think.
00:32:10.060 So I think we're completely misled
00:32:13.140 about what's going on there.
00:32:14.940 I think there's a certain type of person
00:32:16.700 who's more likely to be a single mom
00:32:19.560 and that that person
00:32:21.160 who's more likely to be a single mom
00:32:22.980 is unlikely to have an Ivy League degree.
00:32:28.200 Right?
00:32:28.740 They are likely
00:32:29.700 to have a bunch of other characteristics
00:32:33.460 in common with other single women.
00:32:36.120 And I think it's those other characteristics
00:32:39.380 that are probably driving things.
00:32:41.720 Now, that said,
00:32:43.520 the best way to raise a kid,
00:32:45.620 I'm sure,
00:32:47.180 is a mother and a father.
00:32:49.740 I'm not sure the gender thing
00:32:51.060 is so important.
00:32:52.620 But, you know,
00:32:53.400 a two-parent situation
00:32:54.440 because two is better than one.
00:32:56.880 Having two people take care of you
00:32:58.500 that are adults
00:32:59.180 and love you,
00:33:00.840 well, it's got to be better than one
00:33:02.140 if they're both functional.
00:33:03.840 But I think we need to look
00:33:07.300 at that a little bit deeper.
00:33:08.480 It's too simplistic,
00:33:09.920 you know,
00:33:10.480 married versus unmarried.
00:33:15.560 Rasmussen has a poll.
00:33:17.700 It says 48% of the,
00:33:19.460 these are usually likely voters they poll,
00:33:21.860 48% say Biden's handling
00:33:23.780 of classified documents
00:33:24.860 is a, quote,
00:33:25.840 major scandal.
00:33:29.760 What,
00:33:30.940 I'll just give you a little quiz.
00:33:32.960 Now, for those who are in YouTube,
00:33:34.740 if you're new to me,
00:33:36.200 I've developed
00:33:37.180 the smartest
00:33:38.380 livestream audience
00:33:40.740 in the world.
00:33:42.720 And I'm going to prove it again.
00:33:44.600 Not only do they know
00:33:45.860 based on what's happening,
00:33:47.720 and they're smart about that,
00:33:49.400 they can actually
00:33:50.340 see into the future
00:33:52.440 and see data that
00:33:53.580 hasn't even been presented to them.
00:33:56.980 So I'm going to ask this question
00:33:58.540 and watch how cleverly
00:33:59.940 they get the right answer.
00:34:00.780 They've never seen
00:34:02.300 this information before.
00:34:03.560 They've never seen this before.
00:34:05.060 Watch this.
00:34:07.480 What percentage do you think
00:34:08.800 after the Rasmussen polled them,
00:34:10.980 what percentage of American voters,
00:34:13.140 likely voters,
00:34:14.060 do you think
00:34:14.560 say that Biden's
00:34:16.360 classified document situation
00:34:18.740 is no scandal at all?
00:34:20.760 No scandal at all.
00:34:22.960 You did it again.
00:34:25.120 Amazing.
00:34:26.640 It's 24.
00:34:28.540 It's 24.
00:34:29.420 But, you know,
00:34:30.040 those of you who guessed 25,
00:34:31.880 that was really good.
00:34:34.400 You know,
00:34:34.760 you just keep impressing me.
00:34:36.460 Keep impressing me.
00:34:37.420 I don't know how you do it.
00:34:40.240 But 60% of voters
00:34:42.600 believe it's likely
00:34:43.700 the information
00:34:44.300 from the classified documents
00:34:45.640 was used by Hunter Biden
00:34:47.640 in his foreign business deals.
00:34:49.200 How in the world
00:34:53.560 can Joe Biden
00:34:55.220 win the presidency?
00:34:57.900 How in the world?
00:34:59.360 60%
00:35:00.140 believe that he
00:35:01.740 and his son,
00:35:02.600 and I'm going to add that,
00:35:04.060 you know,
00:35:04.800 they must think
00:35:05.460 they're working as a team.
00:35:07.300 60% of voters
00:35:08.700 think that they're
00:35:09.260 selling classified information
00:35:10.720 or it's likely
00:35:11.920 that they're selling
00:35:13.120 classified information
00:35:14.100 to foreign
00:35:14.660 entities.
00:35:16.200 Isn't it over?
00:35:23.260 How does Trump
00:35:24.200 not become president?
00:35:26.520 You know,
00:35:26.980 you're seeing a lot
00:35:27.840 of Trump criticism.
00:35:29.340 A lot of it comes from me,
00:35:30.660 and I think it's deserved,
00:35:31.740 actually.
00:35:32.960 But
00:35:33.320 he still has to run
00:35:35.420 against someone,
00:35:36.620 and I think
00:35:38.960 it's going to be Biden.
00:35:40.980 It looks like
00:35:41.780 it's going to be Biden.
00:35:42.560 How does Trump lose?
00:35:47.940 Like how?
00:35:48.920 The only way this goes
00:35:50.700 is a rigged election.
00:35:53.780 Am I wrong?
00:35:55.500 I don't see anything else
00:35:57.180 that can happen
00:35:57.760 except a massively
00:35:58.740 rigged election.
00:35:59.940 Because there's no way
00:36:00.720 that the powers that be
00:36:02.180 are going to let
00:36:02.680 Trump win again.
00:36:04.440 I mean,
00:36:04.780 without at least
00:36:05.380 trying really hard
00:36:06.280 to influence it.
00:36:09.460 Now,
00:36:10.000 no matter what you say
00:36:10.840 about past elections,
00:36:12.100 I'll just let
00:36:13.180 past elections
00:36:14.000 go by as a
00:36:14.980 separate topic.
00:36:17.440 No matter what's
00:36:18.420 happened in the past,
00:36:19.540 they're going to have
00:36:20.200 to rig it this time.
00:36:21.760 They have to.
00:36:23.360 And you know what?
00:36:24.740 If the Republicans
00:36:25.540 lose a rigged election,
00:36:26.840 you know what I say?
00:36:28.980 Totally deserve it.
00:36:31.300 And I will welcome
00:36:32.680 our new Democrat president,
00:36:34.960 and I'll congratulate
00:36:35.780 him on the win.
00:36:37.060 Because if the Republicans
00:36:38.580 can't figure out
00:36:39.440 how to cheat better
00:36:40.320 or stop the cheating,
00:36:41.300 they're not very capable.
00:36:44.000 They're not very capable.
00:36:45.900 And they don't
00:36:46.580 deserve to win.
00:36:47.860 And I see no action
00:36:49.100 from the Republicans
00:36:49.920 to increase the transparency
00:36:52.340 of the elections
00:36:53.180 so that you know
00:36:54.920 if it was cheating.
00:36:56.700 At this point,
00:36:57.800 the Republicans
00:36:58.740 deserve to lose.
00:37:01.320 They're doing everything
00:37:02.360 in their power to lose.
00:37:03.740 The Republicans,
00:37:04.580 you know,
00:37:05.020 basically it doesn't even
00:37:06.540 look like they're trying,
00:37:07.840 honestly.
00:37:08.260 So there's a video
00:37:14.640 going around
00:37:15.260 that there was a presentation
00:37:16.900 at the World Economic Forum,
00:37:18.540 and everything sounds scarier
00:37:19.800 when it's at the World Economic Forum.
00:37:23.280 And this was extra scary.
00:37:25.160 So there is now technology
00:37:26.520 where you could put on
00:37:27.520 some kind of a wearable device
00:37:28.960 or a earbud
00:37:30.240 that would monitor
00:37:31.580 your brainwaves,
00:37:32.680 and the thinking
00:37:34.560 is that employers
00:37:35.500 will be able to know
00:37:37.240 if you're thinking
00:37:38.120 about work,
00:37:39.820 if you're actually working
00:37:41.540 and thinking about work,
00:37:43.600 or if your mind
00:37:44.640 has wandered
00:37:45.240 to your personal thoughts.
00:37:47.280 And they'll be able
00:37:47.900 to, like,
00:37:48.580 play back your brainwaves
00:37:49.860 after the fact
00:37:50.720 and say,
00:37:51.560 all right,
00:37:52.060 well,
00:37:52.260 at about 3 o'clock,
00:37:54.180 at 3 o'clock,
00:37:55.100 you were mostly
00:37:55.820 just daydreaming
00:37:56.740 about sex stuff,
00:37:58.600 I guess.
00:37:58.940 now add to that
00:38:03.740 that your employer
00:38:04.580 can check
00:38:05.200 your actual actions online,
00:38:07.600 they can check
00:38:08.360 your keystrokes,
00:38:09.780 and they might even know
00:38:10.640 where you are physically.
00:38:12.180 They might track
00:38:12.840 your location.
00:38:14.200 So your employer
00:38:14.920 will know
00:38:15.360 what you're thinking
00:38:16.160 and what you're doing
00:38:17.060 every minute of the day.
00:38:19.480 Roughly speaking,
00:38:20.320 they'll know.
00:38:22.760 So the thinking is
00:38:24.120 that we'll become
00:38:24.700 this mass dystopia
00:38:27.180 with people
00:38:29.020 just hooked
00:38:29.780 into the matrix
00:38:30.440 having to work
00:38:31.140 all the time
00:38:31.780 and all their happiness
00:38:32.580 and quality of life
00:38:34.460 will be destroyed
00:38:35.280 and it's all
00:38:35.900 the World Economic
00:38:36.680 Forum's problem
00:38:37.640 because they let
00:38:38.840 somebody give
00:38:39.320 a presentation
00:38:40.000 in which the person
00:38:40.860 who presented it
00:38:41.520 said,
00:38:42.180 don't assume
00:38:43.320 it'll be all bad.
00:38:47.560 It's pretty hard
00:38:48.420 to hear this
00:38:49.100 and then imagine
00:38:50.600 it could be good
00:38:51.420 in any way.
00:38:54.000 But,
00:38:54.620 let me give you
00:38:55.780 some context.
00:38:57.180 When I got out
00:38:58.260 of college,
00:38:59.020 I was looking
00:38:59.720 for a job
00:39:00.200 at a big bank.
00:39:01.860 I went to work
00:39:02.400 at what was
00:39:03.400 Crocker National Bank
00:39:04.600 at the time
00:39:05.140 before Wells Fargo
00:39:06.900 bought them.
00:39:08.440 And it was when
00:39:09.740 ATMs were just
00:39:11.240 being rolled down.
00:39:13.200 So ATMs were
00:39:14.120 sort of a new thing.
00:39:15.260 Not every bank
00:39:15.900 had one yet,
00:39:16.680 but Crocker
00:39:17.480 was ahead of the curve.
00:39:19.220 Do you know
00:39:19.480 what the biggest
00:39:20.220 debate was
00:39:22.820 about ATMs?
00:39:24.300 Do you know
00:39:24.540 what the biggest
00:39:25.020 controversy was?
00:39:27.180 this technology
00:39:29.820 will be out of control.
00:39:31.560 It's going to steal
00:39:32.480 my money.
00:39:33.640 And then when I complain,
00:39:34.720 nobody will listen
00:39:35.460 to me because
00:39:36.460 it's me against
00:39:37.120 the computer.
00:39:38.760 Right?
00:39:39.020 And all the old
00:39:39.580 people said,
00:39:40.340 you're going to
00:39:41.180 lose your humanity
00:39:42.100 because you don't
00:39:42.760 get to talk
00:39:43.260 to the teller.
00:39:44.640 You won't be able
00:39:45.540 to complain,
00:39:46.520 blah, blah.
00:39:46.960 Now,
00:39:47.520 did any of those
00:39:48.500 things happen?
00:39:50.100 A little bit,
00:39:51.520 but not really.
00:39:52.780 Not really.
00:39:54.000 Yeah,
00:39:54.180 I mean,
00:39:54.360 they happened,
00:39:55.260 but not to the point
00:39:56.260 where it made
00:39:56.700 any difference
00:39:57.220 to the growth
00:39:57.760 of ATMs.
00:39:59.160 Right?
00:39:59.780 By the way,
00:40:00.420 the rule in banks
00:40:01.860 at the time
00:40:03.480 was if you complained
00:40:04.460 about the ATM
00:40:05.160 stealing your money,
00:40:06.960 if they didn't see it
00:40:08.520 on video,
00:40:10.320 like if they didn't
00:40:11.000 have enough information,
00:40:12.780 they gave you
00:40:13.340 your money.
00:40:13.780 If they couldn't
00:40:15.340 prove it one way
00:40:16.120 or the other,
00:40:16.860 they actually just
00:40:17.660 gave you your money.
00:40:18.960 You know,
00:40:19.220 because usually
00:40:19.740 it was like 100 bucks,
00:40:20.820 right?
00:40:21.460 Somebody would say,
00:40:22.100 hey,
00:40:22.280 the ATM took my $100.
00:40:25.200 And if they couldn't tell,
00:40:27.580 they just gave you
00:40:28.200 the $100.
00:40:29.280 Because there was
00:40:29.640 nothing else they could do.
00:40:30.580 Otherwise,
00:40:31.160 they'd have to,
00:40:32.040 they'd have to just
00:40:33.000 get rid of ATMs.
00:40:34.140 Now,
00:40:34.380 eventually,
00:40:35.500 the ATM,
00:40:36.560 you know,
00:40:37.000 keeps track of what it gives,
00:40:38.320 but it also has a video.
00:40:40.240 And the video would,
00:40:41.840 you know,
00:40:42.060 pretty much prove
00:40:42.920 if it stole something
00:40:44.280 from you.
00:40:45.640 So,
00:40:46.120 whenever you see
00:40:46.720 a new technology
00:40:47.560 that looks like
00:40:48.220 it's going to end
00:40:48.980 all our quality of life,
00:40:51.000 we've been through
00:40:52.300 this cycle a lot.
00:40:54.920 Now,
00:40:55.460 I will agree
00:40:56.080 that this looks different.
00:40:58.760 AI doesn't look
00:40:59.620 like anything else.
00:41:01.220 So,
00:41:01.520 anything that
00:41:02.120 was in the past
00:41:03.400 that looked like
00:41:04.480 a pattern
00:41:04.920 probably will be
00:41:06.220 violated by AI.
00:41:08.020 But it's good to know
00:41:09.280 that we've been here before.
00:41:12.440 We're always worrying
00:41:13.580 about the next technology
00:41:14.860 will be the one
00:41:15.500 that ends us all.
00:41:16.360 And I have now
00:41:18.560 an out-of-pocket
00:41:21.160 prediction about AI.
00:41:24.440 You know how AI
00:41:25.760 was going to take
00:41:26.380 all of our
00:41:26.940 physical jobs first
00:41:28.920 because you could
00:41:29.660 build robots.
00:41:31.180 So,
00:41:31.420 the robots will take
00:41:32.260 the dangerous
00:41:33.240 physical jobs.
00:41:34.700 And then,
00:41:35.600 later,
00:41:36.220 the creative jobs,
00:41:37.720 you know,
00:41:38.440 those might
00:41:39.060 stay intact
00:41:40.780 because it would
00:41:41.540 take a long time
00:41:42.240 for AI to get there.
00:41:43.460 it's going to be
00:41:45.100 the opposite.
00:41:47.820 I think AI
00:41:48.820 will take
00:41:49.260 all the smart jobs,
00:41:51.960 coding,
00:41:54.040 psychology,
00:41:55.500 you know,
00:41:55.760 your therapist
00:41:56.320 will be AI.
00:42:00.200 Art,
00:42:01.500 you know,
00:42:01.700 all art,
00:42:02.360 humor,
00:42:03.700 entertainment
00:42:04.080 is all going
00:42:05.560 to be AI.
00:42:06.980 Do you know
00:42:07.320 what people will do?
00:42:09.940 Manual labor.
00:42:10.860 I think humans
00:42:13.080 will be moved
00:42:14.560 entirely to manual labor.
00:42:17.180 Do you know why?
00:42:18.700 Because it will be
00:42:19.100 cheaper than robots
00:42:19.980 for a long time.
00:42:21.860 That's all.
00:42:23.280 They'll just be
00:42:23.840 cheaper than robots.
00:42:25.140 I think the economics
00:42:26.320 will be that people
00:42:27.160 will be cheap
00:42:27.820 and robots
00:42:28.360 will be expensive.
00:42:29.540 So,
00:42:30.060 you use your
00:42:30.480 expensive robots
00:42:31.460 to be lawyers
00:42:32.260 and doctors.
00:42:33.860 And by the way,
00:42:34.860 AI has already
00:42:35.620 passed the bar
00:42:36.400 and AI has passed
00:42:38.280 what is the medical
00:42:40.680 licensing?
00:42:42.260 It's already
00:42:42.940 done that.
00:42:45.400 It's already
00:42:46.220 more educated
00:42:46.900 than you are
00:42:47.700 because it's got
00:42:49.140 two degrees,
00:42:50.240 you know,
00:42:50.380 even if you have one.
00:42:52.240 So,
00:42:52.740 yeah,
00:42:53.140 humans are going
00:42:53.760 to have to do
00:42:54.180 stuff like
00:42:54.680 crawl under
00:42:55.720 floors
00:42:58.740 to get to plumbing.
00:43:01.060 Humans are going
00:43:01.680 to be figuring out,
00:43:02.780 oh, wait,
00:43:03.840 you know,
00:43:04.080 just because the wires
00:43:04.860 don't touch,
00:43:05.600 there's also maybe
00:43:06.260 rats or there's
00:43:07.120 a human element.
00:43:08.520 Yeah,
00:43:09.000 people will just
00:43:09.680 be doing manual
00:43:10.480 labor.
00:43:12.320 Now,
00:43:12.780 here's the
00:43:13.080 interesting part.
00:43:15.200 Do you know
00:43:15.760 what makes
00:43:16.100 humans happy?
00:43:20.400 Doing manual
00:43:21.140 labor.
00:43:22.460 It turns out
00:43:23.280 it makes us
00:43:23.700 very happy.
00:43:25.200 Yeah,
00:43:25.680 as long as we're
00:43:26.380 not working
00:43:26.840 ourselves to death,
00:43:27.940 right?
00:43:28.200 You don't want
00:43:28.580 to be in a slave
00:43:29.220 camp,
00:43:29.640 you don't want
00:43:30.000 to be in,
00:43:30.980 you know,
00:43:31.640 the gulag.
00:43:32.720 But if you're
00:43:33.460 just keeping busy,
00:43:35.520 you know,
00:43:35.780 you're just moving,
00:43:36.700 you're doing
00:43:37.060 a thing,
00:43:37.660 you complete
00:43:38.140 a task,
00:43:39.100 you get a
00:43:39.460 dopamine for
00:43:40.100 completing a
00:43:40.700 task,
00:43:41.460 you're moving,
00:43:42.220 you're completing
00:43:42.700 tasks,
00:43:43.720 it's actually
00:43:44.260 really good for
00:43:44.960 us.
00:43:45.880 So we probably
00:43:46.780 will become
00:43:47.540 the manual
00:43:50.000 laborers and
00:43:50.800 the AI will do
00:43:51.700 all the thinking
00:43:52.200 for us.
00:43:53.640 That's my
00:43:54.280 prediction.
00:43:56.560 So,
00:43:57.180 you know,
00:43:58.560 what's really
00:43:59.060 puzzling,
00:44:01.040 even at this
00:44:01.660 late stage,
00:44:02.660 is we still
00:44:03.600 don't know why
00:44:04.200 Sweden did so
00:44:05.040 well in the
00:44:05.820 pandemic.
00:44:06.240 It's still
00:44:08.020 a mystery.
00:44:09.540 Now,
00:44:09.740 there are a
00:44:09.920 few things we
00:44:10.400 do know about
00:44:10.960 Sweden.
00:44:11.460 I found out
00:44:12.260 and confirmed
00:44:13.040 this this
00:44:13.540 morning,
00:44:14.180 that I saw
00:44:15.280 a list of
00:44:15.780 the fattest
00:44:16.560 countries and
00:44:17.360 the thinnest
00:44:17.800 countries.
00:44:18.620 America is
00:44:19.160 one of the
00:44:19.560 fattest
00:44:19.980 countries in
00:44:20.600 the world,
00:44:21.580 top three,
00:44:22.200 I think,
00:44:22.840 and Sweden
00:44:23.520 is one of
00:44:23.960 the thinnest,
00:44:25.380 one of the
00:44:25.980 thinnest.
00:44:27.960 So I can't
00:44:28.940 understand the
00:44:29.560 whole pandemic
00:44:31.220 situation.
00:44:31.820 Oh,
00:44:32.360 then also,
00:44:33.060 I found out
00:44:34.720 today that
00:44:35.200 Sweden is one
00:44:35.960 of the
00:44:36.180 youngest
00:44:36.480 countries.
00:44:38.140 It's one
00:44:38.360 of the
00:44:38.500 youngest.
00:44:39.460 America is
00:44:39.860 one of the
00:44:40.160 oldest.
00:44:41.240 I didn't
00:44:41.560 realize that.
00:44:42.840 So America
00:44:43.220 is old and
00:44:44.200 fat, and
00:44:44.940 it's one of
00:44:45.300 the oldest
00:44:45.820 and one of
00:44:46.320 the fattest
00:44:46.780 countries.
00:44:48.520 And Sweden
00:44:49.540 is unusually
00:44:51.240 young and
00:44:51.800 unusually thin.
00:44:54.280 But you
00:44:54.820 know,
00:44:54.960 the thing is,
00:44:55.320 I can't
00:44:55.700 figure out
00:44:56.200 how they
00:44:57.380 did so well
00:44:58.180 during the
00:44:58.800 pandemic.
00:44:59.300 That's still
00:44:59.600 sort of a
00:45:00.060 mystery.
00:45:00.640 Oh,
00:45:00.760 another thing
00:45:01.180 I found out
00:45:01.600 about Sweden
00:45:02.080 is that because
00:45:03.200 of where
00:45:03.760 they're located
00:45:04.440 on the
00:45:04.900 planet,
00:45:05.960 they routinely
00:45:07.520 supplement with
00:45:08.540 vitamin D.
00:45:10.020 I think they
00:45:10.680 take like fish
00:45:11.320 oil or something
00:45:11.980 that's nasty.
00:45:13.800 But they
00:45:15.900 have good
00:45:16.540 vitamin D.
00:45:18.120 They're thin
00:45:18.860 and they're
00:45:19.320 younger than
00:45:19.880 other countries
00:45:20.640 that did
00:45:21.020 poorly.
00:45:21.560 But if I
00:45:22.760 had to guess
00:45:23.520 why they did
00:45:24.240 well,
00:45:25.500 probably
00:45:26.080 ivermectin.
00:45:28.080 Probably
00:45:28.640 ivermectin,
00:45:29.320 I think.
00:45:30.060 Could have
00:45:31.080 been hydroxychloroquine.
00:45:33.400 Or it could
00:45:34.140 be because
00:45:34.580 they didn't
00:45:35.420 use masks
00:45:36.120 as much.
00:45:37.900 Could be
00:45:38.500 that.
00:45:41.020 But,
00:45:42.000 I don't
00:45:42.260 know,
00:45:42.460 I don't
00:45:42.700 understand how
00:45:43.360 just because
00:45:43.920 they're thinner
00:45:44.620 and younger
00:45:45.960 and they have
00:45:46.660 good vitamin D,
00:45:47.640 I don't see
00:45:48.060 how they do
00:45:48.740 well in the
00:45:49.360 pandemic.
00:45:49.840 Unless it
00:45:50.260 was ivermectin
00:45:50.960 or hydroxychloroquine
00:45:52.180 were part of
00:45:53.460 the story.
00:45:54.360 All right.
00:45:54.500 There are
00:45:55.920 two new
00:45:56.580 studies
00:45:57.100 about the
00:45:59.340 vaccinations
00:45:59.880 and about
00:46:00.420 COVID
00:46:00.780 and both
00:46:01.820 of them
00:46:02.100 are total
00:46:02.740 BS.
00:46:04.920 You want to
00:46:05.620 hear this
00:46:06.020 BS?
00:46:06.420 I'll tell
00:46:08.600 you,
00:46:08.980 I think
00:46:09.940 we can
00:46:10.220 agree that
00:46:10.760 all the
00:46:11.140 data about
00:46:11.980 the pandemic
00:46:12.520 is sketchy.
00:46:14.160 It doesn't
00:46:14.500 matter where
00:46:14.840 it comes
00:46:15.160 from,
00:46:15.420 it's all
00:46:15.680 sketchy.
00:46:16.760 But this
00:46:17.380 is the
00:46:17.920 craziest
00:46:18.280 thing you've
00:46:18.920 heard yet.
00:46:20.200 Are you ready
00:46:20.520 for this?
00:46:20.900 Now, of
00:46:22.060 course,
00:46:22.380 these are
00:46:23.080 preprints
00:46:23.700 and who
00:46:24.860 knows if
00:46:25.380 they could
00:46:25.600 be repeated
00:46:26.120 or anything.
00:46:26.900 There are
00:46:27.440 two recent
00:46:27.920 studies,
00:46:28.460 one out of
00:46:28.840 Denmark,
00:46:29.440 one out of
00:46:29.800 somewhere else,
00:46:30.860 that said
00:46:31.340 that if
00:46:31.700 you're over
00:46:32.020 50,
00:46:33.920 you actually
00:46:35.480 did better
00:46:36.240 if you got
00:46:38.200 the vaccination
00:46:38.880 than if you
00:46:39.540 didn't.
00:46:40.880 Now,
00:46:41.900 here's how
00:46:42.700 we analyze
00:46:43.500 that.
00:46:45.200 We'll start
00:46:45.860 with what we
00:46:46.400 know for
00:46:46.840 sure,
00:46:47.260 okay?
00:46:47.640 And then
00:46:47.920 this is how
00:46:49.040 you reason.
00:46:49.460 You start
00:46:49.780 with what
00:46:50.080 you know,
00:46:50.900 and then
00:46:51.500 you reason
00:46:51.940 toward the
00:46:52.400 things that
00:46:52.900 you're trying
00:46:53.180 to figure
00:46:53.460 out.
00:46:54.200 What we
00:46:54.660 know is
00:46:56.000 that I
00:46:56.320 got everything
00:46:57.020 wrong about
00:46:57.940 my decisions
00:46:58.620 about the
00:46:59.140 pandemic.
00:47:00.380 So you
00:47:00.680 start with
00:47:01.080 that as
00:47:01.380 your fact,
00:47:02.160 and then
00:47:02.940 we can
00:47:03.280 reason
00:47:03.560 backwards
00:47:04.000 to conclude
00:47:05.780 that both
00:47:06.240 of these
00:47:06.540 studies are
00:47:07.060 BS.
00:47:09.180 Because if
00:47:09.800 I know I
00:47:10.280 got the
00:47:10.620 wrong answer,
00:47:12.200 it can't
00:47:13.360 be true that
00:47:13.840 these studies
00:47:14.360 are true,
00:47:14.880 because that
00:47:15.240 would sort
00:47:16.000 of suggest
00:47:16.460 I made
00:47:17.120 a good
00:47:17.360 decision.
00:47:18.180 And since
00:47:18.540 we know
00:47:18.840 that's not
00:47:19.240 the case,
00:47:19.760 we start
00:47:20.800 with what
00:47:21.120 we know,
00:47:21.600 take the
00:47:21.960 L,
00:47:22.360 I take
00:47:22.740 the L,
00:47:23.840 L,
00:47:24.680 taking the
00:47:25.160 L,
00:47:26.140 clotting and
00:47:26.680 coping.
00:47:27.400 So we
00:47:27.720 know that's
00:47:28.160 true,
00:47:28.840 so then we
00:47:29.400 can reason
00:47:30.220 backwards that
00:47:31.300 the studies
00:47:32.120 are BS.
00:47:32.620 so I
00:47:34.020 reject them
00:47:34.720 completely
00:47:35.420 for the
00:47:37.020 BS that
00:47:37.520 they are.
00:47:40.400 By the
00:47:40.920 way,
00:47:41.200 how many
00:47:41.560 of you
00:47:41.820 understand
00:47:42.560 that I'm
00:47:44.380 accepting that
00:47:45.200 I live in a
00:47:45.800 simulation,
00:47:47.220 and that
00:47:47.700 your reality
00:47:48.460 can actually
00:47:49.260 be real,
00:47:50.760 as real as
00:47:51.340 anything,
00:47:51.820 because we're a
00:47:52.300 simulation,
00:47:53.220 and mine can
00:47:54.080 be real and
00:47:54.720 opposite,
00:47:55.940 and there's
00:47:56.640 no conflict.
00:47:58.160 There's no
00:47:58.820 conflict if I
00:47:59.660 live in a
00:48:00.040 world in
00:48:00.480 which everything's
00:48:01.220 different than
00:48:01.720 everything for
00:48:02.520 you,
00:48:03.120 as long as
00:48:03.980 it doesn't
00:48:04.340 conflict,
00:48:05.700 as long as
00:48:06.400 we can both
00:48:06.880 reproduce,
00:48:08.160 those two
00:48:08.900 worlds can
00:48:09.440 live as
00:48:10.000 completely
00:48:10.460 true,
00:48:11.540 as true as
00:48:12.100 anything else.
00:48:13.040 No less
00:48:13.520 true,
00:48:13.840 no more
00:48:14.140 true.
00:48:19.680 Have you
00:48:20.340 yet watched
00:48:21.000 Brett and
00:48:21.540 Heather?
00:48:22.260 Well,
00:48:23.440 here's something
00:48:24.060 I learned
00:48:24.580 that other
00:48:26.160 people can do
00:48:26.800 that I can't.
00:48:29.080 Suppose,
00:48:30.920 and let's
00:48:31.360 take as our
00:48:32.600 starting points,
00:48:33.360 again, we'll
00:48:33.780 use the same
00:48:34.260 technique.
00:48:35.100 Since we
00:48:35.500 know that
00:48:36.260 Heather and
00:48:38.440 Brett were
00:48:40.280 correct about
00:48:41.340 the pandemic,
00:48:43.300 so we'll start
00:48:44.100 with the fact
00:48:44.560 that they're
00:48:44.880 correct,
00:48:45.920 then how
00:48:47.740 did you know?
00:48:49.680 How did you
00:48:50.340 know they were
00:48:50.780 correct?
00:48:51.760 Because I'm
00:48:52.340 going to accept
00:48:52.880 as a fact that
00:48:53.720 they were
00:48:53.980 correct,
00:48:54.880 because they
00:48:55.560 have skills.
00:48:57.480 Now, they have
00:48:58.000 skills that I
00:48:58.820 don't have,
00:48:59.300 so I can't
00:49:00.440 do what they
00:49:00.900 do, right?
00:49:02.160 I could not
00:49:02.740 look at data
00:49:03.560 and science
00:49:04.440 at the level
00:49:05.640 that they could
00:49:06.120 look at,
00:49:06.560 and I wouldn't
00:49:06.920 understand it.
00:49:08.060 But apparently
00:49:08.740 they can.
00:49:09.700 But my question
00:49:10.540 is, how did
00:49:11.340 you know that
00:49:11.980 they could,
00:49:13.920 and all the
00:49:14.500 people who
00:49:14.880 don't have
00:49:15.300 podcasts could
00:49:16.240 not?
00:49:17.440 Like, what was
00:49:18.060 it that said
00:49:18.620 to you,
00:49:19.260 the people who
00:49:20.940 are good at
00:49:21.780 this, but
00:49:23.260 also have a
00:49:23.980 podcast,
00:49:25.300 have one point
00:49:27.040 of view,
00:49:27.480 and the
00:49:27.760 people who
00:49:28.280 are, I
00:49:29.020 thought were
00:49:29.520 good at it,
00:49:30.120 but for
00:49:30.480 whatever reason
00:49:31.000 don't have a
00:49:31.620 podcast,
00:49:32.560 are all getting
00:49:33.020 the wrong
00:49:33.380 answers.
00:49:34.340 So, was it
00:49:35.340 the fact that
00:49:35.900 they have a
00:49:36.480 podcast that
00:49:37.840 made you think
00:49:38.640 that they're so
00:49:39.800 good you could
00:49:40.360 just take their
00:49:40.900 point of view?
00:49:42.320 Because I
00:49:42.880 looked at them
00:49:43.340 and I thought
00:49:43.700 to myself,
00:49:44.900 huh, I'm not
00:49:46.000 sure the podcast
00:49:46.760 part is actually
00:49:48.760 telling me as
00:49:49.400 much as it
00:49:49.880 should.
00:49:50.760 In fact,
00:49:51.380 when I
00:49:51.640 analyzed the,
00:49:52.460 when I
00:49:54.040 analyzed the
00:49:55.020 big pharma,
00:49:57.560 I thought,
00:49:58.180 hey, they're
00:49:58.500 making money
00:49:59.000 on these
00:49:59.380 drugs.
00:50:00.640 Are they
00:50:01.280 completely
00:50:01.740 unbiased?
00:50:02.400 And I said,
00:50:02.860 no.
00:50:03.780 If they're
00:50:04.140 making money
00:50:04.860 on a
00:50:06.840 particular point
00:50:07.520 of view,
00:50:07.880 you can't
00:50:08.300 trust the
00:50:08.680 point of
00:50:08.960 view.
00:50:09.540 Am I
00:50:09.880 right?
00:50:10.960 And then I
00:50:11.940 unwisely
00:50:12.600 thought,
00:50:12.980 well, hey,
00:50:14.140 a lot of
00:50:14.600 these people
00:50:15.120 who have,
00:50:15.720 let's say,
00:50:16.500 different views
00:50:17.320 have podcasts
00:50:18.980 and books,
00:50:19.940 and it
00:50:20.500 appears that
00:50:20.940 they're
00:50:21.120 monetizing
00:50:21.740 their point
00:50:22.200 of view,
00:50:23.340 much like
00:50:24.080 the people
00:50:24.440 that are
00:50:24.640 criticizing.
00:50:26.040 Now,
00:50:26.980 you,
00:50:27.800 most of
00:50:28.720 you,
00:50:29.480 could look
00:50:29.920 at that
00:50:30.200 situation
00:50:30.680 and say,
00:50:31.280 okay,
00:50:31.520 the podcast
00:50:32.120 part doesn't
00:50:32.820 count,
00:50:33.580 they're just
00:50:34.160 really good
00:50:34.620 at looking
00:50:34.940 at stuff,
00:50:35.420 and you
00:50:35.740 can tell
00:50:36.100 they're good
00:50:36.440 at looking
00:50:36.820 at stuff,
00:50:39.100 but how
00:50:39.560 did you
00:50:39.780 do that?
00:50:40.980 Because that's
00:50:41.580 the part I
00:50:41.980 can't do.
00:50:43.260 To me,
00:50:43.900 I just saw
00:50:44.420 people had
00:50:44.920 different opinions
00:50:45.580 and I
00:50:45.940 couldn't
00:50:47.540 adjudicate
00:50:48.620 any of them
00:50:49.220 because I
00:50:50.000 don't have
00:50:50.300 the skills
00:50:50.680 they had.
00:50:51.160 If I
00:50:51.800 had the
00:50:52.040 skills they
00:50:52.480 had,
00:50:52.820 I wouldn't
00:50:53.100 need to
00:50:53.400 look at
00:50:53.660 their opinion.
00:50:55.280 Right?
00:50:56.340 If I
00:50:56.960 could judge
00:50:57.580 whether
00:50:57.860 Brett and
00:50:58.420 Heather
00:50:59.580 were correct,
00:51:01.840 if I
00:51:02.480 could judge
00:51:02.960 that,
00:51:04.300 it would
00:51:04.960 mean I
00:51:05.340 had their
00:51:05.740 skills.
00:51:07.400 Wouldn't
00:51:07.860 it?
00:51:08.880 I would
00:51:09.460 have to have
00:51:10.020 at least
00:51:10.440 equal or
00:51:11.000 better skills
00:51:11.700 to judge
00:51:12.780 whether they
00:51:13.280 did it.
00:51:13.700 Because remember,
00:51:14.240 your dog
00:51:14.740 can't judge
00:51:15.600 how you
00:51:17.260 do at
00:51:17.560 work.
00:51:17.900 because your
00:51:19.480 dog doesn't
00:51:19.980 understand your
00:51:20.700 work, so
00:51:22.060 it can't
00:51:22.440 judge it.
00:51:23.720 Right?
00:51:24.200 So I'm like
00:51:24.740 the dog.
00:51:25.920 I'm like a
00:51:26.700 dog watching
00:51:28.140 Brett and
00:51:29.040 Heather go to
00:51:29.540 work.
00:51:30.460 And the
00:51:31.000 only thing I
00:51:31.520 say is,
00:51:32.720 roof,
00:51:34.080 roof.
00:51:35.760 I'm hungry.
00:51:37.020 I need to
00:51:37.540 go out.
00:51:38.600 That's all I
00:51:39.420 can add to
00:51:39.820 the conversation
00:51:40.440 because I
00:51:40.900 don't have
00:51:41.240 their skills.
00:51:42.100 But you
00:51:43.180 do.
00:51:45.080 Many of
00:51:45.660 you have
00:51:46.760 some kind
00:51:47.480 of heuristic
00:51:49.860 or rule of
00:51:50.460 thumb that
00:51:50.940 you're holding
00:51:51.680 from me
00:51:52.380 because nobody
00:51:53.360 will explain
00:51:54.020 it to me
00:51:54.380 and I feel
00:51:54.800 like you're
00:51:55.780 doing it
00:51:56.120 intentionally.
00:51:57.300 You know
00:51:57.940 something I
00:51:58.480 don't know
00:51:59.040 and you're
00:52:00.160 all not
00:52:00.740 telling me.
00:52:03.480 You do.
00:52:04.820 How do
00:52:05.200 you know
00:52:05.680 that they
00:52:06.180 get the
00:52:06.900 right answer?
00:52:08.020 Like,
00:52:08.280 how could
00:52:08.620 you judge
00:52:09.000 that?
00:52:09.320 I'm just
00:52:09.860 like the
00:52:10.200 dog barking
00:52:10.860 at a
00:52:12.040 computer.
00:52:13.380 I'm like
00:52:13.720 barking at
00:52:14.200 the television.
00:52:14.840 Roof,
00:52:15.180 roof.
00:52:15.800 How do
00:52:16.320 they put
00:52:16.620 those pictures
00:52:17.160 on the
00:52:17.480 television?
00:52:18.400 Roof,
00:52:18.880 roof,
00:52:19.340 roof.
00:52:20.080 That's all
00:52:20.480 I have.
00:52:21.480 But I'm
00:52:21.980 completely aware
00:52:22.780 that I'm
00:52:23.240 just the
00:52:23.580 dog barking
00:52:24.140 at the
00:52:24.440 television.
00:52:25.720 How did
00:52:26.400 you know
00:52:26.940 that they
00:52:27.340 were right
00:52:27.940 and that
00:52:29.040 so many
00:52:30.140 other people
00:52:30.620 were wrong?
00:52:32.120 Now,
00:52:32.400 I heard you
00:52:32.740 say it's
00:52:33.020 because of
00:52:33.380 money,
00:52:33.840 but even
00:52:34.480 that doesn't
00:52:34.960 work for
00:52:35.340 me because
00:52:35.680 they're all
00:52:35.960 making money
00:52:36.460 in their
00:52:36.720 different ways.
00:52:38.540 How'd
00:52:39.020 you do
00:52:39.260 it?
00:52:41.040 No,
00:52:41.540 seriously,
00:52:41.900 how did
00:52:42.240 you do
00:52:42.460 it?
00:52:46.960 Evolutionary
00:52:48.480 biology?
00:52:49.220 That's not
00:52:49.760 an answer.
00:52:51.280 All right,
00:52:51.780 here's the
00:52:52.100 other thing
00:52:52.480 that I
00:52:52.760 can't do.
00:52:54.420 And I've
00:52:54.760 told you
00:52:55.020 this before.
00:52:55.500 When I
00:52:55.720 watched the
00:52:56.140 documentary
00:52:56.680 about Michael
00:52:58.320 Jackson and
00:52:59.060 the allegations
00:52:59.680 against him
00:53:00.420 and children,
00:53:02.160 it was 100%
00:53:03.760 convincing.
00:53:05.280 Boy,
00:53:05.840 was it
00:53:06.160 convincing.
00:53:06.500 so you
00:53:13.240 believe
00:53:13.580 Brett and
00:53:14.000 Heather
00:53:14.220 because they
00:53:14.860 admitted when
00:53:15.680 they were
00:53:16.260 wrong and
00:53:17.740 that increased
00:53:18.520 their credibility.
00:53:20.280 Okay,
00:53:20.580 now we're
00:53:20.900 talking.
00:53:21.680 Let's work
00:53:22.080 with that.
00:53:23.140 So would
00:53:23.520 you say
00:53:23.860 that people
00:53:24.440 who get
00:53:25.900 things wrong
00:53:26.780 and then
00:53:29.040 correct it
00:53:29.680 are more
00:53:30.320 credible than
00:53:31.260 people who
00:53:31.880 are not
00:53:33.480 telling you
00:53:33.880 they got
00:53:34.180 anything
00:53:34.440 wrong?
00:53:34.780 Is that
00:53:36.280 true?
00:53:37.080 And do
00:53:37.460 you think
00:53:37.680 scientists
00:53:38.280 don't correct
00:53:38.960 themselves when
00:53:39.640 the data
00:53:40.040 changes or
00:53:41.240 they find out
00:53:41.780 there's something
00:53:42.180 wrong?
00:53:42.900 Do the
00:53:43.400 scientists who
00:53:45.180 actually work in
00:53:45.860 the field of
00:53:46.320 science, do
00:53:47.980 they get new
00:53:48.560 data and
00:53:49.720 then they just
00:53:50.320 say, that data
00:53:50.980 doesn't agree with
00:53:51.760 my opinion, I'll
00:53:52.500 throw it out.
00:53:53.740 Is that how
00:53:54.260 other scientists
00:53:54.920 work?
00:53:55.740 Or is it
00:53:56.240 possible that
00:53:57.860 you saw
00:53:58.500 Brett and
00:53:59.120 Heather changing
00:53:59.740 their minds
00:54:00.400 only because you
00:54:01.800 were watching
00:54:02.320 them and that's
00:54:04.500 how all
00:54:04.940 scientists act?
00:54:07.080 That the
00:54:07.860 reason they're
00:54:08.320 scientists is
00:54:09.620 because they're
00:54:10.160 going to respect
00:54:10.640 the data and
00:54:11.540 the process and
00:54:12.860 that when
00:54:13.180 something changes
00:54:13.960 they change
00:54:14.480 with it.
00:54:15.780 Now I
00:54:16.400 thought that
00:54:16.960 regular
00:54:17.420 scientists were
00:54:18.160 also doing
00:54:18.900 that behind
00:54:19.400 closed doors and
00:54:20.960 I just didn't
00:54:21.500 see it.
00:54:22.520 Whereas if you
00:54:23.360 have a podcast
00:54:24.020 you do it in
00:54:25.600 public and
00:54:26.600 then people see
00:54:27.260 it and they
00:54:28.740 say, well
00:54:29.080 that's credible
00:54:29.720 because I see
00:54:31.160 them doing the
00:54:31.940 most normal
00:54:32.580 thing that any
00:54:33.280 scientist does
00:54:34.140 but they're
00:54:35.120 doing it in
00:54:35.580 public where
00:54:37.500 every other
00:54:38.120 scientist in
00:54:38.720 the world who
00:54:39.120 does exactly
00:54:39.780 the same thing
00:54:40.500 change their
00:54:41.420 mind when the
00:54:41.960 data changes.
00:54:43.140 They do it
00:54:43.760 behind closed
00:54:44.360 doors so you
00:54:44.940 can't really
00:54:45.260 trust them.
00:54:46.820 So being on
00:54:47.280 the podcast is
00:54:48.040 what gives them
00:54:48.500 credibility.
00:54:50.840 Now that I
00:54:51.540 hadn't thought
00:54:51.940 of.
00:54:53.300 So just doing
00:54:53.940 it in public
00:54:54.660 makes you trust
00:54:56.160 them because you
00:54:56.840 can see them
00:54:57.420 change their
00:54:57.900 mind but the
00:54:59.400 people who are
00:55:00.040 also scientists
00:55:00.920 who do that
00:55:02.040 every single
00:55:02.700 day because
00:55:03.240 that's exactly
00:55:04.020 what they
00:55:04.360 signed up to
00:55:04.940 do, change
00:55:06.060 their minds.
00:55:07.960 They signed
00:55:08.740 up to change
00:55:09.240 their minds, to
00:55:10.360 find out new
00:55:10.900 things and then
00:55:12.100 adopt those
00:55:12.800 views.
00:55:14.380 Do you think
00:55:14.780 they're not
00:55:15.060 doing that behind
00:55:15.700 these closed
00:55:16.180 doors?
00:55:17.700 That really
00:55:18.260 they just keep
00:55:19.100 their same
00:55:19.480 opinions even
00:55:20.120 when the data
00:55:20.640 changes?
00:55:21.480 They just call
00:55:22.020 themselves scientists
00:55:22.780 but they're
00:55:23.220 really just
00:55:23.660 actors, maybe
00:55:25.080 crisis actors.
00:55:25.920 All right, so
00:55:29.480 that's one way.
00:55:30.260 So one way is
00:55:31.120 that they change
00:55:31.720 their minds in
00:55:32.300 public.
00:55:33.260 What are some
00:55:33.800 other ways you
00:55:34.400 know they're
00:55:34.740 right?
00:55:43.160 And the other
00:55:44.380 people followed
00:55:45.140 the money.
00:55:46.740 So the fact
00:55:47.420 that Brett
00:55:49.520 and Heather had
00:55:51.280 one sort of
00:55:52.600 business model,
00:55:54.060 do you think
00:55:54.520 that if Brett
00:55:55.640 and Heather
00:55:55.980 had decided
00:55:56.640 that new
00:55:57.860 information showed
00:55:58.780 that the
00:55:59.240 vaccination was
00:56:00.060 the best thing
00:56:00.620 that ever
00:56:00.960 happened to
00:56:01.400 the world,
00:56:02.660 do you think
00:56:03.120 that they
00:56:03.440 would adopt
00:56:04.040 that view?
00:56:05.700 Which would
00:56:06.360 be reputationally
00:56:08.400 and financially
00:56:09.100 devastating to
00:56:10.260 them.
00:56:12.820 So you trust
00:56:14.920 them to the
00:56:15.640 point where
00:56:17.200 they would not
00:56:17.800 follow the
00:56:18.280 money.
00:56:18.820 They would be
00:56:19.220 the only ones.
00:56:21.160 So you believe
00:56:22.100 that you found
00:56:22.940 the only two
00:56:23.640 human who don't
00:56:24.280 follow the
00:56:24.700 money.
00:56:25.640 You do?
00:56:26.440 Okay.
00:56:27.400 Well that is
00:56:27.880 a very good
00:56:28.500 vote for those
00:56:29.820 two.
00:56:31.100 See I don't
00:56:31.620 have that skill.
00:56:33.580 I'm still like
00:56:34.660 the dog barking
00:56:35.360 at the TV.
00:56:36.560 Follow the
00:56:36.940 money.
00:56:37.300 Row, row,
00:56:38.140 row.
00:56:38.740 It's just
00:56:39.120 different money
00:56:39.660 following a
00:56:40.100 different path.
00:56:41.480 To me it's
00:56:42.000 everybody following
00:56:42.660 the money.
00:56:44.400 Now if you
00:56:46.280 thought that
00:56:47.240 Brett and
00:56:49.800 Heather are
00:56:50.720 especially
00:56:51.300 immune from
00:56:52.140 following the
00:56:52.720 money.
00:56:52.940 What about
00:56:53.780 me?
00:56:55.020 Because you've
00:56:55.820 seen me take
00:56:56.540 the view that
00:56:57.120 makes me the
00:56:57.840 least money.
00:56:58.920 And you know
00:56:59.340 it.
00:57:00.040 You know it.
00:57:01.780 You know it.
00:57:03.200 You watch it
00:57:03.980 every day.
00:57:04.840 You see me
00:57:05.660 pissing off my
00:57:06.460 audience every
00:57:07.220 day intentionally.
00:57:08.600 Do you think I do
00:57:09.220 it to make money?
00:57:11.020 I don't know any
00:57:11.860 business model where
00:57:12.720 that works.
00:57:13.840 I'm basically being
00:57:14.800 CNN right now.
00:57:16.200 I'm being CNN.
00:57:17.000 I'm trying to be
00:57:17.840 balanced and it's
00:57:18.700 just killing me
00:57:19.760 financially.
00:57:21.800 It's horrible.
00:57:24.020 But that's what
00:57:25.860 integrity looks like.
00:57:28.220 In case you've
00:57:28.800 never seen it.
00:57:30.300 In case you've
00:57:30.900 never seen it,
00:57:31.480 that's what it
00:57:31.860 looks like.
00:57:34.540 Somebody says
00:57:35.260 it's my ego.
00:57:36.780 Yeah, my ego is
00:57:37.580 the reason that
00:57:38.380 I'm saying things
00:57:39.860 that make people
00:57:40.380 hate me.
00:57:41.460 Because of my
00:57:42.140 ego.
00:57:45.080 How do you even
00:57:45.920 connect all those
00:57:46.640 dots?
00:57:47.000 That's a lot of
00:57:47.940 dots to connect.
00:57:50.640 Dignity.
00:57:53.540 All right.
00:57:54.940 Well, that's
00:57:55.640 enough on that.
00:57:58.780 The funniest
00:58:00.640 people are the
00:58:01.340 people who tell
00:58:02.980 me that they're
00:58:03.780 going to stop
00:58:04.280 listening to me
00:58:05.120 until I stop
00:58:07.480 talking about
00:58:08.360 the interpretation
00:58:10.560 of data,
00:58:11.200 basically.
00:58:13.240 And I think,
00:58:14.480 do you really
00:58:14.900 think I'm not
00:58:15.380 going to block
00:58:15.840 you for that?
00:58:16.400 I block
00:58:18.080 everybody who
00:58:18.780 says, I
00:58:20.040 liked it when
00:58:20.620 you did
00:58:20.900 this, but
00:58:21.420 you did too
00:58:22.400 much of this
00:58:22.880 pandemic stuff.
00:58:23.840 I just block
00:58:24.560 all of them.
00:58:25.740 So they won't
00:58:26.420 have to deal
00:58:26.860 with it again.
00:58:28.380 Which I think
00:58:29.100 is a fair.
00:58:30.680 That's a fair
00:58:31.440 deal.
00:58:32.140 Because the
00:58:32.740 thing is that
00:58:33.500 if I only do
00:58:36.180 what the
00:58:36.880 complainers tell
00:58:37.720 me to do,
00:58:38.660 this would be
00:58:39.320 the shittiest
00:58:39.900 experience you
00:58:40.620 ever had.
00:58:41.040 like, oh,
00:58:43.160 these three
00:58:43.740 people told
00:58:44.360 me I must
00:58:45.060 do these
00:58:45.500 topics, but
00:58:46.560 never do
00:58:46.980 these other
00:58:47.400 topics.
00:58:48.120 That's not
00:58:48.700 the show
00:58:49.000 you want to
00:58:49.360 see.
00:58:50.800 Do a
00:58:51.360 spaces
00:58:51.700 debate.
00:58:54.000 I've got a
00:58:54.660 prediction.
00:58:56.200 Nobody's going
00:58:56.720 to want to
00:58:57.060 debate me on
00:58:57.600 this.
00:59:01.520 Nobody's going
00:59:02.040 to want to
00:59:02.420 debate me on
00:59:02.960 this.
00:59:06.420 Yeah.
00:59:06.860 Yeah, it's
00:59:09.580 never going
00:59:09.960 to happen.
00:59:10.960 By the way,
00:59:11.640 what do you
00:59:12.260 think would
00:59:12.660 happen if
00:59:14.200 Brett and
00:59:14.820 I, Brett
00:59:15.240 Weinstein,
00:59:15.980 what would
00:59:16.340 you think if
00:59:16.840 we were to
00:59:17.900 compare notes
00:59:18.660 and if we
00:59:20.460 were to say,
00:59:21.060 all right, what
00:59:21.340 do you think
00:59:21.700 is true and
00:59:23.120 what do I
00:59:23.440 think is true?
00:59:24.100 How different
00:59:24.720 would it be?
00:59:26.560 How different
00:59:27.180 is my current
00:59:28.660 opinion from
00:59:30.660 Brett Weinstein's
00:59:31.700 current opinion
00:59:32.680 about everything
00:59:33.520 on the pandemic?
00:59:36.860 You think
00:59:39.920 it would be
00:59:40.120 90% different?
00:59:42.700 I think it
00:59:43.360 would be the
00:59:43.680 same.
00:59:45.880 It would be
00:59:46.460 the same.
00:59:49.640 You don't
00:59:50.240 think it
00:59:50.520 would be the
00:59:50.800 same?
00:59:53.600 There
00:59:54.040 wouldn't be
00:59:54.440 one thing
00:59:55.000 we disagree
00:59:55.520 on.
00:59:56.720 But you
00:59:57.320 don't realize
00:59:57.840 that, do
00:59:58.200 you?
01:00:00.160 What would
01:00:00.740 happen if I
01:00:01.320 debated Alex
01:00:02.120 Berenson?
01:00:04.340 What percentage
01:00:05.460 would we
01:00:06.100 disagree on?
01:00:06.860 here's what
01:00:09.960 I think.
01:00:11.800 I think it
01:00:12.520 would go
01:00:12.760 like this.
01:00:14.620 I've got
01:00:15.280 this information
01:00:16.060 that says
01:00:16.680 these vaccinations
01:00:17.700 are dangerous.
01:00:19.340 And I would
01:00:20.060 say, well, I
01:00:20.760 don't believe
01:00:21.580 any pandemic
01:00:23.640 data.
01:00:24.800 Do you?
01:00:26.420 And then he
01:00:27.140 would say,
01:00:28.040 well, yeah, I
01:00:28.440 believe this
01:00:28.920 data.
01:00:30.220 And I would
01:00:30.940 say, you
01:00:33.180 live in a
01:00:33.620 world in
01:00:34.060 2023 where
01:00:35.060 anybody's
01:00:35.800 data is
01:00:36.700 credible on
01:00:37.920 the pandemic.
01:00:39.420 And in
01:00:40.020 about five
01:00:40.600 minutes, he
01:00:41.280 would agree
01:00:41.660 with me that
01:00:43.040 you can't
01:00:43.440 really trust
01:00:43.920 any of the
01:00:44.460 data.
01:00:46.560 Right?
01:00:47.540 He might
01:00:48.020 start that
01:00:48.520 way.
01:00:48.860 I mean, this
01:00:49.840 was speculation
01:00:50.580 and mind
01:00:51.060 reading, so it's
01:00:51.740 not really
01:00:52.080 fair.
01:00:52.800 But I'm
01:00:53.480 almost positive
01:00:54.560 that if I
01:00:55.840 talk to any
01:00:56.500 of the people
01:00:56.940 that you think
01:00:57.540 are opposite
01:00:58.140 of my
01:00:58.480 opinion,
01:00:59.840 that I
01:01:00.680 have the
01:01:00.960 same opinion.
01:01:02.480 Does that
01:01:03.060 blow your
01:01:03.400 mind?
01:01:03.620 Is
01:01:05.200 anybody's
01:01:05.560 mind
01:01:05.800 blown?
01:01:07.840 I don't
01:01:08.440 think we'd
01:01:08.760 have any
01:01:09.040 difference.
01:01:10.360 Everything
01:01:10.840 that you
01:01:11.200 imagined about
01:01:11.900 my different
01:01:12.400 opinion is
01:01:12.940 completely a
01:01:13.860 hallucination,
01:01:14.700 and I've
01:01:15.320 always known
01:01:15.760 that.
01:01:16.780 You know I've
01:01:17.360 always known
01:01:17.760 that, right?
01:01:18.760 That we
01:01:19.360 didn't disagree.
01:01:20.940 The people
01:01:21.560 who think they
01:01:22.240 disagree with
01:01:22.860 me are
01:01:23.600 completely based
01:01:24.680 on rumors
01:01:26.000 or bad
01:01:26.740 information or
01:01:27.380 whatever.
01:01:27.560 Yeah, I'm
01:01:32.840 in 14 to
01:01:33.520 16 movies at
01:01:34.380 the same
01:01:34.660 time.
01:01:36.380 But it
01:01:37.020 says, I
01:01:37.480 don't believe
01:01:38.000 you.
01:01:39.600 Well, let's
01:01:41.120 test it, right?
01:01:42.160 Let's test it.
01:01:43.400 Those of you
01:01:44.000 who think that
01:01:44.680 I should have
01:01:45.440 Brett correct
01:01:47.580 me, tell me
01:01:49.020 one statement
01:01:49.780 you believe he
01:01:50.680 thinks is true
01:01:51.480 that I think
01:01:52.660 is not true.
01:01:53.600 Go.
01:01:54.440 Tell me one
01:01:55.100 statement, just a
01:01:55.920 clean statement,
01:01:56.580 something that
01:01:58.060 Brett Weinstein
01:01:59.200 believes is
01:01:59.800 true that I
01:02:01.300 don't believe
01:02:01.900 is true.
01:02:02.700 Go.
01:02:04.900 Yeah, there's
01:02:05.520 nothing.
01:02:07.140 There's
01:02:07.580 nothing.
01:02:12.140 Ivermectin?
01:02:13.120 You think we
01:02:13.920 have a different
01:02:14.380 opinion on
01:02:14.900 ivermectin?
01:02:17.880 I don't know
01:02:18.600 what that would
01:02:18.980 be.
01:02:20.260 Here's what I
01:02:21.020 think his
01:02:21.460 opinion is, and
01:02:22.140 you tell me if
01:02:23.500 I'm wrong.
01:02:23.880 the ivermectin
01:02:26.480 story is hard
01:02:29.140 to sort out
01:02:29.840 because the
01:02:30.580 studies may
01:02:31.960 not have
01:02:32.300 focused exactly
01:02:33.140 in the right
01:02:33.540 place.
01:02:34.560 Some seem to
01:02:35.400 indicate in a
01:02:36.340 meta-analysis that
01:02:37.360 it's helpful.
01:02:39.440 So far, so
01:02:40.200 far, agree?
01:02:41.460 So far, agree
01:02:42.520 that the studies
01:02:43.560 that show it
01:02:44.720 doesn't work are
01:02:46.140 sketchy.
01:02:47.700 The studies that
01:02:48.780 show it does
01:02:49.520 work are based
01:02:50.900 on a meta-analysis,
01:02:51.960 and then I
01:02:53.260 would say this
01:02:53.840 to Brett, but
01:02:55.180 you know, a
01:02:55.540 meta-analysis
01:02:56.200 introduces
01:02:56.780 subjectivity because
01:02:59.120 the person who
01:02:59.940 does it gets to
01:03:01.040 decide what's in
01:03:01.900 and out of the
01:03:02.440 study, and
01:03:03.380 also, if one
01:03:05.260 big study is
01:03:06.500 bigger than the
01:03:07.060 others, it's
01:03:07.560 really not a
01:03:08.320 meta-analysis,
01:03:09.100 it's one study
01:03:09.780 because it biases
01:03:10.780 it by so much.
01:03:11.760 If I said that
01:03:12.540 to Brett, would
01:03:13.680 he say, no, I
01:03:14.360 disagree with
01:03:14.980 you.
01:03:15.760 A meta-analysis
01:03:17.020 in which one
01:03:17.940 study is so big
01:03:19.920 that it biases the
01:03:20.820 whole meta-analysis,
01:03:21.640 is still really
01:03:22.700 good.
01:03:23.760 He wouldn't say
01:03:24.280 that.
01:03:25.640 Do you know
01:03:26.100 why he wouldn't
01:03:26.640 say that?
01:03:27.900 Because he's
01:03:28.540 smart.
01:03:29.580 Nobody would
01:03:30.020 say that he
01:03:30.400 was smart.
01:03:32.600 And then, do
01:03:35.420 you think I
01:03:35.840 would disagree
01:03:36.380 with him when
01:03:37.040 he said that
01:03:37.580 the studies that
01:03:38.920 say ivermectin
01:03:39.780 doesn't work are
01:03:41.120 sketchy?
01:03:41.900 Do you think I
01:03:42.660 would disagree?
01:03:43.340 No, I think
01:03:43.880 all the data is
01:03:44.680 sketchy, with
01:03:45.760 no exceptions.
01:03:47.320 So we would
01:03:48.120 agree that all
01:03:48.840 the ivermectin
01:03:49.480 data is
01:03:50.380 imperfect.
01:03:53.600 Then we would
01:03:54.400 move to point
01:03:55.060 two.
01:03:55.940 If it's
01:03:56.420 imperfect, should
01:03:58.140 you have the
01:03:58.560 right to try
01:03:59.200 it, knowing
01:03:59.860 that the
01:04:00.360 risks are
01:04:00.960 low?
01:04:01.640 What would
01:04:02.200 I say?
01:04:03.500 What would
01:04:03.840 I say
01:04:04.100 that?
01:04:06.020 Would I
01:04:06.640 disagree?
01:04:08.280 I'd say, yes,
01:04:09.220 of course.
01:04:10.400 Low risk,
01:04:12.060 potential upside,
01:04:13.760 plenty of
01:04:14.400 anecdotal reports,
01:04:15.600 and then
01:04:17.220 Brett would
01:04:18.160 agree with
01:04:18.620 me that
01:04:18.960 anecdotal
01:04:19.520 reports are
01:04:20.080 not
01:04:20.340 confirmation
01:04:21.380 and that
01:04:23.240 the meta
01:04:24.080 analysis is
01:04:24.920 not
01:04:25.140 confirmation
01:04:25.880 and the
01:04:27.260 RCTs that
01:04:28.300 say it
01:04:28.640 doesn't work
01:04:29.240 are not
01:04:30.320 super
01:04:31.260 reliable.
01:04:32.900 So it
01:04:33.500 just comes
01:04:33.880 down to,
01:04:34.440 well, I
01:04:34.740 don't know
01:04:34.940 if it
01:04:35.140 works and
01:04:35.520 I don't
01:04:35.720 know if
01:04:35.960 it does
01:04:36.380 or doesn't,
01:04:37.340 but it
01:04:38.320 should be
01:04:38.620 legal for
01:04:39.100 me to
01:04:39.340 try it.
01:04:40.560 So where
01:04:40.920 would we
01:04:41.260 disagree?
01:04:42.400 Where would
01:04:42.860 we disagree?
01:04:44.460 See, I
01:04:45.260 have a
01:04:45.560 theory that
01:04:47.320 above a
01:04:47.880 certain level
01:04:48.400 of intelligence
01:04:49.060 everybody agrees.
01:04:50.720 It just
01:04:51.380 looks like
01:04:51.840 we don't.
01:04:55.080 I don't
01:04:55.680 think there's
01:04:56.040 ever been a
01:04:56.540 disagreement.
01:04:57.880 Is it
01:04:58.340 blowing your
01:04:58.800 mind?
01:05:00.420 Try
01:05:00.680 something else.
01:05:01.940 That was the
01:05:02.360 best you
01:05:02.680 had.
01:05:03.360 So the
01:05:03.680 best you
01:05:04.080 had was
01:05:04.520 something where
01:05:04.980 we clearly
01:05:05.500 would agree.
01:05:07.320 Try
01:05:07.560 something else.
01:05:08.580 Something else
01:05:09.240 you think that
01:05:09.800 Brett believes
01:05:10.540 is, oh,
01:05:14.080 Brett might
01:05:14.520 say the
01:05:14.940 adverse events
01:05:15.820 justify, all
01:05:16.940 right, let's
01:05:17.180 talk about
01:05:17.540 that.
01:05:18.860 Do you
01:05:19.160 think that
01:05:19.680 Brett believes
01:05:20.720 that the
01:05:21.560 vaccinations are
01:05:22.480 a bad idea
01:05:23.120 for younger
01:05:24.120 people?
01:05:26.180 Yes or no?
01:05:28.940 Would Brett
01:05:29.660 say that
01:05:30.340 vaccinations are
01:05:31.620 a bad idea
01:05:32.300 for younger
01:05:32.800 people?
01:05:33.500 I think he'd
01:05:34.020 say yes.
01:05:34.740 What do you
01:05:35.100 think I'd
01:05:35.440 say?
01:05:36.520 Based on
01:05:37.160 current
01:05:37.440 information,
01:05:38.380 what would
01:05:38.740 I say?
01:05:39.060 I'd
01:05:40.580 agree.
01:05:41.840 I would
01:05:42.400 agree.
01:05:43.100 What did
01:05:43.640 I say
01:05:44.000 during the
01:05:44.600 pandemic,
01:05:45.380 before we
01:05:46.020 had a
01:05:47.300 little bit
01:05:47.600 better
01:05:47.820 information?
01:05:49.020 What did
01:05:49.400 I say?
01:05:49.900 Do you
01:05:50.360 remember me
01:05:50.840 saying, yeah,
01:05:51.380 let's
01:05:51.600 vaccinate
01:05:52.040 those kids?
01:05:53.940 Did you
01:05:54.240 ever hear
01:05:54.500 me say
01:05:54.800 that?
01:05:55.940 Have I
01:05:56.300 ever suggested
01:05:56.960 children should
01:05:57.620 be vaccinated?
01:05:59.480 Nope.
01:06:00.360 Nope.
01:06:01.440 No.
01:06:02.180 So where
01:06:02.620 would we
01:06:02.960 disagree?
01:06:04.540 Now, here's
01:06:05.720 the second
01:06:06.060 question.
01:06:06.440 people over
01:06:08.320 50, let's
01:06:09.560 say over
01:06:09.880 65, wherever
01:06:10.920 we want to
01:06:11.300 do it, would
01:06:13.500 Brett say
01:06:14.660 that for
01:06:16.300 people over,
01:06:17.180 let's say
01:06:17.460 65, would
01:06:19.200 he say that
01:06:19.660 people over
01:06:20.160 65 did
01:06:22.680 not receive a
01:06:23.460 benefit from
01:06:24.000 the shot?
01:06:25.360 What would
01:06:25.940 he say?
01:06:30.640 I think he
01:06:31.460 would say,
01:06:32.020 I don't know,
01:06:33.540 I think he
01:06:34.060 would say the
01:06:34.820 data shows
01:06:35.680 that they
01:06:36.460 had higher
01:06:37.160 survivability,
01:06:38.380 but what we
01:06:39.720 don't know
01:06:40.320 is any
01:06:42.140 long-term
01:06:42.620 consequences.
01:06:44.640 Yes or
01:06:45.360 no?
01:06:46.960 Do you
01:06:47.660 think he
01:06:47.940 would say
01:06:48.180 that?
01:06:49.400 The data
01:06:50.000 shows, the
01:06:51.180 data is, you
01:06:51.960 know, sketchy,
01:06:53.200 but the data
01:06:54.020 does show
01:06:54.760 that older
01:06:55.520 people got a
01:06:56.200 benefit, on
01:06:57.460 average, but
01:07:00.480 that there
01:07:00.800 might be long-term
01:07:01.620 consequences that
01:07:02.700 could be
01:07:03.120 catastrophic.
01:07:04.500 Would I
01:07:04.920 disagree with
01:07:05.580 that?
01:07:06.320 Would I
01:07:06.680 disagree with
01:07:07.220 that?
01:07:08.340 I think that's
01:07:09.320 the opinion, but
01:07:09.800 I don't really
01:07:10.180 know.
01:07:11.400 No, I would
01:07:11.920 agree with
01:07:12.260 that.
01:07:13.920 I have agreed
01:07:14.800 with that as
01:07:15.380 clearly as
01:07:16.020 possible.
01:07:17.460 So where
01:07:18.060 would we
01:07:18.380 disagree?
01:07:21.720 Is your
01:07:22.340 mind blown
01:07:22.860 yet?
01:07:24.060 Does it
01:07:24.620 blow your
01:07:25.000 mind that
01:07:25.640 there's no
01:07:26.040 disagreement
01:07:26.500 between me
01:07:27.400 and my
01:07:27.840 biggest critic?
01:07:29.020 He's not my
01:07:29.600 biggest critic,
01:07:30.300 but, you
01:07:30.560 know, just
01:07:30.820 for the
01:07:31.100 purpose of
01:07:31.520 this
01:07:31.700 conversation.
01:07:33.120 Is anybody's
01:07:35.100 mind blown
01:07:35.580 right now
01:07:36.040 at all?
01:07:39.280 Somebody
01:07:39.720 says, do
01:07:40.100 you believe
01:07:40.380 that data?
01:07:41.760 No matter
01:07:42.180 how many
01:07:42.880 times, can
01:07:43.700 I get
01:07:44.020 permission to
01:07:44.580 swear?
01:07:49.980 Permission
01:07:50.420 to swear?
01:07:52.440 All right,
01:07:52.780 too many
01:07:53.040 no's.
01:07:53.700 All right,
01:07:53.940 there are
01:07:54.080 too many
01:07:54.320 no's, I'm
01:07:54.740 not going
01:07:55.280 to do it.
01:07:57.480 I'll do
01:07:58.060 this without
01:07:58.540 swearing.
01:08:00.440 Let me
01:08:01.000 answer all
01:08:01.640 of your
01:08:01.860 questions.
01:08:03.400 There's a
01:08:03.860 new study
01:08:04.340 out, Scott.
01:08:04.940 Do you
01:08:05.220 believe it?
01:08:06.180 No.
01:08:07.200 There's a
01:08:07.660 new study
01:08:08.080 out that
01:08:08.400 says the
01:08:08.720 opposite
01:08:09.040 study.
01:08:09.840 Do you
01:08:10.120 believe it?
01:08:11.460 No.
01:08:12.880 There's a
01:08:13.360 new study
01:08:13.740 that says
01:08:14.100 you were
01:08:14.420 right about
01:08:14.840 everything,
01:08:15.360 Scott.
01:08:15.680 Do you
01:08:15.900 believe it?
01:08:18.040 What do
01:08:18.500 you think
01:08:18.680 I'd say?
01:08:20.000 No.
01:08:21.580 There's a
01:08:22.060 study that
01:08:22.480 disagrees with
01:08:23.120 everything you
01:08:23.540 said, Scott.
01:08:24.000 Do you
01:08:24.180 believe it?
01:08:25.380 What do
01:08:25.820 you think
01:08:25.980 my answer
01:08:26.340 is?
01:08:27.760 No.
01:08:28.160 Try to
01:08:31.860 pick up
01:08:32.140 the
01:08:32.280 pattern.
01:08:33.540 It's
01:08:33.720 all no.
01:08:35.160 It's
01:08:35.460 all no,
01:08:36.380 top to
01:08:36.840 bottom.
01:08:37.860 No
01:08:38.120 exceptions.
01:08:39.180 If it has
01:08:39.660 anything to
01:08:40.300 do with
01:08:40.540 the pandemic,
01:08:41.660 it's all
01:08:42.640 no.
01:08:43.740 No.
01:08:45.100 So you
01:08:45.400 don't ever
01:08:45.820 have to ask
01:08:46.320 me again.
01:08:47.320 If I tell
01:08:48.120 you there's
01:08:48.520 a new
01:08:48.760 study that
01:08:49.320 says wearing
01:08:50.820 masks made
01:08:52.240 your penis
01:08:52.700 grow, do
01:08:54.440 you have to
01:08:54.860 ask me if
01:08:55.340 I believe
01:08:55.760 it?
01:08:57.900 No.
01:08:59.040 No.
01:09:00.500 If there's
01:09:01.080 a new
01:09:01.320 study that
01:09:01.820 says the
01:09:02.480 vaccinations
01:09:03.100 actually made
01:09:03.880 you smarter,
01:09:05.680 do you
01:09:06.840 have to ask
01:09:07.320 me if I
01:09:07.720 believe it?
01:09:09.280 No.
01:09:11.100 No.
01:09:11.740 You can
01:09:12.000 skip that
01:09:13.060 step.
01:09:13.760 You can
01:09:14.400 skip that
01:09:14.860 step every
01:09:15.640 time.
01:09:16.780 I'm not
01:09:17.480 going to
01:09:17.700 change my
01:09:18.160 mind on
01:09:18.580 this.
01:09:19.300 I'm just
01:09:19.940 not.
01:09:20.700 The
01:09:21.100 information
01:09:21.560 is not
01:09:22.540 credible
01:09:23.300 at all.
01:09:24.860 And
01:09:27.800 basically
01:09:28.620 this is
01:09:29.060 the same
01:09:29.460 approach
01:09:30.040 I've
01:09:30.260 taken on
01:09:30.660 climate
01:09:30.980 change.
01:09:32.640 I think
01:09:33.360 it's
01:09:33.560 hilarious
01:09:33.960 that the
01:09:35.800 world was
01:09:36.340 convinced we
01:09:37.000 can measure
01:09:37.440 the temperature
01:09:38.040 of the
01:09:38.380 world.
01:09:41.500 I mean
01:09:42.120 I don't
01:09:42.480 even need
01:09:42.800 to say
01:09:43.100 anything
01:09:43.340 about that.
01:09:44.820 It's
01:09:45.340 hilarious
01:09:45.820 that anybody
01:09:46.640 ever believed
01:09:47.520 that and
01:09:48.640 that we're
01:09:48.940 measuring it
01:09:49.500 every year
01:09:50.000 and we
01:09:50.620 can tell
01:09:50.920 it's
01:09:51.080 moving a
01:09:51.560 little
01:09:51.700 bit.
01:09:53.500 That's the
01:09:54.120 most
01:09:54.360 absurd
01:09:54.980 thing
01:09:55.840 anybody
01:09:56.220 ever
01:09:56.480 believed.
01:09:57.960 Now if I
01:09:58.460 don't even
01:09:58.760 believe that
01:09:59.380 and that's
01:09:59.800 basically
01:10:00.520 that's
01:10:01.400 accepted as
01:10:02.160 pretty basic
01:10:03.640 science
01:10:04.180 that we
01:10:05.380 okay we
01:10:05.960 can't measure
01:10:06.640 everything but
01:10:07.360 as long as
01:10:08.600 we're consistent
01:10:09.280 with the ones
01:10:09.940 that we are
01:10:10.460 measuring
01:10:10.900 it's not
01:10:12.640 good enough.
01:10:13.440 Not good
01:10:13.820 enough.
01:10:18.020 Has
01:10:18.460 health care
01:10:18.940 declined
01:10:19.540 since
01:10:19.940 Obamacare?
01:10:20.660 Well health
01:10:21.660 care has
01:10:21.980 declined
01:10:22.440 I don't
01:10:23.240 know if
01:10:23.460 it's
01:10:23.620 because
01:10:23.940 of
01:10:24.160 Obamacare.
01:10:27.460 By the
01:10:28.020 way I've
01:10:28.440 heard some
01:10:28.780 of the
01:10:28.980 most horrific
01:10:29.540 health
01:10:29.920 care
01:10:30.140 stories
01:10:30.660 recently.
01:10:34.200 It's
01:10:34.680 crazy.
01:10:37.080 Yeah.
01:10:38.380 I mean
01:10:38.620 I can't
01:10:40.080 tell you the
01:10:40.480 stories I've
01:10:40.980 heard but
01:10:41.420 there's one
01:10:43.080 I heard
01:10:43.360 recently
01:10:43.800 I wish
01:10:44.920 I could
01:10:45.180 tell you
01:10:45.580 but it
01:10:47.360 was a
01:10:47.720 failure of
01:10:48.460 a health
01:10:48.980 care process
01:10:49.780 at a
01:10:50.220 level which
01:10:50.920 I didn't
01:10:52.900 even think
01:10:53.280 was possible.
01:10:55.120 I mean
01:10:55.500 it was
01:10:56.200 incredibly
01:10:57.560 failure
01:10:58.900 failure
01:10:59.280 like
01:11:00.500 oh my
01:11:02.080 freaking
01:11:02.840 god
01:11:03.380 failure.
01:11:04.700 Yeah.
01:11:05.140 And it
01:11:05.380 happens to
01:11:05.760 be my
01:11:06.080 health
01:11:06.320 care
01:11:06.520 provider
01:11:06.960 that
01:11:07.300 did it.
01:11:09.000 So it
01:11:09.360 turns out
01:11:09.900 that there's
01:11:11.340 something called
01:11:11.900 a patient
01:11:13.820 advocate.
01:11:15.020 Have you
01:11:15.160 ever heard
01:11:15.420 of that?
01:11:17.100 At least
01:11:17.480 at Kaiser.
01:11:17.940 there's a
01:11:19.140 patient
01:11:19.420 advocate
01:11:19.920 and
01:11:21.240 apparently
01:11:21.660 you can
01:11:22.800 make things
01:11:23.400 go from
01:11:23.960 nothing
01:11:24.280 happening
01:11:24.680 to
01:11:24.900 something
01:11:25.200 happening
01:11:25.640 by asking
01:11:26.800 to talk
01:11:27.260 to that
01:11:27.580 person.
01:11:29.000 Because
01:11:29.500 things are
01:11:29.920 so bad
01:11:30.540 that you
01:11:31.680 can't even
01:11:32.100 work it out
01:11:32.680 with the
01:11:33.000 people you're
01:11:33.440 working with.
01:11:34.520 You just
01:11:34.780 have to
01:11:35.140 well that's
01:11:40.780 pretty good.
01:11:41.180 like an
01:11:50.580 ombudsman.
01:11:51.500 I don't
01:11:51.800 know if
01:11:52.000 it's like
01:11:52.280 an ombudsman.
01:11:53.520 I think
01:11:53.900 it might
01:11:54.140 be more
01:11:55.100 than that.
01:11:56.680 All right
01:11:57.180 that's all
01:11:57.460 I got for
01:11:57.760 now.
01:11:58.660 YouTube
01:11:59.100 I'll talk
01:11:59.920 to you
01:12:00.140 tomorrow
01:12:00.440 and maybe
01:12:01.920 I'll be
01:12:02.200 right about
01:12:02.600 something
01:12:02.920 tomorrow
01:12:03.380 but not
01:12:04.320 today.
01:12:05.140 Bye for
01:12:05.500 now.
01:12:06.580 Best
01:12:06.860 live stream
01:12:07.320 you've
01:12:07.580 ever seen.
01:12:08.020 you've
01:12:09.520 ever seen.