Real Coffee with Scott Adams - October 10, 2021


Episode 1525 Scott Adams: Can AI Spot Fake News For You? And What if it Could? Lots More.


Episode Stats

Length

55 minutes

Words per Minute

140.73082

Word Count

7,772

Sentence Count

648

Misogynist Sentences

4

Hate Speech Sentences

10


Summary

In this episode of Coffee with Scott Adams, host Scott Adams talks about some of the most annoying things you can do to make your life a little better, including leaf blowers and when to quit your job. Plus, he talks about when to give up on your career and what to do about it.


Transcript

00:00:00.000 Well, good morning, ladies and gentlemen, and everything in between.
00:00:06.740 It's time for Coffee with Scott Adams, and I think you know by now it's the best part of the living experience.
00:00:15.440 In fact, a lot of zombies, a lot of dead people, ghosts, they also enjoy this program, so it's not just for the living.
00:00:24.520 Don't get all, you know, like you're special or something.
00:00:28.360 You are special, but so are the dead.
00:00:31.900 And if you'd like to take this to the next level, and I know you're that kind of people, that's who is attracted to Coffee with Scott Adams.
00:00:40.540 Edgy people.
00:00:41.500 People who are willing to take it to the limit.
00:00:44.500 And it's time.
00:00:45.560 And all you need is a cup or a mug or a glass, a tank or a chalice, a canteen jug, or a flask, a vessel of any kind, fill it with your favorite liquid.
00:00:54.460 I like coffee.
00:00:55.040 And join me now for the unparalleled pleasure.
00:01:00.300 The dopamine here of the day, the thing that makes everything better.
00:01:04.440 It's called the simultaneous sip.
00:01:06.780 Yeah, have you heard of it?
00:01:07.920 It's great.
00:01:09.040 And it's going to happen now, if you're ready.
00:01:11.420 Are you ready?
00:01:12.040 Are you ready?
00:01:13.520 Go.
00:01:13.920 Well, I'm wearing the swaddling blanket of comfort today.
00:01:29.260 Why?
00:01:30.320 I'll tell you why, in a moment, at the proper time.
00:01:35.200 Don't rush me.
00:01:37.160 We're going to do that at the proper time.
00:01:38.760 Issue number one, most important issue, possibly in the entire world, leaf blowers.
00:01:48.160 Now, here's the thing about leaf blowers.
00:01:52.060 You don't really think that's a big problem, right?
00:01:55.760 It's minor annoyance.
00:01:57.420 Nobody likes to hear a leaf blower.
00:01:58.900 But I'm just going to tell you from my own experience, I lose an entire work day every week to leaf blowers.
00:02:07.580 One entire day of creativity and productivity every week for my whole adult life to leaf blowers.
00:02:16.860 Am I the only one?
00:02:18.480 Anybody else find you actually can't work when a leaf blower is outside?
00:02:26.340 I actually can't work.
00:02:27.920 I just have to get up and do something else.
00:02:30.760 Take a shower or something.
00:02:32.640 Yeah.
00:02:33.420 And the funny thing is, I guess they're being banned in some places, but probably not banned because of productivity.
00:02:41.700 Here's what's different.
00:02:42.860 Nobody's ever liked leaf blowers, but usually it's because you're sleeping or you wanted to take a nap or you wanted to just not have any noise by your house.
00:02:52.560 But what happens when everybody's working at home?
00:02:55.680 If everybody's working at home, the leaf blower is going to take 20% of your productivity.
00:03:03.640 It basically kills the day, especially if you have a neighbor.
00:03:07.720 So you get yours, and then the neighbor starts in the afternoon, and you've got a full day of leaf blower.
00:03:16.600 I think it's actually a gigantic problem that looks like a small one.
00:03:22.440 Because each individual who's bothered by it just feels like an annoyance, and maybe you don't complain to anybody.
00:03:28.860 So you've got millions and millions of people who are losing a day of work, and nobody knows that it's happening to other people.
00:03:36.540 I have a feeling leaf blowers are taking something like, I don't know, 2% to 5% off the GDP.
00:03:46.900 I mean, it could be that big, because you've got a lot of people working at home now, especially.
00:03:52.680 Well, Madonna is promoting some kind of documentary or something.
00:03:57.580 And if you've seen her, you know that she's bringing sexy back.
00:04:01.660 Madonna is bringing sexy back.
00:04:03.920 Two senior citizens, mostly.
00:04:08.140 And I asked myself, does there need to be an intervention?
00:04:17.680 Because the very thing that made Madonna amazing is that she didn't care what you thought, and she was going to do it anyway.
00:04:27.260 You know, her single-minded, I don't know, drive and ambition and all that are really what made her special, what made her succeed.
00:04:38.760 But you also have to know when to quit, right?
00:04:42.720 You have to know when to hold them and know when to fold them.
00:04:45.160 And I feel like Madonna has, you know, she probably has lots of room left on her career, but I'm not sure that playing the sexy senior citizen was exactly the right way to play this.
00:04:59.460 You know, I don't know who's advising Madonna, but I feel that the advice goes like this these days.
00:05:08.620 You know, Madonna, if you're not too busy, I've been watching your latest promotions, and I'm just a little productive, positive, really not a criticism at all.
00:05:26.200 I'm just saying that some people, not me, not me, but some people, feel as though maybe because your age and the weird way you look, I don't mean weird, I mean non-standard, I mean better than most people, but different, different.
00:05:41.760 Some people say weird.
00:05:42.580 I don't say it's weird, not me, but some people are saying that, and they're cruel and awful.
00:05:47.460 But I'm just wondering, just putting the idea out there that maybe you should, I'm saying just maybe rethink a little bit.
00:05:55.480 You're, I'm fired?
00:06:00.760 And scene.
00:06:02.700 I don't feel as though Madonna has good advice.
00:06:07.400 You know what I mean?
00:06:08.980 I don't feel like she has good advice.
00:06:12.220 Clearly there's nobody giving her honest opinion about what's going on here.
00:06:16.300 Like there's nobody in her circle at all who can just say, you know, like I'm the one person you'll listen to, and maybe play this a little differently.
00:06:29.220 And I think what bothers me about it is I'm such a big fan.
00:06:33.080 I'm a huge fan of Madonna.
00:06:34.660 Always have been.
00:06:35.680 And I think that she does have a second act, if you'd call it that, the second act.
00:06:41.680 Yeah, well, yeah, Cher, maybe Cher went the same way.
00:06:44.480 I think Cher did it better.
00:06:46.300 But I feel like she could be as substantial as ever.
00:06:52.280 Just play it differently.
00:06:53.780 Maybe play a little more to her substance, if you know what I mean.
00:06:59.220 Well, you know, we've all been making fun of TikTok and how it's destroying the world.
00:07:06.000 But I feel like there's a positive to it.
00:07:09.760 It turns out that one of the things that TikTok does, and you could substitute Instagram and Snapchat for the same conversation.
00:07:19.340 One of the things that TikTok does is it keeps all the narcissists busy.
00:07:24.740 Now, you think I'm going to make a joke, but watch this.
00:07:31.320 I'm not joking at all.
00:07:32.440 If you've studied narcissism, and by the way, I have a history of saying, first of all, it doesn't exist, and then becoming a complete convert.
00:07:41.600 It wasn't exactly a change of opinion, but there was a change of understanding that there's more than one thing called narcissism.
00:07:49.580 So the ones I was saying don't exist still don't exist.
00:07:53.960 So I was never wrong about what I was talking about.
00:07:57.200 What I didn't know, and this is an error on my part, is that the word narcissist is used in different senses.
00:08:04.080 And there's a grandiose narcissist, and there's a vulnerable narcissist, and there's some other kinds, I guess.
00:08:11.900 But one of the characteristics of narcissists is that they damage other people.
00:08:19.060 They damage other people.
00:08:20.760 The more time you spend with a narcissist, the more chance you're going to get damaged, because they do that.
00:08:26.240 That's sort of built into that.
00:08:27.820 This is the vulnerable narcissist, not the grandiose.
00:08:30.460 And to the degree that TikTok absorbs all of their time, so that the narcissist who would be out destroying other people ends up glued to a screen, interacting with other narcissists, to a large extent.
00:08:51.400 And I'm wondering if it actually is helping in some way, because it takes them out of the conversation.
00:08:58.500 You know, every moment that a vulnerable narcissist is on a device is a moment that they're not bothering you in person.
00:09:12.360 There's a big sick out at the, where is it, the FAA?
00:09:18.280 I'm not sure where that is.
00:09:21.840 So I'm not so sure that this is a bad thing.
00:09:24.660 Maybe we've identified and isolated all the most dangerous people, and we'll have them just talking to each other.
00:09:31.020 And the other great thing is that, if you understand narcissists...
00:09:38.660 Somebody paid $50 to criticize me.
00:09:46.380 So I'm going to read the criticism in full.
00:09:50.340 He paid $49.99 just to have this comment be prominent.
00:09:56.300 It says, simulation theory could only ever occur to a white, upper-class boomer living in a McMansion.
00:10:02.500 But please, tell us all about the vax, and tell us again about that one time you didn't get the promotion because a black existed.
00:10:11.240 Does that sound like anything?
00:10:14.840 So let me do some fact-checking.
00:10:16.740 I don't live in a McMansion.
00:10:19.400 I live in an actual mansion.
00:10:23.140 So at least get that part.
00:10:30.300 Anyway, I guess that's a definitional thing.
00:10:33.220 Please tell us all about the vax.
00:10:35.400 Have I ever done that?
00:10:37.520 Have I ever told you to get a vaccination?
00:10:39.940 No.
00:10:40.840 This is a person that's hallucinating.
00:10:42.940 It's an actual hallucination.
00:10:44.120 They imagine them telling them to get a vaccination, I guess.
00:10:49.280 And tell us again about that one time you didn't get the promotion because a black existed.
00:10:55.240 It was two careers, not one time.
00:11:00.700 It was two careers in which I invested a tremendous number of years in time.
00:11:07.160 And really it was more of three, it wasn't two.
00:11:11.280 Because I lost two corporate jobs for being white and male.
00:11:15.740 And just for your information, my bosses told me that directly.
00:11:20.320 So I'm not reading between the lines.
00:11:22.180 They told me, they called me into their offices with an actual meeting.
00:11:27.240 A specific meeting just to tell me that I would never be promoted as far as I could predict.
00:11:33.720 Because I was white and male and they needed to get some more diversity.
00:11:40.040 So do you think I'm imagining it?
00:11:42.560 Do you think I'm making up that conversation?
00:11:46.040 Seriously.
00:11:46.480 Whoever asked the question for $49, do you think I just made that up?
00:11:52.120 I've been saying it for years and I worked with hundreds of people.
00:11:57.700 Hundreds and hundreds of people knew me if you take the years I worked at both corporations.
00:12:03.580 Hundreds.
00:12:04.840 You don't think any one of those hundreds of people by now would have come forth and said,
00:12:10.340 that never happened.
00:12:12.180 There was nothing like that happening in these companies during those days.
00:12:15.320 Do you know why nobody has come forward to say that that didn't happen?
00:12:19.520 Because they all experienced it.
00:12:21.800 It was universal.
00:12:23.820 It wasn't even rare.
00:12:25.820 It was the very texture of life in those days.
00:12:30.140 In San Francisco, anyway.
00:12:31.520 Where I was.
00:12:32.860 Now, I also lost a TV career for being a white guy.
00:12:39.760 On UPN, the Dilbert comic was running,
00:12:42.640 had a successful first season.
00:12:45.320 And it was renewed.
00:12:47.240 But it was renewed in a season that UPN decided to have,
00:12:51.180 to primarily cater to a black audience.
00:12:55.340 Which was a pretty good marketing idea, I thought.
00:12:58.820 Because it seemed like it was an underserved population.
00:13:02.420 And they weren't doing so well in general.
00:13:04.340 So I thought, oh, we'll target a specific audience and that'll be good.
00:13:07.620 Hey, wow.
00:13:19.140 Okay.
00:13:20.360 That was the weirdest text message I ever got while I'm live streaming.
00:13:23.780 So, you paid $49 for the answer.
00:13:31.240 There's your answer.
00:13:32.720 It's 100% real that white people of my generation were discriminated against.
00:13:38.240 And directly.
00:13:39.680 We were told to our faces we couldn't be promoted.
00:13:42.900 Did you know that?
00:13:44.520 And by the way, that's why I identify as black.
00:13:47.100 Number one, because I can.
00:13:51.180 I can identify as black.
00:13:53.260 Because those are the rules, right?
00:13:54.380 You can identify with whatever you feel the most connection to.
00:13:58.960 And because I've been massively racially discriminated in my life.
00:14:04.840 Massively.
00:14:05.340 I mean, those are really big examples.
00:14:08.420 Two careers and a TV show.
00:14:10.940 That's a lot of discrimination.
00:14:13.200 And it's not in my mind.
00:14:14.560 These are very direct, obvious discriminations.
00:14:18.160 And so, when I talk about, you know, maybe there is a way we could think of reparations.
00:14:25.320 When I talk about improving education for the black community.
00:14:29.140 When I talk about all the things Trump did for, you know, enterprise zones.
00:14:33.200 And other things that were good for the black community.
00:14:36.160 I actually mean it.
00:14:37.580 Because I do have, legitimately, a connection to that community.
00:14:42.600 Now, a million differences, obviously.
00:14:45.820 But in the most critical part.
00:14:50.060 I want to swear really badly, but I'm not going to do it.
00:14:53.360 The most critical part of life.
00:14:55.920 Can you get a job?
00:14:58.560 We had the same experience.
00:15:00.180 We meaning anybody who was discriminated against, especially the black population.
00:15:05.980 Same experience.
00:15:07.720 Couldn't get a job, or keep a career, or get promoted, in my case, for your race in America.
00:15:15.300 So, yeah, that's a real connection.
00:15:17.960 And I'm really mad about it.
00:15:20.720 Because nobody should have to experience that.
00:15:23.560 Now, at the same time, I also think society had to do something about diversity.
00:15:28.320 Probably had to do something.
00:15:29.260 It was just bad for me.
00:15:31.200 Happened to be bad for me, personally.
00:15:33.140 All right.
00:15:33.500 Enough about that.
00:15:37.400 Thanks for the question, anyway.
00:15:39.640 And by the way, if you're imagining that I'm promoting that you should get vaccinated, that's never happened.
00:15:45.060 So, just check your assumptions.
00:15:47.320 That's never happened.
00:15:50.560 I'm wondering if there will ever be a free minds movement.
00:15:55.440 You know, the way there are body hackers.
00:15:59.720 You know, there are a lot of people, let's say, health hackers or whatever, who are sort of organized, semi-organized, and they're testing all different things for their physical health.
00:16:09.740 You know, trying to, I don't know, fast, and try to take this supplement and that.
00:16:15.180 And I'm wondering if there will be a similar thing for people trying to avoid brainwashing and persuasion.
00:16:24.340 Will there ever be a movement of free minds?
00:16:28.060 People just trying to figure out techniques, little hacks, to keep them from being brainwashed?
00:16:33.680 I think maybe so.
00:16:38.420 There might actually be, like, an organized movement at some point of people who are using the technology that they've learned.
00:16:46.320 Technology, technique, I guess I'd say.
00:16:48.440 Maybe technique is better.
00:16:50.600 For finding places that they have bias on their own and figuring out how it got there and figuring out how to get rid of it.
00:17:00.680 Maybe.
00:17:01.040 Yeah, we might have free brains.
00:17:05.860 Speaking of that, there was a Gallup poll talking about trusting the news, and it turns out that the Democrats, during the Trump years, their trust in the major media went way up.
00:17:21.760 Substantially higher.
00:17:23.300 More trust in the media during the Trump administration.
00:17:26.500 Meanwhile, the independents and Republicans had less trust.
00:17:31.320 Republicans had 11%, dropped to a new low, trusting the media.
00:17:36.000 And independents were in between, you know, 31%.
00:17:39.220 Now, just think about this.
00:17:41.420 What was happening at the exact same years that Democrats were getting a far greater trust in the media than they ever had before?
00:17:52.280 What was happening?
00:17:54.140 Well, the fake Russia collusion hoax was mostly the news.
00:17:59.320 Fake news.
00:17:59.920 The fine people hoax happened then, the drinking bleach hoax, and lots of other hoaxes, small act to everything else.
00:18:08.200 So during a period of unambiguously massive fake news, the kind maybe we've never seen so much in one time.
00:18:15.240 During the most obvious, grotesque, overwhelming fake news period, Democrats substantially improved their trust in the media that...
00:18:29.920 ...would rate almost hard to believe the rate at which they were being lied to.
00:18:35.600 And their trust increased substantially.
00:18:39.300 At the same time, the Republicans decreased.
00:18:42.560 What happened?
00:18:44.840 What happened?
00:18:47.420 Is it just they were hearing what they wanted to hear?
00:18:50.560 Eh, that's probably a lot of it, right?
00:18:52.200 A lot of the fake news went their way.
00:18:55.400 As long as the fake news went their way, they were good with it.
00:19:00.120 Love this fake news.
00:19:01.660 This is agreeing with me like crazy.
00:19:04.000 Hey, I thought Trump was crazy, and now the news says Trump is crazy.
00:19:08.820 Feeling pretty good about myself, being all right like that and everything.
00:19:13.840 But here's another factor.
00:19:16.720 What is it that makes people believe the news?
00:19:19.620 And what is it that makes something viral and makes it the news?
00:19:24.440 What is the quality of a story that makes it really, really big news, the one you really pay attention to?
00:19:33.260 What one quality of that story makes it the thing you can't look away from?
00:19:39.660 It's not true.
00:19:40.640 That's the primary quality that makes you watch the news, that it's not true.
00:19:47.840 If it were true, it wouldn't be interesting.
00:19:51.200 Sorry.
00:19:52.660 Reality just doesn't serve up an interesting story every day that fits into this little box.
00:19:58.580 But lies do.
00:19:59.460 You can always fill the space with a lie.
00:20:01.880 So there is a real thing, an observed effect, that the less likely it is to be true, the more attention it's going to get.
00:20:13.840 Because it's more provocative that way.
00:20:17.500 So Democrats were just basically gassing themselves with their own fake news and loving every minute of it, apparently.
00:20:26.460 Do you want to hear the most disturbing story of the day?
00:20:33.800 One that you won't even know what to say.
00:20:37.460 This will be so disturbing that you'll be at a loss for words.
00:20:44.000 So that's my challenge.
00:20:45.080 Eric, I will take that advice.
00:20:59.600 Most disturbing story.
00:21:00.920 Dr. Eli David on Twitter.
00:21:04.040 He tweets this.
00:21:04.960 I'll just read you his tweet so that I don't get anything wrong, right?
00:21:09.220 He says this week, so he works in AI, I guess, and writes about it.
00:21:14.580 He said, this week I interviewed our bright AI researcher working for one of the biggest pharmaceutical companies.
00:21:20.880 And man, do I wish I knew the name of the company.
00:21:23.340 And you will too in a moment.
00:21:25.480 So it's an AI researcher who worked for a big pharma company.
00:21:29.300 I asked him why he wants to leave his current position, so he told me the story.
00:21:34.340 His team had developed an accurate AI model predicting whether one of their expensive drugs would be effective for a patient or not.
00:21:42.060 You know, as you know, not every drug works with every patient the same way.
00:21:47.320 You could have a genetic component, maybe some other components.
00:21:50.900 I'm not sure what the AI took into effect.
00:21:53.000 But apparently they got really accurate results, and they could tell you in advance whether you should take this drug.
00:21:58.900 Now, this is great, right?
00:22:00.860 How great is this?
00:22:01.780 If you knew in advance a drug wouldn't work for you, you wouldn't waste months taking a drug that wasn't helping.
00:22:08.620 And then you would have maybe had a greater chance of getting onto a drug that does help or something that helps.
00:22:14.080 So huge development.
00:22:16.640 I mean, wow.
00:22:18.360 What a step forward in our ability to give people the right medicine.
00:22:22.520 So an AI model that can predict whether an expensive drug will work.
00:22:26.920 That's big stuff.
00:22:28.520 This should have been like a world headline, right?
00:22:30.580 I mean, it's so important.
00:22:32.560 So how'd it go?
00:22:34.500 Well, not the way you hope.
00:22:36.720 Not the way you hope.
00:22:39.080 So you develop this model, and then the implications are dramatic.
00:22:42.500 Instead of wasting a whole year taking the pill to find out it doesn't work,
00:22:46.340 AI predicts in advance the efficacy for each patient.
00:22:51.360 So how did the company react?
00:22:53.000 This is still in the tweet from Dr. Eli David.
00:22:55.620 They asked them to delete the project and never mention it to anyone.
00:23:06.000 The AI model would strip them of huge profits generated by millions who take the drug and find out it's not effective for them.
00:23:15.080 Better let them suffer and pay the company instead.
00:23:24.700 What do you even say about this?
00:23:27.580 Well, the first question you should ask is, is it true?
00:23:31.520 Right?
00:23:32.140 I just told you that fake news is a big problem.
00:23:35.280 You already knew that.
00:23:36.860 Is it even true?
00:23:37.660 Is it?
00:23:40.120 What, um, there is one problem with the story, right?
00:23:44.960 A little bit too on the nose.
00:23:48.960 Isn't it?
00:23:50.460 Because isn't it exactly what you expect?
00:23:54.380 Right?
00:23:55.860 So it's a little too on the nose.
00:23:58.780 What's the second part of the story that should raise a red flag?
00:24:03.620 Second part of the story that raises a red flag.
00:24:06.400 Let's see if anybody gets it.
00:24:09.260 What would raise a red flag in this story?
00:24:15.000 That he wouldn't tell anyone?
00:24:16.360 Well, he did tell someone, but he quit to do it, allegedly.
00:24:21.860 AI model that works?
00:24:23.360 Yeah.
00:24:24.880 That is a wonderfully cynical but accurate comment.
00:24:29.720 Somebody said that the strangest part of the story that should tip you off as fake
00:24:33.100 is that somebody made an AI model that works.
00:24:37.200 That's not a bad comment right there.
00:24:40.320 Yeah.
00:24:40.860 I'm going to endorse that one.
00:24:42.980 Yeah.
00:24:43.200 Somebody made an AI model that works?
00:24:45.680 Really?
00:24:47.600 I mean, I'm sure it's happened.
00:24:50.160 But, you know, right there, you're going to, I don't know.
00:24:52.600 It's kind of hard.
00:24:53.480 Are you missing, there you go.
00:25:00.520 You're almost there.
00:25:02.400 Somebody said rogue doctor.
00:25:04.340 You know, whenever there's the rogue doctor, you know, you have to, you should put the odds
00:25:09.000 of them being right at very low.
00:25:10.460 But it's not a rogue doctor.
00:25:11.860 It's an AI person.
00:25:12.780 It's a little bit different.
00:25:15.180 Didn't name the pharma.
00:25:16.220 Well, probably did name the pharma company to the writer.
00:25:24.060 Anonymous.
00:25:25.040 Okay.
00:25:26.060 Okay.
00:25:26.540 We don't know if he's anonymous, by the way.
00:25:31.000 Yeah.
00:25:31.440 We don't know that.
00:25:32.360 The tweet doesn't name him, but I don't know if he will always be anonymous.
00:25:36.480 But that's, that is exactly the right thing you should be looking at.
00:25:40.000 Thank you.
00:25:40.880 Thank you.
00:25:41.380 We got the right answer here from Bear Wires.
00:25:45.160 Disgruntled employee.
00:25:48.040 Do you believe disgruntled employees?
00:25:51.820 Ever?
00:25:52.260 Do you know how often you should take what a disgruntled employee says as just fact?
00:26:00.080 You know, there's no context left out.
00:26:02.160 Boom.
00:26:03.340 That's a fact.
00:26:04.880 How about never?
00:26:06.640 Never would be a good time to take a disgruntled employee's opinion.
00:26:10.880 How about never?
00:26:12.460 How about not one time ever?
00:26:14.940 Now, it doesn't mean they're always wrong.
00:26:16.720 I'm just saying that in terms of credibility, it's kind of zero.
00:26:21.560 Right?
00:26:22.180 Again, it doesn't mean they're wrong.
00:26:25.040 Right or wrong is a separate question from whether it's the kind of thing you should believe
00:26:30.440 when you hear it.
00:26:31.700 I'm saying it's the kind of thing you shouldn't believe when you hear it.
00:26:34.700 It has that nature.
00:26:37.160 Disgruntled employee.
00:26:38.660 Not named.
00:26:39.580 Company isn't named.
00:26:41.380 An AI model that works.
00:26:43.540 A little too on the nose.
00:26:45.760 Now, if it's real, and by the way, I don't want to cast any aspersions on Dr. Eli David.
00:26:55.580 I have no reason to think that he would have the story wrong or anything like that.
00:27:00.420 So I'm sure that the person he talked to is a real person that he really talked to,
00:27:04.720 who really worked for a big pharma.
00:27:06.100 I'm sure he did the basic work.
00:27:08.700 So I think the basic questions are probably true.
00:27:10.940 And he doesn't work for a big media company that maybe is bought off in some way, as far as I know.
00:27:17.980 I mean, I guess you can never know, huh?
00:27:21.140 But I suppose that's true.
00:27:24.780 And how about this?
00:27:26.980 Let's take this and extend it.
00:27:29.640 Do you think it would ever be possible to write an AI,
00:27:32.840 you know, create an AI that could spot fake news for you?
00:27:39.900 What do you think?
00:27:41.380 Could you ever write an AI program that would help you spot fake news?
00:27:48.860 Fairly accurately, maybe not every time.
00:27:53.560 Yes.
00:27:55.420 The answer is absolutely.
00:27:57.480 Just wouldn't do work every time.
00:27:59.880 How do I know that an AI could do that?
00:28:02.840 Because we just did it.
00:28:05.080 We just did it right here.
00:28:07.000 We just came up with several objective standards that an AI could just remind you of.
00:28:15.580 It's an anonymous source.
00:28:18.000 It's a disgruntled employee.
00:28:20.700 It's an unusual claim that AI works.
00:28:25.480 Yeah, I'm exaggerating a little bit there.
00:28:27.180 But, you know, if this AI worked as well as reported,
00:28:31.900 it would be quite a surprise to me that somebody got that done.
00:28:37.300 And then there's the question of whether it's too on the nose.
00:28:41.420 Could an AI identify a story that was too on the nose?
00:28:46.820 Not directly, but it could do it by a poll of humans.
00:28:53.160 It could ask humans, hey, does this story match pretty much your suspicions?
00:28:59.880 Or does it not match the suspicions you already had?
00:29:03.680 And if the AI finds out that a whole bunch of people go, oh, yeah, that's exactly what I expected,
00:29:09.820 then the AI can say, oh, I have now demonstrated it's a little bit too on the nose.
00:29:14.600 It's a little bit too exactly what you thought was going to happen, isn't it?
00:29:18.500 It could also look at the sources.
00:29:20.100 The AI could say, okay, Fox News says it's true or false, and CNN says it's true or false.
00:29:28.700 And unless both of them agree on the facts, it's probably fake news.
00:29:35.440 Just an objective standard.
00:29:37.180 If the left and the right don't agree on the basic facts, don't assume it's true.
00:29:43.080 Because the stuff that is fact, like a hurricane really hit, it's the same on every news.
00:29:49.200 The ones that you know are true are the same on every news.
00:29:52.280 It's only the ones that aren't true where one news item will report it differently.
00:29:58.720 All right, so wait for that.
00:30:00.280 AI could spot fake news.
00:30:03.860 I believe it could also spot fake news by the wording of the stories and also the outlets that carry it.
00:30:11.040 Because it could identify which outlets have had the most fake news.
00:30:14.760 There might be some subjectivity about seeding that information, but you could watch it over time.
00:30:22.180 I'll bet it could also determine fake news from the wording of the people who talk about it.
00:30:27.660 Don't you think?
00:30:29.400 Because the way people write about stuff is almost a fingerprint, right?
00:30:34.300 When people are writing fake news stories, I'll bet you there's a signature in there somewhere.
00:30:38.740 A pattern, a way they talk about things that's different.
00:30:43.600 I'll bet.
00:30:44.420 And I'll bet AI could find it.
00:30:46.880 Well, the monster is back under the bed.
00:30:49.040 You thought there was no monster under the bed?
00:30:51.640 Oh, there's a monster under the bed.
00:30:53.760 But only if you're an anti-Trumper.
00:30:56.480 Because the monster is...
00:30:58.400 Trump's back!
00:31:00.780 Well, he's been back for a while, but it's increasing.
00:31:02.980 He did a rally in Iowa.
00:31:05.180 He's not announcing, but he's sort of walking us up to the line of announcing.
00:31:13.280 We need frame checkers.
00:31:15.380 Oh, that's a good way to say it.
00:31:17.520 More than fact checkers, we need to see that the story is framed correctly.
00:31:22.040 Somebody says Tim Pool is doing something like that?
00:31:24.720 I don't know about that effort, but I've been hearing about it in the comments.
00:31:28.200 So Trump's back, and the news is happy and horrified, because they've got something to talk about.
00:31:36.300 But they're horrified, I tell you, they're horrified.
00:31:38.980 And I guess Trump is doing this trick again, where he's putting the blacks for Trump,
00:31:44.180 who are wearing the T-shirts that say blacks for Trump, behind him on the stage.
00:31:49.920 I feel like it's time to get rid of that.
00:31:54.340 Now, when he was running the first time, and he did that, I said,
00:31:59.800 well, at least it's visual, and people can see some diversity back there,
00:32:04.520 and that probably works in his favor.
00:32:06.960 But obviously it looked a little artificial, right?
00:32:10.140 Nobody thought that they just spontaneously showed up and got good seats, right?
00:32:14.580 Even though you knew it was artificial, because it was visual, it probably still worked.
00:32:22.500 That was my take.
00:32:23.700 It was obviously political and fake.
00:32:26.280 But a lot of things that politicians do when they run is political and fake.
00:32:30.720 So, I mean, that's baked in.
00:32:32.240 So it was political and it was fake, but maybe it worked, just because it was visual.
00:32:37.360 Yeah, whether they were paid or not, it probably still was a good look.
00:32:41.940 But not anymore.
00:32:42.900 I mean, I feel like it's just so obviously, I don't know.
00:32:49.440 It just doesn't look genuine slightly.
00:32:52.440 And I feel like it's just maybe working against them right now.
00:32:57.540 Pandering, or, I don't know, a little too on the nose.
00:33:03.760 I don't know.
00:33:05.920 It's a little cringy now.
00:33:07.420 There was a trending hashtag that said, Civil War is coming.
00:33:14.700 I think this was probably a Trump supporter or something.
00:33:17.620 It was one person that started trending on social media.
00:33:22.280 Let me give you a prediction.
00:33:24.260 Civil War is now coming.
00:33:27.120 Civil War is now coming.
00:33:28.660 We're not even close.
00:33:29.480 There's nothing like that coming.
00:33:32.800 Yeah, lots of protesting and complaining and, you know, we might be even more divided.
00:33:37.920 But no, there's no Civil War coming.
00:33:40.440 Do you know why?
00:33:42.180 You don't want it?
00:33:43.900 You don't need another reason.
00:33:45.940 The reason that there's no Civil War coming is that not enough people want it.
00:33:49.420 If we were anywhere near a lot of people wanting it, well, then maybe.
00:33:54.820 But no, we're nowhere near it.
00:33:56.620 Don't worry about it.
00:34:00.780 Mike Pompeo is making noise as if he's going to primary Trump run against them.
00:34:05.860 What do you think about that?
00:34:07.640 I have two thoughts.
00:34:10.500 Thought number one.
00:34:12.960 I feel like the Republicans should run a primary.
00:34:16.260 What do you think?
00:34:19.880 I think given just the age of Trump alone and, you know, the controversy which he's created,
00:34:27.200 I think he needs to be primaried for the good of Trump as well as the nation.
00:34:33.320 Because I think Trump needs to beat the field again to really have that legitimacy that he had the first time.
00:34:42.860 Now, I think he could do it.
00:34:44.060 I think he would beat the field again.
00:34:46.960 But, I don't know, I think maybe the system requires it this time.
00:34:55.020 You know, if he were a less controversial character and younger, because age, I think, has to be factored in,
00:35:01.140 then I'd say, no, he's, you know, he's earned his place at the top.
00:35:06.860 And I think he's earned his place at the top.
00:35:08.760 But just for the benefit of the voters, I'd like to have him stress test a little bit.
00:35:15.020 You know, make sure he can still deliver.
00:35:17.500 If he can, he can.
00:35:19.400 Right?
00:35:19.720 If he can, he can.
00:35:20.620 And the indication is, it looks like he can.
00:35:24.340 So, I would suspect he'd win a primary.
00:35:27.060 But I feel like the system requires a little push.
00:35:31.640 You know, you need a little competition in the system.
00:35:34.960 Pompeo, I don't have a strong opinion about him one way or another.
00:35:39.660 I think Afghanistan will hang over him.
00:35:42.020 But, I don't know, he's smart.
00:35:47.380 He can communicate well.
00:35:49.940 I just don't think he brings a sizzle.
00:35:52.280 He's going to need a little more sizzle.
00:35:57.180 So, Jim Acosta is doing this opinion-y thing on CNN now.
00:36:03.240 And here's the thing that you have to be bothered by.
00:36:06.820 When you watch Fox News, the person reading the news is obviously a news person.
00:36:14.760 And the people doing the opinion stuff are obviously opinion people.
00:36:18.220 And you really don't get them confused, right?
00:36:20.220 Nobody thinks Hannity is a hard news guy.
00:36:22.440 He's an opinion guy.
00:36:23.840 Tucker Carlson, same thing.
00:36:27.240 But CNN likes to have that a little bit more gray.
00:36:32.800 So, here's Jim Acosta, famous for covering the White House under Trump.
00:36:36.820 So, you think of him as the news guy.
00:36:39.540 But now he's gone full opinion, at least on this show.
00:36:43.440 That feels dangerous to me.
00:36:46.100 I don't think you should, just as a production note,
00:36:50.500 if you're in the news business,
00:36:51.900 I think your news people need to be walled off from your opinion people a little bit.
00:36:56.440 Having your news guy do opinion,
00:37:00.260 I feel like people are going to take that as news a little bit more than they would if it were someone else.
00:37:07.900 So, just something to watch for.
00:37:10.300 Not a big problem, but something to keep an eye on.
00:37:14.500 And he says things like this.
00:37:15.880 He's talking to his guest.
00:37:16.860 He goes, I guess it was Yang.
00:37:19.500 And he said, Tucker Carlson, I mean, let's just say he's a bad person.
00:37:25.200 And he goes on claiming that he spouts off white nationalist talking points.
00:37:30.160 Which I don't think he does, by the way.
00:37:32.180 I think that's an unfair statement.
00:37:34.260 He does say things that are compatible with what other people that are controversial
00:37:40.940 and you think should be condemned and I think should be condemned.
00:37:46.780 But just the fact that we humans have a lot in common with each other.
00:37:52.100 Yeah, everybody's going to have something that's in common with a group they don't like.
00:37:57.320 Don't you think that would apply to everybody?
00:37:59.860 Do you think that Biden doesn't say some things that some bad people agree with?
00:38:06.300 I'm sure he says things that bad people agree with.
00:38:09.480 Does that make Biden a bad person?
00:38:12.340 It doesn't really work that way.
00:38:14.140 Just the fact that bad people agree with you on some part of what you say,
00:38:18.040 that doesn't accrue to you.
00:38:20.540 It is not your fault that people you think are bad agree with some small part of what you say.
00:38:27.540 It doesn't work that way.
00:38:30.440 All right.
00:38:31.580 The American Medical Association has a note about ivermectin.
00:38:38.020 And it's informing doctors how to respond if their clients, their patients, ask them for ivermectin.
00:38:48.360 And it goes on to say that ivermectin is not proven.
00:38:52.660 And that as an unproven drug, you know, maybe don't recommend it.
00:38:59.440 But instead recommend this.
00:39:02.380 Here's the weird part.
00:39:03.640 So the AMA says we don't have evidence that ivermectin works.
00:39:08.600 So therefore, they're not recommending that it be prescribed.
00:39:11.880 But they are saying that if your patient wants it,
00:39:15.760 you should refer them to one of the ongoing ivermectin trials.
00:39:20.120 Why am I confused now?
00:39:33.460 Weren't we told that ivermectin definitely doesn't work by the medical community?
00:39:39.540 I thought we knew that.
00:39:42.300 Right?
00:39:43.460 Am I wrong?
00:39:44.400 Well, isn't the established medical community...
00:39:48.320 I know you might have different opinions, and there are rogue doctors, et cetera.
00:39:52.140 But isn't the established view that ivermectin has been proven to not work,
00:39:58.720 as opposed to, you know, we don't know.
00:40:01.880 Isn't that the view?
00:40:03.220 And the AMA is saying, well, if your patient wants to do it,
00:40:06.780 refer them to an ivermectin trial.
00:40:08.600 What doctor who believes that we know it doesn't work
00:40:15.280 should tell his patient to take it anyway for the trial
00:40:19.700 if he knows it doesn't work?
00:40:23.500 Remember, it's not wondering if it works.
00:40:26.760 Don't they know it doesn't work?
00:40:28.540 Isn't that the official belief?
00:40:31.100 If the official belief is that we know it doesn't work,
00:40:36.040 why are there even trials?
00:40:37.360 Who's running these trials?
00:40:41.020 Don't you think the people who are funding the trials
00:40:43.520 should have said by now,
00:40:45.640 oh, darn, we already found out it doesn't work.
00:40:49.360 Let's cut the expense of the trial right now and wind this up.
00:40:52.960 There's no reason to go to a result,
00:40:55.540 because there's no reason to get a result in the trial,
00:41:00.140 because all the other trials have shown it doesn't work.
00:41:02.620 There's no reason to finish.
00:41:03.880 This is another example where we're clearly being lied to.
00:41:11.040 We, the public, are clearly being lied to,
00:41:14.220 because if these trials are happening,
00:41:17.600 it does mean that people with real money,
00:41:20.740 and a lot of it, because it's expensive,
00:41:23.040 and people who are real professionals in this field,
00:41:26.040 because who else is going to do the study?
00:41:27.500 It's going to be professionals, right?
00:41:31.140 If they're not sure, why do you have to be sure?
00:41:36.800 Right?
00:41:37.960 If medical science itself isn't sure,
00:41:41.000 why do you have to be sure?
00:41:44.280 That's a little bit inconsistent, right?
00:41:48.460 Now, to be clear, as far as I know,
00:41:52.880 the benefit of ivermectin has not been demonstrated,
00:41:57.100 as far as I know.
00:41:58.720 I'm not the expert, but I do follow the news.
00:42:01.900 As far as I know, it has not been demonstrated.
00:42:04.460 Definitely some, there are indications.
00:42:08.520 You know, there's plenty of things that tell you maybe.
00:42:12.080 I suppose that's why they're doing the trials.
00:42:15.680 And, of course, given that the risk of it is low,
00:42:20.740 it appears that we have been more brainwashed
00:42:23.640 than informed on this topic.
00:42:26.500 Hey, here's something that I didn't know until today.
00:42:29.880 I'm here to inform you.
00:42:32.020 We all know that the vaccinations could wear off, correct?
00:42:38.340 We all know that the vaccinations could wear off.
00:42:45.280 And that antibodies can diminish over time, right?
00:42:52.060 So I think whether you got the,
00:42:54.900 you were infected or you got vaccinated,
00:42:57.020 I think your antibodies decrease over time.
00:43:00.260 But is that the problem that you think it is?
00:43:04.260 If you hear that your antibodies are decreasing over time,
00:43:08.340 what does that tell you about your risk?
00:43:11.540 It tells you you have a higher risk, right?
00:43:14.680 But it's not quite as clean as that.
00:43:17.980 It turns out there are two kinds of resistance, let me call it.
00:43:21.840 I'm going to use non-medical terms here
00:43:24.020 so that it's just simpler.
00:43:25.820 There are two ways that having some kind of prior antibodies help you.
00:43:33.300 Number one is keeping it from getting into your body in the first place
00:43:37.100 and taking hold.
00:43:38.340 So there's sort of an initial defense that keeps you from getting it in the first place.
00:43:43.600 And that will decrease over time.
00:43:45.800 Which means that your initial defense will let it in.
00:43:50.100 You will more likely, after the immunity goes down,
00:43:54.400 you will more likely catch it.
00:43:57.080 But here's the part I didn't know.
00:43:59.160 There is a second type of immunity
00:44:01.320 which doesn't decrease the same way.
00:44:04.140 So the first kind, which is you're going to get it,
00:44:08.040 could decrease to the point where you get it.
00:44:11.100 But the other kind has much more lasting permanence.
00:44:16.500 And the other kind is the kind that keeps you from dying.
00:44:19.080 So the good kind, we got, because it's keeping you from dying.
00:44:26.300 The other kind, where it's keeping you from getting it in the first place,
00:44:29.700 isn't working.
00:44:30.960 Which is?
00:44:32.340 Good news or bad news?
00:44:35.020 Is that good news or bad news?
00:44:37.120 Suppose you took a vaccination,
00:44:38.420 and your ability to catch it went down,
00:44:42.080 but your ability to prevent serious illness was still solid.
00:44:47.080 Is that good or bad?
00:44:50.740 I think it's good.
00:44:53.760 Good-ish.
00:44:55.480 Because the end point is we all get vaccinated
00:44:57.740 and we also get the infection.
00:45:00.160 We just get to the end point faster.
00:45:02.360 Now, you don't want it to rage, you know,
00:45:04.940 out of control and crash your hospitals and stuff like that.
00:45:08.420 So you want your good news to, you know, be moderate good news.
00:45:11.940 You don't want great good news
00:45:13.300 because things would happen too quickly.
00:45:15.040 So there's a rate of change that's important.
00:45:17.760 But did we luck into this?
00:45:22.140 Imagine if you could have designed this on paper.
00:45:27.140 Would you have said, let's vaccinate,
00:45:29.160 and then people are just good.
00:45:30.600 They're just vaccinated.
00:45:31.540 They won't get it.
00:45:32.420 Or would you have chosen to vaccinate people
00:45:36.020 in a way that they would still get infected
00:45:37.720 so that they could get the natural immunity,
00:45:39.700 but it won't kill them.
00:45:41.740 I'm not so sure this wasn't perfect.
00:45:47.200 Are you?
00:45:48.960 Can you tell me that you know this is worse?
00:45:52.680 Because I can't tell.
00:45:55.660 So he says it's a bad analysis.
00:45:57.360 Do better.
00:45:58.300 Do better.
00:45:59.020 Raise your game.
00:46:00.680 I'm not saying I'm right.
00:46:02.120 I'm asking the question.
00:46:04.260 But your criticism needs to be better than that.
00:46:06.560 Just give me a reason, some kind of a,
00:46:08.900 any kind of an indication.
00:46:14.600 It's all good until they're forced.
00:46:16.280 Yeah, that's a separate question.
00:46:18.760 Yeah.
00:46:19.340 We can't tell if it's good news or bad news.
00:46:21.500 Isn't that weird?
00:46:22.020 It's one of the most gigantic factors
00:46:24.860 in our environment right now.
00:46:27.760 And we don't even know if it's good news or bad news.
00:46:30.500 The vaccine is a scam, bro.
00:46:37.480 Okay.
00:46:39.820 Well, that's what I learned today.
00:46:42.000 I don't know if I got that right,
00:46:44.220 but look how much smart you are.
00:46:45.960 And let's talk teachers' unions,
00:46:53.700 how we can break them and solve systemic racism.
00:46:57.880 Well, I'll tell you,
00:46:59.220 if somebody wanted to run against Trump,
00:47:03.180 a Republican,
00:47:04.820 and they ran on trying to bring in black voters
00:47:07.840 to solve systemic racism via the school system,
00:47:13.180 they could win, I think.
00:47:15.960 I think a Republican could beat Trump in the primaries
00:47:19.600 by saying,
00:47:21.120 I'm going to intentionally try to find
00:47:23.540 where Republicans and the black population
00:47:25.700 have the same interest.
00:47:28.740 Mandates is one.
00:47:30.660 Mandates.
00:47:31.600 And the same interest in reducing systemic racism
00:47:34.840 via the school system.
00:47:36.840 Same.
00:47:37.800 Just run on that.
00:47:39.080 Just run on the stuff you agree on,
00:47:41.100 and you would beat Trump in the primaries.
00:47:43.580 You should read Brave New World.
00:47:46.360 It's dystopia created
00:47:47.360 by trying to make everyone safe and happy.
00:47:52.020 Yeah, I mean,
00:47:52.960 if his writing was better,
00:47:54.140 I'd read it.
00:47:55.880 Or if it wasn't so sad,
00:47:57.320 I'd read it, I guess.
00:48:03.620 Yeah, I don't read depressing books.
00:48:05.360 That is correct.
00:48:06.040 How did I get the black nail?
00:48:11.860 This was a tragic...
00:48:13.460 This one's a little black, too.
00:48:15.200 A tragic accident involving a window.
00:48:18.220 No, I didn't close my fingers in a window.
00:48:22.680 It was more...
00:48:23.620 It was more indirect than that,
00:48:25.580 but let's just say a window was involved
00:48:27.360 and keep it there.
00:48:28.240 Well, I think Trump is running against mandates,
00:48:34.260 but he's not full-throated on it.
00:48:40.180 All right.
00:48:41.740 Yes, let's go, Brandon.
00:48:46.120 I tried to read 1984,
00:48:47.780 but it was too sad and terrible,
00:48:50.280 and I bailed out.
00:48:54.660 Jill says that she's sure
00:48:56.500 ivermectin does work.
00:48:58.860 Jill, you know you don't know that, right?
00:49:01.000 Because it can't be known.
00:49:02.720 We don't have data that says it works.
00:49:04.860 If you know some anecdotes
00:49:06.180 of somebody who got better in two days,
00:49:08.100 that's most of the people who get COVID.
00:49:10.780 Most people just get better right away.
00:49:13.440 So there should be lots of stories
00:49:14.960 of people who ate a cookie,
00:49:16.980 and within 24 hours,
00:49:18.520 their COVID had subsided.
00:49:21.160 How many people,
00:49:22.900 let's say, stubbed their toe,
00:49:26.180 and within 24 hours,
00:49:27.480 their COVID was totally under control?
00:49:31.320 A lot.
00:49:32.640 How many people took hydroxychloroquine,
00:49:35.340 and within 24 hours,
00:49:36.640 their COVID was all fine?
00:49:38.340 Lots.
00:49:39.180 Millions, probably.
00:49:40.700 How many people ate a peanut,
00:49:43.180 and within 24 hours,
00:49:44.500 their COVID was all better?
00:49:46.700 Millions.
00:49:47.260 How many people had a bowl of rice,
00:49:50.640 and within 24 hours,
00:49:52.180 their COVID was all cleared up?
00:49:54.180 A billion?
00:49:55.880 I don't know,
00:49:56.380 100 million, maybe?
00:49:58.320 So I don't care how many people you know
00:50:00.800 who took a drug,
00:50:01.880 and it cleared up within 24 hours.
00:50:03.700 It means exactly zero.
00:50:06.800 The exception would be
00:50:10.080 the monoclonals.
00:50:12.860 I think the monoclonal antibodies,
00:50:15.140 I've heard people just feel it.
00:50:17.940 You can actually feel yourself getting better
00:50:20.100 while you're taking it.
00:50:22.160 So I think that's a different case.
00:50:27.920 Yeah, how many people took ivermectin,
00:50:30.100 and their problems did not clear up
00:50:32.120 in 24 hours?
00:50:34.440 We don't know.
00:50:36.960 Yeah, and none of those in vitro studies
00:50:38.840 means anything at all.
00:50:42.360 Somebody says it worked for Joe Rogan.
00:50:48.240 Can I see in the comments,
00:50:50.100 how many people think this is a true statement?
00:50:54.340 Ivermectin worked for Joe Rogan.
00:50:56.120 How many think that's a true statement?
00:50:58.900 Well, one that you can confirm is true.
00:51:02.400 Yeah, unknown is the correct answer.
00:51:04.480 But
00:51:04.620 just look at the comments.
00:51:11.600 Yeah, no.
00:51:14.000 We don't know
00:51:15.100 which of the many things
00:51:17.160 that Joe Rogan did
00:51:18.460 made a difference.
00:51:21.340 We don't know if any of it
00:51:22.560 made a difference
00:51:23.280 on one person.
00:51:25.820 I mean, look at Joe Rogan.
00:51:26.900 Doesn't he look healthy?
00:51:28.560 He looks like he could beat COVID
00:51:30.380 without drugs.
00:51:32.440 Now, it would be silly to try
00:51:34.660 under a current condition.
00:51:36.920 But if you could pick anybody in the world
00:51:38.660 who could beat COVID
00:51:40.580 without drugs,
00:51:42.860 I mean, he'd be on the top of my list.
00:51:45.480 I mean, I suppose he could be younger.
00:51:47.820 But, I mean, he's about as healthy
00:51:49.060 as you can possibly get.
00:51:50.300 Yeah, so anybody who thinks
00:51:55.080 that ivermectin cured Joe Rogan
00:51:57.260 or anybody else,
00:51:58.740 I can't rule it out,
00:52:00.580 but it would be the worst take
00:52:02.480 of all time
00:52:03.200 because there's no evidence
00:52:04.900 that worked,
00:52:05.520 and there's lots of evidence
00:52:06.700 that the other stuff does work,
00:52:08.120 like the monoclonals, for example.
00:52:09.500 Somebody says,
00:52:14.120 I have some ivermectin
00:52:15.080 and we'll definitely take it
00:52:16.220 if infected.
00:52:17.820 Personal choice.
00:52:19.480 Personal choice.
00:52:20.640 I would neither talk you
00:52:21.720 into that or out of that.
00:52:23.580 I would just say,
00:52:24.440 I don't have any evidence
00:52:25.340 that it would help you,
00:52:26.600 but I don't have any evidence
00:52:27.740 that it'll hurt you.
00:52:29.480 Personal choice.
00:52:30.280 Less than 1% of those
00:52:37.780 that catch COVID
00:52:38.620 will be hospitalized.
00:52:40.000 That is a shitty data point.
00:52:44.480 Here's the data
00:52:45.840 you should never give anybody
00:52:47.300 if you want to be credible.
00:52:49.220 Only 1% of people
00:52:50.420 are getting hospitalized.
00:52:52.440 Now, or less than 1%.
00:52:53.700 The number is probably right,
00:52:57.000 but there's a pretty big difference
00:52:59.700 between a 12-year-old
00:53:01.700 and an 80-year-old.
00:53:03.500 And if you average
00:53:04.800 the 80-year-old's risk
00:53:07.680 of hospitalization
00:53:08.620 with a 12-year-old's risk
00:53:10.060 of hospitalization,
00:53:11.420 and you get a number
00:53:12.260 that's under 1%,
00:53:13.500 you have misled people.
00:53:17.240 So don't do that.
00:53:19.000 That under 1%
00:53:19.840 is a propaganda number.
00:53:21.720 It's not a useful number.
00:53:23.360 Anybody who tells you
00:53:24.500 the odds of hospitalization
00:53:26.140 are under 1%,
00:53:27.420 they're part of the propaganda
00:53:29.520 process and not part
00:53:30.740 of the information process.
00:53:32.420 It's true,
00:53:33.500 but you've averaged
00:53:34.320 things together
00:53:35.020 that only an idiot
00:53:35.840 would average together.
00:53:37.720 Like, what's the average age
00:53:39.320 of a frog plus an elephant?
00:53:41.460 And then we'll come up
00:53:42.200 with the average age
00:53:42.980 of creatures.
00:53:44.480 It's not a thing.
00:53:45.940 It's not useful.
00:53:47.860 You don't average things
00:53:48.960 like that
00:53:49.820 and think you learned something.
00:53:51.980 So don't give me
00:53:53.080 the average of people
00:53:53.960 who go to the hospital.
00:53:55.020 That's a misleading
00:53:56.280 propaganda number.
00:53:57.460 If you want to tell me
00:53:59.840 that kids have
00:54:00.500 almost no chance,
00:54:02.240 true.
00:54:03.420 And if you want to tell me
00:54:04.380 that people over 80
00:54:05.380 better watch out,
00:54:08.080 true.
00:54:08.980 But if you average
00:54:09.880 those two together,
00:54:10.940 you're not part
00:54:11.520 of the real conversation.
00:54:14.260 You're just propaganda.
00:54:17.820 All right.
00:54:18.460 kids with obesity,
00:54:23.000 maybe.
00:54:24.000 But I'm not even sure
00:54:25.000 kids with obesity.
00:54:26.060 Well, yeah,
00:54:26.560 I guess anecdotally
00:54:28.060 there's enough
00:54:29.040 to worry about there.
00:54:32.340 Yeah, 1% is about
00:54:33.720 4 million extra people
00:54:35.080 getting hospitalized.
00:54:36.940 4 million people.
00:54:39.720 All right.
00:54:41.420 That's all for now
00:54:42.300 and I will talk to you
00:54:43.360 tomorrow.
00:54:44.340 We'll be right back.
00:54:48.740 Bye.
00:54:51.680 Bye.
00:54:54.160 Bye.
00:54:55.060 Bye.
00:54:56.840 Bye.
00:54:57.180 Bye.
00:55:01.900 Bye.
00:55:03.660 Bye.
00:55:12.740 Bye.