Real Coffee with Scott Adams - July 31, 2022


Episode 1821 Scott Adams: Fox Shuns Trump, Depopulation Rumors And More Fun


Episode Stats

Length

1 hour and 5 minutes

Words per Minute

145.7443

Word Count

9,560

Sentence Count

661

Misogynist Sentences

43

Hate Speech Sentences

32


Summary

I talk about a new food tech startup that can make meat from scratch using a 3D printer. I talk about Jon Stewart's fake news, and a new invention that could revolutionize the way you get your food.


Transcript

00:00:00.000 Shall we?
00:00:01.700 Now, you will be amazed at the following thing that happens.
00:00:05.980 Watch me remember the simultaneous sip introduction.
00:00:12.200 All you need is a cup or mug or glass, a tank or gel, a canteen jug or flask,
00:00:18.200 a vessel of any kind, filling with your favorite liquid.
00:00:20.540 I like coffee.
00:00:23.000 And join me now for the dopamine hit of the day,
00:00:27.720 the thing that makes everything better.
00:00:30.000 It's called the simultaneous sip.
00:00:33.520 Go.
00:00:40.740 All right, well, that was close.
00:00:43.600 So let's talk about all the cool things that are happening.
00:00:47.040 Here's something that makes me happy.
00:00:49.300 There's an Israeli food tech startup called Savor Eat
00:00:53.700 that's making 3D printed pork patties, turkeys, and burgers.
00:01:00.000 YouTube just keeps going on and off, so there's no point in continuing with that.
00:01:06.820 Hey, YouTube, for some reason, YouTube just keeps turning on and off,
00:01:10.740 whereas the locals platform...
00:01:13.780 Let's see if I can even say this long enough.
00:01:17.420 It's so unstable, it gives me like 10 seconds shot.
00:01:21.060 I mean, I'm going to shut down YouTube in a moment, but I wanted to tell him.
00:01:24.900 Yeah, yeah, it's done.
00:01:27.620 All right, Scott doesn't get paid today.
00:01:32.620 That's what that means, by the way.
00:01:34.580 That means that the monetization for today's show is off.
00:01:40.300 Not that that matters a lot, but yes, you did pay, that's true.
00:01:43.560 And you are now my only audience.
00:01:49.600 That's right.
00:01:53.120 I still have locals, and that's all I need.
00:01:56.780 Because if YouTube had good technology, they'd be listening to me now.
00:02:02.500 It's what you've been waiting for, finally.
00:02:06.000 All right.
00:02:07.020 So anyway, this Israeli startup,
00:02:11.020 they can 3D print food.
00:02:13.560 And they can make meat.
00:02:16.600 Now, don't you think they can make something that tastes like vegetables?
00:02:20.960 Probably.
00:02:22.140 Probably.
00:02:23.400 And apparently it's, you know, who knows how healthy it is
00:02:26.540 and how many chemical additives are involved in it.
00:02:30.260 I don't know.
00:02:31.060 So I'm not going to tell you it's healthy or unhealthy.
00:02:34.380 But I feel like this is the solution to our food problems.
00:02:39.580 And what I mean is food cost.
00:02:42.140 Could you imagine, could you imagine a 3D printer in your kitchen that's your only kitchen?
00:02:50.200 You know, maybe you've got a sink and garbage and stuff.
00:02:52.640 But basically, could you imagine only having a 3D printer?
00:02:55.980 It'd be like a replicator.
00:02:57.120 Yeah, basically, it's a redneck replicator.
00:03:01.680 But, you know, the thing I complained about was how many times you have to pack and unpack food.
00:03:07.600 You know, how many times does it change hands, food, any kind of food, before it gets on the shelf?
00:03:13.480 It goes through all kinds of processing and people and distributors and wholesalers, farmers.
00:03:20.540 And then it gets to the grocery store.
00:03:22.980 And then, you know, somebody puts it on the shelf.
00:03:25.840 I put it in my cart, my cart on the thing, the thing in the car, the car into the counter, the counter into the...
00:03:31.480 Yeah, basically, it's like probably 12 to 15 times it changes hands, or it goes from one place to another to stage it.
00:03:41.460 Wouldn't you say?
00:03:43.000 Don't you think food probably changes states, meaning it's in a place or it's being shipped or it's being handled,
00:03:50.180 probably 15 times before it gets to you?
00:03:53.520 But imagine if the only thing you had to receive was the fuel for your 3D printer.
00:04:00.980 And, you know, once a week somebody comes with a bunch of, you know, basically, you know, ink, except it's for your food,
00:04:08.040 and you just print all your food.
00:04:10.160 How awesome would that be?
00:04:11.860 Very awesome.
00:04:14.600 All right.
00:04:15.980 Jon Stewart apparently fell victim to some fake news.
00:04:20.180 Are you watching that?
00:04:21.000 So Jon Stewart's been promoting the burn bill compensation thing.
00:04:29.200 So the veterans who apparently were injured by being around these burn pits,
00:04:36.360 Jon Stewart's trying to get them, you know, health care funding and stuff.
00:04:41.120 And I guess that bill was defeated by the GOP.
00:04:43.600 And Jon Stewart said,
00:04:45.680 Ah, you guys voted for this exact bill before, you guys being the Republicans.
00:04:51.000 But now you say, no.
00:04:53.260 But the real story, the real story is it was a fake bill.
00:04:58.340 It had poison pills and then other spending that had nothing to do with it.
00:05:02.580 So it was a bill designed to screw Republicans or fail.
00:05:07.280 It only had two states.
00:05:09.760 None of the states are the ones that Jon Stewart would want,
00:05:12.780 where there was a clean bill and people passed it.
00:05:16.800 Nothing like that happened.
00:05:17.780 And what happened was the Democrats made sure that it couldn't be passed,
00:05:21.840 or if it did, it would be bad for the country.
00:05:24.420 By throwing in some things that, you know,
00:05:26.960 effectively did not get through a government process
00:05:30.040 because they went through the back door.
00:05:33.540 So Jon Stewart falling for fake news.
00:05:36.280 And you would think you would have noticed that the news is fake.
00:05:44.360 When I heard that the Republicans turned down the burn bill,
00:05:48.300 or in other words,
00:05:50.840 when I heard that the GOP was acting anti-veteran,
00:05:56.800 what was my first thought?
00:05:59.060 Let's see.
00:05:59.780 I'm looking at the news,
00:06:01.740 and the news is telling me that Republicans
00:06:05.260 are not supportive of veterans.
00:06:10.040 What are the possible explanations for why this news is what it is?
00:06:14.780 One, Republicans have completely changed who they are overnight.
00:06:20.160 Two, it's yet another example of Democrats playing that trick
00:06:24.620 where they put a poison pill into a legitimate bill
00:06:28.460 so that you can't vote on it.
00:06:30.840 Like you can't possibly approve it.
00:06:32.360 It's just so ugly by the time you vote on it.
00:06:36.320 So I'm going to say that Jon Stewart
00:06:39.260 is at least one level of awareness below me.
00:06:43.940 And probably you.
00:06:44.940 How many of you have the same experience?
00:06:48.980 When you saw the news
00:06:50.060 that the Republicans had turned down
00:06:53.000 some kind of thing that was good for veterans,
00:06:55.180 how many of you knew it was fake news
00:06:58.100 as soon as you saw it?
00:06:59.640 Yeah, raise your hand.
00:07:00.400 All of you, right?
00:07:01.760 Pretty much everybody who watches this live stream
00:07:03.900 knew that was fake news from the jump.
00:07:06.460 And it's always fake news in the same way,
00:07:08.680 that they threw a poison pill in there,
00:07:11.180 meaning they added some funding for unrelated things
00:07:14.060 that Republicans don't like.
00:07:17.800 So do you suppose that Jon Stewart
00:07:20.800 is not operating on a level of awareness
00:07:23.900 where he understands that this is fake news?
00:07:31.080 You're right, he is.
00:07:34.960 I don't know.
00:07:36.040 It makes me wonder if Jon Stewart
00:07:37.420 was actually fooled by the news
00:07:39.220 or if he's pretending to be fooled
00:07:41.260 because, I don't know, to make a point or something.
00:07:44.200 I'm not sure.
00:07:45.880 Well, let me ask you this.
00:07:47.420 Would you say, collectively,
00:07:49.420 that Joe Biden has been successful
00:07:52.220 with major legislation?
00:07:55.800 So here's a test to you
00:07:57.420 to see if you fell for fake news.
00:07:59.960 Let's see if you fell for fake news.
00:08:02.520 Is Joe Biden, has he been successful
00:08:04.580 in large, let's say, major legislation?
00:08:07.900 Well, here's what CNN's take was
00:08:12.700 in one opinion piece anyway.
00:08:16.320 President Joe Biden has scored successes
00:08:18.260 on issues such as infrastructure
00:08:20.360 and gun safety.
00:08:22.360 So, you know, you don't have to like those bills,
00:08:24.620 but just in terms of are they major
00:08:26.240 and are they big issues, yes.
00:08:29.240 And did he give some legislation passed
00:08:31.040 that he liked?
00:08:32.320 Apparently, yes.
00:08:33.540 So infrastructure and gun safety.
00:08:35.560 Now, this is the Democrat version, right?
00:08:38.760 And the CHIPS bill.
00:08:40.400 The CHIPS bill,
00:08:42.060 I think it moves our chip manufacturing
00:08:44.860 back to the United States
00:08:45.980 or promotes that.
00:08:47.960 So that's good.
00:08:49.300 So those are three pretty big things.
00:08:51.160 And then he's got this
00:08:52.140 major climate bill
00:08:55.120 that looks like it's got a good chance
00:08:56.740 of passing in the Senate as well.
00:08:58.040 Well, that would be pretty,
00:08:59.720 wouldn't that be pretty major?
00:09:03.140 For one term,
00:09:04.400 that would be four fairly substantial bills.
00:09:10.240 I don't know.
00:09:10.740 I think they can make a compelling case
00:09:14.100 that he's been successful,
00:09:16.180 at least successful in terms
00:09:17.740 of the Democratic initiatives.
00:09:21.060 I'm not sure any of those bills are useful.
00:09:23.740 I have no idea if it's a waste of money or not.
00:09:26.060 But at least he got what he said he would get.
00:09:28.040 A CHIPS is a fail because it doesn't help.
00:09:32.440 Yeah.
00:09:33.100 Most of these things have a title of the bill
00:09:36.140 that is completely misleading
00:09:37.900 to what the bill is.
00:09:39.580 So if you look at what the bill says it is,
00:09:42.600 what we really need is some kind of truth.
00:09:46.180 Truth in labeling, don't you think?
00:09:49.460 We need a truth in labeling, don't we?
00:09:52.300 Yeah.
00:09:53.040 Because these bills are all labeled
00:09:54.520 the opposite of what they are.
00:09:55.560 And the Congress makes every other product
00:09:59.620 put its ingredients on the label.
00:10:02.620 Oh, you see where I'm going?
00:10:05.660 Congress should be forced
00:10:07.060 to put its ingredients on the label.
00:10:09.680 In other words, what's in the bill?
00:10:12.360 Including, you know how if you look
00:10:14.180 at the ingredients on a package,
00:10:16.180 it will tell you the good stuff that's in it,
00:10:18.480 protein, vitamin D, and stuff.
00:10:20.560 But it also tells you how much sugar and fat, right?
00:10:25.780 So it tells you what's good,
00:10:28.680 and then it also, and trans fat.
00:10:30.420 So it also tells you what will be bad for you.
00:10:32.640 It's right there in the package.
00:10:34.720 Why can't Congress,
00:10:37.300 and let me ask Thomas Massey,
00:10:39.780 or, you know, I don't know, Tom Cotton,
00:10:43.740 or somebody who's, you know, not worthless?
00:10:48.940 How many people in Congress?
00:10:50.380 Matt Gaetz.
00:10:51.500 That's a perfect Matt Gaetz topic, isn't it?
00:10:55.480 Rand Paul.
00:10:57.840 So those are the ones I think would be capable.
00:11:01.960 Right?
00:11:02.300 Matt Gaetz could do it,
00:11:03.320 could do it because he's got nothing to lose,
00:11:05.100 and it would be hilarious.
00:11:06.680 Am I right?
00:11:08.280 Matt Gaetz could do a truth in labeling proposal
00:11:12.120 because he's got nothing to lose.
00:11:14.300 It would be awesome.
00:11:16.880 You know, he's the most dangerous politician
00:11:18.820 in the country right now.
00:11:19.860 You know that, right?
00:11:21.660 Matt Gaetz, by far, is the most dangerous
00:11:24.080 in a good or bad way,
00:11:26.280 depending on your point of view.
00:11:27.560 He's the most dangerous
00:11:28.600 because he can do anything now.
00:11:31.340 He's been moved to the category
00:11:32.940 of people who are completely free.
00:11:34.720 There's no constraints on that guy anymore.
00:11:38.820 He has nothing to lose.
00:11:40.700 So if you want somebody to tell the truth
00:11:43.340 in front of the country,
00:11:44.600 you would hire him
00:11:45.620 because it looks like he's willing to do that,
00:11:48.500 and literally nobody else is doing it.
00:11:51.820 Yeah, what happened with Matt Gaetz?
00:11:53.300 Do you remember all that alleged legal trouble
00:11:55.880 that Matt Gaetz was going to be in?
00:11:58.260 Where's that?
00:12:00.240 Do you remember that accuser?
00:12:03.120 Never heard from her, did you?
00:12:05.340 So where is this witness?
00:12:08.420 And obviously they would have,
00:12:09.940 it would be easy to get digital evidence
00:12:12.440 of some crime, right?
00:12:13.900 So anything that Matt Gaetz did on email
00:12:16.460 or whatever, they probably have that already.
00:12:19.280 I don't see any, do you see any charges?
00:12:22.620 Do you?
00:12:24.320 Was the Matt Gaetz thing entirely bullshit?
00:12:27.340 Because it looks like it.
00:12:29.120 It just disappeared.
00:12:31.040 How do things just disappear?
00:12:32.520 I mean, it looks to me that that was all fake.
00:12:38.880 Anyway, I'd like to see maybe a Thomas Massey
00:12:41.820 or Rand Paul or Matt Gaetz
00:12:44.340 or somebody who's got some, you know, gonads.
00:12:49.240 Say Lauren Boebert.
00:12:50.860 She has more balls than most of the people in Congress.
00:12:54.540 And I don't care who it is,
00:12:55.820 but somebody just needs to do a,
00:12:58.180 somebody needs to do some kind of a
00:13:02.000 truth and labeling thing for the legislation.
00:13:04.360 All right.
00:13:06.700 What is your belief about our supply chain and our ports?
00:13:13.580 Yeah, those are all good names.
00:13:15.660 What about our supply chain and our ports?
00:13:19.340 Do you think they're getting better or worse?
00:13:21.720 I guess they got better in May and June,
00:13:24.080 and I told you that problem might be solved.
00:13:26.100 But apparently they're, they're backed up again.
00:13:30.660 Yeah, the report, the reporting disappeared,
00:13:32.920 but their ports are backed up.
00:13:34.520 But yeah, it's worse.
00:13:36.540 It's worse on some ports.
00:13:38.740 There are some ports that don't seem to be terribly impacted.
00:13:43.400 Ports owned by China?
00:13:45.440 That's not true, is it?
00:13:47.680 Is that true?
00:13:48.860 That doesn't feel true,
00:13:50.640 that the ports that are having trouble
00:13:52.020 have some Chinese ownership.
00:13:53.580 I don't, I don't think that's true, is it?
00:13:55.620 That doesn't sound right.
00:13:57.560 All right.
00:14:00.580 Here's what I wonder.
00:14:01.880 Is it because demand is so high?
00:14:04.740 Is our economy cooking so hard
00:14:06.880 that we just can't deliver all the goods?
00:14:10.420 It's incompetence, you think.
00:14:14.560 Yeah.
00:14:16.700 Oh, parts from China are delayed.
00:14:18.920 Well, but that wouldn't affect
00:14:20.100 the unloading of the ships.
00:14:21.800 So I guess we've got some question
00:14:23.160 about the reporting on the shipping supply chain.
00:14:26.360 Here's the other thing I don't know.
00:14:28.860 Don't you think that by now,
00:14:31.280 and this is just speculation,
00:14:33.480 and it's part of the Adams Law
00:14:35.320 of slow-moving disasters,
00:14:37.360 if you knew there was going to be
00:14:38.780 a long-term supply chain backlog,
00:14:41.900 what ways would you adjust?
00:14:44.560 I feel like the ways you would adjust
00:14:46.480 is you would unload
00:14:47.400 the most important ships first.
00:14:49.920 Am I right?
00:14:51.360 Because normally I would imagine,
00:14:53.240 this is just speculation,
00:14:55.120 I would imagine the ships get unloaded
00:14:57.200 first come, first serve.
00:14:59.300 Does that make sense?
00:15:01.060 Probably.
00:15:01.640 If everything's fast,
00:15:02.740 you just do first come, first serve.
00:15:04.600 But if you've got an emergency,
00:15:06.240 and there's a backlog,
00:15:07.900 I feel like the most important ones
00:15:09.680 probably go to the front line.
00:15:10.940 What do you think?
00:15:13.660 You know, like,
00:15:14.160 if something has microchips in it,
00:15:15.960 don't you think that goes
00:15:16.660 to the front of the line?
00:15:18.500 Perishables, microchips.
00:15:20.460 Yeah, so it could be that
00:15:21.860 we're not noticing the impact,
00:15:24.400 because if you buy some consumer good,
00:15:26.960 it takes two months instead of a week.
00:15:30.040 But, you know,
00:15:30.860 it's some little consumer good.
00:15:32.340 It's a toaster.
00:15:33.820 You know, if you buy a toaster,
00:15:35.140 maybe it takes a month to get it.
00:15:36.500 But if you buy food,
00:15:38.180 it seems to be there.
00:15:40.520 Right?
00:15:41.200 So it could be that
00:15:42.260 they're doing the important stuff,
00:15:43.940 so we don't really notice
00:15:45.060 there's much difference.
00:15:46.060 And I don't see any reporting on that, do you?
00:15:48.200 Why am I the first person
00:15:50.700 who is floating the notion
00:15:52.780 that they may have changed
00:15:54.500 their priorities at the port
00:15:55.660 and completely lessened
00:15:57.440 the impact on the public?
00:15:59.000 Or they didn't,
00:16:00.520 which would be also
00:16:01.560 a gigantic story, right?
00:16:03.420 If the ports did nothing different,
00:16:05.340 and all they did is
00:16:07.220 suffer the problem,
00:16:09.420 well, I'd want to know about that.
00:16:11.400 Wouldn't you?
00:16:12.160 Wouldn't you want to know
00:16:12.880 that they were doing nothing
00:16:13.800 to help the problem?
00:16:15.760 So, I don't know.
00:16:16.640 The total lack of reporting on this
00:16:18.300 is kind of interesting.
00:16:20.500 All right, now here's
00:16:21.320 a really interesting situation
00:16:23.840 brewing with Trump and Murdoch.
00:16:27.520 So my understanding is
00:16:29.040 that various Murdoch-owned entities,
00:16:32.160 they would include
00:16:34.080 the Wall Street Journal
00:16:36.420 and the, is it,
00:16:38.220 which is it,
00:16:38.820 New York Post or Daily News?
00:16:40.100 Is it the New York Post?
00:16:41.780 Which one?
00:16:42.180 Yeah, New York Post.
00:16:43.620 All right, and Fox News.
00:16:46.400 So the New York Post
00:16:47.460 and the Wall Street Journal
00:16:48.220 have both,
00:16:49.440 at least reportedly,
00:16:51.940 turned anti-Trump.
00:16:54.100 Can you confirm that?
00:16:55.420 That's true, right?
00:16:56.260 And the reporting is,
00:16:58.440 and I don't know
00:16:58.860 how reliable this is, right?
00:17:00.540 I wouldn't assume
00:17:01.440 this is totally reliable
00:17:02.540 because it requires
00:17:03.500 a little bit of mind reading.
00:17:05.080 But it sounds like
00:17:06.240 the Murdoch family,
00:17:07.500 father and, you know,
00:17:09.520 Lackland, probably the CEO,
00:17:11.540 seem to be anti-Trump.
00:17:13.280 And the latest reporting
00:17:14.980 is that Fox News
00:17:16.080 is shunning Trump.
00:17:19.040 Now, the opinion people
00:17:20.780 are still talking about him,
00:17:22.300 but the network
00:17:23.700 is not having him on,
00:17:24.840 and I think it didn't cover
00:17:26.060 his last speech,
00:17:27.480 which I don't think
00:17:28.380 is a big deal
00:17:28.980 because he's not in office.
00:17:31.720 But do you think that's true?
00:17:33.060 Do you think Fox News
00:17:33.980 is going to shun Trump?
00:17:37.440 So, yes or no?
00:17:40.800 I feel like it's going to be a mix.
00:17:42.840 You know, one of the interesting
00:17:43.620 things about Murdoch
00:17:44.900 is he does allow
00:17:47.940 dissenting voices
00:17:49.020 on his platforms.
00:17:51.100 Like, you know,
00:17:51.800 I'm not talking about
00:17:53.560 slightly dissenting.
00:17:56.360 Murdoch does allow
00:17:57.400 a lot of voices
00:17:58.020 on his network.
00:17:59.780 Now, you could argue
00:18:00.840 maybe more,
00:18:01.800 maybe he's got his thumb
00:18:03.280 on stuff,
00:18:04.160 but I've always noticed that.
00:18:05.740 Have you?
00:18:07.380 Yeah.
00:18:08.000 Now, full disclosure,
00:18:10.600 you know,
00:18:11.380 I've published books
00:18:12.300 under an entity
00:18:13.240 at one point
00:18:13.960 that was Murdoch,
00:18:15.480 and, yeah,
00:18:17.980 I've sort of been
00:18:18.760 associated with that
00:18:20.100 whole Murdoch world.
00:18:22.940 And my impression was
00:18:26.760 they never tried
00:18:27.320 to censor me,
00:18:28.520 at least in the publishing wing.
00:18:33.540 Fentanyl and the distribution.
00:18:35.000 I don't know what that means.
00:18:36.780 Anyway,
00:18:37.660 what happens if
00:18:38.760 Murdoch decides
00:18:39.940 that Trump will not be president
00:18:41.400 while Trump is deciding
00:18:43.120 he's running
00:18:43.760 and going to be president?
00:18:45.760 Take a vote.
00:18:46.880 Do you think that Murdoch
00:18:47.960 is strong enough
00:18:48.940 to prevent Trump
00:18:50.780 from being president
00:18:51.680 if, let's say,
00:18:53.180 he had the votes?
00:18:55.800 Oh, you don't think so.
00:18:57.280 So most of you
00:18:57.960 are saying
00:18:58.380 we have some yeses,
00:19:00.800 but it looks like
00:19:01.580 the majority of you,
00:19:02.740 maybe at least 75%,
00:19:04.340 are saying
00:19:05.660 that you think
00:19:07.640 Trump would have
00:19:08.300 more power
00:19:09.380 than Murdoch?
00:19:11.840 Hard to say.
00:19:13.120 I do think that
00:19:14.340 taking Trump
00:19:15.740 off the major platforms
00:19:17.360 would have an impact.
00:19:20.500 You think he transcends media,
00:19:22.600 but you still have to see him.
00:19:24.840 Right?
00:19:24.980 He's off of Twitter.
00:19:26.100 If he's off of Twitter
00:19:27.420 and off of Fox News,
00:19:29.420 or even reduced
00:19:30.560 in Fox News
00:19:31.300 to like 50%,
00:19:32.380 how can he win?
00:19:36.220 Yeah,
00:19:36.760 because most of the country
00:19:37.580 is not on Twitter.
00:19:39.380 You know,
00:19:40.280 Twitter is smallish
00:19:41.520 if you look at
00:19:42.200 the whole country.
00:19:45.760 YouTube?
00:19:46.580 You know,
00:19:46.880 YouTube?
00:19:47.480 I don't know.
00:19:50.340 Well,
00:19:50.840 keep an eye on that.
00:19:52.680 I do think
00:19:53.800 Murdoch
00:19:54.340 could keep Trump
00:19:55.160 out of office.
00:19:57.720 So,
00:19:58.840 you know,
00:19:59.440 I don't think
00:19:59.880 it's binary.
00:20:01.200 I don't think,
00:20:01.720 you know,
00:20:01.920 you could say
00:20:02.320 he definitely would
00:20:03.180 or definitely wouldn't.
00:20:04.280 But I think
00:20:04.780 I think
00:20:05.820 if Murdoch
00:20:06.440 is absolutely
00:20:07.460 set on Trump
00:20:09.540 not being president,
00:20:10.460 I think he can prevent it.
00:20:12.840 And what country
00:20:13.920 does Murdoch,
00:20:15.680 what is he a citizen of?
00:20:18.200 Oh, yeah.
00:20:19.840 Australia.
00:20:21.980 Australia, right?
00:20:24.640 So,
00:20:25.560 correct me
00:20:25.940 if I'm wrong,
00:20:27.440 but an Australian
00:20:28.480 will decide
00:20:29.220 who's president.
00:20:29.840 Am I wrong?
00:20:32.540 It looks like
00:20:33.360 that's the case.
00:20:34.400 Because I don't,
00:20:35.260 it seems to me
00:20:35.920 that a Republican
00:20:36.680 can only win
00:20:37.680 if the Republican
00:20:39.100 machine
00:20:39.500 is strongly
00:20:40.280 behind that candidate.
00:20:42.360 I would think.
00:20:43.920 And I don't see
00:20:44.560 that the Republican
00:20:45.220 machine would operate
00:20:46.300 unless
00:20:46.740 Murdoch
00:20:47.780 is fully on board.
00:20:51.440 Yeah.
00:20:52.100 So I think
00:20:52.640 we have another case
00:20:53.520 where a foreign entity
00:20:54.640 is completely
00:20:55.440 controlling
00:20:56.020 our outcomes.
00:20:57.460 There you go.
00:21:02.060 Why have we
00:21:02.800 never talked
00:21:03.300 about that?
00:21:04.820 You never hear
00:21:06.780 anybody talk
00:21:07.420 about
00:21:07.800 an Australian
00:21:09.700 billionaire
00:21:10.380 controlling America.
00:21:12.560 Because it
00:21:13.280 kind of
00:21:13.900 happened,
00:21:15.300 didn't it?
00:21:16.820 Well,
00:21:17.540 we'll see.
00:21:19.660 Here's a
00:21:20.420 typical CNN
00:21:21.400 story.
00:21:21.980 This is from
00:21:22.380 yesterday.
00:21:22.960 I forgot to do it
00:21:23.640 yesterday.
00:21:24.880 So here's the
00:21:25.700 headline.
00:21:26.880 And then I'll
00:21:27.440 tell you what
00:21:27.900 the story is.
00:21:28.480 So the headline
00:21:28.920 says,
00:21:29.940 an entire
00:21:30.420 North Carolina
00:21:31.320 police department
00:21:32.220 resigned after
00:21:33.800 a black woman
00:21:34.760 town manager
00:21:35.600 was hired.
00:21:36.940 And it gets
00:21:37.740 worse.
00:21:38.980 All of the
00:21:39.520 people who
00:21:39.980 resigned were
00:21:41.100 white.
00:21:43.740 So the
00:21:44.300 entire,
00:21:44.960 there were only
00:21:45.480 eight of them.
00:21:46.040 It was a small
00:21:46.520 town.
00:21:47.120 But the entire
00:21:47.820 North Carolina
00:21:48.480 police department
00:21:49.300 resigned after
00:21:50.220 a black woman
00:21:51.040 town manager
00:21:52.400 was hired to
00:21:53.140 be their
00:21:53.460 boss,
00:21:54.280 basically.
00:21:54.600 So that's
00:21:56.720 pretty bad.
00:21:58.520 Totally real
00:21:59.280 news?
00:22:00.940 Oh, let's
00:22:01.460 dig into the
00:22:02.280 details also on
00:22:03.360 CNN.
00:22:04.280 So I'm not
00:22:04.820 going to any
00:22:05.400 other source.
00:22:06.060 I'm looking at
00:22:06.700 their title.
00:22:07.700 And now I'll
00:22:08.080 look at the
00:22:08.500 details.
00:22:12.060 The evidence
00:22:13.180 is that the
00:22:14.200 new boss,
00:22:15.840 who happened
00:22:16.820 to be a
00:22:17.160 black woman,
00:22:18.500 made them do a
00:22:19.900 whole bunch of
00:22:20.440 extra work
00:22:21.240 compared to what
00:22:22.380 they used to do.
00:22:23.080 and they
00:22:24.560 thought it
00:22:24.840 was more
00:22:25.460 than they
00:22:25.780 could handle
00:22:26.240 and they
00:22:26.760 all quit.
00:22:28.140 Now the
00:22:28.780 reason they
00:22:29.280 quit was
00:22:29.860 explicitly
00:22:30.620 specific things
00:22:32.100 that she
00:22:32.420 was doing,
00:22:33.420 which even
00:22:34.220 you could look
00:22:34.800 at from the
00:22:35.240 outside.
00:22:35.620 Now there
00:22:35.800 was a
00:22:36.080 CRT.
00:22:37.300 She was
00:22:37.920 just making
00:22:38.400 them do
00:22:38.920 way more
00:22:39.940 than their
00:22:40.480 normal job,
00:22:41.280 basically,
00:22:42.160 or historically
00:22:43.060 more than
00:22:43.560 their normal
00:22:44.040 job.
00:22:44.820 So it
00:22:45.460 looked like
00:22:45.800 it was just
00:22:46.180 a normal
00:22:47.540 work-related
00:22:48.700 thing.
00:22:49.700 But then
00:22:50.440 CNN throws
00:22:51.340 in this.
00:22:51.860 studies show
00:22:53.060 people in
00:22:53.580 organizations
00:22:54.580 often think,
00:22:56.100 now let's
00:22:56.640 see if you
00:22:57.020 agree with
00:22:57.440 this,
00:22:58.080 racist
00:22:58.640 opinion.
00:23:00.060 Studies show
00:23:00.680 people in
00:23:01.060 organizations
00:23:01.540 often think
00:23:02.220 black women
00:23:03.120 are more
00:23:04.040 likely to
00:23:04.680 have angry
00:23:05.320 personalities,
00:23:06.960 with studies
00:23:07.460 also suggesting
00:23:08.360 that this
00:23:08.840 negative
00:23:09.240 perception is
00:23:10.560 a unique
00:23:11.160 occurrence for
00:23:12.000 black women.
00:23:13.300 So in other
00:23:13.700 words,
00:23:14.020 black men
00:23:14.640 do not have
00:23:15.560 this reputation.
00:23:16.920 It's black
00:23:17.500 women.
00:23:17.860 So that's
00:23:20.700 what they
00:23:21.800 studied.
00:23:22.200 So they
00:23:22.440 studied if
00:23:23.080 people in
00:23:23.920 corporations
00:23:24.440 had a
00:23:24.860 negative
00:23:25.180 impression
00:23:25.820 of the
00:23:27.080 anger
00:23:27.500 level of
00:23:28.800 black
00:23:29.120 women.
00:23:29.940 And they
00:23:30.260 threw that
00:23:30.660 in the
00:23:30.940 story.
00:23:32.180 Now,
00:23:33.300 so the
00:23:33.760 title suggests
00:23:35.340 there's
00:23:35.620 racism,
00:23:36.580 and then
00:23:37.460 the mention
00:23:37.900 of the
00:23:38.280 study is
00:23:40.000 basically
00:23:40.500 talking about
00:23:41.500 racism,
00:23:42.380 except that
00:23:43.260 the content
00:23:43.800 of the
00:23:44.220 story doesn't
00:23:45.760 involve any
00:23:46.360 racism at
00:23:47.020 all.
00:23:48.300 It's a
00:23:48.680 story about
00:23:49.180 racism wrapped
00:23:50.740 around a
00:23:51.180 story that
00:23:51.540 doesn't have
00:23:51.940 any racism
00:23:52.700 in it.
00:23:53.620 It's literally
00:23:54.220 just some
00:23:54.720 people quit
00:23:55.300 because their
00:23:55.800 boss sucked.
00:23:57.460 That's the
00:23:58.060 only thing we
00:23:58.560 know.
00:24:01.980 Look at that.
00:24:03.940 Oh, look at
00:24:04.560 it.
00:24:05.180 Would you like
00:24:05.860 me to change
00:24:06.420 the sky?
00:24:09.020 Possibly,
00:24:10.060 possibly I have
00:24:11.100 superpowers?
00:24:11.880 Watch this.
00:24:12.780 You see the
00:24:13.240 sun behind me?
00:24:14.420 Watch this.
00:24:15.860 Expand.
00:24:17.680 Expand.
00:24:20.660 Yeah, I'm
00:24:21.460 making the
00:24:21.880 sun bigger.
00:24:22.900 Does it feel
00:24:23.440 warmer where
00:24:24.040 you are?
00:24:25.380 A lot of
00:24:25.940 people don't
00:24:26.300 think I
00:24:26.600 can do
00:24:26.880 this.
00:24:27.780 And now,
00:24:29.700 sun, back
00:24:31.220 to your
00:24:31.460 normal size.
00:24:33.600 There we
00:24:34.140 go.
00:24:34.800 There we
00:24:35.100 go.
00:24:36.100 I can do
00:24:36.720 that again.
00:24:37.400 Sun, grow.
00:24:40.240 Now, back
00:24:41.220 to your
00:24:41.440 normal size.
00:24:42.300 Yeah.
00:24:42.900 Let's see
00:24:46.820 Don Lemon do
00:24:47.500 that.
00:24:48.960 Cannot do
00:24:49.640 it.
00:24:50.300 All right, so
00:24:50.900 here was the
00:24:51.720 question that the
00:24:52.340 Harvard Business
00:24:52.940 Review didn't
00:24:53.760 ask.
00:24:54.640 The question
00:24:55.140 they did
00:24:55.780 ask is,
00:24:57.400 do corporate
00:24:58.080 people think
00:24:58.960 black women in
00:25:00.520 particular are
00:25:02.420 angry or have
00:25:03.660 angry personalities
00:25:04.560 and people said
00:25:05.240 yes?
00:25:05.520 You know
00:25:07.580 what they
00:25:07.800 didn't ask?
00:25:09.220 Is there a
00:25:09.800 question missing?
00:25:12.000 It seems like
00:25:13.140 they may have
00:25:14.840 been doing a
00:25:16.480 little thinking
00:25:19.340 past the
00:25:19.840 sale.
00:25:21.820 Is there a
00:25:22.700 missing study?
00:25:24.840 What would
00:25:25.300 be the
00:25:25.660 missing study
00:25:26.700 that would
00:25:27.140 fill in the
00:25:27.940 blanks?
00:25:29.360 Would the
00:25:29.780 missing study
00:25:30.440 be maybe
00:25:32.420 studying all
00:25:33.160 the different
00:25:33.560 kinds of
00:25:34.080 people and
00:25:34.760 asking them
00:25:35.380 directly
00:25:35.900 questions that
00:25:37.360 would get to
00:25:37.820 whether they
00:25:38.300 are angry?
00:25:40.320 I'm just
00:25:40.760 going to throw
00:25:41.060 that out
00:25:41.420 there.
00:25:42.020 Suppose you
00:25:42.500 just surveyed
00:25:43.380 everybody and
00:25:44.220 you developed
00:25:45.100 some questions
00:25:45.700 to determine
00:25:46.340 if each
00:25:46.760 group was
00:25:47.240 happy or
00:25:48.720 angry based
00:25:50.300 on their
00:25:50.620 own input.
00:25:51.600 So not based
00:25:52.100 on anybody
00:25:52.460 else's input
00:25:53.160 because that
00:25:53.640 would be
00:25:53.880 racist, but
00:25:55.660 based on their
00:25:56.960 own opinion
00:25:57.480 of themselves.
00:25:58.920 What do you
00:25:59.540 think you would
00:25:59.940 find?
00:26:01.060 Would you
00:26:01.540 find that any
00:26:02.500 group had more
00:26:03.180 anger or angry
00:26:04.520 personalities?
00:26:05.380 Well, I
00:26:06.240 don't know.
00:26:07.440 I don't know.
00:26:08.760 And let me
00:26:09.380 ask you this.
00:26:10.120 If there were
00:26:10.680 any group in
00:26:11.560 the United
00:26:11.920 States that
00:26:13.600 had a good
00:26:14.140 reason to
00:26:14.720 have some
00:26:15.640 anger, can
00:26:17.860 you think of
00:26:18.340 a group that
00:26:20.300 has more
00:26:20.820 legitimate reasons
00:26:22.220 to be pissed
00:26:22.860 off?
00:26:24.320 They do.
00:26:26.000 Now, you
00:26:26.780 might argue
00:26:27.360 that everybody
00:26:28.540 should feel the
00:26:29.720 same and be
00:26:30.480 treated the same,
00:26:31.040 opinion, but
00:26:31.900 if you, I
00:26:33.900 would imagine
00:26:34.460 if you were
00:26:35.360 a black
00:26:36.280 woman, you
00:26:36.780 have a few
00:26:37.180 things to
00:26:37.660 complain about.
00:26:39.700 Overall, I
00:26:40.620 mean, not
00:26:40.980 every person,
00:26:41.900 we're not
00:26:42.260 making any,
00:26:42.900 I'm not
00:26:43.480 making any
00:26:44.040 blanket
00:26:44.400 statements about
00:26:45.720 people.
00:26:46.380 I'm just
00:26:46.740 saying that if
00:26:47.640 you were to
00:26:47.940 study averages,
00:26:49.180 and that's
00:26:49.700 what this
00:26:50.100 Harvard Business
00:26:50.860 Review did,
00:26:51.540 it studied the
00:26:52.160 average opinion
00:26:52.940 of the
00:26:54.080 corporation.
00:26:55.000 So we're
00:26:55.220 always talking
00:26:55.640 about the
00:26:55.940 average, right?
00:26:56.540 It's not
00:26:56.840 about any
00:26:57.260 individual.
00:26:57.700 Let me ask
00:27:00.900 you, a bunch
00:27:01.560 of racists,
00:27:02.400 if you've
00:27:03.720 ever done
00:27:04.180 telephone support
00:27:05.400 calls, where
00:27:06.560 you call in to
00:27:07.300 get tech support
00:27:08.180 or customer
00:27:08.740 service, you've
00:27:09.920 probably had the
00:27:10.380 experience of
00:27:10.880 getting a variety
00:27:11.560 of people
00:27:12.020 answering the
00:27:13.340 phone.
00:27:14.940 Do you have
00:27:15.660 the racist
00:27:16.500 opinion that
00:27:18.060 black women
00:27:18.820 who do
00:27:19.380 customer service
00:27:20.300 have angry
00:27:21.480 personalities?
00:27:22.720 Go.
00:27:23.440 How racist
00:27:24.180 are you?
00:27:24.680 lots of
00:27:27.620 no's, some
00:27:28.380 yes's, mostly
00:27:29.780 no's, yes's,
00:27:31.880 no.
00:27:32.320 Now remember,
00:27:32.940 we're not
00:27:33.240 talking about
00:27:33.740 each person.
00:27:34.780 Obviously, each
00:27:35.460 person is
00:27:36.060 unique.
00:27:36.820 We all agree
00:27:37.440 on that.
00:27:38.560 But do you
00:27:39.180 see any
00:27:39.480 patterns?
00:27:42.400 No?
00:27:43.020 All right,
00:27:43.420 interesting.
00:27:44.040 So you're all
00:27:44.460 over the place.
00:27:48.060 I don't know.
00:27:50.020 Here's what I
00:27:50.720 think.
00:27:50.960 I think
00:27:53.960 that, and
00:27:57.260 I've said
00:27:57.560 this before,
00:27:59.220 everybody uses
00:28:01.100 the power
00:28:01.620 that's given
00:28:02.080 to them.
00:28:03.520 That's a
00:28:04.100 general statement
00:28:04.900 you'd probably
00:28:05.380 agree with,
00:28:05.840 right?
00:28:06.120 If somebody
00:28:06.740 has a lot
00:28:07.140 of money,
00:28:07.980 that's a
00:28:08.680 form of
00:28:09.100 power, and
00:28:09.500 they'd
00:28:09.700 probably use
00:28:10.200 it in
00:28:11.280 ways that
00:28:11.800 increase
00:28:12.240 their
00:28:12.400 power.
00:28:16.380 And if
00:28:18.140 people have,
00:28:18.740 let's say,
00:28:19.080 physical
00:28:19.660 power, like
00:28:21.000 they're big
00:28:21.380 intimidating
00:28:21.920 people, usually
00:28:22.820 male, but
00:28:23.840 let's say
00:28:24.060 you're like a
00:28:24.520 big intimidating
00:28:25.180 person like
00:28:25.960 Larry Ellison
00:28:27.180 or, you
00:28:28.400 know, something
00:28:28.680 like that.
00:28:29.060 I've heard
00:28:29.560 that some
00:28:30.640 large males
00:28:31.660 are just
00:28:32.700 physically
00:28:33.160 intimidating,
00:28:34.620 and so
00:28:34.920 that's sort
00:28:35.240 of a power
00:28:35.700 that they
00:28:36.080 can wield
00:28:36.700 over people.
00:28:38.400 So, generally
00:28:39.120 speaking, when
00:28:40.880 people have
00:28:41.580 power, they
00:28:42.080 use it,
00:28:42.460 wouldn't you
00:28:42.740 say?
00:28:43.460 Is it fair
00:28:44.120 to say that
00:28:44.620 if you have
00:28:45.160 a power,
00:28:46.300 you use it,
00:28:47.400 on average?
00:28:49.080 So, one
00:28:50.440 power that
00:28:50.920 women have
00:28:51.520 that men
00:28:51.900 don't have
00:28:52.480 is that
00:28:53.620 they can
00:28:54.200 go hard
00:28:55.080 at men
00:28:55.640 without
00:28:56.660 worrying
00:28:57.340 about
00:28:57.700 consequences.
00:29:00.200 Now, of
00:29:00.520 course, there's
00:29:00.960 exceptions and
00:29:01.860 abusive men and
00:29:02.920 stuff, so
00:29:03.400 certainly there
00:29:04.280 are women who
00:29:04.740 can't complain
00:29:05.340 without getting
00:29:06.240 beaten up.
00:29:07.940 So, we're not
00:29:08.300 talking about
00:29:08.720 that.
00:29:09.440 But, generally,
00:29:10.260 in your average
00:29:11.040 marriage, the
00:29:13.340 woman can be
00:29:14.360 a total bitch
00:29:15.820 and get away
00:29:16.980 with it.
00:29:17.400 Because men
00:29:18.060 will be like,
00:29:18.620 ugh, I don't
00:29:19.200 want to deal
00:29:19.600 with a divorce
00:29:20.880 and losing my
00:29:21.680 children.
00:29:22.860 Yes, she's a
00:29:23.460 total bitch.
00:29:24.180 There's nothing
00:29:24.780 I can do
00:29:25.200 about it.
00:29:25.780 There's absolutely
00:29:26.380 nothing I can do
00:29:27.000 about it.
00:29:27.720 Then the woman
00:29:28.260 says, well, I
00:29:29.020 could be a
00:29:29.400 total bitch, and
00:29:30.880 nobody's going to
00:29:31.740 change any of my
00:29:32.580 situation.
00:29:33.860 I'm mad, so I'll
00:29:35.140 be a total bitch.
00:29:36.700 So, generally
00:29:37.560 speaking, women
00:29:38.460 have a power that
00:29:39.800 men don't have,
00:29:40.640 which is a
00:29:41.420 certain kind of
00:29:42.400 complaining.
00:29:44.060 You know, men
00:29:44.380 can complain too,
00:29:45.360 but it's not
00:29:46.720 going to make
00:29:47.020 any difference.
00:29:49.100 When women
00:29:49.820 complain, it's
00:29:50.400 more likely to
00:29:51.080 have some
00:29:51.380 effect, right?
00:29:52.360 Especially in
00:29:52.960 relationships.
00:29:55.620 Now, add on
00:29:58.480 top of that,
00:30:01.800 you know, any
00:30:02.500 cultural impact,
00:30:04.240 and it seems to
00:30:05.160 me that some
00:30:05.840 groups would
00:30:07.200 just have more
00:30:07.820 power to
00:30:08.640 complain.
00:30:09.140 and everybody
00:30:11.380 uses their
00:30:11.960 power.
00:30:13.400 So, you
00:30:14.420 could make a
00:30:14.960 case, and
00:30:15.440 again, this is
00:30:15.940 just, you
00:30:16.400 know, racist
00:30:16.920 speculation
00:30:17.680 probably, but
00:30:19.080 wouldn't you
00:30:19.600 imagine that
00:30:20.280 black women
00:30:20.800 have the most
00:30:21.400 power to
00:30:22.280 complain?
00:30:24.320 Let me just
00:30:25.000 put that out
00:30:25.480 there.
00:30:26.140 Would a black
00:30:26.820 woman have the
00:30:27.480 most power in
00:30:29.260 society to
00:30:30.840 complain?
00:30:32.100 I would say
00:30:32.880 yes, because
00:30:34.400 they would be
00:30:34.960 considered among
00:30:35.900 the most
00:30:36.460 discriminated
00:30:37.820 against group.
00:30:39.140 You know, you
00:30:40.220 got your
00:30:40.680 black, you
00:30:41.240 got your
00:30:41.520 woman, you
00:30:42.020 put them
00:30:42.280 together, you
00:30:43.980 know, you
00:30:44.560 have a reason
00:30:45.280 to have some,
00:30:48.220 you know, maybe
00:30:48.920 attitude.
00:30:50.440 So, if
00:30:51.900 somebody has a
00:30:52.540 really good
00:30:52.920 reason for
00:30:54.020 being the way
00:30:54.480 they are, it
00:30:55.280 shouldn't be a
00:30:55.700 surprise that
00:30:56.420 they're that
00:30:56.740 way.
00:30:57.040 Everybody uses
00:30:57.620 the power
00:30:57.980 they have.
00:30:58.920 If I had
00:31:00.520 the power of
00:31:02.360 being able to
00:31:03.060 complain about
00:31:03.920 my victimhood
00:31:04.760 and it got me
00:31:05.600 stuff, I'd
00:31:06.720 probably do it.
00:31:07.300 if that's
00:31:09.260 just the way
00:31:09.720 I could get
00:31:10.220 stuff.
00:31:11.100 Now, it
00:31:11.480 turns out that
00:31:12.080 it wouldn't
00:31:12.420 work for me.
00:31:13.800 You know, as
00:31:14.240 a white guy, I
00:31:15.160 can complain all
00:31:15.840 day about
00:31:16.420 racism against
00:31:18.180 me.
00:31:18.820 Well, it
00:31:19.280 doesn't work.
00:31:21.240 So, let me
00:31:22.200 say that again.
00:31:22.800 The only reason
00:31:24.000 that I'm not an
00:31:25.040 angry personality
00:31:26.260 and I'm not
00:31:27.740 complaining and
00:31:28.500 bitching and
00:31:29.100 abusing people
00:31:29.860 every time I
00:31:30.380 have a
00:31:30.960 conversation is
00:31:31.960 because I can't
00:31:32.460 get away with
00:31:32.980 it.
00:31:33.800 That's it.
00:31:34.400 If I could
00:31:35.400 get away with
00:31:36.100 being an
00:31:36.480 asshole, I
00:31:36.980 probably would
00:31:38.140 drift in that
00:31:39.340 direction.
00:31:41.200 I mean, most
00:31:41.940 people would.
00:31:43.080 You drift in the
00:31:43.920 direction of your
00:31:44.620 own power.
00:31:46.160 So, if you
00:31:47.280 happen to be the
00:31:48.020 group that can
00:31:49.080 complain the
00:31:49.760 most and
00:31:51.520 other people
00:31:52.040 say, yeah, I
00:31:52.920 can see why you
00:31:53.600 would complain
00:31:54.080 the most, well,
00:31:56.060 you can understand
00:31:57.840 how that situation
00:31:58.620 might evolve.
00:32:00.340 But, again, if
00:32:01.680 you're dealing
00:32:02.160 with anecdotal
00:32:02.960 evidence, there's
00:32:03.640 no evidence
00:32:04.480 that black
00:32:05.000 women are more
00:32:05.640 angry personalities,
00:32:06.800 in my opinion.
00:32:07.680 There's only
00:32:08.180 evidence that
00:32:08.820 people believe
00:32:09.480 it.
00:32:11.100 Let me say
00:32:11.540 that again.
00:32:12.520 There's no
00:32:12.920 evidence that
00:32:13.640 black women are
00:32:14.920 angry personalities
00:32:15.900 that I'm aware
00:32:17.380 of.
00:32:17.640 I haven't seen
00:32:18.100 a study that
00:32:18.760 says that.
00:32:19.480 The study said
00:32:20.140 that people
00:32:20.540 think it's
00:32:21.140 true, which
00:32:22.260 is completely
00:32:22.760 different.
00:32:25.440 All right,
00:32:28.640 here's my
00:32:29.020 favorite story.
00:32:29.860 There were two
00:32:30.260 big game hunters,
00:32:31.660 Lawrence and
00:32:32.100 Bianca Rudolph,
00:32:32.760 and they've
00:32:33.540 been shooting
00:32:34.100 big animals in
00:32:34.940 Africa for
00:32:35.440 years.
00:32:36.460 They were both
00:32:36.860 in their 60s,
00:32:37.540 I think.
00:32:38.580 But in
00:32:40.780 2016, they
00:32:41.940 were going to
00:32:43.060 try to get a
00:32:43.540 leopard.
00:32:43.960 I guess the
00:32:44.440 wife wanted to
00:32:45.320 shoot a leopard.
00:32:46.820 And she was
00:32:47.380 out hunting all
00:32:47.980 day and did not
00:32:48.680 shoot any leopards.
00:32:49.880 She shot a
00:32:50.720 number of other
00:32:51.240 animals, but no
00:32:52.480 leopard.
00:32:53.560 But then when
00:32:54.260 they got back to
00:32:54.820 the cabin, when
00:32:56.040 the husband was in
00:32:56.820 a different room,
00:32:57.480 he says, there
00:32:59.060 was a shotgun
00:33:00.600 went off and it
00:33:01.540 killed the wife.
00:33:03.440 So she was
00:33:04.180 found dead.
00:33:04.680 Now the
00:33:04.860 husband, Lawrence
00:33:06.240 Rudolph, he's
00:33:07.780 charged with
00:33:08.300 murder, but he
00:33:10.280 says that the
00:33:11.420 gun went off by
00:33:12.140 accident.
00:33:13.140 Now, I haven't
00:33:14.460 looked into the
00:33:15.020 details of this
00:33:15.600 story, but here's
00:33:16.580 how I'd have
00:33:17.480 handled it.
00:33:18.800 First of all, the
00:33:19.640 top suspects would
00:33:20.640 be the leopards
00:33:21.400 themselves.
00:33:23.200 So there were
00:33:25.080 no witnesses, and
00:33:26.340 we know that the
00:33:27.040 leopards had a
00:33:27.880 motive.
00:33:29.440 So if I were the
00:33:30.380 leopards, I would
00:33:31.000 have tried to send
00:33:31.900 his squad of
00:33:32.780 leopards in to
00:33:34.320 shoot this woman
00:33:35.160 and maybe save
00:33:36.580 the other leopards.
00:33:39.580 If I were the
00:33:40.360 investigators, once I
00:33:42.460 came upon the
00:33:43.100 crime scene, probably
00:33:45.460 what they did is the
00:33:46.260 way they usually
00:33:46.940 handle it, you
00:33:47.600 know, they clean
00:33:48.260 it up, you know,
00:33:49.160 take pictures, do
00:33:50.300 what they can.
00:33:51.080 But I would have
00:33:51.920 waited a week.
00:33:52.600 just to see if
00:33:56.300 the husband mounted
00:33:57.080 her head on the
00:33:57.720 wall, because that
00:33:59.020 would be evidence
00:34:00.380 of intent.
00:34:03.260 That's just me.
00:34:04.660 But, you know, if
00:34:05.560 you found her head
00:34:06.120 on the wall a week
00:34:06.820 later, I'd say, I
00:34:07.680 feel like I know
00:34:08.760 what happened now.
00:34:10.040 That explains a lot.
00:34:12.300 I would like to add
00:34:13.240 on top of that,
00:34:14.140 normally I do not
00:34:15.500 talk about tragic
00:34:16.900 situations that
00:34:17.920 happen to
00:34:18.280 individuals.
00:34:19.240 But if your
00:34:20.580 hobby is shooting
00:34:21.720 mammals,
00:34:22.600 and you get
00:34:23.480 your own head
00:34:24.000 blown off, I
00:34:25.180 don't really care.
00:34:26.720 I don't really
00:34:27.300 care at all.
00:34:28.760 Oh, yeah, if the
00:34:29.780 shotgun blow was in
00:34:30.760 the head, then we
00:34:31.740 could eliminate the
00:34:32.780 husband, because he
00:34:34.300 wouldn't have a
00:34:34.740 trophy.
00:34:36.820 So that would be
00:34:37.560 evidence, too.
00:34:38.800 So let me be as
00:34:39.660 clear as possible.
00:34:40.520 If you shoot large
00:34:41.660 mammals for fun, I
00:34:43.240 don't fucking care
00:34:43.840 what happens to you.
00:34:45.080 I don't care at
00:34:45.840 all.
00:34:46.400 I don't care if you
00:34:47.000 got shot in the
00:34:47.640 back of the head.
00:34:48.480 I don't care if you
00:34:49.600 suffered and died.
00:34:50.580 I don't care if the
00:34:51.280 elephants,
00:34:52.100 stomp on you.
00:34:53.560 I don't care if the
00:34:54.160 lepers eat you.
00:34:55.700 If that's your
00:34:56.500 hobby, it's your
00:34:58.940 own fucking
00:34:59.360 problem, right?
00:35:00.660 I don't care.
00:35:01.540 I have no sympathy
00:35:02.500 whatsoever for
00:35:04.200 Bianca Rudolph.
00:35:05.440 Fuck her.
00:35:06.020 She can go to hell
00:35:06.800 and rot.
00:35:08.600 I don't normally say
00:35:09.740 that about people who
00:35:10.480 have tragic accidents,
00:35:11.720 but especially if they
00:35:13.940 might have been
00:35:14.360 murdered, I usually
00:35:15.880 don't condemn the
00:35:17.700 victim, but fuck
00:35:19.100 this victim.
00:35:20.520 Fuck her to hell.
00:35:22.080 She can rot.
00:35:24.320 All right.
00:35:24.800 That's just my
00:35:25.420 opinion.
00:35:27.780 How many of you
00:35:28.600 think that, I
00:35:31.280 don't know,
00:35:31.520 Democrats and
00:35:32.180 liberals want to
00:35:32.860 depopulate the
00:35:33.640 world?
00:35:39.240 You think the
00:35:40.440 Democrats want to
00:35:41.560 depopulate the
00:35:42.360 world?
00:35:42.540 Do you think Bill
00:35:44.000 Gates wants to
00:35:44.740 depopulate the
00:35:45.500 world?
00:35:45.680 Here's what I
00:35:48.240 think everybody
00:35:48.900 gets wrong.
00:35:51.580 There's nobody
00:35:52.380 smart enough to
00:35:53.340 be playing at
00:35:53.940 that level, at
00:35:55.840 the Bill Gates
00:35:56.440 level.
00:35:57.540 Let's say, imagine
00:35:58.600 the people who
00:35:59.340 are at that level
00:36:00.080 you think are
00:36:00.680 trying to depopulate
00:36:01.680 the world.
00:36:02.940 So in your
00:36:03.520 opinion, you
00:36:04.160 might put Bill
00:36:04.740 Gates there,
00:36:06.080 you know, Klaus,
00:36:07.040 whatever his name
00:36:07.640 is, Schwab,
00:36:08.980 right?
00:36:09.540 So all these
00:36:10.580 people, but these
00:36:11.280 are the well-informed
00:36:12.920 elites, wouldn't
00:36:14.540 you say?
00:36:15.260 Soros.
00:36:16.120 So you would
00:36:17.040 put them in
00:36:17.480 that category of
00:36:18.400 people that you
00:36:18.960 believe want to
00:36:20.840 depopulate the
00:36:21.620 world, correct?
00:36:23.940 Now, how do you
00:36:24.740 explain that those
00:36:25.580 billionaires think
00:36:26.700 that we're better
00:36:27.340 off with less
00:36:28.780 population, but
00:36:30.460 Elon Musk, who
00:36:31.680 is also very smart
00:36:32.620 and a billionaire,
00:36:33.780 says that it's
00:36:34.500 essential to have
00:36:35.460 a growing population.
00:36:36.820 It's not just
00:36:37.640 optional, it's
00:36:38.180 essential.
00:36:39.360 You need a
00:36:40.080 growing population
00:36:40.840 or else you're
00:36:41.280 really in trouble.
00:36:43.180 So how could
00:36:44.020 these other
00:36:44.520 billionaires, like
00:36:45.240 Klaus, Schwab,
00:36:46.380 and Gates, how
00:36:47.320 could they be so
00:36:48.020 wrong if Musk
00:36:51.000 is right or
00:36:51.520 vice versa?
00:36:53.620 Let me tell you
00:36:54.520 what I'm pretty
00:36:55.220 sure is true.
00:36:57.120 So without
00:36:58.000 knowing, I'm
00:36:59.960 going to make the
00:37:00.580 following assertion,
00:37:01.500 that Musk and
00:37:03.180 Gates agree on
00:37:04.220 population.
00:37:06.560 And here's what I
00:37:07.520 mean by that.
00:37:08.180 I believe that
00:37:08.820 everybody smart
00:37:09.680 thinks that
00:37:10.440 population has to
00:37:11.460 grow to have a
00:37:13.140 healthy world.
00:37:14.600 Let me say it
00:37:15.140 again.
00:37:16.200 I believe that
00:37:17.120 everybody above a
00:37:18.380 certain level of
00:37:19.460 awareness and
00:37:20.180 intelligence,
00:37:21.040 everybody, everybody,
00:37:22.640 100%, above a
00:37:24.080 certain level of
00:37:24.740 awareness,
00:37:25.640 understands that we
00:37:27.240 need a growing
00:37:27.820 population.
00:37:28.440 everybody below
00:37:30.420 that level of
00:37:31.140 awareness could
00:37:32.260 be, you know,
00:37:33.040 any opinion.
00:37:34.640 But above a
00:37:35.460 certain level of
00:37:36.520 knowledge and
00:37:37.460 education, everybody
00:37:38.940 knows it has to
00:37:39.720 grow.
00:37:40.940 And Bill Gates is
00:37:42.000 above that level, as
00:37:43.360 is Klaus Schwab, as
00:37:45.180 are all the elites.
00:37:46.480 There are no elites
00:37:47.600 who think the world
00:37:48.320 needs to have less
00:37:48.980 population.
00:37:50.980 You can't find any
00:37:52.320 who say that.
00:37:53.520 None.
00:37:53.840 You will find lots
00:37:56.480 of people who think
00:37:57.200 population growth
00:37:58.700 needs, Bill Maher
00:38:01.640 is not somebody I
00:38:02.520 would consider well
00:38:03.300 informed on this
00:38:04.280 topic.
00:38:05.720 Bill Maher
00:38:06.420 continuously makes
00:38:08.140 economic assumptions
00:38:10.500 that I think are
00:38:11.860 under-informed.
00:38:13.360 I think that's his
00:38:14.120 weakest spot.
00:38:16.660 Now, I have a very
00:38:17.780 positive opinion of
00:38:18.700 Bill Maher as a
00:38:20.480 public personality
00:38:22.040 and person who
00:38:22.780 talks about the
00:38:23.440 news, and in my
00:38:24.860 opinion, with the
00:38:25.980 exception of Trump
00:38:26.760 commentary, he's
00:38:28.420 clearly biased.
00:38:29.260 That's a little bit
00:38:29.740 personal.
00:38:31.200 But generally
00:38:31.840 speaking, he's
00:38:33.820 telling you his
00:38:34.320 actual opinion.
00:38:35.100 It's not bullshit.
00:38:36.200 But I don't think he
00:38:37.120 has enough
00:38:38.060 understanding of
00:38:38.860 economics to be
00:38:40.820 good in that realm.
00:38:42.480 So when he has an
00:38:43.200 economic opinion,
00:38:44.240 it's generally not
00:38:46.260 one I would look to.
00:38:47.580 Now, if you put
00:38:48.340 Bill Maher in the
00:38:49.080 room with either
00:38:49.780 Elon Musk or
00:38:51.580 Bill Gates, do you
00:38:53.380 think that Bill
00:38:54.860 Maher is
00:38:55.380 operating at the
00:38:56.060 same level of
00:38:56.840 awareness and
00:38:57.840 understanding of the
00:38:58.800 world?
00:38:59.500 I don't.
00:39:00.700 I don't think it's
00:39:01.280 even close.
00:39:02.980 If you did an
00:39:03.960 economics test of
00:39:06.060 Bill Maher versus
00:39:07.220 Bill Gates, let's
00:39:08.540 say it's just a
00:39:09.280 standardized test
00:39:10.640 understanding economic
00:39:12.040 principles, Bill
00:39:13.760 Gates would get
00:39:14.360 every question
00:39:15.040 right.
00:39:15.680 Like every question.
00:39:17.620 He would understand
00:39:18.620 everything about
00:39:19.420 economics, at least
00:39:20.340 that would be put on
00:39:21.160 the test.
00:39:22.520 Bill Maher would
00:39:23.960 get a, he'd
00:39:25.720 probably get a
00:39:26.120 75, 75% right.
00:39:28.980 But that 25% he
00:39:30.240 missed is a big
00:39:30.940 deal.
00:39:32.220 You know, and
00:39:32.580 that's the part I
00:39:33.180 seem to notice when
00:39:34.200 he's got a gap in
00:39:35.440 his understanding.
00:39:36.500 So yes, there are
00:39:37.540 some people at a
00:39:38.420 lower level of
00:39:39.300 understanding of the
00:39:39.960 world who think that
00:39:41.000 we might be better
00:39:41.680 off with fewer
00:39:42.620 people.
00:39:44.160 But what, the
00:39:45.620 thing that you're
00:39:46.220 conflating that
00:39:47.920 makes you think that
00:39:48.820 the smart people
00:39:49.760 who understand
00:39:50.900 economics are also
00:39:52.560 agreeing with Bill
00:39:53.360 Maher, here's
00:39:54.460 what's confusing
00:39:55.140 you.
00:39:56.320 They do want to
00:39:57.560 control the
00:39:58.020 population.
00:39:59.680 They do.
00:40:00.860 So Bill Gates
00:40:01.480 definitely wants
00:40:03.060 individuals to have
00:40:04.220 control over their
00:40:05.100 own reproduction,
00:40:07.060 which might have
00:40:07.980 the effect of
00:40:08.600 lessening it in
00:40:09.440 some places.
00:40:10.640 So they want
00:40:11.420 people to control
00:40:12.140 it, but there's
00:40:13.400 no way that Bill
00:40:14.580 Gates wants the
00:40:15.300 population of the
00:40:16.040 world to shrink.
00:40:17.500 Would anybody
00:40:17.920 like to make a bet
00:40:18.680 on that?
00:40:20.360 Would anybody
00:40:21.040 like to make a bet
00:40:21.860 that you will never
00:40:22.700 see a Bill Gates
00:40:23.680 quote and that
00:40:24.500 none exists, there's
00:40:25.760 no Bill Gates
00:40:26.420 quote, about
00:40:27.480 reducing the
00:40:28.460 population of the
00:40:29.860 whole world?
00:40:34.040 So you'll take
00:40:34.880 that bet?
00:40:36.320 Now remember,
00:40:38.340 the bet is that
00:40:39.360 you believe he
00:40:42.500 wants fewer
00:40:43.380 people.
00:40:43.800 I'm saying he
00:40:44.760 wants the growth
00:40:45.980 rate to be more
00:40:47.900 managed.
00:40:48.680 So he wants
00:40:51.400 more, but at a
00:40:52.540 more manageable
00:40:53.320 way and, you
00:40:54.980 know, that it
00:40:56.100 makes sense for
00:40:56.660 the country.
00:40:59.680 Yeah.
00:41:01.320 All right.
00:41:03.900 Yes, and Bill
00:41:04.900 Maher doesn't
00:41:05.480 understand how
00:41:06.180 technology will
00:41:07.080 improve resource
00:41:08.040 management, and
00:41:08.920 Musk does.
00:41:09.760 That is exactly
00:41:10.600 correct.
00:41:11.660 Exactly.
00:41:12.180 Yeah, I know it
00:41:18.620 causes a lot of
00:41:19.220 trouble when I
00:41:19.840 defend Bill Gates,
00:41:20.980 but keep in mind
00:41:21.940 that I don't
00:41:23.040 defend Bill Gates
00:41:24.040 for a love of
00:41:25.340 Bill Gates.
00:41:26.820 Whatever he's done
00:41:27.640 in his personal or
00:41:28.640 professional life,
00:41:29.640 you know, that's up
00:41:30.140 to him to explain.
00:41:31.020 That's not up to
00:41:31.700 me.
00:41:32.540 What I try to work
00:41:33.920 on is how we
00:41:34.840 think about it.
00:41:35.980 So I'm not
00:41:36.600 defending Bill Gates,
00:41:37.480 I'm defending
00:41:38.100 accurate thinking.
00:41:42.180 And I believe
00:41:42.900 that accurate
00:41:43.440 thinking says
00:41:44.200 there's no
00:41:44.600 evidence that
00:41:45.220 anybody as
00:41:45.960 smart as him
00:41:46.760 would ever be
00:41:48.060 on the side of
00:41:48.680 population reduction.
00:41:50.700 So let me
00:41:51.120 generalize it.
00:41:51.880 I'm going to
00:41:52.220 generalize it from
00:41:53.180 the Bill Gates
00:41:53.940 because he just
00:41:54.640 he confuses the
00:41:56.060 question because
00:41:56.580 you have so many
00:41:57.060 opinions about him.
00:41:58.520 Nobody who
00:41:59.680 understands economics
00:42:00.820 and technology,
00:42:02.180 let's put it that
00:42:03.000 way, nobody has a
00:42:04.320 good understanding
00:42:04.960 of both economics
00:42:06.840 and technology
00:42:07.740 believes that the
00:42:09.320 population should be
00:42:10.260 reduced.
00:42:12.080 Find me one
00:42:12.880 example and I'll
00:42:13.600 change my mind.
00:42:14.960 It's got somebody
00:42:15.820 prominent that you
00:42:16.940 could clearly say,
00:42:18.360 okay, this person
00:42:19.180 does understand
00:42:20.060 economics and does
00:42:21.280 understand business,
00:42:22.380 does understand
00:42:23.060 people, you know,
00:42:24.360 does understand
00:42:25.300 all that.
00:42:27.060 Hitler.
00:42:29.040 Well, Hitler's not
00:42:30.160 if Hitler's the best
00:42:31.500 example you have.
00:42:34.340 Well, let me
00:42:35.120 correct you.
00:42:35.620 Hitler did not
00:42:36.240 believe in
00:42:37.820 population reduction.
00:42:40.220 He believed in
00:42:41.200 reducing some kind
00:42:42.240 of people and
00:42:42.860 increasing the kind
00:42:43.940 of people he is.
00:42:45.200 So even Hitler
00:42:45.860 isn't a good
00:42:46.700 example.
00:42:49.160 And even,
00:42:50.300 and China is an
00:42:52.660 example of
00:42:53.460 controlled
00:42:54.440 population.
00:42:57.700 China wasn't
00:42:58.540 trying to,
00:42:59.340 was trying to
00:43:00.360 not have runaway
00:43:01.820 population growth,
00:43:03.600 but they didn't
00:43:04.400 want it to be
00:43:04.880 negative too hard.
00:43:06.740 They wanted to
00:43:07.400 slow it until
00:43:08.140 they could handle
00:43:09.800 it, which is
00:43:10.940 what they did.
00:43:11.640 So now they've
00:43:12.300 increased the
00:43:13.200 number of kids
00:43:13.640 you can have.
00:43:15.180 So, yeah,
00:43:16.780 and by China
00:43:18.080 reducing their
00:43:19.260 growth rate,
00:43:21.120 they may have
00:43:21.780 killed themselves
00:43:22.460 in the long run
00:43:23.160 because they've
00:43:23.580 got this
00:43:23.880 demographic bomb.
00:43:25.940 Now, do you
00:43:27.140 understand why
00:43:27.720 you have to have
00:43:28.380 more people?
00:43:30.100 Because you can't
00:43:31.020 have a growing
00:43:31.560 number of old
00:43:32.220 people who don't
00:43:32.860 work and a shrinking
00:43:34.320 number of new
00:43:35.040 people who do.
00:43:36.840 That leads you
00:43:37.920 to a bad
00:43:39.000 situation.
00:43:42.600 Yeah, yeah,
00:43:43.460 Japan has that
00:43:44.080 problem too.
00:43:44.960 So we don't
00:43:45.520 want the problem
00:43:46.060 of Japan and
00:43:47.120 of China, but
00:43:48.740 the United States
00:43:49.400 never had an
00:43:50.060 overpopulation
00:43:50.840 problem, so we
00:43:51.600 didn't have to
00:43:52.000 deal with that.
00:43:55.060 Yeah, I do
00:43:55.740 think more old
00:43:56.340 people will work.
00:43:57.020 That is part of
00:43:57.480 the solution,
00:43:58.200 definitely.
00:43:59.700 All right, do
00:44:01.020 you think China
00:44:01.540 will shoot down
00:44:02.140 Pelosi's plane?
00:44:05.340 Gordon Chang
00:44:06.260 thinks it's
00:44:08.180 possible, or he
00:44:09.240 said that.
00:44:10.840 It's not
00:44:11.480 possible.
00:44:12.880 Yeah.
00:44:14.620 I'm the biggest
00:44:15.540 China hawk
00:44:16.700 probably that you've
00:44:18.300 ever seen, and
00:44:19.580 even I don't think
00:44:20.320 they'll shoot down
00:44:20.920 Nancy Pelosi's
00:44:21.760 jet.
00:44:26.820 Taiwan on,
00:44:27.760 yeah.
00:44:27.980 Yeah.
00:44:31.540 All right.
00:44:32.920 Yeah, I don't
00:44:33.480 think that's
00:44:33.860 happened.
00:44:34.160 I'm not going
00:44:34.420 to worry about
00:44:34.760 it.
00:44:37.940 All right, that
00:44:38.880 is the slow
00:44:40.060 news day news.
00:44:43.100 Do you have
00:44:43.500 any interesting
00:44:44.100 questions?
00:44:46.140 Does it look
00:44:46.760 like Africa?
00:44:47.840 Yeah, so I'm
00:44:48.440 intentionally
00:44:48.880 obscuring my
00:44:50.040 face like a
00:44:50.940 witness protection.
00:44:55.220 have I ever
00:45:02.300 talked to
00:45:02.740 Penn Jillette
00:45:03.380 about Trump?
00:45:07.240 I didn't
00:45:08.100 get to meet
00:45:08.860 and talk to
00:45:09.480 Penn Jillette.
00:45:10.000 I don't
00:45:10.180 think we
00:45:10.500 ever talked
00:45:10.860 about Trump.
00:45:15.660 Jim
00:45:16.080 encounter?
00:45:17.020 I did not
00:45:17.600 see your post
00:45:18.200 about a
00:45:18.560 Jim
00:45:18.740 encounter.
00:45:19.220 Russell
00:45:24.880 Brand.
00:45:25.340 Oh, I did
00:45:26.160 give Russell
00:45:27.060 Brand's
00:45:27.500 producer some
00:45:28.400 dates for
00:45:30.160 scheduling, but
00:45:30.780 I haven't heard
00:45:31.240 back, or at
00:45:32.340 least I have to
00:45:32.940 check back.
00:45:35.240 Oh, I
00:45:35.720 forgot about
00:45:36.200 Biden getting
00:45:37.380 COVID a second
00:45:38.140 time.
00:45:38.640 If you're
00:45:39.620 keeping score,
00:45:41.720 Biden has
00:45:43.260 four vaccinations
00:45:44.360 and two
00:45:45.180 COVID infections.
00:45:46.960 So the
00:45:47.240 score is
00:45:47.820 vaccinations
00:45:48.840 for COVID
00:45:50.440 too, but
00:45:52.140 I think
00:45:52.500 COVID can
00:45:53.100 come from
00:45:53.520 behind and
00:45:54.300 maybe soon
00:45:55.900 we'll have
00:45:56.220 more infections
00:45:56.940 than there
00:45:57.440 are vaccinations.
00:46:01.500 Now, how
00:46:02.120 many of you
00:46:02.780 accept the
00:46:04.280 following logic?
00:46:06.420 That if the
00:46:07.200 vaccinations and
00:46:08.260 YouTube, if
00:46:09.460 you're watching,
00:46:10.000 don't cancel
00:46:10.500 me over this,
00:46:11.280 wait until I
00:46:12.000 get to the
00:46:12.360 end and
00:46:13.140 you'll be
00:46:13.420 happy.
00:46:14.520 How many
00:46:15.140 people think
00:46:15.620 the following
00:46:16.140 logic holds?
00:46:18.140 And it
00:46:18.320 goes like
00:46:18.700 this.
00:46:19.620 If the
00:46:20.320 vaccinations
00:46:20.900 don't stop
00:46:22.220 you from
00:46:22.560 getting it
00:46:23.140 and they
00:46:24.060 don't stop
00:46:24.560 you from
00:46:24.860 getting it
00:46:25.280 a second
00:46:25.640 time, and
00:46:28.780 it doesn't
00:46:29.140 stop you
00:46:29.540 from transmitting
00:46:30.280 it, then
00:46:31.780 it doesn't
00:46:32.240 work.
00:46:33.660 Yes or
00:46:34.280 no?
00:46:35.180 If it
00:46:35.780 doesn't stop
00:46:36.380 you from
00:46:36.700 getting it,
00:46:37.220 it doesn't
00:46:37.480 stop you from
00:46:38.080 transmitting
00:46:38.560 it, it
00:46:42.160 doesn't work.
00:46:45.300 Somebody
00:46:45.780 says not as
00:46:46.500 a vaccine,
00:46:47.420 okay, fair
00:46:48.000 enough.
00:46:50.220 Define
00:46:50.700 work, okay.
00:46:53.280 You know,
00:46:53.820 really this
00:46:54.840 group is, I
00:46:56.300 think, the
00:46:56.700 most sophisticated
00:46:57.580 audience watching
00:46:59.380 the news,
00:47:00.020 honestly, because
00:47:01.140 I thought there
00:47:02.100 would be a few
00:47:02.740 more of you
00:47:03.300 agreeing with
00:47:03.900 that, and it
00:47:04.260 makes no sense,
00:47:05.040 right?
00:47:06.060 If you know
00:47:06.880 that it's
00:47:07.540 reducing the
00:47:08.700 severity of
00:47:09.260 infection, then
00:47:10.980 you would have
00:47:11.380 to make an
00:47:11.800 argument that
00:47:12.540 the side
00:47:14.040 effects from
00:47:14.540 the vaccination
00:47:15.100 are worse
00:47:16.040 than however
00:47:17.640 much it
00:47:18.100 saves you.
00:47:18.720 Now, I
00:47:19.000 don't know if
00:47:19.280 that's true.
00:47:20.600 Now, if you
00:47:21.000 said to me, I
00:47:21.560 don't believe
00:47:22.140 the studies, and
00:47:23.840 I don't believe
00:47:24.540 that you're
00:47:25.140 reducing more
00:47:25.980 problem than
00:47:27.660 you're causing
00:47:28.240 with any side
00:47:28.940 effects, I
00:47:30.760 don't know that
00:47:31.160 we could believe
00:47:31.600 that.
00:47:34.360 Oh, they
00:47:34.900 think, did I
00:47:35.620 ask the question
00:47:36.200 wrong, so it
00:47:36.760 looks like they're
00:47:37.280 agreeing to the
00:47:37.840 wrong thing?
00:47:39.160 Maybe.
00:47:39.560 Well, the
00:47:43.740 discussion that
00:47:44.460 matters is
00:47:45.200 long COVID, and
00:47:47.360 whether the
00:47:48.020 vaccination
00:47:49.020 anecdotal
00:47:50.140 information is
00:47:50.900 telling you
00:47:51.280 anything.
00:47:52.280 All right, here's
00:47:52.680 a question I
00:47:53.200 asked just
00:47:53.700 before I got
00:47:54.240 on, and
00:47:54.920 let's see how
00:47:55.320 the, I did a
00:47:56.160 little poll.
00:47:57.820 Let's see how
00:47:58.520 I did, we'll
00:47:59.520 see how you do
00:48:00.220 on this poll.
00:48:02.440 So I did this
00:48:03.300 just before I got
00:48:04.200 on here, so I
00:48:05.000 didn't see any
00:48:05.700 answers yet.
00:48:08.380 And the
00:48:08.900 answers are,
00:48:09.560 so I said
00:48:12.220 this, what
00:48:12.720 does it mean
00:48:13.340 when there, what
00:48:14.220 does it mean, so
00:48:15.000 you get to
00:48:15.460 interpret this, what
00:48:16.980 does it mean when
00:48:17.560 there is a
00:48:17.960 mountain of
00:48:18.500 anecdotal
00:48:19.260 evidence on
00:48:20.300 Twitter, the
00:48:21.540 shots, the
00:48:22.620 vaccination
00:48:23.080 shots, allegedly
00:48:24.340 caused dozens
00:48:25.180 of different
00:48:25.760 medical problems?
00:48:27.080 Now, here's the
00:48:27.640 important part, not
00:48:29.100 medical problems, but
00:48:30.960 dozens of
00:48:31.780 different ones.
00:48:33.560 So if you
00:48:34.020 were a doctor, how
00:48:34.860 would you process
00:48:36.740 that?
00:48:37.680 It's anecdotal, so
00:48:38.920 it's not a
00:48:39.420 study, but there
00:48:41.480 seemed to be a
00:48:42.040 lot of them, on
00:48:42.900 Twitter anyway.
00:48:44.660 But of course
00:48:45.340 Twitter is
00:48:45.900 everybody in the
00:48:47.300 world, so there
00:48:48.660 are a lot of
00:48:49.060 people that they
00:48:49.740 could notice.
00:48:51.640 Do you think
00:48:52.760 they're seeing
00:48:53.140 that there are
00:48:53.740 dozens of
00:48:54.640 different medical
00:48:55.220 problems that
00:48:56.860 people are
00:48:57.260 associating with
00:48:58.220 this shot, whether
00:48:59.420 it's proven or
00:49:00.080 not, people are
00:49:00.800 saying that?
00:49:02.300 Is it a red
00:49:03.060 flag for danger,
00:49:04.520 that there are so
00:49:05.060 many reports?
00:49:05.660 would you say
00:49:06.780 it's near proof
00:49:08.000 of danger?
00:49:09.320 You've seen so
00:49:10.000 many of them that
00:49:10.680 in your mind it's
00:49:11.480 nearly proof, not
00:49:13.440 just a red flag.
00:49:14.720 Or do you think
00:49:15.600 it's more evidence
00:49:16.260 of a probable
00:49:17.160 mass hysteria?
00:49:18.920 Or do you think
00:49:19.900 that it's literally
00:49:20.740 nothing?
00:49:22.520 The fact that
00:49:23.700 there are lots of
00:49:24.320 anecdotal reports,
00:49:26.280 one of the answers
00:49:27.340 could be that it
00:49:28.060 literally means
00:49:28.520 nothing, because
00:49:29.560 anecdotal is just
00:49:30.920 anecdotal.
00:49:31.960 Which of those
00:49:32.840 would you say?
00:49:33.380 It's a red flag
00:49:35.100 for danger, it's
00:49:35.920 near proof of
00:49:36.540 danger, probable
00:49:37.340 mass hysteria, or
00:49:38.900 literally nothing.
00:49:40.600 Go.
00:49:42.380 Danger, noise,
00:49:43.820 red flag, nothing,
00:49:45.860 nothing, nothing.
00:49:49.460 Only one person
00:49:50.620 said, only a few
00:49:51.840 of you are saying
00:49:52.420 hysteria, okay?
00:49:55.700 So most of you
00:49:56.780 are saying either
00:49:57.420 nothing or a red
00:49:58.360 flag.
00:50:00.080 You know, I
00:50:01.000 realize my question
00:50:02.160 sucks, because some
00:50:03.360 think there would
00:50:03.720 be nothing and a
00:50:04.440 red flag at the
00:50:05.160 same time, can't
00:50:05.900 it?
00:50:07.140 Right?
00:50:08.120 It can be nothing
00:50:09.120 and also a red
00:50:10.620 flag.
00:50:11.840 So I'm going to
00:50:12.760 add together the
00:50:13.540 red flag for
00:50:14.200 danger and the
00:50:15.180 literally nothings.
00:50:16.760 And that would
00:50:17.160 be 78% of the
00:50:19.360 people who answered
00:50:19.940 it.
00:50:21.620 And I think that
00:50:22.480 would be, it's
00:50:23.340 correct to put the
00:50:24.160 red flag and the
00:50:25.640 literally nothing
00:50:26.200 together.
00:50:27.440 Because you've
00:50:28.040 literally proven
00:50:28.840 nothing, but it
00:50:30.520 could still be a
00:50:31.100 red flag of
00:50:31.660 something to look
00:50:32.240 for, right?
00:50:34.240 Only 10% said
00:50:35.720 probable mass
00:50:36.540 hysteria, and
00:50:37.360 that is my
00:50:38.120 vote.
00:50:39.540 My vote is
00:50:40.460 probable mass
00:50:41.440 hysteria.
00:50:44.020 Do you know
00:50:44.880 why a mass
00:50:45.640 hysteria works?
00:50:48.120 Because only 10%
00:50:49.740 of the people
00:50:50.320 recognize it.
00:50:54.160 This is exactly
00:50:55.660 what it looks
00:50:56.120 like.
00:50:56.380 here should have
00:50:59.660 been the tip
00:51:00.160 off that mass
00:51:01.620 hysteria is the
00:51:02.500 probable
00:51:03.020 explanation.
00:51:04.680 There are too
00:51:05.260 many medical
00:51:05.920 problems being
00:51:06.800 attributed to the
00:51:08.020 vaccinations.
00:51:09.620 If you told me
00:51:10.720 there are three
00:51:11.380 medical problems
00:51:12.280 that we're seeing
00:51:13.020 all over the
00:51:13.680 place, it's the
00:51:14.840 same three, then
00:51:16.400 I'd say, whoa,
00:51:17.280 that's a red flag.
00:51:18.320 Not proof.
00:51:19.380 Not near proof.
00:51:20.160 Just a red flag.
00:51:22.520 Look into it.
00:51:24.300 But if you tell me
00:51:25.740 that there are
00:51:26.260 dozens and dozens
00:51:27.220 of different medical
00:51:28.160 problems after the
00:51:29.180 shots, immediately
00:51:30.620 my mind says,
00:51:32.000 that's more,
00:51:33.980 that's closer to the
00:51:35.020 arc of a mass
00:51:35.840 hysteria.
00:51:36.420 But it could be
00:51:39.120 complicated.
00:51:40.400 Because it could be
00:51:41.420 a red flag plus
00:51:43.520 a mass hysteria.
00:51:45.120 It could be both.
00:51:46.960 Because within the
00:51:48.160 dozens and dozens
00:51:49.080 of medical problems
00:51:50.440 people are reporting,
00:51:51.780 there are some that
00:51:52.860 are dominant, right?
00:51:54.180 The myocarditis,
00:51:55.780 although the
00:51:56.840 myocarditis usually
00:51:57.900 clears up.
00:51:59.380 But let's say
00:52:00.060 fertility.
00:52:01.780 You know, some
00:52:02.340 impact on that.
00:52:03.780 There probably is a
00:52:04.860 top three that you
00:52:05.760 really should worry
00:52:06.400 about in terms of
00:52:07.980 a red flag.
00:52:08.860 Not in terms of
00:52:09.700 proof.
00:52:10.620 We don't have that.
00:52:12.020 But to me, this looks
00:52:12.960 like at least
00:52:13.840 half mass hysteria.
00:52:18.560 So whether or not
00:52:19.400 the vaccinations,
00:52:20.520 so let me summarize
00:52:21.220 this as cleanly as
00:52:22.340 possible, because
00:52:23.380 it's easy to get
00:52:24.280 mischaracterized when
00:52:25.420 you do this kind of
00:52:26.520 topic.
00:52:27.880 My opinion is
00:52:28.880 it's probably a
00:52:30.420 mix, all these
00:52:32.040 anecdotal reports
00:52:33.100 are probably a mix
00:52:34.740 of real red flags
00:52:36.380 flags, meaning you
00:52:38.140 really ought to take
00:52:38.780 it seriously, but
00:52:41.340 most of it is mass
00:52:42.620 hysteria and easily
00:52:44.660 identifiable as
00:52:45.680 so.
00:52:46.860 Easily identifiable.
00:52:48.600 In my opinion, the
00:52:49.800 mass hysteria part is
00:52:50.640 obvious, right?
00:52:52.460 Which does not mean
00:52:53.420 you're wrong.
00:52:55.000 Can you handle that
00:52:56.000 nuance?
00:52:56.940 If I tell you that
00:52:57.980 most of what you're
00:52:58.820 seeing in the
00:52:59.360 anecdotal reports are
00:53:00.580 just mass hysteria,
00:53:01.640 can you accept that
00:53:03.080 that could be true?
00:53:04.120 At the same time,
00:53:05.560 at the same time,
00:53:06.740 there might be
00:53:08.040 enough of these
00:53:08.600 reports that are
00:53:09.320 true that you
00:53:11.220 need to take it
00:53:11.800 seriously.
00:53:13.840 But I figure
00:53:15.620 probably there are
00:53:16.300 only a few
00:53:16.900 conditions that
00:53:18.260 look credible,
00:53:19.320 and probably there
00:53:20.460 are a whole bunch
00:53:20.880 of other things
00:53:21.560 that are just
00:53:22.040 coincidence.
00:53:23.220 And that's the
00:53:23.820 coincidence part is
00:53:24.960 the mass hysteria
00:53:25.760 part.
00:53:27.340 How do you
00:53:28.000 identify mass
00:53:28.740 hysteria?
00:53:29.540 The clue is how
00:53:31.500 many different
00:53:32.340 explanations or
00:53:33.720 problems there
00:53:35.020 are.
00:53:39.040 Yeah, I'm
00:53:40.040 intentionally keeping
00:53:40.880 the camera in a
00:53:44.400 terrible mode,
00:53:45.180 because I've got
00:53:45.620 this injury on my
00:53:46.820 lip that looks like
00:53:48.200 monkeypox, but
00:53:49.040 isn't.
00:53:50.060 No, it's not a
00:53:50.760 herpes sore.
00:53:51.500 I know you're
00:53:51.920 going to say that.
00:53:53.320 I literally burned
00:53:54.900 myself very badly
00:53:55.780 on my lip.
00:53:58.740 Is Robert
00:54:00.240 Malone correct?
00:54:02.160 I doubt it.
00:54:04.340 If I had to
00:54:05.140 place a bet on
00:54:06.500 Dr. Malone being
00:54:08.380 correct on, let's
00:54:09.380 say, most of his
00:54:10.940 biggest claims, I'd
00:54:12.480 bet against him, and
00:54:13.400 I'd feel pretty
00:54:14.000 confident about that.
00:54:15.400 If you said bet
00:54:16.540 against everything
00:54:17.700 that Malone says
00:54:19.460 that's different from
00:54:20.280 what the mainstream
00:54:20.820 says, I wouldn't
00:54:21.400 take that bet.
00:54:23.140 I would agree with
00:54:25.080 you.
00:54:25.820 It's almost
00:54:26.480 certainly true that
00:54:28.020 some of the
00:54:28.500 things Malone is
00:54:29.280 saying are
00:54:30.480 valid, but I
00:54:32.640 think he goes
00:54:33.180 too far.
00:54:36.980 Call him rogue
00:54:38.040 to discredit
00:54:38.740 him.
00:54:39.220 Yeah, I do
00:54:39.660 call the rogue
00:54:40.700 doctors rogue to
00:54:41.980 discredit them, but
00:54:43.800 what I'm doing is
00:54:44.480 discrediting the
00:54:46.560 category for your
00:54:49.280 benefit.
00:54:50.220 So if I take a
00:54:51.440 specific doctor and
00:54:53.000 say that they're
00:54:54.960 presenting themselves
00:54:55.760 as a rogue doctor, I
00:54:57.620 do mean, I do
00:54:58.860 mean intentionally
00:54:59.660 for you to reduce
00:55:01.280 your belief in
00:55:02.020 them.
00:55:02.720 That's the point.
00:55:04.420 Doesn't mean they're
00:55:05.160 wrong.
00:55:06.240 If you take that as
00:55:07.460 I believe that my
00:55:08.760 medical information is
00:55:10.320 better than theirs and
00:55:11.080 you should treat them
00:55:11.760 as wrong, that's not
00:55:12.700 what I'm saying.
00:55:13.760 I'm saying that if
00:55:14.580 you're not a doctor and
00:55:16.040 you can't check it
00:55:16.740 yourself, and the
00:55:18.040 only thing you know is
00:55:19.100 that there's a rogue
00:55:19.960 doctor who differs from
00:55:21.800 the mainstream, you're
00:55:23.600 probably bet on the
00:55:24.260 mainstream.
00:55:25.560 The smart money says
00:55:26.480 the mainstream is
00:55:27.140 right, not the rogue
00:55:27.920 doctor.
00:55:28.800 You only think the
00:55:30.160 rogue doctors are
00:55:31.360 credible because you
00:55:32.960 only hear about the
00:55:33.960 ones that got it
00:55:34.660 right.
00:55:37.880 Follow the money,
00:55:38.740 Scott.
00:55:39.160 Doctors have little
00:55:39.880 or nothing to gain.
00:55:40.880 Really?
00:55:42.760 Really?
00:55:44.160 You don't think Dr.
00:55:45.360 Malone got some
00:55:46.020 speaking invitations?
00:55:49.960 You don't think he
00:55:51.060 monetized that?
00:55:51.800 You know, the other
00:55:54.720 thing you have to look
00:55:55.460 at is that Malone
00:55:57.720 presents himself as a
00:55:59.020 narcissist, and I'm
00:56:01.840 not going to say
00:56:02.300 that's bad because I
00:56:03.800 am one, and there are
00:56:05.060 two types.
00:56:06.280 One type of narcissist
00:56:07.400 wants to just destroy
00:56:09.040 everything, you know,
00:56:10.020 destroy their social
00:56:11.960 situation.
00:56:13.200 The other kind just
00:56:13.980 likes to get credit
00:56:14.920 for doing good stuff.
00:56:17.520 In my opinion, Malone
00:56:20.380 looks like a narcissist.
00:56:22.400 Meaning that his
00:56:23.360 payoff is having
00:56:24.780 people believe he was
00:56:25.900 right all along, and
00:56:27.420 that that would be
00:56:28.020 worth any economic
00:56:29.320 degradation that he
00:56:32.440 experienced.
00:56:33.920 So if he's
00:56:34.880 financially secure,
00:56:36.920 then his payoff would
00:56:38.100 be what they call the
00:56:39.280 fuel.
00:56:40.220 People say, you're
00:56:41.160 right, Malone, you
00:56:42.840 were right all along.
00:56:44.300 Now, if it turns out
00:56:45.500 later that he's right,
00:56:46.640 he'll make a fortune.
00:56:48.260 Do you get that?
00:56:49.220 If you say, well, he
00:56:51.860 sacrificed so much to
00:56:54.180 be a rogue, I say, no,
00:56:56.440 he invested so much to
00:56:58.120 be a rogue.
00:56:59.100 If his investment pays
00:57:01.320 off, and it turns out
00:57:02.440 he's right, he's going to
00:57:04.760 have like a speaking
00:57:05.580 career, books.
00:57:07.900 He will make, I would
00:57:09.640 say the value of Dr.
00:57:10.860 Malone being right in the
00:57:12.440 end, if it turns out that
00:57:13.500 way, would probably be
00:57:15.320 $20 million.
00:57:17.400 If I had to monetize
00:57:18.680 the value of him being
00:57:20.600 right in the end, it
00:57:21.760 would be about $20
00:57:22.500 million.
00:57:23.320 Because he'd sell a
00:57:24.180 book, and they'd go in
00:57:25.360 the speaking circuit, and
00:57:27.340 he probably, I don't
00:57:28.360 know if he practices, but
00:57:29.560 if he practices medicine,
00:57:31.300 all the smart people would
00:57:32.340 say, I'm going to go to
00:57:33.140 you, because you've got
00:57:34.120 everything right.
00:57:36.560 Does that make him
00:57:37.400 wrong?
00:57:38.200 No, I don't know if he's
00:57:38.960 right or wrong.
00:57:40.420 How would I know?
00:57:41.180 So I can't give you an
00:57:42.680 opinion of whether he's
00:57:43.340 right or wrong.
00:57:44.140 I can tell you what it
00:57:45.000 looks like.
00:57:46.600 He's in a category that's
00:57:48.220 usually wrong.
00:57:49.540 It doesn't mean he's
00:57:50.560 wrong, but he's in a
00:57:51.280 category that's usually
00:57:52.160 wrong.
00:57:53.140 And he acts like he
00:57:55.620 presents himself as a
00:57:56.940 narcissist.
00:57:58.920 Again, that's subjective,
00:58:00.740 but I've seen enough of
00:58:02.100 him, and it takes one to
00:58:03.260 know one, right?
00:58:04.740 Like, nobody's more
00:58:05.620 narcissistic than I am.
00:58:08.360 But I'm also
00:58:09.140 transparent.
00:58:10.740 I'm transparent in the
00:58:11.940 sense that if I don't
00:58:13.460 do something that's
00:58:14.240 worthwhile to you, I
00:58:16.680 don't want any credit.
00:58:18.300 But if I do, and I do
00:58:20.180 something that helps, yeah,
00:58:21.520 I'd like you to mention
00:58:22.180 it, because that would
00:58:23.040 feel good to me, and
00:58:23.840 maybe I'll do some more
00:58:24.620 stuff for you.
00:58:25.780 It just works.
00:58:26.660 It's just a virtuous
00:58:27.340 cycle.
00:58:28.340 I give you something, you
00:58:29.480 give me something that I
00:58:30.480 want.
00:58:31.340 I want different stuff than
00:58:32.620 you want.
00:58:33.120 That's a perfect situation.
00:58:34.400 If we wanted the same
00:58:35.320 stuff, it might be a
00:58:36.280 problem.
00:58:37.300 But if I want credit,
00:58:39.080 and you want some
00:58:40.820 benefit that I can help
00:58:41.800 you get, well, we both
00:58:42.960 win.
00:58:44.160 That's cool.
00:58:45.200 So I don't care that he
00:58:46.800 might, in my opinion, have
00:58:49.220 that personality trait that I
00:58:50.520 have.
00:58:51.300 That part's fine.
00:58:52.040 How is that not mind reading?
00:58:57.280 That is statistical assumptions
00:59:01.540 short of mind reading.
00:59:03.600 Mind reading would say, I know
00:59:05.620 what he's thinking.
00:59:07.220 I don't say that.
00:59:08.880 I say, to me, he looks like
00:59:11.300 me.
00:59:12.980 So if I were to say, well, if
00:59:14.820 he looks like me, he's probably
00:59:15.900 like me, that's my speculation.
00:59:17.440 So I always put anything that
00:59:20.160 is somebody else's thoughts into
00:59:23.180 the speculative road.
00:59:25.880 Using rogue is persuasion.
00:59:28.040 Yes, I am intentionally persuading
00:59:30.060 you to think that he's less
00:59:31.340 credible than he presents
00:59:33.040 himself.
00:59:34.540 I'm not trying not to do that.
00:59:37.600 I'm doing that very intentionally,
00:59:39.000 very openly.
00:59:40.420 I'm trying to encourage you to
00:59:43.100 put less credibility on anybody
00:59:45.220 who differs from the mainstream
00:59:47.520 in a way you want to hear.
00:59:49.840 Maybe that's the thing I should
00:59:51.060 have said.
00:59:52.080 All right, here's the better way
00:59:53.300 to say it.
00:59:53.600 I finally figured out how to
00:59:54.680 say this.
00:59:56.740 This took me forever.
00:59:58.380 I finally figured out how to
00:59:59.680 communicate this.
01:00:01.440 Don't believe people who are a
01:00:04.480 little too close to what you
01:00:05.740 want to be true.
01:00:08.300 Those are the least credible
01:00:09.900 people.
01:00:10.360 If they disagree with the
01:00:11.820 mainstream, if you disagree with
01:00:14.320 the mainstream, and it's exactly
01:00:16.500 what you want it to hear, it's
01:00:17.680 right on the nose, the odds of
01:00:19.820 that being true are low.
01:00:22.620 Are low.
01:00:26.700 It doesn't mean he's not right,
01:00:28.500 but the odds are low.
01:00:31.000 Here Scott describes himself
01:00:32.600 being a real, yeah, that's
01:00:34.740 exactly true.
01:00:35.840 So somebody says that I'm
01:00:37.800 describing myself.
01:00:38.840 Let's see if that's true.
01:00:41.020 I disagree with the
01:00:42.220 mainstream, and I tell you
01:00:44.480 things often that are exactly
01:00:45.980 what you want to hear.
01:00:47.440 Yeah, that's true.
01:00:49.180 That's true.
01:00:50.420 So what I do in public is I
01:00:54.560 make predictions and statements,
01:00:57.200 and then you get to test them.
01:00:59.180 So you'll get to see if it's
01:01:00.540 true.
01:01:01.160 Like anything that I tell you,
01:01:03.280 usually you just get to see for
01:01:04.700 yourself, right?
01:01:06.000 So if I told you that Biden would
01:01:08.320 be problematic as a president,
01:01:10.900 well, you get to see it
01:01:11.780 yourself.
01:01:12.660 Was I right or was I wrong?
01:01:14.440 And I keep score.
01:01:17.200 You know, I literally gave
01:01:18.800 myself a scorecard after the
01:01:20.720 pandemic.
01:01:21.760 Here's what I said.
01:01:23.040 Here's how it turned out.
01:01:24.180 And I included the things I got
01:01:25.720 wrong.
01:01:26.940 Now, if Malone does that, let's
01:01:29.260 say five years from now, we've
01:01:31.280 got a better idea, everything
01:01:33.640 else.
01:01:34.220 If Malone does that and says,
01:01:35.820 here's the stuff I said.
01:01:36.740 I got a couple of things wrong,
01:01:38.840 but here are the things I got
01:01:39.740 right.
01:01:41.060 Instantly, he would go right to
01:01:42.880 the top of my credibility list.
01:01:45.460 Right?
01:01:48.260 He would go right to the top of my
01:01:50.120 credibility list if he kept track
01:01:51.940 of his own statements and how he
01:01:53.680 got them right or wrong.
01:01:55.440 And then, you know, gave you an
01:01:56.760 honest accounting at the end.
01:01:58.440 Then I would lift him up a little
01:02:02.060 bit.
01:02:05.060 Malone is clearly right on myocarditis
01:02:07.420 as the CDC data is showing the
01:02:09.680 trend now.
01:02:11.500 He might be right in that.
01:02:12.880 But the myocarditis, as I understand
01:02:14.840 it, clears up.
01:02:18.740 And that...
01:02:22.060 Yeah.
01:02:25.220 So the myocarditis is not so much a
01:02:27.480 question of whether you get it,
01:02:28.780 because I think we've done that for a
01:02:29.960 long time.
01:02:30.980 It's a question of whether how bad
01:02:33.660 that is in the long run makes a
01:02:36.320 difference to you compared to any
01:02:38.500 protection you'd get from the shot.
01:02:42.460 Cardiac cells can't repair.
01:02:44.240 I don't think it's a cardiac cell
01:02:46.060 problem.
01:02:51.400 But COVID also causes myocarditis,
01:02:54.000 somebody says, right?
01:02:54.720 I'm afraid of long, but myocarditis, no
01:03:03.800 biggie.
01:03:08.040 Isn't myocarditis part of long COVID?
01:03:11.740 I'd put that in the same category, right?
01:03:14.940 If you get some kind of condition,
01:03:17.600 there are cardiac stem cells.
01:03:21.900 So I think the doctor is telling us that
01:03:24.080 the heart can repair itself.
01:03:29.140 What is the definition of long COVID?
01:03:30.960 Yeah, that's a good question.
01:03:32.920 In my opinion, it's anything that
01:03:34.820 continues to bother you after you had
01:03:37.320 COVID that's related to the COVID.
01:03:39.880 So I would think that if you had myocarditis
01:03:41.800 and it lasted, I don't even know if that
01:03:43.940 could happen.
01:03:44.340 Can myocarditis just continue?
01:03:48.920 I don't know.
01:03:51.800 But anything that bothers you after,
01:03:53.660 I would call long COVID,
01:03:55.300 whether it's temporary or permanent.
01:04:00.380 All right.
01:04:02.880 Do folks in upstate New York hunt deer?
01:04:05.600 Yes, they do.
01:04:09.060 Yes, they do.
01:04:10.500 Yeah, my parents were hunters.
01:04:12.480 They both were deer hunters.
01:04:14.340 You know, I do have a different opinion
01:04:20.320 of people who eat the meat.
01:04:22.520 And, you know, in upstate New York,
01:04:25.400 you could make an argument for conservation.
01:04:30.160 I mean, it's kind of a crazy argument.
01:04:32.060 You have to shoot the deer to protect the deer.
01:04:34.200 But there really would be too many deer.
01:04:36.940 That's, there's no joke.
01:04:38.340 If the deer were not somehow population controlled,
01:04:43.340 and I'm not saying that's the best way to do it,
01:04:45.600 but there would be,
01:04:49.380 yeah, either you'd have to introduce wolves
01:04:51.300 and then you're still killing deer.
01:04:53.780 Yeah, there's no right answer.
01:04:55.560 There's no right answer.
01:04:56.840 But at least if you're eating the venison,
01:04:59.180 and we did.
01:04:59.900 I ate a lot of venison.
01:05:00.800 But if you eat the venison,
01:05:03.020 it's at least not as ethically disgusting.
01:05:14.940 Yeah.
01:05:17.900 Reintroducing wolves to Yellowstone fixed it.
01:05:20.560 Deer will destroy a place.
01:05:21.980 Yeah.
01:05:22.380 The deer will eat your crops.
01:05:23.800 All right.
01:05:28.900 I think that's all for now.
01:05:30.780 I believe we've done our job.
01:05:32.900 And I believe that I will talk to you tomorrow.