Real Coffee with Scott Adams - October 26, 2021


Episode 1542 Scott Adams: Today I Will Argue the Opposite of my Actual Opinions on the Pandemic. And More Fun


Episode Stats

Length

58 minutes

Words per Minute

152.50444

Word Count

8,927

Sentence Count

651

Misogynist Sentences

7

Hate Speech Sentences

21


Summary

Dave Chappelle is the only person who isn't safe at work, and it's not because he's trans or LGBTQ, but because he doesn't have the proper qualifications to work in a Fortune 500 company like Netflix.


Transcript

00:00:00.880 Bum, bum, bum, bum, bum.
00:00:03.480 Well, good morning, everybody.
00:00:09.280 What an amazing, amazing start of the day.
00:00:13.300 It turns out that you woke up today thinking,
00:00:16.820 well, today might be a good day, it might be a bad day,
00:00:18.960 but surprise, it's an amazing day,
00:00:21.320 because you made it here to the simultaneous sip.
00:00:24.540 And some of you are prepared,
00:00:26.240 some of you are scurrying now to grab your cup and your beverage.
00:00:30.180 But while you're doing that, let me give you the introduction,
00:00:33.300 the thing that makes everything better.
00:00:36.000 It's called the simultaneous sip.
00:00:37.620 A cup or a mug or a glass, that's all you need.
00:00:39.880 A tank or chalice or stein, a canteen jug or flask,
00:00:42.540 a vessel of any kind, fill it with your favorite liquid.
00:00:44.540 I like coffee.
00:00:45.800 And join me now for the unparalleled pleasure,
00:00:49.660 the dopamine hit of the day,
00:00:51.300 the thing that makes everything better,
00:00:53.940 the thing that connects us across space and time.
00:00:57.020 It's called the simultaneous sip,
00:00:58.820 and it's going to take your antibodies to a whole, whole new level.
00:01:03.400 Watch this.
00:01:04.540 Go.
00:01:09.560 Ah, antibodies.
00:01:11.340 Well, I know you don't like anecdotal evidence.
00:01:17.380 You like data.
00:01:18.520 You like hard data.
00:01:20.440 But let me give you some hard data.
00:01:23.920 Nobody has ever died of COVID
00:01:27.020 while taking the simultaneous sip.
00:01:30.580 You can fact check me on that.
00:01:31.820 Zero people out of 7.8 billion people on this earth,
00:01:36.440 not a single one of them has died of COVID
00:01:40.540 while doing the simultaneous sip.
00:01:43.740 None.
00:01:44.560 Not a single one.
00:01:46.120 Do you think that's coincidence or causation?
00:01:49.640 Clearly causation.
00:01:51.260 Clearly.
00:01:51.940 Because we know how to analyze things.
00:01:53.460 Well, in the news, Elon Musk,
00:01:56.460 his personal net worth
00:01:57.840 is now more than ExxonMobil.
00:02:02.000 His personal net worth.
00:02:04.220 No, not the value of Tesla.
00:02:06.780 Tesla is already much more valuable than,
00:02:09.920 apparently more valuable than ExxonMobil.
00:02:12.640 But Elon Musk himself,
00:02:15.480 he's worth $289 billion.
00:02:20.080 And by the way, at his current age,
00:02:22.940 he should be the first trillionaire.
00:02:26.220 Elon Musk will be our first trillionaire.
00:02:28.400 Pretty sure.
00:02:29.380 I mean, the odds are really good.
00:02:31.980 And he surpassed the value of ExxonMobil,
00:02:35.500 which is only $272 billion.
00:02:37.620 Did you know that ExxonMobil was only $272 billion?
00:02:43.340 I mean, I realize a billion is a lot of money,
00:02:45.560 but I thought they'd be bigger.
00:02:47.660 Actually, I was a little surprised.
00:02:49.420 I guess I should have known.
00:02:51.400 Well, Dave Chappelle addressed the issue
00:02:54.800 with the trans community
00:02:55.960 and the dust-up he's having with his Netflix special.
00:03:01.380 And he had this quite clever thing to say.
00:03:04.640 He said that the trans employees
00:03:09.060 and the LBGQT employees at NetSafe,
00:03:12.980 they say what they want is a safe working environment.
00:03:16.540 A safe working environment.
00:03:18.760 Don't we all?
00:03:19.800 Don't we all?
00:03:20.680 And don't we want that for everyone?
00:03:22.160 Of course.
00:03:22.780 We all want a safe working environment.
00:03:24.680 That's a good thing to ask for.
00:03:26.700 And we certainly want it for the trans community.
00:03:29.600 But as Dave Chappelle cleverly points out,
00:03:32.860 he's the only one who can't safely go to work at Netflix.
00:03:37.560 Now, he doesn't exactly work for Netflix anymore.
00:03:41.000 At least he's not doing any more specials that we know of.
00:03:43.920 But literally, he's the only person who isn't safe at work.
00:03:49.260 That's a pretty good comment.
00:03:50.580 But I'd like to add on to it.
00:03:55.080 Because I think it was a good social commentary.
00:03:58.680 But if I could extend it a little bit.
00:04:02.280 Dave Chappelle might, in fact, not be terribly safe if he worked at Netflix.
00:04:07.780 Because of the anger with the trans and LGBTQ community, I guess.
00:04:12.160 But if I, as a white man, worked for Netflix,
00:04:17.880 but let's not make it about Netflix.
00:04:19.620 Let's just make it generic.
00:04:21.520 A Fortune 500 company.
00:04:24.120 Who would do better at the Fortune 500 company?
00:04:28.520 Let's say equal amount of talent and experience.
00:04:32.100 Dave Chappelle or me.
00:04:34.700 Which one of us would do better in their career?
00:04:39.220 Equal talent and equal experience.
00:04:41.160 Not even close.
00:04:45.100 Right.
00:04:46.080 And who would do better than Dave Chappelle?
00:04:49.540 Who might do even better than Dave Chappelle at a Fortune 500 company?
00:04:54.620 Maybe a trans employee.
00:04:56.640 Because there might be more pressure.
00:04:59.920 Because it's a smaller community.
00:05:01.940 And it's harder to find candidates just because of the smaller pool.
00:05:05.880 I would guess that if a highly qualified black employee
00:05:10.240 was up for a promotion against a highly, let's say, equally qualified trans employee
00:05:15.980 in a Fortune 500 company, who wins?
00:05:18.740 Same qualifications.
00:05:21.580 I think trans wins.
00:05:23.520 And I think Dave's right about that, Dave Chappelle.
00:05:26.620 You know, you didn't say that directly,
00:05:28.000 but I feel that's the implication.
00:05:30.860 So, we have this weird situation where the only people who can get promoted
00:05:37.180 and, you know, have a safe experience are the ones complaining.
00:05:40.720 This is not to say they don't have valid complaints, okay?
00:05:46.580 So, make sure you hear this correctly.
00:05:50.600 Everybody's got valid complaints.
00:05:53.040 The trans people have tons of valid complaints.
00:05:56.360 They're valid, right?
00:05:58.100 You might not agree with them, but they're valid complaints.
00:06:00.880 You might disagree about what to do about it.
00:06:04.140 But we all have valid complaints.
00:06:05.620 And so, let's remember that, you know,
00:06:09.780 you're not alone if somebody's treating you poorly, right?
00:06:14.440 If there was one thing that we should all be able to band together
00:06:17.880 and agree on, is that we're all victims of assholes.
00:06:22.860 Am I right?
00:06:24.320 Like, if you could...
00:06:25.760 If you had...
00:06:26.440 Let's say you were an omnipotent being
00:06:30.220 and you could fix the earth and the racial strife.
00:06:35.080 Let's say you had all the power
00:06:37.500 that you could somehow do whatever it took
00:06:39.220 to fix racial division.
00:06:41.800 How would you do it?
00:06:43.220 Would you create a melting pot,
00:06:45.920 the United States model, and throw everybody together?
00:06:48.760 Or would you segregate people and say,
00:06:50.920 hey, one way to get rid of all this division
00:06:53.420 is just divide people.
00:06:55.520 You know, let them live with each other.
00:06:59.000 I don't know.
00:06:59.700 I don't think either of those would work.
00:07:01.440 Here's what I would do.
00:07:03.120 You ready?
00:07:03.720 Here's my solution to racial division.
00:07:08.140 Separate the assholes in all the groups.
00:07:11.340 Just put all the assholes together.
00:07:13.320 So you take all the black people who are also assholes
00:07:15.960 and you separate them out from the group of black people
00:07:19.420 who are awesome.
00:07:21.000 Then you take all the Irish people who are assholes
00:07:23.960 and you separate them out.
00:07:26.500 And you keep the Irish people who are not assholes.
00:07:28.560 And you put them together with the black Americans
00:07:31.720 who are not assholes.
00:07:33.360 How do they do?
00:07:35.580 Really well.
00:07:37.060 Really well, right?
00:07:38.180 They wouldn't have a problem at all.
00:07:40.420 Nothing to do with racism.
00:07:42.840 Everything to do with who's an asshole
00:07:44.560 and who's not an asshole.
00:07:46.260 Right?
00:07:46.340 So we're completely framing the problem wrong.
00:07:50.720 If you took the assholes out of the category of white people,
00:07:56.240 granted, it's a pretty big percentage,
00:07:59.420 but if you could take the assholes out of the average,
00:08:03.000 would white people look so bad?
00:08:04.500 No, no.
00:08:07.420 We'd look pretty good.
00:08:09.200 You take the assholes out.
00:08:11.340 And likewise,
00:08:12.800 do you think you could get along with somebody
00:08:15.020 from whatever community,
00:08:16.520 just pick a community,
00:08:17.560 LGBTQ,
00:08:19.260 you know,
00:08:19.660 women,
00:08:20.280 men,
00:08:21.340 black,
00:08:22.100 white,
00:08:22.700 Asian,
00:08:23.040 American,
00:08:23.560 whatever,
00:08:24.020 whatever you want.
00:08:25.180 Pick any group
00:08:25.860 and then remove the assholes from the group.
00:08:27.740 How do you like them?
00:08:29.340 They're great.
00:08:30.580 They're great.
00:08:31.940 You just remove the assholes.
00:08:34.300 Now,
00:08:34.780 the fact that we've been fooled into thinking
00:08:36.960 we have a race problem,
00:08:39.560 think about it.
00:08:40.340 Think about any other ethnic group
00:08:43.160 and then think of somebody you know personally
00:08:45.500 who just happens to not be an asshole.
00:08:48.260 Do you like them?
00:08:49.580 Yeah.
00:08:50.580 Would you hire them?
00:08:52.200 Yeah.
00:08:53.320 Would you date them?
00:08:55.060 Probably.
00:08:56.060 Yeah,
00:08:56.260 I mean,
00:08:56.540 that's a slightly different question,
00:08:57.780 but probably.
00:08:59.860 So,
00:09:00.220 we're all looking at the wrong question here.
00:09:03.500 Really,
00:09:03.820 we're just looking at the wrong question.
00:09:06.160 All right.
00:09:07.540 Kudos to the Biden administration.
00:09:09.820 Yeah,
00:09:10.120 I know.
00:09:10.660 Weird,
00:09:11.000 huh?
00:09:11.640 I'm going to give credit
00:09:12.720 to the Biden administration
00:09:13.940 for something.
00:09:15.200 Now,
00:09:15.600 it's conditional
00:09:16.280 because I'm not sure
00:09:17.880 this isn't a trick.
00:09:20.000 I don't know
00:09:21.000 that this will lead to something good,
00:09:22.580 but it's a good start.
00:09:24.060 And here's the setup.
00:09:25.760 Apparently,
00:09:26.160 the Biden administration
00:09:27.080 is expecting to name
00:09:28.900 Kim Wyman
00:09:29.660 a Republican
00:09:32.140 to a high position.
00:09:34.640 So,
00:09:35.260 Biden is going to put
00:09:36.180 a Republican
00:09:36.840 in a high position
00:09:38.760 in the Homeland Security.
00:09:42.500 So,
00:09:42.860 remember,
00:09:43.360 Biden said he would be,
00:09:45.160 you try to
00:09:46.060 unite the country,
00:09:47.400 be a little more bipartisan.
00:09:49.560 Now,
00:09:50.020 the one he picked
00:09:50.860 happens to,
00:09:51.880 it's not a coincidence,
00:09:53.200 of course,
00:09:54.160 was,
00:09:54.860 went hard at Trump
00:09:56.140 for his,
00:09:57.020 what CNN calls
00:09:58.100 his false claims
00:09:59.280 of fraud
00:10:00.840 in the election.
00:10:01.760 So,
00:10:02.160 it's a Republican
00:10:02.820 who has criticized Republicans.
00:10:06.520 Now,
00:10:07.140 yeah,
00:10:07.320 I know what you're saying.
00:10:08.280 Everybody's typing
00:10:08.980 rhino,
00:10:09.520 rhino,
00:10:09.900 rhino,
00:10:10.240 not a real Republican.
00:10:11.520 I get it.
00:10:12.080 I get it.
00:10:12.400 Can we stipulate?
00:10:14.480 Just so you'll stop typing
00:10:15.820 rhino in the comments
00:10:16.900 and type something else.
00:10:17.740 Can we stipulate
00:10:19.060 that a Republican
00:10:20.660 who's criticizing
00:10:21.800 other Republicans,
00:10:23.560 you have a name for them?
00:10:25.560 You don't have to
00:10:26.220 just keep printing it
00:10:27.300 over and over again
00:10:27.960 in the comments.
00:10:28.660 Just print anything else.
00:10:30.360 Just stop saying rhino.
00:10:32.620 I will stipulate
00:10:34.480 that you have that opinion.
00:10:36.300 Okay?
00:10:37.580 Okay?
00:10:37.920 Can we stipulate
00:10:38.700 so we don't have to
00:10:39.440 just keep saying it
00:10:40.280 over and over?
00:10:41.340 I get it.
00:10:42.300 You don't like rhinos.
00:10:44.000 I get it.
00:10:44.900 Stipulate it.
00:10:46.540 But,
00:10:46.940 who is more likely
00:10:49.060 to criticize
00:10:50.540 both Republicans
00:10:51.760 and Democrats?
00:10:53.440 Do you think
00:10:54.120 a rhino
00:10:54.540 doesn't
00:10:55.420 criticize Democrats?
00:10:58.080 Of course they do.
00:10:59.400 The rhino
00:10:59.900 is going to
00:11:00.640 criticize Democrats
00:11:01.840 but also Republicans.
00:11:03.660 I know you don't
00:11:04.360 like that.
00:11:05.460 And in some cases
00:11:06.220 you say,
00:11:07.020 oh,
00:11:07.220 they don't have
00:11:07.620 the right views.
00:11:09.480 Man,
00:11:10.000 you do not like
00:11:10.640 people who go
00:11:11.420 against your team,
00:11:12.220 do you?
00:11:13.220 The Republicans
00:11:14.040 out here
00:11:14.500 are just going
00:11:14.960 freaking nuts.
00:11:15.960 Let me finish
00:11:18.600 my point
00:11:19.020 and see what
00:11:19.340 you think.
00:11:19.840 Okay?
00:11:20.480 Now,
00:11:20.960 if your choice
00:11:21.520 was to
00:11:22.080 put a pure
00:11:23.560 Democrat
00:11:23.960 in the job,
00:11:24.760 would that be
00:11:25.260 better or worse
00:11:25.940 than picking a rhino?
00:11:29.240 Let me know.
00:11:30.460 Better or worse?
00:11:33.140 What's better
00:11:33.840 or worse?
00:11:35.700 The same.
00:11:36.660 Somebody says
00:11:37.120 the same.
00:11:37.900 You think
00:11:38.400 that a rhino
00:11:39.800 would not
00:11:41.440 criticize a Democrat?
00:11:42.340 What would
00:11:44.520 stop a rhino
00:11:45.100 from criticizing
00:11:45.940 a Democrat?
00:11:47.280 Nothing,
00:11:47.920 right?
00:11:48.660 I think
00:11:49.500 Biden actually
00:11:50.700 picked the one
00:11:51.580 kind of person
00:11:52.480 who's definitely
00:11:53.800 not a Republican
00:11:54.580 in the classic
00:11:56.180 way,
00:11:57.460 but someone
00:11:57.940 who has a
00:11:58.520 pretty good
00:11:59.080 chance of
00:11:59.780 at least
00:12:00.280 understanding
00:12:01.700 the argument
00:12:02.300 on two sides.
00:12:05.020 I think
00:12:05.620 that's a
00:12:06.580 reasonable step.
00:12:08.320 But anyway,
00:12:08.800 this person
00:12:09.160 will be in
00:12:09.740 charge of
00:12:10.340 protecting
00:12:10.760 future elections.
00:12:11.580 So being a
00:12:12.140 resource to
00:12:12.800 the states
00:12:13.320 to help
00:12:14.460 them prevent
00:12:15.520 being hacked
00:12:16.240 specifically.
00:12:17.900 So her job
00:12:19.200 is to keep
00:12:19.640 foreign interference
00:12:20.520 out of the
00:12:20.940 elections.
00:12:21.800 And she
00:12:22.080 would be a
00:12:22.420 resource to
00:12:22.960 the states.
00:12:23.900 Now,
00:12:24.780 is that good?
00:12:26.800 All right.
00:12:28.980 I'm a little
00:12:29.780 disappointed in
00:12:30.660 you,
00:12:30.980 honestly.
00:12:32.580 I'm a little
00:12:33.360 disappointed in
00:12:34.360 the audience.
00:12:35.140 I have to
00:12:35.820 admit.
00:12:36.860 I can't
00:12:37.460 hide it.
00:12:39.180 If you
00:12:39.840 don't understand
00:12:40.500 the point
00:12:40.920 that a
00:12:41.280 rhino would
00:12:42.200 criticize
00:12:42.600 both sides
00:12:43.500 and you
00:12:44.420 really think
00:12:44.900 that a
00:12:45.140 rhino is
00:12:45.560 really just
00:12:46.060 a Democrat,
00:12:47.980 I mean,
00:12:48.340 maybe.
00:12:48.860 I mean,
00:12:49.100 that could
00:12:49.340 apply to
00:12:49.780 somebody,
00:12:50.440 I guess.
00:12:51.140 But I
00:12:51.460 don't think
00:12:51.740 that's keen
00:12:53.400 insight.
00:12:53.880 I think you
00:12:55.920 should be able
00:12:56.380 to agree with
00:12:57.060 me that a
00:12:57.580 rhino could
00:12:58.220 criticize both
00:12:59.120 sides.
00:13:00.480 Can I get
00:13:00.860 that?
00:13:01.960 Can I get
00:13:02.960 that a
00:13:04.120 rhino would
00:13:04.640 criticize both
00:13:05.460 sides?
00:13:07.180 Just that.
00:13:07.780 You could
00:13:08.720 say you
00:13:08.980 don't like
00:13:09.300 it for
00:13:09.480 other reasons.
00:13:10.300 But just
00:13:10.640 that.
00:13:12.700 All right.
00:13:13.160 So some
00:13:13.840 of you agree
00:13:14.300 with it and
00:13:14.760 others say
00:13:15.600 no.
00:13:16.680 I think some
00:13:17.280 of you think
00:13:17.760 a rhino is
00:13:18.400 somebody who's
00:13:18.860 trying to
00:13:19.200 destroy the
00:13:19.840 Republican Party.
00:13:20.760 Is that what
00:13:21.060 you think?
00:13:22.320 Do you think
00:13:22.720 a rhino is
00:13:23.420 somebody who's
00:13:23.880 trying to
00:13:24.220 destroy the
00:13:24.900 party?
00:13:26.060 Because I
00:13:26.560 don't see
00:13:26.820 that.
00:13:28.240 I see it's
00:13:28.700 somebody who
00:13:29.160 likes them,
00:13:29.680 just wants
00:13:30.040 them to be a
00:13:30.480 little different
00:13:30.880 than they are.
00:13:31.380 All right.
00:13:33.360 So I think
00:13:34.000 that whole
00:13:34.340 rhino thing
00:13:35.000 is just
00:13:35.300 sort of a
00:13:35.820 weird
00:13:37.680 stimulant
00:13:39.740 that just
00:13:40.320 causes people
00:13:40.900 to go nuts.
00:13:43.500 Oh, you
00:13:44.020 know what
00:13:44.200 it is?
00:13:44.440 Maybe it's
00:13:44.840 the uncanny
00:13:46.760 valley.
00:13:48.100 That if
00:13:48.440 somebody's
00:13:48.860 exactly like
00:13:49.640 a Republican
00:13:50.180 and you're
00:13:50.880 a Republican,
00:13:51.540 you say,
00:13:51.880 yeah, right
00:13:52.720 on, that's
00:13:53.160 exactly what
00:13:53.800 I expected.
00:13:54.920 But if
00:13:55.300 somebody's
00:13:55.720 like a
00:13:56.360 Republican but
00:13:57.060 not quite,
00:13:58.500 you know,
00:13:58.820 the rhino
00:13:59.240 thing, well
00:14:00.440 they're like
00:14:00.880 one but
00:14:01.260 not quite,
00:14:01.840 does it
00:14:02.160 cause
00:14:02.400 revulsion
00:14:02.920 like the
00:14:04.380 uncanny
00:14:04.860 valley?
00:14:05.420 If you've
00:14:05.940 never heard
00:14:06.300 of a
00:14:06.720 phrase, the
00:14:07.820 phrase uncanny
00:14:09.040 valley, just
00:14:10.380 Google it
00:14:10.920 separately.
00:14:11.660 It's kind of
00:14:12.300 a cool concept
00:14:13.000 to know.
00:14:13.740 It's about
00:14:14.120 how robots
00:14:14.960 become grotesque
00:14:16.500 when they get
00:14:17.040 close to looking
00:14:17.880 exactly like
00:14:18.600 people, but
00:14:19.900 not quite
00:14:20.360 exactly.
00:14:21.220 They're just
00:14:21.560 close to
00:14:22.060 exact and
00:14:23.160 they become
00:14:23.740 disgusting like
00:14:24.980 zombies.
00:14:26.460 You know,
00:14:26.560 they just,
00:14:26.980 ugh, you
00:14:27.480 see somebody
00:14:28.400 who's almost
00:14:28.960 a person,
00:14:29.500 you're like,
00:14:29.740 ugh, what
00:14:30.200 the hell's
00:14:30.480 wrong?
00:14:31.260 Maybe that's
00:14:32.140 what it is
00:14:32.520 with the
00:14:33.120 rhinos.
00:14:34.180 They're like,
00:14:34.680 they're almost
00:14:35.240 Republicans, but
00:14:36.060 ugh, what is
00:14:36.940 wrong with
00:14:37.360 them?
00:14:37.840 Ugh, might be
00:14:39.440 something like
00:14:39.940 that.
00:14:40.760 Anyway, I'm
00:14:42.520 going to
00:14:43.120 withhold my
00:14:44.520 criticism because
00:14:45.320 if the job of
00:14:47.180 this rhino,
00:14:49.540 you call her,
00:14:50.460 is to protect
00:14:51.620 foreign elections
00:14:52.420 or elections
00:14:53.280 from foreign
00:14:53.780 interference, is
00:14:55.180 there any way to
00:14:55.700 do that without
00:14:56.320 knowing that you
00:14:57.160 can also fully
00:14:59.440 audit the
00:15:00.020 election.
00:15:01.240 So here's my
00:15:01.840 stand.
00:15:02.820 Anything short
00:15:03.760 of an explicit
00:15:05.400 effort to make
00:15:07.960 all elections
00:15:08.740 instantly
00:15:09.460 auditable, so
00:15:10.860 you can track
00:15:11.440 your vote
00:15:11.980 individually all
00:15:12.900 the way through,
00:15:14.300 anything but
00:15:15.220 that is not
00:15:16.920 good enough.
00:15:18.200 Not good enough
00:15:19.120 because you
00:15:19.440 would never
00:15:19.740 know if any
00:15:21.140 influence happened.
00:15:22.000 Now, influence
00:15:23.100 may be beyond
00:15:24.380 just vote
00:15:25.140 counting, right?
00:15:25.960 It could cause
00:15:26.820 you to vote the
00:15:27.380 way you do.
00:15:28.160 That's a separate
00:15:28.840 question.
00:15:29.980 But somebody's
00:15:31.260 got to be in
00:15:31.680 charge of making
00:15:32.180 sure the elections
00:15:33.140 are instantly
00:15:33.940 auditable.
00:15:35.280 And if this
00:15:35.860 Kim Wyman is
00:15:37.820 not that person,
00:15:39.420 then this is a
00:15:40.160 complete failure.
00:15:41.640 Complete failure
00:15:42.480 by the Biden
00:15:43.040 administration.
00:15:44.260 I mean, it's
00:15:44.840 good that they
00:15:45.400 have a resource
00:15:45.960 to help against
00:15:46.840 hacking, but
00:15:48.480 it's not even
00:15:49.400 close to what
00:15:51.180 the job
00:15:51.960 requires.
00:15:52.980 The job
00:15:53.460 requires that
00:15:54.480 we have at
00:15:54.900 least an
00:15:55.340 explicit goal.
00:15:57.040 Now, I get
00:15:57.540 that the states
00:15:58.240 have control
00:15:59.020 of the elections,
00:15:59.760 right?
00:16:00.360 I will stipulate
00:16:01.440 that the states
00:16:02.880 have control
00:16:03.480 of the elections.
00:16:04.600 But somebody,
00:16:05.940 maybe the
00:16:06.520 federal government
00:16:07.200 should say,
00:16:08.220 the objective
00:16:08.940 is to be
00:16:09.820 fully auditable
00:16:10.720 and fairly
00:16:11.900 instantly.
00:16:13.320 That should at
00:16:14.080 least be the
00:16:14.540 objective, even
00:16:15.220 if the states
00:16:15.840 say we can't
00:16:16.460 get there or
00:16:17.180 we need
00:16:18.080 resources to
00:16:18.900 do it or
00:16:19.420 we don't want
00:16:19.980 to do it for
00:16:20.420 one reason or
00:16:21.000 not, at
00:16:21.740 the very
00:16:22.100 least it
00:16:22.760 should be
00:16:23.000 the objective.
00:16:25.320 So I'm
00:16:26.340 going to say
00:16:26.700 it's a small
00:16:27.900 step in
00:16:28.540 slightly the
00:16:29.200 right direction
00:16:29.800 and I'll
00:16:30.140 give them
00:16:30.360 credit for
00:16:30.780 that, but
00:16:31.740 I don't see
00:16:32.460 evidence that
00:16:33.140 it's anywhere
00:16:33.540 near enough.
00:16:37.500 Rhinos have
00:16:38.160 no natural
00:16:38.920 predators.
00:16:40.640 That's
00:16:40.960 interesting.
00:16:42.540 Somebody says
00:16:43.260 that's not the
00:16:43.860 goal.
00:16:44.080 It's my
00:16:44.400 goal.
00:16:45.420 It's my
00:16:45.880 goal.
00:16:46.100 It should be
00:16:47.880 your goal,
00:16:48.420 right?
00:16:49.060 Is there
00:16:49.300 anything?
00:16:49.540 A rhino
00:16:51.340 criticizes the
00:16:52.220 GOP for
00:16:52.820 the wrong
00:16:53.220 things, but
00:16:57.260 that's just
00:16:57.780 an opinion
00:16:59.060 that it's
00:16:59.500 the wrong
00:16:59.860 things, right?
00:17:01.620 I mean, they
00:17:01.980 just have a
00:17:02.400 different opinion.
00:17:05.780 You don't
00:17:06.360 need to argue
00:17:06.900 with me that
00:17:07.420 you disagree
00:17:07.980 with a
00:17:08.980 rhino.
00:17:10.960 I mean,
00:17:11.760 there's nothing
00:17:12.520 there.
00:17:13.380 I get it.
00:17:14.060 You disagree
00:17:14.540 with them.
00:17:14.900 There's nothing
00:17:15.260 else to be
00:17:15.680 said about it,
00:17:16.280 really.
00:17:16.880 It's not
00:17:17.300 interesting in
00:17:17.960 any way.
00:17:21.540 All right.
00:17:23.800 So,
00:17:25.360 Saturday Night
00:17:26.180 Live, how
00:17:26.720 many people
00:17:27.580 saw Saturday
00:17:28.280 Night Live
00:17:28.900 with Jason
00:17:29.520 Sudeikis
00:17:31.320 roasting Biden?
00:17:34.100 Did anybody
00:17:34.420 see that?
00:17:35.740 Because it
00:17:36.120 felt like a
00:17:36.860 change in
00:17:38.880 tone, didn't
00:17:39.440 it?
00:17:39.660 Did it
00:17:41.420 look to
00:17:41.780 you like
00:17:42.220 SNL just
00:17:43.360 said, oh,
00:17:44.460 hell, we
00:17:46.240 just can't
00:17:47.060 ignore this
00:17:47.700 anymore?
00:17:48.680 Biden's a
00:17:49.820 train wreck?
00:17:51.220 It looked
00:17:51.700 like they
00:17:52.040 went at him
00:17:52.460 pretty hard.
00:17:53.340 They went
00:17:53.660 after his
00:17:54.280 sniffing.
00:17:56.260 They went
00:17:56.920 after his
00:17:57.600 decline, you
00:17:58.920 know, how
00:17:59.160 he used to
00:17:59.520 be all
00:17:59.900 energetic, and
00:18:01.020 now there's
00:18:01.660 not much
00:18:02.020 left of
00:18:02.440 him.
00:18:03.060 I thought
00:18:03.600 it was
00:18:03.860 pretty brutal.
00:18:06.180 Now, are
00:18:07.200 you surprised?
00:18:07.860 because, you
00:18:09.740 know, you've
00:18:10.020 seen that
00:18:10.460 CNN has
00:18:11.240 started to
00:18:11.700 move against
00:18:12.180 him.
00:18:12.500 You've seen
00:18:12.900 Biden's poll
00:18:14.200 numbers are
00:18:14.860 crashing.
00:18:16.980 CNN feels
00:18:17.900 like one of
00:18:19.060 the last
00:18:19.540 holdouts,
00:18:20.520 doesn't it?
00:18:21.320 Like, if you
00:18:22.020 can get
00:18:22.360 SNL to
00:18:23.560 say, all
00:18:24.000 right, you
00:18:24.820 know, we
00:18:25.340 had good
00:18:26.120 hopes, but
00:18:26.600 it didn't
00:18:26.880 work out,
00:18:28.060 which is,
00:18:29.600 somebody says
00:18:30.760 that they
00:18:31.020 went after
00:18:31.440 him very
00:18:31.820 soft.
00:18:32.700 It was
00:18:33.440 gentle in
00:18:34.280 a sense, but
00:18:34.900 they hit all
00:18:35.320 the points.
00:18:36.700 It did
00:18:37.080 seem, it
00:18:37.640 did seem
00:18:38.180 like there
00:18:38.500 was a
00:18:38.940 gentle
00:18:39.360 edge to
00:18:39.840 it.
00:18:39.960 You're
00:18:40.100 right.
00:18:40.400 You're
00:18:40.640 right about
00:18:40.920 that.
00:18:41.540 But they
00:18:42.020 did hit
00:18:42.300 all the
00:18:42.560 points.
00:18:43.300 They did
00:18:43.600 get his
00:18:43.900 mental
00:18:44.160 decline.
00:18:44.840 They did
00:18:45.080 get, you
00:18:46.180 know, his
00:18:47.320 craziness.
00:18:53.900 Who's a
00:18:54.860 zaobiding?
00:18:58.280 Who is that?
00:18:59.820 Sounds like
00:19:00.320 there's something
00:19:00.760 good that I
00:19:01.560 don't know
00:19:01.800 about.
00:19:03.600 Anyway, the
00:19:04.260 Wall Street
00:19:04.600 Journal has
00:19:05.280 also sounded
00:19:07.420 the alarm,
00:19:08.080 the editorial
00:19:08.740 board.
00:19:09.520 So it's
00:19:09.840 not just
00:19:10.200 an article.
00:19:11.560 The editorial
00:19:12.220 board of the
00:19:12.860 Wall Street
00:19:13.160 Journal basically
00:19:14.420 said we have
00:19:15.120 to stop ignoring
00:19:15.880 the fact that
00:19:16.660 Biden is
00:19:17.380 incompetent.
00:19:18.660 Like, you
00:19:19.140 know, mentally
00:19:19.940 incompetent, not
00:19:20.800 just bad at his
00:19:21.500 job.
00:19:22.660 That's a pretty
00:19:23.120 big deal.
00:19:23.940 So now SNL has
00:19:25.020 basically just put
00:19:25.860 it out there.
00:19:26.980 And now the
00:19:27.740 Wall Street Journal
00:19:28.400 editorial board
00:19:29.360 basically just
00:19:30.820 reports it like an
00:19:31.920 obvious fact.
00:19:32.720 You don't even
00:19:33.100 have to, you
00:19:34.020 don't even need the
00:19:34.640 medical examination.
00:19:35.880 It's just sort of
00:19:36.700 obvious now.
00:19:40.440 That's pretty
00:19:41.100 amazing.
00:19:43.340 And I would
00:19:45.060 love to ask
00:19:45.860 people privately
00:19:47.780 who were big
00:19:48.480 supporters of
00:19:49.500 Biden if they
00:19:50.820 got what they
00:19:51.340 wanted.
00:19:52.980 I would love,
00:19:53.580 has anybody had
00:19:54.200 that conversation
00:19:54.900 privately?
00:19:55.900 Has anybody had a
00:19:56.980 private conversation?
00:19:58.100 Because you act
00:19:58.660 differently in public.
00:19:59.860 But has anybody
00:20:00.320 had a private
00:20:01.000 conversation with a
00:20:02.020 Democrat who is
00:20:04.320 regretting their
00:20:06.080 choice?
00:20:09.100 Because people
00:20:10.260 tend to have
00:20:10.780 cognitive dissonance
00:20:11.720 and say they
00:20:12.220 were right even
00:20:13.220 if they were
00:20:13.600 wrong.
00:20:17.880 Yeah, you don't
00:20:18.480 see much of it.
00:20:20.460 Look how little
00:20:21.180 there is.
00:20:21.600 You'd expect there
00:20:22.260 to be a lot of
00:20:22.760 it because the
00:20:23.260 poll numbers are
00:20:24.000 plunging, right?
00:20:24.840 So therefore there
00:20:25.560 should be a lot of
00:20:26.040 people who change
00:20:26.620 their minds.
00:20:28.320 But you can't
00:20:28.960 find them.
00:20:30.480 Yeah.
00:20:30.600 Oh, here's
00:20:31.560 some.
00:20:32.420 So, yep, my
00:20:33.160 neighbor, Lisa
00:20:34.220 says.
00:20:37.620 There's still
00:20:38.200 anything but
00:20:38.780 Trump.
00:20:39.260 Yeah.
00:20:39.740 So I guess they
00:20:40.360 can defend their
00:20:40.960 choice because it's
00:20:41.720 anything but Trump.
00:20:42.680 I guess you always
00:20:43.400 have that to fall
00:20:43.920 back on.
00:20:44.920 Here's my question.
00:20:45.940 If we could put a
00:20:46.700 Tesla automobile in
00:20:48.400 space, and when I
00:20:50.080 say we, I don't mean
00:20:51.020 me, because I had
00:20:52.280 nothing to do with
00:20:53.000 it.
00:20:53.540 But humans, at
00:20:55.420 least some of us,
00:20:56.020 are smart enough to
00:20:56.820 get together and
00:20:57.520 launch an automobile
00:20:59.420 into space.
00:21:01.320 And yet, we can't
00:21:03.540 invent a fake gun that
00:21:05.520 looks real on a
00:21:06.340 movie.
00:21:07.640 Really?
00:21:08.900 Really.
00:21:09.420 The only way to
00:21:10.240 shoot a movie, which
00:21:11.540 is all make-believe,
00:21:13.200 is with a real gun
00:21:14.220 that could at least
00:21:15.980 potentially fire real
00:21:17.320 ammunition.
00:21:18.540 Nobody can make a
00:21:19.400 fake gun that would
00:21:21.300 be an actual prop as
00:21:22.580 opposed to a real gun
00:21:23.500 that they pretend is a
00:21:24.420 prop.
00:21:25.820 Can't do that?
00:21:27.780 Nobody's figured that
00:21:28.620 out yet?
00:21:28.940 It does feel as though
00:21:30.540 there is a market for
00:21:31.880 that.
00:21:32.820 Now, my understanding is
00:21:33.920 that there are such
00:21:34.860 things as prop guns.
00:21:37.260 And a prop gun,
00:21:38.240 actually, you couldn't
00:21:38.960 put a bullet in it if
00:21:39.840 you wanted.
00:21:40.240 It would just be
00:21:40.680 blocked.
00:21:41.900 But I imagine the
00:21:43.420 prop gun doesn't have
00:21:46.120 the same action, or
00:21:47.480 maybe it doesn't flash
00:21:48.700 the same, or have the
00:21:49.700 kick, or whatever it is
00:21:50.740 that makes it look
00:21:51.960 realistic in a movie.
00:21:52.920 But we couldn't figure
00:21:55.020 out how to solve those
00:21:56.240 problems.
00:21:57.880 I mean, that seems like
00:21:58.700 a solvable problem to me,
00:22:01.220 to have a gun that makes
00:22:02.440 a flash without a bullet.
00:22:04.520 You couldn't make an
00:22:06.040 electronic gun that just
00:22:08.080 lights something that just
00:22:09.480 flashes out the barrel or
00:22:11.760 something.
00:22:13.180 I mean, really?
00:22:15.300 How hard is that?
00:22:16.880 Well, we're also finding
00:22:17.840 out how dumb Hollywood is.
00:22:20.120 I'm going to read you
00:22:21.060 this story, and you're
00:22:22.340 going to swear, if you
00:22:23.240 haven't already heard
00:22:23.880 the story, you're going
00:22:25.520 to swear I'm making this
00:22:26.440 up.
00:22:27.500 All right?
00:22:28.140 Anybody who owns a gun,
00:22:29.700 or has been around guns,
00:22:30.680 or knows anything about
00:22:31.880 guns, you're going to
00:22:33.640 think I made this up.
00:22:35.480 This will blow your
00:22:37.420 frickin' mind.
00:22:39.320 All right?
00:22:40.040 I swear to God, this is
00:22:41.820 true.
00:22:42.900 And I'm going to read
00:22:43.600 it.
00:22:45.060 This is one from CNN.
00:22:46.520 There's this pastime that
00:22:50.040 crew members sometimes
00:22:51.160 do, and this is coming
00:22:52.120 from somebody who knows
00:22:54.300 that world, somebody named
00:22:56.220 Waxman.
00:22:57.120 It's called plinking.
00:22:59.680 So there's something that
00:23:00.720 movie crew members do
00:23:02.020 called plinking.
00:23:03.540 And they go out into
00:23:04.800 rural areas, and they
00:23:06.420 shoot at beer cans.
00:23:08.240 This is with live ammunition.
00:23:11.640 In other words, they take
00:23:13.920 the guns that are used for
00:23:15.800 the movie, they take
00:23:18.620 them to somewhere in a
00:23:21.100 rural area, they replace
00:23:23.620 the blanks with real
00:23:26.780 bullets, and then they
00:23:28.940 shoot at targets, and then
00:23:31.480 they return the gun.
00:23:36.160 I'm not making that up.
00:23:39.240 They actually do that.
00:23:40.860 Now, everybody who's had
00:23:44.700 gun training, hold on to the
00:23:47.500 top of your head, because
00:23:49.000 it's coming off.
00:23:49.960 It's coming off, isn't
00:23:50.960 it?
00:23:51.640 Okay, ah, ah, ah, the top
00:23:53.480 of my head.
00:23:54.180 Ah, ah, ah, ah.
00:23:58.340 Honestly, can you think of
00:23:59.520 anything dumber?
00:24:01.420 If you were going to teach a
00:24:03.660 class in gun safety, can you
00:24:06.460 imagine a better anecdote of
00:24:08.440 bad behavior?
00:24:10.660 Now, I know what the movie
00:24:12.260 people are going to say.
00:24:14.480 They're going to say, Scott,
00:24:15.840 Scott, Scott.
00:24:17.220 There's somebody in charge of
00:24:18.540 checking all the guns before
00:24:20.660 they're handed to an actor.
00:24:21.740 I mean, it's not like a live
00:24:24.560 round is still going to be in
00:24:25.980 the chamber.
00:24:27.920 Scott, you fool.
00:24:29.760 That's why we hire somebody.
00:24:31.280 We hire somebody just for that
00:24:32.900 job, to make sure that gun is
00:24:34.340 safe before it gets handed to an
00:24:35.680 actor.
00:24:37.900 And all the people who have
00:24:39.160 any gun training whatsoever
00:24:40.600 say, you are the dumbest
00:24:43.580 person in the world if you
00:24:44.560 think that's a good idea.
00:24:45.880 Because now you've taken the
00:24:47.420 failure point and you've put
00:24:50.140 it on one person who's
00:24:51.520 underpaid and may not be
00:24:53.080 paying attention, might come
00:24:54.280 in with a hangover that day.
00:24:56.240 And lives would be at stake for
00:24:58.840 somebody just to skip a step or
00:25:01.200 something.
00:25:02.100 You would never, ever put a
00:25:05.140 live round in a gun that's
00:25:06.740 going to be used in a movie,
00:25:08.460 no matter how sure you were
00:25:10.740 that you were going to remove
00:25:12.280 it.
00:25:12.700 This gets me to a related
00:25:14.260 topic.
00:25:15.480 So I've had a side conversation
00:25:17.320 with somebody who works on
00:25:19.340 movie sets a lot.
00:25:20.860 You know, he's often on movie
00:25:21.820 sets.
00:25:22.300 It's his job.
00:25:24.040 And is arguing on the
00:25:26.540 internet and can't understand
00:25:27.880 why people don't get his
00:25:29.060 argument.
00:25:29.380 And he argued that Alec
00:25:32.140 Baldwin is not either
00:25:35.120 responsible or 100%
00:25:36.560 responsible because there
00:25:38.980 was somebody whose job it
00:25:40.020 was, and it's the routine
00:25:41.700 way that movies are done,
00:25:43.560 somebody whose job it was to
00:25:44.860 make sure that gun was safe
00:25:46.040 before it was given to him.
00:25:47.460 And therefore, the people who
00:25:49.240 are saying that Alec Baldwin is
00:25:50.580 fully responsible don't
00:25:52.860 understand because clearly it
00:25:54.700 was somebody else's
00:25:55.400 responsibility.
00:25:56.860 And the whole point was to
00:25:58.740 keep the actor just as an
00:26:00.140 actor and not have to worry
00:26:01.280 about this stuff.
00:26:02.560 So, Scott, don't you
00:26:03.540 understand?
00:26:04.620 Don't you understand that the
00:26:06.860 responsibility was, you know,
00:26:09.240 sort of shared, at least, by the
00:26:11.940 person whose job it was?
00:26:14.120 Here's my counter-argument.
00:26:15.800 Are you ready?
00:26:16.040 There's no such thing as one kind of
00:26:19.260 responsibility.
00:26:21.300 And as soon as you imagine that
00:26:22.900 there's one thing called
00:26:24.140 responsibility and it can be
00:26:25.720 defined one way, well, you're lost
00:26:28.600 because there's nothing like that.
00:26:30.800 There's legal responsibility, which
00:26:33.840 has a standard.
00:26:35.580 There's financial responsibility, which
00:26:38.120 might be different than the legal
00:26:40.000 responsibility.
00:26:40.740 Could be the same, but could be
00:26:42.520 different.
00:26:42.800 Then there's what I would call
00:26:44.760 common sense responsibility.
00:26:46.940 It has nothing to do with the law
00:26:48.980 or anything else.
00:26:49.720 It's just, what makes sense to you?
00:26:52.520 Like, as an ordinary person, what
00:26:54.840 makes sense?
00:26:56.460 And then there's gun responsibility.
00:26:59.840 Back me on this, gun owners, so the
00:27:02.820 rest of you who are not maybe trained
00:27:04.440 in firearms, look to the comments of
00:27:06.920 the people who are and see if you agree
00:27:08.540 with the statement, gun responsibility
00:27:11.540 is absolute.
00:27:14.760 It's absolute.
00:27:15.980 So there's no such thing as sharing it.
00:27:18.520 It's absolute.
00:27:20.020 And it's absolute for everybody who
00:27:21.700 touches it.
00:27:23.240 In other words, the gun owner, or the
00:27:25.920 person who has it in their hand, let's
00:27:27.120 say, the person who has it in their
00:27:28.640 hand is 100% responsible for what
00:27:30.960 happens.
00:27:32.200 100%.
00:27:32.560 Even if the person who gave it to
00:27:36.680 him is also 100% responsible.
00:27:39.540 So gun ownership is the only situation
00:27:41.480 in which the responsibility can add up
00:27:43.240 to over 100%.
00:27:44.460 Am I right?
00:27:46.400 Which is illogical, right?
00:27:48.340 Because responsibility can't add up to
00:27:50.120 100%.
00:27:50.840 But let me say it again.
00:27:52.780 It does.
00:27:53.380 Because gun owners, gun owners, don't
00:27:58.560 fuck around when it comes to safety.
00:28:02.140 Right?
00:28:02.580 There are absolutes.
00:28:04.780 And when you're trying to argue with a
00:28:06.180 gun owner, but don't you get that the
00:28:08.800 system was that this other person takes
00:28:11.180 responsibility?
00:28:12.340 Both people had agreed.
00:28:14.660 You know, Alec Baldwin knew that.
00:28:16.440 The person responsible knew it.
00:28:18.060 They had agreed that it was her
00:28:19.600 responsibility.
00:28:20.600 So therefore, no.
00:28:22.340 No.
00:28:22.580 Blah, blah, blah, blah, blah.
00:28:25.640 Blah, blah, blah, your reasons.
00:28:28.260 Nobody wants your reasons.
00:28:30.040 If you're a gun owner, or if you have a
00:28:31.820 gun in your hand, it's your
00:28:32.760 responsibility, period.
00:28:34.280 Period.
00:28:35.460 And nobody's going to listen to a
00:28:37.120 counterargument.
00:28:38.220 Because the moment you allow the
00:28:40.400 counterargument in, people get killed.
00:28:44.260 Right?
00:28:44.700 The moment you allow, well, there could
00:28:46.860 be an exception.
00:28:49.020 Bam.
00:28:49.500 Somebody's dead.
00:28:50.080 Because this was the exception.
00:28:52.580 This was exactly the exception.
00:28:55.420 This is why it always has to be 100%.
00:28:57.360 Because the moment you make the
00:28:59.300 exception, somebody gets shot.
00:29:01.000 Every time.
00:29:02.460 So you don't make that exception.
00:29:04.740 And it has nothing to do with legal
00:29:06.640 responsibility.
00:29:07.460 It's a different standard.
00:29:09.220 Legal responsibility is nuanced.
00:29:11.520 And, you know, you really have to figure out who did
00:29:13.840 what and blah, blah, blah.
00:29:15.160 But gun responsibility is irrational by design.
00:29:21.720 Right?
00:29:22.540 It's rational irrationality.
00:29:25.720 It's rational to be irrational on gun safety.
00:29:30.220 And by irrational, I mean, you don't even put your
00:29:33.200 finger on it unless you're going to do something
00:29:35.520 productive with it.
00:29:36.580 Right?
00:29:36.800 Like, this is how irrational gun safety is.
00:29:40.220 If your gun's on, you know, if this is your gun, you
00:29:43.120 don't even, you don't even do this unless you're going to
00:29:47.680 move it productively, like with a purpose.
00:29:50.860 You don't do anything with a gun.
00:29:53.780 Anything.
00:29:55.320 Unless you're doing it with purpose and safety.
00:29:57.940 Right?
00:29:58.860 So, anyway, people are not arguing the same argument.
00:30:01.520 They're pretending it's all responsibility is one thing,
00:30:04.000 or it's logical, or it's common sense.
00:30:06.120 It's none of those things.
00:30:07.020 And it shouldn't be.
00:30:08.220 As soon as you make that stuff common sense, people die.
00:30:11.540 So, there's a reason for it.
00:30:14.920 I'm going to tell you something, but I can't tell you why.
00:30:18.600 A lot of the big political stories you're hearing now,
00:30:21.360 you are completely misled on what's really behind them.
00:30:25.700 And I can't tell you which stories, and I can't tell you why.
00:30:28.620 But just trust me, I'm seeing in more windows than you see,
00:30:31.860 and I can't even frickin' believe in the stuff I'm seeing.
00:30:34.700 You're never going to hear why, I don't think.
00:30:37.180 I don't think you'll ever know what I'm talking about.
00:30:40.180 But, oh my God, the biggest stories are just so completely wrong
00:30:45.740 in important ways.
00:30:48.420 Anyway, I'll just leave that there.
00:30:50.480 CNN keeps dumping on Facebook, and I'm loving it.
00:30:55.820 CNN's attacked from the left and the right.
00:30:57.660 So, CNN is attacking Facebook for not censoring enough.
00:31:05.020 And their examples would be, you know, January 6th, for example.
00:31:08.440 They didn't censor enough, so it caused people to organize and do bad stuff.
00:31:11.740 And then, of course, the righty thinks they censor too much.
00:31:18.600 And you're not going to like this, but it probably means they're doing a good job.
00:31:23.780 I hate to say it.
00:31:28.060 I'm not a fan of Facebook, by the way.
00:31:29.740 I don't use it.
00:31:30.680 I think it should go away.
00:31:32.500 I just don't like anything about Facebook, honestly.
00:31:34.900 So, I'm not a fan.
00:31:37.000 But if you see that Facebook is being criticized for being not censorious enough,
00:31:45.100 at the same time they're being criticized for being too censorious.
00:31:48.800 Is that a word?
00:31:49.420 Censorious?
00:31:50.440 Censoring too much?
00:31:51.320 I would think that's kind of where you need to be.
00:31:55.520 How could they possibly have both sides happy about that?
00:31:59.400 Does anybody think there's something they could do that would make both sides happy?
00:32:04.380 I don't know that that's even an option.
00:32:06.740 Now, I'm not saying they're doing it the way I would do it or that it's flaw-free.
00:32:12.300 Clearly, there are flaws.
00:32:13.860 But it looks to be the kind of flaws that you would expect in a free system, right?
00:32:18.940 You know, civilization has flaws, but we like to be civilized.
00:32:23.780 Capitalism has flaws, but it's still pretty good compared to the alternatives.
00:32:29.640 Democracy and the republic and all that, lots and lots of flaws,
00:32:34.580 but nobody's figured out a better thing to do.
00:32:38.000 Facebook is starting to feel like democracy.
00:32:43.220 It's a terrible system that we can't figure out a better one.
00:32:46.740 Now, I'm not supporting any decision they made,
00:32:50.200 and I'm not going to criticize any specific thing today.
00:32:53.220 I'm just saying that I wouldn't want one side to be happy with Facebook
00:32:58.940 and the other side to be unhappy.
00:33:01.000 Would you?
00:33:03.460 Tell me, would you want one side to be happy and the other side to be unhappy?
00:33:07.560 No matter which side was happy.
00:33:09.760 I wouldn't.
00:33:10.760 That feels like a very unhealthy situation.
00:33:13.100 The healthiest situation is exactly what we have.
00:33:16.740 Everybody's mad at Facebook all the time.
00:33:20.240 I hate to say it, but that's literally the healthiest situation.
00:33:24.220 Everybody mad at Facebook all the time.
00:33:28.680 All right.
00:33:33.000 So what else we got going on here?
00:33:34.900 There's a reason that I follow Michael Malice on Twitter.
00:33:40.820 If you don't follow Michael Malice, and he's at Michael Malice,
00:33:46.660 just his name's put together.
00:33:50.240 Here's why.
00:33:51.980 I'm going to read you one of his tweets,
00:33:54.640 and you'll know why I follow him.
00:33:57.120 Okay?
00:33:57.580 Just one tweet.
00:33:59.500 He tweeted today, or it might have been yesterday,
00:34:02.020 I wonder when Alec Baldwin will resume shooting.
00:34:09.400 And that's why I follow Michael Malice.
00:34:13.700 Maybe you should, too.
00:34:15.340 Now, a little more explanation.
00:34:18.120 I've talked before how there are some humorists,
00:34:20.580 such as Norm MacDonald,
00:34:22.980 who, if you don't understand his humor,
00:34:25.240 you don't realize that the joke is always on the audience.
00:34:28.580 So sometimes he's, you know, just joking.
00:34:30.220 But lots of times the joke is about how the audience is responding.
00:34:34.720 Now, Dave Chappelle sometimes does the same thing.
00:34:37.680 Dave Chappelle is operating at that, you know, that high level.
00:34:41.520 So that he's telling jokes,
00:34:43.200 but also it's sort of about your reaction to the jokes, too.
00:34:47.420 Right?
00:34:48.500 And you have to understand that Michael Malice
00:34:51.720 is also about your reaction.
00:34:53.700 So don't automatically jump to imagine he's a monster
00:34:59.160 because he says things that get you wound up.
00:35:02.580 His act, if I could call it that,
00:35:05.320 his approach, maybe,
00:35:07.140 is he finds the worst thing you can say about every topic.
00:35:13.120 Whatever is the worst thing you can say.
00:35:15.520 And then he tweets it.
00:35:16.500 And if you can't appreciate, you know, the, I don't know,
00:35:21.800 I guess the edge that he brings to it,
00:35:24.740 you probably shouldn't follow him.
00:35:26.380 But if you get that he's doing it for the reaction,
00:35:29.620 then it's just a great show.
00:35:32.360 So I recommend him highly.
00:35:34.780 And by the way, you won't like it sometimes,
00:35:37.440 which is the point, right?
00:35:39.000 Sometimes it's really going to rattle your chain, too.
00:35:42.140 All right.
00:35:49.100 I've been accused credibly,
00:35:51.880 so here I'd like to agree with my critics.
00:35:54.800 Sometimes people say,
00:35:56.080 Scott, Scott, Scott, you can't admit when you're wrong.
00:36:00.420 I think I do.
00:36:01.760 I think I do, actually,
00:36:02.860 but maybe I'm not admitting when I'm wrong.
00:36:05.280 So I'll try to do it now.
00:36:07.600 And I'm going to agree with my critics
00:36:09.580 that although I've said over and over again
00:36:12.780 that, for example,
00:36:14.540 that I don't care if you get vaccinations,
00:36:16.300 the way I talk about the pandemic
00:36:18.020 leads you to believe that I'm persuading you
00:36:21.760 or that I have a certainty
00:36:23.800 about things I don't have certainty about.
00:36:26.060 So I think my critics are completely right
00:36:28.080 that you can accidentally influence people
00:36:32.400 just by what you talk about most, right?
00:36:35.000 So if I happen to talk about things
00:36:37.400 that were pro-vaccine,
00:36:38.760 even if I'm not telling you to get it,
00:36:42.260 you would internalize that as,
00:36:43.900 well, he's saying pro-vaccine things.
00:36:46.460 How can you say you're not trying to convince me to get it?
00:36:48.900 You keep saying pro-vaccine things.
00:36:50.840 So as a public service,
00:36:53.360 because I know that it annoys a lot of you,
00:36:55.380 I'm going to argue the opposite of my opinion today,
00:36:59.100 the same way that, you know,
00:37:01.360 I make the mistake the other direction.
00:37:02.860 So it won't be so much I'm making an argument
00:37:05.540 against my opinion
00:37:07.100 as much as I'm giving the other side of the story.
00:37:11.860 Is that fair?
00:37:13.220 So I'm going to concentrate on
00:37:15.300 something that's pro-ivermectin,
00:37:18.440 because usually I talk about the opposite.
00:37:20.920 So I'm going to tell you some things that are pro-ivermectin,
00:37:23.600 even though I'm almost always talking the opposite.
00:37:25.560 I'm not trying to convince you.
00:37:28.140 All right, nothing you hear now
00:37:29.680 will be to convince you it's good or bad or anything.
00:37:32.320 I'm just going to take the other side.
00:37:33.980 Now, this is a good exercise.
00:37:36.660 It's a good exercise
00:37:37.820 to see if you can argue the other point.
00:37:41.200 All right?
00:37:41.940 If you can't argue the other side's point,
00:37:45.460 you have to ask yourself why.
00:37:47.400 Is it because the other side is just whack?
00:37:49.800 Or because you're so cognitively biased
00:37:52.540 that you can't even say the words?
00:37:55.560 That the other side would say.
00:37:58.220 So in the spirit of testing my own cognitive dissonance,
00:38:03.200 I'm going to show you an exercise
00:38:05.120 in which I speak the opposite of my opinion.
00:38:09.240 Okay?
00:38:10.180 And it'll be on the vaccines
00:38:11.380 and also on ivermectin,
00:38:13.540 I think one other thing.
00:38:15.080 I'm going to start with a little context
00:38:16.800 and we'll lead you into it.
00:38:18.820 Rasmussen has a poll about climate change
00:38:20.980 and asks people,
00:38:22.980 is climate change a crisis?
00:38:23.940 51% of the people said,
00:38:26.360 you know,
00:38:26.820 hell yes
00:38:27.780 or,
00:38:29.120 you know,
00:38:29.900 leaning yes.
00:38:31.300 So over half of the people
00:38:32.620 think climate change is a crisis.
00:38:34.760 And it breaks down exactly like you think.
00:38:36.580 75% of Democrats
00:38:37.800 and 31% of Republicans.
00:38:41.720 And interestingly,
00:38:43.080 the moderates are right in the middle.
00:38:44.360 which tells you something disturbing,
00:38:49.040 but that's another story.
00:38:52.000 And then Rasmussen also asked,
00:38:54.760 and I'm paraphrasing their questions,
00:38:56.460 that they put the questions in better form
00:38:59.340 than I'm going to paraphrase them.
00:39:01.380 So just know that they know how to do questions right,
00:39:04.160 even if I say them wrong.
00:39:05.240 All right.
00:39:07.400 They asked,
00:39:08.620 if the country that goes the hardest at climate change
00:39:12.140 and green energy stuff,
00:39:14.480 will they reap the economic and jobs benefit
00:39:17.480 in green energy?
00:39:18.640 In other words,
00:39:20.700 do people think that going hard against climate change
00:39:23.800 ends up being an economic positive?
00:39:25.620 And 56% said yes.
00:39:29.380 56% of the public
00:39:31.860 thinks that going hard at climate change
00:39:34.880 would be a positive economic thing.
00:39:37.040 38% disagree.
00:39:39.560 All right.
00:39:40.700 Again,
00:39:42.200 remember,
00:39:43.540 I'm just sort of giving you some views here.
00:39:47.440 So I'm going to do climate change
00:39:49.320 and then I'm going to do COVID as soon as I'm done.
00:39:52.240 And then do you agree with Biden's approach to climate?
00:39:54.380 48% supported it,
00:39:55.980 but 46% do not.
00:39:58.440 And 6% are unsure.
00:39:59.840 But I think the unsure
00:40:00.760 should be thrown in the do not support category.
00:40:04.200 That would be...
00:40:05.000 So pretty much a tie
00:40:06.680 between supporting Biden and not supporting him.
00:40:10.880 And as Michael Schellenberger is pointing out,
00:40:13.640 the world's green renewable experiment is over.
00:40:17.320 So Biden's climate bill is dead.
00:40:19.360 This is in a Michael Schellenberger tweet today.
00:40:22.180 Norway affirmed oil drilling.
00:40:23.680 So Norway is basically saying,
00:40:25.600 OK, green stuff isn't working.
00:40:26.940 We better drill.
00:40:28.100 Let's get busy drilling for oil.
00:40:30.280 And even Uganda says that the solar and wind force
00:40:32.900 and wind cause poverty in Africa.
00:40:37.340 So Africa is giving up on green energy.
00:40:40.200 Norway,
00:40:41.560 in some ways,
00:40:42.520 the United States,
00:40:43.740 in a minor way,
00:40:44.660 at least the climate bill.
00:40:45.540 And Germany is having the same issue.
00:40:50.340 And I think the UK is looking at nuclear and stuff.
00:40:55.240 So basically,
00:40:56.260 everything you thought you knew
00:40:58.240 or people thought they knew
00:41:00.860 about green energy and its benefits
00:41:03.220 seems to be wrong.
00:41:05.360 So at the same time that Rasmussen is saying,
00:41:07.560 hey,
00:41:08.480 look at all these people
00:41:10.800 who think the climate is a crisis
00:41:12.400 and Biden is doing the right thing
00:41:14.020 are looking at countries all over the world
00:41:17.280 who also thought the climate was a crisis
00:41:19.860 and going green was the right thing.
00:41:21.980 And they've all learned that they were wrong
00:41:23.840 and they're reversing their decisions.
00:41:26.300 At the same time,
00:41:27.340 people are saying,
00:41:27.820 yeah,
00:41:27.980 Biden looks good.
00:41:29.140 The majority of people are saying,
00:41:30.600 yeah,
00:41:30.780 it looks good.
00:41:31.820 While the rest of the world
00:41:32.760 who already went there
00:41:33.980 is changing their mind.
00:41:37.660 Okay.
00:41:38.360 Also good.
00:41:39.860 All right.
00:41:40.260 I'm going to argue the other side
00:41:41.420 quickly for ivermectin and stuff.
00:41:44.320 Here's my question.
00:41:45.500 If you had a drug
00:41:46.560 that you knew worked
00:41:47.800 in a laboratory,
00:41:50.580 in a test tube,
00:41:51.820 and it kills the virus in a test tube,
00:41:54.020 why doesn't it work in people?
00:41:56.140 Now,
00:41:56.500 I know,
00:41:56.960 I know the reason you do the controlled trials
00:41:59.180 is because it's actually rare
00:42:01.140 for something that works in a test tube
00:42:03.600 to also work in a human
00:42:05.460 without hurting them.
00:42:07.920 Do we agree?
00:42:09.460 Everybody knows
00:42:10.400 that the test tube results
00:42:11.920 actually rarely work in people.
00:42:15.640 So you would expect
00:42:16.920 that just because it works in a test tube
00:42:18.400 doesn't really mean anything to people.
00:42:20.920 But,
00:42:22.040 suppose you had a drug,
00:42:23.680 and the reason I ask why is that?
00:42:26.180 Why can't you tell
00:42:27.160 that it will work in people?
00:42:28.180 What's the mechanism
00:42:29.900 that ruins it?
00:42:31.700 And we know it doesn't work,
00:42:33.040 but what's the mechanism?
00:42:34.480 And I heard
00:42:35.220 a number of smart people say,
00:42:37.660 well,
00:42:37.940 the body
00:42:38.500 rapidly breaks down chemicals.
00:42:41.060 So whatever you put into a body,
00:42:43.500 as soon as it interacts
00:42:44.400 with your,
00:42:44.860 you know,
00:42:45.080 your blood
00:42:45.540 and your hormones
00:42:46.320 and everything,
00:42:47.160 it might break you down.
00:42:48.600 And so the effectiveness
00:42:49.460 you saw in the test tube
00:42:51.500 could get just destroyed
00:42:53.180 by the body.
00:42:54.820 Makes sense.
00:42:55.620 But,
00:42:56.960 suppose you have the special case
00:42:59.660 that the drug you're testing
00:43:01.800 is already known
00:43:03.620 not to break down in the body.
00:43:06.540 Let's take ivermectin
00:43:07.900 as my example.
00:43:09.460 Now remember,
00:43:11.200 I'm not promoting ivermectin
00:43:12.500 as working
00:43:13.560 because I don't have any idea.
00:43:15.120 I'm just arguing the other side
00:43:16.440 to see if I can do it.
00:43:18.500 Ivermectin,
00:43:19.000 we know,
00:43:19.340 doesn't break down in the body
00:43:20.700 so much
00:43:22.180 that it doesn't work
00:43:23.060 for what it was invented for.
00:43:24.180 So we know
00:43:26.400 that the body
00:43:27.380 seems to keep
00:43:29.040 ivermectin intact
00:43:30.460 long enough
00:43:32.020 to work against
00:43:33.520 its original purpose,
00:43:35.320 you know,
00:43:35.900 as a dewormer
00:43:36.820 for horses,
00:43:37.460 for example,
00:43:38.160 but in humans as well,
00:43:39.940 originally created
00:43:40.880 for humans.
00:43:42.020 So if you know
00:43:43.100 it's not going to break down
00:43:44.400 and you know
00:43:45.900 it kills a virus
00:43:46.920 in a test tube,
00:43:50.020 can you really,
00:43:50.620 and you know
00:43:51.140 what parts of the body
00:43:52.660 it gets into,
00:43:53.340 presumably you could find out
00:43:55.480 it wouldn't be hard,
00:43:56.640 I imagine,
00:43:57.620 take some blood
00:43:58.300 from a person
00:43:58.960 and say,
00:43:59.600 okay,
00:44:00.480 there's the coronavirus,
00:44:02.220 it's right there
00:44:02.700 in the blood,
00:44:04.060 then you give them
00:44:04.640 some ivermectin
00:44:05.480 and you test their blood again
00:44:06.580 and you go,
00:44:06.920 okay,
00:44:07.740 their blood has ivermectin
00:44:09.140 and it also has
00:44:10.780 coronavirus.
00:44:13.040 Do they both exist
00:44:13.980 in the blood
00:44:14.460 at the same time?
00:44:16.660 And are you telling me
00:44:17.400 that if you put something
00:44:18.260 that doesn't break down,
00:44:20.400 because we know
00:44:20.860 it doesn't break down
00:44:21.660 for its other purposes,
00:44:23.340 that doesn't mean
00:44:24.420 it doesn't break down
00:44:25.140 for this purpose,
00:44:25.960 right?
00:44:26.200 It could break down
00:44:27.000 in a specific way
00:44:28.160 that luckily
00:44:29.600 doesn't matter
00:44:30.580 to one purpose
00:44:31.440 but matters to the other,
00:44:32.560 right?
00:44:32.740 So there's all kinds
00:44:33.400 of possibilities.
00:44:34.940 But,
00:44:36.080 wouldn't you expect
00:44:37.200 it to work
00:44:37.820 more often than not?
00:44:40.680 As long as
00:44:41.460 the drug itself
00:44:42.580 doesn't change
00:44:43.480 too much in the body
00:44:44.420 and it lasts
00:44:45.520 a little while.
00:44:46.960 If your blood,
00:44:48.960 let me ask
00:44:49.660 the question this way,
00:44:50.420 if I drew
00:44:51.440 your blood out
00:44:52.100 and it had
00:44:52.500 coronavirus in it,
00:44:54.560 so I've got
00:44:55.040 a little petri dish
00:44:56.880 full of blood,
00:44:58.080 and let's say
00:44:58.480 I put a Bunsen burner
00:44:59.860 under it so it stays
00:45:00.940 about the right temperature
00:45:01.840 as your body.
00:45:03.300 So now you've got
00:45:03.920 real blood
00:45:04.520 from a real person
00:45:05.320 with real virus in it.
00:45:08.140 If you drop
00:45:08.880 an ivermectin pill
00:45:09.820 in there,
00:45:10.620 I'm simplifying,
00:45:11.500 right?
00:45:11.860 It wouldn't be a pill.
00:45:12.560 So put some
00:45:13.580 ivermectin in there
00:45:14.500 and you just keep it
00:45:16.100 at 98.6
00:45:17.200 and you check
00:45:18.740 in an hour,
00:45:20.360 is it going to have
00:45:21.180 the same amount
00:45:21.720 of virus?
00:45:24.360 What do you think?
00:45:25.740 If you tested
00:45:26.620 it outside the body
00:45:27.660 but it was a real
00:45:28.380 person's blood
00:45:29.500 with virus
00:45:30.080 and real hormones
00:45:30.960 and real everything
00:45:32.560 else that's in real
00:45:33.400 blood,
00:45:33.760 because it's real blood,
00:45:35.300 and you drop
00:45:36.160 the ivermectin in,
00:45:38.480 would it kill
00:45:39.040 the ivermectin
00:45:39.700 in the blood?
00:45:40.180 Now, I don't think
00:45:42.560 we test it in blood,
00:45:43.960 right?
00:45:45.620 Somebody give me
00:45:46.300 a fact check on this.
00:45:47.360 When they test it
00:45:48.120 in the laboratory,
00:45:48.740 they don't put it
00:45:49.260 in anything like blood,
00:45:50.460 do they?
00:45:50.860 They just put it
00:45:52.480 in a dish with the virus
00:45:54.220 and some kind of medium.
00:45:56.920 Am I right?
00:45:57.820 But shouldn't there
00:45:58.520 be a step where
00:45:59.560 they drop it
00:46:00.160 in human blood
00:46:00.920 and see if it makes
00:46:01.520 any difference
00:46:02.020 or it just immediately
00:46:02.900 breaks down?
00:46:04.980 So here's my question.
00:46:06.880 How in the world
00:46:07.680 do we not know
00:46:08.800 that ivermectin
00:46:09.740 works if it doesn't
00:46:12.540 break down
00:46:13.020 for its other purposes?
00:46:14.420 It doesn't give you
00:46:15.700 side effects.
00:46:16.420 Those are the things
00:46:16.920 we know.
00:46:18.420 It does kill
00:46:19.460 ivermectin in a laboratory.
00:46:22.040 Are you telling me
00:46:22.640 nobody ever did a test
00:46:23.800 where they dropped it
00:46:24.580 in a little pint of blood,
00:46:26.940 real blood,
00:46:27.420 to see if it made
00:46:27.960 a difference?
00:46:28.560 Nobody ever tested that?
00:46:30.200 Because if it did,
00:46:32.120 if you could neutralize
00:46:34.180 a virus in real blood,
00:46:36.520 let's say in an hour,
00:46:37.680 and there was less of it,
00:46:40.800 it doesn't have to be gone,
00:46:42.520 but would it be
00:46:43.320 noticeably less?
00:46:46.060 How could it not work?
00:46:49.360 Now,
00:46:50.700 is this a good argument?
00:46:54.480 Is my argument good?
00:46:56.020 Now,
00:46:58.980 remember,
00:46:59.340 I'm not giving you certainty.
00:47:01.440 There's no certainty
00:47:02.380 involved here.
00:47:03.560 I'm just saying
00:47:04.360 that I would love
00:47:06.320 to see an explanation.
00:47:08.720 Now,
00:47:09.360 if you're saying to yourself,
00:47:10.760 Scott,
00:47:11.180 this is why we test
00:47:12.440 randomized controlled trials,
00:47:14.700 exactly your problem,
00:47:16.640 because you can't tell
00:47:17.820 that the laboratory
00:47:19.080 extends to the real world.
00:47:21.420 But are you telling me
00:47:22.480 if you knew the drug
00:47:23.940 at the level given,
00:47:27.000 let's say we knew
00:47:28.040 that the drug
00:47:28.580 was going to be administered
00:47:29.520 at the same level
00:47:30.300 it is for other stuff,
00:47:31.980 and we know
00:47:32.480 it doesn't have side effects,
00:47:34.020 and we know
00:47:34.680 it doesn't break down
00:47:35.680 quickly,
00:47:36.800 because it works
00:47:37.680 for something else.
00:47:38.660 It doesn't break down
00:47:39.380 too fast for that.
00:47:41.480 Are you telling me
00:47:42.280 that you couldn't test
00:47:43.300 that in blood
00:47:44.300 in a laboratory
00:47:47.000 and really,
00:47:48.620 really be close
00:47:49.840 to knowing
00:47:50.320 if it worked?
00:47:50.960 Again,
00:47:51.600 no certainty.
00:47:52.500 That's why you do
00:47:53.040 the randomized controlled tests.
00:47:55.840 All right.
00:47:57.540 Now,
00:47:57.920 keep in mind
00:47:58.520 what I'm doing here.
00:48:00.160 I'm not promoting
00:48:01.200 ivermectin
00:48:02.400 or talking against it.
00:48:03.920 I'm saying that
00:48:04.920 I hadn't spent enough time
00:48:06.460 talking about
00:48:07.920 the possibility
00:48:08.720 it would work.
00:48:10.160 The possibility.
00:48:11.720 Now,
00:48:12.160 many of you
00:48:12.600 have pointed out
00:48:13.300 ad nauseum,
00:48:14.580 follow the money,
00:48:15.460 and it's about
00:48:16.000 the big pharma
00:48:17.260 doesn't want
00:48:17.840 the cheap thing to work
00:48:18.840 because it's off patent,
00:48:20.140 blah, blah, blah, blah, blah.
00:48:21.740 Is it possible
00:48:23.080 that big pharma
00:48:24.980 could block
00:48:27.340 ivermectin
00:48:28.960 in every way
00:48:30.160 that people are looking at it?
00:48:31.820 Because it'd be pretty massive.
00:48:34.420 Do you think
00:48:35.020 big pharma
00:48:35.640 could influence
00:48:37.460 like everybody
00:48:38.560 everywhere in the world,
00:48:39.640 like all the governments
00:48:40.820 and all the experts?
00:48:41.620 Yeah, they could.
00:48:46.180 Yeah, they could.
00:48:47.580 If they can find
00:48:48.560 a choke point.
00:48:50.140 Now, I doubt
00:48:50.900 they could do it retail,
00:48:52.320 meaning go to every person
00:48:53.560 and bribe every person
00:48:54.760 and, you know,
00:48:55.260 get to them individually
00:48:56.240 or something like that.
00:48:57.120 So you can't get
00:48:57.800 to them individually.
00:48:59.240 But who controls
00:49:00.160 the data?
00:49:02.360 Who controls the data?
00:49:03.640 Where does it come from?
00:49:07.660 Because if they control
00:49:08.960 the data,
00:49:10.280 then yeah,
00:49:11.140 they can control
00:49:12.120 what everybody
00:49:12.960 thinks of it.
00:49:14.420 Now,
00:49:15.200 do the big pharma companies,
00:49:16.420 do they get involved
00:49:17.960 in the trials?
00:49:19.660 Who is it
00:49:20.200 who's funding
00:49:20.940 the trials of ivermectin?
00:49:23.460 Do you know?
00:49:25.160 I don't know.
00:49:26.980 Suppose they're funded,
00:49:28.140 but maybe some money
00:49:29.120 went to pharma,
00:49:30.420 to some middleman,
00:49:31.540 to some other middleman,
00:49:33.040 and then to the study.
00:49:35.120 Would you know it?
00:49:36.360 I wouldn't know it.
00:49:37.900 I mean,
00:49:38.140 does anybody know
00:49:38.740 who funds studies?
00:49:39.760 I'm not even sure.
00:49:40.300 Is that always reported?
00:49:41.360 I don't even know
00:49:41.780 if it's always reported.
00:49:43.600 But it's possible.
00:49:46.420 So part of my argument
00:49:48.440 had been,
00:49:49.140 you know,
00:49:49.380 there's no way
00:49:49.980 this conspiracy theory
00:49:51.180 could possibly be true
00:49:53.120 because too many people
00:49:54.400 would have noticed
00:49:55.480 it did work.
00:49:57.100 Too many countries
00:49:57.880 would have run a trial
00:49:58.740 where it did work,
00:49:59.680 and then it would be
00:50:00.600 working in the real world
00:50:01.640 and that everybody
00:50:02.360 would notice.
00:50:03.620 And so I've argued
00:50:04.360 that it would be impossible
00:50:05.680 to conceal
00:50:08.560 such a mass conspiracy
00:50:10.240 that affects
00:50:11.340 so many people
00:50:11.980 and is such a big,
00:50:13.360 you know,
00:50:13.580 dollar and life
00:50:14.640 and death
00:50:15.180 and everything else.
00:50:16.420 Unless
00:50:17.220 they control the data.
00:50:20.240 And they could.
00:50:22.000 So I'm not saying they do.
00:50:23.460 I'm saying you can't
00:50:24.400 rule out the possibility
00:50:25.480 that the big pharma
00:50:26.640 controls enough
00:50:28.120 of the trials.
00:50:28.920 They wouldn't have to
00:50:29.440 control all the trials.
00:50:30.760 They would only have to
00:50:31.740 control the big ones.
00:50:34.040 Or put people in charge
00:50:35.580 who don't know
00:50:36.100 how to do trials.
00:50:37.240 There's probably
00:50:37.860 a million ways
00:50:38.520 you can ruin data
00:50:39.660 or manage it.
00:50:41.300 All right.
00:50:42.900 So that's my
00:50:45.220 pro-ivormectin argument.
00:50:47.840 And again,
00:50:48.280 I'm not pro-ivormectin.
00:50:49.640 If I had to guess,
00:50:51.460 I'm going to guess
00:50:52.300 against it being
00:50:53.300 a big deal.
00:50:55.040 Meaning maybe
00:50:55.700 it makes some difference,
00:50:56.580 but I doubt
00:50:57.440 it would change
00:50:58.240 the pandemic.
00:50:59.700 So that's where
00:51:00.440 my opinion is.
00:51:01.140 But I haven't spent
00:51:01.660 enough time talking
00:51:02.340 on the other side,
00:51:02.960 so there you go.
00:51:03.940 All right.
00:51:04.160 How about on vaccinations?
00:51:05.260 I tweeted out earlier.
00:51:08.280 I've got to run
00:51:08.660 in a minute.
00:51:11.060 Graphs from,
00:51:12.520 who was it?
00:51:13.800 Jason Lewis,
00:51:15.160 who is a scientist,
00:51:17.080 but he's a material scientist,
00:51:18.560 but he apparently
00:51:19.060 has experience
00:51:19.760 as a medical data analyst.
00:51:23.260 So he's somebody
00:51:23.940 who's good at data systems
00:51:25.400 and analytics,
00:51:26.600 and he's a scientist.
00:51:28.000 Not the right kind
00:51:29.160 for virology.
00:51:30.380 But he does do
00:51:31.460 medical data analysis.
00:51:32.900 All right.
00:51:33.040 So he knows data,
00:51:34.000 and he knows where
00:51:35.040 to get it,
00:51:35.760 and he knows
00:51:36.120 how to look at it right.
00:51:37.220 So it's sort of
00:51:38.080 the right person, right?
00:51:40.040 And he did studies
00:51:41.240 of vaccinations
00:51:42.300 to see if vaccinations
00:51:43.560 work,
00:51:44.060 and sure enough,
00:51:45.000 there's like a
00:51:45.800 totally solid correlation.
00:51:48.160 The more you're vaccinated,
00:51:49.640 the fewer problems you have.
00:51:53.500 Correlation is unmistakable.
00:51:55.560 You can't miss it.
00:51:56.860 Very, very clear correlation.
00:51:59.380 Vaccination, boom,
00:52:01.040 lower problems.
00:52:01.980 But turns out
00:52:06.860 there's a little bit
00:52:08.280 of a surprise involved
00:52:09.580 in the data,
00:52:10.580 which is that the infections
00:52:12.760 turned down
00:52:13.480 before the vaccinations
00:52:14.880 pretty reliably,
00:52:18.780 which means,
00:52:20.720 and this is just speculation
00:52:22.220 from Jason,
00:52:24.100 the vaccinations
00:52:25.240 might not be
00:52:26.200 the cause of the decline
00:52:27.540 because the decline
00:52:28.920 starts well before
00:52:29.880 with the vaccinations
00:52:30.660 and just continues.
00:52:33.560 And he suggests
00:52:34.540 to people,
00:52:35.660 you've got a lot
00:52:36.340 of problems,
00:52:37.200 and by then
00:52:37.720 you've already masked up
00:52:38.880 and put on
00:52:39.620 and done social isolation.
00:52:43.280 Remember,
00:52:43.820 I'm not arguing
00:52:44.380 my point of view, right?
00:52:46.260 If you're joining late,
00:52:47.760 this is not my point of view.
00:52:49.080 I'm just trying
00:52:49.600 to give the opposite
00:52:50.380 point of view
00:52:50.920 of what I usually do
00:52:51.800 to show that I can do it, okay?
00:52:53.840 So he's got a suggestion
00:52:58.000 that the vaccinations
00:53:00.120 might not make any difference
00:53:01.460 and the masking
00:53:03.340 and the social distances
00:53:04.640 might make a lot of difference.
00:53:06.480 Not concluding,
00:53:08.300 that's not a conclusion,
00:53:10.060 it's just somebody
00:53:10.740 who's really good
00:53:11.380 looking at the numbers
00:53:12.180 said the numbers
00:53:13.980 don't explain
00:53:14.980 why the dip
00:53:16.860 comes well
00:53:18.280 before the vaccinations.
00:53:19.280 So there is
00:53:23.360 at least one person
00:53:24.540 who's good with data
00:53:25.520 who says
00:53:26.180 the data
00:53:27.120 is,
00:53:28.340 I think he used
00:53:29.100 the word
00:53:29.520 agnostic
00:53:31.000 about whether
00:53:32.660 the vaccinations work.
00:53:35.360 Think about that.
00:53:37.440 That here's somebody
00:53:38.120 who's really good
00:53:38.860 with data
00:53:39.360 and he's using
00:53:40.100 public data
00:53:40.900 so you can check
00:53:42.080 his sources, I guess,
00:53:43.360 and he says
00:53:44.260 it doesn't
00:53:45.560 make the case.
00:53:47.480 It doesn't
00:53:48.280 not make the case.
00:53:50.140 But it doesn't
00:53:50.980 make the case
00:53:51.600 because the dip
00:53:52.540 starts before
00:53:53.160 the vaccinations.
00:53:55.780 Alright?
00:53:56.540 Now,
00:53:57.620 again,
00:53:58.500 personally,
00:53:59.320 I think,
00:54:00.380 if I had to guess,
00:54:01.420 I think there's
00:54:01.920 slightly more chance,
00:54:03.420 or maybe a lot more chance,
00:54:04.640 that the vaccinations
00:54:05.720 work.
00:54:07.260 I got vaccinated.
00:54:08.720 I don't recommend
00:54:09.420 you do
00:54:09.880 because
00:54:11.580 there is some doubt,
00:54:13.400 right?
00:54:14.240 If I told you
00:54:15.060 I'm certain
00:54:15.740 I made the right decision,
00:54:16.780 I'd just be an idiot.
00:54:18.360 If I told you
00:54:19.020 I was certain
00:54:19.420 you made the right decision,
00:54:21.000 I'd be a bigger idiot.
00:54:22.680 Nobody could be certain
00:54:23.580 about this stuff.
00:54:24.960 So,
00:54:25.320 just for balance,
00:54:26.860 I show you
00:54:27.320 the argument
00:54:27.700 on the other side.
00:54:29.300 Now,
00:54:29.760 since you also
00:54:30.500 don't believe,
00:54:31.220 many of you,
00:54:31.720 that masks
00:54:32.140 and distancing work,
00:54:33.080 you probably don't
00:54:33.600 buy his alternative
00:54:34.620 explanation,
00:54:35.820 but it does suggest
00:54:36.640 that there's something
00:54:37.260 in the data
00:54:37.680 we don't understand.
00:54:38.840 That's probably
00:54:39.200 the only thing
00:54:39.660 you can conclude.
00:54:41.080 The only thing
00:54:41.660 you know for sure
00:54:42.220 is there's something
00:54:42.720 you don't understand.
00:54:44.380 Not necessarily
00:54:45.260 that vaccinations
00:54:46.040 are not going
00:54:47.620 to work.
00:54:48.640 All right.
00:54:50.380 How many
00:54:51.100 anecdotal reports
00:54:52.660 have you seen
00:54:53.200 of somebody
00:54:53.680 who forgot
00:54:55.160 to get vaccinated
00:54:56.020 or didn't get vaccinated
00:54:57.120 and they died?
00:54:58.560 A lot, right?
00:54:59.700 CNN used to do
00:55:00.500 one every day.
00:55:01.980 Every day.
00:55:02.820 Didn't get vaccinated.
00:55:03.820 I'm dying.
00:55:04.380 Didn't get vaccinated.
00:55:05.240 I wish I had.
00:55:05.940 Didn't get vaccinated.
00:55:06.700 Where are all
00:55:08.660 the anecdotal reports
00:55:10.200 of people
00:55:11.540 who took
00:55:12.020 ivermectin early
00:55:13.160 when they first
00:55:14.260 got symptoms
00:55:14.860 and died anyway?
00:55:16.840 Not horse dewormer
00:55:17.980 and not overdosing
00:55:19.400 on horse dewormer.
00:55:20.940 But where are the reports
00:55:22.020 of even one person,
00:55:24.160 just one,
00:55:25.680 just one,
00:55:26.960 who took
00:55:27.680 ivermectin early
00:55:28.680 and then died anyway
00:55:31.160 and didn't take
00:55:32.200 anything else,
00:55:32.840 just ivermectin
00:55:33.600 and died anyway?
00:55:36.240 Not one?
00:55:37.720 Because CNN
00:55:38.480 is pushing pretty hard
00:55:39.500 on this anecdotal
00:55:40.440 persuasion.
00:55:41.900 They can't find
00:55:42.380 one person
00:55:43.080 who took
00:55:44.620 the ivermectin
00:55:45.420 and then died anyway.
00:55:48.420 Now,
00:55:49.520 again,
00:55:50.440 I'll remind you
00:55:51.120 if you're coming here late,
00:55:52.340 I'm not pro-ivermectin.
00:55:54.960 My personal opinion
00:55:56.260 is that we'll probably
00:55:57.400 find out it's weak
00:55:58.400 or doesn't make
00:55:59.280 much difference.
00:56:01.360 But,
00:56:02.400 isn't it a fair question
00:56:04.500 why we haven't seen
00:56:05.300 one example
00:56:06.140 of what the news
00:56:07.700 tells us is true?
00:56:09.500 The news tells us
00:56:10.280 it doesn't work.
00:56:11.920 Probably there are,
00:56:12.880 I don't know,
00:56:13.260 tens of thousands
00:56:13.920 of people
00:56:14.380 using it anyway.
00:56:16.460 You've got a lot
00:56:17.340 of people
00:56:17.760 that you can look at
00:56:19.480 and ask yourself,
00:56:20.280 all right,
00:56:20.600 did any of them die?
00:56:22.380 Now,
00:56:22.860 you can say,
00:56:23.380 okay,
00:56:23.560 that's what the trials are.
00:56:26.160 The trials
00:56:26.780 sort of sum it up,
00:56:28.400 but where's
00:56:28.840 that one person?
00:56:31.100 I get that the trials
00:56:32.380 show that,
00:56:32.960 you know,
00:56:33.320 some people died
00:56:34.140 even on ivermectin.
00:56:35.440 That must be true.
00:56:36.480 But where's
00:56:38.260 the anecdotal story?
00:56:39.580 Because you know
00:56:40.220 they want to give it to you
00:56:41.180 because they want you
00:56:42.240 to hear that anecdotal story.
00:56:43.580 Where is it?
00:56:45.760 All right.
00:56:47.660 I also ask the question,
00:56:48.900 is it unethical
00:56:49.600 to test a new therapeutic?
00:56:52.040 Because if we know
00:56:53.200 Regeneron works,
00:56:54.220 which we do,
00:56:55.580 how can you not give it
00:56:56.860 to somebody
00:56:57.280 who has COVID?
00:56:58.980 How can you give somebody
00:57:00.260 some new therapeutic
00:57:01.500 that you're testing
00:57:02.340 without also
00:57:03.800 giving them Regeneron?
00:57:06.680 Because if you don't
00:57:07.460 give them Regeneron,
00:57:09.060 they might die.
00:57:10.940 So ethically,
00:57:12.060 can you do a test
00:57:13.440 of another therapeutic
00:57:14.420 once you have one
00:57:15.180 that works?
00:57:16.500 I don't know
00:57:16.860 that you can.
00:57:19.180 All right.
00:57:20.080 Adam Dopamine
00:57:20.860 had an interesting tweet.
00:57:23.500 Listen to this tweet
00:57:24.500 all the way
00:57:24.940 to the last word.
00:57:27.060 Listen to it
00:57:27.720 all the way
00:57:28.220 to the last word.
00:57:30.720 Adam says,
00:57:31.480 the dog not barking.
00:57:34.180 It's been 10 weeks
00:57:34.980 since Pfizer
00:57:35.600 was fully FDA approved
00:57:37.100 to much fanfare.
00:57:38.980 Moderna still not approved,
00:57:41.320 even though
00:57:41.860 the emergency authorization
00:57:43.980 was granted
00:57:45.440 only one week apart
00:57:46.540 from Pfizer.
00:57:48.360 Really raises concern
00:57:49.720 that there is something
00:57:51.100 wrong with the Moderna's
00:57:52.680 lobbyists.
00:57:56.120 Good tweet.
00:57:58.120 There's something wrong
00:57:59.140 with the Moderna's
00:58:00.120 lobbyists.
00:58:01.480 That is a well-constructed tweet.
00:58:05.760 Ladies and gentlemen,
00:58:06.540 this is my show.
00:58:07.660 I must go now.
00:58:08.940 I hope that I have satisfied
00:58:10.320 some of my critics
00:58:11.720 in the limited sense
00:58:13.680 that I can give you
00:58:15.240 an argument
00:58:15.640 for the other side.
00:58:17.080 Doesn't mean it's right,
00:58:19.100 but I think I gave you
00:58:20.240 an argument that at least
00:58:21.280 has a little bit of weight
00:58:22.480 on the other side.
00:58:23.680 Wouldn't you agree?
00:58:25.100 Good.
00:58:25.640 Thank you.
00:58:26.080 Some of you were liking it.
00:58:27.100 I've got to run.
00:58:27.740 I wish I could stay longer,
00:58:28.840 but it's been a treat
00:58:30.060 to be with you.
00:58:31.320 Thank you.
00:58:31.780 Thank you.