Real Coffee with Scott Adams - March 23, 2021


Episode 1322 Scott Adams: Let's Talk About the Bad Arguments in the News About the Second Amendment, Vaccine Safety


Episode Stats

Length

1 hour and 18 minutes

Words per Minute

141.72449

Word Count

11,172

Sentence Count

700

Misogynist Sentences

12

Hate Speech Sentences

25


Summary

Trump is back in the news, and a new app that can tell you how much brain damage you're giving yourself by the amount of time you spend on the left and right on social media, and how much you're getting from the right.


Transcript

00:00:00.000 Bum, bum, bum, bum.
00:00:02.020 Bum, bum, bum.
00:00:04.880 Hey, everybody.
00:00:06.200 Come on in.
00:00:06.960 Come on in.
00:00:08.020 Now, if I were to ask you
00:00:10.160 what would be the best moment of your entire life,
00:00:16.960 well, it'd be hard to come up with that answer, maybe.
00:00:19.940 You'd be thinking about the birth of your children,
00:00:22.320 maybe a big promotion you got,
00:00:24.720 maybe the day you got married.
00:00:26.640 But no, the best day of your life
00:00:29.640 is right now.
00:00:32.800 And if you'd like to make this even better,
00:00:36.500 I know it's hard to believe,
00:00:38.660 but you can, just a little bit better.
00:00:40.920 All you need is a cup or a mug or a glass,
00:00:43.200 a tank or a cellist stand,
00:00:44.320 a canteen drink of flask,
00:00:45.480 a vessel of any kind.
00:00:48.000 Fill it with your favorite liquid.
00:00:49.560 I like coffee.
00:00:51.580 And join me now for the unparalleled pleasure,
00:00:54.840 the dopamine hit of the day,
00:00:57.020 the thing that makes everything.
00:00:59.640 everything better.
00:01:01.460 It's called the simultaneous sip,
00:01:02.780 and you're about to experience it,
00:01:05.140 and you're lucky if you are.
00:01:07.500 And it happens now.
00:01:08.640 Go!
00:01:08.840 Oh, yeah.
00:01:14.740 Oh, yeah.
00:01:15.820 Ha, ha, ha, ha, ha, ha.
00:01:19.400 Well, Trump is back in the news,
00:01:22.280 so everything's looking a little brighter now.
00:01:25.960 I guess he had some long interview yesterday
00:01:28.780 with Fox News,
00:01:30.480 and he was on Newsmax,
00:01:31.980 and blada, blada.
00:01:33.620 So it looks like we'll have a steady diet of Trump.
00:01:38.780 And so far, he's not making a lot of news.
00:01:41.800 He's just complaining about Biden,
00:01:44.040 which makes a little bit of news.
00:01:47.540 But apparently the scoop is
00:01:52.080 that Trump will be returning
00:01:53.960 with some kind of a platform,
00:01:56.260 some kind of a social media,
00:01:58.400 or maybe it's a news platform.
00:02:00.380 We don't know what kind of platform it is,
00:02:02.840 but he's going to be coming at us
00:02:04.380 with a platform.
00:02:06.520 If I were him,
00:02:07.840 this is how I would have done it.
00:02:09.360 I would have tried to get investors
00:02:12.660 to do a roll-up,
00:02:14.880 which is combining,
00:02:16.160 let's say,
00:02:17.680 Rumble and Parler
00:02:19.840 and some of the other properties
00:02:23.040 that try to compete with the major brands.
00:02:27.180 And I'd make it an alternative internet, basically.
00:02:30.520 But we'll see what he does.
00:02:33.640 Here is the coolest thing ever.
00:02:37.280 All right, I may have oversold this a little bit.
00:02:41.600 Back it up, back it up.
00:02:42.980 It's a cool thing.
00:02:44.900 You heard about how scientists say
00:02:47.500 that if you consume too much news from one source,
00:02:52.080 either all on the left or all leading right,
00:02:55.800 that it will cause brain damage.
00:02:58.940 Brain damage.
00:03:01.120 And they don't use that word lightly.
00:03:03.700 If you only follow the left
00:03:05.580 or you only follow the right,
00:03:06.960 you actually get brain damage.
00:03:08.940 Now, wouldn't you like to know
00:03:10.200 how much brain damage you're getting?
00:03:14.020 Because nobody is perfect, right?
00:03:17.200 Nobody is perfectly balanced.
00:03:19.500 Wouldn't you like to know?
00:03:20.920 Well, it turns out that there's a brand new tool
00:03:23.360 that I just tweeted.
00:03:25.260 So you can just go to my Twitter account.
00:03:27.160 It's right at the top at the moment.
00:03:28.560 And you've heard of ground news.
00:03:33.180 Ground news.
00:03:34.760 They tell you how much of...
00:03:37.380 They tell you which news sites,
00:03:39.720 left or right, are covering which stories.
00:03:42.740 So they'll often hilariously post something
00:03:45.400 that says everybody on the left is talking about it
00:03:48.560 and literally no one on the right.
00:03:50.620 Or vice versa.
00:03:51.760 So that's very illuminating.
00:03:54.580 But they came out with a new little tool,
00:03:56.360 a new app,
00:03:57.280 in which you can put in the name of any Twitter user,
00:04:01.380 including yourself,
00:04:02.740 and it will tell you,
00:04:05.040 based on your Twitter activity
00:04:06.440 and who you're following and tweeting, I guess,
00:04:09.300 how biased your followings are.
00:04:14.380 So if you're interacting primarily with the left
00:04:17.680 or primarily with the right,
00:04:19.720 science says brain damage.
00:04:23.300 Brain damage.
00:04:25.280 And so you can find out how much brain damage
00:04:27.120 you're giving yourself with this app.
00:04:31.040 Is that about the coolest thing
00:04:32.660 you've ever heard in your life?
00:04:34.960 So ground news, congratulations.
00:04:37.620 So I just ran my own through it.
00:04:40.900 And I came out relatively balanced.
00:04:44.740 I definitely lean a little bit more right.
00:04:47.600 The people I interact with
00:04:49.060 are more likely to be on the right.
00:04:51.180 But the bottom line for me
00:04:53.600 is that I follow enough news on the left
00:04:56.220 and enough people
00:04:57.100 and I interact with them enough
00:04:59.080 that I am not giving myself brain damage.
00:05:04.340 How about that?
00:05:05.580 Yay for me.
00:05:06.980 Some of you may get a different result.
00:05:09.940 And it'll be even more interesting
00:05:11.160 if you put other people in there.
00:05:12.640 So let's say somebody's arguing with you on Twitter
00:05:15.980 and they're just being a moron.
00:05:18.880 Stick them into the app
00:05:20.180 and find out if they have brain damage.
00:05:25.440 Do you see why I'm so excited about this?
00:05:30.340 Yeah, look in my...
00:05:31.540 Somebody's asking for the link.
00:05:33.200 So I just tweeted it
00:05:34.420 just a moment before I came on.
00:05:35.960 So you could go to...
00:05:37.820 Just search for ground news.
00:05:41.040 Ground like the ground you're standing on.
00:05:43.600 And you can follow them.
00:05:45.520 And you should.
00:05:46.300 Or you can go to my Twitter account
00:05:47.820 and it's the one I just tweeted this morning.
00:05:50.240 So you'll see it right at the top.
00:05:52.380 So you can actually have a Twitter debate with somebody
00:05:54.600 and send to them the result
00:05:58.000 of how brain damaged they are.
00:05:59.940 Could anything be better?
00:06:03.100 Is there anything you've waited for
00:06:06.080 that would make you happier than this?
00:06:09.060 Seriously.
00:06:10.040 If it looks like I'm just like dancing in my seat,
00:06:13.600 it's because nothing could make me happier
00:06:15.700 than being able to prove to somebody...
00:06:19.100 I mean, I don't know how much science you want to say this is,
00:06:22.080 but certainly it would be entertaining,
00:06:25.900 even if not strictly scientific,
00:06:29.200 to say, you know, according to this app,
00:06:32.700 the things you follow and the things you interact with
00:06:35.660 strongly suggest according to science.
00:06:39.780 And you do follow science, right?
00:06:41.940 Do you love science?
00:06:43.480 Do you?
00:06:44.360 Do you love it?
00:06:46.040 Because I love it.
00:06:48.620 If you love science,
00:06:50.000 here's the evidence that you've got brain damage.
00:06:53.500 All right.
00:06:55.340 Of course, there's yet another mass shooting.
00:06:59.000 Apparently there have been seven mass shootings
00:07:01.260 in the last seven days.
00:07:04.000 I'm seeing a pattern develop here.
00:07:07.640 Now, some of the mass shootings
00:07:09.280 are more like inner city situations
00:07:12.540 where things got out of control, I think,
00:07:15.460 as opposed to the kind where somebody gets a gun
00:07:17.740 and plans it and, you know,
00:07:19.240 goes somewhere just to shoot people.
00:07:22.680 But seven of them in seven days?
00:07:26.100 You know, even if you're pro-Second Amendment,
00:07:28.940 as I am,
00:07:30.040 I'm very pro-Second Amendment,
00:07:32.420 you've got to worry about seven shootings
00:07:34.260 in seven days,
00:07:35.140 seven mass shootings.
00:07:37.640 We learned today that the Denver shooting,
00:07:40.740 in which ten people tragically died,
00:07:42.760 including a police officer,
00:07:44.180 were told that the suspect is male.
00:07:48.780 Well, I didn't see that coming.
00:07:51.380 Is anybody else as surprised as I am
00:07:53.380 that this mass shooter was male?
00:07:57.940 What are the odds of that?
00:08:02.500 But interestingly,
00:08:04.420 the ethnicity of the shooter
00:08:06.220 has not been released, has it?
00:08:10.100 Yeah.
00:08:10.780 If I'm going to take a guess,
00:08:13.320 probably a youngish white guy.
00:08:18.080 Just going to put it out there.
00:08:20.440 Because if you're young and white,
00:08:22.840 there's a pretty good chance
00:08:24.500 you're in the same category
00:08:25.620 with a lot of mass shooters.
00:08:27.380 Now, obviously,
00:08:28.080 there's a lot of, you know,
00:08:28.940 urban shootings that are,
00:08:30.820 they have a different character to them
00:08:32.420 than more, you know,
00:08:33.640 crime-related or gang-related
00:08:35.220 or, you know,
00:08:36.580 a party got out of control.
00:08:37.720 But when somebody actually plans it,
00:08:41.020 the odds of them being a white male
00:08:43.240 with an AR-15,
00:08:45.020 pretty, pretty high.
00:08:47.080 Pretty high.
00:08:50.340 Why is it the AR-15s are used
00:08:53.520 in so many of these mass shootings?
00:08:56.180 I just saw a list this morning
00:08:57.500 of how often they're used.
00:08:59.400 Pretty often.
00:09:00.940 Pretty often.
00:09:01.900 So why is it that the AR-15
00:09:03.880 is the most used weapon,
00:09:06.760 handguns being second?
00:09:09.260 Now, you say to yourself,
00:09:11.560 well, the reason it's being used
00:09:13.600 is because it's the best weapon
00:09:16.500 that's easily available.
00:09:18.220 Best in terms of killing power.
00:09:20.960 But that's not exactly true.
00:09:22.300 So, I think most gun,
00:09:25.220 you know, gun experts
00:09:26.620 or gun hobbyists will tell you
00:09:28.500 that if you went into
00:09:30.360 exactly those same situations
00:09:32.500 with the right, you know,
00:09:35.000 automatic handguns,
00:09:36.880 you would do the same amount of damage.
00:09:40.020 Right?
00:09:40.880 Because if it's a crowded space,
00:09:43.060 the handgun is going to do
00:09:44.500 just as much damage.
00:09:45.400 So, you're not really getting
00:09:48.040 that much extra killing power
00:09:50.600 from the AR-15.
00:09:53.160 And by the way,
00:09:54.280 if there's some gun,
00:09:55.740 I know there's lots of gun enthusiasts
00:09:58.120 watching this.
00:09:59.840 So, check my math on that, right?
00:10:01.960 Now, here's the first thing we learned.
00:10:04.660 None of those mass shootings
00:10:06.220 involved a fully automatic weapon.
00:10:08.400 None of them involved
00:10:12.520 a fully automatic weapon.
00:10:16.640 Why is that?
00:10:18.580 Yeah, I meant semi-automatic, sorry.
00:10:20.880 Gun people are correcting me as we go.
00:10:23.220 So, I often use the wrong gun terms
00:10:25.520 because guns are not exactly my hobby.
00:10:28.940 I'm pro-Second Amendment,
00:10:30.380 but, you know,
00:10:31.000 I don't make a hobby of it.
00:10:32.960 So, yeah,
00:10:35.180 why is it that fully automatics
00:10:37.380 were not used?
00:10:38.400 Because they're illegal.
00:10:41.480 Well, not illegal.
00:10:42.660 You can own a fully automatic,
00:10:44.140 but it's harder.
00:10:45.300 It's just more difficult.
00:10:47.140 Yeah.
00:10:47.900 So,
00:10:49.440 doesn't this give you
00:10:51.780 a lot of evidence,
00:10:53.180 you know,
00:10:53.400 short of a scientific proof,
00:10:55.100 but doesn't this strongly suggest
00:10:57.200 that banning fully automatic weapons
00:11:00.600 may have worked?
00:11:04.940 In the sense that
00:11:06.120 nobody used an automatic weapon.
00:11:08.360 But, like I said earlier,
00:11:10.140 maybe the total number of people killed
00:11:11.920 ends up being about the same.
00:11:13.300 because a person with the handguns
00:11:15.580 in a public place
00:11:17.420 is going to do just as much damage
00:11:19.240 no matter what kind of weapon you have.
00:11:21.660 You know,
00:11:21.780 maybe there's a little faster firing
00:11:23.240 with the...
00:11:24.280 Because my understanding is that
00:11:25.700 even if you had an automatic weapon,
00:11:28.380 unless you were shooting
00:11:29.660 into a crowd at close range,
00:11:31.760 you'd still rather take a shot at a time.
00:11:33.980 gun people,
00:11:36.720 confirm this for me,
00:11:38.800 that if your goal was to kill a bunch of people,
00:11:42.900 and unless they were clustered all together,
00:11:45.500 you know,
00:11:45.780 as long as there's a little separation,
00:11:47.940 your best technique
00:11:49.620 for getting the most people dead
00:11:52.160 would be one shot at a time.
00:11:54.980 Can anybody confirm that?
00:11:58.800 Now,
00:11:59.440 somebody's saying
00:11:59.900 there are fewer rounds in a handgun,
00:12:01.580 but, of course,
00:12:02.080 you would bring extra rounds with you.
00:12:04.380 You know,
00:12:04.700 you'd be slapping a clip in there,
00:12:06.120 I guess,
00:12:07.180 pretty quickly.
00:12:09.100 But the gun experts
00:12:10.680 have schooled me,
00:12:12.160 you know,
00:12:12.360 people with actual military experience,
00:12:14.500 et cetera,
00:12:14.820 and saying that
00:12:16.240 a shot at a time,
00:12:18.200 you know,
00:12:18.580 well-aimed,
00:12:19.880 is going to get you more kills
00:12:21.480 than...
00:12:21.980 And one of the reasons
00:12:24.400 is that automatic weapons
00:12:26.000 will ride up.
00:12:27.420 Have you heard of this?
00:12:29.320 If you're...
00:12:30.120 Here,
00:12:30.300 let me do the example for you
00:12:31.680 using my back scratcher.
00:12:33.820 So,
00:12:35.140 apparently,
00:12:35.900 if you're...
00:12:36.720 if you have an automatic weapon
00:12:37.960 and you put it on automatic
00:12:39.580 and it starts shooting,
00:12:40.780 it will go like this,
00:12:41.960 and your gun will rise.
00:12:44.820 So you actually end up
00:12:47.100 being very inaccurate
00:12:48.300 if you keep it
00:12:49.640 on the fully automatic mode.
00:12:51.780 So the point being
00:12:52.980 that a non...
00:12:54.580 non-fully automatic
00:12:56.600 and even non-semi-automatic,
00:12:59.480 you know,
00:12:59.740 a revolver
00:13:00.460 is going to kill a lot of people,
00:13:02.240 even if it doesn't shoot as quickly
00:13:03.780 and doesn't carry as many rounds.
00:13:05.920 So,
00:13:06.340 there's your
00:13:06.960 mass shooter education for today.
00:13:11.000 How long will it be
00:13:12.240 before we find out
00:13:13.060 the ethnicity of the killer?
00:13:14.820 And why would they
00:13:17.220 keep that from us?
00:13:22.040 Makes you curious,
00:13:23.240 doesn't it?
00:13:24.140 Makes you curious.
00:13:26.060 All right.
00:13:27.580 How about some good news?
00:13:30.100 In Appalachia,
00:13:31.920 the place in the United States
00:13:33.400 most famous for poverty,
00:13:36.620 they've...
00:13:36.820 somebody built a greenhouse
00:13:38.460 that is the size
00:13:39.540 of 58 football fields.
00:13:43.180 What?
00:13:44.360 It's an indoor greenhouse
00:13:45.820 for growing all kinds of stuff,
00:13:47.460 and it's the size
00:13:48.320 of 58 football fields.
00:13:51.380 One of the largest
00:13:52.260 in the world.
00:13:53.100 Now,
00:13:53.420 here are the benefits
00:13:54.460 of growing indoors.
00:13:56.720 Saves on water,
00:13:58.480 like,
00:13:58.840 a lot,
00:13:59.780 right?
00:14:00.040 And agricultural use
00:14:01.800 of water,
00:14:02.440 one of the biggest
00:14:02.960 problems in the world.
00:14:04.980 So,
00:14:05.480 that's good.
00:14:06.540 Number two,
00:14:07.320 saves on shipping.
00:14:09.360 Because,
00:14:10.140 in theory,
00:14:10.740 you could locate
00:14:11.680 your indoor
00:14:12.480 growing
00:14:14.640 closer to the market,
00:14:17.320 so you don't have to
00:14:18.220 ship it from another country,
00:14:19.500 et cetera.
00:14:21.720 So,
00:14:22.240 you save on water,
00:14:23.180 you save on shipping.
00:14:24.680 I'm assuming
00:14:25.360 you save on pesticides
00:14:26.600 because it's a
00:14:27.660 controlled environment.
00:14:29.400 So,
00:14:29.660 in theory,
00:14:30.160 you wouldn't need any,
00:14:30.880 right?
00:14:31.520 So,
00:14:31.880 they would be...
00:14:32.800 I don't know if this is true.
00:14:34.320 I'd have to get an opinion
00:14:36.180 on this.
00:14:37.020 But,
00:14:37.400 that might make them organic
00:14:38.700 just because you don't
00:14:40.940 have to control for pests.
00:14:43.120 I think that might be true,
00:14:44.240 although you would still
00:14:44.960 fertilize them,
00:14:45.700 so maybe that's not organic.
00:14:48.080 And then,
00:14:48.720 the third thing
00:14:49.280 is that
00:14:49.920 our food supply
00:14:51.220 would be safer
00:14:52.080 from climate change.
00:14:54.620 So,
00:14:55.260 if you believe
00:14:56.600 that climate change
00:14:57.480 is going to make it
00:14:58.220 worse for growing
00:14:59.460 in some way
00:15:01.280 because it disrupts
00:15:02.360 the weather pattern,
00:15:03.600 and I know
00:15:03.980 you're already typing,
00:15:05.960 Scott,
00:15:06.360 don't you know
00:15:06.880 that more CO2
00:15:07.780 is good for plant life?
00:15:09.640 And there is more
00:15:10.360 vegetation in the world
00:15:11.480 now than there has been
00:15:12.580 in years
00:15:13.040 because of all that CO2.
00:15:15.400 So,
00:15:15.780 climate change
00:15:16.320 is really good for crops,
00:15:17.540 Scott.
00:15:17.880 Don't you know that?
00:15:19.160 Scott,
00:15:19.620 Scott,
00:15:19.980 Scott.
00:15:20.960 Well,
00:15:21.360 I do know that.
00:15:22.380 But,
00:15:22.860 if climate change,
00:15:24.000 I'll put an if there
00:15:24.960 because I know
00:15:25.540 some of you are skeptics,
00:15:26.980 but if climate change
00:15:28.260 causes massive
00:15:29.440 weather disruptions,
00:15:31.900 those disruptions
00:15:33.080 and floods
00:15:33.660 and whatever
00:15:34.200 could be a larger
00:15:36.080 impact than the CO2.
00:15:38.220 So,
00:15:38.460 there is some risk.
00:15:39.280 We just don't know
00:15:39.940 which way that goes.
00:15:41.680 Could end up,
00:15:42.400 climate change
00:15:43.400 could end up
00:15:44.000 be the greatest thing
00:15:45.080 that ever happened
00:15:45.800 to farms.
00:15:48.000 But,
00:15:48.480 it could also
00:15:49.380 make farms
00:15:50.180 in some places
00:15:51.380 non-viable
00:15:52.960 while making
00:15:54.300 farms in other places
00:15:55.620 the best year
00:15:56.540 they've ever had.
00:15:57.940 So,
00:15:58.500 it's nice to have
00:15:59.060 some greenhouse
00:16:00.000 options here.
00:16:01.800 To me,
00:16:02.340 this is one of the
00:16:02.920 best stories ever.
00:16:03.920 It just doesn't
00:16:04.480 seem interesting.
00:16:06.300 Right?
00:16:06.700 But the fact that
00:16:07.640 you can make
00:16:08.220 greenhouses this big
00:16:09.400 and apparently
00:16:09.920 they figured out
00:16:10.560 how to make it
00:16:11.040 profitable,
00:16:12.040 it's a big deal,
00:16:13.340 I think.
00:16:13.660 If I were to
00:16:15.100 design a city,
00:16:17.160 and I do plan
00:16:18.120 to do this
00:16:18.660 before I die
00:16:19.360 sometime,
00:16:20.540 I would put
00:16:21.460 a massive
00:16:22.200 greenhouse
00:16:23.020 right there.
00:16:25.280 So,
00:16:25.700 I would make
00:16:26.120 sure that
00:16:26.780 you paired
00:16:27.640 maybe a
00:16:29.160 nuclear power
00:16:30.320 facility,
00:16:31.260 possibly a
00:16:32.000 Gen 4,
00:16:32.940 you know,
00:16:33.160 smaller,
00:16:33.660 safer kind of
00:16:34.300 thing.
00:16:35.000 Then I would
00:16:35.700 put a
00:16:36.340 massive
00:16:37.520 indoor farm
00:16:38.720 and then I
00:16:40.820 would make
00:16:41.140 sure that,
00:16:41.660 you know,
00:16:41.920 your water
00:16:42.600 situation was
00:16:43.460 taken care
00:16:44.020 of,
00:16:44.640 you were
00:16:45.000 close enough
00:16:45.700 to a major
00:16:46.320 airport,
00:16:48.340 boom,
00:16:49.540 perfect city.
00:16:51.260 More to
00:16:51.640 that later.
00:16:53.700 There's a
00:16:54.320 statistic that
00:16:55.080 says 42%
00:16:56.360 of Americans
00:16:57.080 report undesired
00:16:59.460 weight gain
00:17:00.060 during the
00:17:00.920 pandemic.
00:17:02.580 And the
00:17:03.000 average weight
00:17:03.460 gain is 29
00:17:04.540 pounds.
00:17:06.180 And 41
00:17:07.320 pounds for
00:17:07.960 millennials.
00:17:10.960 Do you
00:17:11.740 believe any
00:17:12.240 of that?
00:17:13.460 aren't we
00:17:16.500 at a point
00:17:17.020 where you
00:17:18.240 just don't
00:17:18.660 believe any
00:17:19.200 statistics?
00:17:21.140 All right.
00:17:22.060 No,
00:17:22.420 seriously,
00:17:23.020 do you
00:17:23.320 believe that
00:17:24.000 the millennials
00:17:25.280 were gaining
00:17:26.000 41 pounds
00:17:27.060 on average?
00:17:29.520 41 pounds?
00:17:31.760 I don't
00:17:32.540 believe that
00:17:33.000 anyone gained
00:17:33.620 41 pounds
00:17:34.520 this year.
00:17:36.180 I don't even
00:17:36.860 know if you
00:17:37.200 could gain
00:17:37.720 41 pounds
00:17:38.540 in a year.
00:17:39.420 Could you?
00:17:40.040 If you
00:17:40.960 tried really
00:17:41.860 hard,
00:17:42.260 could you
00:17:42.520 gain 41
00:17:43.200 pounds?
00:17:43.920 I mean,
00:17:44.160 you'd have
00:17:44.420 to start
00:17:44.920 at 300
00:17:45.560 to gain
00:17:47.520 that.
00:17:48.360 So the
00:17:48.880 first thing
00:17:49.340 you need
00:17:49.600 to know
00:17:49.980 is that
00:17:50.920 this is
00:17:51.280 very unlikely
00:17:51.940 to be
00:17:52.340 true.
00:17:53.200 That's
00:17:53.460 very unlikely.
00:17:55.080 And secondly,
00:17:56.660 how many
00:17:57.620 people got
00:17:58.760 healthier?
00:18:00.480 In the
00:18:01.140 comments.
00:18:02.300 All right,
00:18:02.760 are you ready?
00:18:03.820 In the
00:18:04.260 comments,
00:18:05.100 you're already
00:18:05.780 starting it,
00:18:06.360 but I want
00:18:06.640 the rest of
00:18:07.040 you to do
00:18:07.340 it.
00:18:08.060 Tell me
00:18:08.500 your weight
00:18:09.320 gain or
00:18:10.780 loss.
00:18:11.540 All right?
00:18:12.720 So tell me
00:18:13.320 your weight
00:18:13.680 gain or
00:18:14.180 loss.
00:18:14.560 I want
00:18:14.820 everybody to
00:18:15.460 participate.
00:18:16.920 Either the
00:18:17.360 same,
00:18:18.500 plus 10,
00:18:19.800 minus 10,
00:18:20.540 give me your
00:18:21.260 number.
00:18:22.880 I'll tell
00:18:23.500 you mine.
00:18:24.120 I think
00:18:24.460 I'm plus
00:18:25.180 five,
00:18:26.560 but
00:18:26.880 intentionally.
00:18:28.480 So I
00:18:29.200 intentionally
00:18:29.720 gained five
00:18:30.440 pounds for
00:18:31.120 cosmetic reasons.
00:18:32.840 Christina likes
00:18:33.520 me,
00:18:34.440 you know,
00:18:34.760 five pounds
00:18:35.300 heavier.
00:18:35.540 I like
00:18:36.840 myself
00:18:37.320 five pounds
00:18:37.900 lighter because
00:18:38.560 when I look
00:18:39.000 in the mirror,
00:18:39.720 my stomach
00:18:40.180 is tighter
00:18:41.500 and I can
00:18:41.940 see my
00:18:42.280 abs.
00:18:43.420 Christina
00:18:43.840 likes me a
00:18:44.380 little beefier
00:18:45.000 because my
00:18:45.420 face looks a
00:18:46.800 little more
00:18:47.120 filled out.
00:18:47.680 I look
00:18:47.900 younger that
00:18:48.340 way.
00:18:49.420 So I
00:18:50.620 gained five,
00:18:51.360 but it was
00:18:51.640 under control.
00:18:53.240 All right,
00:18:53.400 I'm going to
00:18:54.200 read off your
00:18:54.620 numbers.
00:18:55.480 Negative 10,
00:18:56.220 negative 20,
00:18:57.140 plus 20,
00:18:57.840 plus 10.
00:18:59.340 Wow,
00:18:59.940 minus 4,
00:19:00.620 plus 40.
00:19:01.740 Jeez,
00:19:02.140 sorry about
00:19:02.580 that.
00:19:02.860 Plus 7,
00:19:06.260 plus 15,
00:19:07.200 these will
00:19:07.540 all be
00:19:07.880 pluses.
00:19:08.960 12,
00:19:09.600 even,
00:19:10.100 same,
00:19:11.020 20 pounds
00:19:11.640 lost,
00:19:12.280 congratulations,
00:19:13.280 minus 18,
00:19:14.320 good work.
00:19:15.700 Lost 18,
00:19:16.640 and again,
00:19:17.100 good work.
00:19:18.020 Plus 10,
00:19:18.700 minus 7,
00:19:19.560 good work.
00:19:22.280 I feel as
00:19:23.440 if this
00:19:23.920 pandemic
00:19:24.420 really,
00:19:25.620 really told
00:19:26.220 you who
00:19:26.600 you were,
00:19:27.740 didn't it?
00:19:28.600 about 5,
00:19:31.360 minus 1,
00:19:32.620 plus 10,
00:19:33.680 plus 4,
00:19:34.940 minus 5,
00:19:36.660 plus 10,
00:19:37.200 plus 2,
00:19:37.720 plus 5.
00:19:40.560 Whoa,
00:19:41.100 hold on,
00:19:41.580 hold on.
00:19:42.320 Did I really
00:19:42.720 see that?
00:19:46.500 Jungle,
00:19:47.260 well,
00:19:47.500 I'm not going
00:19:47.840 to read your
00:19:48.220 name because
00:19:48.780 it's offensive,
00:19:49.600 but did you
00:19:50.340 really lose
00:19:50.820 64 pounds
00:19:51.720 this year?
00:19:53.780 Could be.
00:19:54.460 lost 10,
00:19:59.040 lost 10,
00:19:59.520 all right,
00:20:00.720 somebody lost,
00:20:01.600 somebody says
00:20:02.280 they lost 41,
00:20:03.360 I don't know
00:20:03.720 about that.
00:20:04.740 Minus 25,
00:20:05.840 good job,
00:20:06.540 lost 15,
00:20:07.600 good job.
00:20:08.600 All right,
00:20:08.840 here's,
00:20:09.260 here's a,
00:20:10.500 what I think
00:20:11.420 happened here.
00:20:13.140 When you have
00:20:13.960 something like
00:20:14.500 a pandemic
00:20:15.100 and suddenly
00:20:16.320 you don't know
00:20:17.420 what to do
00:20:18.080 because you
00:20:18.620 haven't been
00:20:19.200 in a pandemic
00:20:19.820 before,
00:20:20.840 and you have
00:20:21.500 to make
00:20:21.760 your own
00:20:22.240 decisions,
00:20:22.780 most of
00:20:25.100 the time
00:20:25.500 you're put
00:20:25.940 into a
00:20:26.340 situation
00:20:26.800 where the
00:20:27.360 decisions
00:20:27.760 are kind
00:20:28.200 of made
00:20:28.480 for you,
00:20:29.660 right,
00:20:30.000 you kind
00:20:30.660 of know
00:20:30.920 you've got
00:20:31.240 to get
00:20:31.480 a job
00:20:31.980 and you
00:20:32.200 have to
00:20:32.400 wear clothes
00:20:33.120 and,
00:20:34.160 you know,
00:20:34.300 if you
00:20:34.480 want to
00:20:34.680 go somewhere
00:20:35.260 there's a
00:20:35.740 certain way
00:20:36.120 to do it,
00:20:36.940 et cetera,
00:20:37.420 but you
00:20:37.740 hit the
00:20:38.000 pandemic
00:20:38.400 and suddenly
00:20:39.920 all the
00:20:40.700 rules are
00:20:41.200 wrong and
00:20:41.840 nothing's
00:20:42.580 the same
00:20:42.980 and now
00:20:43.240 you have
00:20:43.520 to figure
00:20:44.740 out what
00:20:45.100 to do.
00:20:46.820 I think
00:20:47.520 this really
00:20:47.980 told you
00:20:48.400 who you
00:20:48.720 were.
00:20:51.640 There are
00:20:52.260 people who
00:20:52.940 see crises
00:20:54.400 and say,
00:20:56.620 oh wow,
00:20:57.540 this crisis
00:20:58.360 is going to
00:20:59.400 give me an
00:20:59.800 opportunity to
00:21:00.580 do something
00:21:01.000 I never
00:21:01.460 could have
00:21:01.800 done without
00:21:02.480 the crisis.
00:21:03.800 And one
00:21:04.080 of the things
00:21:04.520 that I
00:21:04.900 wanted to
00:21:05.320 do was
00:21:05.920 take a
00:21:07.420 year to
00:21:07.740 work on
00:21:08.060 my fitness
00:21:08.580 more than
00:21:09.180 normal.
00:21:10.640 And so
00:21:11.040 I did
00:21:11.540 that.
00:21:12.380 And so
00:21:12.620 at least
00:21:14.080 my muscle
00:21:14.840 definition
00:21:15.360 is probably
00:21:16.220 better than
00:21:16.640 ever.
00:21:16.820 and I
00:21:20.340 said to
00:21:20.720 myself,
00:21:21.220 I'm not
00:21:21.480 going to
00:21:21.720 waste this
00:21:22.380 crisis.
00:21:23.840 I started
00:21:24.680 a new
00:21:25.660 line of
00:21:26.080 business.
00:21:27.180 I got
00:21:27.780 healthier,
00:21:29.200 got a lot
00:21:29.900 of stuff
00:21:30.200 in order,
00:21:31.000 and so
00:21:31.640 I came
00:21:32.120 out ahead.
00:21:33.380 And I
00:21:33.940 knew I
00:21:34.260 would because
00:21:35.100 it was my
00:21:35.680 plan and
00:21:36.580 there was
00:21:36.840 nothing to
00:21:37.360 stop it.
00:21:37.920 There was
00:21:38.120 no friction.
00:21:39.180 I had every
00:21:40.060 opportunity in
00:21:40.780 the world to
00:21:41.240 come out
00:21:41.540 ahead.
00:21:41.880 I had more
00:21:42.260 time to
00:21:42.620 work on
00:21:43.000 stuff so
00:21:44.280 I could do
00:21:44.660 things that
00:21:45.140 I've been
00:21:45.400 putting off.
00:21:46.000 And I
00:21:47.060 could work
00:21:48.440 on my
00:21:48.740 fitness because
00:21:49.440 I don't
00:21:49.760 have a
00:21:50.000 social life
00:21:50.520 like most
00:21:51.780 of them.
00:21:52.900 And I
00:21:53.720 tried on
00:21:54.180 day one,
00:21:55.180 I tried to
00:21:57.440 get you a
00:21:59.060 little bit
00:21:59.360 pregnant with
00:21:59.980 that philosophy
00:22:00.700 that you
00:22:01.420 should be
00:22:01.720 looking at
00:22:02.100 this as
00:22:02.440 an opportunity,
00:22:03.300 not just
00:22:03.880 a problem.
00:22:04.720 It's a
00:22:05.000 big problem,
00:22:05.920 but it
00:22:06.500 comes with
00:22:06.880 an opportunity.
00:22:08.500 Now,
00:22:10.180 here's
00:22:13.780 the payoff
00:22:14.760 for this
00:22:15.540 topic.
00:22:16.000 Number
00:22:18.020 one,
00:22:18.760 ask yourself,
00:22:19.720 look at
00:22:20.180 the people
00:22:20.520 who lost
00:22:20.980 weight.
00:22:22.060 If you
00:22:22.580 look at
00:22:22.880 all the
00:22:23.140 people in
00:22:23.520 the comments
00:22:23.980 who said
00:22:24.360 they lost
00:22:24.840 weight,
00:22:25.440 I don't
00:22:25.860 think it
00:22:26.140 was a
00:22:26.380 majority,
00:22:27.120 maybe 25%
00:22:28.620 of you,
00:22:29.440 perhaps,
00:22:30.480 said you
00:22:30.860 lost weight.
00:22:32.580 That could
00:22:33.260 have been
00:22:33.500 you.
00:22:34.760 That could
00:22:35.140 have been
00:22:35.360 you.
00:22:36.280 Right?
00:22:36.700 Because
00:22:37.060 nothing stops
00:22:37.700 you from
00:22:38.000 taking a
00:22:38.480 walk,
00:22:39.540 and nothing
00:22:39.980 stops you
00:22:40.500 from eating
00:22:40.960 more healthily.
00:22:42.060 If you
00:22:42.460 need some
00:22:43.440 help in
00:22:44.100 losing weight,
00:22:44.800 see my
00:22:46.380 book behind
00:22:46.920 me,
00:22:47.260 How to
00:22:47.480 Fail of
00:22:47.780 Almost
00:22:47.980 Everything and
00:22:48.540 Still Win
00:22:48.980 Big?
00:22:49.740 Two of the
00:22:50.520 chapters in
00:22:51.080 there are
00:22:51.600 about fitness
00:22:53.060 and diet
00:22:53.920 systems.
00:22:55.520 Now,
00:22:55.940 when I talk
00:22:56.300 about a
00:22:56.680 system,
00:22:57.580 it's as
00:22:58.020 opposed to
00:22:58.400 having a
00:22:58.800 goal of
00:22:59.360 losing weight.
00:23:00.300 If you have
00:23:00.940 a goal,
00:23:01.680 usually you're
00:23:02.300 just going to
00:23:02.640 work harder.
00:23:03.700 I'm going to
00:23:04.260 try harder to
00:23:04.960 not eat,
00:23:05.580 and then that's
00:23:06.240 just something
00:23:07.380 that most
00:23:07.780 people can't
00:23:08.340 do.
00:23:08.500 But if
00:23:09.880 you have
00:23:10.100 a system
00:23:10.700 that makes
00:23:11.900 it easy
00:23:12.520 to lose
00:23:13.040 weight,
00:23:13.980 or easy
00:23:14.660 to exercise,
00:23:16.120 well,
00:23:16.360 then most
00:23:16.680 people can
00:23:17.180 succeed.
00:23:18.060 Because you
00:23:18.520 took something
00:23:19.060 hard and you
00:23:19.720 just made
00:23:20.140 it easy.
00:23:21.160 Let me
00:23:21.480 give you an
00:23:21.820 example.
00:23:23.020 I remove
00:23:23.700 from my
00:23:24.460 home food
00:23:26.120 that's bad
00:23:26.680 for me.
00:23:28.080 That's the
00:23:28.700 system.
00:23:29.800 How hard
00:23:30.580 is it for
00:23:31.100 me to
00:23:31.420 simply not
00:23:32.220 buy food
00:23:33.620 that's bad
00:23:34.100 for me?
00:23:34.920 It's not
00:23:35.280 hard.
00:23:36.460 Because as
00:23:37.440 long as I'm
00:23:37.920 not hungry
00:23:38.420 when I'm
00:23:38.820 shopping,
00:23:39.740 which is
00:23:40.140 also a
00:23:40.680 system,
00:23:41.100 right?
00:23:41.280 Make sure
00:23:41.680 you're not
00:23:41.980 hungry when
00:23:42.460 you're
00:23:42.620 shopping.
00:23:43.280 I just
00:23:43.780 don't have
00:23:44.160 anything in
00:23:44.600 the house
00:23:45.020 that's bad
00:23:45.880 for me.
00:23:47.060 So when
00:23:47.780 I'm hungry,
00:23:49.360 I eat
00:23:49.860 what's in
00:23:50.280 the house
00:23:50.700 because it's
00:23:51.460 easy.
00:23:52.060 So I've
00:23:52.460 created a
00:23:53.000 system in
00:23:53.940 which eating
00:23:54.520 bad food
00:23:55.200 is sort
00:23:55.740 of impractical
00:23:56.520 because I
00:23:57.500 made it
00:23:57.800 that way.
00:23:59.140 Now,
00:23:59.780 there's a
00:24:00.200 whole bunch
00:24:00.580 of other
00:24:01.100 system
00:24:02.380 tweaks,
00:24:03.120 such as
00:24:03.900 a system
00:24:04.560 of continuously
00:24:06.100 learning how
00:24:07.380 to prepare
00:24:08.060 food that's
00:24:08.720 good for
00:24:09.140 you in
00:24:10.040 a way
00:24:10.280 that you
00:24:10.600 enjoy eating.
00:24:12.220 So I
00:24:13.560 never stop
00:24:14.300 experimenting
00:24:14.920 on how to
00:24:15.760 make a
00:24:16.380 yam
00:24:17.700 taste good.
00:24:19.800 And by the
00:24:20.380 way,
00:24:20.540 I have a
00:24:20.840 really good
00:24:21.200 way to
00:24:21.520 eat a
00:24:21.880 yam that
00:24:22.280 makes it
00:24:22.560 taste great.
00:24:23.540 Soy sauce
00:24:24.060 and pepper,
00:24:24.760 it's good
00:24:25.460 stuff.
00:24:26.140 But if
00:24:27.580 you're not
00:24:27.880 experimenting
00:24:28.500 continuously how
00:24:29.860 to make the
00:24:30.580 food that's
00:24:31.260 good for you
00:24:31.820 taste good
00:24:32.380 also,
00:24:33.740 the difference
00:24:34.420 between how
00:24:35.260 good your
00:24:35.740 bad food
00:24:36.340 tastes and
00:24:37.340 how good
00:24:37.700 the stuff
00:24:38.080 that's good
00:24:38.620 for you
00:24:38.940 tastes is
00:24:39.360 going to
00:24:39.500 be gigantic.
00:24:40.720 But you
00:24:41.200 can learn
00:24:41.640 to close
00:24:42.140 that gap
00:24:42.880 by learning
00:24:43.960 to prepare
00:24:44.520 it just
00:24:44.900 the way
00:24:45.180 you like
00:24:45.580 and figuring
00:24:46.060 out what
00:24:46.480 works and
00:24:46.940 what doesn't.
00:24:47.540 Anyway,
00:24:48.020 that's the
00:24:48.520 short version.
00:24:49.260 If you want
00:24:49.480 to see the
00:24:49.760 longer version,
00:24:51.000 it's in my
00:24:51.380 book.
00:24:51.780 People who
00:24:52.140 have read
00:24:52.360 that book
00:24:52.820 report losing
00:24:53.840 massive amounts
00:24:54.680 of weight,
00:24:55.380 gaining muscle,
00:24:56.680 and none of
00:24:57.080 the book is
00:24:58.120 about really
00:24:58.880 weight or diet.
00:25:00.360 It's only
00:25:00.740 about systems.
00:25:02.180 So you would
00:25:02.740 design your
00:25:03.240 own systems.
00:25:04.100 I give you
00:25:04.400 some examples.
00:25:05.640 But it's
00:25:05.920 about coming
00:25:07.260 up with a
00:25:07.740 system that's
00:25:08.440 designed for
00:25:09.100 you, not
00:25:09.820 one that I
00:25:10.260 give you.
00:25:11.760 But here's
00:25:12.420 the payoff
00:25:13.260 for this
00:25:13.980 bullshit
00:25:15.740 statistic.
00:25:17.140 About 42%
00:25:18.160 of Americans
00:25:19.100 report undesired
00:25:20.940 weight gain.
00:25:21.600 And this
00:25:21.820 comes from
00:25:22.300 Nate Silver,
00:25:23.360 who you
00:25:24.040 should be
00:25:24.440 following on
00:25:25.140 Twitter.
00:25:26.020 Nate Silver,
00:25:26.880 even if you
00:25:27.500 disagree with
00:25:28.300 him on
00:25:28.640 everything,
00:25:30.100 he can do
00:25:30.660 math.
00:25:31.820 He can do
00:25:32.640 math.
00:25:33.820 Better than
00:25:34.320 you can.
00:25:35.340 Probably.
00:25:35.980 At least
00:25:36.260 statistics,
00:25:37.020 anyway.
00:25:38.140 So here's
00:25:39.140 what he
00:25:39.460 says.
00:25:40.200 This really
00:25:40.860 calls out
00:25:41.360 for a
00:25:41.680 control,
00:25:42.940 meaning a
00:25:43.480 controlled
00:25:43.920 part of
00:25:44.320 the
00:25:44.420 experiment.
00:25:45.540 How many
00:25:45.860 Americans
00:25:46.260 have undesired
00:25:47.120 weight gain
00:25:47.660 in a typical
00:25:48.260 year?
00:25:50.300 Did any
00:25:51.040 of you think
00:25:51.660 of that
00:25:51.980 question?
00:25:53.300 If this
00:25:54.120 year 42%
00:25:55.180 of Americans
00:25:56.020 gained weight
00:25:56.640 they didn't
00:25:57.060 want,
00:25:58.260 what's a
00:25:59.060 normal year
00:25:59.640 look like?
00:26:00.180 I'll bet
00:26:01.900 it's pretty
00:26:02.300 close to
00:26:02.800 42% of
00:26:03.720 Americans
00:26:04.180 gained weight
00:26:04.680 they didn't
00:26:05.040 want,
00:26:05.880 right?
00:26:06.760 It's probably
00:26:07.460 exactly the
00:26:08.140 same.
00:26:09.400 But you
00:26:10.040 don't know
00:26:10.400 that because
00:26:11.580 they don't
00:26:11.900 tell you
00:26:12.240 that.
00:26:12.560 And it
00:26:13.240 takes somebody
00:26:15.740 who lives
00:26:16.360 and breathes
00:26:16.960 statistics to
00:26:18.620 flag this
00:26:19.440 obviously
00:26:20.200 missing part
00:26:21.360 of the
00:26:21.580 analysis,
00:26:22.700 which could
00:26:23.680 be that
00:26:24.440 there's nothing
00:26:24.960 to see here.
00:26:26.020 This might
00:26:26.740 be just like
00:26:27.400 every other
00:26:27.860 year.
00:26:28.700 In fact,
00:26:30.040 the thing
00:26:30.480 that can't
00:26:30.960 be ruled
00:26:31.380 out is
00:26:32.380 that people
00:26:32.780 gained less
00:26:34.060 weight this
00:26:35.640 year than
00:26:36.060 before.
00:26:36.540 I don't
00:26:36.760 think that's
00:26:37.160 true, by
00:26:37.500 the way.
00:26:38.000 But you
00:26:38.280 can't rule
00:26:38.660 it down
00:26:39.060 based on
00:26:39.660 what we
00:26:39.980 know.
00:26:41.040 So that's
00:26:41.640 why Nate
00:26:42.000 Silver is
00:26:42.780 sort of
00:26:43.480 a national
00:26:44.000 treasure,
00:26:44.960 because Americans
00:26:46.060 are statistically
00:26:47.300 ignorant,
00:26:48.540 and I would
00:26:49.260 include myself
00:26:50.000 in this
00:26:50.380 category.
00:26:51.920 My skill
00:26:52.940 stack has
00:26:54.560 some statistics
00:26:55.540 in it,
00:26:56.380 but not
00:26:56.940 enough that I
00:26:57.460 would consider
00:26:57.920 myself capable
00:26:59.600 to be a good
00:27:00.580 analyst in that
00:27:01.420 area.
00:27:05.020 So,
00:27:06.140 Sidney
00:27:08.080 Powell,
00:27:09.120 you all
00:27:09.880 know Sidney
00:27:10.380 Powell.
00:27:10.900 She made
00:27:11.220 lots of
00:27:11.660 claims about
00:27:12.320 Dominion
00:27:12.780 Software,
00:27:13.740 and Dominion
00:27:14.520 Software is
00:27:15.200 suing her
00:27:16.280 for saying
00:27:17.880 these things
00:27:18.460 they say
00:27:18.940 are untrue.
00:27:20.420 Guess what
00:27:20.820 her defense
00:27:21.420 is?
00:27:22.720 You don't
00:27:23.360 have to
00:27:23.660 guess,
00:27:23.980 I'm just
00:27:24.260 going to
00:27:24.480 tell you.
00:27:25.540 Her defense
00:27:26.120 is,
00:27:27.060 she's arguing
00:27:28.040 in new
00:27:28.500 court filings,
00:27:29.820 that no
00:27:30.200 reasonable
00:27:30.760 people would
00:27:32.220 believe what
00:27:32.940 she said.
00:27:37.080 That's right.
00:27:37.800 her defense
00:27:39.200 is that no
00:27:40.900 reasonable
00:27:41.400 person would
00:27:42.300 have believed
00:27:42.860 anything she
00:27:43.420 said about
00:27:44.980 Dominion.
00:27:46.160 What do you
00:27:46.900 think of that?
00:27:48.540 Legally
00:27:49.220 speaking, is
00:27:49.860 that a good
00:27:50.280 defense?
00:27:50.740 so when she
00:27:52.700 was saying
00:27:53.300 things such
00:27:54.580 as the
00:27:56.180 Dominion
00:27:57.280 software had
00:27:58.020 some
00:27:58.320 Venezuelan
00:27:59.420 dictator
00:28:00.060 connection,
00:28:02.780 and that it
00:28:03.760 was, you
00:28:04.520 know, her
00:28:04.840 claim is that
00:28:05.600 it was rigged,
00:28:06.260 et cetera,
00:28:07.400 did you,
00:28:10.360 as a
00:28:11.140 reasonable
00:28:11.520 person,
00:28:12.760 as a
00:28:13.500 reasonable
00:28:13.900 person,
00:28:14.400 did you say
00:28:15.020 to yourself,
00:28:16.340 well, I
00:28:17.600 don't think I
00:28:18.260 could believe
00:28:18.660 that, just
00:28:19.300 because she's
00:28:19.700 saying it.
00:28:21.000 Now, there's
00:28:21.840 a little bit
00:28:22.240 more to the
00:28:22.720 argument, which
00:28:23.520 is that a
00:28:24.760 reasonable
00:28:25.180 person should
00:28:26.020 know that
00:28:27.580 just because
00:28:28.380 Sidney Powell
00:28:29.040 says something
00:28:29.840 is true,
00:28:31.500 that you
00:28:32.460 would still
00:28:32.860 need to
00:28:33.280 check,
00:28:34.920 you know,
00:28:35.280 legally, you'd
00:28:35.920 have to do
00:28:36.300 audits or
00:28:36.760 whatever, you
00:28:37.580 would still
00:28:37.880 have to
00:28:38.160 check to
00:28:38.700 know if
00:28:39.080 it were
00:28:39.360 true.
00:28:40.700 So the
00:28:41.340 point is,
00:28:42.820 even if you
00:28:43.560 agreed with
00:28:44.400 her,
00:28:45.540 like your
00:28:46.080 instinct,
00:28:47.140 et cetera,
00:28:47.440 was the
00:28:48.000 same as
00:28:48.360 hers,
00:28:48.960 so even
00:28:49.680 if you
00:28:50.040 did believe
00:28:50.620 there was
00:28:50.940 some gigantic
00:28:51.680 irregularity
00:28:52.780 here,
00:28:54.060 you would
00:28:55.000 also be
00:28:55.560 smart enough
00:28:56.140 to know
00:28:56.620 that you
00:28:58.180 wouldn't
00:28:58.460 take it
00:28:58.860 based on
00:28:59.320 just
00:28:59.580 Sidney
00:29:00.180 Powell
00:29:00.420 said so,
00:29:01.760 right?
00:29:03.420 No matter
00:29:04.220 how much
00:29:04.720 you think
00:29:05.260 she's right,
00:29:07.080 isn't it
00:29:07.640 also true
00:29:08.680 that you,
00:29:10.500 as a
00:29:10.720 reasonable
00:29:11.060 person,
00:29:12.220 would still
00:29:12.820 say,
00:29:13.480 but you
00:29:13.740 still have
00:29:14.060 to check,
00:29:15.540 I mean,
00:29:16.540 it's not
00:29:16.780 just because
00:29:17.380 a lawyer
00:29:17.780 is pretty
00:29:18.260 sure it
00:29:18.660 happened,
00:29:19.380 because she
00:29:20.060 didn't say,
00:29:20.780 here's my
00:29:21.220 source code,
00:29:22.160 look,
00:29:22.440 you can see
00:29:22.860 it yourself.
00:29:24.160 She did
00:29:24.880 not say,
00:29:25.820 here's the
00:29:26.540 proof,
00:29:27.500 you can see
00:29:28.320 it yourself,
00:29:29.000 although she
00:29:29.540 did, I
00:29:29.900 guess she
00:29:30.200 showed some
00:29:30.620 evidence,
00:29:31.380 but I
00:29:31.640 think it
00:29:31.940 should have
00:29:32.260 been obvious
00:29:33.100 to anybody
00:29:34.600 who is a
00:29:35.520 reasonable
00:29:36.460 person,
00:29:37.960 how do you
00:29:38.440 define a
00:29:38.880 reasonable
00:29:39.140 person in
00:29:39.940 2021,
00:29:41.060 when literally
00:29:41.780 nobody is a
00:29:42.520 reasonable
00:29:42.800 person?
00:29:43.240 it's just
00:29:44.160 not even
00:29:44.540 a thing.
00:29:45.600 But here's
00:29:46.040 the thing,
00:29:47.040 I consider
00:29:48.080 myself a
00:29:49.620 reasonable
00:29:50.040 person.
00:29:51.600 The evidence
00:29:52.580 is in the
00:29:53.700 ground news
00:29:54.760 app,
00:29:55.740 which suggests
00:29:56.760 that I
00:29:57.220 follow enough
00:29:58.200 news on the
00:29:59.040 left and the
00:29:59.680 right to be
00:30:01.140 relatively
00:30:01.760 non-brain
00:30:03.120 damaged.
00:30:04.700 That's right.
00:30:05.900 So now I
00:30:06.460 have an app
00:30:07.160 which has
00:30:08.180 looked at my
00:30:08.880 body of
00:30:10.160 tweeting and
00:30:10.840 commentary,
00:30:11.300 and decided
00:30:12.700 that unlike
00:30:13.260 many of
00:30:13.840 you, I
00:30:15.920 don't have
00:30:16.340 brain damage
00:30:17.000 from politics.
00:30:19.260 Now what
00:30:19.740 if you ran
00:30:20.780 your Twitter
00:30:23.860 account through
00:30:25.220 the ground
00:30:25.840 news app and
00:30:27.080 it came out
00:30:27.580 showing that
00:30:28.100 you're only
00:30:28.560 following news
00:30:29.560 on the
00:30:30.020 right?
00:30:31.740 Well that
00:30:32.260 would say
00:30:32.600 that you
00:30:32.860 have brain
00:30:33.220 damage,
00:30:34.560 scientifically
00:30:35.100 speaking.
00:30:36.640 Or the
00:30:37.260 same as if
00:30:37.820 you are only
00:30:38.420 following news
00:30:38.980 on the
00:30:39.260 left, it
00:30:39.640 doesn't matter
00:30:40.040 if it's
00:30:40.280 left or
00:30:40.660 right.
00:30:41.300 If you're
00:30:41.620 only following
00:30:42.180 one side,
00:30:43.940 science says
00:30:45.000 you've got
00:30:45.560 brain damage
00:30:46.180 and you're
00:30:46.560 giving it
00:30:46.900 to yourself.
00:30:48.440 Now would
00:30:49.120 you say
00:30:49.440 that a
00:30:49.700 person who
00:30:50.060 has brain
00:30:50.480 damage
00:30:50.940 would be
00:30:52.620 a reasonable
00:30:54.000 person?
00:30:55.640 No, that's
00:30:56.480 the opposite,
00:30:57.040 right?
00:30:57.720 A reasonable
00:30:58.360 person, by
00:31:00.020 definition,
00:31:00.820 wouldn't you
00:31:01.140 say is
00:31:01.540 somebody without
00:31:02.220 brain damage?
00:31:04.040 Right?
00:31:05.240 That would
00:31:05.760 be pretty
00:31:06.720 much understood
00:31:07.800 to be in
00:31:08.240 the definition.
00:31:08.780 So I'm
00:31:10.480 a reasonable
00:31:11.080 person, I
00:31:12.100 can demonstrate
00:31:12.820 with this
00:31:13.360 app that I
00:31:14.180 don't seem
00:31:14.600 to have an
00:31:15.000 indication of
00:31:15.700 brain damage,
00:31:16.640 and when
00:31:17.280 Sidney Powell
00:31:17.940 was making
00:31:18.700 her claims
00:31:19.240 about
00:31:19.540 Dominion,
00:31:20.100 do you
00:31:20.540 know what
00:31:20.920 I said
00:31:21.520 out loud
00:31:22.920 many times
00:31:24.720 in public?
00:31:27.140 Well, there's
00:31:27.920 no way that's
00:31:28.560 true.
00:31:30.620 That's what I
00:31:31.460 said.
00:31:32.300 And I'm a
00:31:33.100 reasonable
00:31:33.440 person and I
00:31:34.340 can prove it
00:31:34.900 with this
00:31:35.260 ground news
00:31:35.820 app.
00:31:37.620 So how
00:31:38.620 good is
00:31:38.980 her defense?
00:31:40.580 I'm a
00:31:41.300 reasonable
00:31:41.660 person and
00:31:43.160 to me, her
00:31:44.880 defense is
00:31:45.620 100% solid.
00:31:47.880 Because there
00:31:48.980 was no way
00:31:49.780 that me, you
00:31:52.100 know, I as
00:31:52.780 a reasonable
00:31:53.320 person, believed
00:31:54.280 any of her
00:31:54.760 claims.
00:31:55.840 I did
00:31:56.560 believe it
00:31:58.420 was worth
00:31:58.860 looking into.
00:32:00.280 I did
00:32:01.060 believe that
00:32:02.120 if you
00:32:02.520 looked into
00:32:03.040 it, you
00:32:04.060 might be
00:32:04.480 able to
00:32:04.780 determine if
00:32:05.540 her instinct
00:32:07.200 was correct
00:32:07.960 or if her
00:32:09.200 instinct was
00:32:09.780 off, right?
00:32:12.400 You'd be able
00:32:13.160 to figure that
00:32:14.440 out.
00:32:16.540 I think she
00:32:17.380 has an
00:32:17.680 airtight
00:32:18.080 defense.
00:32:19.360 I honestly
00:32:19.880 do.
00:32:21.060 Because I
00:32:22.780 would defend
00:32:23.240 it just the
00:32:23.660 way I said.
00:32:24.840 Show me
00:32:25.480 some reasonable
00:32:26.840 people who
00:32:28.080 according to
00:32:28.740 the app
00:32:29.520 Ground News
00:32:30.440 do not have
00:32:31.180 brain damage.
00:32:31.980 I'll show
00:32:32.320 you a few.
00:32:33.680 Here's one.
00:32:34.320 his name
00:32:34.900 is Scott
00:32:35.680 Adams.
00:32:36.900 Check for
00:32:37.380 yourself.
00:32:38.560 Look at
00:32:38.980 his body
00:32:39.400 of work.
00:32:40.720 Check his
00:32:41.220 Ground News
00:32:42.080 app and
00:32:43.260 find out if
00:32:43.860 he has
00:32:44.060 brain damage.
00:32:45.220 There's a
00:32:45.780 reasonable
00:32:46.060 person.
00:32:46.620 Scott, you're
00:32:47.680 reasonable.
00:32:48.280 Did you believe
00:32:48.920 everything I
00:32:49.480 said or did
00:32:50.120 you think,
00:32:51.100 like a
00:32:51.420 reasonable
00:32:51.720 person would,
00:32:52.920 that you
00:32:53.240 would have
00:32:53.480 to validate
00:32:54.020 it by
00:32:54.540 really looking
00:32:55.680 into it?
00:32:58.260 I think she
00:32:59.080 wins.
00:33:00.280 I think she
00:33:00.760 does.
00:33:01.000 Now, correct
00:33:02.800 me if I'm
00:33:03.240 wrong, when I
00:33:04.220 get into the
00:33:04.720 legal stuff, I'm
00:33:05.620 way over my
00:33:06.340 head, but I
00:33:07.960 think this type
00:33:08.860 of case, it's
00:33:10.520 not a reasonable
00:33:11.120 doubt case,
00:33:11.820 right?
00:33:12.160 It's one of
00:33:12.660 those civil
00:33:13.540 things where
00:33:14.300 the jury just
00:33:17.000 has to be
00:33:17.520 weighted in one
00:33:18.520 direction.
00:33:19.200 They don't have
00:33:19.900 to have 100%
00:33:20.700 agreement.
00:33:21.860 So she does
00:33:22.520 have some risk,
00:33:23.540 but her defense
00:33:24.160 is solid, I
00:33:24.980 would say.
00:33:25.300 So I
00:33:32.400 guess there's
00:33:32.920 a lot of
00:33:34.520 movement in
00:33:36.780 I guess King
00:33:38.240 County, a
00:33:40.440 county that
00:33:40.820 includes Seattle,
00:33:41.900 Washington, will
00:33:42.940 spend $5 million
00:33:43.880 of their COVID
00:33:44.720 relief that
00:33:46.500 they're getting
00:33:46.800 from the
00:33:47.100 government to
00:33:49.740 fight what
00:33:50.540 they call
00:33:50.860 anti-Asian
00:33:51.580 bias in
00:33:53.060 response to
00:33:53.720 recent shootings
00:33:54.480 in Atlanta
00:33:54.980 Georgia, etc.
00:33:56.580 Now, I
00:33:58.680 think that
00:33:59.220 anything that
00:33:59.900 reduces bias
00:34:00.900 is good, and
00:34:01.680 anything that
00:34:02.260 reduces violence
00:34:03.220 is good, and
00:34:04.560 anything that
00:34:05.240 reduces anti-Asian
00:34:06.640 violence is
00:34:07.320 good, but I
00:34:09.680 didn't even need
00:34:10.280 to throw in the
00:34:11.620 anti-Asian part,
00:34:12.620 right?
00:34:13.020 Because anything
00:34:13.660 that reduces
00:34:14.220 violence is good.
00:34:17.180 But here's the
00:34:18.260 question.
00:34:19.960 How do you
00:34:20.780 do that?
00:34:22.700 What would
00:34:23.400 be the way
00:34:24.160 to reduce
00:34:25.760 anti-Asian
00:34:26.880 bias?
00:34:28.620 Because what's
00:34:29.440 it based
00:34:29.820 on?
00:34:32.540 If you
00:34:33.340 knew what
00:34:33.660 the anti-Asian
00:34:35.620 bias was
00:34:36.500 based on,
00:34:37.940 then you
00:34:38.960 could work
00:34:39.380 on the
00:34:40.720 base
00:34:41.240 assumptions
00:34:41.740 to make
00:34:42.640 sure people
00:34:43.040 are on the
00:34:43.420 same page.
00:34:43.940 change, but
00:34:45.180 I have a,
00:34:48.080 let's say,
00:34:50.760 a hypothesis
00:34:51.620 that the
00:34:53.180 anti-Asian
00:34:54.100 bias has to
00:34:55.000 do with
00:34:55.300 their success.
00:34:57.700 Maybe not
00:34:58.560 100%, but
00:35:00.600 don't you
00:35:00.900 think the
00:35:01.300 biggest part
00:35:02.200 of why
00:35:03.060 Asians are
00:35:03.820 being targeted
00:35:05.200 has more to
00:35:06.240 do with the
00:35:06.620 fact that
00:35:07.120 they're doing
00:35:07.560 great?
00:35:08.540 Like, in
00:35:09.280 American society,
00:35:10.180 they're doing
00:35:10.860 better than
00:35:11.380 almost any
00:35:12.000 other group.
00:35:13.540 And do
00:35:15.940 you really
00:35:16.420 think that
00:35:17.840 a Chinese
00:35:19.140 American
00:35:19.820 citizen is
00:35:21.020 being targeted
00:35:21.740 because the
00:35:22.740 Chinese Communist
00:35:24.300 Party, which
00:35:26.100 is, you
00:35:26.820 know, X
00:35:27.320 number of
00:35:27.760 people, it's
00:35:28.180 not China,
00:35:29.120 and it's
00:35:29.460 certainly not
00:35:30.020 all Chinese
00:35:30.680 people.
00:35:32.120 Is there
00:35:32.620 anybody who
00:35:33.780 is targeting
00:35:34.300 Asian Americans
00:35:35.460 because of
00:35:37.200 what the
00:35:37.520 Chinese Communist
00:35:38.360 Party is doing?
00:35:39.880 Like, we're
00:35:40.340 being told
00:35:40.820 that that's
00:35:41.220 a thing, but
00:35:42.840 is it?
00:35:44.040 And did the
00:35:45.100 violence suddenly
00:35:46.120 spike, or are
00:35:46.960 we just more
00:35:47.440 aware of it?
00:35:49.700 You know, it
00:35:50.380 only takes a
00:35:51.460 few high-profile
00:35:52.460 incidents.
00:35:53.540 And if you
00:35:54.000 look at the
00:35:54.420 Atlanta one,
00:35:55.140 that looked
00:35:55.580 like a
00:35:55.960 coincidence.
00:35:57.220 The fact that
00:35:57.920 the people who
00:35:58.700 were killed
00:35:59.080 were mostly
00:35:59.960 Asian was
00:36:01.720 because it
00:36:02.280 was a
00:36:02.560 massage parlor.
00:36:03.560 It was the
00:36:03.980 massage parlor
00:36:04.640 that was the
00:36:05.180 problem, according
00:36:06.380 to the shooter.
00:36:07.620 It wasn't the
00:36:08.760 ethnicity at
00:36:09.400 all.
00:36:10.780 So, first
00:36:11.980 of all, I
00:36:13.180 assume anti-Asian
00:36:15.000 bias and
00:36:15.740 violence are
00:36:16.340 real things,
00:36:17.180 of course, and
00:36:18.500 that there
00:36:19.200 should be
00:36:19.740 lessened.
00:36:21.860 But I don't
00:36:22.620 know if you
00:36:23.060 can do it if
00:36:24.580 the real
00:36:25.020 problem is that
00:36:26.020 Asian Americans
00:36:26.900 are too
00:36:27.420 successful.
00:36:28.800 Because I
00:36:29.420 feel like that's
00:36:30.120 the problem.
00:36:31.640 Let me ask
00:36:32.340 you, who is
00:36:32.860 the most hated
00:36:34.340 group in
00:36:35.240 America right
00:36:37.220 now?
00:36:37.500 who is the
00:36:38.780 most hated
00:36:39.460 group?
00:36:40.500 I would say
00:36:41.460 white, rich
00:36:43.440 people.
00:36:45.720 You know, the
00:36:46.500 elites, the
00:36:47.180 people in
00:36:47.540 power.
00:36:48.740 They're the
00:36:49.220 most, probably
00:36:50.160 the most hated.
00:36:51.780 Why is it
00:36:52.540 that the
00:36:54.140 most hated
00:36:54.720 people are
00:36:55.560 elite, rich,
00:36:58.280 white people?
00:37:00.360 Isn't it
00:37:01.020 because they're
00:37:01.720 successful?
00:37:02.240 successful?
00:37:04.380 You know, it's
00:37:04.880 also because the
00:37:06.140 people who are
00:37:07.120 in charge, or
00:37:08.800 the people you
00:37:09.300 think are in
00:37:09.740 charge, create
00:37:10.920 all the rules,
00:37:11.780 and if the
00:37:12.160 rules seem bad,
00:37:13.280 then who are
00:37:14.040 you going to
00:37:14.300 blame?
00:37:14.960 But don't you
00:37:15.420 have a problem
00:37:15.980 that the people
00:37:17.900 who are the
00:37:18.360 most successful
00:37:19.400 are always
00:37:20.900 hated?
00:37:22.180 Is there any
00:37:22.940 exception to
00:37:23.520 that?
00:37:24.320 Can you think
00:37:24.940 of any exception
00:37:25.700 where the
00:37:26.160 people who have
00:37:26.760 the power don't
00:37:28.500 also have a
00:37:29.840 third of the
00:37:30.440 public or more
00:37:31.080 just hating
00:37:31.580 them?
00:37:32.440 It's sort of
00:37:32.980 automatic, isn't
00:37:33.800 it?
00:37:34.360 So I have a
00:37:35.160 real question of
00:37:35.840 whether anti-Asian
00:37:36.900 bias can be
00:37:38.000 addressed in
00:37:39.120 like a
00:37:39.480 substantial way,
00:37:40.720 as long as
00:37:41.860 they continue to
00:37:42.620 be successful.
00:37:44.260 It feels like
00:37:45.180 that's the
00:37:45.620 problem.
00:37:47.660 So I did a
00:37:48.520 little very
00:37:49.540 unscientific poll
00:37:50.800 on Twitter.
00:37:51.900 They're all
00:37:52.520 unscientific, but
00:37:53.540 this is the
00:37:54.120 most unscientific
00:37:55.340 of them all.
00:37:56.880 The least
00:37:57.700 credible poll
00:37:58.620 ever taken.
00:38:00.360 I asked this
00:38:00.960 question, in
00:38:01.640 your opinion,
00:38:02.200 which ethnic
00:38:02.980 slash racial
00:38:04.280 bias is the
00:38:05.700 biggest problem
00:38:06.380 in the United
00:38:06.980 States in
00:38:07.600 2021?
00:38:09.100 So forget
00:38:09.700 about the
00:38:10.120 past, we're
00:38:11.260 just talking
00:38:11.600 about 2021.
00:38:13.040 Which ethnic
00:38:13.900 racial group
00:38:14.660 is subject
00:38:16.260 to the
00:38:16.680 most bias?
00:38:19.100 Now, my
00:38:19.820 audience is
00:38:21.420 probably 87%
00:38:23.140 white, and
00:38:25.620 surprisingly,
00:38:27.560 or not,
00:38:29.180 the results
00:38:30.520 were that
00:38:31.440 87% of the
00:38:32.680 respondents said
00:38:33.360 that white
00:38:33.860 people are
00:38:34.980 subject to
00:38:35.720 the biggest
00:38:36.180 ethnic bias.
00:38:38.960 Surprise!
00:38:40.640 If 87% of
00:38:42.480 my audience
00:38:43.080 had been
00:38:43.520 Asian-Americans,
00:38:45.300 I think 87%
00:38:46.780 would have said
00:38:47.360 Asian-Americans
00:38:47.980 are the biggest
00:38:48.700 bias victims.
00:38:52.820 Likewise,
00:38:53.660 black,
00:38:54.360 likewise,
00:38:54.980 Hispanic.
00:38:55.340 But, just
00:38:56.480 for fun,
00:38:57.360 knowing that
00:38:58.460 my audience
00:38:59.080 is overwhelmingly
00:39:00.060 white and
00:39:02.940 male,
00:39:04.680 right?
00:39:05.080 So it's as
00:39:05.860 non-scientific
00:39:06.880 a group as
00:39:07.580 you could ever
00:39:08.020 come up with.
00:39:09.360 But they
00:39:10.000 say, that
00:39:11.360 group says,
00:39:12.180 that the
00:39:12.520 biggest bias
00:39:13.180 is against
00:39:13.800 white people
00:39:14.380 by 87%
00:39:15.480 say that.
00:39:16.620 7% say the
00:39:17.900 biggest problem
00:39:18.560 is with
00:39:19.360 black Americans,
00:39:20.760 5% say
00:39:22.520 Asian-Americans
00:39:23.220 or Asians,
00:39:24.140 I guess,
00:39:24.820 because not
00:39:25.660 all are
00:39:26.400 citizens,
00:39:27.480 and 2%
00:39:29.120 said Hispanic.
00:39:31.520 Now, to me,
00:39:32.560 that's the
00:39:32.980 most interesting
00:39:33.700 one, isn't
00:39:34.320 it?
00:39:35.960 How could
00:39:36.820 it be that
00:39:38.120 we have
00:39:38.460 hordes of
00:39:39.300 migrants
00:39:39.960 trying to
00:39:40.660 come across
00:39:41.180 the border?
00:39:42.480 One of the
00:39:43.000 biggest political
00:39:43.860 questions is
00:39:45.240 the number of
00:39:45.960 Hispanic migrants
00:39:47.120 coming into
00:39:48.160 the country.
00:39:48.700 it's, you
00:39:50.360 know, one
00:39:50.820 of the main
00:39:51.300 things we
00:39:51.820 talk about.
00:39:53.760 And yet,
00:39:56.540 my audience,
00:39:57.720 again, deeply
00:39:58.680 unscientific poll,
00:40:00.280 my audience
00:40:00.840 says that only
00:40:01.720 2% think that
00:40:03.240 Hispanics are
00:40:04.540 suffering the
00:40:05.060 biggest ethnic
00:40:05.920 and racial
00:40:06.440 bias.
00:40:08.160 Why is that?
00:40:10.860 I have a
00:40:12.540 hypothesis.
00:40:14.580 They don't
00:40:15.020 complain as
00:40:15.600 much.
00:40:17.100 Isn't that the
00:40:17.980 whole story?
00:40:19.700 Hispanics just
00:40:20.440 don't complain
00:40:21.000 as much.
00:40:22.720 Especially the
00:40:24.340 immigrants, right?
00:40:26.280 Because if you
00:40:27.700 willingly came
00:40:28.800 here and took
00:40:29.720 a big risk to
00:40:30.520 get here, at
00:40:31.500 the very least,
00:40:32.580 cognitive dissonance
00:40:33.660 would make you
00:40:34.220 think you were
00:40:34.820 in the country
00:40:35.340 you wanted to
00:40:35.900 be in.
00:40:37.100 So, number
00:40:37.800 one, one of
00:40:40.300 my biggest
00:40:41.120 problems with
00:40:42.360 trying to be
00:40:43.220 tough on
00:40:44.020 border security,
00:40:46.080 because I think
00:40:46.680 rationally you
00:40:47.520 have to have a
00:40:48.480 good border
00:40:48.940 security.
00:40:49.760 I just think
00:40:50.200 you have to.
00:40:50.980 Now, how much
00:40:51.520 you open it is
00:40:52.420 a separate
00:40:52.800 question, but
00:40:53.840 you have to
00:40:54.260 control it
00:40:54.900 first.
00:40:55.600 That's my
00:40:56.100 view.
00:40:58.060 And anybody
00:40:58.880 who's willing
00:41:00.260 to take the
00:41:01.200 risks and
00:41:02.060 they're ambitious
00:41:02.900 enough and
00:41:04.040 they want to
00:41:04.660 live in this
00:41:05.160 country with
00:41:05.900 me, it's
00:41:07.380 hard to not
00:41:08.080 like them.
00:41:09.060 It really is.
00:41:10.520 If somebody
00:41:11.120 says, just
00:41:12.220 walks up to you
00:41:12.820 and says, you
00:41:13.140 know, I really
00:41:13.620 like you.
00:41:14.120 I really
00:41:15.020 like you.
00:41:15.840 In fact, I'd
00:41:16.480 like to spend
00:41:16.940 more time around
00:41:17.780 you and
00:41:19.020 people like
00:41:19.680 you.
00:41:20.860 That's how
00:41:21.480 much I like
00:41:22.080 you and
00:41:22.540 people like
00:41:23.080 you.
00:41:24.120 How do you
00:41:24.880 dislike that
00:41:25.580 person?
00:41:27.360 And to a
00:41:28.380 large extent,
00:41:29.180 that's what
00:41:29.800 immigrants tell
00:41:30.640 us without
00:41:31.280 saying it.
00:41:32.620 They're saying
00:41:33.160 they're voting
00:41:33.920 with their feet
00:41:34.660 and their
00:41:35.040 lives.
00:41:36.060 They're risking
00:41:36.700 their lives,
00:41:38.400 risking their
00:41:39.760 lives to be
00:41:41.960 more like
00:41:42.420 me and
00:41:44.040 to be
00:41:44.320 closer to
00:41:44.880 me.
00:41:45.840 That's not
00:41:46.460 the purpose,
00:41:47.660 of course,
00:41:48.220 but it works
00:41:48.900 out that way.
00:41:51.420 From an
00:41:52.800 economic
00:41:53.260 perspective and
00:41:54.360 safety and
00:41:55.280 all that,
00:41:56.060 you have to
00:41:56.600 control
00:41:56.940 immigration.
00:41:58.740 But I
00:41:59.540 can't feel
00:42:00.120 bad about
00:42:00.760 it.
00:42:01.640 I can't
00:42:03.260 feel bad
00:42:03.780 about Hispanics
00:42:04.720 wanting to
00:42:05.280 be part of
00:42:06.020 what I'm
00:42:06.960 part of.
00:42:09.720 So I
00:42:10.200 think that
00:42:10.500 makes sense.
00:42:11.020 And that
00:42:13.240 population
00:42:13.820 seems mostly
00:42:14.900 interested in
00:42:15.660 work.
00:42:17.920 They're less
00:42:18.840 interested in
00:42:19.600 complaining and
00:42:21.000 they're more
00:42:21.500 interested in
00:42:22.180 work.
00:42:23.920 I can't
00:42:26.260 feel bad
00:42:26.640 about that.
00:42:29.820 So there
00:42:31.000 we go.
00:42:31.360 I think
00:42:31.800 everybody thinks
00:42:32.540 that their
00:42:32.900 group is the
00:42:33.560 one that has
00:42:34.080 the most
00:42:34.380 bias, so
00:42:35.100 that just
00:42:35.400 makes sense.
00:42:37.340 All right.
00:42:39.280 So Biden's
00:42:40.280 coming up
00:42:40.680 with this
00:42:41.060 new $3
00:42:41.780 trillion
00:42:42.180 infrastructure
00:42:43.080 proposal on
00:42:44.420 top of all
00:42:45.040 the trillions
00:42:45.620 we've spent
00:42:46.200 already.
00:42:47.640 And here's
00:42:49.440 the thing.
00:42:52.400 Do we
00:42:53.100 understand
00:42:53.600 anything about
00:42:54.540 national debt?
00:42:56.500 Like anything?
00:42:59.160 You know,
00:43:00.600 I've been on
00:43:02.080 the side who
00:43:02.640 said that
00:43:03.580 during the
00:43:03.960 pandemic you
00:43:04.700 should just
00:43:05.040 pick the
00:43:05.380 biggest number
00:43:06.020 and just get
00:43:07.040 that money out
00:43:07.660 there because
00:43:08.420 the risk of
00:43:09.200 the whole
00:43:09.660 system falling
00:43:10.660 over is
00:43:11.840 much bigger
00:43:12.400 than the
00:43:13.020 risk of
00:43:13.500 we added
00:43:14.180 20% of
00:43:16.180 the debt
00:43:16.520 or whatever
00:43:16.900 we're going
00:43:17.220 to add.
00:43:18.320 And so
00:43:18.760 it made
00:43:19.820 sense during
00:43:20.320 the pandemic.
00:43:21.360 But once
00:43:21.760 you get to
00:43:22.540 spending outside
00:43:24.180 the focus of
00:43:25.300 the pandemic
00:43:25.880 to just
00:43:26.860 stuff we
00:43:27.800 need,
00:43:28.340 infrastructure,
00:43:29.240 clean energy,
00:43:30.040 et cetera,
00:43:32.400 shouldn't there
00:43:33.060 be some kind
00:43:33.820 of limit
00:43:34.360 on that?
00:43:37.680 Shouldn't
00:43:38.120 there be
00:43:38.360 some kind
00:43:38.760 of limit
00:43:39.140 on how
00:43:40.320 much we
00:43:40.700 spend?
00:43:41.440 Or at
00:43:41.920 the very
00:43:42.240 least,
00:43:43.620 shouldn't
00:43:43.940 our leadership
00:43:44.860 say,
00:43:45.320 look,
00:43:45.620 here's the
00:43:46.000 deal.
00:43:46.960 We're going
00:43:47.540 to spend
00:43:47.920 trillions of
00:43:48.720 dollars on
00:43:49.960 stuff that we
00:43:50.540 think we
00:43:50.900 need and
00:43:52.060 there'll be
00:43:52.620 good investments,
00:43:53.520 et cetera,
00:43:54.200 but here's
00:43:56.060 how we're
00:43:56.360 going to pay
00:43:56.780 back the
00:43:57.320 debt.
00:43:59.180 I feel as
00:44:00.100 if they need
00:44:00.560 to tell us
00:44:01.140 that,
00:44:01.480 that they
00:44:03.020 can or
00:44:03.880 how much
00:44:04.560 risk there
00:44:05.020 is or
00:44:05.360 whatever.
00:44:05.980 Because I'm
00:44:06.560 not saying
00:44:06.980 that we
00:44:07.320 shouldn't
00:44:07.840 spend this
00:44:08.380 money.
00:44:09.260 I'm just
00:44:09.880 saying it's
00:44:10.340 an incomplete
00:44:10.920 plan.
00:44:11.900 It's a
00:44:12.260 half a
00:44:12.580 plan.
00:44:13.500 A half
00:44:14.200 plan says
00:44:14.960 we're going
00:44:15.260 to spend
00:44:15.520 this much
00:44:15.920 money.
00:44:16.860 The other
00:44:17.280 part of the
00:44:17.680 plan is
00:44:18.240 and we're
00:44:20.500 going to
00:44:20.740 pay it
00:44:21.040 back?
00:44:22.760 And if
00:44:24.000 we didn't
00:44:24.780 pay it
00:44:25.080 back,
00:44:25.820 what does
00:44:26.600 that do?
00:44:27.740 What happens
00:44:28.220 to inflation?
00:44:29.300 What happens
00:44:29.760 to taxes?
00:44:31.480 So I
00:44:33.660 believe that
00:44:34.220 we will
00:44:34.580 be presented,
00:44:35.440 the country
00:44:35.920 will,
00:44:36.700 with this
00:44:37.240 ginormous
00:44:38.180 $3 trillion
00:44:39.220 jobs and
00:44:40.380 infrastructure
00:44:40.900 proposal and
00:44:42.340 that our
00:44:43.900 news industry
00:44:45.100 will not
00:44:45.740 insist for
00:44:47.260 a complete
00:44:47.760 picture.
00:44:48.680 A complete
00:44:49.220 picture that
00:44:49.820 explains how
00:44:51.120 much debt
00:44:51.800 can the
00:44:52.120 United States
00:44:52.720 take on?
00:44:53.580 How are we
00:44:54.000 going to pay
00:44:54.820 it back?
00:44:55.900 Now again,
00:44:56.580 during a
00:44:57.060 pandemic,
00:44:58.220 you don't
00:44:58.620 have the
00:44:58.980 luxury of
00:44:59.480 asking those
00:45:00.040 questions.
00:45:00.460 But we
00:45:01.300 certainly
00:45:01.660 have the
00:45:02.080 luxury now.
00:45:03.400 I mean,
00:45:03.880 it's the
00:45:04.220 end of the
00:45:04.520 pandemic-ish.
00:45:05.800 But we
00:45:06.360 should be
00:45:06.640 asking that.
00:45:07.860 But that
00:45:08.300 won't happen.
00:45:09.360 And that
00:45:09.820 is a
00:45:10.340 gigantic
00:45:10.760 failure of
00:45:11.860 the press.
00:45:12.960 And you
00:45:13.260 can predict
00:45:13.780 it.
00:45:14.520 You don't
00:45:14.980 even have
00:45:15.320 to wonder
00:45:15.720 if that
00:45:16.060 will happen.
00:45:17.080 There's
00:45:17.340 no way
00:45:17.780 in hell
00:45:18.200 we will
00:45:19.360 ever get
00:45:19.780 any good
00:45:20.300 information
00:45:20.780 on this
00:45:21.340 question of
00:45:22.060 do we
00:45:22.860 even need
00:45:23.200 to pay
00:45:23.440 back this
00:45:23.900 debt?
00:45:24.720 Because
00:45:25.120 national debt
00:45:25.760 is different
00:45:26.160 than personal
00:45:26.720 debt.
00:45:27.040 you can
00:45:28.140 keep
00:45:28.360 national
00:45:28.760 debt
00:45:28.980 forever.
00:45:30.380 Personal
00:45:30.880 debt you
00:45:31.340 kind of
00:45:31.560 have to
00:45:31.800 pay off
00:45:32.260 or you're
00:45:32.740 in trouble.
00:45:34.240 All right.
00:45:35.580 I was
00:45:36.220 asked on
00:45:37.200 Twitter what
00:45:37.900 I thought
00:45:38.280 about this
00:45:38.920 effort to
00:45:39.760 study the
00:45:40.260 origins of
00:45:40.880 the virus.
00:45:42.140 And apparently
00:45:42.540 there's some
00:45:43.100 science that
00:45:43.760 you can apply
00:45:44.460 to if you
00:45:45.080 have enough
00:45:45.900 samples from
00:45:46.720 the right
00:45:47.000 places.
00:45:47.820 You can
00:45:48.480 apply your
00:45:48.960 science and
00:45:50.920 maybe you
00:45:51.420 could find
00:45:51.900 out the
00:45:52.260 origin of
00:45:52.820 the virus.
00:45:53.340 to which
00:45:54.500 I say
00:45:55.000 that sounds
00:45:55.920 like a
00:45:56.320 complete
00:45:56.720 waste of
00:45:57.220 time.
00:45:58.660 Here's
00:45:59.180 why.
00:46:01.000 What is
00:46:01.680 the most
00:46:02.000 predictable
00:46:02.600 thing that
00:46:04.140 you could
00:46:04.500 say if
00:46:05.500 there's a
00:46:06.100 team of
00:46:06.580 scientists
00:46:07.100 studying the
00:46:08.360 origin of
00:46:09.000 the virus?
00:46:10.120 The most
00:46:10.800 predictable
00:46:11.440 outcome,
00:46:12.800 I would
00:46:13.140 say 100%
00:46:13.960 certainty,
00:46:15.020 is that if
00:46:16.100 they were to
00:46:16.500 come up with
00:46:17.020 a decision
00:46:17.660 and say,
00:46:18.100 okay,
00:46:18.360 it started
00:46:18.800 in Wuhan
00:46:19.560 or started
00:46:20.200 somewhere else,
00:46:21.020 no matter
00:46:21.520 what they
00:46:22.220 said,
00:46:23.340 and no
00:46:24.140 matter how
00:46:24.780 much they
00:46:25.580 showed you
00:46:26.060 their data
00:46:26.540 and how
00:46:26.960 they got
00:46:27.280 there,
00:46:28.320 there would
00:46:28.880 be other
00:46:29.440 scientists who
00:46:30.540 say they
00:46:31.820 studied it
00:46:32.520 wrong.
00:46:34.540 100%
00:46:35.460 chance.
00:46:37.220 So what
00:46:38.820 would you
00:46:39.200 do as a
00:46:40.000 citizen or
00:46:41.340 as a leader
00:46:41.940 if you knew
00:46:43.180 that one
00:46:43.740 group of
00:46:44.160 scientists said,
00:46:44.860 yeah, we
00:46:45.120 studied it and
00:46:45.880 it came from
00:46:46.700 a banana in
00:46:48.480 Switzerland,
00:46:49.720 and another
00:46:50.700 group of
00:46:51.140 scientists said,
00:46:52.520 we just
00:46:52.820 looked at
00:46:53.180 your data
00:46:53.540 and it
00:46:53.820 doesn't
00:46:54.060 look right,
00:46:54.840 here's
00:46:55.100 why.
00:46:56.660 Now it's
00:46:57.220 your job
00:46:57.620 to follow
00:46:58.160 the science.
00:46:59.880 What do
00:47:00.380 you do?
00:47:01.980 What do
00:47:02.460 you do?
00:47:03.420 Do you
00:47:03.940 look at
00:47:04.300 the work
00:47:04.760 and say,
00:47:05.340 all right,
00:47:05.620 let me dig
00:47:06.080 into this
00:47:06.500 a little bit.
00:47:07.360 Let me
00:47:07.680 look into
00:47:08.060 the work
00:47:08.480 of how
00:47:08.760 these
00:47:09.000 scientists
00:47:09.500 came up
00:47:10.620 with this
00:47:11.100 idea of
00:47:12.800 where it
00:47:13.080 came from.
00:47:13.760 I'll look
00:47:14.180 at their
00:47:14.480 statistics
00:47:15.020 that I
00:47:15.540 don't
00:47:15.760 understand,
00:47:16.760 I'll look
00:47:17.360 at the
00:47:17.640 genetics
00:47:18.120 that I
00:47:19.080 don't
00:47:19.320 understand,
00:47:20.440 and then
00:47:20.820 what?
00:47:21.100 Am I
00:47:22.220 going to
00:47:22.460 decide
00:47:22.780 who's
00:47:23.020 right?
00:47:24.180 No.
00:47:24.980 This is
00:47:25.620 the problem
00:47:26.260 with the
00:47:27.980 dumbest thing
00:47:28.640 anybody's
00:47:29.180 ever said,
00:47:29.820 which is
00:47:30.120 trust
00:47:30.420 science.
00:47:31.940 Trust
00:47:32.500 science is
00:47:33.860 one of the
00:47:34.540 dumbest
00:47:35.040 pieces of
00:47:36.320 advice.
00:47:36.820 After
00:47:37.080 be yourself,
00:47:38.460 maybe the
00:47:38.940 worst advice
00:47:39.540 in the world
00:47:39.940 is be
00:47:40.280 yourself.
00:47:41.460 You should
00:47:41.980 try to be
00:47:42.400 better than
00:47:42.820 yourself.
00:47:44.260 You should
00:47:44.820 try to raise
00:47:45.380 your game.
00:47:46.580 Being yourself
00:47:47.140 as the
00:47:47.480 worst
00:47:47.740 advice.
00:47:51.720 But
00:47:52.200 follow the
00:47:52.760 science
00:47:53.120 doesn't
00:47:53.880 work
00:47:54.320 because we
00:47:55.520 don't know
00:47:55.820 what the
00:47:56.100 science is.
00:47:57.680 We don't
00:47:58.340 see it.
00:47:59.680 It would
00:47:59.920 be like
00:48:00.340 you could
00:48:01.200 replace
00:48:01.680 the sentence
00:48:03.380 of follow
00:48:03.960 the science
00:48:04.640 with follow
00:48:06.120 the magic
00:48:06.860 leprechaun.
00:48:09.200 Whatever
00:48:09.800 the magic
00:48:10.580 leprechaun
00:48:11.200 tells you
00:48:11.680 to say,
00:48:12.600 do that.
00:48:14.240 Now,
00:48:14.780 that wouldn't
00:48:15.380 be easy,
00:48:15.880 would it?
00:48:16.200 Because you
00:48:18.200 don't know
00:48:18.600 where to
00:48:18.880 find a
00:48:19.240 magic
00:48:19.560 leprechaun.
00:48:21.240 And if
00:48:21.860 you did,
00:48:22.440 you wouldn't
00:48:23.020 believe him
00:48:23.560 because it's
00:48:24.560 a magic
00:48:25.140 leprechaun.
00:48:26.160 Not very
00:48:26.860 credible.
00:48:28.440 So,
00:48:29.460 you could
00:48:30.340 tell us
00:48:30.740 to follow
00:48:31.160 the science
00:48:31.720 all day
00:48:32.120 long,
00:48:32.480 but we
00:48:32.760 can't.
00:48:33.900 That's
00:48:34.220 not a
00:48:34.580 thing.
00:48:35.620 All we
00:48:36.320 can do
00:48:36.740 is hear
00:48:37.180 what people
00:48:37.640 say.
00:48:39.820 Noise
00:48:40.340 is not
00:48:41.660 science.
00:48:43.440 What you
00:48:44.100 hear when
00:48:44.620 the experts
00:48:45.140 are talking,
00:48:45.660 the scientists,
00:48:46.440 is noise.
00:48:48.100 Noise is
00:48:48.880 not science.
00:48:50.480 Then your
00:48:50.820 brain does
00:48:51.640 this thing
00:48:52.040 where the
00:48:52.400 noise is
00:48:53.020 translated into
00:48:53.840 thoughts,
00:48:54.840 the thoughts
00:48:55.420 are formed
00:48:55.960 into opinions,
00:48:56.960 and then you
00:48:57.320 say to
00:48:57.580 yourself,
00:48:58.040 oh yeah,
00:48:59.240 I guess I
00:48:59.800 just heard
00:49:00.180 some science.
00:49:01.740 I guess I'll
00:49:02.480 follow the
00:49:03.080 science.
00:49:04.700 You never
00:49:05.840 came close
00:49:06.640 to science.
00:49:08.360 What happened
00:49:08.980 to you
00:49:09.440 was that
00:49:10.320 some noise,
00:49:11.220 some vibration
00:49:12.040 of air
00:49:12.640 entered your
00:49:14.060 ear,
00:49:15.060 and then
00:49:15.400 your brain
00:49:15.840 did something
00:49:16.380 with it.
00:49:17.260 You are
00:49:17.800 so far
00:49:18.620 from science.
00:49:21.840 It's like
00:49:22.600 science is
00:49:23.240 in a sealed
00:49:24.120 lead container
00:49:25.100 on the
00:49:25.580 other side
00:49:26.080 of the
00:49:26.360 universe.
00:49:27.520 It might
00:49:28.260 be in
00:49:28.640 there,
00:49:29.260 but you
00:49:30.180 can't get
00:49:30.760 to it,
00:49:31.860 and you
00:49:32.800 can't see
00:49:33.360 it,
00:49:33.820 and nobody
00:49:34.920 can tell
00:49:35.360 you what's
00:49:35.700 in there
00:49:36.060 who is
00:49:36.680 credible.
00:49:37.000 that's
00:49:37.760 it.
00:49:39.180 Now,
00:49:39.500 science,
00:49:40.100 of course,
00:49:40.560 is better
00:49:41.740 than anything
00:49:42.240 else we
00:49:42.680 have,
00:49:43.440 but the
00:49:44.500 whole fetish
00:49:45.640 of trusting
00:49:47.220 science is
00:49:48.020 for stupid
00:49:48.560 people,
00:49:49.720 if I can
00:49:50.200 say it
00:49:50.640 as plainly
00:49:53.040 as that.
00:49:54.740 Follow the
00:49:55.560 science is
00:49:56.300 for stupid
00:49:56.920 people,
00:49:57.820 just like
00:49:58.940 be yourself.
00:50:01.320 It's advice
00:50:01.920 for stupid
00:50:02.460 people.
00:50:02.900 It's just
00:50:03.160 the worst
00:50:03.480 advice.
00:50:04.900 All right.
00:50:07.000 there was
00:50:09.140 a nursing
00:50:09.560 home in
00:50:10.060 New York
00:50:10.600 State that
00:50:11.740 burned up
00:50:12.400 and some
00:50:12.860 people were
00:50:13.320 trapped inside.
00:50:14.200 I didn't
00:50:14.420 see the
00:50:15.100 injury and
00:50:15.940 death count.
00:50:17.260 Well,
00:50:17.520 let me tell
00:50:17.940 you,
00:50:18.320 if there's
00:50:19.180 one place
00:50:19.740 you don't
00:50:20.180 want to
00:50:20.500 be on
00:50:20.920 this whole
00:50:21.420 earth,
00:50:22.620 it would
00:50:22.920 be a
00:50:23.280 nursing
00:50:23.580 home in
00:50:24.460 New York
00:50:24.960 because
00:50:26.520 nothing good
00:50:27.460 happens in
00:50:27.940 those places,
00:50:28.640 apparently.
00:50:30.640 Rasmussen,
00:50:31.640 the polling
00:50:32.340 people,
00:50:32.860 say that
00:50:33.240 58%
00:50:34.240 say that
00:50:35.900 masks
00:50:36.400 should be
00:50:37.240 required
00:50:37.860 until
00:50:38.920 everyone
00:50:39.360 is
00:50:39.600 vaccinated.
00:50:42.060 58%.
00:50:42.700 So a
00:50:44.300 nice,
00:50:44.860 fairly solid
00:50:45.700 majority
00:50:46.220 say wear
00:50:47.960 masks until
00:50:49.140 everyone is
00:50:49.700 vaccinated.
00:50:50.400 What do you
00:50:50.640 think of
00:50:50.860 that?
00:50:55.700 Of course,
00:50:56.640 it breaks
00:50:56.960 down by
00:50:57.520 political
00:50:57.900 lines.
00:50:58.760 76% of
00:50:59.920 Democrats
00:51:00.380 say you
00:51:01.060 should wear
00:51:01.340 your masks
00:51:01.800 until
00:51:02.080 everybody's
00:51:02.560 vaccinated,
00:51:03.540 but nearly
00:51:04.140 half of
00:51:04.700 Republicans,
00:51:05.620 46%.
00:51:06.740 So nearly
00:51:08.180 half of
00:51:08.680 Republicans
00:51:09.140 say the
00:51:09.560 same thing.
00:51:11.600 I know
00:51:12.380 my audience
00:51:13.080 is more
00:51:13.560 anti-mask,
00:51:14.860 but here's
00:51:17.380 my take
00:51:17.900 on this.
00:51:21.120 As much
00:51:22.000 as I
00:51:22.360 think masks
00:51:23.420 were a
00:51:23.900 good risk
00:51:24.580 management
00:51:25.120 decision,
00:51:26.660 we now
00:51:27.880 have enough
00:51:28.460 data to
00:51:29.240 see that
00:51:29.640 it doesn't
00:51:30.240 seem to
00:51:30.800 move the
00:51:31.240 curve.
00:51:32.880 And if it
00:51:33.460 doesn't move
00:51:33.900 the curve,
00:51:34.460 it doesn't
00:51:34.840 mean they
00:51:35.100 don't work.
00:51:36.440 It just
00:51:37.040 means that
00:51:37.500 maybe you
00:51:38.080 can't see
00:51:38.500 the effect
00:51:39.120 because the
00:51:39.740 other variables
00:51:40.400 are just
00:51:40.760 bigger.
00:51:43.660 And I
00:51:44.300 think at
00:51:44.660 some point
00:51:45.400 the death
00:51:47.400 rate is
00:51:48.060 low enough
00:51:48.600 that even
00:51:49.520 if masks
00:51:50.400 work,
00:51:52.380 you shouldn't
00:51:53.240 wear it
00:51:53.740 because it
00:51:55.800 does lower
00:51:56.320 your quality
00:51:56.880 of life
00:51:57.260 quite a bit.
00:51:57.840 So I
00:51:58.560 think you
00:51:59.760 should drop
00:52:00.420 the masks
00:52:01.180 before it's
00:52:04.480 safe.
00:52:05.920 You know
00:52:06.100 what I
00:52:06.280 mean?
00:52:07.240 So even
00:52:07.980 if masks
00:52:08.620 help, you
00:52:09.900 should probably
00:52:10.320 drop them
00:52:10.900 sooner than
00:52:11.600 later just
00:52:12.840 because it's
00:52:14.500 not that big
00:52:15.260 of an effect
00:52:16.020 and the
00:52:17.520 citizens want
00:52:18.280 some freedom.
00:52:19.640 We'd like
00:52:20.140 some freedom.
00:52:22.000 All right,
00:52:23.540 Oakland is
00:52:24.220 going to do
00:52:24.580 a big UBI
00:52:25.300 test, the
00:52:26.060 universal basic
00:52:26.960 income.
00:52:27.720 So basically
00:52:28.240 some group
00:52:29.100 of people
00:52:29.440 in Oakland,
00:52:29.980 not all of
00:52:30.460 them, but
00:52:31.160 some group
00:52:31.660 will be
00:52:32.200 selected to
00:52:33.760 get free
00:52:34.780 money and
00:52:36.540 then they
00:52:37.040 can do
00:52:37.440 whatever they
00:52:37.840 want.
00:52:38.700 They just
00:52:39.060 give free
00:52:39.420 money.
00:52:40.660 What do
00:52:41.180 you think
00:52:41.380 of that?
00:52:42.540 Well, I
00:52:43.440 love an
00:52:44.400 experiment.
00:52:45.980 I think I've
00:52:46.600 told you this
00:52:47.040 before.
00:52:47.720 So I am
00:52:48.440 100% in
00:52:49.600 favor of
00:52:50.500 this experiment.
00:52:52.100 It's
00:52:52.380 Oakland's
00:52:52.840 money.
00:52:53.240 They want
00:52:53.540 to spend
00:52:53.860 it on
00:52:54.160 this.
00:52:55.120 Okay.
00:52:56.960 But I've
00:53:00.400 often said
00:53:00.980 that the
00:53:01.260 difference
00:53:01.520 between the
00:53:02.000 Democrats
00:53:02.320 and Republicans
00:53:02.960 is that
00:53:03.420 Republicans
00:53:03.920 consistently
00:53:04.880 understand
00:53:05.700 human
00:53:06.200 motivation,
00:53:07.640 whereas
00:53:08.000 Democrats
00:53:08.680 consistently
00:53:09.500 act like
00:53:10.120 it's not
00:53:10.480 even a
00:53:10.840 thing.
00:53:11.780 That if
00:53:12.320 you reward
00:53:14.420 people for
00:53:15.180 stuff, they
00:53:15.660 won't do
00:53:16.060 more of
00:53:16.520 it.
00:53:18.340 Weird.
00:53:19.260 Or that if
00:53:19.880 you penalize
00:53:20.540 somebody for
00:53:21.140 something, they
00:53:21.660 won't do
00:53:22.020 less of it.
00:53:23.500 Whereas
00:53:23.940 Republicans
00:53:25.060 think, no,
00:53:27.000 if you
00:53:27.240 reward people,
00:53:28.000 they'll do
00:53:28.260 more of it.
00:53:28.760 If you
00:53:29.020 punish them,
00:53:29.500 they'll do
00:53:29.740 less of it.
00:53:30.240 It's never
00:53:30.740 different.
00:53:32.080 Friction
00:53:32.520 works every
00:53:33.680 time.
00:53:34.540 Not 100%,
00:53:35.520 but it
00:53:36.500 works at
00:53:37.180 least a
00:53:37.580 little every
00:53:38.040 time.
00:53:40.200 So
00:53:40.460 testing this
00:53:42.660 UBI makes
00:53:43.300 sense, but
00:53:43.900 what would
00:53:44.300 you expect
00:53:44.940 to come
00:53:45.440 out of
00:53:45.680 the test?
00:53:46.260 Here's
00:53:46.480 what I
00:53:46.780 expect.
00:53:49.240 People who
00:53:50.020 are intrinsically
00:53:51.920 motivated will
00:53:54.420 very quickly
00:53:55.200 not need
00:53:55.800 it, because
00:53:56.440 they'll go
00:53:57.400 get their
00:53:57.820 own work,
00:53:58.460 etc.
00:53:58.940 And people
00:53:59.620 who are not
00:54:00.260 intrinsically
00:54:00.940 motivated will
00:54:01.720 just spend
00:54:02.480 the money and
00:54:02.920 it won't
00:54:03.120 make any
00:54:03.500 difference.
00:54:04.740 And I
00:54:04.980 feel as
00:54:05.420 though that
00:54:06.980 the real
00:54:07.840 way we
00:54:08.900 should be
00:54:09.240 thinking of
00:54:09.760 this is
00:54:11.220 that modern
00:54:12.960 society has
00:54:13.780 evolved to
00:54:14.560 the point
00:54:15.000 where it's
00:54:15.840 so complicated
00:54:16.720 and hard
00:54:17.240 just to
00:54:17.720 live and
00:54:18.240 navigate our
00:54:18.920 complicated
00:54:19.420 world that
00:54:21.060 a huge
00:54:22.200 percentage of
00:54:22.820 the population
00:54:23.420 will never
00:54:24.000 be able to
00:54:24.600 work, never
00:54:26.520 be able to
00:54:27.000 thrive in
00:54:28.800 civilization as
00:54:30.180 it has
00:54:30.540 evolved.
00:54:31.720 So there's
00:54:32.520 going to be
00:54:32.740 some percentage,
00:54:33.960 maybe 20%,
00:54:35.440 who are going
00:54:36.500 to do great
00:54:37.340 because they
00:54:38.520 can handle
00:54:39.020 complexity, they
00:54:40.140 can find all
00:54:40.780 the advantages
00:54:41.940 and strategic
00:54:43.460 things and put
00:54:44.560 it together.
00:54:45.480 And if you're
00:54:45.880 unusually smart
00:54:47.120 and capable
00:54:47.680 and ambitious,
00:54:48.620 this is a
00:54:49.080 really good time
00:54:49.720 to be alive.
00:54:51.060 But if you
00:54:52.020 happen to be
00:54:52.520 just a person
00:54:53.280 who wants
00:54:53.700 to kind of
00:54:55.160 get by in
00:54:55.760 life, you
00:54:57.100 are in the
00:54:57.540 wrong civilization
00:54:58.820 because you
00:55:01.000 have to work
00:55:01.760 to get by
00:55:03.360 and if you're
00:55:04.360 just not a
00:55:05.060 worker, it's
00:55:05.720 going to be
00:55:06.040 tough.
00:55:06.820 So I would
00:55:07.700 expect that the
00:55:08.460 universal basic
00:55:09.460 income would
00:55:10.740 keep people who
00:55:11.860 are not
00:55:12.240 ambitious, they
00:55:14.120 would continue
00:55:14.640 to not be that
00:55:15.400 way, whereas
00:55:16.580 they might have
00:55:17.140 been forced to
00:55:17.960 do something
00:55:18.540 productive otherwise.
00:55:19.760 But here's
00:55:22.620 a, so I'm
00:55:23.920 going to say
00:55:24.220 two things
00:55:25.400 about this.
00:55:25.840 One is that
00:55:26.960 we need a
00:55:27.440 second path
00:55:28.340 for human
00:55:29.300 beings.
00:55:30.320 One path is
00:55:31.580 the common
00:55:32.540 one that we
00:55:33.020 see now, which
00:55:33.780 is if you go
00:55:34.940 to school and
00:55:35.820 you're smart
00:55:36.280 enough and
00:55:36.780 you're not
00:55:37.140 addicted to
00:55:37.700 too many
00:55:38.060 things, you
00:55:39.100 can probably
00:55:39.500 make a nice
00:55:40.060 life for
00:55:40.760 yourself.
00:55:43.300 But I feel
00:55:44.160 as if we
00:55:44.640 need to create
00:55:45.340 a path for
00:55:47.220 people who
00:55:47.800 will never be
00:55:48.560 competitive or
00:55:49.440 successful in
00:55:51.300 such a
00:55:53.260 complicated
00:55:54.880 civilization.
00:55:56.840 That's going
00:55:57.320 to be a lot
00:55:57.720 of people.
00:55:58.740 It could be
00:55:59.460 a third of
00:56:00.860 the public
00:56:01.360 just won't
00:56:02.700 be able to
00:56:03.320 work or
00:56:04.020 succeed when
00:56:05.500 drugs are
00:56:06.140 freely available
00:56:07.000 and temptations
00:56:09.920 are all over
00:56:10.440 the place.
00:56:11.260 So I think
00:56:11.760 we need this
00:56:12.300 second path,
00:56:13.700 which would
00:56:14.480 require a
00:56:15.160 lower cost
00:56:16.100 of living.
00:56:16.520 maybe you're
00:56:17.780 building new
00:56:18.360 communities from
00:56:19.400 green fields,
00:56:20.500 but basically
00:56:21.080 a tribal
00:56:22.120 living situation
00:56:25.300 where everybody
00:56:26.760 can do a
00:56:27.420 little bit
00:56:27.940 to contribute.
00:56:29.940 They can chip
00:56:30.900 in a little
00:56:31.280 bit, but they
00:56:32.240 don't have to
00:56:32.600 be a real
00:56:33.300 go-getter and
00:56:34.360 they don't
00:56:34.600 have to be
00:56:34.980 able to
00:56:35.300 thrive in
00:56:36.040 the complicated
00:56:36.580 world as
00:56:37.660 long as they
00:56:38.160 can do
00:56:38.780 well and
00:56:39.920 help out in
00:56:40.660 a sort of
00:56:41.280 a tribal,
00:56:42.080 more cooperative
00:56:43.840 environment.
00:56:44.600 So when
00:56:46.560 you hear
00:56:46.800 stuff like
00:56:47.240 that, you
00:56:47.560 say to
00:56:47.840 yourself,
00:56:48.220 my God,
00:56:48.800 Scott, that's
00:56:49.420 communism, keep
00:56:50.460 that away from
00:56:51.060 me, to which
00:56:52.100 I say, maybe
00:56:53.420 people are
00:56:53.940 different.
00:56:55.580 Personally, I
00:56:56.340 would not want
00:56:56.920 to be anywhere
00:56:57.580 near communism
00:56:58.500 because I feel
00:57:00.260 I can thrive
00:57:01.120 better if I
00:57:02.080 compete and I
00:57:03.940 feel I have
00:57:04.640 some competitive
00:57:05.300 advantages so
00:57:06.280 I'd be better
00:57:06.820 just competing.
00:57:08.060 But there are
00:57:08.640 a lot of
00:57:09.120 people who
00:57:09.520 can't compete
00:57:10.120 and we just
00:57:11.520 have to face
00:57:12.100 that because
00:57:13.220 it's just real.
00:57:14.600 A lot of
00:57:15.120 people just
00:57:15.500 can't compete
00:57:16.220 in the modern
00:57:16.800 world.
00:57:17.680 They need a
00:57:18.160 path where
00:57:19.340 they can have
00:57:19.800 a reasonable
00:57:20.340 life, it
00:57:21.160 just costs
00:57:21.600 less, so
00:57:22.580 they don't
00:57:22.900 have as
00:57:23.220 many, maybe
00:57:24.420 they don't
00:57:24.680 have cars
00:57:25.440 because you
00:57:26.560 build a
00:57:26.880 community where
00:57:27.420 you just
00:57:27.680 don't need
00:57:28.020 one, so
00:57:28.800 they don't
00:57:29.000 have to
00:57:29.280 pay for
00:57:29.560 the car.
00:57:30.820 All right,
00:57:31.120 so anyway,
00:57:31.960 lowering the
00:57:32.720 cost of living
00:57:35.320 for the second
00:57:36.920 path, it has
00:57:38.540 to happen.
00:57:39.440 It has to
00:57:39.840 happen.
00:57:41.400 And then the
00:57:42.060 other thing, if
00:57:42.720 you want to get
00:57:43.200 really controversial,
00:57:44.600 shouldn't we
00:57:46.600 only be comparing
00:57:47.520 ethnicities by
00:57:50.100 happiness?
00:57:52.240 Suppose you
00:57:54.300 were to do
00:57:55.300 a poll and
00:57:55.860 you found out
00:57:56.440 that black
00:57:57.360 Americans were
00:57:58.260 just as happy
00:57:59.320 as Asian
00:58:01.960 Americans or
00:58:02.760 anybody else,
00:58:04.080 white Americans.
00:58:04.600 do you need to
00:58:07.900 fix that?
00:58:09.520 So let's say
00:58:10.280 one group has
00:58:10.940 more money and
00:58:11.760 more advantages
00:58:12.380 and that
00:58:13.660 difference is
00:58:14.560 real and it's
00:58:15.380 caused by bias,
00:58:16.860 it's caused by
00:58:17.420 racism.
00:58:18.560 Let's say that's
00:58:19.200 real, but what if
00:58:21.220 both sides are just
00:58:22.060 as happy?
00:58:23.740 Isn't happiness
00:58:24.640 the point?
00:58:26.600 Why would you,
00:58:27.740 why would, if you
00:58:28.480 had two groups that
00:58:29.120 are happy, and let's
00:58:29.840 say one person has a
00:58:31.080 million dollars and
00:58:32.700 the other person has
00:58:33.540 a hundred thousand
00:58:34.240 dollars, a very
00:58:36.580 artificial example, but
00:58:38.220 they're just as
00:58:38.780 happy, what would be
00:58:40.900 the argument for
00:58:41.720 taking the money from
00:58:42.640 the rich person and
00:58:44.300 giving some amount of
00:58:45.660 it to the less rich
00:58:47.640 person?
00:58:48.620 What would be the
00:58:49.420 argument for that?
00:58:50.700 Like why does that
00:58:51.820 need to be balanced
00:58:52.620 when the point of life
00:58:54.220 is happiness and
00:58:56.200 they're both just as
00:58:56.860 happy?
00:58:57.100 So, I would say
00:59:00.680 that looking at
00:59:01.380 economics and
00:59:02.420 trying to balance
00:59:03.500 it, we're just
00:59:05.120 looking at the
00:59:05.680 wrong things.
00:59:07.180 I do like, you
00:59:08.360 know, economic
00:59:09.300 equality, I'm all
00:59:10.580 for it, but I'm not
00:59:12.080 sure that's the
00:59:12.640 point of life.
00:59:15.040 Elon Musk said
00:59:16.260 something provocative.
00:59:19.920 He tweeted a
00:59:20.880 little meme that
00:59:21.540 said the strongest
00:59:22.220 argument for, well,
00:59:25.380 what he tweeted was,
00:59:26.340 he said the strongest
00:59:27.040 argument against
00:59:28.520 aliens, meaning
00:59:29.640 against the idea
00:59:30.900 that aliens are
00:59:31.720 visiting in their
00:59:32.460 spaceships, and he
00:59:33.820 showed two graphs.
00:59:34.760 One showed the
00:59:35.360 camera resolution
00:59:36.740 improves, you know,
00:59:38.460 has been improving
00:59:39.260 like greatly in
00:59:40.400 recent years,
00:59:41.100 especially recently,
00:59:42.120 but the photographs
00:59:44.400 of UFOs are still
00:59:46.000 low definition,
00:59:48.280 which is kind of
00:59:49.900 funny.
00:59:50.900 How much further
00:59:52.080 into the future
00:59:52.960 can we go where
00:59:54.880 every photograph is
00:59:56.100 really, really
00:59:56.680 clear, you know,
00:59:58.020 and everybody's got
00:59:58.740 a camera, at the
01:00:00.280 same time that
01:00:01.000 every photograph of
01:00:02.120 an alien spaceship
01:00:03.100 is a grainy
01:00:04.380 smudge.
01:00:06.620 You know, at the
01:00:07.360 moment, you can say
01:00:09.560 to yourself, okay,
01:00:11.040 it's because the
01:00:11.540 spaceship is moving
01:00:12.480 fast, and it's,
01:00:14.140 yeah, maybe it's up
01:00:14.740 in the clouds, and
01:00:16.040 maybe it's got some
01:00:17.120 alien technology that
01:00:18.520 keeps it from being
01:00:19.240 photographed or
01:00:19.940 something, I don't
01:00:20.540 know, but the
01:00:22.780 longer you go, the
01:00:25.120 more it becomes a
01:00:25.980 little obvious that
01:00:26.900 somebody would have
01:00:27.920 gotten at least one
01:00:28.880 good picture, at
01:00:30.340 least one.
01:00:31.860 So that was Elon
01:00:33.280 Musk's provocative
01:00:34.540 tweet today, or
01:00:36.960 yesterday.
01:00:39.580 The AstraZeneca
01:00:40.920 vaccine everybody's
01:00:42.660 talking about, the
01:00:44.580 company itself says
01:00:45.600 they did some more
01:00:46.320 studies and found
01:00:47.060 down to a 79%
01:00:48.380 overall efficacy, and
01:00:50.880 100% success in
01:00:54.220 keeping people from
01:00:55.340 severe hospitalization.
01:00:58.200 100%.
01:00:58.800 Do you believe
01:01:00.340 that?
01:01:01.020 Do you believe that
01:01:02.180 there's a vaccination
01:01:03.060 that is 100%
01:01:04.940 effective at
01:01:05.820 preventing
01:01:06.880 hospitalizations and
01:01:08.540 severe disease?
01:01:10.660 No.
01:01:11.780 No.
01:01:13.560 No, you should not
01:01:14.700 believe that.
01:01:15.260 It could be close
01:01:17.180 to 100%, which
01:01:17.960 would be cool.
01:01:21.080 But of course, I
01:01:22.580 tweeted this because
01:01:23.600 it looked like good
01:01:24.360 news.
01:01:24.840 It took me 10
01:01:25.520 seconds for somebody
01:01:27.160 else to say, but
01:01:28.080 what about all these
01:01:28.960 other stories?
01:01:31.420 For example, the
01:01:32.520 Wall Street Journal is
01:01:34.100 reporting that U.S.
01:01:35.500 officials said they
01:01:36.280 were told that
01:01:36.860 AstraZeneca may have
01:01:38.560 released outdated
01:01:39.780 information.
01:01:41.220 What?
01:01:41.820 In its disclosure of
01:01:43.020 trial results.
01:01:44.560 What?
01:01:45.260 And they may have
01:01:47.480 provided an incomplete
01:01:49.320 view of the efficacy
01:01:50.980 data.
01:01:52.320 What?
01:01:53.880 Now, do you see the
01:01:54.780 problem with following
01:01:55.680 the science?
01:01:57.480 Which science?
01:01:59.280 You've got two
01:02:00.260 sciences here.
01:02:01.460 One science says,
01:02:02.860 yeah, this stuff's good
01:02:04.320 and it works.
01:02:05.360 And the other science
01:02:06.360 says, we don't know
01:02:08.040 if it works.
01:02:09.260 And then there's yet
01:02:10.180 other science that
01:02:11.000 says it causes
01:02:11.580 blood clots.
01:02:12.660 But then there's
01:02:14.180 yet other science
01:02:15.020 that says it
01:02:15.680 doesn't.
01:02:17.280 So trust the
01:02:18.220 science.
01:02:18.640 It's on both
01:02:19.100 sides.
01:02:20.380 It works and it
01:02:21.100 doesn't work,
01:02:21.960 according to
01:02:22.460 science.
01:02:24.500 So I don't have
01:02:27.360 an opinion on
01:02:28.040 whether this is a
01:02:29.600 good vaccine or
01:02:30.600 not, but I'd
01:02:32.320 probably take it.
01:02:33.600 I'd probably take
01:02:34.300 it.
01:02:34.540 breaking news
01:02:37.600 that senior
01:02:38.440 citizens are
01:02:39.340 dying, and
01:02:40.720 they're dying
01:02:41.740 within days
01:02:42.540 of doing,
01:02:44.900 it turns out,
01:02:45.400 anything.
01:02:46.540 Anything.
01:02:47.240 So senior
01:02:47.720 citizens are
01:02:49.000 dropping dead
01:02:49.760 all over the
01:02:50.400 world within
01:02:51.940 just a few
01:02:52.660 days of
01:02:54.720 having done,
01:02:56.060 well,
01:02:56.620 anything.
01:02:58.020 Watching TV,
01:03:00.580 eating an
01:03:01.080 apple,
01:03:02.240 getting a
01:03:02.620 vaccination.
01:03:03.120 Yeah,
01:03:04.760 pretty much
01:03:05.240 senior
01:03:05.960 citizens are
01:03:06.780 dying after
01:03:07.600 everything.
01:03:10.420 And so
01:03:10.860 the EU is
01:03:13.840 considering a
01:03:14.640 ban on all
01:03:16.080 human activity.
01:03:17.940 Because if
01:03:19.020 people are
01:03:19.520 dying within
01:03:20.300 days after
01:03:21.300 every human
01:03:22.220 activity, I
01:03:24.160 don't think we
01:03:24.920 can continue
01:03:25.680 having human
01:03:26.440 activities.
01:03:28.000 People are
01:03:28.780 dying within
01:03:29.520 days all
01:03:31.100 over the
01:03:31.540 world, and
01:03:32.080 it's not
01:03:32.460 stopping.
01:03:33.120 do you
01:03:36.740 know why
01:03:37.200 that there
01:03:38.260 are senior
01:03:38.620 citizens
01:03:39.140 dying after
01:03:41.700 human
01:03:42.160 activities?
01:03:43.580 There is a
01:03:44.360 cause.
01:03:45.720 It turns out,
01:03:46.740 and this is
01:03:47.120 science too,
01:03:48.240 I looked into
01:03:49.020 this, did some
01:03:49.520 research, it
01:03:50.480 turns out that
01:03:50.960 the number of
01:03:51.480 people who
01:03:51.820 die after
01:03:53.700 they've already
01:03:54.360 died, zero.
01:03:57.860 Or at least I
01:03:58.660 couldn't find
01:03:59.100 any.
01:03:59.580 I mean, maybe a
01:04:00.440 zombie here or
01:04:01.020 there.
01:04:01.160 But it
01:04:01.800 turns out
01:04:02.180 that only
01:04:03.440 people who
01:04:04.040 are already
01:04:04.680 alive are
01:04:06.820 dying at
01:04:07.380 all.
01:04:08.480 People who
01:04:09.180 are already
01:04:09.740 dead, the
01:04:11.240 death rate for
01:04:11.940 them, the
01:04:12.420 additional death
01:04:13.120 rate of being
01:04:14.420 dead and then
01:04:15.080 being more
01:04:15.660 dead, zero.
01:04:17.160 So exactly
01:04:20.160 zero dead
01:04:21.060 people have
01:04:21.640 died after
01:04:23.320 getting this
01:04:23.840 vaccination.
01:04:24.460 vaccination.
01:04:25.900 It's good to
01:04:26.780 know.
01:04:28.180 But a lot of
01:04:29.100 people who
01:04:30.340 are alive have
01:04:31.820 died after
01:04:32.640 literally
01:04:33.840 everything.
01:04:35.020 Riding a
01:04:35.620 bike, taking
01:04:37.180 a shower,
01:04:39.020 died within
01:04:39.600 two weeks.
01:04:40.740 Tragic.
01:04:45.520 All right,
01:04:46.580 so that is
01:04:50.100 my show for
01:04:50.720 today.
01:04:51.080 Ainsley
01:04:57.360 says,
01:04:57.860 careful, you're
01:04:59.000 prescribing
01:04:59.520 genocide to
01:05:01.400 cure death.
01:05:04.760 That explains
01:05:05.660 why Biden
01:05:06.240 isn't doing
01:05:06.860 anything.
01:05:08.600 Oh, yeah, so
01:05:09.520 there's a story
01:05:10.140 that says Biden
01:05:10.920 and Harris are
01:05:11.720 traveling together.
01:05:12.560 here's what I
01:05:16.920 think is
01:05:17.500 happening.
01:05:18.660 Now, the
01:05:19.100 thinking is that
01:05:19.920 the reason for
01:05:20.880 that is that
01:05:21.500 she's learning
01:05:22.480 on the job and
01:05:23.420 she's going to
01:05:23.860 take over any
01:05:24.500 minute.
01:05:25.060 I think that
01:05:25.820 that is part
01:05:26.440 of it.
01:05:27.000 I do think
01:05:27.560 that they're
01:05:27.960 giving her
01:05:28.560 more presidential
01:05:31.180 exposure because
01:05:32.500 they think she's
01:05:33.680 likely to take
01:05:34.540 over before his
01:05:35.300 turn is over.
01:05:36.260 I feel like
01:05:36.880 everybody believes
01:05:37.640 that, right?
01:05:38.560 But I think
01:05:39.220 there might be a
01:05:39.940 second explanation
01:05:40.960 and I'm going
01:05:42.260 to use the
01:05:42.960 science of body
01:05:44.060 language for
01:05:45.900 this hypothesis.
01:05:47.600 And when I say
01:05:48.660 the science of
01:05:49.440 body language,
01:05:50.220 I mean a lot
01:05:51.240 of it's guessing.
01:05:52.660 But if you saw
01:05:53.920 there's a video
01:05:54.460 I was watching
01:05:54.940 this morning of
01:05:56.220 Biden and
01:05:56.900 Harris, they were
01:05:57.500 traveling together
01:05:58.260 and they got off
01:05:59.200 or they were
01:05:59.580 getting on or
01:06:00.160 off a helicopter
01:06:00.960 or something and
01:06:01.940 they were chatting
01:06:02.400 with some other
01:06:03.140 official looking
01:06:03.820 people.
01:06:05.080 And on a
01:06:06.240 number of
01:06:06.620 occasions you saw
01:06:08.200 Kamala Harris
01:06:09.760 put her hands
01:06:10.700 on Biden.
01:06:12.800 Have you seen
01:06:13.260 that video?
01:06:14.240 So she'll just
01:06:14.760 sort of, you
01:06:15.420 know, she'll be
01:06:16.260 talking and then
01:06:17.020 she'll pat him
01:06:17.520 on the shoulder
01:06:18.100 or she'll just
01:06:19.400 lightly touch his
01:06:20.480 back while she's
01:06:21.220 talking to him.
01:06:22.900 Watch how often
01:06:23.820 she touches him.
01:06:26.220 What's that mean?
01:06:28.140 Well, if it were
01:06:29.020 the other way
01:06:29.760 around it would
01:06:30.440 be sexually
01:06:31.920 inappropriate.
01:06:33.380 But when a
01:06:33.980 woman in a
01:06:36.000 business setting
01:06:36.940 touches a man,
01:06:39.520 it's fine.
01:06:40.000 because, well,
01:06:41.160 the man doesn't
01:06:41.740 care, so he
01:06:42.520 actually is fine.
01:06:44.000 And, but watch
01:06:45.340 how often she
01:06:46.040 touches him.
01:06:47.420 Here's what I
01:06:48.080 think is going
01:06:48.620 on, just a
01:06:49.720 hypothesis.
01:06:50.960 I believe that
01:06:52.020 Joe's mental
01:06:54.120 state is
01:06:54.660 deteriorating.
01:06:56.540 And I believe
01:06:57.480 that he needs
01:06:58.460 people that he
01:06:59.460 has a certain
01:07:01.000 level of trust
01:07:02.400 in to keep
01:07:05.220 him on the
01:07:05.700 right track.
01:07:06.340 And I believe
01:07:07.720 that Kamala
01:07:08.220 Harris's real
01:07:09.100 job is a
01:07:10.180 little bit like
01:07:10.780 a nurse.
01:07:11.880 And I mean
01:07:12.620 that literally.
01:07:13.720 Meaning that I
01:07:14.660 believe she's
01:07:15.320 there to tell
01:07:17.200 him things that
01:07:18.020 other people can't
01:07:18.900 tell him.
01:07:20.060 Because he's not
01:07:20.900 quite, he's not
01:07:22.380 quite mentally
01:07:23.100 capable.
01:07:24.120 So I think a lot
01:07:24.840 of people are
01:07:25.240 saying, Kamala,
01:07:26.640 could you tell
01:07:27.560 him he needs
01:07:28.080 to do this?
01:07:29.400 And then maybe
01:07:29.980 she's the only
01:07:30.560 one he trusts.
01:07:31.920 Because he's got a,
01:07:32.720 he's got a history
01:07:33.420 with her and she
01:07:34.320 had a relationship
01:07:34.960 with, you know,
01:07:36.000 Bo, et cetera.
01:07:37.400 So, yeah, I'm
01:07:39.400 not sure a
01:07:39.940 handler is exactly
01:07:41.160 the right word.
01:07:42.020 Although there
01:07:42.520 might be a little
01:07:43.020 of that.
01:07:43.660 I believe that
01:07:44.440 she's there as
01:07:45.260 an emotional,
01:07:46.440 mental, let's
01:07:48.400 say, support.
01:07:50.240 And I believe
01:07:51.080 she's there to
01:07:51.700 like maybe step
01:07:52.500 in if things,
01:07:53.540 things look a
01:07:54.740 little weird.
01:07:55.740 That's my
01:07:56.360 guess.
01:08:02.300 Yeah, and
01:08:03.020 his wife serves
01:08:05.120 a similar role.
01:08:06.460 I would say
01:08:06.860 that, yes.
01:08:09.940 Somebody's
01:08:10.380 saying she
01:08:11.020 prompts him
01:08:11.740 and helps him
01:08:12.620 finish sentences.
01:08:13.880 Maybe.
01:08:14.380 I haven't seen
01:08:15.000 that, but it
01:08:15.900 wouldn't surprise
01:08:16.420 me.
01:08:18.200 And then I
01:08:18.620 guess there's
01:08:19.020 a story of
01:08:19.640 Kamala Harris
01:08:20.400 was asked about
01:08:21.100 the crisis on
01:08:22.020 the border and
01:08:23.160 she, was she
01:08:25.200 going to go
01:08:25.620 there and she
01:08:26.340 laughed and
01:08:27.120 said, not
01:08:27.520 today.
01:08:28.560 And people
01:08:28.980 said, it's
01:08:31.060 not funny.
01:08:31.760 why are you
01:08:33.440 laughing?
01:08:34.520 But of course
01:08:35.360 the laugh was
01:08:36.500 about not
01:08:37.640 today.
01:08:38.660 It wasn't
01:08:38.980 about the
01:08:39.720 situation, etc.
01:08:40.960 But it turns
01:08:41.560 into a news
01:08:42.100 story that
01:08:43.040 makes her
01:08:43.460 look bad.
01:08:47.480 So, yeah,
01:08:49.180 Kamala Harris
01:08:49.640 does have a
01:08:50.240 problem that
01:08:50.760 she can't
01:08:51.340 stop laughing
01:08:52.840 when she's
01:08:53.380 talking about
01:08:53.840 a tragedy.
01:08:54.880 I don't feel
01:08:55.800 bad about that
01:08:56.460 because I've
01:08:56.840 got the same
01:08:57.360 problems.
01:08:58.680 Sometimes I
01:08:59.440 laugh at
01:08:59.900 tragedies.
01:09:00.560 I know I
01:09:01.840 shouldn't, but
01:09:03.020 I do.
01:09:08.880 Oh, do we
01:09:09.500 know the
01:09:09.920 name of the
01:09:10.400 shooter?
01:09:11.460 I'm not going
01:09:12.080 to say that
01:09:12.520 name out loud,
01:09:13.880 but somebody
01:09:14.420 in the comments
01:09:15.040 says that we
01:09:15.980 know the name
01:09:16.700 of the
01:09:16.940 shooter.
01:09:18.540 But I don't
01:09:19.240 want to say
01:09:19.620 the name because
01:09:20.300 that would be
01:09:20.980 let's just say
01:09:23.300 there would be
01:09:23.660 some racial
01:09:24.160 bias if you
01:09:24.920 heard the
01:09:25.200 name.
01:09:26.000 At least the
01:09:26.440 one that was
01:09:26.800 in the comments.
01:09:27.480 I don't know
01:09:27.740 if that's the
01:09:28.180 real name.
01:09:29.500 All right,
01:09:29.800 that's all I
01:09:30.180 got, and I
01:09:30.960 will talk to
01:09:31.420 you tomorrow.
01:09:36.460 Well, if I
01:09:37.180 can turn this
01:09:37.680 off, I
01:09:38.000 will.
01:09:40.340 All right,
01:09:40.980 YouTubers.
01:09:47.580 I'm just
01:09:48.120 looking at your
01:09:48.740 comments.
01:09:49.720 So it does
01:09:50.320 look like there's
01:09:50.840 some news about
01:09:51.560 the shooter that
01:09:52.420 we're hearing.
01:09:52.920 I guess there's a
01:09:53.940 name out there
01:09:54.500 now.
01:09:56.320 Do we suffer
01:09:57.140 brain damage
01:09:57.860 getting our
01:09:58.320 news from you?
01:09:59.000 no, your
01:10:00.100 brain is
01:10:00.600 healthier because
01:10:01.300 you got your
01:10:01.780 news from me
01:10:02.460 because I'm
01:10:03.460 not partisan.
01:10:05.600 It might seem
01:10:06.160 like it, but
01:10:07.320 I'm not.
01:10:11.680 BitClout.
01:10:12.980 I haven't
01:10:13.780 checked my
01:10:14.500 value on
01:10:17.340 BitClout.
01:10:19.160 I need to
01:10:19.880 check that.
01:10:20.800 So yeah, you
01:10:21.280 can buy an
01:10:22.220 NFT of me,
01:10:24.640 I guess, on
01:10:25.660 something called
01:10:26.280 BitClout.
01:10:27.540 All right,
01:10:39.200 my coin is in
01:10:40.060 the top ten,
01:10:41.300 Christian says.
01:10:42.800 Yeah, I'll go
01:10:43.200 check that.
01:10:47.400 I have not
01:10:48.220 read that book
01:10:48.900 by Rod
01:10:49.600 I'm reading
01:10:58.100 your comments.
01:10:58.660 If you're
01:10:58.900 listening to
01:10:59.320 this on a
01:10:59.760 podcast, I'm
01:11:00.580 sorry that you
01:11:02.120 can't see the
01:11:02.680 comments.
01:11:03.700 By the way, do
01:11:04.460 you know that if
01:11:05.000 you have a
01:11:05.580 digital device,
01:11:07.600 the one made
01:11:08.260 by Amazon or
01:11:09.380 the one that's
01:11:10.720 built into your
01:11:11.580 Apple phone, I
01:11:13.080 don't want to say
01:11:13.580 their names because
01:11:14.540 I don't want to
01:11:15.000 activate your
01:11:15.520 device, but you
01:11:16.660 can say to
01:11:17.260 them, device
01:11:18.820 name, which
01:11:20.260 you know, and
01:11:21.440 then say, play
01:11:23.420 coffee with Scott
01:11:24.640 Adams, and my
01:11:27.860 podcast will just
01:11:28.840 pop right up.
01:11:31.500 Have I heard of
01:11:32.380 Decentral Land?
01:11:33.520 No, but I like
01:11:34.320 it.
01:11:35.200 Just based on the
01:11:36.020 name, I think I
01:11:36.720 would like that.
01:11:38.840 Have we been
01:11:39.600 hypnotized into
01:11:40.500 watching you every
01:11:41.240 day?
01:11:41.600 Yes, you have.
01:11:43.080 That's what the
01:11:43.760 simultaneous SIP
01:11:44.520 does.
01:11:46.660 Glad to hear
01:11:51.900 you've come to
01:11:52.380 your senses on
01:11:53.120 masks probably
01:11:53.980 having no, well
01:11:55.380 you were wrong.
01:11:56.520 You're wrong.
01:11:57.780 So the thing
01:11:58.880 that I'll have to
01:11:59.600 explain for the
01:12:00.320 rest of my life
01:12:01.220 is risk
01:12:02.760 management.
01:12:04.340 When the
01:12:05.020 decision of
01:12:05.740 hydroxychloroquine
01:12:06.840 came up early
01:12:07.680 on, I said,
01:12:09.640 the science does
01:12:10.360 not prove it
01:12:11.060 works, but as a
01:12:12.060 risk management
01:12:12.920 decision, you
01:12:14.500 might want to
01:12:15.000 try it.
01:12:15.520 because the
01:12:16.800 downside is
01:12:17.940 low.
01:12:19.000 Certainly try it
01:12:19.920 in one state or
01:12:20.980 one country or
01:12:21.760 something to see
01:12:22.280 if it makes a
01:12:22.740 difference.
01:12:23.780 That doesn't
01:12:24.600 mean that I am
01:12:25.760 right or wrong
01:12:26.800 about whether that
01:12:28.080 drug works.
01:12:30.280 And at least
01:12:31.640 75% of the
01:12:33.040 population can't
01:12:33.860 understand that
01:12:34.500 point.
01:12:35.380 When you make a
01:12:36.240 risk management
01:12:37.240 decision, the
01:12:38.680 decision can be
01:12:39.720 right even when
01:12:42.180 the outcome is
01:12:42.860 wrong.
01:12:43.820 Do you get
01:12:44.260 that?
01:12:45.180 So as it is
01:12:46.220 with masks,
01:12:47.520 since the
01:12:48.060 science was
01:12:48.880 undependable,
01:12:50.800 and we didn't
01:12:51.360 know how it
01:12:52.560 applied to this
01:12:53.260 new situation,
01:12:54.500 we wouldn't
01:12:55.260 know.
01:12:56.540 We wouldn't
01:12:57.040 know.
01:12:57.760 And we
01:12:58.400 wouldn't trust
01:12:58.940 the data that
01:12:59.680 said it worked
01:13:00.240 or that it
01:13:00.740 didn't work,
01:13:01.540 because nothing
01:13:02.060 is dependable
01:13:02.740 data-wise.
01:13:04.840 But common
01:13:05.920 sense says that
01:13:07.540 if the amount
01:13:08.120 of the viral
01:13:08.840 shedding matters,
01:13:10.600 if you get
01:13:11.860 more viral
01:13:12.760 exposure, you
01:13:13.880 get more
01:13:14.380 sick, that
01:13:15.660 the masks, if
01:13:16.580 they reduce
01:13:17.760 moisture coming
01:13:18.580 out of your
01:13:18.900 mouth, and
01:13:19.400 they do, and
01:13:20.640 moisture is what
01:13:21.440 carries the
01:13:21.960 virus and it
01:13:22.580 does, that it
01:13:23.900 makes sense that
01:13:24.620 you'd spread less
01:13:25.740 of it.
01:13:27.100 But it is
01:13:28.340 true that we're
01:13:30.220 not seeing it in
01:13:31.040 the data.
01:13:32.640 I've not seen
01:13:33.580 any data that
01:13:34.280 would suggest
01:13:34.860 that mask
01:13:36.060 wearing by
01:13:37.740 itself makes
01:13:39.240 any difference
01:13:39.800 at all.
01:13:40.580 I still think
01:13:41.260 it probably
01:13:41.820 does, but I
01:13:44.060 would admit that
01:13:45.340 we don't see it
01:13:45.980 in the data.
01:13:47.500 That does seem
01:13:48.200 to be the
01:13:49.100 fact.
01:13:50.060 Now, years
01:13:51.120 from now, will
01:13:51.780 we know for
01:13:52.320 sure if masks
01:13:53.840 worked?
01:13:54.960 Maybe.
01:13:55.780 Might find a
01:13:56.380 way to test it
01:13:57.160 better or
01:13:57.660 something.
01:13:58.600 But I'm
01:13:59.640 going to guess
01:14:00.240 that it works
01:14:01.580 a little.
01:14:02.880 It might not
01:14:03.640 work enough that
01:14:04.640 we would be
01:14:05.440 happy to use
01:14:06.640 them in the
01:14:07.080 next pandemic.
01:14:08.100 I don't know.
01:14:09.240 So for those
01:14:11.880 of you who
01:14:12.320 are mistakenly
01:14:13.260 imagining that
01:14:14.080 my opinion
01:14:14.600 went from
01:14:15.360 masks work
01:14:16.380 to masks
01:14:17.900 don't work
01:14:19.060 or probably
01:14:19.600 don't work,
01:14:20.220 you're
01:14:20.500 misinterpreting
01:14:21.220 completely.
01:14:22.540 Your risk
01:14:23.380 management
01:14:23.820 decision might
01:14:24.520 change, but
01:14:25.740 in both cases
01:14:26.500 you're not
01:14:26.900 saying definitely
01:14:27.560 yes or
01:14:27.980 definitely no.
01:14:29.240 You're just
01:14:29.560 saying, I
01:14:29.920 don't know,
01:14:30.140 it looks like
01:14:30.500 the odds
01:14:31.040 are in
01:14:31.920 favor of
01:14:32.360 this.
01:14:33.560 And I
01:14:34.080 would say
01:14:34.440 that even
01:14:35.520 still, I
01:14:36.520 don't mind
01:14:36.900 putting on
01:14:37.340 my mask
01:14:37.880 because I
01:14:38.260 think there's
01:14:39.600 a good
01:14:39.800 chance there's
01:14:40.400 some little
01:14:41.640 improvement.
01:14:43.020 I'll take
01:14:43.500 the extra
01:14:43.920 edge.
01:14:50.620 Yeah, they
01:14:51.460 might help,
01:14:52.060 so why not?
01:14:52.840 Yeah, the
01:14:53.120 why not is
01:14:53.820 whether or not
01:14:54.620 it makes you
01:14:55.980 crazy.
01:14:57.400 Because it's
01:14:58.180 definitely not
01:14:58.740 free to wear
01:14:59.720 a mask.
01:15:00.400 It does make
01:15:00.940 you a little
01:15:01.200 crazy.
01:15:01.620 somebody says
01:15:07.140 the risk
01:15:07.480 level is
01:15:07.940 how much
01:15:08.200 time you
01:15:08.580 spend with
01:15:09.040 a confined
01:15:09.540 person.
01:15:10.080 Yeah, I
01:15:10.360 agree with
01:15:10.760 that.
01:15:12.600 But also
01:15:13.620 common sense
01:15:14.340 tells me
01:15:14.880 that maybe
01:15:16.160 without a
01:15:16.940 mask you
01:15:18.040 could be
01:15:18.400 totally
01:15:18.740 infected in
01:15:19.660 five minutes,
01:15:21.380 but maybe
01:15:21.840 with a
01:15:22.360 mask you
01:15:23.800 wouldn't be
01:15:24.220 totally
01:15:24.520 infected for
01:15:25.540 seven minutes,
01:15:28.040 right?
01:15:28.460 Something like
01:15:29.080 that.
01:15:29.340 So you
01:15:30.880 might not
01:15:31.260 ever see
01:15:31.640 that difference
01:15:32.180 in the
01:15:32.640 statistics,
01:15:33.800 but there
01:15:34.540 must be a
01:15:35.080 lot of
01:15:35.420 interactions
01:15:36.000 where the
01:15:38.020 length of
01:15:38.760 the interaction
01:15:39.360 is changed
01:15:40.300 by wearing
01:15:40.820 the mask,
01:15:41.660 whether you
01:15:42.200 could get
01:15:42.520 it from
01:15:42.720 that.
01:15:43.120 It seems
01:15:43.440 logical.
01:15:47.060 Are
01:15:47.580 lockdowns
01:15:48.460 poor risk
01:15:49.440 management?
01:15:50.460 In retrospect,
01:15:51.420 yes.
01:15:52.660 When we
01:15:53.300 were first
01:15:53.840 doing the
01:15:54.220 lockdowns and
01:15:55.380 we were
01:15:55.600 talking about
01:15:56.120 two weeks,
01:15:57.480 that was a
01:15:58.020 good decision.
01:15:59.340 Because maybe
01:16:00.500 it would
01:16:00.820 have worked.
01:16:01.860 In retrospect,
01:16:02.860 it didn't
01:16:03.180 work, but
01:16:03.840 it could
01:16:04.180 have.
01:16:04.800 It wasn't
01:16:05.940 unreasonable to
01:16:06.860 close down for
01:16:07.600 a few weeks.
01:16:08.700 That was
01:16:09.060 actually a
01:16:09.640 good leadership
01:16:10.900 decision.
01:16:11.820 It didn't
01:16:12.100 work, but
01:16:13.440 that doesn't
01:16:13.860 make it a
01:16:14.200 bad decision
01:16:14.760 because it
01:16:15.620 was a risk
01:16:16.200 management
01:16:16.700 decision.
01:16:20.100 Somebody
01:16:20.580 says,
01:16:20.960 anger is
01:16:21.580 killing me.
01:16:22.880 You know,
01:16:23.420 I felt a
01:16:23.940 little of that
01:16:24.360 lately.
01:16:27.400 Keeping the
01:16:28.080 mask on is
01:16:28.740 unhealthy.
01:16:30.340 We might
01:16:31.060 know more
01:16:31.460 about that
01:16:31.900 in the
01:16:32.100 future.
01:16:33.340 Now,
01:16:33.660 it's easier
01:16:34.920 for me to
01:16:35.400 be pro-mask
01:16:36.220 because I
01:16:36.900 have that
01:16:37.280 elite situation
01:16:38.300 where I
01:16:40.580 don't really
01:16:41.020 need to put
01:16:41.600 on a mask
01:16:42.180 unless I
01:16:42.660 leave the
01:16:42.980 house and
01:16:43.500 I don't
01:16:43.720 have to
01:16:44.140 leave the
01:16:44.440 house if
01:16:44.780 I don't
01:16:44.960 want to.
01:16:46.200 And if
01:16:46.620 I do
01:16:46.880 leave the
01:16:47.200 house and
01:16:47.520 I'm
01:16:47.660 outdoors,
01:16:48.240 I don't
01:16:48.440 wear a
01:16:48.780 mask.
01:16:49.980 So masks
01:16:51.420 are not the
01:16:52.000 biggest problem
01:16:52.980 in my life,
01:16:53.820 but imagine
01:16:54.320 if I had a
01:16:54.840 job where I
01:16:55.380 had to wear
01:16:55.740 one for
01:16:56.240 eight hours
01:16:57.420 a day.
01:16:57.740 I'd be
01:16:58.900 pretty
01:16:59.260 anti-mask
01:17:00.020 then,
01:17:00.420 especially
01:17:00.780 if I
01:17:01.080 were younger.
01:17:04.000 The camera
01:17:04.660 looks great,
01:17:05.220 somebody says,
01:17:05.760 no, it's
01:17:06.100 the same
01:17:06.560 camera.
01:17:07.360 Yeah, it's
01:17:11.740 a luxury
01:17:12.660 opinion.
01:17:14.680 I tweeted
01:17:15.700 around an
01:17:16.100 article by
01:17:16.580 Rob Henderson
01:17:17.200 who's a
01:17:17.520 few years
01:17:17.900 old, talking
01:17:18.940 about how
01:17:19.440 elites such
01:17:20.760 as myself
01:17:21.360 can have
01:17:23.000 these luxury
01:17:23.740 opinions,
01:17:24.960 where I'll
01:17:25.720 say an
01:17:26.040 opinion that's
01:17:26.620 bad for
01:17:27.040 poor white
01:17:27.680 people, but
01:17:28.240 not me,
01:17:28.960 because it'll
01:17:30.380 make me sound
01:17:30.920 good.
01:17:31.880 Yeah, open
01:17:33.040 those borders
01:17:33.640 and let
01:17:34.760 everybody come
01:17:35.440 in, because
01:17:36.820 they're not
01:17:37.820 going to be
01:17:38.160 competing for
01:17:38.700 my job,
01:17:39.660 they're going
01:17:40.000 to be
01:17:40.240 competing for
01:17:40.920 somebody else's
01:17:41.660 job.
01:17:42.620 So isn't it
01:17:43.260 great that I
01:17:43.840 could have
01:17:44.140 such a
01:17:45.060 liberal,
01:17:46.280 open-minded
01:17:46.800 opinion about
01:17:47.600 immigration?
01:17:48.580 Because it's
01:17:49.160 only bad for
01:17:49.700 you.
01:17:50.600 It's not bad
01:17:51.140 for me.
01:17:51.720 For me, I
01:17:52.160 just get
01:17:52.600 cheaper
01:17:53.660 gardener.
01:17:54.720 That's it.
01:17:57.760 Thoughts on
01:17:58.380 crypto.
01:17:59.140 So I'm not a
01:18:00.120 crypto guy,
01:18:01.420 meaning that I'm
01:18:02.600 involved with
01:18:03.180 crypto, but I'm
01:18:03.920 not any
01:18:04.640 expert on
01:18:05.240 it.
01:18:06.140 And here's
01:18:08.120 my take.
01:18:09.480 If I were
01:18:10.340 young or I
01:18:12.260 had a big
01:18:12.680 enough portfolio
01:18:13.420 where I could
01:18:14.880 diversify, I
01:18:16.460 would definitely
01:18:16.920 own Bitcoin
01:18:17.600 at this point.
01:18:19.460 So there was a
01:18:20.220 time when you
01:18:20.820 could say,
01:18:21.220 ah, I don't
01:18:21.760 know if
01:18:22.020 Bitcoin's going
01:18:22.680 to stay
01:18:22.940 around, but I
01:18:23.460 think we're
01:18:23.760 past that
01:18:24.280 time.
01:18:24.860 It does
01:18:25.340 seem like
01:18:25.780 it's going
01:18:26.700 to be here.
01:18:27.800 So any
01:18:28.680 young person's
01:18:30.100 portfolio should
01:18:30.920 have some
01:18:31.260 Bitcoin.
01:18:32.140 Every old
01:18:32.920 person who
01:18:33.360 has a big
01:18:33.860 enough
01:18:34.140 portfolio,
01:18:34.640 should put
01:18:35.680 a little
01:18:35.980 bit in
01:18:36.320 there.
01:18:39.040 But any
01:18:40.000 other kind
01:18:41.240 of coin I
01:18:42.020 think is
01:18:42.540 riskier.
01:18:44.720 That's
01:18:45.000 basically the
01:18:45.560 whole story.
01:18:47.320 All right.
01:18:48.140 That's all for
01:18:48.680 now.
01:18:49.140 I'll talk to
01:18:49.580 you.