Real Coffee with Scott Adams - March 24, 2021


Episode 1323 Scott Adams: Senate Goes Full Racist, Legacy Media Gets Competition, Gun Control Laws, and More


Episode Stats

Length

55 minutes

Words per Minute

142.50626

Word Count

7,968

Sentence Count

610

Misogynist Sentences

8

Hate Speech Sentences

29


Summary

The world is headed toward a future where governments have your DNA, and they want to know who your DNA belongs to. Is it you, or your ancestors? And if so, what would you do with your DNA if you were infected by a pandemic?


Transcript

00:00:00.000 Hey, everybody. Come on in. Come on in. Yeah. It's time for Coffee with Scott Adams, the
00:00:11.180 bestest place in the entire universe. There are other dimensions we haven't checked yet,
00:00:18.660 but all indications are that this is the best place to be of all reality. Don't know for
00:00:26.900 sure, but let's test it. All you need is a cup or a mug or a glass, a tank or a chalice or
00:00:31.940 a canteen jug of glass, a vessel of any kind, fill it with your favorite liquid. I like coffee.
00:00:36.860 And join me now for the unparalleled pleasure, the dopamine hit of the day, the thing that makes
00:00:43.000 everything better, including the vaccinations. It's called the Simultaneous Sip, and it's going
00:00:49.060 to be rocking people all over the world at the same time, hence Simultaneous. And it happens now. Go.
00:00:56.900 Oh, yeah. Never disappoints.
00:01:06.360 I just saw a tweet by Richard Grinnell. He says that the border to Mexico was opened up before
00:01:14.540 California was reopened. And if that doesn't bother you, you haven't been paying attention.
00:01:19.540 Well, that does bother me. I got to say, that's a good point. So how many of you have Neanderthal
00:01:28.100 genes? Have you ever had your DNA tested? Well, I have. And I got a little Neanderthal genes in me.
00:01:35.820 And it turns out that that can have a big impact on your coronavirus outcomes.
00:01:42.180 Now, who told you really early in the pandemic that we would probably find out that the biggest
00:01:51.800 difference in outcomes was genetic? And it looks like the biggest difference in outcomes is genetic.
00:02:00.140 Now, that's on top of comorbidities. So the comorbidities are still bigger. But within the
00:02:06.060 comorbidities, there are plenty of people who don't have bad outcomes. So there are people
00:02:13.160 who are, you know, overweight, and they're older and whatever, and they just do fine. And apparently
00:02:18.520 it's genetic. So the genes can protect you, even if you've not protected yourself with your lifestyle.
00:02:26.040 And I think that we will find out more and more about this now. Raises an interesting question,
00:02:34.660 doesn't it? Suppose, just brainstorm here, suppose the United States had a national registry of DNA.
00:02:45.680 Oh, you don't like that, do you? You don't want the government having your DNA, do you?
00:02:50.740 But what if they did? Let's just walk it through. If they had your DNA, and if a pandemic came through,
00:02:58.760 and they could fairly quickly determine, which wouldn't take long, that some genetic situations
00:03:06.440 are dangerous, and some are not. Given that the biggest challenge with the vaccinations is to get
00:03:14.020 them to the people who are in most danger, what would happen if you could take the people you know
00:03:20.560 are in most danger, the comorbidity people, and that you could further divide them by which ones are
00:03:27.440 really in danger because of genetics. Suddenly, you're giving your vaccines to exactly the right
00:03:34.820 people. At the moment, we're giving vaccines to sort of generally the population that we think
00:03:41.900 statistically has the biggest problem. But once you have their genetics, you can give it to exactly
00:03:50.020 the people who need it. You know, you're getting really close to knowing exactly what a person
00:03:55.560 needs. Now, of course, there are all kinds of, you know, privacy questions and whatnot about
00:04:02.620 anybody having your genes. But here's the thing you need to know. China is already collecting your
00:04:09.100 genetic material. So the question you have is, do you want to live in a future where China
00:04:16.500 has your genetic material has your genetic material, but the United States does not? Because that's
00:04:23.520 where we're heading, and we're heading there fast. One of the stories that didn't get much play
00:04:27.300 is that there were a number of lab resources that people were using that were Chinese-based
00:04:35.520 companies. Do you want your blood and your DNA to be analyzed by an American company that's owned by
00:04:43.580 China? Do you? Because everybody they test, they're going to have your DNA. So I think the world is
00:04:54.520 heading in a direction where governments will have your DNA. So do you want your own government to have
00:05:00.400 it? Because China's going to. China will have your DNA. Now, to the question of whether China had
00:05:08.100 developed this virus to be, let's say, work on some people less than others, well, there's no evidence
00:05:20.300 of that, by the way. So that was just a conspiracy theory. But apparently there's more Neanderthal genes
00:05:27.900 in Eastern Asia than there is in most places. So China actually has a high percentage of Neanderthal
00:05:35.920 DNA. But apparently there are two kinds, you know, just having Neanderthal DNA doesn't mean you're good
00:05:43.920 or bad. Some people with some subsets of that situation can do worse than others. So it's a little
00:05:51.440 more complicated than that. But it doesn't seem to be true that the virus is less effective against
00:06:01.280 Chinese ethnic people. One of the most fun stories of the day is that a writer for Vox, Aaron Rupar,
00:06:12.120 became so famous for tweeting misleading videos that he got in the dictionary. So the Urban Dictionary
00:06:20.940 now calls Rupar, his last name, they turned it into a verb, and it means to purposely mislead,
00:06:28.280 mischaracterize a video. So the video of Trump at the, you know, let the Charlottesville
00:06:38.480 find people hoax, that's a Rupar. Because it's a misleading video, they always cut out the end where
00:06:45.420 he says he disavows the racist completely. So you just take that part out.
00:06:50.940 And it looks the opposite. It looks like he didn't. Same with the drinking bleach hoax. It's
00:06:55.500 just an edit. You take out the part where he's specifying light, which he did before he talked
00:07:00.280 about it. So you knew he was talking about light. And then when he was done, he brought it back to
00:07:05.840 light. So you knew he was talking about light the whole time, which is a real thing.
00:07:10.200 But it got Rupar'd. It got Rupar'd by taking off the end and the beginning. So you didn't see the
00:07:17.700 reference to light. And then they Rupar it and say it's bleach. And then that's the story. It just
00:07:23.460 got Rupar'd. Now, there's nothing that makes me happier. And by the way, Rupar has come after me
00:07:32.100 personally. So we've had some exchanges online. So I can't say he's my natural enemy on Twitter. But
00:07:40.200 you know what I mean. He sort of is. But he made it into the dictionary before I did. So I call that
00:07:48.800 winning. So here's an interesting thing. Rasmussen is going to be reporting today in a poll that says
00:08:01.620 47% of Democrats think that border security is not a vital national security concern.
00:08:08.980 What? What? 47% of Democrats don't think that protecting your border is a national security
00:08:21.940 concern. Now, remember what I was telling you about how consuming news from one source doesn't
00:08:30.520 matter if it's just the left or just the right. Science has shown that it causes brain damage.
00:08:36.140 Now, if you're looking at this story and you're saying to yourself, wait a minute,
00:08:40.380 nearly half of Democrats think that protecting the border doesn't have a national security element
00:08:45.560 to it? And this is a perfect example. If you were trying to explain this, how would you explain that?
00:08:55.020 Right? Would you say to yourself, huh, it looks like 47% of Democrats are really stupid.
00:09:00.620 But you know that's not true. You know that's not true. Half of all Democrats are stupid. I mean,
00:09:09.060 not more stupid than anybody else, right? So it's not their intelligence, right? But is there some kind
00:09:18.960 bias in play? Well, I'm not sure it's exactly a bias. It looks like brain damage, doesn't it? Because
00:09:28.400 this is the kind of opinion that you could only have if your brain wasn't working. Because there's
00:09:35.460 no argument. It's not like there are two arguments and they're pretty good. And you know, well, I can see
00:09:41.340 the other side, but I prefer this one. There's no argument in favor of opening the border. There's none.
00:09:50.660 And, you know, there's certainly arguments about how many people you let through, but there's certainly
00:09:56.840 not an argument that is a vital security concern. Anyway, so once you realize and you reframe your
00:10:03.800 understanding of the world to say that these people are probably literally, so this is not,
00:10:10.780 I'm not giving an analogy, I'm not speaking, you know, figuratively, literally and scientifically brain
00:10:16.820 damaged. And almost certainly, you can tell the source of the brain damage consuming too much news on
00:10:24.980 one side. Once you see it, you can't unsee it. Because it explains everything, right? It just explains
00:10:32.700 everything. Why doesn't this make any sense when you're talking to these people? This doesn't make
00:10:37.140 any sense. It's like you're talking to somebody with brain damage. And it turns out that's literally
00:10:44.980 exactly what's happening. Exactly what's happening. So sometimes things are just the way they look.
00:10:56.160 Here's a story that is sort of teasing us about becoming a big story. And boy, is it going to be a big
00:11:02.560 story? So you know, that Taiwan? It's a little island country that sits just off of China. And you
00:11:09.920 know that China has for decades claimed them as their own, even though they're officially a separate
00:11:16.580 country. And the United States is sort of, you know, back to Taiwan and giving them weapons and stuff
00:11:25.020 so they can protect themselves. But here's the thing you need to know. Probably in the next 10 years,
00:11:31.500 say the experts, China will take Taiwan. And it's going to be easy. It's going to be easy.
00:11:40.680 And the things I know about this, apparently it's been, you know, war game. And every time the United
00:11:47.640 States does a war game on this, to figure out what would happen if China tried to capture Taiwan.
00:11:55.000 And the answer is, China captures Taiwan. Every time. It's not even close. Because of proximity.
00:12:05.660 Now what China is doing is quite clever for a long-term plan. If they were to just start a fight,
00:12:11.920 it's going to get pretty bloody and it's going to hurt them in international relationships forever, etc.
00:12:17.020 But as China builds up more and more capability around Taiwan, the number of hours it would take
00:12:25.360 to conquer all of Taiwan starts shrinking. So we're at the point now where China could conquer Taiwan
00:12:33.260 in about a day. It's about a one-day process currently with their current military situation.
00:12:40.200 Nobody believes that Taiwan could defend itself against a legitimate attack. Nobody thinks
00:12:47.700 that. And that timing will probably get down to an hour because they'll just keep adding assets.
00:12:54.460 You know, more missiles, more ships, more capability. And once Taiwan is completely surrounded and the
00:13:01.540 time it will take to conquer it gets down to about an hour,
00:13:05.040 they'll just negotiate to become part of China. Because it'll be their only choice.
00:13:12.100 So I don't see there's any way that Taiwan could go any other direction, really, in long run.
00:13:17.100 And I think our military is starting to...
00:13:21.240 Well, not starting to. They're warning the same thing.
00:13:24.340 But the United States has this, you know...
00:13:26.900 I assume there's some kind of military pact where we would have some responsibility
00:13:31.660 or moral authority to help defend Taiwan.
00:13:37.300 But what do you do when it's impossible?
00:13:41.760 It's just impossible.
00:13:43.740 You couldn't possibly militarily defend it. It's just not a thing.
00:13:48.100 So here's what you should expect.
00:13:50.360 I would expect sometime in the next 10 years,
00:13:53.000 China will just take Taiwan.
00:13:54.660 And it will become a jaya issue for a few years, like Hong Kong.
00:14:03.780 And then we'll get over it.
00:14:05.660 And they will just own Taiwan.
00:14:07.100 I don't see any other future.
00:14:09.060 I just can't imagine it would happen any other way.
00:14:12.380 Bernie Sanders continues to be interesting.
00:14:15.720 And he said he was very uncomfortable about Twitter banning Trump.
00:14:20.160 Now, you know, he's no lover of Trump, of course.
00:14:23.660 But he says, but if you're asking me, do I feel particularly comfortable
00:14:27.400 that the then president of the United States could not express his views on Twitter?
00:14:32.580 And Bernie says, I don't feel comfortable about that.
00:14:36.040 So thank you.
00:14:37.380 Thank you, Bernie Sanders, for being a voice of reason on this.
00:14:44.480 But there's an interesting trend developing.
00:14:47.140 And if you're not sort of in the writer's world, you haven't seen it.
00:14:54.840 And it goes like this.
00:14:55.860 Right now, we, the public, are at the mercy of the big platforms.
00:15:02.580 So if you're on a big platform, Twitter, Facebook, YouTube, whatever,
00:15:07.420 you pretty much have to do what they will allow you to do,
00:15:11.600 or else you're going to get booted off like the president.
00:15:13.920 So that gives the platforms narrative control.
00:15:19.340 So if you're saying something that's against the narrative,
00:15:22.260 you just get shut down.
00:15:24.120 You either booted off or your traffic is depressed.
00:15:28.160 But there's something happening that you might not be aware of.
00:15:32.320 You've heard of, I've talked about Locals, which is a subscription service.
00:15:36.220 So you could be free of the platform control because the individuals on Locals
00:15:42.460 just charge you a small subscription fee.
00:15:45.440 And you can just see them directly, and you don't have to get filtered by any platforms
00:15:49.140 because Locals doesn't do that.
00:15:51.840 Each of the creators own their own content there.
00:15:54.940 So it's a different situation.
00:15:57.140 But even bigger than that is Substack.
00:15:59.780 Are you all familiar with Substack?
00:16:01.440 Now, it's essentially a, I guess I'd call it a platform for writers
00:16:06.860 who want to escape the narrative machines,
00:16:11.920 and they get paid a lot, a lot.
00:16:17.120 In fact, some of them are being paid in advance
00:16:19.640 so they can get a very large paycheck just for joining Substack,
00:16:24.540 more than they would make in legacy media.
00:16:26.780 And suddenly, they have full access to lots of people.
00:16:32.320 You have to subscribe, but they have lots of reach.
00:16:35.700 And suddenly, no platform tells them what to say.
00:16:39.100 So Glenn Greenwald is on Substack and making quite a lot of noise.
00:16:45.300 And he's become even more interesting than he was,
00:16:48.280 and he's always been interesting.
00:16:49.220 So to watch somebody of his talent level suddenly completely free,
00:16:57.900 it's really kind of fun to watch.
00:17:00.080 Matt Taibbi, I believe, is also over there.
00:17:02.460 Another perfect example.
00:17:04.180 Somebody's saying Barry Weiss.
00:17:06.060 I think she is over there as well.
00:17:08.760 So Substack and Locals are both sort of recruiting, if you will,
00:17:13.780 getting the interesting voices,
00:17:15.180 the people who don't quite fit in the official narrative platforms.
00:17:21.360 And here's the shift that's happening.
00:17:25.160 The shift is moving away from the platform controlling the narrative
00:17:29.020 back to individual stars.
00:17:33.960 Andrew Sullivan's another one who's on Substack.
00:17:36.580 That's right.
00:17:38.180 And Locals is pulling in lots of people.
00:17:40.880 Greg Gottfeld's on there.
00:17:42.080 I'm on there, et cetera.
00:17:43.080 So this is a big thing because as the number of people
00:17:51.360 who escape the narrative machines, the big platforms,
00:17:56.120 they will become the new opinion makers.
00:18:01.220 And there is a trend toward taking the mainstream media
00:18:05.300 a little bit out of the power structure.
00:18:08.880 So just keep looking at this and see if there's any threat
00:18:13.000 to this subscription model.
00:18:15.120 Now, what I think will happen eventually,
00:18:17.520 and it might require some new company to do this,
00:18:20.360 but I think eventually you will be able to choose your voices
00:18:23.720 and then some service will stitch them together into a channel
00:18:28.240 so that you can say,
00:18:30.980 well, rather than watching who CNN thinks should be on TV
00:18:34.060 or who Fox News wants to interview,
00:18:37.060 I'll just pick my individual writers.
00:18:39.160 I'll take one Glenn Greenwald.
00:18:41.620 I'll take Scott Adams.
00:18:44.020 I'll take a Barry Weiss.
00:18:45.260 And I'll just make a channel
00:18:46.420 that's just the people that I know are not big fucking liars.
00:18:52.900 Basically, it would be a channel of people who are not liars.
00:18:56.340 They could be wrong, right?
00:18:58.940 I could be wrong about a lot of stuff,
00:19:01.160 but I'm pretty sure I haven't lied to you.
00:19:04.920 I mean, I can't think of any reason I need to.
00:19:07.460 What would be the reason to lie to you?
00:19:08.960 I don't have any reason.
00:19:10.400 So the people who don't have a reason to lie,
00:19:13.460 I'm not part of, you know,
00:19:15.080 I've got enough money that if I get canceled, I get canceled.
00:19:18.600 And same with Greenwald and Taibbi, et cetera.
00:19:23.840 Their voices are unleashed now,
00:19:25.740 and that's a really good thing to watch for.
00:19:29.580 I told you yesterday
00:19:31.740 that I thought Sidney Powell had an airtight defense.
00:19:35.600 Not airtight in the sense that she'll definitely win,
00:19:38.220 but solid defense.
00:19:39.940 Let's just say it's solid.
00:19:40.980 It doesn't mean she'll win.
00:19:42.740 And then I watched CNN's Eli Honig,
00:19:46.300 who apparently is a brilliant lawyer, according to CNN,
00:19:49.540 and he disagrees with my completely uneducated legal opinion.
00:19:55.500 So let's talk about that.
00:19:59.420 So according to Eli,
00:20:02.540 in order for Dominion to prevail over Sidney Powell,
00:20:06.920 if you don't know the issue,
00:20:08.600 Sidney Powell had made claims about Dominion election systems being,
00:20:13.080 I don't want to use the word,
00:20:16.540 because I'll get banned from social media,
00:20:18.700 but she said there were some problems with Dominion.
00:20:22.960 Dominion is suing her for making these claims.
00:20:27.140 And Eli Honig says that Dominion only has to prove
00:20:32.860 that Sidney Powell knew what she was saying was false
00:20:37.220 and that she's basically admitted it in her filing.
00:20:41.060 Because in her filing, she says,
00:20:43.660 you shouldn't have believed me, basically, I'm paraphrasing,
00:20:46.480 you shouldn't have taken it as fact,
00:20:48.820 you should have taken it as my opinion
00:20:50.500 that would need to be verified.
00:20:54.260 So do you take the Scott Adams legal opinion
00:20:58.540 or the Eli Honig legal opinion?
00:21:01.140 So my opinion is that I'm a reasonable person,
00:21:08.400 and when I heard Sidney Powell talk,
00:21:10.760 I believed exactly what she said,
00:21:13.840 that she had an opinion, like a hunch.
00:21:18.480 It was based on stuff, but it was basically a strong opinion,
00:21:21.900 and that she was advocating that we find out.
00:21:27.680 In other words, her whole thing was that she doesn't now.
00:21:31.140 She believes it to be true, and we need to find out.
00:21:35.340 Now, if she said we know it to be true
00:21:37.440 and we don't need to do any research
00:21:40.020 because we already have the information,
00:21:42.020 we know it to be true, well, that would be liable, right?
00:21:46.580 If she knew it was not true and she said it was true,
00:21:49.300 that's pretty bad.
00:21:50.920 But I think her case is solid.
00:21:54.000 So now I've heard the opposite.
00:21:56.560 And how do you prove somebody knew something
00:21:59.200 that they were saying was false?
00:22:02.120 How do you do that?
00:22:04.300 How do you prove somebody knew what they were saying was false?
00:22:09.220 Especially if somebody's job is an advocate.
00:22:12.900 If you're an advocate,
00:22:15.420 you're saying some stuff that, you know,
00:22:17.980 certainly could be questionable.
00:22:20.940 But it's her job to be an advocate.
00:22:23.520 Of course lawyers say things that might not be exactly true.
00:22:27.320 It's sort of the job they're in.
00:22:29.060 So I'm going to say again that I think Sidney Powell will win this,
00:22:32.600 will prevail,
00:22:35.000 because it was clearly an opinion.
00:22:37.820 Clearly an opinion.
00:22:39.160 In my opinion,
00:22:41.520 it's obviously an opinion.
00:22:42.980 Because how could somebody know something that can't be known?
00:22:45.760 The whole problem with Dominion is that it was non-transparent.
00:22:51.560 Nobody looked at the code
00:22:52.840 and nobody looked at the whole process from beginning to end.
00:22:57.020 It hasn't been audited in that fashion.
00:23:00.720 So I think she wins.
00:23:02.920 Gun sales are up, of course.
00:23:05.540 Of course.
00:23:06.200 People are going to buy more guns
00:23:07.340 because there's mass shootings
00:23:08.400 and because the Democrats are talking about more gun control.
00:23:12.400 So good luck.
00:23:14.740 Good work on that, Democrats.
00:23:17.900 More guns coming.
00:23:22.200 So let's see.
00:23:23.760 Let's talk about the mass shooter.
00:23:25.280 Everybody's talking about the guy.
00:23:27.140 I guess he was an immigrant from Syria.
00:23:29.980 He had mental problems.
00:23:31.620 Seems pretty clear.
00:23:32.800 He was anti-Trump.
00:23:35.720 And he was against Islamophobes.
00:23:39.180 But beyond that,
00:23:40.920 there's not any specific motivation for it.
00:23:45.660 So we don't know the immediate cause of why he did this.
00:23:50.160 We just sort of know he might have some mental problems.
00:23:52.620 It looks likely he does.
00:23:56.280 I'm going to make his statement.
00:23:57.580 You're not going to like it all.
00:24:00.220 I hate to do this to you.
00:24:01.640 But I have to do this
00:24:02.520 because otherwise you won't believe I'm telling you the truth
00:24:07.140 unless I occasionally tell you something you don't like.
00:24:11.380 Otherwise you'll just think I'm telling you all you want to hear.
00:24:14.260 And here it is.
00:24:15.060 You're going to freaking hate this.
00:24:17.820 In my opinion,
00:24:20.440 and it's just an opinion,
00:24:22.040 and this opinion is informed by my experience with hypnosis,
00:24:25.760 my experience with just the way the brain works.
00:24:30.800 And it goes like this.
00:24:31.820 If you banned the cool-looking weapons,
00:24:38.060 you would have far fewer mass shootings.
00:24:43.880 I hate to say it.
00:24:45.640 Now, when I say the cool-looking weapons,
00:24:48.400 I'm talking about the coolness.
00:24:50.660 I'm not talking about the killing power.
00:24:54.220 Because as we discussed yesterday,
00:24:56.260 somebody with a handgun can do a lot of killing,
00:24:58.620 and it's going to be almost no difference, right,
00:25:01.420 if they're good at it.
00:25:04.860 But imagine kids playing video games
00:25:07.280 and watching war things,
00:25:08.640 and then they're imagining their last moments.
00:25:11.380 Because anybody who goes into this mass shooting thing,
00:25:13.960 they certainly have to imagine it before they do it.
00:25:17.700 Imagine yourself going in
00:25:19.320 with this really cool,
00:25:21.940 somewhat military-ish-looking gun
00:25:24.260 that you just love holding in your hand.
00:25:26.780 I mean, you just love it.
00:25:30.440 Now, if you're a female,
00:25:32.900 or if you don't have an affinity for,
00:25:35.320 let's say, cool objects,
00:25:39.540 you don't understand what I'm saying.
00:25:41.800 This may be harder for women to understand.
00:25:44.380 And I'm going to be a little over-sexist here
00:25:48.680 to make the point.
00:25:49.560 Obviously, some women like guns.
00:25:52.280 We're all adults, right?
00:25:53.460 When I make a generalization about gender
00:25:56.360 in your head,
00:25:57.740 you should always say,
00:25:58.420 well, that's not everybody.
00:26:00.040 Obviously.
00:26:01.040 It's not everybody.
00:26:02.680 Plenty of women, I'm sure,
00:26:05.400 like the look and feel of a cool gun.
00:26:08.840 That's the thing.
00:26:10.040 But I would say, in general,
00:26:11.740 guys are more likely to like
00:26:13.720 just the physicality of it,
00:26:16.500 just the coolness of it.
00:26:17.760 What happens if you take that away?
00:26:21.980 It doesn't end all mass shootings, of course.
00:26:25.000 Like I said, somebody who's crazy
00:26:27.200 or has a motivation is just going to get a handgun
00:26:29.420 or some other weapon.
00:26:31.580 But I do think that people like this guy,
00:26:34.660 this particular guy who had some mental problems,
00:26:37.360 if you took away the coolness factor,
00:26:40.440 does it look the same?
00:26:42.760 I just don't know if it looks the same.
00:26:45.380 You've heard that if all you have is a hammer,
00:26:48.460 everything looks like a nail.
00:26:50.400 If you build up a big military,
00:26:53.780 don't you think you're more likely to use it
00:26:55.760 than if you hadn't?
00:26:58.200 Of course.
00:26:59.080 You tend to use a tool just because you have it.
00:27:02.860 So if you have, you know,
00:27:04.740 a awesome, somewhat military-ish looking gun,
00:27:08.700 not really military,
00:27:10.180 it's going to feel like you need to use it.
00:27:14.700 Because you have it.
00:27:16.960 And if you're not a hunter,
00:27:19.040 and you have a bad day,
00:27:21.140 and you've got this cool thing,
00:27:23.960 one plus one,
00:27:25.900 and you've got a bad situation.
00:27:27.460 Now, I don't imagine that this would have an effect
00:27:29.660 on most mass shootings,
00:27:32.120 because they're either, you know,
00:27:33.360 ideological or somebody's crazy or whatever.
00:27:35.700 But this guy probably wasn't so crazy
00:27:39.720 that he wouldn't be influenced by the,
00:27:43.620 let's say, the vibe of the whole situation
00:27:46.100 that makes him look kind of awesome
00:27:48.100 in his diseased mind.
00:27:51.360 Now, somebody says,
00:27:52.440 nice mind reading.
00:27:53.440 So it's not mind reading
00:27:54.500 if you're talking about people in general.
00:27:57.020 Mind reading would be
00:27:58.140 if you're talking about an individual.
00:28:00.480 But if you're making a general statement
00:28:02.220 about how people act,
00:28:03.980 you can be pretty safe about that
00:28:05.580 relative to guessing
00:28:06.800 what one person thinks.
00:28:10.340 Now, of course,
00:28:11.680 I'm watching your comments.
00:28:13.700 A lot of people are going crazy on this.
00:28:15.640 No, the numbers don't show that.
00:28:19.500 I'm not aware of any numbers
00:28:21.080 that would suggest one way or the other.
00:28:24.220 I'm telling you that there's no way
00:28:26.220 the coolness of it has no effect.
00:28:29.220 I just don't know how much.
00:28:31.620 Could it be 5%?
00:28:34.320 10%?
00:28:35.380 I would say that it's at least 10%.
00:28:37.960 I don't think it's 80%.
00:28:40.380 Is it 50%?
00:28:43.460 Could be.
00:28:45.160 Could be.
00:28:46.140 I think it's less than 50%,
00:28:48.100 but it could be.
00:28:49.840 Now, I'll add one other factor,
00:28:52.120 the news.
00:28:52.760 What would it look like
00:28:55.060 if the news did not report these
00:28:58.760 and almost always report
00:29:00.480 that it's the same kind of weapon?
00:29:03.280 Do you think that this guy
00:29:05.160 would have had this idea
00:29:07.100 to be a mass shooter
00:29:08.940 at this grocery store?
00:29:10.820 Do you think he would have even thought of it
00:29:12.760 except for the news?
00:29:16.300 Probably not.
00:29:17.780 Probably not.
00:29:19.040 No.
00:29:19.500 So you need two things.
00:29:20.620 You need a news that can't shut up about it.
00:29:24.340 And you need a really cool weapon.
00:29:27.740 And then you need somebody
00:29:29.160 whose life is not going well.
00:29:31.520 Apparently he couldn't get a girlfriend.
00:29:33.860 He had some mental problems.
00:29:35.260 I don't think his life was going to go that well.
00:29:39.760 So, anyway.
00:29:41.660 Now, this is all separate from
00:29:43.600 whether there should be gun control.
00:29:45.900 I want you to hear this clearly.
00:29:48.000 The question of whether
00:29:49.180 there should be gun control
00:29:50.560 opens up all kinds of other questions.
00:29:53.760 So the question is not just
00:29:55.700 if you made this change,
00:29:57.540 would it reduce the number of shootings?
00:29:59.840 You have to weigh that against freedom
00:30:01.440 and protecting the country, etc.
00:30:04.600 Now, one of the things that I often say
00:30:06.560 about gun control,
00:30:07.540 and the reason that I'm in favor of guns,
00:30:09.860 is that there's at least a small chance
00:30:12.520 that you have to protect yourself
00:30:15.040 from the government itself.
00:30:16.220 And somebody said to me,
00:30:18.340 what are the odds of that?
00:30:19.900 We've been around 300 years or whatever,
00:30:22.440 and the country hasn't
00:30:24.800 fallen apart yet.
00:30:26.600 Do you really need those guns
00:30:28.180 to protect yourself against
00:30:29.400 the government of the United States?
00:30:32.600 Really?
00:30:33.600 To which I say,
00:30:34.740 I don't think it's a big chance.
00:30:37.440 2%.
00:30:38.000 I would say the odds are 2%.
00:30:41.380 Now, if you say it's zero,
00:30:44.520 I think you're not in
00:30:45.660 intellectually secure territory.
00:30:48.980 So I think it's closer to 100%
00:30:51.880 if you wait long enough.
00:30:53.540 It's just how long you wait.
00:30:55.080 Because there are no dynasties
00:30:57.160 that lasted forever.
00:30:59.240 They all have an upheaval.
00:31:01.720 So the odds of the United States
00:31:04.100 someday not being the United States
00:31:06.880 is 100%.
00:31:08.060 It's just how long you wait.
00:31:10.240 Is it 1,000 years or 50 years?
00:31:13.060 So there is some chance.
00:31:15.240 I would put the odds at,
00:31:16.820 just for discussion purposes,
00:31:19.260 2%.
00:31:19.920 2%.
00:31:21.460 This is just to do some math, right?
00:31:23.780 You don't have to agree with that number.
00:31:25.000 Now, how would you look at the risk
00:31:28.560 of a 2% chance of this big problem?
00:31:33.640 You know, like the government
00:31:35.120 turning on the citizens.
00:31:37.120 2%.
00:31:37.600 How would you rate that
00:31:39.480 compared to the deaths by guns?
00:31:41.500 Well, the deaths by guns you can count.
00:31:44.000 And you can say,
00:31:44.740 all right, we get this many thousand people per year
00:31:46.900 die from guns.
00:31:49.280 What is a 2% risk
00:31:51.260 that the entire United States
00:31:53.880 would be devastated
00:31:55.860 by some civil war
00:31:57.580 or the government
00:31:58.640 would turn on the people?
00:31:59.900 Well, you multiply
00:32:00.660 370 million people
00:32:02.360 times 2%,
00:32:04.080 you get
00:32:04.660 7 million people
00:32:07.380 at risk.
00:32:08.880 Now, that would be
00:32:09.580 if the whole country
00:32:10.440 was fighting each other
00:32:11.380 with weapons,
00:32:11.980 which wouldn't happen.
00:32:13.220 But you very quickly,
00:32:15.440 you very quickly,
00:32:16.420 this,
00:32:16.880 by the way,
00:32:17.560 this type of analysis
00:32:18.940 is called
00:32:20.120 an expected value.
00:32:22.120 You multiply the odds
00:32:23.360 of something happening
00:32:24.320 times the cost
00:32:26.060 if it did happen.
00:32:27.100 Now, both of these are guesses,
00:32:28.800 but it allows you to
00:32:29.860 at least
00:32:30.420 have some kind of
00:32:32.180 rational comparison
00:32:33.160 to an alternative.
00:32:34.880 So the alternative
00:32:35.600 is, what,
00:32:37.080 30,000 people a year
00:32:38.700 die from guns
00:32:39.460 in the United States?
00:32:40.880 But you've got a 2% chance
00:32:42.580 of preventing
00:32:43.280 millions
00:32:44.700 from dying.
00:32:46.560 Millions.
00:32:47.680 So I would say
00:32:48.520 that that's worth it.
00:32:49.720 To me,
00:32:50.980 as an insurance policy,
00:32:53.020 against that 2%
00:32:54.340 in the short term,
00:32:56.500 but really 100%
00:32:57.640 in the long term,
00:32:59.060 I'd say that's
00:33:00.460 a good
00:33:01.360 insurance policy.
00:33:07.180 Yeah,
00:33:07.780 so 100% chance
00:33:08.880 doesn't mean
00:33:09.420 that everybody dies,
00:33:11.200 right?
00:33:11.580 And I didn't say that.
00:33:13.080 100% chance
00:33:14.240 is that there would be
00:33:15.140 a lot of upheaval.
00:33:16.160 So let's say
00:33:19.780 a million people died
00:33:21.780 in the Civil War.
00:33:23.680 Let's say you did
00:33:24.640 2% of a million.
00:33:27.980 All right?
00:33:28.860 So 10% would be
00:33:30.880 100,000.
00:33:32.880 Do the math.
00:33:34.180 All right.
00:33:37.300 Yeah,
00:33:37.860 I think the news
00:33:38.720 plus the coolness
00:33:40.100 of the weapons
00:33:40.720 are the two variables
00:33:41.760 that we won't talk about,
00:33:43.720 but are at least
00:33:44.500 a big part of it.
00:33:45.760 It's not the whole story.
00:33:49.480 I guess Biden
00:33:50.520 is talking about
00:33:51.460 maybe using
00:33:53.620 his power
00:33:55.940 of executive orders
00:33:57.400 to do something
00:33:58.580 with
00:33:58.900 these weapons.
00:34:02.000 And I'm thinking,
00:34:02.700 what kind of executive order
00:34:03.940 can you do about that?
00:34:07.220 I'm still confused
00:34:08.540 about the
00:34:09.220 scope of executive orders.
00:34:13.200 Like,
00:34:13.300 when can you use them?
00:34:14.600 How could you use
00:34:16.280 an executive order
00:34:17.300 to make a constitutional
00:34:19.560 right
00:34:20.020 less available?
00:34:22.220 That feels like
00:34:22.980 a questionable
00:34:24.300 thing to do.
00:34:25.780 But maybe he can.
00:34:26.720 I don't know.
00:34:29.500 So he hasn't ruled out
00:34:30.700 the executive orders.
00:34:32.240 And I guess
00:34:32.680 what they mostly want
00:34:33.700 or they're asking for
00:34:34.560 is
00:34:34.940 background check
00:34:36.720 improvements.
00:34:37.240 I don't know
00:34:39.660 why anybody
00:34:40.300 disagrees
00:34:40.920 with background
00:34:41.540 checks.
00:34:44.000 And I don't know
00:34:44.860 if this guy
00:34:45.380 could have been
00:34:45.940 stopped
00:34:47.120 with a background
00:34:47.820 check.
00:34:48.940 But
00:34:49.320 here's the scariest
00:34:51.000 thing you'll hear
00:34:51.620 today.
00:34:52.620 Do you believe
00:34:53.800 that you could
00:34:54.480 write an algorithm
00:34:55.980 that would
00:34:57.500 detect
00:34:57.940 somebody
00:34:58.720 who is a
00:34:59.240 potential
00:34:59.680 mass shooter
00:35:00.420 before they
00:35:00.980 did the shooting?
00:35:02.900 Here's the things
00:35:03.800 that could have
00:35:04.360 been detected
00:35:04.960 about this
00:35:05.740 shooter in Denver.
00:35:07.860 He's
00:35:08.380 concerned about
00:35:10.300 Islamophobia.
00:35:13.320 He doesn't
00:35:14.280 like Trump.
00:35:15.440 Couldn't get a
00:35:16.220 girlfriend.
00:35:17.440 And there's some
00:35:17.940 indication of
00:35:18.640 mental illness.
00:35:20.780 Now, if they
00:35:21.520 also had access
00:35:22.340 to the fact that
00:35:23.040 he'd recently
00:35:23.640 bought a weapon,
00:35:25.500 could there be
00:35:26.200 any kind of
00:35:26.800 algorithm that
00:35:27.520 could see all
00:35:28.220 of those things?
00:35:29.680 Say a government
00:35:30.500 algorithm that
00:35:31.280 could look into
00:35:31.820 your financial
00:35:32.540 stuff and also
00:35:34.480 look into your
00:35:35.320 postings and
00:35:36.680 say, wait a
00:35:37.180 minute, wait a
00:35:37.520 minute, we've
00:35:37.920 got a bad
00:35:38.580 match here.
00:35:39.800 Here's a guy
00:35:40.380 who's got sort
00:35:41.280 of the
00:35:41.540 characteristics that
00:35:43.360 doesn't look
00:35:43.860 safe.
00:35:45.240 He doesn't have
00:35:45.820 a happy life,
00:35:46.780 little mental
00:35:47.340 problems, and
00:35:48.600 he's got a
00:35:49.200 political motive.
00:35:51.820 That's bad.
00:35:53.340 And then he
00:35:53.700 just bought this
00:35:54.980 particular type
00:35:56.260 of weapon.
00:35:57.500 And there's
00:35:58.160 nothing on his
00:35:58.780 social media to
00:36:00.140 suggest he's a
00:36:00.960 hunter.
00:36:01.960 And there's
00:36:02.500 nothing on his
00:36:03.100 social media to
00:36:03.860 suggest he's
00:36:05.040 just a Second
00:36:05.680 Amendment guy.
00:36:07.620 Right?
00:36:08.840 Do you think you
00:36:09.800 couldn't write an
00:36:10.600 algorithm that
00:36:11.260 would find the
00:36:11.880 shooters before
00:36:12.540 they shot?
00:36:13.460 Now, you would
00:36:14.040 find too many
00:36:14.720 people.
00:36:15.120 That's the
00:36:15.400 problem, right?
00:36:16.420 You'd find way
00:36:17.240 more people than
00:36:18.060 actual people who
00:36:19.020 commit crimes.
00:36:21.260 But I feel like
00:36:22.320 we're heading in
00:36:22.840 that direction.
00:36:29.020 Hello?
00:36:29.460 Hello?
00:36:29.540 Hello, Adam.
00:36:34.280 Yes?
00:36:36.400 Good morning.
00:36:37.400 This is Jack with
00:36:38.400 Today Capital.
00:36:39.420 How are you doing
00:36:39.920 today?
00:36:40.600 Oh, I'm great.
00:36:42.280 But, you know, I
00:36:43.080 don't need any
00:36:43.600 capital.
00:36:45.380 I understand that.
00:36:46.860 Actually, as a
00:36:47.680 direct lender, we
00:36:48.580 have 15-plus loan
00:36:49.820 programs where we
00:36:51.340 don't need any
00:36:51.860 guarantee, no
00:36:52.780 personal use.
00:36:54.480 So do you need
00:36:55.500 any additional
00:36:56.020 capital for your
00:36:56.880 business?
00:36:57.140 Yeah, if you
00:36:57.880 could send me a
00:36:58.440 check, that'd be
00:36:58.940 great.
00:37:01.880 Probably had more
00:37:02.600 to say about that,
00:37:03.460 but I think he's
00:37:04.620 going to send me a
00:37:05.240 check.
00:37:06.840 All right, let's
00:37:07.460 say...
00:37:08.460 So everybody's
00:37:13.300 talking about the
00:37:13.940 ethnicity of the
00:37:15.240 shooter.
00:37:16.340 And in the
00:37:17.600 beginning, people
00:37:19.300 like me and
00:37:20.480 people like
00:37:22.320 Kamala Harris'
00:37:25.080 niece, I think it
00:37:26.020 was.
00:37:26.260 Was it her
00:37:27.120 niece?
00:37:27.500 Yeah.
00:37:28.140 Her niece said,
00:37:29.680 well, it's probably
00:37:30.820 a white guy with
00:37:31.840 a big weapon.
00:37:34.600 You may recall
00:37:35.520 that I said the
00:37:36.360 same thing.
00:37:37.720 I think it's
00:37:38.360 probably a white
00:37:39.860 guy.
00:37:40.920 How's that feeling
00:37:42.540 about it?
00:37:43.560 Turns out it
00:37:44.120 wasn't.
00:37:45.320 And yesterday, I
00:37:46.320 had also raised
00:37:47.680 the question about
00:37:48.560 what his ethnicity
00:37:50.600 might be because it
00:37:52.200 hasn't been
00:37:52.660 released.
00:37:53.220 the longer it
00:37:54.680 takes them to
00:37:55.200 release the
00:37:55.900 ethnicity, the
00:37:56.720 more you suspect
00:37:57.480 it wasn't a
00:37:58.180 white guy.
00:37:59.920 I feel like if
00:38:00.840 it's a white guy,
00:38:01.540 you know right
00:38:01.960 away what the
00:38:02.480 race is.
00:38:03.320 But people were
00:38:03.920 saying that since
00:38:04.660 they took him
00:38:05.260 alive, he must
00:38:07.000 have been a
00:38:07.360 white guy.
00:38:10.680 But that
00:38:11.580 turned out to be
00:38:12.420 the case.
00:38:13.120 So everybody got
00:38:13.920 to have their
00:38:14.680 biases tested by
00:38:16.800 this situation.
00:38:17.560 I got my
00:38:19.020 biases tested.
00:38:22.280 Now the bias
00:38:23.260 is just, in my
00:38:24.320 case, it was
00:38:24.780 statistical.
00:38:26.180 I just thought
00:38:26.760 statistically, the
00:38:27.980 odds are good it
00:38:28.940 was a white guy.
00:38:30.400 But it wasn't.
00:38:33.500 I saw a quote
00:38:34.540 from, I think, a
00:38:36.220 gentleman who
00:38:37.080 lives in India, if
00:38:38.160 I'm correctly
00:38:39.240 seeing that.
00:38:41.560 And he tweeted
00:38:42.400 this.
00:38:42.700 He said, this is
00:38:43.220 very brazen, he
00:38:48.260 just said it was
00:38:49.140 brazen that they're
00:38:50.140 just openly
00:38:50.760 discriminating against
00:38:52.000 white people now
00:38:52.880 in the Senate.
00:38:53.940 You probably heard
00:38:54.620 that Tammy Duckworth
00:38:55.660 and Maisie
00:38:57.980 Hirona said
00:38:59.120 they'll vote
00:38:59.640 against all Biden
00:39:00.540 nominees who
00:39:01.360 are not racial
00:39:03.000 minorities.
00:39:04.900 So these stories
00:39:06.240 end up becoming
00:39:06.940 the same story.
00:39:08.040 Because every story
00:39:09.020 just turns into a
00:39:10.180 racial filter.
00:39:12.120 Oh, the shooter,
00:39:13.220 it's racial.
00:39:14.920 Yeah, in the
00:39:15.720 comments, somebody
00:39:16.280 is saying the
00:39:16.720 shooter was already
00:39:17.540 on FBI radar.
00:39:18.860 Which means they
00:39:19.760 already do have an
00:39:20.860 algorithm.
00:39:22.440 Apparently, they do
00:39:23.480 have an algorithm
00:39:24.280 that can put you
00:39:25.560 on their radar
00:39:26.360 just by your
00:39:27.540 social media traffic.
00:39:29.480 And if they
00:39:30.120 haven't yet bashed
00:39:31.180 that against
00:39:31.800 firearm purchases,
00:39:34.140 why haven't they?
00:39:35.820 That seems like
00:39:36.380 the obvious thing
00:39:36.980 to do.
00:39:40.180 So, amazingly,
00:39:41.560 two Democratic
00:39:42.180 senators said that
00:39:43.140 you can't,
00:39:44.000 that they won't
00:39:44.540 vote for white
00:39:45.220 people to fill
00:39:46.580 these cabinet
00:39:47.500 positions.
00:39:48.800 And I almost
00:39:53.680 don't know what
00:39:54.360 to say about
00:39:55.000 that.
00:39:56.920 They literally
00:39:57.900 won't vote for
00:39:59.860 white people.
00:40:00.840 And they said
00:40:01.500 that out loud.
00:40:02.720 Now, I think it
00:40:03.340 was Duckworth who
00:40:04.240 may have pulled
00:40:05.420 that back a little
00:40:06.160 bit.
00:40:06.420 but they would
00:40:09.740 only make an
00:40:10.260 exception if
00:40:11.000 you're LGBTQ.
00:40:13.980 Which is an
00:40:14.780 interesting exception
00:40:15.660 because if I were
00:40:16.960 a nominee
00:40:17.480 and I were
00:40:19.920 an adult white
00:40:21.580 guy who was
00:40:22.240 qualified for the
00:40:23.140 office, and I
00:40:24.540 knew that Tammy
00:40:25.260 Duckworth and
00:40:25.980 Maisie Hirona would
00:40:26.820 not vote for me
00:40:27.640 unless I were
00:40:29.700 LGBTQ, I would
00:40:32.160 become LGBTQ
00:40:33.420 within minutes.
00:40:36.960 Because you just
00:40:37.760 have to say you
00:40:38.440 are, right?
00:40:39.780 I'm not wrong
00:40:40.680 about that.
00:40:42.700 And if you
00:40:44.160 think I'm
00:40:44.560 joking, no.
00:40:46.820 No.
00:40:47.280 If I were a
00:40:48.580 nominee and I
00:40:49.480 were a white
00:40:49.880 guy, and I
00:40:50.880 thought I couldn't
00:40:51.440 get the job
00:40:52.020 because of my
00:40:52.820 race, unless I
00:40:54.520 were LGBTQ, I
00:40:56.480 would stand in
00:40:57.200 front of the
00:40:57.600 public and say,
00:40:59.200 you know, I
00:41:01.600 maybe am not
00:41:02.720 naturally leading
00:41:03.720 that way, but as
00:41:05.680 a lifestyle choice
00:41:06.680 it is my right to
00:41:07.700 make it.
00:41:08.920 And I declare
00:41:09.760 myself to be
00:41:10.660 gay for the
00:41:12.480 purposes of
00:41:13.120 employment, and
00:41:15.240 I understand I
00:41:16.640 don't have to
00:41:17.080 actually have sex
00:41:17.860 with anybody.
00:41:19.340 Because you can
00:41:20.460 be gay without
00:41:21.380 the actual sexual
00:41:22.340 act, obviously.
00:41:24.020 You know,
00:41:24.200 heterosexuals who
00:41:25.080 are not having
00:41:25.780 sex are still
00:41:26.440 heterosexual.
00:41:26.940 So I don't
00:41:28.260 have to prove
00:41:29.060 I'm gay.
00:41:30.500 I just have to
00:41:31.600 be gay in my
00:41:33.840 personal preferences.
00:41:36.640 And I would
00:41:38.360 play it completely,
00:41:40.340 ironically, I
00:41:40.980 would play it
00:41:41.420 straight.
00:41:42.540 I wouldn't joke,
00:41:44.140 and I wouldn't
00:41:44.780 say I was kidding,
00:41:45.720 no matter how
00:41:46.420 many people asked.
00:41:47.840 I would look you
00:41:48.840 right in the eye
00:41:49.580 and say, yes,
00:41:52.360 totally honest,
00:41:54.240 I'm as gay as
00:41:55.380 you can get.
00:41:56.940 For the job.
00:41:58.260 I'm doing it for
00:41:59.140 the job.
00:42:00.180 But I'm
00:42:00.500 definitely gay.
00:42:02.260 For the job.
00:42:05.860 Now, this is
00:42:06.780 one of my
00:42:09.600 perma-trolls,
00:42:10.860 somebody on
00:42:11.460 Twitter called
00:42:11.940 Hampton Stevens.
00:42:13.660 He saw me
00:42:14.400 tweeting about
00:42:14.940 these issues.
00:42:16.640 And replying to
00:42:18.680 somebody else,
00:42:19.260 he goes,
00:42:19.680 right, Adams is
00:42:20.960 going full-on
00:42:21.980 white grievance.
00:42:24.260 To which I
00:42:25.060 tweeted in reply,
00:42:26.020 yes, I am.
00:42:29.660 Our system works
00:42:30.720 best with guard
00:42:31.640 rails in every
00:42:32.380 direction, and
00:42:33.540 only the people
00:42:34.200 who can survive
00:42:35.020 the blowback
00:42:35.820 have the ability
00:42:37.280 to perform that
00:42:38.320 public task.
00:42:40.280 You can sit it
00:42:41.300 down if you're
00:42:41.820 frightened, I
00:42:43.040 understand.
00:42:44.420 So I told my
00:42:45.800 critic that if
00:42:49.440 he's frightened to
00:42:51.000 be part of this
00:42:51.620 conversation, I
00:42:52.360 understand.
00:42:52.700 he has every
00:42:54.160 right to be
00:42:54.640 frightened.
00:42:56.200 It's a scary
00:42:57.140 thing.
00:42:57.880 And the only
00:42:58.680 people who can
00:42:59.160 even talk about
00:42:59.840 this topic in
00:43:00.660 public are
00:43:01.840 people who are
00:43:02.420 bulletproof.
00:43:03.780 And I'm kind of
00:43:05.040 bulletproof.
00:43:06.220 So it's almost
00:43:07.520 like this Spider-Man
00:43:08.420 problem, which is,
00:43:09.840 I don't really want
00:43:10.760 to talk about this.
00:43:12.400 I really don't.
00:43:13.560 I'm just not
00:43:14.160 interested.
00:43:15.320 But I'm the only
00:43:16.160 one who can.
00:43:17.200 I mean, not the
00:43:17.680 only one.
00:43:18.100 But there's a
00:43:19.140 small number of
00:43:19.920 people who can
00:43:21.080 even talk about
00:43:21.800 this in public and
00:43:22.760 not have their
00:43:23.660 life ruined.
00:43:25.120 So I guess I'm
00:43:27.580 one of them now
00:43:28.240 because I can.
00:43:30.780 And it's
00:43:31.400 important.
00:43:33.060 Full-out racism
00:43:34.180 in the Senate.
00:43:35.380 Tom Cotton
00:43:36.100 called it down as
00:43:37.460 well on Twitter.
00:43:38.660 And I feel as if
00:43:39.940 they should be
00:43:40.460 removed from the
00:43:41.240 Senate.
00:43:43.420 Like they should
00:43:44.400 really be removed
00:43:45.280 from the Senate.
00:43:45.880 As in, all of
00:43:49.500 the white people
00:43:50.480 in the Senate
00:43:51.240 should walk out
00:43:52.120 until they are.
00:43:53.760 If I were a
00:43:55.040 white, straight
00:43:57.340 Republican, I
00:43:59.500 would refuse to
00:44:00.660 do service.
00:44:01.720 I would refuse to
00:44:02.620 do the work of
00:44:03.180 the people until
00:44:04.200 they're gone.
00:44:06.020 I wouldn't do
00:44:06.820 any work of the
00:44:07.720 public.
00:44:08.140 I wouldn't do
00:44:08.540 anything in the
00:44:09.220 Senate.
00:44:09.820 I wouldn't look
00:44:10.520 at a nomination.
00:44:13.440 I wouldn't even
00:44:14.360 look at the
00:44:15.280 nomination.
00:44:15.880 I wouldn't even
00:44:17.200 show up until
00:44:19.580 they get rid of
00:44:20.200 these two people.
00:44:21.940 Because if you
00:44:22.720 think there's
00:44:23.140 something more
00:44:23.700 important than
00:44:24.280 this that they're
00:44:24.840 working on,
00:44:25.960 well, you're
00:44:26.480 wrong.
00:44:27.640 This is the
00:44:28.480 most important
00:44:29.100 thing that
00:44:29.520 happened in
00:44:30.560 the country
00:44:31.040 that has to
00:44:32.160 get fixed.
00:44:33.160 That people
00:44:33.900 were openly
00:44:34.740 racist in the
00:44:36.080 Senate and
00:44:38.040 don't think
00:44:38.520 there'll be
00:44:38.800 repercussions for
00:44:39.720 that.
00:44:41.020 So you need
00:44:41.820 to play by
00:44:42.360 their own game,
00:44:43.400 which is, if
00:44:44.160 it's racist,
00:44:44.880 it's racist.
00:44:45.340 And you
00:44:45.660 have to
00:44:45.900 deal with
00:44:46.240 it.
00:44:46.380 It's the
00:44:46.680 biggest issue.
00:44:47.700 And this
00:44:48.060 is really
00:44:48.680 racist.
00:44:49.780 This is
00:44:50.240 like super
00:44:50.880 racist.
00:44:52.060 This is
00:44:52.380 as racist
00:44:52.800 as you
00:44:53.200 can be.
00:44:55.820 Does
00:44:56.260 anybody feel
00:44:56.840 sorry for
00:44:57.380 the white
00:44:57.960 people who
00:44:58.440 will be
00:44:58.720 victimized by
00:44:59.420 this?
00:45:00.000 No.
00:45:00.760 But you
00:45:01.140 still can't
00:45:01.760 be racist.
00:45:03.040 It doesn't
00:45:03.320 work in the
00:45:03.740 long run.
00:45:04.720 Now, I've
00:45:05.280 told you
00:45:05.540 before the
00:45:05.940 only way to
00:45:06.440 stop the
00:45:07.360 crazy brain
00:45:08.580 damaged part of
00:45:09.720 the left
00:45:10.240 from the
00:45:12.520 slippery slope
00:45:13.340 that just
00:45:14.360 goes forever
00:45:14.980 into madness
00:45:15.660 is
00:45:18.280 aggressively
00:45:20.940 agreeing with
00:45:21.780 them.
00:45:23.700 Disagreeing
00:45:24.180 with somebody
00:45:24.860 who has
00:45:25.120 brain damage
00:45:25.720 doesn't work.
00:45:27.480 Do you
00:45:27.740 know why?
00:45:28.980 It's sort
00:45:29.960 of in the
00:45:30.360 setup.
00:45:31.020 It's the
00:45:31.380 brain damage
00:45:31.920 part.
00:45:32.880 If they
00:45:33.520 didn't have
00:45:33.980 brain damage,
00:45:34.680 well, you
00:45:34.980 could probably
00:45:35.280 reason for them.
00:45:35.900 But they
00:45:36.660 do.
00:45:37.680 So you
00:45:38.040 can't.
00:45:39.140 So you
00:45:39.420 have to use
00:45:39.880 persuasion,
00:45:40.620 not reason.
00:45:42.420 And I
00:45:43.760 would say
00:45:44.060 the way to
00:45:44.460 do that
00:45:44.920 would be
00:45:45.760 to embrace
00:45:47.920 their policies
00:45:48.760 and watch
00:45:49.500 the break.
00:45:50.680 So that's
00:45:51.360 happened with
00:45:52.020 immigration,
00:45:52.960 right?
00:45:53.740 So the
00:45:54.380 Democrats got
00:45:55.140 to try their
00:45:55.880 immigration plan
00:45:56.800 and got to
00:45:58.140 see it didn't
00:45:58.560 work.
00:45:59.860 And that
00:46:00.540 would be true
00:46:00.980 of a lot of
00:46:01.420 stuff they're
00:46:02.040 doing.
00:46:02.360 If they
00:46:02.640 got their
00:46:03.140 way, if
00:46:04.360 the dog
00:46:04.820 caught the
00:46:05.320 car,
00:46:05.900 it wouldn't
00:46:06.760 know what
00:46:07.040 to do
00:46:07.280 with it.
00:46:08.160 So sometimes
00:46:08.860 you have to
00:46:09.260 let the dog
00:46:09.760 catch the
00:46:10.160 car and
00:46:10.520 say,
00:46:10.740 all right,
00:46:11.100 dumbass.
00:46:12.280 Now what
00:46:12.560 are you going
00:46:12.800 to do
00:46:12.980 with it?
00:46:13.920 Chew on
00:46:14.320 the bumper?
00:46:15.380 How's it
00:46:15.760 taste?
00:46:17.120 Sometimes you
00:46:17.740 have to let
00:46:18.180 them get
00:46:18.560 their way
00:46:19.140 because they're
00:46:20.140 brain damaged
00:46:20.800 and just see
00:46:21.940 what happens
00:46:22.420 and then talk
00:46:23.260 themselves out
00:46:23.980 of it.
00:46:24.300 They can
00:46:24.520 talk themselves
00:46:25.080 out of it
00:46:25.420 that way.
00:46:26.740 So that's
00:46:28.800 why I think
00:46:29.340 the GOP
00:46:29.980 should just
00:46:30.420 shut down
00:46:30.780 the Senate
00:46:31.140 because that's
00:46:33.380 what the
00:46:33.640 Democrats would
00:46:34.300 do.
00:46:34.500 if there
00:46:36.280 had been
00:46:36.760 let's say
00:46:38.240 a GOP
00:46:38.960 member who
00:46:39.520 had said
00:46:39.840 something that
00:46:40.380 was not
00:46:41.520 just maybe
00:46:42.640 a little
00:46:43.040 racist,
00:46:44.360 not just
00:46:44.920 a dog
00:46:45.320 whistle,
00:46:46.480 but direct.
00:46:47.620 if some
00:46:49.540 white adult
00:46:51.180 Republican
00:46:52.220 had made
00:46:52.740 a direct
00:46:53.700 racist
00:46:54.780 statement
00:46:55.420 like this,
00:46:58.020 I think
00:46:58.920 the Democrats
00:46:59.420 would just
00:46:59.840 stop the
00:47:00.260 Senate,
00:47:00.680 wouldn't they?
00:47:01.440 They would
00:47:01.740 just stop
00:47:02.160 everything and
00:47:02.620 say,
00:47:02.780 oh,
00:47:03.060 wait a minute,
00:47:04.300 this is the
00:47:04.960 biggest problem.
00:47:07.740 Nothing else
00:47:08.580 is as big a
00:47:09.400 problem as
00:47:10.080 this is.
00:47:10.560 These two
00:47:10.960 people have
00:47:12.120 to be gone
00:47:12.900 or we're not
00:47:14.760 going to do
00:47:15.060 any other
00:47:15.500 business.
00:47:17.620 somebody's
00:47:18.860 using the
00:47:19.340 example of
00:47:19.980 Representative
00:47:20.520 King.
00:47:21.760 King is an
00:47:22.620 example of
00:47:23.360 the dog
00:47:24.300 whistle type.
00:47:25.720 Some people
00:47:26.180 thought they
00:47:26.680 heard it in
00:47:27.300 his words,
00:47:28.080 other people
00:47:28.680 said he's
00:47:29.520 just bad at
00:47:30.260 talking in
00:47:30.780 public.
00:47:31.860 It's not
00:47:32.360 really there,
00:47:32.960 but you can
00:47:33.280 imagine it if
00:47:34.060 you want to.
00:47:34.900 That's different.
00:47:36.460 The dog
00:47:36.920 whistle stuff is
00:47:37.800 a little bit
00:47:38.240 mind reading and
00:47:39.160 you have to
00:47:39.500 assume you know
00:47:40.200 what they're
00:47:40.480 thinking and
00:47:41.300 you can't.
00:47:41.880 But if
00:47:42.820 somebody says
00:47:43.320 it directly,
00:47:44.920 as Duckworth
00:47:45.940 and Hirona
00:47:46.700 did, you
00:47:48.720 have to stop
00:47:49.500 the Senate
00:47:50.020 and fix
00:47:51.320 that.
00:47:52.580 Nothing else
00:47:53.520 matters today.
00:47:54.900 You gotta
00:47:55.540 fix it.
00:47:57.700 So,
00:47:59.180 the other
00:48:02.160 way to go
00:48:02.720 would be for
00:48:04.420 the GOP to
00:48:05.540 embrace it
00:48:06.480 and use the
00:48:08.240 aggressive
00:48:08.740 agreeing idea.
00:48:09.940 Imagine, if
00:48:12.100 you will,
00:48:12.560 that the GOP
00:48:13.400 introduced some
00:48:14.180 legislation to
00:48:15.640 ban white men
00:48:16.940 from serving in
00:48:18.120 the Senate from
00:48:19.200 that point on,
00:48:20.700 or serving in
00:48:21.600 a cabinet
00:48:21.980 position from
00:48:23.720 that point on.
00:48:25.360 What would
00:48:26.080 you do?
00:48:27.600 What would
00:48:28.000 happen if the
00:48:28.840 GOP said,
00:48:29.860 well, let's
00:48:30.720 take your
00:48:31.260 lead, Duckworth
00:48:33.220 and Ronnie.
00:48:33.860 We're going to
00:48:34.360 put together a
00:48:35.100 bill that says
00:48:35.900 that white people
00:48:36.620 can't serve in
00:48:37.440 government.
00:48:37.780 what would
00:48:40.000 happen?
00:48:41.540 Because if they
00:48:42.300 disagree with
00:48:43.260 the idea, it's
00:48:44.080 just another
00:48:44.700 disagreement, it
00:48:45.640 means nothing.
00:48:46.480 But what would
00:48:46.960 happen if they
00:48:47.600 said, yeah, let's
00:48:48.700 take this to
00:48:49.700 where you want
00:48:50.240 it to be.
00:48:51.320 Let's put it
00:48:52.200 into law.
00:48:53.400 And then put
00:48:54.060 all the
00:48:54.360 Democrats on
00:48:55.060 record to
00:48:56.380 see if they'd
00:48:56.840 vote for it.
00:48:58.640 See if they'll
00:48:59.320 vote for it.
00:49:00.580 Put them on
00:49:01.000 record.
00:49:01.840 See how many
00:49:02.340 Democrats will
00:49:03.020 vote that no
00:49:04.220 white people can
00:49:05.020 serve in the
00:49:05.480 government.
00:49:05.760 you like it,
00:49:08.640 don't you?
00:49:11.000 So this is why I
00:49:12.620 could never be in
00:49:13.220 the Senate, because
00:49:14.380 I would just be
00:49:14.980 trolling the other
00:49:15.740 side nonstop.
00:49:20.320 All right.
00:49:25.940 Purge the
00:49:26.540 whites.
00:49:30.140 Yeah, Schumer
00:49:31.020 wouldn't put it on
00:49:31.720 the calendar.
00:49:33.280 Well, it
00:49:33.560 doesn't matter.
00:49:34.020 It doesn't
00:49:34.760 matter if it
00:49:35.260 gets voted
00:49:35.720 on.
00:49:36.380 It matters if
00:49:37.200 they would do
00:49:37.740 it, because it
00:49:38.820 would be a news
00:49:40.540 attractor.
00:49:47.520 That makes you
00:49:48.300 overqualified for
00:49:49.220 the Senate.
00:49:50.300 Yeah, you
00:49:50.640 definitely wouldn't
00:49:51.200 want me in the
00:49:51.800 Senate.
00:49:53.360 Although that
00:49:53.940 would be fun.
00:49:54.820 Maybe the
00:49:55.200 Senate would be
00:49:55.680 better than being
00:49:56.280 president, because
00:49:57.240 I could cause more
00:49:58.400 trouble in the
00:49:58.920 Senate.
00:50:02.220 They're so brain
00:50:03.060 damaged, they
00:50:03.580 would probably
00:50:03.920 vote for it.
00:50:04.940 Yeah, but you
00:50:05.360 were really going
00:50:05.960 for the other
00:50:06.460 half of the
00:50:06.980 Democrats who
00:50:07.620 are not brain
00:50:08.260 damaged.
00:50:08.940 They just have
00:50:09.440 different policy
00:50:10.280 preferences.
00:50:15.300 All right.
00:50:16.560 That's all I got
00:50:17.320 for now.
00:50:18.660 And follow me on
00:50:19.920 Locals.
00:50:20.380 I'm doing lots of
00:50:21.140 stuff on reframing.
00:50:23.440 So I've discovered
00:50:24.420 that people like
00:50:25.620 best my lessons on
00:50:27.860 reframing.
00:50:28.740 How to look at the
00:50:29.400 stuff you're already
00:50:30.080 looking at in a
00:50:31.480 different way, so
00:50:32.580 that you can deal
00:50:33.600 with it more
00:50:34.060 productively.
00:50:35.220 So I'm putting up
00:50:36.040 more reframing
00:50:36.800 lessons on there.
00:50:38.240 And I've been
00:50:41.220 asked a number of
00:50:41.980 times lately by my
00:50:43.560 critics to explain
00:50:45.440 how I was so sure
00:50:47.240 that Trump would
00:50:51.020 win the election
00:50:52.140 even after the
00:50:54.060 election, because I
00:50:55.220 was suggesting that
00:50:57.500 there might be some
00:50:58.260 improprieties
00:50:59.100 discovered.
00:50:59.860 people say, well, I
00:51:01.880 guess you're wrong.
00:51:03.120 To which I say, am
00:51:05.280 I?
00:51:06.720 Am I wrong?
00:51:08.720 We only know that it
00:51:10.000 hasn't been checked.
00:51:12.080 How do you know I'm
00:51:16.640 wrong on the thing you
00:51:18.460 haven't checked?
00:51:19.660 Because that's the whole
00:51:20.680 point, isn't it?
00:51:21.800 Now, we know that the
00:51:22.620 courts did not overrule
00:51:23.820 anything.
00:51:24.300 We know that Biden's in
00:51:25.300 office.
00:51:25.740 I'm not arguing any of
00:51:26.760 that.
00:51:27.560 I'm just saying, you
00:51:29.160 don't know if I'm
00:51:29.900 wrong.
00:51:31.000 You just know we
00:51:31.800 didn't check.
00:51:34.620 So I tweeted the
00:51:35.620 other day that
00:51:36.180 sometimes you think I'm
00:51:37.360 wrong, but really you
00:51:38.360 just haven't waited
00:51:39.140 long enough.
00:51:41.700 That's one of these.
00:51:46.340 I cannot.
00:51:50.340 Can we get a
00:51:51.180 simultaneous sip every
00:51:52.260 time you speak
00:51:53.080 publicly?
00:51:53.600 Well, I don't know
00:51:54.940 if you've noticed, but
00:51:55.680 I've stopped speaking
00:51:57.860 publicly.
00:51:58.820 Has anybody noticed?
00:52:00.340 I just stopped taking
00:52:01.680 interviews.
00:52:03.980 I'm going to be doing,
00:52:06.460 unless it gets,
00:52:07.640 unless it gets postponed,
00:52:10.540 I'm going to be doing a
00:52:11.520 clubhouse event a little
00:52:13.900 later this morning.
00:52:15.040 And I haven't been on
00:52:16.040 clubhouse yet.
00:52:16.940 I just signed up, just
00:52:18.260 playing with it this
00:52:19.000 morning.
00:52:19.780 And I'm going to be, if
00:52:21.480 this happens, I think it's
00:52:22.740 going to happen, I'm
00:52:23.760 going to be defending
00:52:24.840 the indefensible.
00:52:27.080 So I'm going to be
00:52:28.780 picking a topic which
00:52:29.980 can't be defended, and
00:52:32.300 then I'm going to defend
00:52:33.060 it anyway.
00:52:33.800 And I'm going to have a
00:52:34.360 guest, a controversial
00:52:36.260 guest.
00:52:38.060 Controversial.
00:52:39.020 Oh yeah, really
00:52:40.480 controversial.
00:52:41.900 So look for that,
00:52:43.780 unless it gets postponed.
00:52:45.080 And I will talk to you
00:52:47.600 tomorrow.
00:52:48.900 I just saw a comment go
00:52:59.400 by that says, the
00:53:00.100 Beatles are the best
00:53:00.840 band ever.
00:53:01.920 I am totally obsessed at
00:53:03.760 looking at old Beatles
00:53:05.260 videos and stories, and
00:53:07.420 learning about how they
00:53:08.560 succeeded.
00:53:10.240 Because that's really
00:53:11.260 interesting.
00:53:12.280 And when you see that
00:53:14.120 they had systems, so the
00:53:17.640 Beatles had systems, they
00:53:19.400 didn't have goals.
00:53:20.500 And when you look at
00:53:21.240 their systems, it
00:53:22.720 becomes somewhat, almost
00:53:26.660 obvious why they
00:53:27.600 succeeded.
00:53:28.280 They just had better
00:53:28.980 systems.
00:53:30.140 And I'll give you one
00:53:31.340 example.
00:53:31.960 One example would be when
00:53:33.380 they were writing a song,
00:53:35.380 different band members
00:53:36.400 would have like little
00:53:37.320 bits of ideas of things
00:53:39.260 that are completely
00:53:40.720 different.
00:53:41.740 And the Beatles would
00:53:42.540 say, well if your little
00:53:44.200 bit is awesome, but you
00:53:46.440 couldn't build a whole
00:53:47.360 song around it.
00:53:48.580 And then, you know, if
00:53:49.840 George has a little bit
00:53:51.180 that's awesome, why
00:53:52.740 don't we just put the
00:53:53.480 bits together and we'll
00:53:54.500 make a song of all the
00:53:56.400 random bits.
00:53:57.960 And then you think, well
00:53:58.980 what about the vocals?
00:54:01.960 And the vocals, well
00:54:03.180 we'll just do random
00:54:04.120 sentences.
00:54:05.900 What?
00:54:07.480 You're going to make
00:54:08.200 music that has random
00:54:09.540 vocals, that literally
00:54:10.760 don't mean anything, and
00:54:12.720 your song will be a
00:54:14.220 hodgepodge of different
00:54:15.500 things that you just
00:54:16.420 like that just were put
00:54:17.440 together.
00:54:19.780 And that's what they
00:54:20.600 did.
00:54:21.420 And it turns out that
00:54:22.320 that's a better system.
00:54:24.320 Because people like music
00:54:25.940 where they like all the
00:54:26.840 good parts.
00:54:27.720 You've noticed that a lot
00:54:28.700 of modern music has the,
00:54:30.440 you might have the hip-hop
00:54:31.360 artist with maybe, you
00:54:32.920 know, Rihanna or
00:54:33.900 somebody, Rihanna,
00:54:35.160 somebody who can sing.
00:54:37.200 And then you get a little
00:54:38.000 bit of, you know, rap and
00:54:39.120 a little bit of singing and
00:54:41.100 you kind of like it.
00:54:42.880 Because it's, oh I like a
00:54:44.000 little bit of that, a
00:54:44.620 little bit of that.
00:54:45.680 So the Beatles sort of
00:54:46.760 pioneered putting things
00:54:48.320 together that shouldn't be
00:54:49.380 together.
00:54:49.920 Sort of the pineapple on
00:54:51.120 the pizza sort of thing.
00:54:52.480 And they did tons of
00:54:54.300 experimenting and tons of
00:54:56.980 practice.
00:54:58.020 And they took it from the,
00:55:00.860 they were stifled doing
00:55:02.480 live performances because
00:55:03.820 they couldn't hear
00:55:04.340 themselves, the crowds
00:55:05.300 were too loud.
00:55:06.680 And so they changed their
00:55:08.520 system.
00:55:09.060 And they moved to the
00:55:09.840 studio and started making
00:55:12.240 sounds that you'd never
00:55:13.560 heard before.
00:55:14.660 So another part of their
00:55:15.420 system is they would
00:55:17.100 mic their equipment
00:55:18.420 differently than other
00:55:19.320 people.
00:55:19.780 I think in one case a
00:55:21.640 piano was microphoned
00:55:22.920 underneath it.
00:55:24.460 They would, they would
00:55:25.800 have different devices
00:55:26.880 that nobody else used,
00:55:28.320 different sounds.
00:55:29.440 And they would just make
00:55:32.040 music out of sounds.
00:55:34.500 So once you dig into what
00:55:37.080 the Beatles did, systems
00:55:38.960 wise, it's pretty
00:55:40.960 impressive.
00:55:41.840 So it's not exactly an
00:55:43.480 accident that they
00:55:44.280 succeeded.
00:55:44.840 I mean, their talent was
00:55:45.840 off the charts and they
00:55:47.280 had exactly the right
00:55:48.140 people, but their systems
00:55:49.840 were amazing.
00:55:51.360 That's the real story of
00:55:52.520 it.
00:55:52.820 All right.
00:55:53.300 That's all for now.
00:55:54.240 Talk to you tomorrow.