Real Coffee with Scott Adams - April 18, 2021


Episode 1348 Scott Adams: How to Defund Police With Technology, Masks After Vaccinations, Why Liberal Men Are Unhappy, More


Episode Stats

Length

57 minutes

Words per Minute

152.68712

Word Count

8,850

Sentence Count

600

Misogynist Sentences

7

Hate Speech Sentences

9


Summary

In my ongoing segment of about Joe Biden evolving into Donald Trump, we now have Biden playing golf while there s a crisis on the border. What's the deal with that? And why does he look so good in it?


Transcript

00:00:00.000 I'm for Coffee with Scott Adams. Best part of your day, guaranteed.
00:00:05.600 And if it's not, double your coffee back.
00:00:08.960 And you like coffee, so everybody wins.
00:00:12.260 If you'd like this to really exceed your expectations, and I think it will,
00:00:18.760 all you need is a cup or mug or a glass, a tank or chalice or stein, a canteen jug or flask,
00:00:22.780 a vessel of any kind. Fill it with your favorite liquid.
00:00:25.720 I like coffee. And join me now for the unparalleled pleasure, the dopamine hit of the day,
00:00:32.680 the thing that makes everything better. It's called the simultaneous sip.
00:00:37.300 And I hope you are ready now.
00:00:46.160 Well, if that wasn't enough excitement for you, you can see that there's going to be a whiteboard
00:00:51.820 presentation later. Yes, yes, it's that good.
00:00:57.740 But let's talk about all the things.
00:01:01.220 In my ongoing segment of Joe Biden evolving into Donald Trump,
00:01:06.220 we now have Biden playing golf while there's a crisis on the border.
00:01:11.280 So remember, every time you see Biden just evolving into a version of Trump,
00:01:18.880 just remember who told you it was going to happen months before it did.
00:01:22.660 Now, since we know from the Project Veritas undercover video,
00:01:30.740 we know that CNN, at least according to their technical director who was caught on camera,
00:01:35.500 we know that they actively try to make Joe Biden look like he's, you know, vital and young and fit.
00:01:42.720 And they said it out loud, you know, directly said,
00:01:46.240 we're trying to make him look like he's a healthy guy.
00:01:49.360 And I have to compliment whoever dressed him for the photo shoot,
00:01:55.020 because you don't see him golfing.
00:01:56.900 I don't know if there's any photo.
00:01:58.280 I don't know if there's any video.
00:02:00.500 Is there any video of him actually golfing?
00:02:04.360 Because I'd sort of like to see his swing, if you know what I mean.
00:02:08.120 But my compliments to the wardrobe propagandist.
00:02:13.580 He looks really good.
00:02:15.660 There's a photograph of him in a really flattering golf outfit that looks new and shiny.
00:02:21.540 And, you know, if I'm going to be objective, there is a thing that Joe Biden does very well,
00:02:29.880 which is he's a pretty good role model for fitness, right?
00:02:33.160 I believe he's been a lifelong exerciser, keeps his weight under control, even at his age.
00:02:40.820 You've got to admit, that's not nothing, right?
00:02:45.100 I wish we had more presidents who would have sort of a fitness bias.
00:02:50.480 But it's funny, because we know that this is, you know, sort of a staged propaganda kind of thing.
00:02:56.580 But it still works.
00:02:57.740 I don't mind when our leaders, I don't mind at all when a president does a staged thing to look, you know, strong and athletic.
00:03:08.340 That's fine.
00:03:09.840 That's good technique, actually.
00:03:13.940 What about this argument?
00:03:15.240 You know, the argument against letting kids use, let's say, private schools is that if you move the funding from the public schools, the public funding to let it follow the students and they go to private schools, there'll be less funding for what's left behind.
00:03:34.720 So the public schools will fall apart, so goes the argument, if you let money flow to the private ones.
00:03:40.600 I would like to challenge that theory.
00:03:46.120 And here's the assumption where I think we differ.
00:03:49.760 Obviously, math is math.
00:03:52.400 If money moves away from the public schools, they'll have less money.
00:03:56.400 That does seem like a problem, right?
00:03:58.620 So, so far, that's not crazy.
00:04:01.200 But there's a difference in assumption.
00:04:02.900 My assumption is that the only thing keeping massive extra funding of schools, both from private individuals, billionaires, you know, endowed whatevers, as well as the government, probably could put in more.
00:04:18.820 I think what keeps anybody from putting more into the school system is, is what?
00:04:23.860 What smart person pours money into a broken system that can't be fixed?
00:04:32.860 Nobody, right?
00:04:34.360 So there's, you would be forgiven for believing that whatever money we have is a fixed pot.
00:04:40.900 And that if we move some of it, there will be less left over here.
00:04:45.200 Certainly true in the short run.
00:04:48.000 But it seems to me that because of teachers' unions, they've created this broken and unfixable system called the public schools.
00:04:56.560 And they're preventing competition because the unions are so strong they can prevent anybody from competing with them.
00:05:03.440 They can just get the laws changed and such, which they do.
00:05:07.200 So here's the thing.
00:05:11.320 Anyway, it seems to me that we live in a country where we could get unlimited funding for, let's say, startup education projects.
00:05:22.360 Because so many people want it, it's such an obvious public good, that I think billionaires would just open their wallets.
00:05:30.380 The only thing that they lack is something to fund.
00:05:35.360 Which sounds weird, doesn't it?
00:05:36.980 You know, wouldn't your first assumption be that schools are underfunded?
00:05:42.100 And of course they are.
00:05:43.360 But there's a very good reason.
00:05:45.160 If you put more money into the bad schools, apparently it doesn't make that much difference.
00:05:50.340 Because, you know, the system is broken.
00:05:53.480 So without competition, it wouldn't matter how much money you put into it.
00:05:57.420 Nobody has really an incentive to compete.
00:05:59.740 There is no competition.
00:06:01.240 And that's because of the teachers' unions.
00:06:02.860 So just summarizing, the difference between, let's say, the left's philosophy and the right's, and I've said this before, but you'll see it in all different topics,
00:06:17.220 is the left likes to assume that it's a zero-sum game.
00:06:22.140 Meaning that there's a fixed pot of wealth or resources, and if I have some of it, that's something you can't have.
00:06:30.080 So if I have some, that takes away what could have been yours.
00:06:34.920 That's not what the right tends to think.
00:06:37.580 The right tends to approach things from a perspective of abundance, which is, there's never a limited amount of anything.
00:06:46.640 You can make more.
00:06:48.460 How you would make more money for funding schools is by giving them better things to fund.
00:06:53.420 There's always a way to make more for everybody.
00:06:55.840 But only the right seems to think that's even a thing.
00:06:58.420 The left seems to think, ah, it's a limited amount.
00:07:01.680 We've got to get ours.
00:07:02.920 And then everything falls apart.
00:07:06.300 Here's the most alarming thing that's happening today, that we're acting like it's not alarming.
00:07:12.440 I've now heard three high-end attorneys who understand the legal defense world.
00:07:22.020 Three of them so far, and I think Dershowitz was one, predicting that the facts of the matter, Robert Barnes was another, and it was just another one I saw, people really know what they're talking about, saying that it's unlikely that the facts of the case and the law will be used to decide the outcome.
00:07:43.120 And that it's how the, I guess, how the jurors feel.
00:07:48.880 You know, the emotional impact of the whole thing is almost certainly, meaning, you know, greater than 80% chance, according to people who know what they're talking about.
00:07:58.180 Not me.
00:07:58.960 I don't know what I'm talking about.
00:08:00.340 But there's an 80% chance that there's an American citizen, an American citizen, who is innocent until proven guilty.
00:08:13.160 And we're sitting here like it's okay with the complete knowledge that there's an 80% chance that he will go to jail for at least 12 years because he's a white male.
00:08:25.920 Now, you're going to say to yourself, that's not why.
00:08:31.460 It's because he freaking killed a guy on video.
00:08:34.440 That's why he's going to jail.
00:08:36.420 You know, if he goes to jail, it's because he killed a guy right in front of us.
00:08:40.800 I get that, right?
00:08:42.000 That's a completely legitimate point of view.
00:08:45.740 It may not be technically exactly what happened because you kind of need to know his state of mind and such.
00:08:50.900 But certainly we watched something that the professionals have deemed a homicide.
00:08:57.220 And I'm not arguing that it's a homicide.
00:09:00.060 So certainly the human actions had something to do with it.
00:09:02.820 But still, you have to support the system, right?
00:09:09.780 It's not always about the person.
00:09:11.720 You know, you can imagine there are cases where you just need to punish some person, bend a rule to do it.
00:09:17.380 But can you break the system in this case and just say, well, and apparently this wouldn't be the one time this happens, obviously.
00:09:26.500 But why are we okay with this?
00:09:29.940 Why is it okay?
00:09:32.000 Because, and I have to be more careful about how they say this.
00:09:37.080 I'm not saying that Chauvin is innocent or guilty.
00:09:40.280 That's actually irrelevant to my point.
00:09:42.200 My point is that his innocence or guilt won't be part of what puts him in jail or not.
00:09:50.860 It's just not part of the decision.
00:09:53.580 That ain't right.
00:09:55.140 So what will be part of the decision if not the law and the facts?
00:10:00.180 And again, this isn't my opinion.
00:10:01.600 This is people who are in the law and know what they're talking about.
00:10:04.100 If they're not going to be used, isn't it a racial outcome?
00:10:11.300 And how is it not, really?
00:10:14.320 How is this not?
00:10:15.800 I don't want to use the L word, because as soon as you use the L-Y-C-N, you know, you know that word, then it becomes a whole, you know, it's a distraction.
00:10:26.360 But clearly, the entire world is watching him getting ready to go to jail for 12 or more years, because he's a white male.
00:10:38.560 Now, he might also, let me be very careful, he might also be completely guilty.
00:10:46.660 I think the prosecution made a pretty good case.
00:10:48.900 But that's not what he's going to jail for, according to the people who know what they're talking about.
00:10:55.460 They're not even going to look at that.
00:10:57.240 They're just going to vote on how they feel.
00:11:00.260 How is that okay?
00:11:02.680 I guess we just don't have an alternative, right?
00:11:04.940 Nobody's got a better idea, and so we'll just say, well, okay, here's my theory.
00:11:10.440 The only reason that's okay is that we've been brainwashed.
00:11:15.160 That's my theory.
00:11:16.060 That you would not put up with this for a second if the media had not told you to do it.
00:11:24.960 Now, not directly, but the media has created a situation where this guy's guilty, and justice requires him to be punished.
00:11:35.200 Right?
00:11:36.260 And that narrative existed long before we even saw the defense's case, long before we saw the prosecution's case,
00:11:44.260 which I assert, again, is very strong.
00:11:47.940 So I'm not saying he's innocent of anything.
00:11:52.160 Prosecution made a good case.
00:11:53.580 It's just not going to be used to decide his fate.
00:11:57.380 I'm not cool with that.
00:11:58.900 I just don't know what to do about it.
00:12:01.040 So here's an interesting thing.
00:12:02.980 So now, you know, we know this.
00:12:04.440 And by the way, fact-check me on this following thing, because I don't follow the vaccination stuff too closely.
00:12:09.920 But my understanding is that the J&J vaccination is sort of being pulled back temporarily while they're looking into whether it's causing some exotic or rare blood problems.
00:12:22.280 But, apparently, the state of our data and reporting is so bad that we don't know if it's rare.
00:12:30.720 Apparently, it's kind of hard to tell who's having what kind of reactions.
00:12:35.620 Now, I hate to be a conspiracy theorist.
00:12:39.500 He lied.
00:12:40.800 Okay, I love to be a conspiracy theorist.
00:12:43.360 If I could be one thing, it would be a conspiracy theorist.
00:12:47.560 So, I kind of like it.
00:12:49.400 Okay, here's my conspiracy theory.
00:12:53.960 So, you've got billions of dollars at stake in these vaccinations.
00:12:58.700 I guess the government is paying it, but billions of dollars.
00:13:02.400 And you've got three competitors, right?
00:13:04.980 Three big pharmaceutical competitors.
00:13:09.140 Two of them are making this mRNA newer kind of vaccination, which requires two shots.
00:13:17.260 Would you rather have something that's newer and you require two shots, or something that's been around so long we understand the risks a little better in terms of the technology?
00:13:29.540 Not the specific shot, but the technology.
00:13:32.360 And you only need one.
00:13:34.180 Which of those do you like better?
00:13:35.480 Well, it's obvious, right?
00:13:37.080 If you could take the one that's always worked forever, and you only need one, right?
00:13:44.420 Now, here's the part I need to fact check.
00:13:46.420 The people who are getting the second shot and having all the symptoms, the J&J one doesn't have those side effects, right?
00:13:55.820 Can I get a fact check on that?
00:13:57.200 When I say side effects, I mean the feeling of the flu and like you got knocked down and stuff.
00:14:05.740 And people are saying that the other vaccinations had some side effects, too.
00:14:09.480 We just don't know if they're significant.
00:14:12.740 All right.
00:14:13.500 So, on paper, the J&J shot would be better because it's one, better one shot instead of two,
00:14:22.720 better because it doesn't have the side effects that might make you lose a day of work,
00:14:27.820 better because it's not new technology, although, let's acknowledge, the new technology might be way better.
00:14:35.660 But it's new.
00:14:37.140 So, you just have a little less visibility, right?
00:14:41.020 So, what are the odds that the problem that J&J is having are real?
00:14:47.520 You see where I'm at with this?
00:14:48.760 Do you know who is the biggest, I think, just visually, it seems like,
00:14:54.800 who would be the biggest funder of the news?
00:14:58.920 It would be pharmaceuticals, wouldn't it?
00:15:01.480 They're the biggest funders of the news because they buy advertising on the news.
00:15:05.960 And so, the news has taken a side, basically.
00:15:10.400 Now, it doesn't look like it.
00:15:12.200 If you're just consuming the news, it doesn't look like they took a side.
00:15:15.800 It looks like they're just reporting the news, right?
00:15:18.760 The news says the J&J thing has a problem.
00:15:22.300 The news, these are just the facts.
00:15:24.560 No propaganda here, right?
00:15:26.000 Just the facts.
00:15:27.520 A problem has been raised.
00:15:30.180 J&J is looking into it.
00:15:32.060 A little bit more reporting.
00:15:33.480 A little bit more reporting on that.
00:15:35.060 Hey, did you notice?
00:15:36.040 A little bit more reporting.
00:15:37.760 Do you see the subtle difference between paid propaganda?
00:15:42.820 I'm not saying that's happening, by the way.
00:15:45.140 I'm just saying, could you tell the difference between just reporting the news and deciding that you wanted to kill one of the shots because the other two companies spend more on advertising?
00:15:58.140 Which, by the way, I don't know if that's the case.
00:16:00.020 I doubt that's exactly the case.
00:16:03.380 But my problem is, I used to live in a world where I could rule this out.
00:16:09.240 I could just say to myself, I don't live in a world where some company is going to start false rumors about another company, plant stories in the media, just so they can sell more vaccinations.
00:16:21.700 I certainly don't live in a world where something like that could, yeah, I do.
00:16:27.880 I do.
00:16:28.760 I'm not making a claim.
00:16:30.500 There's no allegation being made here.
00:16:33.980 I'm just saying, we live in a world where you see this and you say, is it a coincidence that the unambiguously best shot just got ruled out by the media?
00:16:47.600 Is that a coincidence?
00:16:48.680 It could be nothing but the news.
00:16:52.280 I have no information that would suggest this is anything but exactly what's happening.
00:16:57.900 Science spotted some problems.
00:17:00.460 Science is cautious.
00:17:01.840 Science pulled back.
00:17:03.460 Science is looking into it.
00:17:05.540 That's all we know.
00:17:07.440 But man, we don't live in a world where you can just assume that's what's happening.
00:17:11.000 We do live in a world where there's a billion dollars on the table if you can place a story in the media.
00:17:19.800 What would you do if you worked for a big company and you personally could just get insanely rich if just you could get a few friendly reporters to just boost the little story and maybe see if it gets picked up by the other ones so you don't have to bribe them?
00:17:36.980 It would be kind of tempting.
00:17:40.620 All you've got to do is give somebody a story that's completely true.
00:17:44.260 Hey, reporter, this is totally true.
00:17:47.040 You can check it yourself.
00:17:48.460 We've seen some problems with side effects with our competitors' vaccination.
00:17:52.760 Do you think this is a story?
00:17:53.900 I'll package it up for you so you don't even have to do much research.
00:17:57.720 Here are the people to talk to.
00:17:59.220 Here are the data and the source.
00:18:01.080 And here it is.
00:18:02.140 That would be a good story, wouldn't it?
00:18:05.160 That's how the world works.
00:18:07.340 Stories are planted.
00:18:08.660 They're not always found.
00:18:11.420 Big controversy about whether you should wear masks after vaccination.
00:18:15.940 Dr. Fauci says that since you could still get the virus, even vaccinated, that means you could still, in theory, spread the virus, because anybody who has it can spread it.
00:18:28.940 And therefore, getting the vaccination, Dr. Fauci says, is no excuse not to wear a mask.
00:18:36.060 Well, Dr. J. Bhattacharya, who appeared on Fox News, argues this.
00:18:40.920 He said that because of that argument, Dr. Fauci is probably the number one anti-vaxxer in the country.
00:18:47.660 Boom.
00:18:48.460 Reframed.
00:18:49.880 Boom.
00:18:50.740 Here's his argument.
00:18:53.140 Dr. J. Bhattacharya's argument is that because Fauci is telling people they still have to wear masks, he's kind of telling them they're not going to get enough benefit from the vaccination.
00:19:06.900 He's also kind of saying maybe, I'm not saying this, but if you were a conspiracy theorist and you were already worried that these vaccinations aren't even real, they're real, by the way, I wouldn't worry about them not being real, but it plays into every fear.
00:19:26.500 If you still have to wear a mask, don't you say to yourself, Dr. Fauci, you've been wrong about masks before, maybe once or twice.
00:19:37.060 So everything about this is like a caution flag to people who are already cautious.
00:19:43.500 This is a pretty good argument.
00:19:45.140 It's also a good reframe.
00:19:46.320 So by reframing Fauci as the obstacle to vaccinations and then showing a perfectly solid argument for why that would be the case, well done, persuasion-wise.
00:20:00.620 Now, let me speculate about why Dr. Fauci might be saying we need masks.
00:20:07.820 I think other people probably say it too.
00:20:09.960 You see, he's not alone.
00:20:11.160 But why would he say that?
00:20:12.280 One of the possible reasons he would say that when many of you are saying, no way, I'm not going to wear a mask after I get a vaccination.
00:20:20.320 That's cray-cray.
00:20:21.760 Well, reason number one, Dr. Fauci's job is not to make you happy.
00:20:29.140 It's not in his job description.
00:20:30.760 So if he had a choice of, you know, keeping one extra person alive because of his, you know, tight, tight recommendations versus, you know, 20 million people are unhappy and God knows what happens because of that, he's going to pick the thing that he gets, you know, measured on, how many people are you keeping alive?
00:20:52.420 So he's not exactly, not exactly, he's completely the wrong person to give you advice.
00:21:00.860 You get that, right?
00:21:02.380 If you're a Democrat, you think he's exactly the right person.
00:21:05.860 He's the expert.
00:21:07.500 I mean, if you're not an epidemiologist, people are going to come after you.
00:21:12.240 But you know who I would rather make that decision?
00:21:15.860 Nate Silver, right?
00:21:17.740 Now, he got a lot of crap because he said something about the statistics of the pandemic.
00:21:24.500 And people say, you're not an epidemiologist.
00:21:29.700 You're not a virologist.
00:21:31.240 So what are you saying about this?
00:21:33.000 No, he's a guy who knows risk management.
00:21:35.700 It's a risk management question with lots of variables.
00:21:39.520 Dr. Fauci's job is not to manage all the variables.
00:21:43.260 So when he gives you advice, what should you do?
00:21:45.520 You should ignore it.
00:21:49.120 He's the wrong person.
00:21:51.640 Suppose you're having a brain tumor.
00:21:56.640 And the gardener from across the street comes over and says,
00:21:59.940 Hey, I heard you had a brain tumor.
00:22:01.740 Have you considered, you know, stuffing weeds in your ear?
00:22:06.160 You wouldn't really listen to your gardener.
00:22:08.520 Because your gardener might be very wise about the wrong things.
00:22:12.140 The gardener is not really who should give you advice about your brain tumor.
00:22:17.320 And I would argue that Dr. Fauci is exactly the wrong person to give you advice about wearing a mask.
00:22:24.100 Because he's sort of not allowed to consider all the variables.
00:22:27.760 He's sort of the medical guy.
00:22:31.560 So he just can't look at the whole risk management and quality of life situations.
00:22:37.320 He's just the wrong guy.
00:22:38.620 And you should know that.
00:22:39.900 Which is nothing about him as a person.
00:22:44.220 He's just not the right job for this.
00:22:46.520 Here is how you would make this decision.
00:22:52.020 Let me walk it through you.
00:22:54.040 So, is the real reason that we want masks on everybody because the non-vaccinated would start cheating?
00:23:04.680 They'd say, Oh, yeah.
00:23:06.280 There's no passports yet.
00:23:08.080 So, yeah, I had my vaccination.
00:23:10.380 I can go to the bar.
00:23:12.140 Yeah, don't worry.
00:23:13.140 I had my vaccination.
00:23:14.260 So, you have to worry because, remember, Fauci already told us masks don't work when what he really meant early on in the pandemic was that there aren't enough of them.
00:23:27.400 So, we know Fauci has lied once, a really big lie, about the value of masks.
00:23:33.980 All right?
00:23:34.440 He might say he was mistaken, so we'll let him have that.
00:23:37.460 But we know he was wrong because he changed his mind.
00:23:41.240 Or maybe he was right the first time.
00:23:42.980 Whatever.
00:23:43.480 But he changed his mind, so he was wrong one of those times.
00:23:47.060 And so, he might be just telling people to wear masks because it's the only way you can tell without passports.
00:23:56.660 It's the only way you can tell that the people who are not vaccinated are also wearing masks.
00:24:02.640 We don't really know the risk of transmitting, right?
00:24:08.200 Has anybody told you?
00:24:09.980 What is the percentage risk that a vaccinated person could catch the virus?
00:24:17.140 Now, we know that can happen.
00:24:19.100 But then we also know that you have to have a pretty good dose of it to spread it.
00:24:23.920 And we know that the super spreaders are most of the spread.
00:24:26.500 But how many super spreaders will there ever be who had the vaccination?
00:24:33.380 And if they were a super spreader, I mean, it's going to be a special case with no immune system or something.
00:24:38.400 But, you know, they were going to be trouble anyway.
00:24:44.960 Right.
00:24:45.720 All right.
00:24:46.160 So, I don't think you can figure out what makes sense here.
00:24:52.520 And you're going to end up defaulting to a lifestyle decision.
00:24:56.000 In my opinion, you will never have enough data to know if it is wise or unwise to wear masks once you're vaccinated.
00:25:03.440 But if you have no data, and it's certainly not obvious that it would make a difference, and maybe there's a little risk.
00:25:12.180 I think you would admit there must be some risk.
00:25:14.040 I feel as if, you know, I'm very solidly in the don't wear a mask after you're vaccinated camp.
00:25:21.300 But I don't think we could know.
00:25:23.180 You know, you end up using your hunch or your instinct because you just don't know.
00:25:29.820 But everything we know suggests that masking after vaccination might approach just dumb.
00:25:37.560 Right.
00:25:38.260 But you have to be careful.
00:25:40.120 Your common sense can be misleading.
00:25:41.920 So, you never know.
00:25:42.480 Here's a trend that's happening.
00:25:44.600 I always say that if you know economics and business models, you understand how business models work, that you can see the future.
00:25:54.320 You can predict things because economics can be kind of predictable in some sense.
00:25:59.380 And here's a trend that's happening.
00:26:01.080 It's a really big deal.
00:26:03.180 All right, this following trend, if you're not watching this, you're really going to be blindsided by what could happen really quickly.
00:26:12.440 It looks like this.
00:26:14.000 There's a subscription service for authors called Substack.
00:26:19.880 Have you all heard of it yet?
00:26:21.280 It's a pretty big deal.
00:26:23.440 It's a real big deal, actually.
00:26:24.700 It's called Substack.
00:26:26.920 So, if you're an author, you can work there.
00:26:29.440 And at least at the moment, some of those notable authors, people who have brought an audience with them,
00:26:35.740 are making up to half a million a year by charging a subscription for their regular writing.
00:26:41.860 At the same time, New York Times and Wall Street Journal writers are somewhere in 120,000 per year range.
00:26:47.620 The Substack writers can write anything they want.
00:26:55.360 There must be some terms of service that they can't exceed.
00:26:58.700 They couldn't go full Nazi, I'm sure.
00:27:01.240 But in terms of standard other platform censorship,
00:27:07.520 and in terms of somebody telling them from the top,
00:27:11.200 like a CEO saying, this is what you're going to cover, and this is your narrative, they don't have that.
00:27:15.880 So, Substack, because it's way more profitable than working for one of these propaganda legacy platforms,
00:27:26.060 is going to suck off all of, well, let me say that differently,
00:27:30.640 might absorb all the best writers, because why would you work for one-fifth of what you could earn?
00:27:37.040 So, Substack is going to suck all the people away from the ad-based businesses.
00:27:41.700 So, if you're an advertising-based business, you're in real trouble,
00:27:46.280 because you're not going to be able to hire talent, and without that, you're dead.
00:27:51.220 Similarly, there's a situation with YouTube.
00:27:55.440 So, YouTube is, you know, ad revenue, plus you could subscribe.
00:28:00.420 But if you're a creator making stuff for YouTube,
00:28:03.720 you would make a small fraction of what the same creator would make on locals.
00:28:08.760 So, I put a lot of my content that you don't see here,
00:28:12.840 most of the micro-lessons on success and persuasion and stuff, fitness and whatnot.
00:28:18.720 I put those on locals, because if I only put them on YouTube,
00:28:24.200 it would be, you know, just a fraction of the same financial impact.
00:28:29.880 And, now YouTube has a response, because they have a subscription service.
00:28:35.680 If you are a consumer of YouTube, and you should be, it's like the best thing out there, right?
00:28:43.000 As much as I complain about YouTube for, you know, this or that,
00:28:47.400 as a platform and a service, oh my God, it's good.
00:28:52.420 It's like just the best thing YouTube is.
00:28:55.360 But, it's not so good if you have to watch the ads.
00:28:58.440 So, for my money, it's one of the few things that's just so totally worth the subscription.
00:29:03.060 However, that does not help the creators as much.
00:29:07.360 You know, that's just good for the people watching it.
00:29:10.040 So, I highly recommend YouTube if you are a consumer.
00:29:13.800 For creators, they might be able to do a little better in some other platform.
00:29:19.160 Locals being the one I would recommend, because that's where I am.
00:29:22.400 By the way, full disclosure, I have a small equity stake in Locals, just so you know.
00:29:29.420 All right, apparently we're having a mass shooting just about every day now.
00:29:35.860 And even way more than last year, and last year was crazy.
00:29:39.380 Now, we don't know how much of this is because of the pandemic, right?
00:29:42.740 We don't know how much of this is because of, I don't know, police maybe pulling back.
00:29:48.560 It's hard to know exactly what's behind this all.
00:29:51.100 But I'll tell you one thing that's definitely behind it, and experts talk about this all the time, the copycat people.
00:30:00.720 Imagine a world where all of the news wasn't covering these mass shootings all the time.
00:30:08.360 What would people even think of to do?
00:30:10.560 If you didn't know there was such a thing as a mass shooting that was happening once a day, and you had some kind of feelings of violence, what would you do?
00:30:22.100 Well, you wouldn't even think of it.
00:30:24.740 It's pressed into our heads to the point where it's literally the first thing you think of if you have a bad day.
00:30:30.740 Yeah, now, we have to know it's happening, so you can't not cover it.
00:30:38.700 But I would certainly make some kind of law that says the only coverage is the statistics, and none of the coverage is human.
00:30:48.720 So that you just couldn't show any footage of the event.
00:30:52.680 All you could show is the box score.
00:30:55.300 You know, three people dead, suspect dead.
00:30:58.420 Boom.
00:30:59.300 Tuesday.
00:30:59.680 And at least you know what's going on, you know, in maybe West City, et cetera, et cetera.
00:31:05.580 So you wouldn't be in the dark, but you wouldn't be encouraging it the same way.
00:31:10.260 And it could be that the shooter, you know, video games may have an effect on people's psychology as well.
00:31:19.820 But, man, we're in a situation where the news basically causes these mass shootings.
00:31:25.600 Let me say that again.
00:31:26.440 Now, one of the things that confuses people is the difference between legal responsibility and cause and effect.
00:31:39.960 Cause and effect is just physics.
00:31:43.700 This cause this, cause this, cause this, cause this.
00:31:46.980 Legal responsibility can be its whole separate thing, right?
00:31:50.280 So in terms of who's responsible for shooting people, it's the person with the gun from a legal standpoint.
00:31:57.860 If all you're looking at is the law, we all agree it's the person with the gun who's the one responsible.
00:32:03.840 But if you're looking at a chain of cause and effect, outside of the context of the law, it's the news.
00:32:11.160 The news makes the mass shootings, and then the news covers them, and the news makes more money, and then they generate another one.
00:32:19.520 And so it's the news business, basically, is the cause.
00:32:26.580 All right.
00:32:27.820 Let's see what else we've got going on.
00:32:32.440 Rob Henderson, who writes about human nature.
00:32:35.600 He's a good follow on Twitter.
00:32:38.040 Rob Henderson.
00:32:39.480 So look for him.
00:32:40.260 And he was talking about a new survey that came out that said conservative women, he was talking about who's the happiest, you know, conservatives or liberals, men and women.
00:32:50.640 And the top line result is conservative women are particularly blissful.
00:32:56.060 Forty percent say they are very happy.
00:32:58.540 That makes them slightly happier than conservative men.
00:33:01.640 So both conservative women and conservative men, fairly happy compared to the average, and significantly happier than liberal women.
00:33:11.940 So that's, you know, pretty far down the list.
00:33:15.280 But the unhappiest are liberal men.
00:33:18.160 Only a fifth of liberal men consider themselves very happy.
00:33:24.480 Are you surprised?
00:33:26.440 Now, I was trying to think before I got on here today, I was thinking, all right, I should probably talk about at least what the possible reasons are.
00:33:37.640 You know, I can at least mention a few reasons.
00:33:40.640 But you know what stymied me?
00:33:42.920 There were too many reasons.
00:33:45.200 There are so many reasons why this must be true that I got tired listening.
00:33:51.080 I just saw somebody here say, low T.
00:33:54.260 You're not wrong.
00:33:55.360 I think there is actually literally, no joke, probably a scientifically valid difference between the testosterone levels of conservative versus liberal men.
00:34:08.980 And I don't think that's, I think that would actually hold up, like literally.
00:34:14.480 And we know that your testosterone level influences your happiness.
00:34:19.140 It's pretty direct.
00:34:21.140 So that might be part of it.
00:34:23.240 But let me speculate on some other possible causes.
00:34:27.380 This is just speculation.
00:34:29.560 So we're just being, you know, we're just being idiots, you know, pretending we're scientists.
00:34:33.560 Okay.
00:34:34.100 So don't take any of this too seriously.
00:34:36.540 One possibility is that conservatives are more likely to follow traditional gender roles.
00:34:44.000 Yeah, I said it.
00:34:45.600 I said it right out loud.
00:34:47.420 How about that?
00:34:48.320 Now, you might know that I consider myself left of Bernie on especially the social issues.
00:34:56.940 And I certainly don't feel like anybody should ever be pressured to do any traditional gender roles.
00:35:04.580 Like, I don't want anybody to feel any pressure for traditional gender roles.
00:35:10.420 If that's not your thing, you know, you don't fit into that, I'm all for you.
00:35:15.820 Just, you know, follow what makes you happy.
00:35:18.500 So no suggestion that you should do this.
00:35:20.980 I'm just saying there's a really good chance that this is one of the keys to happiness.
00:35:27.280 And my take on it is that whatever gets you closest to your biological evolutionary truth is what makes you happiest.
00:35:40.080 What I mean by that is if you're on the path toward finding a mate, creating new people and taking care of them,
00:35:49.580 that will always feel like some kind of a satisfying, happy thing.
00:35:52.980 Now, there's lots of trials and tribulations, but at least you'll feel like you have meaning, etc.
00:35:59.080 But again, not everybody's doing that.
00:36:01.580 There just might be more of that happening on the conservative side.
00:36:05.520 And it might just connect them a little bit more closely with their biological reality.
00:36:12.060 Perfectly optional, but it might work.
00:36:15.440 Of course, religion.
00:36:16.940 You know, there seems to be a difference in religiosity.
00:36:20.060 And while I am not religious myself, there's no doubt that it makes people happy, right?
00:36:26.180 It's a, I'm pro-religion, very pro-religion, because I observe it makes people happy.
00:36:32.000 And why would I deny anybody happiness?
00:36:35.880 Here's another possibility.
00:36:38.560 Tinder.
00:36:40.260 Tinder.
00:36:41.360 You know that Tinder is ruining the world for young people, right?
00:36:44.700 Oh, not all young people, apparently the top, you know, one or two percent of good-looking men
00:36:51.940 are having a great time on Tinder.
00:36:54.720 It's all the other ones who are not getting a date.
00:36:58.140 So, if something like Tinder, and let's say online ability to find people,
00:37:03.540 is allowing pretty much all women to simply pick the most exceptional men and ignore everybody else,
00:37:10.180 should we not have a massive problem with male loneliness?
00:37:15.040 And would it not be, let's say, more pronounced in the, shall we say,
00:37:22.000 less muscular part of the world?
00:37:26.600 Would that be fair?
00:37:28.860 Do you think that people could tell the difference
00:37:31.220 between a beefy Trump supporter
00:37:34.800 and a maybe-didn't-quite-go-to-the-gym Hillary supporter?
00:37:40.660 Do you think if you're flipping on Tinder,
00:37:43.080 you can pick out the ones with the most testosterone?
00:37:48.080 I feel like there might be some kind of a correlation there.
00:37:51.940 Because I do think that,
00:37:54.040 I do think if you did a scientific experiment
00:37:56.820 where you just said,
00:37:57.680 here's a bunch of male faces,
00:37:59.500 just tell us which ones have the most testosterone.
00:38:02.380 I think people could do it.
00:38:04.800 You know, better than a chance would suggest.
00:38:08.340 So, if it's true,
00:38:10.580 and again, you could fact-check this,
00:38:13.160 I don't know if it's true,
00:38:14.520 but observationally, it feels like it could be
00:38:17.120 that conservative men are more likely to have a little,
00:38:21.260 especially young,
00:38:22.740 have a little extra testosterone.
00:38:25.040 And maybe when you're flipping through Tinder,
00:38:27.960 you can see it.
00:38:29.480 And maybe they get more dates.
00:38:31.540 I don't know.
00:38:32.300 Maybe.
00:38:32.520 Maybe.
00:38:33.180 Again, fact-check me on that if there's any way.
00:38:36.500 Let me, all right, so,
00:38:37.680 but here's my favorite one.
00:38:38.980 I made this one up just for fun.
00:38:41.500 Imagine you're planning a party, okay?
00:38:44.280 You're the party planner,
00:38:45.900 and there are two, let's say,
00:38:48.480 groups of people you hang out with,
00:38:50.240 but you don't want to mix them together.
00:38:51.940 So you're going to have a party,
00:38:53.160 and you're either going to invite one of these groups
00:38:55.580 or the other,
00:38:57.060 but you're not going to have them at the same party
00:38:58.900 because it won't work.
00:39:01.040 And let me describe these two groups.
00:39:03.420 You tell me which one you'd invite to your party.
00:39:06.620 One group believes that America is a toxic wasteland of bigotry
00:39:11.300 that's based on race, gender, and sexual orientation.
00:39:14.480 So if you invite that group,
00:39:17.560 that's what you get.
00:39:19.060 The other group,
00:39:20.320 which will remain nameless,
00:39:22.700 thinks that America is a great place to pursue happiness
00:39:26.040 because everyone is equal under the law.
00:39:28.000 Now, your party has to have one of those two groups.
00:39:33.500 Which one do you think would make a better party?
00:39:39.300 So I feel like there are really good
00:39:42.760 and somewhat obvious reasons
00:39:44.740 why liberal men are the unhappiest they've ever been.
00:39:48.980 And I don't think it's funny,
00:39:50.360 even though I'm laughing at it.
00:39:52.360 There's nothing about the situation that's funny.
00:39:54.600 But when you get to the individual person
00:39:57.100 who's dying of loneliness and doesn't know why,
00:40:00.300 that's not funny.
00:40:01.780 I mean, that's legitimately like a big, big problem.
00:40:06.180 So I don't want to minimize the problem,
00:40:07.940 but we do kind of know what the problem is.
00:40:11.000 I think we know what the source is.
00:40:13.580 All right, you want to go to the whiteboard?
00:40:14.960 Are you ready?
00:40:15.780 Okay.
00:40:17.160 I believe that one of the problems that liberals have,
00:40:20.020 that conservatives have less of,
00:40:22.500 and again, everybody's different,
00:40:24.280 but sort of a generality,
00:40:25.600 is that sometimes,
00:40:31.280 and maybe I shouldn't even make this liberal versus conservative.
00:40:34.380 Just forget that part.
00:40:35.580 This is just different ways of looking at the world.
00:40:39.300 There are two ways to look at success,
00:40:41.760 and this is just a slice of success, right?
00:40:44.040 Not all of it.
00:40:45.140 And I'm going to reframe something for you.
00:40:46.960 Wouldn't you believe in your common sense
00:40:50.560 and the way your brain is organized,
00:40:52.500 doesn't it make sense that in order to succeed,
00:40:56.900 doesn't it make sense that you should understand the facts as well as you could?
00:41:01.860 How could that not be true?
00:41:03.780 The better command you have of reality and the facts,
00:41:07.600 the more likely you'll be successful.
00:41:10.120 How many of you agree with that statement?
00:41:11.680 The better you understand the facts,
00:41:14.520 the more successful you'll be.
00:41:16.500 We'll see in the comments if you agree with that.
00:41:19.880 Because there's a little bit of problem with that,
00:41:22.680 as logical as that sounds.
00:41:26.540 You can't tell what the facts are.
00:41:30.120 Didn't you used to think you could?
00:41:31.740 You used to think you could tell the facts.
00:41:36.440 Now, in a world that doesn't exist,
00:41:39.100 but like a hypothetical world,
00:41:40.860 where you could actually tell what was true,
00:41:43.920 that would be a pretty good strategy for success.
00:41:47.340 Oh, the more you know about what's true,
00:41:50.080 the more you can craft the right strategy.
00:41:52.280 That makes perfect sense.
00:41:53.980 But you don't live in that world.
00:41:56.340 You have a perfect strategy for a world that just doesn't exist.
00:42:00.180 How many more times do you have to see in the last 12 months
00:42:03.980 that the facts change right in front of you?
00:42:07.300 What you thought was true yesterday just isn't true today.
00:42:10.800 So if you use what's true as your main filter,
00:42:15.120 you're just going to run into a wall over and over again.
00:42:18.320 Sometimes it'll work great, but there'll be a lot of walls.
00:42:21.660 Now, let's say you were irrational,
00:42:24.020 and you said to yourself,
00:42:25.620 I don't know why this thing works, but it seems to work,
00:42:28.620 so I'll just keep doing it.
00:42:30.180 Well, what are the facts you're using to determine you should do that?
00:42:33.180 I don't really have any.
00:42:35.100 Because your life is mostly this stuff,
00:42:37.520 where you test stuff and you observe what happened.
00:42:40.340 If you walked out of one door of your house
00:42:42.700 and every day a baseball hit you in the head,
00:42:45.700 but you couldn't figure out where it was coming from
00:42:47.700 or why it always hit you in the head
00:42:49.100 when you walked out one door but not the other,
00:42:52.080 wouldn't you still stop walking out that door?
00:42:54.300 You wouldn't have a randomized gold standard trial
00:42:58.560 that is scientifically valid,
00:43:00.640 but you still have to make decisions.
00:43:02.960 So if one door always hurts you and the other door doesn't,
00:43:05.760 you're going to do what works
00:43:08.180 without knowing anything about the facts.
00:43:12.120 So if I can sell you just on the concept first,
00:43:16.060 now let me fill in some examples,
00:43:18.140 because until you see some examples,
00:43:19.840 this is just nonsense, right?
00:43:21.220 All right, here are some examples.
00:43:23.440 Here's something that's true, as far as we know,
00:43:27.880 that bigotry exists in the United States especially.
00:43:31.160 Well, that's just what we're talking about.
00:43:33.200 Bigotry exists and it influences the outcomes of things
00:43:36.800 and that it's unfair.
00:43:40.520 That feels like those are facts in the United States.
00:43:43.520 Bigotry exists, it influences outcomes, and it's unfair.
00:43:47.060 So that's what's true.
00:43:48.180 So suppose you built a strategy around what's true.
00:43:51.740 How would that work out for you?
00:43:53.420 Well, you would be miserable
00:43:54.660 and you would probably be directing your attentions
00:43:58.440 in the wrong places,
00:43:59.420 but you'd be working on facts.
00:44:01.600 Suppose you worked on what works instead.
00:44:04.880 Let me give you an example with that being.
00:44:07.400 Acting as if you control your own destiny by your actions.
00:44:11.040 Is that true?
00:44:12.540 Do you control your own destiny by your actions?
00:44:15.460 It feels like that's only a little bit true, right?
00:44:19.140 Well, yeah, somewhat, but what about all these other things?
00:44:22.300 Those are pretty big, right?
00:44:24.620 I'm not saying it's true
00:44:25.920 that you can control your future by your actions.
00:44:29.680 What I'm saying is that everybody who believes it's true
00:44:32.420 gets a better outcome.
00:44:36.840 You're going to start to see it in a minute, right?
00:44:39.260 Following what is true
00:44:40.600 throws you completely off the path of success.
00:44:45.500 Following what you're simply acting like is true,
00:44:49.660 which is that you control your own outcomes,
00:44:52.440 will make you act in that way
00:44:54.280 and you're going to get pretty good results.
00:44:56.560 Let me give you another one.
00:44:59.620 This might be true,
00:45:00.820 that you're not the best at a given task.
00:45:03.980 So let's say there's something you're going to be doing
00:45:05.620 and it's just true.
00:45:07.480 You're not one of the good ones at this thing.
00:45:10.080 So if you base your strategy
00:45:12.120 on based on what's true,
00:45:13.940 you're going to be going into it with low confidence.
00:45:16.760 And what does that predict?
00:45:18.660 Well, low confidence predicts low performance.
00:45:22.020 But suppose instead of dealing with what's true,
00:45:25.220 that you're bad at whatever this thing is,
00:45:27.960 you say to yourself,
00:45:29.400 I'm going to exceed expectations.
00:45:32.620 And you just tell yourself that.
00:45:33.700 Yeah, I'm totally going to exceed expectations.
00:45:35.720 I'm going to nail it this time.
00:45:37.600 Is it true?
00:45:39.160 Well, there's no truth or fact to it whatsoever.
00:45:41.900 It's just something in your head.
00:45:43.640 But it would make you perform better.
00:45:46.440 A little bit of confidence might calm you down
00:45:48.520 and you might perform better than you expected.
00:45:51.400 So again, if you start your strategy
00:45:54.020 based on the facts and what's true,
00:45:56.180 you just run off the trail.
00:45:58.460 If you just imagine a world that's true,
00:46:01.560 it's very useful.
00:46:02.440 Here's another one.
00:46:06.300 Here's a fact.
00:46:07.320 Science can't prove God exists.
00:46:09.220 All right, hold on, hold on.
00:46:10.900 There are lots of different religions.
00:46:13.120 So for this example,
00:46:14.920 let's say that science can't prove
00:46:17.060 that somebody else's religion exists.
00:46:20.420 Okay?
00:46:21.920 It can't prove somebody else's religion exists.
00:46:24.340 Yours is fine.
00:46:24.920 So the facts are that science can't prove
00:46:28.560 somebody else's religion exists.
00:46:32.140 But it's also true that religion makes people happy.
00:46:36.240 Definitely.
00:46:37.560 Religion makes people happy.
00:46:39.520 Makes them more successful.
00:46:41.080 I think they live longer, right?
00:46:42.520 And healthier.
00:46:43.020 I believe that the science is completely consistent about that.
00:46:47.560 So should you go with what's true,
00:46:49.620 which is,
00:46:50.120 I can't do a science experiment
00:46:53.200 and find me some God,
00:46:55.140 so I don't know if I can deal with it that way.
00:46:58.080 But suppose you just go with what works.
00:47:01.400 Seven generations of your family
00:47:03.260 worshiped God.
00:47:04.800 They all did great.
00:47:06.300 So you do it.
00:47:07.560 And it works out great.
00:47:08.460 Let me give you another one.
00:47:14.060 Let's say unions are good for workers
00:47:16.620 and we should support them.
00:47:19.960 Doesn't that sound completely reasonable?
00:47:22.600 Now, I don't want to get into too much
00:47:24.580 the argument of unions are good or bad.
00:47:28.640 But I would say most people,
00:47:30.580 most Americans would say unions have a role
00:47:33.240 and they've made the world better,
00:47:34.660 especially for workers.
00:47:36.020 So isn't it a fact
00:47:37.180 that unions are good for workers
00:47:39.040 and we should support them?
00:47:41.420 And if you took that as a fact,
00:47:43.440 you'd have a certain set of policies.
00:47:45.880 But here's another view.
00:47:48.440 Teachers unions don't work
00:47:51.380 because their opponents are children.
00:47:55.300 The only union
00:47:56.620 that has as its opponents
00:47:59.340 children.
00:48:01.960 You can't take the teachers unions
00:48:04.380 and throw them in with the argument
00:48:05.860 in which it's adults against adults.
00:48:09.620 The teachers unions is the only one
00:48:11.440 where it's adults in the teachers unions
00:48:14.040 whose opponents are literally children.
00:48:18.280 That's not where you want a union.
00:48:20.720 I want a union
00:48:21.980 where I've got a powerful force,
00:48:24.600 let's say the owners of the company
00:48:26.320 on one side,
00:48:27.080 and then the workers,
00:48:29.420 you know,
00:48:30.520 they get together
00:48:31.320 to equal the force
00:48:32.820 and then you've got
00:48:33.740 a nice competitive,
00:48:35.340 you know,
00:48:35.720 productive,
00:48:36.520 competitive situation.
00:48:37.800 Everybody gets what they want.
00:48:39.840 That's not the teachers unions.
00:48:42.160 The teachers unions are,
00:48:43.660 they're treating the children
00:48:45.000 as the enemy.
00:48:46.140 That's not fair.
00:48:47.280 Now I know technically,
00:48:48.620 you know,
00:48:49.340 they're negotiating
00:48:49.940 with their bosses for raises,
00:48:51.700 but that's not how it works out.
00:48:53.740 Right?
00:48:54.440 So here's my point.
00:48:56.160 Every time you get stuck
00:48:57.640 in what are the facts,
00:48:59.160 you're probably leaving out
00:49:00.940 some options
00:49:01.720 that might be better.
00:49:03.960 No two situations are alike.
00:49:06.200 Clearly there are plenty of cases
00:49:07.800 where you should just follow the facts.
00:49:10.000 Maybe you can recognize them
00:49:11.260 when you see them.
00:49:12.360 But my point is
00:49:13.300 that following the facts
00:49:14.900 slavishly
00:49:15.840 is a loser strategy
00:49:17.880 because you don't know the facts.
00:49:20.520 And believing you do
00:49:21.660 is what leads you over the cliff.
00:49:23.980 All right.
00:49:27.520 I got one more story here.
00:49:29.720 Maybe two.
00:49:32.020 In the Washington Examiner,
00:49:35.160 writer Eddie Scarry,
00:49:37.480 or S-C-A-R-R-Y,
00:49:40.240 Scarry,
00:49:41.640 is writing about
00:49:42.700 how the media
00:49:43.300 is treating
00:49:43.980 the Duante Wright
00:49:45.840 tragic shooting
00:49:48.360 and death.
00:49:50.060 You all know that story, right?
00:49:51.500 That's the one
00:49:51.920 where the police officer
00:49:53.680 mistook a taser
00:49:55.800 for a handgun
00:49:56.900 and killed this guy
00:49:59.600 in the attempt,
00:50:01.380 hoping to taser him,
00:50:02.320 but killed him.
00:50:04.860 And so,
00:50:05.600 Columbia University
00:50:07.500 professor
00:50:08.240 Sarah Sio,
00:50:10.840 she weighed in on it,
00:50:12.300 I guess,
00:50:12.660 in the New York Times,
00:50:13.640 and she said this,
00:50:14.360 that traffic stops
00:50:15.440 should not be
00:50:16.680 harrowing or dangerous
00:50:17.700 experiences,
00:50:18.660 but too often they are
00:50:19.760 for people of color.
00:50:21.520 And then she proposed
00:50:22.600 that if somehow,
00:50:24.080 you know,
00:50:24.400 we just,
00:50:25.420 a camera had picked up
00:50:26.660 his license plate
00:50:27.500 and just sent him a ticket,
00:50:28.980 then he never would have
00:50:29.940 been stopped,
00:50:31.180 and there never would have
00:50:32.060 been any opportunity
00:50:32.920 for anybody to get hurt.
00:50:34.860 Is that a good idea?
00:50:35.920 Well,
00:50:37.840 as Eddie
00:50:38.660 Scarry points out,
00:50:40.800 that's not exactly
00:50:41.900 covering the whole
00:50:44.020 context here,
00:50:44.820 right?
00:50:45.280 The context is
00:50:46.540 that he,
00:50:48.080 the only reason
00:50:49.480 there was a problem
00:50:50.360 is that he had a warrant,
00:50:52.620 an arrest warrant.
00:50:53.820 I think a weapon
00:50:54.740 was involved
00:50:55.340 in the arrest warrant.
00:50:56.160 I can't remember
00:50:56.720 what the warrant
00:50:57.800 was for.
00:50:58.900 And he resisted arrest.
00:51:01.100 And then there was
00:51:01.760 a tragic mistake
00:51:03.020 of the confusion
00:51:05.080 of the taser.
00:51:07.580 So,
00:51:09.240 is it,
00:51:10.240 is it fair to say
00:51:11.660 that
00:51:13.600 if,
00:51:14.660 if we could have
00:51:15.460 changed things
00:51:16.280 police-wise,
00:51:17.220 we would have gotten
00:51:18.080 a better result?
00:51:19.760 I don't know.
00:51:20.480 So as
00:51:21.100 Eddie Scarry writes,
00:51:22.920 you know,
00:51:23.540 the,
00:51:25.080 the idea that
00:51:26.160 Wright ended up dead
00:51:27.920 because of expired
00:51:28.920 car tags,
00:51:29.720 which was the original
00:51:30.780 reason for pulling
00:51:31.720 him over,
00:51:32.180 I guess,
00:51:32.380 even though he did,
00:51:34.800 it wasn't expired,
00:51:35.720 it just wasn't posted
00:51:36.700 in the right place.
00:51:38.200 It's something about
00:51:39.020 an air freshener
00:51:39.800 that nobody believes
00:51:40.560 has anything to do
00:51:41.400 with that.
00:51:42.580 But why do you think
00:51:43.640 he got shot?
00:51:44.440 Do you think he got shot
00:51:45.420 because he got stopped
00:51:48.420 or did he get shot
00:51:51.060 because he resisted arrest?
00:51:53.680 Which of those
00:51:54.520 is the truth?
00:51:56.780 Well,
00:51:57.360 you could argue
00:51:57.880 about this all day long
00:51:58.960 and obviously
00:51:59.520 you needed everything
00:52:00.700 to happen
00:52:01.240 the way it did
00:52:02.320 for it to go
00:52:02.820 the way it did.
00:52:03.380 But let me suggest this.
00:52:05.120 Every time somebody
00:52:06.160 says to me,
00:52:07.160 there's no way
00:52:08.160 to reduce
00:52:08.900 police funding
00:52:10.520 and still get
00:52:11.940 a good result,
00:52:13.040 I always feel like
00:52:14.280 that's a limit
00:52:15.760 of imagination.
00:52:17.400 I do not believe
00:52:18.720 that it would be
00:52:20.520 impossible
00:52:21.120 to defund
00:52:22.960 the police,
00:52:23.640 you know,
00:52:23.880 in some percentage,
00:52:25.280 not all the police,
00:52:26.260 of course.
00:52:26.600 But I do think
00:52:27.940 you could get
00:52:28.480 a whole bunch
00:52:29.240 of technical
00:52:30.540 advantages
00:52:31.420 to lower
00:52:32.720 the economics
00:52:33.480 of it
00:52:33.840 and get rid
00:52:35.140 of danger.
00:52:35.840 Let me give you
00:52:36.220 a specific example.
00:52:38.240 Suppose you said
00:52:39.120 to Detroit,
00:52:40.000 I'll just pick
00:52:40.580 a city
00:52:41.360 where there's
00:52:42.540 gun violence
00:52:44.240 and police
00:52:44.760 stopping people.
00:52:46.620 You say,
00:52:47.220 Detroit,
00:52:48.180 we're going to let
00:52:49.060 the big car
00:52:50.220 manufacturers
00:52:50.940 who want to build
00:52:51.820 self-driving cars
00:52:52.800 bid for your city
00:52:54.040 and then they're
00:52:55.120 just going to
00:52:55.520 sort of own
00:52:56.100 the roads
00:52:56.700 and they'll be able
00:52:57.820 to put their own
00:52:58.400 self-driving cars
00:52:59.640 there and maybe
00:53:00.220 phase them in
00:53:00.900 over time,
00:53:01.460 whatever it is.
00:53:02.720 And Detroit
00:53:03.680 will become
00:53:04.360 the test bed
00:53:05.760 for,
00:53:06.840 I'll pick a company,
00:53:08.400 let's say it's Tesla,
00:53:10.100 but it could be
00:53:10.660 any big car company.
00:53:12.000 And Tesla
00:53:12.960 will own Detroit
00:53:13.940 and eventually
00:53:15.540 they'll have
00:53:16.200 the franchise
00:53:16.720 that if you want
00:53:17.680 to call a car
00:53:18.460 like an Uber,
00:53:19.580 it's going to be
00:53:20.120 a Tesla
00:53:20.480 and that's all there is
00:53:22.080 and they'll just
00:53:22.660 own the city.
00:53:23.300 And the advantage
00:53:24.380 is that the
00:53:25.120 cars will all
00:53:26.280 be networked
00:53:27.780 so that,
00:53:28.320 you know,
00:53:28.780 everything knows
00:53:30.400 where everything is
00:53:31.120 and doesn't bump
00:53:31.760 into everything.
00:53:33.120 So under that
00:53:33.920 situation,
00:53:34.680 how would you
00:53:35.960 ever have a
00:53:36.380 traffic stop?
00:53:38.360 Because the
00:53:39.100 self-driving cars
00:53:39.900 will never be
00:53:40.560 pulled over
00:53:41.020 for a traffic
00:53:41.680 violation.
00:53:42.820 So the entire
00:53:43.720 reason for pulling
00:53:44.740 somebody over
00:53:45.320 is gone.
00:53:46.700 Now what are
00:53:47.380 the odds
00:53:47.820 that someday
00:53:48.480 we'll have
00:53:48.940 self-driving cars?
00:53:50.100 A hundred percent.
00:53:51.440 There's no chance
00:53:52.340 that's not going
00:53:52.880 to happen.
00:53:53.960 What are the
00:53:54.320 odds that someday
00:53:55.120 we'll only
00:53:56.100 have self-driving
00:53:56.940 cars?
00:53:57.860 Well,
00:53:58.340 that might
00:53:58.700 take longer,
00:54:00.180 but it's
00:54:00.900 a hundred percent.
00:54:02.260 I don't think
00:54:03.020 there's any chance
00:54:03.940 that we're going
00:54:05.020 to let perfectly
00:54:05.960 safe self-driving
00:54:08.000 cars on the
00:54:09.160 road with,
00:54:09.860 you know,
00:54:10.200 crazy,
00:54:11.460 organic,
00:54:12.300 you know,
00:54:13.000 defective humans
00:54:14.140 driving them.
00:54:15.260 That's crazy.
00:54:16.600 You know,
00:54:16.860 maybe cars will
00:54:17.740 be in recreational
00:54:18.960 areas,
00:54:19.600 but you're not
00:54:19.980 going to use
00:54:20.280 them for basic
00:54:20.860 transportation in the
00:54:21.920 future.
00:54:22.380 There's no chance
00:54:23.180 that's going to
00:54:23.640 happen in the
00:54:24.160 future.
00:54:25.020 So that might
00:54:25.800 be 20 years
00:54:26.440 away,
00:54:26.780 but suppose we
00:54:27.440 just sped it up
00:54:28.260 and said,
00:54:29.080 let every major
00:54:30.340 car company adopt
00:54:31.660 a city.
00:54:32.480 Maybe you start
00:54:33.360 with one block
00:54:34.180 or whatever,
00:54:34.780 make that just
00:54:35.400 the self-driving
00:54:36.140 car,
00:54:36.860 you know,
00:54:37.460 area,
00:54:37.840 and then you
00:54:38.260 expand it until
00:54:39.040 you own the
00:54:39.500 city.
00:54:40.560 What the car
00:54:41.260 manufacturers,
00:54:42.320 and by the way,
00:54:43.060 the car manufacturer
00:54:43.860 could be Apple
00:54:44.620 Computer,
00:54:45.880 could be Google,
00:54:47.080 right?
00:54:47.420 Because a lot of
00:54:48.120 people are going
00:54:48.540 to be making
00:54:48.880 self-driving cars.
00:54:50.420 And you just say,
00:54:51.440 you own the city,
00:54:52.420 but you also have
00:54:53.220 to do all the
00:54:53.720 transportation,
00:54:54.580 you've got to
00:54:54.920 make it affordable,
00:54:56.040 and you just have
00:54:57.040 eliminated traffic
00:54:58.420 stops.
00:54:59.580 Now,
00:55:00.300 how do you arrest
00:55:01.020 somebody who's got
00:55:01.660 an arrest warrant
00:55:02.360 if you don't stop
00:55:04.280 them in a car?
00:55:04.800 Well,
00:55:05.680 I somewhat tongue-in-cheek
00:55:06.760 said the police
00:55:07.500 could have an
00:55:07.980 override button
00:55:08.740 and just make your
00:55:09.860 car drive to the
00:55:10.720 police station
00:55:11.260 so you could be,
00:55:12.280 you know,
00:55:12.440 safely take you
00:55:14.300 out of the car
00:55:14.740 without trouble.
00:55:16.180 I don't think
00:55:17.100 literally you want
00:55:18.300 the car to drive
00:55:19.300 to the police station,
00:55:20.640 but it could.
00:55:22.960 Yeah,
00:55:23.340 it could.
00:55:24.660 Now,
00:55:24.900 I heard some people
00:55:26.260 say that we'll
00:55:27.080 never have self-driving
00:55:28.340 cars everywhere
00:55:29.060 because it could
00:55:30.920 get hacked.
00:55:32.600 And then,
00:55:33.300 you know,
00:55:33.960 what would happen
00:55:34.580 then if it got
00:55:35.720 hacked?
00:55:36.740 And I would just
00:55:37.600 say that every
00:55:38.660 new technology
00:55:39.500 has the same
00:55:40.300 kind of problem.
00:55:42.340 Every new
00:55:43.040 technology.
00:55:44.180 I used to work
00:55:44.960 in a bank
00:55:45.620 when the very
00:55:46.360 first ATMs
00:55:47.440 were being
00:55:48.080 introduced,
00:55:49.320 and the argument
00:55:50.160 against them
00:55:50.780 was nobody's
00:55:52.340 going to trust
00:55:52.800 these machines
00:55:53.480 with their money.
00:55:55.160 Of course,
00:55:55.700 ATMs are everywhere.
00:55:57.080 I'm sure when
00:55:57.740 the airplanes
00:55:58.300 were invented,
00:55:59.360 well,
00:55:59.680 I know,
00:56:00.740 the thinking
00:56:01.620 was,
00:56:02.040 well,
00:56:02.160 you're never
00:56:02.520 going to have
00:56:03.060 like an
00:56:04.000 industry
00:56:04.640 of travel
00:56:06.580 by airplane
00:56:07.580 because there's
00:56:09.180 no way you
00:56:09.800 could make
00:56:10.100 an airplane
00:56:10.640 safe,
00:56:11.600 right?
00:56:12.920 Or Elon Musk.
00:56:14.720 You're never
00:56:15.400 going to have
00:56:15.760 a rocket
00:56:16.260 that shoots
00:56:17.480 up,
00:56:18.320 you know,
00:56:18.720 unloads its
00:56:19.340 payroll,
00:56:20.340 payload,
00:56:21.220 not payroll,
00:56:22.280 sort of unloads
00:56:23.000 its payroll
00:56:23.380 too,
00:56:24.160 unloads its
00:56:24.740 payload,
00:56:25.400 and then
00:56:25.840 comes back
00:56:26.800 down and
00:56:27.740 somehow stays
00:56:28.560 upright until
00:56:29.520 it lands on
00:56:30.420 a platform
00:56:31.000 in the ocean.
00:56:33.160 That's not
00:56:33.860 going to work,
00:56:34.960 but it does.
00:56:37.200 So,
00:56:38.440 yeah,
00:56:39.000 self-driving
00:56:39.500 cars are
00:56:40.060 guaranteed.
00:56:41.360 It's just
00:56:42.060 how long
00:56:42.460 it will take.
00:56:43.960 And flying
00:56:44.360 cars are
00:56:44.820 guaranteed too,
00:56:46.060 but not until
00:56:46.880 batteries are
00:56:47.860 cost-effective
00:56:48.580 and they're
00:56:48.920 almost there.
00:56:50.680 Flying cars
00:56:51.540 are pretty
00:56:52.180 much here.
00:56:53.300 Just a little
00:56:54.060 bit more
00:56:54.480 in battery
00:56:55.280 efficiency,
00:56:55.820 which is
00:56:56.460 guaranteed,
00:56:57.240 and you'll
00:56:57.940 have your
00:56:58.220 flying car.
00:56:59.400 The laws
00:57:00.120 will be the
00:57:00.460 hard part.
00:57:01.000 nothing is
00:57:02.900 100%.
00:57:03.480 You say,
00:57:03.960 well,
00:57:04.240 okay,
00:57:04.460 that's
00:57:04.720 true.
00:57:05.500 I'll
00:57:05.840 back that
00:57:06.260 up to
00:57:06.540 98%.
00:57:07.620 You know,
00:57:10.360 self-driving
00:57:10.840 cars are
00:57:11.320 going to
00:57:11.480 change
00:57:11.760 everything.
00:57:13.800 Imagine if
00:57:14.440 you could
00:57:14.700 just go on
00:57:15.160 vacation
00:57:15.620 anywhere you
00:57:16.720 wanted in
00:57:17.100 the United
00:57:17.460 States,
00:57:18.620 or,
00:57:19.060 you know,
00:57:19.520 wherever you
00:57:20.020 are,
00:57:21.360 and you
00:57:21.620 could just
00:57:21.900 tell the
00:57:22.400 car to
00:57:22.820 go somewhere.
00:57:24.760 Like,
00:57:24.920 I have this
00:57:26.060 rule that
00:57:26.700 if I'm
00:57:27.220 driving a
00:57:27.860 car,
00:57:28.500 I'm not
00:57:29.140 on vacation.
00:57:29.800 And
00:57:30.960 unfortunately,
00:57:31.640 if you
00:57:31.820 happen to
00:57:32.180 be the
00:57:33.200 oldest adult
00:57:34.980 male with
00:57:36.540 whatever group
00:57:37.240 you're traveling
00:57:37.760 with,
00:57:38.720 you're usually
00:57:39.320 going to be
00:57:39.680 the one
00:57:39.960 driving a
00:57:40.520 car.
00:57:41.620 And anytime
00:57:42.380 I'm driving
00:57:43.020 a car,
00:57:44.100 I am not
00:57:44.940 on vacation.
00:57:45.740 I'm just
00:57:46.040 working.
00:57:46.580 That's just
00:57:46.960 a job.
00:57:48.360 So that's
00:57:49.120 where I draw
00:57:49.520 the line.
00:57:50.060 If I have
00:57:50.440 to drive
00:57:50.940 a car,
00:57:51.860 that's work,
00:57:52.880 that's not
00:57:53.260 a vacation.
00:57:54.680 All right,
00:57:54.960 that's all
00:57:55.240 for me,
00:57:55.620 and I will
00:57:55.920 talk to you
00:57:56.480 tomorrow.
00:57:57.640 Tomorrow.