Real Coffee with Scott Adams - January 17, 2023


Episode 1991 Scott Adams: Lots Of Political Intrigue And Fake News Today, And That Spells Fun


Episode Stats

Length

58 minutes

Words per Minute

149.0111

Word Count

8,732

Sentence Count

658

Misogynist Sentences

12

Hate Speech Sentences

15


Summary

In this episode of the highlight of civilization, Scott Adams tells the story of a man who shot a man nine times, including when he was down, and dumped his beverage on him before he left, and then he just left.


Transcript

00:00:00.400 Good morning, everybody, and welcome to the highlight of civilization.
00:00:05.180 It's called Coffee with Scott Adams, the finest thing that's ever happened.
00:00:08.640 And today we have intrigue and fake news and all kinds of good stories.
00:00:15.320 It's going to be a good one.
00:00:16.660 And if you'd like to enjoy this, with the maximum amount of oxytocin and, oh, all those good chemicals,
00:00:24.520 all you need is a cup or mug or a glass of tank or chalice of stein, a canteen jug or flask, a vessel of any kind,
00:00:30.740 fill it with your favorite liquid, or like coffee, and join me now for the unparalleled pleasure,
00:00:36.880 the dopamine hit of the day, the thing that makes everything better.
00:00:40.980 It's called the simultaneous sip, and it happens now.
00:00:44.280 Go.
00:00:48.700 Ah, yeah, that was good.
00:00:50.840 That was good.
00:00:52.060 Dopamine.
00:00:52.580 I feel the rush.
00:00:54.520 Good stuff.
00:00:57.580 Well, there was a feel-good story in Houston.
00:01:01.480 Do you mind if I start with the feel-good stories?
00:01:05.000 Would you like a nice, warm, fuzzy feeling?
00:01:08.320 You know, get the week going after the holiday.
00:01:11.180 Warm, fuzzy.
00:01:12.700 So there was a local taqueria in Houston where a 30-year-old gentleman robbed all the patrons,
00:01:21.480 so he took out a gun and pointed it at all the patrons and made them hand over their wallets and stuff.
00:01:28.400 Did I mention this was Houston?
00:01:31.760 Yeah.
00:01:32.200 There was somebody who decided that brandishing a gun in a public area in Houston would turn out okay.
00:01:40.740 Um, so there was a patron who was armed.
00:01:45.500 Did I mention it was Houston?
00:01:49.860 There was a patron who was armed who shot the intruder, which apparently is legal.
00:01:57.660 It's totally legal because the guy was brandishing a gun.
00:02:00.240 I like saying brandishing.
00:02:02.520 He was brandishing all over.
00:02:04.520 Oh, man, he was brandishing.
00:02:06.060 He was brandishing like crazy.
00:02:08.300 So the guy shot him.
00:02:09.200 But the other part of the story is that he shot him like nine times, including when he was down.
00:02:17.260 And then he dumped his beverage on him before he left.
00:02:22.380 And then he just left.
00:02:27.720 Now, there's some suggestion that he actually was guilty of a crime.
00:02:34.480 Because the thinking is that once he had neutralized the attacker, there was no reason to do any additional shooting.
00:02:40.900 I would like to present his defense for him, if I may.
00:02:47.660 Here now, the defense for the person who shot nine times and dumped his beverage on him.
00:02:54.800 Number one, I'm not a doctor.
00:02:58.620 I don't know if somebody can get up from a particular kind of injury.
00:03:03.220 Do you?
00:03:04.760 So put me on the witness stand.
00:03:07.540 All right.
00:03:07.900 After you shot him once and he went down and the danger was neutralized, why did you keep shooting?
00:03:14.620 To which I say, well, I'm not a doctor.
00:03:18.640 I don't know what kind of injury somebody can take and not be able to fight back.
00:03:24.260 You say, yes, but, you know, they took the gun away from him.
00:03:26.880 I don't know if that's true.
00:03:27.920 But let's say the gun, you know, maybe fell or something.
00:03:31.460 Then I'd say, well, I don't know if he has another one.
00:03:34.600 Do you know if he has another gun?
00:03:36.540 I didn't know.
00:03:37.300 How would I know that?
00:03:38.560 Why would I take that chance?
00:03:40.460 He's obviously a guy that points guns at people.
00:03:43.500 And I just shot him.
00:03:44.740 If he had another gun, I'm dead.
00:03:47.160 So why would I take that chance?
00:03:49.760 So you're saying that my risk was over when he went down.
00:03:53.860 And I say, that's for me to decide.
00:03:57.240 That's for me to decide.
00:03:58.500 You don't get to decide my risk.
00:04:02.720 If there's a guy who shoots people and uses guns and robs people just because I saw one of his guns was no longer in action, I don't know what else he could do.
00:04:12.260 And I don't know how many times have you heard a story about the guy who got shot six times and still attack somebody.
00:04:18.700 Lots of times because, you know, they're on, I don't know, some kind of weird drug or something.
00:04:24.960 But that's the thing.
00:04:26.600 I would say if you tell me I'm supposed to be an expert on military injuries, how is that reasonable?
00:04:34.980 Now, I have one job, which is to keep myself safe and the other patrons, and I made sure I got it done.
00:04:43.340 What's wrong with making sure you're safe?
00:04:46.220 What is the percentage of safety that you believe is appropriate to me?
00:04:51.680 How much safety should, is my safety based on your assessment of my safety?
00:04:56.180 Or is my assessment of my safety the one that matters in this situation?
00:05:00.920 Pretty sure it's mine.
00:05:02.560 Pretty sure it's mine.
00:05:03.300 So here would be the argument.
00:05:05.920 Nobody can tell you what your risk is when you're in the moment.
00:05:10.080 Here's the other thing I would say, if it's true.
00:05:12.800 It might be true.
00:05:13.880 I didn't even know I shot nine times.
00:05:17.500 I literally didn't even know it.
00:05:20.000 Because in the moment, I'm not some kind of trained policeman.
00:05:24.780 I'm not in the military.
00:05:26.680 I saw a threat, I pulled my gun, and honestly, I don't even know what happened in the next ten seconds.
00:05:31.280 Because it was just fog of war.
00:05:33.940 My brain was on fire.
00:05:36.580 I would just say, my brain was on fire.
00:05:38.340 I barely remember what happened.
00:05:40.760 Did I fire nine times?
00:05:42.000 The only reason I know is because when I pulled the trigger the tenth time, there was no bullets or something like that.
00:05:47.260 So, I don't see how he could possibly be convicted.
00:05:52.720 Do you?
00:05:53.160 If he had a good attorney.
00:05:54.980 If he has a bad attorney, I suppose anything's possible.
00:05:57.700 Or a bad jury.
00:05:58.900 The last shot was in the head.
00:06:03.200 Yep.
00:06:03.720 Doesn't change a thing.
00:06:05.580 I would say, if you can't be sure, you can't be sure.
00:06:08.260 And it wasn't his problem to solve.
00:06:10.520 Well, a reparations panel in San Francisco was tasked with coming up with a suggestion of what reparations should be.
00:06:19.120 I think there was an earlier report, they came up with some number like 200,000 per black person or something in California.
00:06:28.280 But this group came up with a different number, 5 million per black person.
00:06:34.160 That's the recommendation, 5 million per black person.
00:06:38.220 Now, the total budget of the city of San Francisco is $14 billion a year.
00:06:44.320 And the recommendation is to spend $50 billion on reparations to each San Francisco black person who has been black for at least 10 years.
00:06:53.980 You have to be black for at least 10 years.
00:06:58.260 People who have only been black for, like, say, 3 to 5 years, nothing.
00:07:02.500 So I've only been black for, I don't know, maybe 8 years.
00:07:09.320 So I get nothing.
00:07:10.740 I get nothing.
00:07:11.780 I've only been black for, like, 8 years.
00:07:14.260 So, but this is why, you know, I've been teaching you.
00:07:17.640 This is why you start early.
00:07:19.360 Because they put that 10-year thing.
00:07:21.420 I'm only about 2 years away from qualifying.
00:07:25.320 $5 million is free money, right?
00:07:27.500 So I've been planning way ahead.
00:07:30.080 So 2 years from now, if this comes up, I'm going to say, I got my 10 years in.
00:07:36.300 But I guess you have to prove it with government documents.
00:07:38.540 That would be a little harder.
00:07:41.160 The committee also proposed wiping out all debts associated with educational, personal credit cards, payday loans for black households.
00:07:49.220 And so the 50-member panel was established by the San Francisco supervisors.
00:07:57.020 So let me give you a suggestion for how to make bad ideas go away.
00:08:04.940 You tell them to form a committee and get real specific about what it is they're asking.
00:08:10.100 Because then it's easy to ignore it.
00:08:14.140 Because it'll be so ridiculous.
00:08:16.620 See, this is the same thing that Gavin Newsom did.
00:08:19.200 Apparently there was at least two panels.
00:08:22.140 So I think Gavin Newsom had a panel.
00:08:24.260 And they came back with that quarter million dollars.
00:08:26.680 You know, that was impractical, too.
00:08:28.800 I think that was always the play.
00:08:31.560 And Newsom played it completely right.
00:08:33.840 He took it seriously in the sense that he formed a committee to make a recommendation and then gave them attention.
00:08:41.780 That seems perfectly fair.
00:08:43.860 And then once they got attention and they showed you what their idea was, it was completely impractical.
00:08:49.940 Problem solved.
00:08:52.740 So I don't think, you know, the thing you want to watch out for is that Newsom is a strong player.
00:09:00.940 Like, if you don't like his politics, I get it.
00:09:02.940 Blah, blah, blah.
00:09:03.900 But don't overlook the fact that he has game.
00:09:07.320 All right?
00:09:08.080 He could be coming.
00:09:10.560 All right.
00:09:11.320 Let's see.
00:09:11.880 That was fun.
00:09:16.240 More reports that China has more deaths than births this year.
00:09:20.540 So there was something like, in very rough numbers, 10 million deaths and 9 million-some births.
00:09:27.940 So President Xi actually wants to boost the population of China.
00:09:32.660 How many of you had that on your predictions?
00:09:36.360 How many of you were predicting, you know, I think China's going to really want to increase their population?
00:09:41.820 That would have been hard to predict, wouldn't it?
00:09:47.260 But, you know, I think Peter Zahn is saying that they probably have been decreasing the population for 14 years or so.
00:09:53.500 They've probably just been lying about their data.
00:09:55.340 And all the experts say that it's a demographic time bomb and there won't be enough young people to support all the old people fairly soon.
00:10:06.000 And it's a problem.
00:10:06.860 So it's going to take a while to convince everybody that population growth isn't the problem, that prosperity actually solves that.
00:10:17.920 So you just do prosperity right, and your biggest problem is not enough people.
00:10:23.780 The United States...
00:10:25.860 Do me a fact check on this.
00:10:27.220 I thought I saw this, but I don't have the source right here.
00:10:29.400 I thought I saw that last year, the United States population would have decreased, if not for immigration, including illegal.
00:10:40.360 Is that true?
00:10:42.420 Was the population of the United States not going to grow, or would it slightly decrease, except for immigration?
00:10:50.960 Give me a fact check on that.
00:10:55.560 I think it's true.
00:10:59.400 But here's what I think I should do before 2024.
00:11:07.340 I'm thinking of creating a series of maybe like, you know, one-minute videos in which I explain a policy position that would work for everybody.
00:11:17.020 Because a lot of these things have policy positions that nobody has staked out, that both a Democrat and a Republican would totally agree with.
00:11:24.800 I think.
00:11:26.460 And here's how I would do it if I were...
00:11:28.440 Let's say I'm Trump, and I want to avoid all the, you know, the border racism questions, right?
00:11:34.320 If I'm Trump, I say, oh, let's do this.
00:11:38.000 Let's form an economic board who decides who to let in under what circumstances that benefits the economy of the United States the most.
00:11:47.380 And make sure that board is super diverse.
00:11:51.560 Who says no to that?
00:11:54.140 Really.
00:11:54.540 If he creates this super diverse board of actual economists, and the economists say, you know, I think we should let in no people this year, then don't.
00:12:06.820 Or if they say we should let in 5 million because otherwise, you know, we'll have a shrinking population.
00:12:11.820 And even though it's a burden on our systems, we're still better off, you know, hypothetically.
00:12:16.780 I'm not saying they would say that, but whatever they say.
00:12:20.080 Because I think that the decision should be offloaded to economists.
00:12:24.800 And they should be a diverse group of economists so you don't worry about the racial stuff.
00:12:29.020 And they should just say, we think the country is better off with this level.
00:12:34.080 And then just manage to that level.
00:12:35.800 Who says no to that?
00:12:39.260 Right?
00:12:40.100 Who says no?
00:12:41.540 Now, somebody would find some reason to complain about something.
00:12:44.860 But it's way better than what the Democrats are doing, which is nothing.
00:12:50.840 And what Trump will suggest, which will sound too harsh.
00:12:55.180 Right?
00:12:56.300 Why do we want to choose between basically nothing, open borders?
00:13:00.560 You know, I know it's not technically open borders, but you know what I mean.
00:13:03.580 Versus somebody who's going to get everybody mad that it sounds racist.
00:13:08.560 Why are those our choices?
00:13:11.580 Anybody could just choose the smarter one.
00:13:14.220 Instead of having Congress decide how many people come in, which is dumb,
00:13:20.600 do what we do when we do budgets.
00:13:22.420 You have the, what is the OMB?
00:13:26.160 Sort of somewhat objectively scores your budgets and says, is this good or bad?
00:13:30.980 Just do that.
00:13:31.800 How many people, how many people argue with the OMB lately?
00:13:36.300 I mean, it happens.
00:13:37.560 But I feel like that's become a credible, least credible feelings system led by, no, I don't,
00:13:48.340 Gates?
00:13:50.100 No.
00:13:50.620 I don't know what his views are on immigration exactly.
00:13:54.280 All right.
00:13:54.960 Well, that's my suggestion.
00:13:56.700 I see more talk about the Biden residents not having guest logs.
00:14:03.260 And I saw Tom Fenton say that if the, if the Secret Service is doing its job, there has to be visitor logs.
00:14:11.140 To which I ask the question, is that true?
00:14:14.180 Is it true that the Secret Service has a log of everybody who comes in?
00:14:21.040 Because didn't we learn that the White House did not keep a guest log during the Trump administration?
00:14:27.680 They had one and then they got rid of it.
00:14:30.520 How could it be true that they got rid of the White House log?
00:14:33.520 Well, it would also be true that the Secret Service would keep a log of everybody who visited, which sounds reasonable, seems reasonable.
00:14:43.340 And then somebody suggested that they need background checks of everybody who visits the president.
00:14:49.700 Do you think so?
00:14:50.320 Do you think they really do background checks of everybody who visits the president?
00:14:56.860 You do.
00:14:59.960 I'm not entirely sure.
00:15:01.840 You might be right.
00:15:02.780 I mean, it's a bureaucracy, so maybe they just do it.
00:15:06.820 Could be it's just the rule and they just do it all the time.
00:15:09.180 But, so when I visited the White House, I gave them, you know, my name and information, probably gave them my Social Security number.
00:15:18.700 Probably.
00:15:19.140 Probably, I think I gave them my driver's license at some point, either before or when I checked in.
00:15:24.900 I can't remember.
00:15:25.680 Maybe both.
00:15:27.020 So, they could certainly do a quick digital check, right?
00:15:32.700 They could check government records.
00:15:35.760 But nobody talked to me.
00:15:38.960 And nobody talked to relatives or nobody talked to my close associates or anything like that.
00:15:45.400 So, how much of a check do they really do?
00:15:47.240 And then here's the question.
00:15:49.840 If they see a name on the list and it's a public figure, do they really check?
00:15:54.920 Really?
00:15:55.780 Like, if when Kim Kardashian was invited to the White House, did they do a background check on her?
00:16:04.080 The most transparent person on the planet Earth?
00:16:09.460 But probably it was perfunctory, don't you think?
00:16:12.800 Same with me.
00:16:14.340 Don't you think it was just perfunctory?
00:16:15.880 I've never said that word out loud.
00:16:19.020 I've waited my whole life to say perfunctory.
00:16:23.420 I think I know what it means.
00:16:25.380 You know, just sort of going through the motions.
00:16:27.840 Yeah, I don't think that anybody even, like, looked at it.
00:16:31.360 I think they just probably typed my name into the system.
00:16:34.480 Something spit out and they said, go ahead.
00:16:36.120 All right, so I guess we have some interesting questions about whether they have a list and how much they actually check.
00:16:43.900 My suspicion is, unless you're a complete unknown, like a citizen who did something famous,
00:16:50.000 I'm sure they check all the normal citizens, if that makes sense.
00:16:54.060 I don't think they check as much if you're a public figure.
00:16:58.860 Because I think you just, you know, I mean, there's not any real surprises with public figures.
00:17:05.040 So McCarthy kept his promise to try to pass a bill to cancel the funding of the 87,000 new IRS employees that were trying to be hired.
00:17:16.940 And it looks like that's popular.
00:17:18.520 64% of publicans are in favor of canceling it.
00:17:22.060 But even 52% of Democrats and affiliated, non-affiliated voters.
00:17:28.320 So a solid majority of every type of voter, well, 52% is not that solid, but a majority, don't want it.
00:17:37.700 And McCarthy is falling through.
00:17:40.380 Now, yeah, I guess it still goes to the Senate, so who knows if that's going to happen.
00:17:45.200 All right, who would like to hear an inspirational story?
00:17:49.000 Anybody?
00:17:50.300 It's even better than the Houston shooting.
00:17:53.540 Better.
00:17:54.700 All right.
00:17:55.420 Now, the people on the Locals platform heard this, but I'm going to add a detail that you didn't hear.
00:18:02.720 And it's a fun detail.
00:18:04.640 It's worth it.
00:18:05.620 So don't tune out because you've already heard it.
00:18:07.240 So yesterday, I was planning to go to Starbucks, do some work.
00:18:13.480 My car's in the shop because I have a BMW, so it always has some kind of warning light on or another.
00:18:21.180 So it's in the shop.
00:18:22.540 And, of course, it always goes in the shop on a three-day weekend.
00:18:24.960 Of course.
00:18:26.260 Of course.
00:18:26.700 I don't bring it in on a Friday, but it got extended into the three-day weekend.
00:18:31.120 So I called an Uber.
00:18:33.920 Uber shows up, and it's a Tesla.
00:18:36.980 And I realized that I'd never opened a Tesla car door.
00:18:44.820 And it's got the door handles, like, flat with the door itself.
00:18:50.440 And so I'm standing outside the Uber, and I'm like a monkey with a coconut trying to open this car.
00:18:56.720 I'm like, ooh, ooh, ooh, ooh, ooh, ooh, ooh, ooh, ooh.
00:18:59.920 And, you know, the driver sees that I'm struggling, and I'm, ooh, ooh, ooh, boomer, ooh, ooh.
00:19:06.580 Yes, you can say boomer.
00:19:08.520 Totally appropriate under these circumstances.
00:19:11.420 Now, to be clear, I've been in a few Teslas.
00:19:15.500 I've ridden in them before.
00:19:17.260 But I can't remember ever opening the door.
00:19:20.380 I think maybe the door was opened or something.
00:19:23.300 I don't know.
00:19:24.320 But I've never opened one.
00:19:25.760 So anyway, I finally figured out how to push it, and it comes out, and I open the door.
00:19:29.280 So I'm getting in the door, and the driver jokes with me, and he jokes this way.
00:19:35.520 He goes, you know, Elon Musk, he doesn't make these cars.
00:19:38.960 He doesn't design these cars for the common people like us.
00:19:43.440 He goes, I think he just makes these cars for, like, you know, geniuses or something.
00:19:47.440 So everybody has trouble with the car door.
00:19:49.380 We can't figure it out.
00:19:50.900 So we're laughing about, you know, Elon Musk designing this car that I can't figure out how to get into.
00:19:56.520 So I said, a young Hispanic driver told me he was 26 years old, and I asked him where he lived.
00:20:04.620 He said, Stockton, but he likes to drive.
00:20:08.120 Now, Stockton, if you don't know California, is a higher crime, lower income place.
00:20:14.940 It's sort of not an ideal place to be an Uber driver.
00:20:18.780 You know, you'd want to go to a better neighborhood.
00:20:20.920 But one of the reasons he goes to my neighborhood is that we're just, it's a rich neighborhood.
00:20:25.680 So we're just full of CEOs and tech bros and, you know, just successful people of all kinds.
00:20:32.520 And so he drives in that neighborhood, and he explicitly has this strategy.
00:20:39.480 He's been reading motivational books and stuff, and he had a tough childhood.
00:20:45.140 I guess his mom was a drug addict and high crime and, you know, just the worst situation, poverty.
00:20:52.420 And he was going to be the one who made it out, right?
00:20:56.740 So he figured out how to become an Uber driver, and his strategy was, his system, his system was to drive where he would be exposed to successful people.
00:21:09.120 And his theory was that that exposure would have a variety of benefits which would help him in his, you know, his race to the top.
00:21:20.280 So we're chatting away.
00:21:21.740 Now, just hold this in your mind.
00:21:26.740 Here's somebody whose strategy was to try to meet the right people who could help him.
00:21:33.600 And of all the people in the whole world, 8 billion people in the world, there wasn't anybody better for him to be in that car than me.
00:21:43.780 Now, I know, I know what you're going to say.
00:21:46.520 Oh, my God, the ego on this man.
00:21:48.920 No, I'm just saying that it's exactly my job.
00:21:51.540 My precise, exact job is giving career and success advice to exactly people like him.
00:22:01.300 You know, my book, How to Fail, is probably the most influential book in that genre at the moment.
00:22:07.780 8 billion people on earth, and I got in his car.
00:22:11.640 Now, everybody who's saying braggart and ego, I do this intentionally to find out who are the weak people to get rid of.
00:22:22.200 Because you can't even handle the conversation.
00:22:25.660 All right.
00:22:26.080 But there's no way to tell the story without saying, the truth is, there's probably nobody better on the planet than me.
00:22:35.940 At this very narrow thing, right?
00:22:38.380 There's lots of things I'm bad at, that you're better than I am.
00:22:40.960 I'm not saying I'm better than you.
00:22:42.520 I'm saying at this very narrow thing, it's what I do.
00:22:45.900 It was like somebody asked, I'm not going to make that comparison.
00:22:49.940 It'll give me a trouble.
00:22:51.920 So, I tell this story on Twitter.
00:22:55.780 I tell this story about the young man whose system was to meet people,
00:22:59.360 and that he was lucky enough to meet me.
00:23:01.160 And I left off cryptically, and this is true, as I was leaving the car at Starbucks.
00:23:10.100 I said, you don't know it yet, but this is the luckiest day of your life.
00:23:16.460 Now, by then, I'd already told him what my book was and who I was,
00:23:20.000 and we talked a little bit about the whole genre of self-help, etc.
00:23:24.980 So, he knew where I fit into the ecosystem.
00:23:28.020 But the story's not over.
00:23:31.160 I tweet about this, and I tell the story about how this young man came from bad circumstances
00:23:38.600 and was using his system to meet people, and it worked that he met me.
00:23:44.220 Elon Musk sees my tweet and responds to his strategy of being where the successful people are.
00:23:53.780 And Elon Musk tweets in reply, he's right.
00:23:56.840 Just hold this in your head for a moment.
00:24:02.200 You know, this is the model of the world that I have, that some people are players, and some people are NPCs.
00:24:09.400 The other person who has that model is Elon Musk, where it just seems as though we're in a simulation,
00:24:15.720 and people like Elon Musk can sort of author the simulation for ridiculous outcomes.
00:24:22.160 I believe it as well, and I believe I have also routinely changed reality in ways that just defy all logic.
00:24:30.740 So, if we're not a simulation, I don't have a better explanation of what's going on.
00:24:35.120 This young Uber driver had a very clear intention to move the universe.
00:24:45.420 He wanted to nudge the universe through his own success.
00:24:49.120 I got in his car, against all odds, all odds.
00:24:57.300 He makes a joke about Elon Musk not wanting to deal with us.
00:25:05.540 Elon Musk tweets a confirmation of this kid's, 26 years old, I call him a kid,
00:25:12.780 of this kid's system and validates it.
00:25:17.200 He validates his system.
00:25:19.580 Now, I don't have a way to contact him, because, you know, Uber is a first name kind of thing.
00:25:25.020 He might contact me someday, you know, just to follow up if he reads my book or something.
00:25:29.520 And I'll let him know the, you know, I'll connect the dots and let him know the story.
00:25:32.460 But what happens when Elon Musk, who has, I don't know, 125 million followers, etc.,
00:25:39.380 what happens when he validates a success strategy?
00:25:44.780 That, you know, being around successful people helps you succeed.
00:25:47.700 It's present.
00:25:50.760 This kid actually moves the universe from his Uber car without knowing it.
00:25:56.960 Some day, that kid's a player.
00:26:04.080 He's a player, right?
00:26:05.800 The only thing I know for sure is he's not an NPC, because he moves the universe.
00:26:12.580 A little bit, just a nudge.
00:26:14.360 But he did that.
00:26:16.220 Is that the coolest story you've heard today?
00:26:18.320 It's early, right?
00:26:20.020 Yeah, it's cool.
00:26:23.100 All right.
00:26:23.360 Well, I don't know if 126 million people saw it, but...
00:26:25.480 All right, that was your inspirational story for the day.
00:26:30.300 Speaking of Elon Musk, I also tweeted about the WEF.
00:26:35.780 And I said this, I'm skeptical of anything that can't be explained in a sentence.
00:26:41.640 What exactly do they do and why?
00:26:44.660 Have you ever thought of that?
00:26:46.240 Here's a good standard for detecting bullshit.
00:26:48.900 If somebody can't explain their situation or what they're offering or their product in one sentence, it's a scam.
00:27:01.020 Now, the first time you hear that, you're going to say, that can't be true.
00:27:04.780 There must be plenty of legitimate things that are a little hard to explain.
00:27:09.800 Nope.
00:27:11.400 No, there are not.
00:27:12.740 There are no legitimate things that are hard to explain.
00:27:15.460 Nope.
00:27:16.040 You'll never find one.
00:27:16.980 The only things that are hard to explain are sketchy things.
00:27:21.160 Sketchy things are hard to explain because you don't want to say the truth.
00:27:25.180 So ask anybody who is pro-World Economic Forum.
00:27:30.060 What do they do and why?
00:27:32.260 And then watch what happens.
00:27:33.680 It's going to be word salad.
00:27:35.340 Well, we like to bring together the future leaders to coordinate and the communication of the synergies,
00:27:42.580 because then the world can be moving in a way that's compatible with the future of both the technology and freedom and equality,
00:27:49.920 and we'll also have the no racism and save the world because of climate change and all the good things and leaders in your community.
00:28:01.460 Right?
00:28:02.020 Now, I'm totally serious.
00:28:05.800 We need to get them on record saying, what is this thing and why does it exist?
00:28:15.020 Where do I send my money?
00:28:17.260 Right?
00:28:17.440 So, after I said, I'm skeptical of anything that can't be explained in a sentence.
00:28:27.420 Then Elon Musk was looking at the same video, and he responded,
00:28:32.900 because Klaus Schwab sounds like a Bond villain.
00:28:36.020 You all know that, right?
00:28:37.260 Like, I'm not the first person to say that.
00:28:40.100 Do you know who else says that Klaus Schwab, the head of the World Economic Forum,
00:28:43.580 do you know who else says he sounds like a Bond villain?
00:28:47.720 Everybody who hears him.
00:28:50.580 Everybody.
00:28:53.300 Everybody.
00:28:54.340 Like, we all thought of it at the same time.
00:28:57.840 Is that the head of Spectre?
00:29:02.340 I think that's the head of Spectre.
00:29:04.480 Like, everybody.
00:29:05.860 Same time.
00:29:07.240 So, that's never a good look, right?
00:29:09.220 Never a good look.
00:29:09.920 But, you know, in his defense, Klaus Schwab is a Swiss economist slash engineer.
00:29:20.080 So, Swiss economists slash engineers are not exactly Mr. Charisma.
00:29:25.740 So, he's got that working against him.
00:29:29.220 But, he talked about the WF and the need to, quote, master the future,
00:29:35.800 which Elon Musk said, master the future doesn't sound ominous at all.
00:29:41.800 And then he says, how is the WF slash Davos even a thing?
00:29:46.840 Are they trying to be the boss of Earth?
00:29:50.740 Now, that's my point as well, right?
00:29:54.480 Like, Elon Musk is a master of the universe.
00:29:59.040 You know, he's one of the people who's figured it out.
00:30:01.780 He's invited to the WF and turned it down because it sounded boring AF, as he said.
00:30:07.840 He doesn't know what they do.
00:30:10.860 Like, what's the point of this?
00:30:13.080 What exactly are you trying to do here?
00:30:15.320 Now, you tell me, if Elon Musk, who is invited to it,
00:30:20.520 can't figure out why they exist and what they do,
00:30:23.220 are they just trying to be the boss of Earth?
00:30:25.460 You should worry about that, right?
00:30:27.820 Now, that's basically another way to say,
00:30:31.160 if you can't explain it in a sentence, it's sketchy.
00:30:35.480 Now, by any objective, reasonable consideration,
00:30:42.300 are they not trying to inject their influence
00:30:45.280 between the citizens of a country and their government?
00:30:48.960 That part would be confirmed by both people pro and con, right?
00:30:56.140 That that is a description.
00:30:58.240 They're inserting themselves between the leadership of a country and the citizens.
00:31:03.420 Who in the world would agree to that?
00:31:06.280 Who in the world would say yes to that?
00:31:08.660 Well, the fact that it's a non-profit means little.
00:31:16.740 Yeah.
00:31:17.340 It just seems like a crazy thing to be in favor of.
00:31:21.280 So I asked if there's any kind of a master list on Twitter, I asked,
00:31:25.320 of people, at least Americans, who are attending,
00:31:27.580 so we can know who all the dumb people are.
00:31:30.840 Doesn't that feel useful?
00:31:32.640 Wouldn't you have to, like, have a master list
00:31:34.840 of who is gullible enough to go to this thing?
00:31:38.660 Well, it turns out there are quite a few of American politicians,
00:31:43.580 both Republican and Democrat.
00:31:45.980 So it's totally bipartisan, it looks like.
00:31:48.460 Yeah.
00:31:49.100 Four Republican governors.
00:31:52.140 Kyrsten Sinema.
00:31:53.960 Somebody said Joe Manchin.
00:31:55.080 I didn't see his name on the list, but I don't know if that's true.
00:31:59.880 Yeah, Christopher Wray, our FBI director is at the WEF.
00:32:05.420 Seriously?
00:32:06.020 Our FBI director is attending the organization
00:32:11.400 that wants to get between our government and the people.
00:32:15.040 What could be worse than that?
00:32:17.520 Like, he should be fired for that.
00:32:20.320 I think.
00:32:21.500 I think he should be fired for that.
00:32:22.980 Like, actually fired for that.
00:32:24.960 Is that wrong?
00:32:25.600 I think attending that is such a bad look for the FBI,
00:32:30.440 such a bad look, that that's disqualifying.
00:32:33.940 I mean, that would be similar to just taking his dick out
00:32:36.200 and, like, pissing on Congress, to me.
00:32:39.960 I mean, maybe it's not technically illegal,
00:32:43.300 but I think you should get fired for it.
00:32:46.200 You know, it's probably approved by somebody above him,
00:32:48.180 so that's different.
00:32:49.440 But I don't approve of that whatsoever.
00:32:54.060 But anyway, it's nice to have a list.
00:32:56.120 Code Monkey Z, who's back on Twitter.
00:32:59.620 Do you remember Code Monkey Z?
00:33:02.140 He's back on Twitter.
00:33:03.360 He had a good list.
00:33:05.700 And I think everybody on the list should be asked in public
00:33:08.220 to explain what the WEF is and why they attended.
00:33:13.300 Oops.
00:33:15.140 Sorry, I just speared there for a minute.
00:33:16.780 So I think everybody on the American list of attendees,
00:33:20.500 yeah, Ron Watkins is the name behind Code Monkey Z.
00:33:25.720 And some say he was behind Q.
00:33:30.760 I don't know if that's confirmed or not.
00:33:33.540 He's the name they most often say.
00:33:35.720 Anyway, I saw a tweet by Rob Reiner today,
00:33:45.600 and I didn't know how to comment on it
00:33:48.820 because it's now indistinguishable from parody.
00:33:53.600 I looked at it, and I swear I couldn't tell if he was joking
00:33:56.520 because he keeps saying that, you know,
00:33:58.680 Biden has been proven as all honest, unlike Trump.
00:34:02.140 And I think you couldn't possibly be looking at the news.
00:34:05.720 Are you looking at the same news I'm looking at?
00:34:08.860 And even Musk said, he said, talking about somebody else mentioned,
00:34:15.680 somebody tweeted,
00:34:19.040 you're saying I should stay away from trashing Biden
00:34:21.480 because I'll see more Rob Reiner and Keith Olbermann tweets.
00:34:24.760 The context here was Musk had tweeted
00:34:27.500 that if you interact with an account you don't like,
00:34:30.640 the Twitter algorithm will give you more accounts you don't like.
00:34:34.920 And then he laughed because he didn't think that was necessarily a mistake,
00:34:38.960 and I agree.
00:34:39.900 If you enjoyed interacting with an account you didn't like,
00:34:43.960 why wouldn't it give you more of those?
00:34:45.580 You enjoyed it.
00:34:46.680 You spent time on it.
00:34:48.180 So it's a weird little kink of the algorithm
00:34:51.200 that'll give you what you don't want
00:34:52.680 because sometimes you want what you don't want.
00:34:54.700 You know, it's like a confusing little situation.
00:34:58.140 So Musk called that out,
00:35:00.000 and he was agreeing with his tweet.
00:35:03.400 So talking about Rob Reiner and Keith Olbermann's tweets,
00:35:07.280 Musk says,
00:35:08.520 is it even possible to parody their tweets?
00:35:11.400 I'm just literally wondering right now if it's even possible.
00:35:15.220 Totally agree.
00:35:16.600 Like actually, literally, no joke, no hyperbole.
00:35:19.520 I don't know if you could do it.
00:35:20.820 You know, I saw a parody account today that it took me 10 minutes
00:35:25.120 to decipher that it was a parody.
00:35:28.780 Like, I really couldn't tell.
00:35:30.360 I had to look pretty deep in the tweet stream.
00:35:32.960 I'm like, this could be just a Democrat.
00:35:36.740 And then finally I saw interacting with Michael Knowles,
00:35:42.140 and that like gave away the parody.
00:35:43.920 But at first I actually couldn't tell.
00:35:46.200 It was way over the top, but still, you know, can't tell.
00:35:50.820 So if you haven't seen this thread, it's just required.
00:35:56.260 Jeez, another glitch.
00:36:00.400 This is just a required thread.
00:36:02.860 All right.
00:36:03.380 Now, I can't require you to do anything,
00:36:05.980 but I would suggest that if you don't see the...
00:36:09.120 I tweeted a thread by Kanakoa the Great,
00:36:13.600 who has a substack as well.
00:36:15.200 But his tweet threads are just outrageously good.
00:36:19.120 Like, better than anything you're seeing in the news.
00:36:21.380 I don't know how, like, what his background is.
00:36:24.580 I don't know anything about him.
00:36:25.500 But the work is, oh my God.
00:36:27.820 It's like better than anything you're seeing
00:36:29.480 in terms of really describing, you know, the situation.
00:36:33.640 Anyway, he's reporting on...
00:36:35.200 I didn't know about any of this,
00:36:37.500 but apparently the DHS, Department of Homeland Security,
00:36:42.900 they outsourced censorship of the Internet platforms
00:36:47.180 to the Election Integrity Partnership,
00:36:51.060 which was comprised of four organizations.
00:36:55.620 Okay, let me stop here.
00:36:58.660 Kanakoa the Great correctly uses the word comprised.
00:37:02.380 You almost never see that.
00:37:05.760 Most people would say it was composed of four organizations,
00:37:09.200 which is actually improper.
00:37:10.920 He uses the correct word, it was comprised of.
00:37:14.040 That's actually a tell for somebody who's operating
00:37:17.540 on at least a higher level of writing talent,
00:37:20.760 and probably that spills over into his reasoning ability.
00:37:25.620 So, four organizations,
00:37:27.180 the Stanford Internet Observatory,
00:37:28.820 the University of Washington's Center for an Informed Public.
00:37:35.320 What the hell is that?
00:37:36.280 The Atlantic Council's Digital Forensic Research Lab,
00:37:39.560 and something called Graphica.
00:37:42.480 All right, do you see what's going on?
00:37:45.180 They're laundering the censorship
00:37:46.760 through, like, organizations.
00:37:50.220 So it's going to look legitimate
00:37:51.840 because they can say these organizations,
00:37:54.040 but it would also be so complicated
00:37:55.760 that it's hard to figure out what's going on.
00:37:59.100 All right, sort of a perfect cover, the complexity.
00:38:05.600 And they said that the EIP
00:38:09.460 claimed every, quote,
00:38:12.060 repeat spreader of election misinformation
00:38:14.300 was on the right.
00:38:15.420 What?
00:38:22.100 Every one of the repeat spreaders of information,
00:38:25.480 according to the censors,
00:38:27.460 they were all on the right.
00:38:28.860 Every one of them.
00:38:30.020 Huh.
00:38:30.940 There are very few things
00:38:32.160 that are that completely binary.
00:38:35.280 And this isn't one of them.
00:38:37.600 Obviously, there are plenty of tweets
00:38:39.420 of Hillary Clinton and Democrats
00:38:41.420 and, you know, Stacey Abrams
00:38:43.520 questioning elections.
00:38:46.780 But all of the repeat offenders,
00:38:48.900 all of them were on the right.
00:38:51.260 Now, here's my question.
00:38:54.200 Is that telling you that
00:38:56.160 they were simply political?
00:39:01.020 That they knew what they were doing,
00:39:02.640 they knew they were just being political,
00:39:04.520 and that's all it was?
00:39:05.980 Because otherwise, how do you explain
00:39:07.440 100% of all the bad guys
00:39:10.040 being on one side according to them?
00:39:13.940 Well, I'm going to propose
00:39:17.240 that there's at least one other possibility.
00:39:20.000 There is one other possibility.
00:39:22.140 That they believed it.
00:39:24.240 That they believed it.
00:39:26.480 Because the left actually believes
00:39:28.620 everything the left says is actually true.
00:39:31.080 They actually believe it.
00:39:34.200 So when they're looking at it,
00:39:35.620 they say, well, it looks like
00:39:36.720 100% of the lies are on this side.
00:39:38.360 But that isn't necessarily
00:39:40.760 purely political.
00:39:43.440 It might actually be
00:39:45.100 that they believe it.
00:39:47.440 And by the way,
00:39:48.380 that wouldn't even be unusual.
00:39:50.460 That would require no leap of logic,
00:39:53.300 no believing in something
00:39:54.760 non-scientific.
00:39:55.860 It would be the way
00:39:56.520 normal people act.
00:39:58.660 You don't think you could get
00:39:59.980 a bunch of Republicans in a room
00:40:03.040 to say that the only people lying
00:40:05.340 are the Democrats.
00:40:07.880 Of course you could.
00:40:09.580 Would they be acting only politically?
00:40:12.740 Nope.
00:40:13.800 No, they would actually believe it.
00:40:16.060 Right.
00:40:16.700 If you put a bunch of Republicans in a room,
00:40:18.520 they would say
00:40:19.040 a whole bunch of conspiracy theories
00:40:21.760 that they believe are just facts.
00:40:24.600 And they would say
00:40:25.380 these other guys are just spewing garbage.
00:40:27.260 The truth is,
00:40:28.460 both sides have a healthy dose
00:40:30.640 of conspiracy theories and bullshit.
00:40:33.220 But if only one side
00:40:34.360 is doing the fact-checking,
00:40:35.840 you've got a real big problem.
00:40:37.860 And that's what
00:40:38.640 Kennekoa the Great
00:40:39.860 is calling out
00:40:40.600 in great detail.
00:40:41.580 So he's got all the receipts
00:40:42.740 and he shows it
00:40:43.500 in a way that
00:40:45.020 I just recommend
00:40:45.740 that you look at the details.
00:40:47.320 The big picture is
00:40:48.540 that our government
00:40:49.860 was definitely involved
00:40:51.180 in censorship
00:40:52.280 in a big way,
00:40:54.060 in an important way
00:40:55.680 that probably influenced elections.
00:40:58.200 Probably.
00:40:59.540 And that they use
00:41:01.700 these cutouts,
00:41:02.840 if you want,
00:41:03.180 it's probably the wrong term,
00:41:04.560 but they use these organizations
00:41:05.660 that are not terribly credible
00:41:08.200 to me anyway
00:41:09.300 to, you know,
00:41:11.200 to make this happen
00:41:11.960 and make it confusing
00:41:12.840 and give us some
00:41:14.100 laundered legitimacy.
00:41:17.540 So 22 million tweets
00:41:19.420 were labeled misinformation.
00:41:21.660 None of them on the left.
00:41:24.060 So the left
00:41:25.240 has got a good record.
00:41:27.280 They're 22 million to zero.
00:41:31.180 That's pretty good.
00:41:32.260 22 million to zero.
00:41:35.360 All right.
00:41:36.300 This next segment
00:41:37.100 I call backwards science.
00:41:40.120 Backwards science.
00:41:41.980 The people who get
00:41:42.800 cause and effect
00:41:43.860 backwards
00:41:45.240 or might.
00:41:47.520 So I'm going to give you
00:41:48.580 two situations
00:41:49.360 in which
00:41:49.920 it's entirely possible
00:41:51.720 that the interpretation
00:41:52.600 is accurate.
00:41:54.340 Maybe the cause and effect
00:41:55.620 is exactly what they say.
00:41:57.600 But you decide.
00:41:59.380 Number one.
00:42:01.420 There's new study
00:42:02.440 that says
00:42:02.900 going for a walk
00:42:03.780 in the park
00:42:04.380 or along a lake
00:42:05.960 or tree-lined spaces
00:42:07.160 basically in nature.
00:42:08.780 They're doing some exercise
00:42:09.620 in nature
00:42:10.440 may reduce the need
00:42:13.400 for medications
00:42:14.100 for anxiety,
00:42:15.160 asthma,
00:42:15.600 depression,
00:42:16.080 high blood pressure
00:42:16.860 or insomnia.
00:42:21.280 That
00:42:21.720 physical activity
00:42:23.500 is thought to be
00:42:24.640 the key mediating factor
00:42:26.100 when you're out
00:42:27.180 in these green spaces.
00:42:28.100 So it's not just being there.
00:42:29.760 It's being active out there.
00:42:31.320 And the study
00:42:32.620 found that
00:42:33.100 visiting nature
00:42:33.840 three to four times
00:42:34.860 a week
00:42:35.300 was associated with
00:42:37.020 wait,
00:42:37.460 associated with
00:42:38.260 they didn't even use
00:42:40.680 the word causation.
00:42:42.480 They don't say
00:42:43.360 it's a cause.
00:42:44.580 Just associated with.
00:42:47.320 So associated with
00:42:48.860 36% lower odds
00:42:50.760 of using blood pressure pills,
00:42:52.400 33% lower odds
00:42:53.740 of using mental health
00:42:54.540 medications,
00:42:55.200 and 26% lower odds
00:42:56.560 of using asthma medications.
00:42:58.080 Now,
00:43:00.640 let me ask you
00:43:02.040 this question.
00:43:04.760 If you were to
00:43:05.720 divide the world
00:43:06.500 into people
00:43:07.300 who are already healthy
00:43:08.560 and people
00:43:10.160 who are not healthy,
00:43:11.360 which one of those
00:43:12.560 do more hiking?
00:43:15.480 Do the unhealthy people
00:43:17.140 do as much hiking
00:43:18.080 as the healthy people?
00:43:20.360 Because I wouldn't
00:43:21.720 have to do
00:43:22.180 any science whatsoever
00:43:23.280 to determine
00:43:24.500 that people
00:43:24.980 who go hiking
00:43:25.780 are healthier
00:43:27.500 than people
00:43:28.000 who don't.
00:43:29.180 And I don't need
00:43:30.060 any science
00:43:30.620 to tell me that.
00:43:32.260 Do you know why?
00:43:32.980 Because unhealthy people
00:43:33.840 don't hike.
00:43:35.120 Healthy people do.
00:43:37.340 Is this
00:43:38.040 probably backward science?
00:43:40.720 Don't you think
00:43:41.180 just healthy people
00:43:42.120 go hiking?
00:43:42.820 It's not that
00:43:43.260 hiking people
00:43:43.820 makes you healthy?
00:43:44.780 Now, of course
00:43:45.800 hiking makes you
00:43:47.000 healthier.
00:43:48.000 Let me be clear.
00:43:49.580 There's no question
00:43:50.520 whatsoever
00:43:50.980 it works both ways.
00:43:52.660 Right?
00:43:53.280 We know it works
00:43:54.140 both ways.
00:43:54.580 For sure it works
00:43:55.360 both ways.
00:43:55.860 But don't you think
00:43:57.500 the bigger factor
00:43:58.860 is that healthy people
00:44:00.600 can hike
00:44:01.220 and unhealthy people
00:44:02.820 can't get off the couch?
00:44:05.040 I'm just going to
00:44:05.860 put it out there.
00:44:07.140 Possibility.
00:44:08.620 Let's do another one.
00:44:11.120 There is a report
00:44:12.220 out of Australia
00:44:13.160 and you're not
00:44:14.500 going to believe this
00:44:15.140 but people
00:44:16.620 who are healthy
00:44:17.480 don't go to the
00:44:21.140 doctors and die
00:44:22.840 nearly as often
00:44:24.660 is people who
00:44:25.200 are super unhealthy.
00:44:27.220 Did you know that?
00:44:29.460 Yeah?
00:44:30.800 Well, that's what
00:44:31.380 the Australians
00:44:31.820 figured out
00:44:32.260 that healthy people
00:44:33.540 live longer
00:44:34.420 than unhealthy people.
00:44:35.900 Now, they didn't
00:44:36.400 say it that way.
00:44:36.980 They said it a little
00:44:37.440 differently.
00:44:38.240 What they said was
00:44:38.940 that the people
00:44:39.400 who had the most
00:44:40.220 vaccinations
00:44:41.260 and boosters
00:44:42.180 were the only ones
00:44:44.300 dying.
00:44:45.480 Literally the only ones.
00:44:46.680 It was like a
00:44:47.500 subset of a thousand
00:44:49.200 some people.
00:44:49.960 So it wasn't
00:44:50.440 the biggest study
00:44:51.040 in the world.
00:44:51.500 So I think
00:44:53.300 you'd need
00:44:53.620 confirmation.
00:44:54.820 But they found
00:44:55.300 that zero
00:44:56.780 in the period
00:44:57.900 that they looked
00:44:58.460 at this recently
00:44:59.140 zero unvaccinated
00:45:01.100 people died
00:45:01.760 in the hospital
00:45:02.440 from COVID.
00:45:03.880 Zero.
00:45:05.100 Not a single
00:45:06.200 unvaccinated person.
00:45:11.360 Who do you think
00:45:12.360 got vaccinated
00:45:13.100 the most?
00:45:15.040 Was it
00:45:15.880 the people
00:45:16.840 with no
00:45:17.540 comorbidities
00:45:18.480 who looked
00:45:19.360 perfectly healthy
00:45:20.280 and said
00:45:21.060 to themselves
00:45:21.560 I'm perfectly
00:45:22.260 healthy
00:45:22.660 and we're
00:45:23.400 already in the
00:45:23.940 age of Omicron
00:45:24.840 and if I haven't
00:45:25.900 got vaccinated
00:45:26.540 by now
00:45:27.220 it would be
00:45:28.100 maybe unwise
00:45:28.980 to do so
00:45:29.700 because Omicron
00:45:30.720 is not going
00:45:31.180 to hurt me.
00:45:31.700 I just ran
00:45:32.140 a marathon.
00:45:34.260 Or
00:45:34.700 are you more
00:45:35.860 likely to get
00:45:36.520 vaccinated
00:45:36.900 if you're
00:45:37.840 400 pounds
00:45:38.680 and you have
00:45:39.060 diabetes?
00:45:42.980 I don't know.
00:45:43.900 I think you're
00:45:44.400 more likely to get
00:45:45.160 vaccinated
00:45:45.560 if you're
00:45:46.740 unhealthy.
00:45:47.440 So
00:45:48.720 to me
00:45:49.660 all it
00:45:50.060 proved
00:45:50.320 is that
00:45:50.960 unhealthy
00:45:51.760 people die
00:45:52.620 and healthy
00:45:53.240 people do
00:45:53.800 not.
00:45:55.520 The other
00:45:56.340 possibility
00:45:57.020 is that
00:45:58.940 the jabs
00:45:59.460 are killing
00:45:59.860 people.
00:46:03.020 Which one
00:46:03.760 is it?
00:46:05.840 Would you
00:46:06.500 agree that
00:46:06.900 they're both
00:46:07.300 possible?
00:46:08.540 Are they
00:46:08.860 both possible?
00:46:10.340 That the
00:46:10.780 only explanation
00:46:11.740 is the jabs
00:46:12.500 are killing
00:46:12.920 you?
00:46:13.960 Or
00:46:14.400 maybe the
00:46:16.000 most explanation
00:46:16.760 is that
00:46:17.180 healthy people
00:46:17.700 didn't get
00:46:18.080 vaccinated?
00:46:20.940 Probably
00:46:21.420 both.
00:46:22.960 Probably
00:46:23.440 both.
00:46:24.000 Do you
00:46:24.220 know why
00:46:24.440 I say
00:46:24.680 probably
00:46:25.020 both?
00:46:26.000 Because we
00:46:26.660 know
00:46:26.880 vaccinations
00:46:27.480 are dangerous
00:46:28.940 to some
00:46:29.560 people.
00:46:30.680 So if you've
00:46:31.380 got a thousand
00:46:31.900 people,
00:46:32.540 well,
00:46:32.760 probably.
00:46:34.020 Probably
00:46:34.500 maybe one
00:46:35.520 of them?
00:46:35.960 I don't know.
00:46:36.540 Not probably
00:46:37.240 because I
00:46:37.640 think,
00:46:38.640 I don't think
00:46:39.120 one in a
00:46:39.540 thousand are
00:46:40.000 being killed
00:46:41.160 by a
00:46:41.520 vaccination.
00:46:42.400 It's not
00:46:42.780 one in a
00:46:43.120 thousand.
00:46:43.500 That would be
00:46:43.700 too much.
00:46:44.720 But you
00:46:45.220 know,
00:46:45.380 you can
00:46:45.620 imagine if
00:46:46.180 you looked
00:46:46.460 at a
00:46:46.660 thousand
00:46:46.880 people,
00:46:47.300 you get
00:46:47.600 maybe one
00:46:48.960 or two.
00:46:50.120 Who knows?
00:46:51.520 And I
00:46:51.860 can also
00:46:52.220 imagine if
00:46:52.940 it's,
00:46:54.080 you know,
00:46:54.580 I'm reasonably
00:46:55.480 healthy,
00:46:56.300 but I didn't
00:46:57.140 tolerate the
00:46:57.800 first two
00:46:58.220 shots at
00:46:58.660 all.
00:46:59.460 Like,
00:46:59.800 they kicked
00:47:00.120 my ass.
00:47:01.620 If the
00:47:02.260 shot itself
00:47:02.880 kicks your
00:47:03.360 ass and
00:47:03.760 you're already
00:47:04.120 near death,
00:47:05.560 I'm not
00:47:06.180 too surprised
00:47:06.860 if it kills
00:47:07.400 some people
00:47:07.860 too.
00:47:09.080 Would that
00:47:09.760 surprise you?
00:47:11.100 If the
00:47:11.480 virus could
00:47:12.020 kill you,
00:47:13.300 and I
00:47:14.040 don't know
00:47:14.220 about you,
00:47:14.680 but I
00:47:14.940 felt just
00:47:15.400 as sick
00:47:15.740 with the
00:47:16.040 shot as
00:47:16.480 I did
00:47:16.760 with the
00:47:17.020 virus.
00:47:18.460 The shot
00:47:19.220 was before
00:47:20.680 Omicron,
00:47:21.620 but when I
00:47:22.140 got Omicron,
00:47:23.700 it kicked
00:47:24.200 my ass,
00:47:24.820 but about
00:47:25.180 the same
00:47:25.560 as the
00:47:25.820 shot did.
00:47:26.920 There wasn't
00:47:27.300 that much
00:47:27.740 difference.
00:47:28.980 So if
00:47:30.400 you can
00:47:30.760 take out
00:47:31.840 an old
00:47:32.220 person with
00:47:32.800 just a
00:47:33.320 little bit
00:47:33.600 of Omicron,
00:47:34.440 which is
00:47:34.840 happening,
00:47:36.060 why wouldn't
00:47:36.760 the vaccination
00:47:38.360 take them
00:47:38.920 out?
00:47:39.100 It feels
00:47:40.640 just as
00:47:41.080 aggressive,
00:47:41.720 but in
00:47:42.080 a different
00:47:42.420 way.
00:47:45.200 Just put
00:47:45.800 that out
00:47:46.120 there.
00:47:46.680 So I
00:47:47.020 want to
00:47:47.200 be very
00:47:47.540 clear for
00:47:48.060 the
00:47:48.220 clopperts.
00:47:49.920 I would
00:47:50.480 be worried
00:47:50.900 about this
00:47:51.320 data.
00:47:53.120 There's
00:47:53.560 definitely a
00:47:54.040 flag, you
00:47:54.560 should look
00:47:54.880 at it.
00:47:55.520 But I
00:47:56.020 always see
00:47:57.020 Dr.
00:47:57.480 McCullough,
00:47:58.360 he's always
00:47:58.860 the one
00:47:59.180 who's on
00:47:59.560 these
00:47:59.920 correlation
00:48:00.860 causation
00:48:02.140 claims,
00:48:03.460 and he
00:48:05.480 could easily
00:48:06.080 turn out to
00:48:06.640 be right
00:48:07.000 about everything.
00:48:07.600 it's well
00:48:08.640 within the
00:48:09.520 realm of
00:48:10.040 possibility.
00:48:11.400 But he
00:48:11.760 consistently
00:48:12.500 treats this
00:48:14.360 kind of
00:48:14.760 study like
00:48:15.840 there's only
00:48:16.240 one
00:48:16.440 explanation.
00:48:18.660 There's at
00:48:19.280 least two.
00:48:20.460 At least
00:48:20.860 two, there
00:48:21.320 might be
00:48:21.600 more, which
00:48:22.080 is bad
00:48:22.620 data.
00:48:23.320 Bad data
00:48:23.860 would be
00:48:24.120 the other
00:48:24.420 explanation.
00:48:27.860 Well,
00:48:28.540 somebody's
00:48:28.980 saying Scott
00:48:29.580 was shilling
00:48:30.200 pro-vax in
00:48:31.100 my opinion.
00:48:32.420 What does
00:48:32.980 your opinion
00:48:33.380 have to do
00:48:33.840 with anything?
00:48:34.140 Why does
00:48:35.780 anybody care
00:48:36.220 about your
00:48:36.520 opinion?
00:48:37.200 I mean,
00:48:37.600 it's wrong,
00:48:38.440 obviously.
00:48:40.300 Why do we
00:48:41.180 care?
00:48:43.800 As long as
00:48:44.520 you say it's
00:48:44.900 your opinion,
00:48:46.400 don't say it's
00:48:47.320 a fact.
00:48:48.960 If you say
00:48:49.620 it's a fact,
00:48:50.520 well, then I
00:48:50.960 have to bury
00:48:52.220 you.
00:48:53.180 But if you
00:48:53.620 say it's
00:48:53.880 your opinion,
00:48:54.320 that's fine.
00:48:55.380 I mean, it's
00:48:55.720 an uninformed
00:48:56.360 opinion based
00:48:57.200 on sketchy
00:48:58.420 data, probably.
00:48:59.480 But opinions
00:49:00.040 are fine.
00:49:00.400 All right.
00:49:06.120 I saw
00:49:06.900 Twitter user
00:49:08.200 Eliza, who
00:49:10.340 is Eliza
00:49:11.100 Blue, who
00:49:12.260 is a
00:49:12.700 notable
00:49:13.160 anti-trafficking
00:49:15.600 advocate.
00:49:18.100 So does
00:49:18.660 important good
00:49:19.940 work of
00:49:21.940 advocacy for
00:49:22.920 groups that
00:49:23.460 need it, the
00:49:24.520 trafficked
00:49:24.960 people and
00:49:25.820 pedophiles and
00:49:26.760 stuff, victims
00:49:28.860 of pedophilia.
00:49:29.420 So a very
00:49:30.760 credible person,
00:49:33.220 doing good
00:49:34.080 work.
00:49:34.640 But she
00:49:35.000 asked this
00:49:35.340 question.
00:49:35.740 She said,
00:49:36.200 honest question,
00:49:37.380 what happened
00:49:38.100 where men are
00:49:39.180 buying into
00:49:39.740 these alpha
00:49:40.380 male scams?
00:49:41.920 Now, she
00:49:43.240 didn't say
00:49:43.720 Andrew Tate,
00:49:45.760 but I feel
00:49:46.500 like she
00:49:46.820 might have
00:49:47.080 been at
00:49:47.440 least thinking
00:49:47.900 of him a
00:49:48.580 little bit.
00:49:49.800 Yeah.
00:49:50.420 So honest
00:49:51.360 question, what
00:49:51.920 happened where
00:49:52.460 men are buying
00:49:53.080 into these
00:49:53.500 alpha male scams?
00:49:55.480 And then she
00:49:57.080 said, what can
00:49:57.700 I do as a
00:49:58.360 woman to
00:49:59.340 let men
00:49:59.840 know that
00:50:00.360 it's not
00:50:00.800 the move?
00:50:01.840 At least
00:50:02.260 not for
00:50:02.680 me.
00:50:03.240 It's not
00:50:03.660 attractive energy
00:50:04.600 in the
00:50:05.020 slightest.
00:50:06.780 And people
00:50:07.760 had a lot
00:50:08.140 of replies,
00:50:09.040 but I
00:50:09.860 noted this
00:50:11.040 one from
00:50:11.420 Jason Andrews,
00:50:13.060 who's a
00:50:13.680 hypnotist, by
00:50:14.900 the way.
00:50:15.720 And
00:50:16.100 persuasion
00:50:16.980 rising is his
00:50:17.760 account, if
00:50:18.280 you like to
00:50:18.700 follow
00:50:19.060 hypnotists.
00:50:20.320 And he
00:50:20.760 responded,
00:50:21.540 this is
00:50:22.800 pretty savage.
00:50:23.920 Now, I
00:50:25.220 don't endorse
00:50:25.800 this opinion.
00:50:26.800 I'll tell you
00:50:27.240 my opinion
00:50:27.680 in a
00:50:27.880 moment, but
00:50:28.680 it was so
00:50:29.340 savage, I
00:50:29.860 just had to
00:50:30.260 read it.
00:50:31.460 He goes,
00:50:32.500 first of all,
00:50:33.160 she said,
00:50:33.600 honest question,
00:50:34.420 blah, blah,
00:50:34.700 blah.
00:50:34.980 And he
00:50:35.340 says, no
00:50:36.260 honest person
00:50:37.040 could have
00:50:37.460 lived through
00:50:37.940 the last 20
00:50:38.740 to 30
00:50:39.180 years and
00:50:40.140 not be
00:50:40.620 aware of
00:50:41.180 why men
00:50:41.640 do this.
00:50:44.540 Men,
00:50:45.600 true or
00:50:45.920 false, you
00:50:46.940 couldn't
00:50:47.200 possibly have
00:50:47.880 lived through
00:50:48.180 the last 20
00:50:48.760 to 30 years
00:50:50.300 and be
00:50:50.980 confused at
00:50:51.660 all about
00:50:52.640 why this
00:50:53.080 is happening.
00:50:53.540 Then he
00:50:55.960 goes on,
00:50:56.520 because he's
00:50:56.780 not done,
00:50:57.680 he goes,
00:50:58.100 but you
00:50:58.420 have no
00:50:58.820 standing to
00:50:59.560 tell men
00:51:00.160 if that is
00:51:01.160 or isn't
00:51:01.640 the right
00:51:02.000 move because
00:51:02.940 you are not
00:51:03.480 one of
00:51:03.820 them.
00:51:05.140 Ouch.
00:51:06.980 Ouch.
00:51:08.360 Yeah,
00:51:09.300 yeah, I
00:51:10.180 will endorse
00:51:10.740 that opinion.
00:51:12.180 Yes, I
00:51:12.560 endorse that
00:51:13.060 opinion.
00:51:14.060 Now, but
00:51:14.840 to be clear,
00:51:16.460 let me
00:51:17.080 defend her
00:51:17.600 as well.
00:51:18.760 She didn't
00:51:19.240 say what
00:51:19.880 you should
00:51:20.280 do.
00:51:20.700 She was
00:51:20.920 more about
00:51:22.060 what her
00:51:22.620 preferences
00:51:23.100 would be.
00:51:24.160 So she
00:51:24.520 was saying
00:51:24.860 if you
00:51:25.180 act like
00:51:25.540 this, I
00:51:25.880 don't
00:51:26.060 personally
00:51:26.480 like it.
00:51:27.660 But if
00:51:27.980 you act
00:51:28.260 like this
00:51:28.620 other thing,
00:51:29.080 I would
00:51:29.360 personally
00:51:29.780 like it.
00:51:30.600 So I'm
00:51:31.440 going to
00:51:31.620 give her
00:51:31.940 a break.
00:51:34.340 I'm going
00:51:34.840 to defend
00:51:35.320 her a
00:51:35.560 little bit.
00:51:35.880 It was
00:51:36.060 more about
00:51:36.600 there are
00:51:37.840 people who
00:51:38.180 have personal
00:51:38.660 preferences,
00:51:39.580 and maybe
00:51:40.000 that's not
00:51:40.460 their personal
00:51:40.940 preference.
00:51:41.720 So that's
00:51:42.060 fair.
00:51:43.840 But I'm
00:51:44.420 also going
00:51:44.800 to disagree
00:51:45.220 with Jason
00:51:46.300 when he
00:51:46.800 says no
00:51:47.200 honest person
00:51:48.000 could have
00:51:48.380 lived through
00:51:48.720 the last
00:51:49.000 20 to 30
00:51:49.580 years and
00:51:50.520 not be
00:51:50.960 aware of
00:51:51.400 it.
00:51:52.620 You don't
00:51:53.300 think men
00:51:53.840 have just
00:51:54.400 as big
00:51:54.780 blind spots
00:51:55.520 about women?
00:51:57.360 All I
00:51:58.000 see is a
00:51:58.400 giant blind
00:51:58.880 spot.
00:52:00.140 I don't
00:52:00.960 really see
00:52:01.560 dishonesty.
00:52:03.140 But it was
00:52:03.800 a funny
00:52:04.120 comment, so
00:52:04.920 it's a good
00:52:06.260 Twitter exchange.
00:52:07.760 But I'm
00:52:08.120 going to
00:52:08.300 defend her a
00:52:08.800 little bit.
00:52:09.500 I feel like
00:52:10.260 the explanation
00:52:11.300 is far more
00:52:12.460 likely to be
00:52:13.220 that men and
00:52:14.320 women have
00:52:14.740 enormous blind
00:52:16.000 spots about
00:52:16.840 the other,
00:52:17.660 and there's
00:52:18.220 almost no way
00:52:18.820 to fix it.
00:52:20.060 It's just
00:52:20.360 too many of us
00:52:21.100 who are too
00:52:21.500 confused.
00:52:22.620 So I
00:52:25.400 definitely
00:52:25.800 agree with
00:52:27.020 Jason about
00:52:28.200 maybe men
00:52:30.180 could explain
00:52:30.740 this to you
00:52:31.260 and it
00:52:31.440 wouldn't be
00:52:31.660 that complicated.
00:52:34.580 But I
00:52:35.920 don't think
00:52:36.300 this exchange
00:52:37.000 should have
00:52:37.540 turned as
00:52:38.240 savage as it
00:52:39.200 did, because
00:52:39.700 she was really
00:52:40.260 just asking a
00:52:40.960 question.
00:52:42.040 And I do
00:52:42.760 appreciate her.
00:52:44.880 I appreciate
00:52:45.500 the work she
00:52:46.040 does, and so
00:52:46.800 I don't want
00:52:47.360 to sound like
00:52:47.840 I'm dumping
00:52:48.220 on her.
00:52:48.500 But as a
00:52:50.340 representative of
00:52:52.160 men versus
00:52:53.080 women and
00:52:53.760 what blind
00:52:54.660 spots they
00:52:55.140 have, I
00:52:55.720 think this
00:52:56.100 is just a
00:52:56.480 perfect exchange.
00:52:58.960 All right, I
00:52:59.640 would like to
00:53:00.020 add one thing
00:53:00.780 to the analysis
00:53:01.680 of Ukraine
00:53:02.480 versus Russia.
00:53:04.460 I'm sure
00:53:05.220 somebody has
00:53:06.420 already said
00:53:06.920 this, probably
00:53:08.360 lots of people,
00:53:09.280 but I haven't
00:53:10.160 seen it, so
00:53:11.080 I'll say it.
00:53:11.540 When the
00:53:14.200 predictors are
00:53:15.700 telling us how
00:53:16.460 Russia and
00:53:17.520 Ukraine are
00:53:18.080 going to turn
00:53:18.540 out, what I
00:53:20.300 hear the most
00:53:21.000 is you don't
00:53:21.860 understand that
00:53:23.220 Russia can be
00:53:23.940 patient, and
00:53:25.360 they have more
00:53:26.040 resources, and
00:53:27.380 they can just
00:53:27.980 grind on Ukraine
00:53:29.800 until they get
00:53:30.540 what they want,
00:53:31.220 or there's
00:53:31.680 nothing left.
00:53:32.960 So Russia
00:53:34.140 almost can't
00:53:35.020 lose, because
00:53:36.560 they'll just
00:53:37.060 grind away, and
00:53:37.920 they don't have
00:53:38.360 to worry about
00:53:38.800 public opinion,
00:53:39.760 and they'll just
00:53:40.620 keep doing it.
00:53:41.540 But Ukraine
00:53:42.260 might run out
00:53:42.980 of people, and
00:53:44.720 maybe the West
00:53:45.460 will get tired
00:53:46.720 of supporting
00:53:47.240 them.
00:53:47.920 So if you
00:53:48.500 were a smart
00:53:49.100 bettor, say
00:53:50.040 some people, you
00:53:51.280 would bet on
00:53:51.740 the one with
00:53:52.100 the most
00:53:52.380 resources and
00:53:53.340 staying power.
00:53:54.640 I submit that
00:53:55.980 that's the wrong
00:53:56.560 analysis, because
00:53:59.060 I think the
00:54:00.020 critical thing
00:54:00.840 will be collapse
00:54:02.540 points.
00:54:04.400 In other
00:54:04.780 words, both
00:54:06.080 Ukraine and
00:54:07.020 Russia might
00:54:08.160 be, we
00:54:09.100 don't know,
00:54:10.120 they might
00:54:10.580 both be close
00:54:11.520 to collapse,
00:54:13.180 meaning a
00:54:13.980 system collapse,
00:54:15.040 a cascade of
00:54:16.120 things that
00:54:17.020 makes the
00:54:17.340 whole system
00:54:17.860 collapse.
00:54:18.860 Imagine, for
00:54:19.640 example, it's
00:54:21.640 easier to
00:54:22.100 imagine on the
00:54:22.640 Russian side,
00:54:23.580 but on the
00:54:23.800 Russian side,
00:54:24.280 imagine if
00:54:24.940 there's just
00:54:26.360 too many
00:54:26.920 qualified people
00:54:28.000 who get killed
00:54:28.720 to handle
00:54:29.640 logistics, like
00:54:31.400 getting the
00:54:31.820 food and the
00:54:32.480 ammunition.
00:54:33.520 Let's just say
00:54:34.200 that's the only
00:54:34.780 thing that goes
00:54:35.360 wrong, that the
00:54:37.580 Russian army is
00:54:38.680 so degraded in
00:54:39.680 talent, that
00:54:41.000 they have
00:54:41.280 nothing but
00:54:41.900 literally drunks
00:54:43.340 to deliver the
00:54:44.880 food and
00:54:45.240 ammunition, and
00:54:45.900 it just stops
00:54:46.800 getting there.
00:54:48.120 That's the
00:54:48.480 system collapse.
00:54:49.860 They still have
00:54:50.560 plenty of food,
00:54:52.240 and they still
00:54:52.960 have plenty of
00:54:53.520 guns and
00:54:53.920 weapons, but
00:54:55.380 the system
00:54:55.900 collapsed, because
00:54:57.140 it was just one
00:54:57.760 weak point.
00:54:58.360 You only need
00:54:58.760 one weak point,
00:54:59.560 and the whole
00:54:59.760 thing collapses.
00:55:00.400 So, I think
00:55:04.260 both should be
00:55:05.200 looked at as
00:55:05.900 system collapse
00:55:06.940 problems, and
00:55:09.440 a little less,
00:55:10.720 no, that's not
00:55:11.260 mind reading.
00:55:13.120 Where is there
00:55:14.140 mind reading in
00:55:14.800 this?
00:55:16.100 There's no mind
00:55:17.000 reading in this.
00:55:17.940 This is literally
00:55:18.680 describing a system,
00:55:19.720 it's not even
00:55:20.100 people.
00:55:21.720 No people have
00:55:22.540 been discussed
00:55:23.300 yet.
00:55:24.460 So, somebody
00:55:28.140 says that NATO
00:55:29.080 might collapse
00:55:29.840 first.
00:55:30.660 I can see
00:55:31.320 that.
00:55:31.720 I can see
00:55:32.260 that the, let's
00:55:33.140 say, the
00:55:33.480 support for
00:55:34.200 Ukraine could
00:55:36.040 collapse.
00:55:36.820 Imagine if
00:55:37.340 something happened
00:55:37.900 to Zelensky.
00:55:40.100 Game over.
00:55:41.620 What happens if
00:55:42.560 we found out
00:55:43.480 that Zelensky,
00:55:44.360 I'm not suggesting
00:55:45.540 this, but what
00:55:46.420 if we found out
00:55:47.080 he did some
00:55:47.580 huge corrupt
00:55:48.360 thing, or
00:55:49.560 killed some
00:55:50.160 people, right?
00:55:53.060 Suddenly, the
00:55:54.020 public opinion
00:55:54.680 would just
00:55:55.120 completely change,
00:55:56.740 and there'd be a
00:55:57.500 system collapse
00:55:58.180 on the left.
00:55:58.680 So, you have
00:55:59.700 two entities
00:56:01.240 that are both
00:56:02.580 very, very
00:56:03.660 vulnerable to
00:56:05.400 total system
00:56:06.660 collapse if
00:56:07.400 any one peg
00:56:09.220 gets pulled out
00:56:09.860 of the Jenga
00:56:10.400 thing.
00:56:12.000 Now, you
00:56:14.120 can't ignore
00:56:14.760 the fact that
00:56:16.420 one of them
00:56:16.880 might run out
00:56:17.560 of resources to
00:56:18.440 the point where
00:56:18.940 it collapses.
00:56:19.660 That could
00:56:19.940 happen.
00:56:20.540 But there are
00:56:20.880 way more ways
00:56:22.360 it collapses
00:56:22.960 than just who
00:56:25.020 has the most
00:56:25.560 resources that
00:56:26.360 they're applying
00:56:26.820 to it.
00:56:27.660 Would you
00:56:27.980 agree with
00:56:28.360 that?
00:56:29.520 Would you
00:56:29.880 agree that
00:56:30.400 the system
00:56:30.960 collapse thing
00:56:31.480 is a little
00:56:31.880 underappreciated?
00:56:34.400 Because that's
00:56:35.200 the thing you
00:56:35.940 can't see until
00:56:36.600 it happens.
00:56:37.460 It's invisible
00:56:38.100 until it happens.
00:56:39.620 But everybody
00:56:40.120 can see,
00:56:41.060 everybody can
00:56:41.940 see, oh,
00:56:42.660 they used up
00:56:43.220 their missiles,
00:56:44.720 a lot of
00:56:45.160 their military
00:56:46.060 is already dead
00:56:46.800 or injured.
00:56:47.760 So that's
00:56:48.260 stuff we talk
00:56:48.860 about because
00:56:49.260 you can see
00:56:49.760 it.
00:56:50.660 But a potential
00:56:51.400 system collapse
00:56:52.480 would just be
00:56:52.960 one component
00:56:53.800 and you
00:56:54.660 wouldn't even
00:56:55.000 notice it
00:56:55.440 until it was
00:56:55.760 gone.
00:56:57.820 So just keep
00:56:58.740 that in mind.
00:57:00.640 I think you'll
00:57:01.520 see stories in
00:57:02.280 the future
00:57:02.640 about system
00:57:04.300 collapse.
00:57:08.480 Center of
00:57:09.120 gravity analysis.
00:57:10.120 Klaus Witz
00:57:10.720 calls it the
00:57:11.500 critical point,
00:57:12.960 culminating points.
00:57:14.300 So surely this
00:57:16.120 is already,
00:57:16.860 you know,
00:57:17.680 military doctrine,
00:57:19.000 right?
00:57:20.180 I mean,
00:57:20.620 it's too obvious
00:57:21.220 not to be part
00:57:22.560 of the military
00:57:23.080 doctrine.
00:57:23.800 Is it a
00:57:24.340 prediction?
00:57:26.980 It's short
00:57:27.700 of a prediction.
00:57:29.400 It's short
00:57:29.920 of a prediction.
00:57:30.940 Because we're
00:57:31.540 definitely in a
00:57:32.200 anything-could-happen
00:57:33.120 situation.
00:57:34.120 And I don't
00:57:34.740 believe, you
00:57:35.620 know, data
00:57:35.980 that comes out
00:57:36.600 of that area.
00:57:37.820 So I think
00:57:38.440 either side
00:57:38.920 could win
00:57:39.280 at this point.
00:57:39.840 What do you
00:57:40.140 say?
00:57:41.600 Would you
00:57:42.140 agree either
00:57:42.680 side could win?
00:57:43.520 It's unpredictable
00:57:44.340 at this point.
00:57:46.420 Yeah.
00:57:46.720 And then you
00:57:47.460 also have to
00:57:47.920 decide what
00:57:48.420 winning looks
00:57:48.940 like, right?
00:57:50.320 Because we
00:57:50.740 disagree what
00:57:51.540 winning looks
00:57:52.040 like.
00:57:53.800 Right.
00:57:56.840 I see
00:57:57.180 Stephen saying
00:57:57.860 Russia is
00:57:58.520 winning for
00:57:59.220 certain.
00:58:00.080 Well, would
00:58:00.680 you say
00:58:00.960 they're winning
00:58:01.360 for certain
00:58:01.880 if their
00:58:02.440 economy is
00:58:03.260 forever shut
00:58:03.960 out of the
00:58:04.580 main economic
00:58:06.460 engine of the
00:58:07.060 world?
00:58:07.480 Would that
00:58:07.860 be winning?
00:58:08.220 I don't know.
00:58:10.300 well, we'll
00:58:13.340 see.
00:58:13.960 All right.
00:58:14.340 Ladies and
00:58:14.880 gentlemen of
00:58:15.860 YouTube, I'm
00:58:17.180 going to say
00:58:18.240 goodbye for
00:58:18.760 today and I'll
00:58:21.360 see you tomorrow.
00:58:22.080 I'm going to
00:58:22.320 talk to the
00:58:22.760 locals people
00:58:23.380 privately in
00:58:24.380 their secret
00:58:25.280 little platform
00:58:26.640 which you're not
00:58:27.900 invited to.
00:58:28.580 Well, actually,
00:58:29.040 you are invited
00:58:29.540 but you'd have
00:58:29.980 to subscribe.
00:58:31.160 You should also
00:58:31.600 subscribe here.
00:58:32.400 There's a button
00:58:32.780 there.
00:58:33.040 Hit that button.
00:58:34.500 Bye for now.
00:58:34.980 Bye.