Real Coffee with Scott Adams - May 01, 2022


Episode 1730 Scott Adams: The Golden Age Is Upon Us, Trump Was Right Per Chomsky. Wow. What A Show


Episode Stats

Length

1 hour and 12 minutes

Words per Minute

147.68286

Word Count

10,768

Sentence Count

751

Misogynist Sentences

12

Hate Speech Sentences

32


Summary

The Golden Age is upon us, and it's other-level greatness. All of the biggest problems in the world are heading toward a solution at the same time, and we have a plan to solve them all.


Transcript

00:00:00.000 And that concludes the ASMR portion of our program.
00:00:07.280 The rest of it will be such high excitement that if you tried to sleep to this,
00:00:12.120 my God, I pity you.
00:00:14.000 The excitement that you are about to experience,
00:00:16.080 the full body pleasure, the dopamine rush,
00:00:20.300 unbelievable.
00:00:22.380 Usually the show is great, I have to say,
00:00:25.440 but today, it's other-level greatness.
00:00:30.420 The greatest greatness there's ever been.
00:00:31.920 There will be new concepts, new ideas, shocking twists and turns.
00:00:36.720 The sort of thing no one's ever seen before.
00:00:40.220 But can you enjoy this without being properly primed?
00:00:44.700 Of course you can.
00:00:45.720 But you can enjoy it more,
00:00:47.360 and all you need is a copper mug or a glass,
00:00:49.400 a tank or a gel, a canteen jug or a flask,
00:00:51.240 a vessel of any kind,
00:00:53.980 filling with your favorite liquid.
00:00:57.680 Gosh darn it, I like coffee.
00:01:00.480 And join me now for the unparalleled pleasure.
00:01:04.480 It's the dopamine hit of the day.
00:01:08.180 It's the thing that makes everything better,
00:01:11.000 and it'll fix everything in your day.
00:01:14.240 Go.
00:01:14.460 Go.
00:01:18.380 Ah.
00:01:21.360 Did you see that in the news,
00:01:23.980 was it a jeet pie?
00:01:29.880 Was getting some grief for having a gigantic coffee mug?
00:01:35.920 To which I say,
00:01:37.940 the more gigantic, the better.
00:01:40.400 Well, let's talk about all the news,
00:01:42.140 but before we do that,
00:01:43.740 special hello to Susan.
00:01:47.080 Susan at the Moss Beach Distillery.
00:01:49.100 And you, too, someday could get a special hello.
00:01:55.060 All you have to do is ask for it.
00:02:00.640 Turns out it isn't that hard after all.
00:02:03.220 But hello to Susan.
00:02:04.280 Everybody say hi to Susan.
00:02:06.580 And the golden age is upon us.
00:02:10.900 I don't know if you've noticed.
00:02:11.840 Let me start out with a bang.
00:02:15.980 Have you noticed
00:02:17.060 that
00:02:18.880 all of the biggest problems in the world
00:02:22.820 are heading toward a solution at the same time?
00:02:27.060 It's one of those things you don't notice,
00:02:28.800 because we always talk about the trouble.
00:02:30.980 So you don't notice anything positive.
00:02:33.620 What the hell?
00:02:34.320 Well, apparently something has exploded on me.
00:02:40.240 Okay?
00:02:40.880 I don't know what that is.
00:02:44.020 So here's some examples.
00:02:45.960 So we've got the pandemic is winding down.
00:02:49.440 That's good.
00:02:50.360 And we probably learned a whole bunch
00:02:52.120 about avoiding future pandemics
00:02:53.760 or minimizing them
00:02:54.780 or what to do and what not to do.
00:02:57.180 Wouldn't you say?
00:02:58.800 I would say so.
00:03:00.000 I'd say the whole pandemic thing,
00:03:01.920 as horrible as it has been,
00:03:03.160 and, you know,
00:03:04.000 there'll be lasting negative consequences.
00:03:07.020 But I think as a civilization,
00:03:09.140 we're stronger than we've ever been.
00:03:12.600 Because we know how to take care of the next one
00:03:14.760 way better
00:03:15.560 than we were prepared for this one.
00:03:18.760 How about inflation?
00:03:20.360 Inflation's out of control.
00:03:21.820 Gas prices are up, right?
00:03:23.020 So that's the way the news handles it.
00:03:24.680 It's all true.
00:03:26.280 But there are smart economists saying today,
00:03:30.060 according to the news,
00:03:30.940 that they believe we may have peaked.
00:03:34.180 That we may be at the top of the inflation.
00:03:37.360 And that you would see the,
00:03:38.540 what they call the flexible part
00:03:40.280 of the inflation coming down rapidly
00:03:42.300 in the summer.
00:03:44.100 There's still a sticky part
00:03:45.440 that'll take a while to work out.
00:03:47.420 But it's not the big part.
00:03:49.680 So we could be at the peak inflation
00:03:51.640 right now.
00:03:53.080 So the next month or so,
00:03:55.200 you might see it stabilize
00:03:56.160 or start to go down.
00:03:58.420 How about climate change
00:04:00.600 and, you know,
00:04:02.200 the catastrophe ahead?
00:04:04.220 Well, nuclear power
00:04:06.600 and nuclear energy
00:04:08.360 is going to be the solution to that.
00:04:10.220 And now it's a bipartisan agreement.
00:04:13.980 Democrats and Republicans
00:04:15.220 both say by majority,
00:04:17.640 yes, do nuclear, we need it.
00:04:19.260 We're starting to keep older plants operational.
00:04:24.040 There's a budget for that.
00:04:25.040 Biden's doing that.
00:04:26.620 And there's a fusion
00:04:27.740 that's starting to become practical.
00:04:30.100 Maybe in the next 10 years,
00:04:31.280 we'll have a fusion plant.
00:04:34.380 We've got these small Rolls-Royce type,
00:04:36.480 actually Rolls-Royce is a company
00:04:37.980 making actual nuclear reactors.
00:04:40.560 And they're making small factory-made modular units.
00:04:44.020 So they can be sort of approved once
00:04:46.420 and then you can implement them many times.
00:04:48.380 Cost-effectively.
00:04:50.560 Smaller size, already approved,
00:04:53.080 no new engineering,
00:04:54.400 just slapped together a bunch of them.
00:04:56.220 So basically the economics
00:04:57.540 and also the risk of handling the waste,
00:05:00.940 these problems are all solved.
00:05:03.220 Or they're right on the border of solved.
00:05:05.060 You know, you've got to work out a few kinks.
00:05:07.520 But they're all so solvable
00:05:09.080 with what we know how to do already,
00:05:12.640 you're really sort of at the worst of it
00:05:14.560 and it looks like it's going to turn
00:05:15.840 in the other direction very quickly.
00:05:17.200 What about free speech
00:05:19.420 and all the fake news?
00:05:21.620 That's the biggest problem in the world,
00:05:24.380 in my opinion.
00:05:25.780 Or maybe the highest priority to fix.
00:05:28.500 Because if you don't get the free speech fixed,
00:05:31.160 then everything else breaks.
00:05:33.480 But enter Elon Musk buying Twitter.
00:05:37.720 Now I'm not going to tell you that I know
00:05:39.360 that having one billionaire in charge
00:05:42.620 of such an important lever on free speech,
00:05:46.700 I'm not going to say that necessarily
00:05:48.660 that's going to be solved.
00:05:51.240 But it sure looks like it.
00:05:52.960 If I had to bet,
00:05:54.480 I would put a very large bet,
00:05:56.560 a very large bet.
00:05:58.520 I would easily bet a million dollars
00:06:00.800 that free speech will look better
00:06:02.980 in a year than it looks now.
00:06:04.280 specifically in the Twitter lever situation.
00:06:09.200 I would bet a million dollars that's true.
00:06:11.660 Would you take the other side of that bet?
00:06:14.100 I mean, anything's possible, right?
00:06:16.380 You know, Elon could turn out to be
00:06:17.940 an evil person who's been hiding it
00:06:20.080 for years or something.
00:06:21.000 But I'd bet a million dollars,
00:06:25.160 and I'd feel comfortable with that bet,
00:06:26.800 that if there were any way to measure it,
00:06:29.220 that we'd be way ahead in free speech.
00:06:31.920 And that, because it's Twitter,
00:06:33.900 and that influences the whole chain of news
00:06:37.740 and everything else,
00:06:39.600 that's heading in exactly the direction
00:06:42.160 I'd want it to head.
00:06:44.060 How about the experts being defanged?
00:06:47.580 Is that good or bad?
00:06:48.520 That our understanding of what experts
00:06:51.000 can and cannot do for us
00:06:52.420 is completely altered.
00:06:54.560 Is that good or bad?
00:06:56.920 It's good.
00:06:58.340 Because we're less susceptible to bullshit.
00:07:02.860 And then you add on top of that
00:07:04.320 that free speech may be coming back,
00:07:06.740 and that's starting to look good.
00:07:08.880 So our ability to communicate instantly
00:07:11.340 across the globe is in pretty good shape.
00:07:14.200 You know, the Internet is just amazing.
00:07:16.920 Everybody can talk to everybody.
00:07:18.200 That was a big thing for the pandemic, especially.
00:07:22.040 But if we're trying to figure out
00:07:24.420 what to do about the next emergency,
00:07:26.940 we really have to understand
00:07:28.520 the limits of our experts
00:07:29.880 and understand what they can and cannot do
00:07:32.420 in the fog of war.
00:07:34.720 And we should also be more forgiving
00:07:36.220 when they get it wrong.
00:07:39.200 So we should be improving in both directions,
00:07:41.780 and I think we will.
00:07:43.120 I think we should be more forgiving
00:07:44.560 in the fog of war,
00:07:45.760 and we should be even, let's say,
00:07:49.360 less forgiving
00:07:50.340 and more skeptical
00:07:52.140 when things are settled down, right?
00:07:55.020 It would be good to go hard
00:07:56.540 in both directions at the same time.
00:07:59.120 More skepticism,
00:08:00.700 but also more forgiveness.
00:08:02.040 I think that's happening.
00:08:06.160 What about, you know,
00:08:07.320 Ukraine and Russia
00:08:08.540 could turn into a nuclear confrontation,
00:08:11.640 but I don't think people expect it.
00:08:14.380 I don't expect it.
00:08:15.880 So I don't think that's going to happen.
00:08:17.620 You have to worry about it.
00:08:18.640 But I think the most likely outcome
00:08:22.260 would be a permanent understanding
00:08:25.640 that you can't attack your neighbor anymore
00:08:27.580 if you're a certain type of country.
00:08:31.540 You know, maybe far less industrialized countries
00:08:35.040 that nobody's paying attention to
00:08:37.080 can do some bad stuff to each other.
00:08:39.780 That'll last for a long time.
00:08:41.140 But in terms of a tank war,
00:08:44.520 I think this is the last tank war.
00:08:48.980 Does anybody disagree?
00:08:51.800 I think this is the last tank war
00:08:54.480 where you invade your neighbors
00:08:56.580 with heavy equipment.
00:08:58.200 I just don't see it happening again after this.
00:09:01.100 It's just so obvious
00:09:02.360 that the defensive weapons
00:09:03.680 are better than the offensive weapons
00:09:05.200 in this situation.
00:09:07.160 Somebody says no.
00:09:08.320 Maybe they'll be better tanks.
00:09:09.440 Who knows?
00:09:10.580 But I think that war,
00:09:13.140 war in terms of the World War II style
00:09:15.880 that we're seeing, unfortunately, again,
00:09:18.220 I think this is the example
00:09:21.160 that will just seal it for everybody.
00:09:22.860 It's like, okay, these tank wars don't work.
00:09:25.760 Because even if Putin gets what he wants,
00:09:28.040 it's not going to look like it was a good idea
00:09:29.900 to everyone else, right?
00:09:32.520 Even if Putin convinces Russia
00:09:34.280 that it was a good idea
00:09:35.240 because he controls their information,
00:09:37.420 nobody else is going to think
00:09:39.340 it was a good idea,
00:09:40.320 even if he gets control of Ukraine.
00:09:42.420 It's just not going to look like a good idea.
00:09:44.540 So that's good.
00:09:46.980 I think the supply chain issues,
00:09:48.780 which are bad
00:09:50.000 and might worsen a little bit
00:09:51.220 in the next month or so,
00:09:52.620 these are the kinds of things
00:09:54.580 that humans fix really well.
00:09:57.200 If there's one thing
00:09:58.200 that you could depend on humans
00:09:59.860 to fix,
00:10:01.340 it's the supply chain.
00:10:02.560 Because there's so many people
00:10:04.460 who have vital interests
00:10:05.820 and it's the most important thing
00:10:07.520 in the world right now, actually.
00:10:09.460 And there's so many ways
00:10:10.620 you can communicate.
00:10:11.740 There's so many ways
00:10:12.440 you can alter transportation,
00:10:14.940 emergency resources, etc.
00:10:17.920 It just seems to me
00:10:19.300 that this is exactly the kind of thing
00:10:21.780 humans are good at.
00:10:22.860 I'm not too confident
00:10:27.980 that the poorest countries
00:10:29.540 are going to come through this okay.
00:10:31.760 Maybe that's going to get pretty dire.
00:10:34.620 But I think that
00:10:35.500 we're going to go through
00:10:37.340 an uncomfortable bump
00:10:38.960 no matter who you are
00:10:40.620 and then it will be better.
00:10:43.180 And it will probably be better forever
00:10:44.500 because we'll figure out
00:10:46.280 where all the weaknesses are
00:10:47.460 and then we'll have a workaround
00:10:49.600 for every weakness
00:10:50.440 in the future.
00:10:51.440 These are really, really big things
00:10:54.820 that look like
00:10:55.640 minor efficiency improvements.
00:10:58.180 But these are the things
00:10:59.000 that make civilization survive.
00:11:01.960 And these go right to
00:11:03.080 survival of humanity.
00:11:06.100 So I think the food issues
00:11:07.360 will be fixed
00:11:07.980 and we'll probably rethink
00:11:09.060 our entire
00:11:09.620 how do you create food,
00:11:11.220 how do you grow it,
00:11:11.900 how do you fertilize it.
00:11:13.220 The fertilizer thing
00:11:14.080 is a big problem, by the way.
00:11:15.840 That was a big, big problem.
00:11:18.040 But
00:11:18.340 I think we'll have time
00:11:20.820 to put enough human ingenuity
00:11:22.460 into it
00:11:22.980 to fix it.
00:11:24.400 Because remember,
00:11:25.040 there was a time
00:11:25.540 when we thought
00:11:26.060 we would run out of food
00:11:27.140 entirely
00:11:27.720 and then somebody
00:11:29.220 invented fertilizer.
00:11:31.960 That was,
00:11:32.620 that plus
00:11:33.380 I assume
00:11:35.760 irrigation methods
00:11:37.000 would be the two things
00:11:38.940 and then pesticides,
00:11:40.000 I guess the third thing.
00:11:41.100 So those are three things
00:11:42.280 that I don't think
00:11:43.060 anybody foresaw
00:11:44.540 the inventions of.
00:11:47.840 That was a sentence
00:11:48.820 I wish I'd bailed out of earlier.
00:11:51.820 Right?
00:11:52.820 So don't you think
00:11:53.580 that there will be
00:11:54.260 new ways to make food,
00:11:56.360 new technologies,
00:11:57.800 you know,
00:11:58.000 just as fertilizer,
00:12:00.440 pesticides,
00:12:00.820 and irrigation techniques
00:12:02.540 could not have been foreseen
00:12:04.200 at one point,
00:12:05.400 there must be things
00:12:06.380 we don't see coming.
00:12:07.220 And I'll bet
00:12:10.040 we'll be surprised
00:12:10.700 on the upside.
00:12:11.580 The only thing
00:12:12.140 that I think
00:12:12.640 is broken
00:12:13.120 with no plan
00:12:14.260 in place to fix it
00:12:15.380 is election credibility.
00:12:19.380 Wouldn't you say?
00:12:20.560 If you're going to look
00:12:21.260 at problems,
00:12:22.520 it's not,
00:12:23.600 you could argue
00:12:24.240 it's not the biggest problem.
00:12:26.440 You could make that argument.
00:12:28.780 But
00:12:29.180 it's the one
00:12:31.680 that doesn't seem
00:12:32.300 to have anybody
00:12:32.880 working on it.
00:12:35.960 Which makes it unique.
00:12:37.220 right?
00:12:38.140 Because the other ones,
00:12:39.020 everybody says,
00:12:39.620 okay,
00:12:39.880 we're trying to fix
00:12:40.580 climate change
00:12:41.860 or we're working hard
00:12:43.380 on the pandemic
00:12:44.020 and everything.
00:12:44.880 Everybody recognizes
00:12:45.960 those as problems.
00:12:47.700 But even though
00:12:48.520 the public
00:12:49.040 has great skepticism
00:12:50.760 about the credibility
00:12:51.640 of the election system,
00:12:53.300 you know,
00:12:53.460 just the ability
00:12:54.120 to give us
00:12:54.620 an accurate count,
00:12:56.420 I don't really see
00:12:57.760 anything happening
00:12:58.420 there, do you?
00:13:00.020 It's the biggest problem
00:13:01.860 that doesn't have
00:13:02.520 anything that looks
00:13:03.400 like a solution
00:13:04.100 percolating in any way.
00:13:07.220 But I think
00:13:08.740 we'll get there too.
00:13:10.300 So I think
00:13:10.820 you could add
00:13:11.240 to this list,
00:13:12.180 we're in this weird,
00:13:13.080 weird place
00:13:13.660 where we definitely
00:13:14.880 had,
00:13:17.160 I mean,
00:13:17.640 civilization
00:13:18.320 just had the shit
00:13:20.060 slapped out of it
00:13:20.940 between the pandemic
00:13:22.740 and now the war
00:13:23.920 and the supply chain
00:13:24.900 and every other
00:13:26.060 damn thing,
00:13:26.640 inflation.
00:13:27.820 And I think
00:13:28.980 we're going to
00:13:29.380 actually handle it.
00:13:30.180 And if we do,
00:13:31.960 we're in really
00:13:32.800 strong shape.
00:13:33.980 So I think
00:13:34.300 that's where
00:13:34.620 we're heading.
00:13:35.360 So that's my
00:13:35.820 positive for the day.
00:13:38.040 The White House
00:13:38.520 correspondence dinner
00:13:40.160 happened.
00:13:41.380 I can just tell you
00:13:42.480 some of the better jokes.
00:13:44.240 The only thing
00:13:44.940 that's worthwhile
00:13:45.640 from it.
00:13:47.200 You know,
00:13:47.980 when you see
00:13:48.840 the White House
00:13:49.520 correspondence dinner,
00:13:52.060 to me that feels
00:13:53.160 like they're pulling
00:13:53.840 the curtain back
00:13:55.960 and they're admitting
00:13:57.680 to the world,
00:13:58.980 you know this is
00:13:59.600 all theater, right?
00:14:01.360 Like all the things
00:14:02.300 we say to each other,
00:14:03.620 you know we're all
00:14:04.260 lying and acting
00:14:05.220 and then when we
00:14:06.880 do this thing,
00:14:07.460 we'll mock the fact
00:14:08.400 that any of it
00:14:09.660 is serious.
00:14:12.140 So in a way,
00:14:13.520 I would say that
00:14:14.620 the White House
00:14:15.100 correspondence dinner
00:14:16.200 greatly decreases
00:14:18.380 the credibility
00:14:19.200 of our system.
00:14:21.260 Does anybody
00:14:21.900 feel that?
00:14:23.640 I would be
00:14:24.880 way more comfortable
00:14:25.680 if the correspondence
00:14:27.400 and the government
00:14:28.580 did not get together
00:14:29.860 and make jokes
00:14:30.560 and pretend
00:14:31.900 it was all
00:14:32.520 just an act
00:14:33.540 because that's
00:14:36.080 what we're afraid of.
00:14:38.820 Isn't it?
00:14:40.360 Isn't the thing
00:14:41.240 you're most afraid of
00:14:42.220 that the government
00:14:42.900 and the journalists
00:14:44.480 are colluding
00:14:45.420 is literally
00:14:47.180 one of our
00:14:47.800 biggest fears.
00:14:49.440 And so these
00:14:50.820 fucking idiots
00:14:51.560 came up with
00:14:52.340 a festival
00:14:52.980 to celebrate
00:14:54.180 our greatest fear
00:14:55.300 and to put it
00:14:56.260 in our faces
00:14:56.800 with lots
00:14:57.360 of visuals
00:14:57.860 so you can't
00:14:59.020 miss it.
00:15:00.060 Hey, here's
00:15:00.460 your greatest fear
00:15:01.300 that the journalists
00:15:02.720 are all in on it
00:15:03.700 and they're basically
00:15:04.780 just good friends
00:15:05.960 with the elites
00:15:06.640 and when they
00:15:07.720 get together
00:15:08.620 they all laugh
00:15:09.300 about all this
00:15:09.940 bullshit that you
00:15:10.720 think is real.
00:15:13.340 I'm looking at
00:15:14.080 the comments
00:15:14.580 and some of you
00:15:15.340 feel the same way,
00:15:16.140 right?
00:15:16.680 To me,
00:15:17.260 this is a massive
00:15:18.180 mistake
00:15:18.900 and it's one
00:15:20.960 that Trump
00:15:21.980 got right,
00:15:23.700 didn't he?
00:15:25.320 Trump got this
00:15:26.320 completely right
00:15:27.380 by not going.
00:15:30.760 What do I keep
00:15:31.660 telling you?
00:15:32.220 That Trump
00:15:32.740 would look better
00:15:33.640 the longer he's
00:15:34.860 out of office,
00:15:35.920 the better he
00:15:36.760 would look.
00:15:37.940 You just see
00:15:38.500 an example
00:15:39.040 almost every day.
00:15:41.060 You know,
00:15:41.420 here's another one.
00:15:43.200 There's no doubt
00:15:44.120 that as a citizen
00:15:46.360 when I'm watching
00:15:47.380 this,
00:15:48.000 I feel creepy
00:15:49.080 about it.
00:15:49.880 Like,
00:15:50.260 oh,
00:15:50.420 this is creepy.
00:15:51.920 They're supposed
00:15:52.500 to be on different
00:15:53.240 sides.
00:15:54.280 Our whole system
00:15:55.280 depends on them
00:15:56.120 having a proper
00:15:58.200 relationship.
00:16:01.880 And I'm not sure
00:16:02.560 that joking about
00:16:03.500 how serious
00:16:04.200 any of this is
00:16:05.000 is a proper
00:16:05.540 relationship.
00:16:07.100 Is it?
00:16:09.560 Well,
00:16:10.320 yeah,
00:16:10.620 we'll get to
00:16:11.040 Chomsky.
00:16:12.220 so here's
00:16:14.100 some of the
00:16:14.360 good jokes.
00:16:16.340 And these are
00:16:17.100 Biden's jokes.
00:16:18.020 So he has,
00:16:18.360 he has,
00:16:18.800 let me,
00:16:19.720 let me give a
00:16:20.200 shout out to
00:16:20.780 his joke writer.
00:16:22.480 Whoever helped
00:16:23.340 Biden write his
00:16:24.320 jokes,
00:16:25.080 pretty good.
00:16:25.920 Because they
00:16:26.520 weren't,
00:16:26.880 they weren't too
00:16:27.540 edgy.
00:16:28.260 They were sort
00:16:29.100 of just right
00:16:29.700 for the situation.
00:16:30.740 And some of
00:16:31.800 them were pretty
00:16:32.240 good.
00:16:32.620 Here's some.
00:16:33.940 Biden said,
00:16:34.780 Republicans seem
00:16:35.540 to support one
00:16:36.340 fellow,
00:16:37.380 Biden said,
00:16:38.400 some guy named
00:16:39.320 Brandon.
00:16:40.480 He's having a
00:16:41.100 really good year.
00:16:41.780 And I'm kind
00:16:42.420 of happy for
00:16:43.020 him.
00:16:44.480 That's a good
00:16:45.200 joke.
00:16:46.220 All right,
00:16:46.500 here's another
00:16:46.900 one.
00:16:48.960 He thanked
00:16:49.740 the 42% who
00:16:50.920 actually applauded
00:16:51.820 as he took
00:16:52.260 the microphone.
00:16:53.100 So self-deprecating.
00:16:55.020 But he self-deprecated
00:16:56.180 in a clever way.
00:16:57.720 I think he picked
00:16:58.600 the best poll
00:17:00.140 number of his
00:17:01.080 bad polls.
00:17:02.660 Because I think
00:17:03.420 it's lower than
00:17:04.080 42%
00:17:05.480 favorability,
00:17:06.440 isn't it?
00:17:06.900 Depending on
00:17:07.960 which poll.
00:17:11.220 So in a clever
00:17:12.720 way, he
00:17:13.200 self-deprecates.
00:17:14.660 But he actually
00:17:15.240 gave himself a
00:17:16.000 promotion, even
00:17:17.360 in the self-deprecation,
00:17:18.700 I think.
00:17:19.660 Yeah, I think it's
00:17:20.240 as low as 39%
00:17:21.600 elsewhere.
00:17:23.960 What is
00:17:24.460 Rasmussen?
00:17:25.980 I didn't want to
00:17:26.940 guess.
00:17:28.780 But, all right.
00:17:30.640 Here's another one.
00:17:31.540 He jabbed
00:17:34.760 at the press
00:17:35.400 saying that
00:17:36.220 they're the only
00:17:36.960 group of Americans
00:17:37.820 with a lower
00:17:38.440 approval rating
00:17:39.220 than his own.
00:17:42.600 You know,
00:17:43.340 good sort of
00:17:44.540 harmless joke.
00:17:45.840 But again,
00:17:46.660 it's alarming.
00:17:48.600 It's alarming
00:17:49.500 to see them
00:17:50.680 on the same team.
00:17:52.280 Hey, we're on
00:17:53.080 the same team.
00:17:53.700 Everybody hates us.
00:17:54.980 Let's get together.
00:17:56.640 All right.
00:17:57.020 But this is
00:17:57.640 his best joke.
00:17:59.360 So Biden
00:18:00.220 pointed out
00:18:00.820 that he's the
00:18:01.360 first sitting
00:18:01.860 president since
00:18:02.580 2016 to attend
00:18:03.920 this event.
00:18:04.920 And they said,
00:18:05.840 quote,
00:18:06.160 it's understandable
00:18:07.100 we had a horrible
00:18:08.320 plague followed
00:18:09.600 by two years
00:18:10.280 of COVID.
00:18:13.460 Nice.
00:18:15.400 Nice.
00:18:16.700 So the horrible
00:18:17.380 plague here being
00:18:18.220 Trump, if you
00:18:18.920 weren't following
00:18:19.500 the math.
00:18:22.340 Well done.
00:18:24.080 Well done.
00:18:26.640 All right.
00:18:27.240 And now here's
00:18:32.240 the one.
00:18:32.520 And he said,
00:18:34.400 this is also
00:18:35.080 something Biden
00:18:35.700 said.
00:18:36.460 He said it
00:18:37.060 would have been
00:18:37.440 a real coup
00:18:38.700 had Trump
00:18:39.540 attended this
00:18:40.260 year.
00:18:41.320 That would have
00:18:41.920 been a real coup.
00:18:43.580 He's actually
00:18:44.380 joking with the
00:18:45.580 press about
00:18:47.000 Trump organizing
00:18:49.360 a coup.
00:18:51.140 Now,
00:18:51.920 help me out
00:18:53.920 here.
00:18:54.160 January 6th
00:18:56.940 was either
00:18:57.640 an insurrection
00:18:58.560 and a coup
00:18:59.160 and one of
00:18:59.660 the worst
00:19:00.000 things that
00:19:00.460 ever happened
00:19:00.960 to the
00:19:01.240 republic.
00:19:02.480 That's what
00:19:03.060 Biden is
00:19:03.740 saying,
00:19:04.100 right?
00:19:05.120 Or,
00:19:06.040 or it's a
00:19:07.260 joke.
00:19:08.840 It's not
00:19:09.560 both,
00:19:10.920 you motherfuckers.
00:19:12.400 It's not.
00:19:13.800 Pick one.
00:19:15.160 It's either
00:19:15.540 the worst
00:19:16.000 thing that's
00:19:16.380 happened to
00:19:16.740 the republic,
00:19:17.440 that's what
00:19:17.840 all your
00:19:18.220 little toadies
00:19:18.880 are telling
00:19:19.260 us,
00:19:19.540 or it's
00:19:20.920 just a
00:19:21.400 fucking joke.
00:19:23.640 He treated
00:19:24.400 that as a
00:19:24.920 joke.
00:19:26.140 I'm going to
00:19:26.800 take his
00:19:27.160 leadership on
00:19:27.820 that.
00:19:28.800 Joe Biden,
00:19:29.520 I take your
00:19:30.000 leadership.
00:19:30.840 Now,
00:19:31.100 I get that
00:19:32.060 you're at an
00:19:32.540 event where
00:19:33.700 you tell
00:19:34.020 jokes,
00:19:35.120 but where's
00:19:35.780 his joke
00:19:36.240 about the
00:19:37.040 Holocaust?
00:19:38.200 Like,
00:19:38.500 where's the
00:19:38.780 Holocaust joke?
00:19:40.320 It's not
00:19:41.000 there,
00:19:41.320 right?
00:19:42.020 Do you know
00:19:42.360 why it's not
00:19:42.840 there?
00:19:43.960 Because the
00:19:44.560 Holocaust isn't
00:19:45.200 a fucking
00:19:45.580 joke.
00:19:46.580 That's why.
00:19:48.300 You know
00:19:48.560 what is a
00:19:48.920 fucking joke?
00:19:50.180 January 6th.
00:19:51.860 He just
00:19:52.220 told you
00:19:52.540 it's a
00:19:52.840 joke.
00:19:54.120 He joked
00:19:55.180 right in
00:19:55.620 front of
00:19:55.900 you while
00:19:56.720 the
00:19:56.880 investigation
00:19:57.360 is ongoing.
00:19:59.940 I don't
00:20:00.560 think you
00:20:00.920 can ignore
00:20:01.400 this.
00:20:02.300 Seriously.
00:20:03.100 See,
00:20:03.360 this is
00:20:03.600 why Trump
00:20:05.420 is smart
00:20:05.880 not to
00:20:06.260 attend.
00:20:07.320 This is
00:20:08.660 inappropriate.
00:20:10.560 Now,
00:20:11.140 I'm not
00:20:11.440 the person
00:20:11.800 who's going
00:20:12.120 to say
00:20:12.320 jokes are
00:20:12.880 inappropriate
00:20:13.480 as an
00:20:14.860 art form.
00:20:16.120 So I'm
00:20:16.420 not talking
00:20:16.780 about the
00:20:17.080 art form.
00:20:18.220 I'm talking
00:20:18.720 about the
00:20:19.160 president
00:20:19.660 treating
00:20:21.260 our current
00:20:22.000 biggest problem
00:20:22.860 according to
00:20:23.640 him and
00:20:24.600 his people
00:20:25.160 as a joke.
00:20:27.200 Pick one.
00:20:28.780 Pick one.
00:20:29.940 It's either
00:20:30.180 always a joke
00:20:30.820 or it's a
00:20:31.920 big problem.
00:20:32.380 Yes, I
00:20:35.800 told you
00:20:36.100 these
00:20:36.480 analysts at
00:20:37.620 UBS
00:20:38.080 think that
00:20:38.800 inflation
00:20:39.500 might have
00:20:39.900 peaked.
00:20:41.080 So it's
00:20:42.380 coming from
00:20:42.880 smart people
00:20:43.520 too, not
00:20:44.020 just from
00:20:44.480 me jabbering.
00:20:46.480 All right,
00:20:47.000 here's a
00:20:47.960 scary thing.
00:20:50.860 If you
00:20:51.540 were,
00:20:52.720 let's say,
00:20:53.600 hypothetically,
00:20:55.080 and I'm
00:20:55.580 going to
00:20:55.940 frame this
00:20:57.020 by saying
00:20:57.540 I'm
00:20:58.540 unaware of
00:20:59.380 any proof
00:21:00.660 that the
00:21:01.440 2020
00:21:01.940 election
00:21:02.440 was
00:21:03.360 illegitimate.
00:21:06.920 I'm
00:21:07.220 unaware of
00:21:08.220 any proof
00:21:08.680 of that.
00:21:09.800 I'm also
00:21:10.420 unaware of
00:21:11.720 any way you
00:21:12.240 could prove
00:21:12.760 it was
00:21:13.060 legitimate.
00:21:14.320 I don't
00:21:14.980 know that
00:21:15.280 anything could
00:21:15.800 be proved.
00:21:16.700 Since
00:21:17.080 nothing is
00:21:17.660 fully
00:21:17.960 auditable,
00:21:19.400 I would
00:21:19.660 say the
00:21:19.980 only thing
00:21:20.380 that we
00:21:20.800 citizens
00:21:21.840 can say
00:21:22.500 is that we
00:21:23.620 either accepted
00:21:24.460 it or we
00:21:25.140 didn't.
00:21:25.400 It turns
00:21:26.780 out that's
00:21:27.180 the only
00:21:27.400 thing we
00:21:27.700 can say.
00:21:28.820 I, from
00:21:29.780 the very
00:21:30.140 beginning,
00:21:30.820 accepted the
00:21:31.500 result.
00:21:32.920 And the
00:21:33.280 reason I
00:21:33.680 accepted it
00:21:34.320 is because
00:21:34.980 I knew you
00:21:36.140 couldn't
00:21:36.440 check, but
00:21:38.060 you can't
00:21:38.520 throw out
00:21:38.800 the system
00:21:39.400 if the
00:21:39.920 system is
00:21:40.400 the only
00:21:40.620 thing that
00:21:40.940 has a
00:21:41.160 chance of
00:21:41.560 ever fixing
00:21:42.100 this.
00:21:42.980 Like, you
00:21:43.360 want to
00:21:43.600 keep enough
00:21:44.060 of a
00:21:44.340 system alive,
00:21:45.800 you say,
00:21:46.260 okay, maybe
00:21:47.080 someday it'll
00:21:47.740 elect somebody
00:21:48.260 who can
00:21:48.480 look into
00:21:48.880 this and
00:21:49.840 maybe fix
00:21:50.540 it.
00:21:51.740 So, I
00:21:53.560 have no way
00:21:53.960 to know that
00:21:54.400 the 2020
00:21:54.900 election
00:21:55.260 was either
00:21:55.800 rigged or
00:21:56.240 not rigged.
00:21:56.960 That
00:21:57.060 information is
00:21:57.740 forever
00:21:58.180 unavailable to
00:21:59.900 us.
00:22:01.700 But,
00:22:03.060 hypothetically,
00:22:04.620 this is just
00:22:06.160 speculation.
00:22:07.660 Suppose you
00:22:08.600 were on a
00:22:09.140 team, a
00:22:09.820 political team,
00:22:11.220 that expected
00:22:12.580 to rig the
00:22:13.360 next election.
00:22:14.980 I'm not
00:22:15.380 saying they
00:22:15.780 are.
00:22:16.400 How would I
00:22:16.820 know that?
00:22:17.700 I can't read
00:22:18.260 anybody's mind.
00:22:19.780 But suppose
00:22:20.280 that happened.
00:22:20.980 It's just a
00:22:21.440 mental experiment.
00:22:23.040 What would
00:22:23.520 you be doing
00:22:24.380 around now
00:22:26.320 to make sure
00:22:27.840 that worked
00:22:28.340 out well
00:22:28.840 later?
00:22:29.460 Well, you'd
00:22:29.860 be preparing
00:22:30.400 your methods,
00:22:31.440 like how are
00:22:31.960 you going to
00:22:32.280 cheat and
00:22:32.780 get away with
00:22:33.380 it?
00:22:33.800 Again,
00:22:34.340 hypothetically.
00:22:35.260 I'm not
00:22:35.640 accusing anybody
00:22:36.360 of anything.
00:22:36.800 But the
00:22:38.700 most important
00:22:39.300 thing you'd
00:22:39.820 need to do
00:22:40.360 about now,
00:22:42.080 as the
00:22:42.560 election is
00:22:43.060 approaching,
00:22:44.120 is to create
00:22:44.980 some kind of
00:22:45.540 a narrative
00:22:46.000 that would
00:22:48.000 explain why
00:22:49.180 an election
00:22:49.680 result could be
00:22:50.620 so different
00:22:51.300 from what
00:22:52.500 the polls
00:22:53.120 and all
00:22:53.680 common sense
00:22:54.320 and observations
00:22:55.060 suggest they
00:22:55.820 should.
00:22:57.020 Because right
00:22:57.600 now history
00:22:58.080 is telling us
00:22:58.720 that the
00:22:59.180 Republicans will
00:22:59.960 sweep the
00:23:00.480 midterms.
00:23:02.040 And the
00:23:02.540 signal for
00:23:03.060 that is so
00:23:03.660 strong,
00:23:05.040 we would be
00:23:05.700 kind of amazed
00:23:06.520 if it didn't
00:23:07.020 happen,
00:23:07.460 wouldn't we?
00:23:08.660 Even Democrats
00:23:09.440 would be surprised
00:23:10.200 at this point.
00:23:12.000 So if somebody
00:23:13.820 planned to
00:23:14.440 rig an election,
00:23:15.180 and again,
00:23:15.540 there's no
00:23:15.840 evidence of this
00:23:16.460 whatsoever.
00:23:17.560 This is just
00:23:18.180 a mental
00:23:18.520 experiment.
00:23:20.280 If somebody
00:23:21.020 planned to
00:23:21.500 do it,
00:23:22.180 it would be
00:23:22.580 very important
00:23:23.900 that they
00:23:24.480 could seed
00:23:25.080 the public
00:23:25.680 with a
00:23:26.600 narrative that
00:23:27.280 could explain
00:23:28.080 an upcoming
00:23:29.500 inexplicable
00:23:32.240 thing.
00:23:34.160 And so
00:23:34.740 today,
00:23:35.320 we notice
00:23:35.900 with some
00:23:36.560 interest,
00:23:37.520 that Harry
00:23:37.900 Enten,
00:23:38.420 who's an
00:23:38.760 opinion guy
00:23:39.400 who writes
00:23:39.760 for CNN,
00:23:40.980 writes on
00:23:41.400 the CNN
00:23:41.800 website,
00:23:43.180 he describes
00:23:44.020 how Democrats
00:23:45.460 could win
00:23:46.120 the midterms.
00:23:48.180 Interesting.
00:23:50.320 So even
00:23:50.960 though in
00:23:52.040 the article
00:23:52.440 he confesses
00:23:53.360 that every
00:23:54.560 signal says
00:23:55.500 that the
00:23:56.080 Republicans
00:23:56.620 are just
00:23:57.100 going to
00:23:57.460 wipe the
00:23:58.100 Democrats
00:23:58.580 out in
00:23:59.140 the midterms,
00:24:00.100 every signal
00:24:00.740 says it,
00:24:01.180 historical
00:24:01.540 signal,
00:24:02.640 as well as
00:24:03.300 current ones.
00:24:04.480 The polls
00:24:04.920 show that
00:24:05.400 clearly.
00:24:08.460 But here's
00:24:09.340 how Harry
00:24:09.840 Enten says
00:24:10.560 that Democrats
00:24:11.360 could inexplicably,
00:24:13.700 by surprise,
00:24:15.360 nobody saw it
00:24:16.200 coming,
00:24:16.820 but after the
00:24:17.380 fact,
00:24:17.740 you could say,
00:24:18.080 well,
00:24:18.300 Harry Enten
00:24:18.780 saw it
00:24:19.120 coming.
00:24:20.240 And here
00:24:20.520 are the
00:24:20.740 three things
00:24:21.340 it's all
00:24:21.780 it would
00:24:21.980 take.
00:24:23.040 Number
00:24:23.240 one,
00:24:23.820 Harry
00:24:24.240 explains,
00:24:25.320 bad
00:24:25.760 Republican
00:24:26.300 candidates.
00:24:28.520 Well,
00:24:29.020 that's a
00:24:29.260 pretty good
00:24:29.580 comment.
00:24:30.640 If all
00:24:31.020 of the,
00:24:32.060 or if
00:24:32.600 most of
00:24:33.340 the
00:24:33.540 Republicans
00:24:34.020 running
00:24:34.400 for office
00:24:34.960 were just
00:24:35.940 terrible
00:24:36.280 candidates,
00:24:38.140 well,
00:24:38.520 that would
00:24:38.840 change things,
00:24:39.580 wouldn't it?
00:24:40.340 But what
00:24:40.740 are the odds
00:24:41.500 that there
00:24:41.820 would be
00:24:42.100 terrible
00:24:42.500 candidates,
00:24:43.200 like worse
00:24:44.460 than normal?
00:24:45.100 What would
00:24:47.400 be the
00:24:47.720 argument that
00:24:48.280 they would
00:24:48.560 be worse
00:24:49.160 than average,
00:24:50.640 like every
00:24:51.180 other election
00:24:51.960 and every
00:24:52.320 other time?
00:24:53.420 I don't
00:24:53.980 think there's
00:24:54.340 an argument
00:24:54.680 for that,
00:24:55.100 is there?
00:24:56.700 But do
00:24:58.200 you know
00:24:58.460 this has
00:24:58.840 an interesting
00:24:59.460 quality to
00:25:00.260 it?
00:25:01.220 Bad
00:25:01.680 Republican
00:25:02.200 candidates.
00:25:02.880 It's kind
00:25:03.160 of subjective,
00:25:03.860 isn't it?
00:25:05.120 Isn't that
00:25:05.540 interesting?
00:25:05.880 There are
00:25:06.700 three things
00:25:07.300 that would
00:25:07.560 change this
00:25:08.500 election from
00:25:09.320 a predictable
00:25:10.860 Republican
00:25:11.520 victory to
00:25:15.040 a surprising
00:25:15.800 result,
00:25:16.300 and one
00:25:16.520 of them
00:25:16.780 is subjective,
00:25:18.180 that the
00:25:19.000 Republican
00:25:19.460 candidates were
00:25:20.400 bad.
00:25:21.520 They were
00:25:21.700 bad.
00:25:22.700 They didn't
00:25:23.020 do enough
00:25:23.380 work.
00:25:24.400 They didn't
00:25:24.720 campaign right.
00:25:26.040 They didn't
00:25:26.340 have good
00:25:26.880 campaign
00:25:28.940 organization.
00:25:31.180 Isn't it
00:25:31.560 interesting that
00:25:32.260 one of the
00:25:32.800 three things
00:25:33.320 that would
00:25:33.600 explain a
00:25:34.420 surprise
00:25:35.140 election
00:25:35.640 result would
00:25:37.140 be something
00:25:37.580 that you
00:25:37.940 could always
00:25:38.460 say was
00:25:38.880 true and
00:25:39.320 nobody really
00:25:39.860 could prove
00:25:40.360 it?
00:25:41.340 Well,
00:25:41.840 here you
00:25:42.160 go.
00:25:42.880 All those
00:25:43.360 Republicans
00:25:43.860 lost,
00:25:44.560 therefore,
00:25:46.020 it's proof
00:25:46.600 they were
00:25:46.940 bad
00:25:47.240 candidates.
00:25:48.040 Wait a
00:25:48.280 minute,
00:25:48.380 wait a
00:25:48.580 minute.
00:25:49.800 The fact
00:25:50.540 that they
00:25:50.820 lose is
00:25:51.480 proof that
00:25:52.580 they were
00:25:52.800 bad
00:25:53.060 candidates?
00:25:54.860 Or were
00:25:55.680 the bad
00:25:56.120 candidates
00:25:56.660 what caused
00:25:57.400 them to
00:25:57.760 lose?
00:25:58.960 Wait.
00:26:00.600 You see
00:26:01.140 how interesting
00:26:01.680 this thing
00:26:03.160 is?
00:26:03.620 It's a
00:26:04.040 setup that
00:26:04.600 they can
00:26:04.860 just say,
00:26:05.260 well,
00:26:05.380 they ran
00:26:05.740 bad
00:26:06.040 candidates.
00:26:07.140 Nobody
00:26:07.380 can prove
00:26:07.800 they didn't.
00:26:08.780 Nobody
00:26:09.020 can prove
00:26:09.460 they did.
00:26:10.140 It's
00:26:10.380 purely
00:26:10.680 subjective.
00:26:11.840 What's
00:26:12.100 the second
00:26:12.440 one?
00:26:13.400 The second
00:26:13.880 one is
00:26:14.220 the economy
00:26:14.820 improves.
00:26:16.820 What are
00:26:17.400 the odds
00:26:17.960 that by
00:26:20.180 election day
00:26:20.780 there won't
00:26:21.220 be noticeable
00:26:21.900 improvement in
00:26:22.600 the economy?
00:26:24.520 I mean,
00:26:24.880 we could
00:26:25.200 slide into
00:26:25.720 a recession.
00:26:26.440 It could
00:26:26.780 get worse.
00:26:28.200 But if
00:26:29.680 it gets
00:26:29.980 worse,
00:26:31.080 then even
00:26:31.660 cheating isn't
00:26:32.320 going to
00:26:32.520 work.
00:26:33.800 Right?
00:26:34.240 If the
00:26:34.840 economy is
00:26:35.400 actually
00:26:35.680 worse than
00:26:36.900 right now
00:26:37.480 by election
00:26:38.520 day,
00:26:40.220 then there's
00:26:41.940 no Democrat
00:26:42.660 who has a
00:26:45.500 close race
00:26:46.180 who's going
00:26:46.500 to be too
00:26:46.800 happy about
00:26:47.240 that.
00:26:49.240 But it's
00:26:50.180 a good bet
00:26:50.820 that at least
00:26:51.900 on some
00:26:52.480 measures,
00:26:53.080 such as I
00:26:54.220 imagine the
00:26:56.200 inflation rate
00:26:56.900 will come down
00:26:57.600 by then.
00:26:58.380 That's a good
00:26:58.940 bet.
00:26:59.260 You don't
00:26:59.940 know,
00:27:01.160 but it's
00:27:01.520 a smart
00:27:01.900 bet.
00:27:02.980 So let's
00:27:03.580 say a few
00:27:04.100 of the
00:27:04.480 economic
00:27:04.960 indicators
00:27:05.520 improve.
00:27:06.340 Not all
00:27:06.960 of them,
00:27:07.940 but let's
00:27:08.300 just say a
00:27:09.040 few good
00:27:09.380 ones improve.
00:27:10.000 Let's say
00:27:10.260 the GDP
00:27:10.720 is no
00:27:11.180 longer
00:27:11.440 negative.
00:27:11.920 How hard
00:27:12.240 would that
00:27:12.520 be?
00:27:13.640 How hard
00:27:14.140 would it
00:27:14.420 go from
00:27:14.860 our first
00:27:15.560 negative
00:27:16.700 GDP to
00:27:18.480 a little
00:27:19.200 bit positive?
00:27:20.240 Let's say
00:27:20.560 up 1%.
00:27:21.280 Probably
00:27:22.200 not that
00:27:22.600 hard.
00:27:24.060 Right?
00:27:24.680 Supply
00:27:25.140 chain gets
00:27:25.600 worked out.
00:27:26.160 Ukraine
00:27:28.920 starts
00:27:29.400 negotiating
00:27:29.920 with
00:27:30.220 Russia.
00:27:32.100 All of
00:27:32.480 a sudden
00:27:32.800 inflation
00:27:33.620 goes down
00:27:34.400 25%.
00:27:35.340 So even
00:27:38.360 if the
00:27:38.720 economy is
00:27:39.260 still bad
00:27:39.960 and even
00:27:40.440 if Biden
00:27:41.560 and the
00:27:41.960 Democrats
00:27:42.840 are the
00:27:43.260 cause,
00:27:44.200 wouldn't
00:27:44.460 this give
00:27:45.060 Democrats
00:27:46.540 an argument
00:27:47.080 that the
00:27:47.580 economy
00:27:47.940 improved?
00:27:49.120 Well,
00:27:49.460 we told
00:27:49.800 you if
00:27:50.040 the economy
00:27:50.460 improved,
00:27:51.060 the Democrats
00:27:51.520 would do
00:27:51.800 better than
00:27:52.180 you thought.
00:27:53.200 Right?
00:27:53.760 So now
00:27:54.280 you have
00:27:54.580 two subjective
00:27:55.620 things.
00:27:59.080 Bad
00:27:59.600 Republican
00:28:00.040 candidates,
00:28:00.900 nobody would
00:28:01.500 agree what
00:28:01.940 that looks
00:28:02.320 like,
00:28:02.940 and economy
00:28:03.580 improves,
00:28:04.500 which is
00:28:04.960 almost
00:28:05.260 guaranteed
00:28:05.820 just by
00:28:06.660 sitting around
00:28:07.260 and waiting.
00:28:09.800 Now
00:28:10.020 those are
00:28:11.420 two pretty
00:28:12.100 weak indicators,
00:28:13.320 aren't they?
00:28:14.580 And then
00:28:15.000 the third
00:28:15.340 one is
00:28:15.700 Democrats
00:28:16.380 basically have
00:28:17.620 to turn
00:28:17.960 out and
00:28:18.360 vote for
00:28:18.740 Biden.
00:28:19.720 So they
00:28:20.040 have to
00:28:20.260 give a
00:28:20.520 good
00:28:20.720 turnout.
00:28:22.300 What do
00:28:22.760 you think
00:28:22.960 would happen
00:28:23.400 if an
00:28:23.820 unusual
00:28:24.280 number of
00:28:25.180 Democrats
00:28:26.240 voted?
00:28:27.640 Like so
00:28:28.280 many
00:28:28.520 Democrats
00:28:29.000 voted that
00:28:29.680 by historical
00:28:30.700 standards you
00:28:32.480 say to
00:28:32.760 yourself,
00:28:33.740 huh,
00:28:34.320 this doesn't
00:28:35.020 even look
00:28:35.380 real.
00:28:36.760 If I
00:28:37.600 didn't know
00:28:37.940 better,
00:28:38.240 I'd think
00:28:38.520 this was
00:28:38.900 rigged.
00:28:39.460 This number
00:28:39.900 is so
00:28:40.180 big.
00:28:41.560 They're
00:28:42.000 literally
00:28:42.480 telling you
00:28:44.020 their game
00:28:44.440 plan right
00:28:44.920 here.
00:28:45.720 It feels
00:28:46.260 like they
00:28:46.600 just mapped
00:28:47.140 out exactly
00:28:48.020 what they're
00:28:48.460 going to
00:28:48.660 do.
00:28:49.600 There's
00:28:50.040 going to
00:28:50.220 be a
00:28:50.540 rigged
00:28:50.800 election.
00:28:51.520 Again,
00:28:52.020 I don't
00:28:52.320 know this.
00:28:52.820 this is
00:28:53.940 just a
00:28:54.280 mental
00:28:54.500 experiment,
00:28:55.180 right?
00:28:55.540 I don't
00:28:55.920 have any
00:28:56.220 evidence of
00:28:56.820 this.
00:28:58.500 All we
00:28:59.080 have is
00:28:59.400 foreshadowing.
00:29:00.860 Is
00:29:01.320 foreshadowing
00:29:02.040 evidence?
00:29:03.720 Remember,
00:29:04.280 evidence isn't
00:29:05.080 proof.
00:29:07.080 Evidence is
00:29:07.700 just something
00:29:08.180 that maybe
00:29:08.880 collectively could
00:29:09.820 get you to
00:29:10.240 a proof.
00:29:11.780 But
00:29:12.220 foreshadowing is
00:29:13.500 like weaker
00:29:13.920 than that.
00:29:14.860 It's just
00:29:15.220 something that
00:29:15.640 you say,
00:29:16.000 oh,
00:29:16.400 pattern
00:29:16.760 recognition.
00:29:17.880 Pattern
00:29:18.440 recognition has
00:29:19.560 kicked in.
00:29:20.560 I feel
00:29:21.620 like I've
00:29:22.060 seen this
00:29:22.440 pattern before.
00:29:23.420 That's all
00:29:23.740 it is.
00:29:24.420 So it's
00:29:24.800 not evidence.
00:29:25.980 It's
00:29:26.220 definitely
00:29:26.440 not proof.
00:29:28.660 But there's
00:29:29.420 definitely some
00:29:30.080 foreshadowing
00:29:30.680 here.
00:29:31.580 So let me
00:29:32.020 say the
00:29:32.320 one thing
00:29:32.640 that I
00:29:32.840 can say
00:29:33.120 for sure.
00:29:33.920 This is
00:29:34.560 foreshadowing
00:29:35.320 like crazy.
00:29:37.180 This
00:29:37.620 foreshadowing
00:29:38.340 is setting
00:29:38.860 up a
00:29:39.160 narrative
00:29:39.460 that if
00:29:40.520 there's
00:29:40.760 an
00:29:41.040 unexpected
00:29:41.700 Democrat
00:29:42.200 win,
00:29:43.480 and they
00:29:43.820 maintain,
00:29:45.140 you know,
00:29:45.520 let's say,
00:29:45.980 or they
00:29:46.220 somehow they
00:29:47.460 own both
00:29:47.940 sides of
00:29:48.360 Congress,
00:29:49.520 people are
00:29:50.180 going to
00:29:50.340 say,
00:29:50.560 look,
00:29:50.820 Harry
00:29:51.020 Enten
00:29:51.280 already
00:29:51.600 told you
00:29:51.980 how this
00:29:52.280 could happen.
00:29:53.280 It was
00:29:53.520 easy.
00:29:54.960 Republican
00:29:55.400 candidates
00:29:55.840 were crazy.
00:29:57.360 They were
00:29:57.580 basically all
00:29:58.260 Marjorie Taylor
00:29:58.900 Greens,
00:29:59.380 that's what
00:29:59.720 they're going
00:29:59.960 to say.
00:30:00.760 Didn't have
00:30:01.340 any good
00:30:01.640 candidates.
00:30:02.260 They were
00:30:02.440 all Trumpers
00:30:03.140 or whatever
00:30:03.480 they were
00:30:03.700 going to
00:30:03.920 say.
00:30:04.780 The economy
00:30:05.460 improved,
00:30:06.120 of course,
00:30:06.680 in some
00:30:07.340 ways it
00:30:07.700 will,
00:30:07.940 they'll make
00:30:08.260 that argument.
00:30:09.660 And then
00:30:09.980 they'll say,
00:30:10.520 yeah,
00:30:10.860 and the
00:30:11.980 Democrats
00:30:12.380 were really
00:30:12.900 enthusiastic,
00:30:13.780 got that
00:30:14.160 turnout out
00:30:14.720 there.
00:30:15.080 Man,
00:30:15.320 they got
00:30:15.560 some
00:30:15.780 turnout.
00:30:17.420 This is
00:30:18.020 pretty scary
00:30:18.600 shit.
00:30:18.940 Every once
00:30:22.720 in a while
00:30:23.040 you see a
00:30:23.520 topic that's
00:30:24.500 a political
00:30:24.940 topic in
00:30:26.440 which all
00:30:27.140 the people
00:30:27.500 on one
00:30:27.880 side are
00:30:28.240 the smart
00:30:28.680 people,
00:30:29.400 regardless
00:30:30.060 of their
00:30:30.540 political
00:30:30.960 affiliations,
00:30:32.080 and all
00:30:32.900 the people
00:30:33.220 on the
00:30:33.460 other side
00:30:33.820 are dumb
00:30:34.140 people.
00:30:35.360 Now,
00:30:35.520 you don't
00:30:35.700 see that
00:30:36.060 often.
00:30:36.840 Usually you
00:30:37.320 see people
00:30:37.720 just support
00:30:39.100 their side.
00:30:40.580 But this
00:30:41.460 whole free
00:30:42.500 speech thing
00:30:44.260 is actually
00:30:45.020 very interesting
00:30:46.060 because all
00:30:47.400 the smart
00:30:47.920 people are
00:30:48.640 on the
00:30:48.860 same
00:30:49.040 side.
00:30:50.260 And I'll
00:30:50.620 give you
00:30:50.820 an example.
00:30:51.400 Bill Maher,
00:30:52.300 he wasn't
00:30:53.180 too familiar
00:30:53.720 with the
00:30:54.080 Babylon Bee,
00:30:55.540 the satirical
00:30:56.600 site that
00:30:57.080 got banned
00:30:57.480 from Twitter,
00:30:58.860 but having
00:31:00.520 read up on
00:31:01.180 it,
00:31:01.320 he supports
00:31:01.940 the fact
00:31:02.600 that they
00:31:02.900 shouldn't
00:31:03.100 have been
00:31:03.340 banned for
00:31:03.860 jokes,
00:31:05.040 basically.
00:31:06.000 So Bill
00:31:06.400 Maher is
00:31:07.680 on the same
00:31:08.140 side as
00:31:08.800 the darkest,
00:31:10.800 deepest
00:31:11.100 Republican on
00:31:12.580 this issue.
00:31:13.880 But also
00:31:14.700 all the smart
00:31:15.380 people are
00:31:15.800 on this
00:31:16.100 side.
00:31:16.320 I don't
00:31:18.320 think you
00:31:19.760 could find
00:31:20.200 a smart
00:31:20.700 Republican
00:31:21.280 or a
00:31:21.960 smart
00:31:22.200 Democrat
00:31:22.640 who would
00:31:23.940 think that
00:31:24.300 the Babylon
00:31:24.740 Bee should
00:31:25.340 have been
00:31:25.620 banned from
00:31:26.800 Twitter.
00:31:27.940 I don't
00:31:28.620 think you
00:31:28.900 could find
00:31:29.240 one.
00:31:29.980 It's a
00:31:30.360 weird issue
00:31:30.920 that literally
00:31:32.820 just intelligence
00:31:33.680 is the
00:31:34.300 dividing line
00:31:35.000 and not
00:31:35.480 party
00:31:36.080 affiliation.
00:31:37.360 Because at
00:31:38.100 a certain
00:31:38.400 level of
00:31:38.800 intelligence,
00:31:39.800 you know
00:31:40.220 that whatever
00:31:40.680 the Babylon
00:31:41.220 Bee did
00:31:41.900 had minimal
00:31:43.320 to no
00:31:43.780 effect on
00:31:44.300 anybody.
00:31:45.720 But any
00:31:46.660 kind of
00:31:47.520 punishment to
00:31:48.280 free speech,
00:31:49.600 even by a
00:31:50.740 private entity
00:31:51.660 in this
00:31:52.060 special case,
00:31:53.680 is a
00:31:55.220 bad
00:31:55.640 precedent.
00:31:57.540 Pretty much
00:31:57.960 all the smart
00:31:58.500 people agree
00:31:59.020 with that.
00:32:01.340 Yeah.
00:32:02.660 All right,
00:32:03.140 but here's the
00:32:03.520 most surprising
00:32:04.080 story of the
00:32:04.700 day.
00:32:05.720 And I would
00:32:06.380 like you to
00:32:06.780 savor this
00:32:07.300 one.
00:32:07.540 or to
00:32:09.120 possibly puke
00:32:10.600 in your
00:32:10.880 mouth on
00:32:11.280 it,
00:32:11.480 depending on
00:32:11.940 your political
00:32:12.480 leaning.
00:32:14.100 Do you all
00:32:14.600 know Noam
00:32:15.700 Chomsky?
00:32:16.640 He's a very
00:32:17.620 famous intellectual
00:32:18.540 and he's
00:32:19.680 famous for
00:32:20.140 many things,
00:32:20.920 including being
00:32:21.560 so left,
00:32:22.860 he's like the
00:32:23.540 leftiest of
00:32:24.280 the left,
00:32:24.860 wouldn't you
00:32:25.180 say?
00:32:26.340 Right,
00:32:28.380 he's a
00:32:28.880 linguist,
00:32:29.420 but he talks
00:32:29.880 about politics
00:32:31.440 and he's
00:32:32.040 famous for
00:32:32.580 that.
00:32:34.240 So would
00:32:35.180 you agree
00:32:35.600 that he's
00:32:36.500 as left
00:32:37.020 as the
00:32:37.840 lefties
00:32:38.560 could be
00:32:39.120 left?
00:32:41.240 Now here's
00:32:42.040 the surprising
00:32:42.540 part.
00:32:46.420 So Noam
00:32:47.260 Chomsky was
00:32:47.840 being interviewed,
00:32:49.260 he was
00:32:49.560 talking about
00:32:50.100 the Ukraine
00:32:50.780 Russia
00:32:51.240 situation,
00:32:52.840 and this
00:32:53.240 was last
00:32:53.700 week,
00:32:54.140 no this
00:32:54.480 week in
00:32:54.820 an interview
00:32:55.100 he said,
00:32:56.200 quote,
00:32:56.820 fortunately
00:32:57.340 there is
00:32:58.720 one western
00:32:59.600 statesman
00:33:00.840 of stature
00:33:01.620 who is
00:33:02.480 pushing for
00:33:03.080 a diplomatic
00:33:03.560 solution to
00:33:04.340 the war
00:33:04.740 in Ukraine
00:33:05.260 rather than
00:33:06.380 looking for
00:33:06.900 ways to
00:33:07.460 prolong it.
00:33:10.280 Who's
00:33:10.800 that?
00:33:12.520 So who
00:33:13.060 is this
00:33:13.680 one western
00:33:14.720 statesman of
00:33:15.480 stature who
00:33:16.620 is pushing
00:33:17.040 for a
00:33:17.420 diplomatic
00:33:17.760 solution to
00:33:18.560 the war
00:33:18.860 in Ukraine
00:33:19.340 rather than
00:33:20.660 looking for
00:33:21.060 ways to
00:33:21.500 fuel and
00:33:21.940 prolong it?
00:33:23.060 And then
00:33:23.300 later Chomsky
00:33:24.180 explained what
00:33:25.080 that idea
00:33:25.660 was,
00:33:26.140 specifically
00:33:26.700 the idea
00:33:28.020 was to
00:33:28.380 have a
00:33:28.700 non-NATO
00:33:29.560 accommodation.
00:33:31.600 And accommodation
00:33:32.100 is an
00:33:32.600 interesting word,
00:33:33.320 it really
00:33:34.000 works in
00:33:34.540 this context.
00:33:36.080 And what
00:33:36.880 he suggested
00:33:37.460 was,
00:33:38.560 and he
00:33:39.280 was saying,
00:33:39.860 you know,
00:33:40.100 this one
00:33:40.760 statement,
00:33:41.400 the statesman
00:33:42.000 didn't say
00:33:42.400 this directly
00:33:42.960 but it's
00:33:43.380 obvious this
00:33:43.940 is where
00:33:44.220 he was
00:33:44.460 going,
00:33:45.680 that one
00:33:47.340 thing we
00:33:48.020 could do
00:33:48.340 to end
00:33:48.620 the war,
00:33:49.380 negotiation
00:33:49.880 wise,
00:33:50.680 is to say
00:33:51.460 to Putin,
00:33:51.940 you know
00:33:52.180 what,
00:33:52.940 we did
00:33:53.420 promise you
00:33:53.980 that NATO
00:33:54.620 would not
00:33:55.120 expand toward
00:33:56.260 Russia and
00:33:57.280 we were
00:33:58.780 breaking that
00:33:59.440 promise.
00:34:00.560 So how
00:34:01.140 about we
00:34:01.520 go back
00:34:01.900 to the
00:34:02.160 promise to
00:34:03.320 not expand
00:34:03.980 NATO and
00:34:05.140 we'll form
00:34:05.680 this other
00:34:06.180 alliance that's
00:34:07.280 not NATO.
00:34:08.480 And it
00:34:08.920 would be open
00:34:09.380 not just to
00:34:10.020 Europeans and
00:34:10.820 Americans,
00:34:11.620 it would be
00:34:11.900 open to
00:34:12.160 anybody.
00:34:12.980 And it
00:34:13.180 would be
00:34:13.420 sort of a
00:34:13.920 mutual
00:34:14.460 protecting
00:34:15.420 each other
00:34:16.000 agreement that
00:34:19.000 wouldn't be
00:34:19.440 NATO.
00:34:20.420 It wouldn't
00:34:20.840 be anti-Russia,
00:34:21.800 it wouldn't
00:34:21.960 be anti-
00:34:22.400 anybody.
00:34:22.980 It would
00:34:23.140 just be
00:34:23.560 everybody,
00:34:24.900 let's look
00:34:25.440 out for each
00:34:25.940 other's backs,
00:34:27.100 if somebody
00:34:27.560 attacks you,
00:34:28.440 we're your
00:34:28.760 friend,
00:34:29.780 but not
00:34:30.180 too specific
00:34:30.920 maybe.
00:34:32.060 And he
00:34:32.280 said that
00:34:32.740 even Japan
00:34:34.040 could join
00:34:34.660 it and
00:34:35.920 even Russia.
00:34:37.600 Russia could
00:34:38.160 actually join
00:34:38.760 this hypothetical
00:34:39.920 organization because
00:34:41.500 they too would
00:34:43.160 probably like to
00:34:43.800 have friends if
00:34:44.780 they were
00:34:45.020 attacked.
00:34:47.260 And so
00:34:48.060 Chomsky is
00:34:48.900 saying that
00:34:49.280 there is a
00:34:50.060 negotiated way
00:34:50.920 through this and
00:34:51.540 when he describes
00:34:52.220 it I say to
00:34:52.700 myself that
00:34:53.880 feels like that
00:34:54.480 could work because
00:34:55.700 that would give
00:34:56.140 Putin a win
00:34:56.820 because he could
00:34:58.320 say I stopped
00:34:59.060 NATO.
00:35:00.740 And he
00:35:01.560 probably
00:35:01.780 give some
00:35:02.160 territorial
00:35:02.780 consolidation,
00:35:05.260 at least
00:35:06.320 Crimea,
00:35:07.140 right?
00:35:07.800 So he'd
00:35:08.560 give something
00:35:09.000 and he could
00:35:09.900 say I
00:35:10.240 stopped
00:35:10.620 NATO.
00:35:12.980 At the
00:35:13.480 same time,
00:35:15.060 countries that
00:35:15.720 want the
00:35:16.100 protection of
00:35:16.740 NATO but
00:35:17.260 don't want to
00:35:17.880 be as
00:35:18.640 provocative as
00:35:19.880 joining NATO
00:35:20.600 could say,
00:35:21.660 well, we're
00:35:22.440 not going to
00:35:22.820 join NATO
00:35:23.420 because we
00:35:24.260 understand how
00:35:25.020 that feels to
00:35:25.780 Russia and
00:35:26.360 you guys are
00:35:26.900 prickly about
00:35:27.580 that.
00:35:27.880 But we'll
00:35:29.180 have this
00:35:29.520 other friendly
00:35:30.160 organization that's
00:35:31.280 not nearly as
00:35:32.320 provocative as
00:35:33.420 NATO and it'll
00:35:34.960 just make
00:35:35.360 everybody happy
00:35:35.940 because we're
00:35:36.460 friends.
00:35:38.480 Now, that
00:35:39.140 would make it
00:35:39.860 look like
00:35:40.300 everybody could
00:35:40.980 win.
00:35:41.800 You could argue
00:35:42.760 who really
00:35:43.920 wins, but it
00:35:44.460 would give
00:35:44.760 everybody an
00:35:45.420 argument that
00:35:46.740 they got
00:35:47.040 something.
00:35:48.760 And we're
00:35:49.480 getting back to
00:35:50.040 the statesman.
00:35:50.640 So, who
00:35:51.620 would be the
00:35:52.060 one statesman
00:35:53.620 that Noam
00:35:54.160 Chomsky, the
00:35:55.340 leftiest of the
00:35:56.100 left people in
00:35:56.680 the world, the
00:35:57.800 guiding light of
00:35:58.660 their intellectual
00:35:59.380 life, who
00:36:02.120 would he say is
00:36:03.080 smart enough,
00:36:03.900 really, of
00:36:05.480 stature, who
00:36:06.720 would have a
00:36:07.440 way to negotiate
00:36:08.940 this thing
00:36:09.380 successfully?
00:36:11.020 Well, Chomsky
00:36:11.600 said his name is
00:36:12.600 Donald J.
00:36:13.640 Trump.
00:36:16.060 I will pause
00:36:17.240 for a moment.
00:36:17.820 This really
00:36:20.360 happened.
00:36:23.260 This actually
00:36:24.380 happened.
00:36:25.620 Noam Chomsky
00:36:26.400 gave a full-throated
00:36:28.560 endorsement to
00:36:30.160 Donald Trump's
00:36:31.100 approach and
00:36:32.780 called out the
00:36:33.420 fact that nobody
00:36:34.080 else was in his
00:36:34.940 league.
00:36:38.520 Do you know
00:36:39.080 why Noam
00:36:39.620 Chomsky
00:36:40.160 supported Donald
00:36:42.280 Trump, who you
00:36:43.360 would imagine
00:36:43.840 would be as
00:36:44.400 far away as
00:36:45.560 they could
00:36:45.800 possibly be in
00:36:47.200 political
00:36:47.620 ideology?
00:36:48.580 Do you know
00:36:48.880 why he
00:36:49.140 supported him?
00:36:51.580 Because Noam
00:36:52.260 Chomsky didn't
00:36:53.040 want to lie.
00:36:55.000 That's my
00:36:55.540 guess.
00:36:56.020 I mean, I
00:36:56.440 can't read his
00:36:56.920 mind.
00:36:58.660 I feel like he
00:36:59.480 just didn't
00:36:59.840 want to lie.
00:37:01.720 Maybe he
00:37:02.280 always feels
00:37:02.720 like that.
00:37:03.240 You know, I'm
00:37:03.480 not saying that
00:37:04.060 he lies in any
00:37:05.140 other context.
00:37:06.220 It could be
00:37:06.780 that Noam
00:37:07.200 Chomsky is a
00:37:08.060 legitimate
00:37:08.460 intellectual.
00:37:10.260 And by
00:37:10.520 legitimate, I
00:37:11.340 mean that he
00:37:11.820 says what he
00:37:12.320 actually thinks
00:37:12.840 is real.
00:37:14.240 That would be
00:37:14.800 legitimate.
00:37:15.420 Even if you
00:37:15.900 disagree with
00:37:16.540 his opinions,
00:37:17.620 it's very
00:37:18.220 legitimate to
00:37:19.720 be that
00:37:20.100 educated, that
00:37:20.980 smart, and
00:37:22.340 be honest.
00:37:23.580 Like, you'd
00:37:24.000 have to respect
00:37:25.120 that, right?
00:37:26.900 So even if you
00:37:27.720 disagree with
00:37:28.260 him.
00:37:29.340 But I'm going
00:37:31.400 to introduce a
00:37:32.040 new concept
00:37:32.540 today.
00:37:33.500 We always talk
00:37:34.280 about free
00:37:34.880 speech.
00:37:35.360 I don't think
00:37:37.860 free speech is
00:37:38.720 exactly the way
00:37:39.900 to talk about
00:37:41.320 it.
00:37:41.580 Because in our
00:37:42.400 world where
00:37:43.080 commerce and
00:37:43.940 freedom are
00:37:44.680 connected, like
00:37:46.780 you can say
00:37:47.320 something terrible
00:37:48.140 but it's going
00:37:48.740 to hurt you
00:37:49.100 economically.
00:37:51.000 So yeah,
00:37:51.800 technically that's
00:37:52.700 free speech,
00:37:53.400 right?
00:37:54.000 Because the
00:37:54.520 government didn't
00:37:55.220 necessarily do it
00:37:56.020 to you.
00:37:56.880 But you're not
00:37:57.720 really free if
00:37:59.280 you can't say
00:37:59.840 something without
00:38:00.460 economic repercussions.
00:38:01.780 But I'm going to
00:38:02.940 tell you that there's
00:38:03.600 a new thing that is
00:38:04.460 better than free
00:38:05.140 speech and it's
00:38:05.800 called affordable
00:38:07.140 speech.
00:38:07.960 It's not free,
00:38:09.900 but is it
00:38:10.680 affordable?
00:38:11.120 And there are
00:38:12.780 two categories of
00:38:13.560 people at least
00:38:14.320 who have what I
00:38:15.740 call affordable
00:38:16.740 speech.
00:38:18.080 Number one,
00:38:19.420 me.
00:38:20.420 People who have
00:38:21.180 money can afford
00:38:23.260 free speech.
00:38:24.580 So when I say I
00:38:25.420 have fuck you
00:38:26.060 money, what that
00:38:27.460 means is I can
00:38:28.420 say my opinion
00:38:29.520 as honestly as I
00:38:30.940 want and even
00:38:31.440 dishonestly if I
00:38:32.320 want because I
00:38:33.220 have the ability to
00:38:34.180 say anything I
00:38:34.800 want and then if
00:38:36.120 I'm damaged
00:38:36.960 economically, I
00:38:38.400 say, oh fuck
00:38:39.720 you, I'm going to
00:38:40.140 retire anyway.
00:38:41.120 I have enough
00:38:41.560 money.
00:38:42.620 So I don't have
00:38:43.540 free speech because
00:38:45.140 I will get whacked
00:38:46.040 economically if I
00:38:47.120 cross the line,
00:38:47.900 right?
00:38:48.340 I already have.
00:38:49.120 I probably lost
00:38:49.940 one third of my
00:38:51.080 potential income
00:38:52.060 for the last 10
00:38:52.740 years based on my
00:38:54.560 blogging and I
00:38:56.180 knew it, right?
00:38:57.200 I was completely
00:38:58.320 aware that the
00:39:00.100 things I was saying
00:39:00.920 would degrade my
00:39:01.860 income substantially,
00:39:03.100 but I could afford
00:39:04.220 it.
00:39:05.540 So I had free
00:39:06.580 speech.
00:39:07.340 You didn't.
00:39:08.300 You absolutely
00:39:09.040 did not unless
00:39:10.180 you were also
00:39:10.640 rich.
00:39:11.940 But I had free
00:39:13.020 speech.
00:39:13.960 It cost me a lot
00:39:15.060 of money, but I
00:39:15.640 could afford it.
00:39:16.900 There's another
00:39:17.540 category of free
00:39:18.340 speech.
00:39:19.720 Old as fuck.
00:39:22.340 When you're old
00:39:23.160 as fuck, you
00:39:24.360 don't care.
00:39:26.060 Noam Chomsky is
00:39:26.920 old as fuck.
00:39:28.340 Now, I don't know
00:39:29.080 what, how old is
00:39:29.920 he?
00:39:30.540 I mean, he looks
00:39:31.040 like he doesn't
00:39:31.760 have long, honestly.
00:39:34.820 Give me an age
00:39:35.620 on him.
00:39:36.500 It's 145 or
00:39:38.160 something.
00:39:39.100 He's super old.
00:39:41.000 But I think Noam
00:39:41.800 Chomsky, he doesn't
00:39:42.600 give a fuck.
00:39:44.880 Noam Chomsky, he's
00:39:46.020 93, somebody says.
00:39:47.960 Noam Chomsky can
00:39:49.420 afford his speech.
00:39:52.120 Because if you give
00:39:52.880 Noam Chomsky the
00:39:53.780 bill, he'll say,
00:39:54.940 yeah, give me 30
00:39:55.620 days, I'll get right
00:39:56.300 back to you on that.
00:39:57.020 He'll probably be
00:39:57.500 dead anyway.
00:39:58.460 Okay, I'm
00:39:58.860 exaggerating.
00:39:59.540 But you get the
00:39:59.980 point, right?
00:40:01.240 You don't give a
00:40:02.040 fuck.
00:40:03.200 Noam Chomsky can
00:40:04.060 say anything he
00:40:04.680 wants.
00:40:05.540 So he can offend
00:40:06.500 every person
00:40:07.200 who ever loved
00:40:07.800 him.
00:40:09.800 Maybe he just
00:40:10.600 did.
00:40:11.760 I don't know.
00:40:13.340 But Noam has
00:40:15.520 affordable speech,
00:40:17.140 and I have
00:40:17.680 affordable speech.
00:40:20.500 Elon Musk has
00:40:21.660 affordable speech.
00:40:23.820 Elon Musk talked
00:40:24.920 down his own
00:40:25.660 stock price.
00:40:27.620 Do you remember
00:40:28.000 that?
00:40:30.280 He said his own
00:40:31.700 stock, at one
00:40:32.240 point in the past,
00:40:32.960 he said his own
00:40:33.480 stock was too
00:40:34.040 high, and then
00:40:35.080 like dropped.
00:40:36.760 And he said a
00:40:37.340 bunch of other
00:40:38.060 things that, you
00:40:38.720 know, he seemed
00:40:39.200 getting sued for
00:40:40.020 some of the
00:40:40.380 things he said.
00:40:41.460 So he got sued
00:40:42.200 for, I don't
00:40:42.700 know, tens of
00:40:43.220 millions of
00:40:43.620 dollars by
00:40:45.320 stockholders for
00:40:46.940 saying something
00:40:47.920 about taking it
00:40:48.620 private, and it
00:40:49.440 affected the stock
00:40:50.240 price, whatever.
00:40:51.780 But could he
00:40:52.640 afford it?
00:40:53.920 He could.
00:40:55.160 It turns out,
00:40:56.340 even though it
00:40:57.080 might cost tens of
00:40:58.020 millions, hundreds of
00:40:58.880 millions of dollars,
00:41:00.460 turns out he could
00:41:01.160 afford it.
00:41:02.080 So he had
00:41:02.680 affordable speech,
00:41:03.580 speech, and he
00:41:04.900 might try to give
00:41:05.660 you free speech, or
00:41:07.780 more affordable
00:41:08.380 speech, through
00:41:09.160 Twitter.
00:41:11.540 All right, here's
00:41:12.680 another foreshadowing.
00:41:15.140 Nancy Pelosi
00:41:15.980 visited Ukraine, so
00:41:17.200 now she's the
00:41:17.700 highest-ranking
00:41:18.320 person to go over
00:41:19.000 there.
00:41:19.720 Clearly, we're
00:41:20.660 sending lots and
00:41:21.520 lots of signals to
00:41:23.500 the world and
00:41:24.620 Ukraine by sending
00:41:26.240 such high-level
00:41:27.740 politicians over there
00:41:29.940 in the middle of a
00:41:31.100 war.
00:41:31.320 And that's a really
00:41:32.380 strong signal.
00:41:33.440 And I will give the
00:41:34.580 Biden administration
00:41:35.380 credit for the
00:41:38.400 communication there.
00:41:39.640 They are trying to
00:41:40.680 send that signal, and
00:41:42.180 they're succeeding.
00:41:43.380 They're doing a real
00:41:44.120 good job of sending a
00:41:45.560 we're backing Ukraine.
00:41:47.840 But watch the
00:41:48.620 language.
00:41:49.200 Here's a little trick
00:41:50.040 that I learned in the
00:41:51.080 hypnosis class.
00:41:53.700 Hypnotists learn, well,
00:41:55.280 depending who instructs
00:41:56.640 you, you might not
00:41:57.280 learn this.
00:41:57.700 They learn that you
00:41:58.900 can fairly accurately
00:42:00.740 identify what people
00:42:02.060 are thinking or
00:42:02.920 planning or secretly
00:42:04.060 cogitating by the
00:42:07.760 words that they
00:42:08.540 choose.
00:42:09.960 So when people speak,
00:42:11.080 they're not thinking
00:42:11.880 about every word before
00:42:12.920 it comes out their
00:42:13.560 mouth.
00:42:14.460 We speak sort of in
00:42:15.680 flow, right?
00:42:17.480 My language is coming
00:42:18.840 out now almost like my
00:42:20.200 brain is doing
00:42:20.800 something slightly
00:42:21.720 different because my
00:42:23.080 brain is like a little
00:42:24.000 bit ahead of where I'm
00:42:25.140 talking.
00:42:25.580 So literally right
00:42:27.380 now, I'm modeling
00:42:29.040 this, my mouth is
00:42:30.280 talking, but I'm not
00:42:31.080 even super conscious
00:42:32.300 of the mental part of
00:42:33.680 it.
00:42:34.520 It's just so automatic
00:42:35.720 because I kind of know
00:42:36.960 what I'm going to talk
00:42:37.580 about.
00:42:38.760 So in that context,
00:42:41.780 people will use word
00:42:43.780 choices that reveal
00:42:45.960 what they're really
00:42:46.740 thinking because they're
00:42:48.220 not thinking too hard
00:42:49.220 about filtering it.
00:42:50.760 Where this doesn't
00:42:51.920 work is when people
00:42:54.180 have a speech that's
00:42:55.420 written and a lot of
00:42:56.240 people look at it and
00:42:57.680 a lot of people look
00:42:58.360 at your political
00:42:58.960 speech, they go,
00:42:59.720 ooh, were you aware
00:43:01.360 that this word makes
00:43:02.280 me think this?
00:43:03.360 And somebody say,
00:43:03.980 oh, I didn't know
00:43:04.520 that.
00:43:05.140 They'll take that word
00:43:05.760 out.
00:43:06.540 So if you've got a
00:43:07.520 nice scrubbed piece
00:43:09.000 of work, then this
00:43:10.180 doesn't work.
00:43:11.540 But anything like a
00:43:12.600 tweet where somebody
00:43:14.200 is not putting a lot
00:43:14.960 of work into it,
00:43:16.400 send out of a tweet,
00:43:17.920 you can slip out some
00:43:19.500 words that were
00:43:20.240 unintended.
00:43:21.280 I feel like this is
00:43:22.560 happening with this
00:43:23.760 Ukraine situation.
00:43:24.660 One tweet that Nancy
00:43:27.580 Pelosi gave was, it
00:43:29.120 ended with this,
00:43:29.880 America stands firmly
00:43:31.140 with Ukraine.
00:43:32.640 Now that would be
00:43:33.640 compatible with
00:43:34.500 everything that's been
00:43:35.540 said so far, right?
00:43:37.520 That the United States
00:43:38.460 stands firmly with
00:43:39.500 Ukraine.
00:43:40.420 Now that is a
00:43:42.540 scrubbed statement.
00:43:44.540 That's one that people
00:43:45.720 could look at and say,
00:43:47.020 okay, that is right on
00:43:48.740 point and it doesn't
00:43:49.660 make people think the
00:43:50.660 wrong thing.
00:43:51.480 It doesn't mislead in
00:43:52.460 any way.
00:43:52.800 It's not ambiguous.
00:43:53.420 But America stands
00:43:56.580 with Ukraine, leaves
00:43:57.580 open a possibility,
00:43:58.580 doesn't it?
00:43:59.900 And that possibility is
00:44:01.300 that Ukraine could
00:44:02.080 lose and we're with
00:44:05.040 them.
00:44:06.260 But we're not really
00:44:07.200 talking about victory.
00:44:08.600 We're just sort of
00:44:09.460 with you.
00:44:10.860 It's like being with
00:44:12.200 somebody as they're
00:44:12.880 dying.
00:44:13.980 We're with you.
00:44:16.240 You're going to go
00:44:17.020 through something bad,
00:44:17.940 but we're with you.
00:44:18.620 Now that was one
00:44:21.200 thing she said.
00:44:22.160 But here's another
00:44:22.760 thing she said.
00:44:25.280 This is a Pelosi
00:44:26.240 tweet.
00:44:27.380 It's around the same
00:44:28.240 time too, so there's
00:44:29.520 not much time
00:44:30.540 difference between
00:44:31.180 these tweets.
00:44:32.280 She says, our
00:44:32.780 congressional delegation
00:44:33.920 was honored to meet
00:44:34.880 with Zelensky in Kiev
00:44:36.100 to salute his
00:44:37.420 leadership and
00:44:38.080 courage.
00:44:38.500 So, so far this is
00:44:39.340 all like blah, blah,
00:44:40.180 blah, you know,
00:44:41.060 political talk, right?
00:44:42.100 To commend the
00:44:42.940 Ukrainian people for
00:44:44.240 their outstanding
00:44:44.920 defense of democracy
00:44:46.060 and to say that
00:44:47.320 we are with you
00:44:48.080 until victory is
00:44:49.220 won.
00:44:50.020 What?
00:44:51.960 Until victory is
00:44:53.060 won.
00:44:54.600 What the hell is
00:44:55.400 victory?
00:44:56.920 What would that
00:44:57.600 look like?
00:45:01.080 Victory is a
00:45:03.260 slip, I think.
00:45:06.140 So based on
00:45:06.900 hypnosis training,
00:45:08.820 the word victory
00:45:09.580 looks like a
00:45:10.200 mistake.
00:45:11.580 Like, like they
00:45:12.420 should have edited
00:45:13.080 that word out of
00:45:13.860 there.
00:45:14.340 What they should
00:45:15.080 have said is what
00:45:15.620 the other tweet
00:45:16.200 said.
00:45:17.480 America stands
00:45:18.280 firmly with
00:45:18.940 Ukraine.
00:45:20.640 Right?
00:45:21.820 Who put that
00:45:22.600 word victory in
00:45:23.440 there?
00:45:24.480 I don't think
00:45:25.180 that was
00:45:25.480 intentional.
00:45:27.060 Or, it's
00:45:28.640 foreshadowing.
00:45:30.500 They may be,
00:45:31.260 they may be
00:45:32.200 preparing the
00:45:32.900 country, and
00:45:34.700 this could be
00:45:35.340 intentionally doing
00:45:36.520 it, preparing the
00:45:37.360 country.
00:45:38.440 Remember I told
00:45:39.240 you that this is a
00:45:41.080 tipping point war,
00:45:42.740 war, and
00:45:44.740 that whoever
00:45:45.400 pushes the
00:45:46.520 other one in
00:45:47.100 any one of
00:45:47.640 these variables
00:45:48.340 just over that
00:45:49.400 tipping point
00:45:50.040 would be what
00:45:51.620 looks like the
00:45:52.200 winner.
00:45:53.400 And the tipping
00:45:53.960 points are
00:45:54.540 running out of
00:45:55.100 fuel, running
00:45:56.560 out of ammo,
00:45:57.420 the good kind,
00:45:58.220 wherever you need
00:45:58.820 it the most,
00:46:00.260 and running out
00:46:01.120 of, let's say,
00:46:02.580 food, fuel, and
00:46:04.220 ammo.
00:46:05.760 It's those three
00:46:06.420 things.
00:46:07.580 And both sides
00:46:08.620 are trying to
00:46:09.220 deny the other
00:46:10.060 those three
00:46:10.600 things, because
00:46:11.760 that would be
00:46:12.260 enough for, you
00:46:13.920 know, surrender.
00:46:15.600 And they're all
00:46:18.220 at the tipping
00:46:18.720 point, or we
00:46:20.720 imagine that to
00:46:21.420 be true, in the
00:46:22.640 fog of war and
00:46:23.460 what we can see
00:46:24.080 from the outside.
00:46:25.100 Any one of those
00:46:25.820 things could be a
00:46:26.560 tipping point, but
00:46:27.840 in either direction.
00:46:29.900 So if you're
00:46:30.700 looking at, like,
00:46:31.440 who has gained
00:46:32.100 what territory, or
00:46:33.360 who repelled who
00:46:34.780 from what places,
00:46:35.780 I don't know
00:46:37.000 that that's
00:46:37.400 telling you
00:46:37.820 anything.
00:46:39.080 Those are not
00:46:39.640 the important
00:46:40.080 variables, because
00:46:41.540 this is a who
00:46:42.260 can last longer
00:46:43.240 war, and that's
00:46:44.340 going to be a
00:46:44.720 tipping point
00:46:45.260 question.
00:46:46.020 It's not going
00:46:46.540 to be who had
00:46:47.680 so much progress
00:46:48.580 so far.
00:46:50.040 Because it could
00:46:50.600 be that the team
00:46:51.920 that pushed the
00:46:52.660 furthest used up
00:46:54.100 the most fuel, and
00:46:56.020 put their supply
00:46:56.980 lines in the most
00:46:57.720 jeopardy.
00:46:58.840 So the thing
00:46:59.660 you're looking at
00:47:00.180 could be exactly
00:47:00.820 the opposite of
00:47:01.540 what matters.
00:47:02.220 You know, who
00:47:02.480 got a battleground
00:47:03.960 victory?
00:47:04.460 It might not
00:47:04.880 matter.
00:47:05.780 In a war of
00:47:06.480 attrition.
00:47:07.480 Because it'll
00:47:08.140 just be victories
00:47:09.280 back and forth
00:47:10.040 until somebody
00:47:10.620 runs out of
00:47:11.120 one of the
00:47:11.460 big three.
00:47:12.380 And then,
00:47:13.520 somebody has
00:47:14.060 to surrender.
00:47:15.320 If you have
00:47:16.040 no fuel, no
00:47:16.760 ammo, or no
00:47:17.740 food, it's
00:47:19.000 over.
00:47:20.580 So, I don't
00:47:22.400 think that we
00:47:22.920 can assume that
00:47:25.180 the United States
00:47:26.040 has given up on
00:47:26.920 the idea that
00:47:27.480 Ukraine would
00:47:28.040 have an
00:47:28.340 outright victory.
00:47:29.620 I believe that
00:47:30.480 what is being
00:47:31.020 signaled here,
00:47:31.940 either intentionally
00:47:32.720 or unintentionally,
00:47:33.700 with the word
00:47:34.620 victory, that I
00:47:35.840 think is a
00:47:36.340 mistake.
00:47:37.460 It's telling you
00:47:38.280 that privately, at
00:47:39.820 the highest levels
00:47:40.580 of the United
00:47:41.020 States, they are
00:47:42.540 talking about
00:47:43.120 winning the war
00:47:44.080 outright.
00:47:45.540 That's what
00:47:46.200 $33 billion
00:47:47.260 heading toward
00:47:48.220 Ukraine means.
00:47:49.260 It doesn't mean
00:47:50.160 we stand with
00:47:50.860 Ukraine.
00:47:51.260 Do you know
00:47:52.580 what stand with
00:47:53.320 Ukraine money
00:47:53.960 looks like?
00:47:55.480 $3 billion.
00:47:57.460 That's what that
00:47:58.200 looks like.
00:47:58.640 If you said
00:47:59.920 we stand with
00:48:00.560 Ukraine and
00:48:01.300 send $3 billion,
00:48:02.240 everybody would
00:48:02.740 say, yeah, that's
00:48:03.520 about right.
00:48:04.020 You stand with
00:48:04.660 them, you send
00:48:05.080 $3 billion.
00:48:06.080 That's a lot of
00:48:06.640 money.
00:48:07.820 If you say we'll
00:48:08.820 be with you until
00:48:09.580 victory, and you
00:48:11.120 send $33 billion,
00:48:12.760 that tells me that
00:48:13.900 privately, you
00:48:14.700 think there's a
00:48:15.260 way to win this
00:48:15.880 thing outright.
00:48:17.440 And the only way
00:48:18.100 you'd say that is
00:48:18.880 if the generals are
00:48:19.760 telling you that.
00:48:21.500 Am I right?
00:48:23.020 Politicians would
00:48:23.740 not come to an
00:48:24.780 independent opinion
00:48:25.620 that looked anything
00:48:26.440 like that.
00:48:27.080 Because they
00:48:28.240 wouldn't know.
00:48:29.340 But if their
00:48:29.920 advisors are
00:48:30.660 saying, you
00:48:31.020 know, all of
00:48:32.780 the TV generals
00:48:33.620 are telling you
00:48:34.220 that Russia is
00:48:36.040 going to win.
00:48:37.360 But let them
00:48:38.060 keep saying that
00:48:38.780 because that's
00:48:39.200 how you get the
00:48:39.720 money.
00:48:40.740 And that's how
00:48:41.160 you get the
00:48:41.620 support, right?
00:48:42.800 You want to
00:48:43.200 keep painting
00:48:44.540 Zelensky as the
00:48:46.360 underdog hero.
00:48:48.220 Keep painting
00:48:49.100 Russia as the
00:48:50.180 bully who's
00:48:50.900 winning.
00:48:51.940 And that's our
00:48:52.480 best situation for
00:48:53.600 getting money and
00:48:54.400 resources to Ukraine.
00:48:55.900 But secretly,
00:48:57.080 we think the
00:48:59.280 Russians are so
00:48:59.940 close to
00:49:00.440 collapse that
00:49:02.360 we're going to
00:49:02.960 push them until
00:49:03.900 they do.
00:49:04.960 I think that the
00:49:06.140 Biden administration
00:49:06.760 is trying to
00:49:07.460 destroy the
00:49:08.020 Russian army and
00:49:09.160 take Putin out
00:49:09.780 of power.
00:49:11.800 That's what it
00:49:12.440 looks like.
00:49:13.400 And I think that
00:49:14.320 they think they
00:49:14.900 can do it.
00:49:16.140 And you know
00:49:16.600 what?
00:49:18.160 I don't know
00:49:19.020 that they can't.
00:49:21.980 Especially if
00:49:22.820 there's somebody
00:49:23.320 on the inside who
00:49:24.340 knows what they're
00:49:24.880 talking about.
00:49:25.540 and that
00:49:26.360 wouldn't be me.
00:49:29.280 Is telling
00:49:29.940 them that, you
00:49:30.640 know, these
00:49:31.260 tipping points are
00:49:32.000 really close and
00:49:33.100 I think the
00:49:33.560 Russians are a
00:49:34.080 little closer to
00:49:34.700 the tipping point
00:49:35.480 than the
00:49:36.360 Ukrainians.
00:49:37.100 And that would
00:49:37.580 make sense.
00:49:38.320 Because the
00:49:38.880 Ukrainians could
00:49:39.540 always be
00:49:40.140 backstopped
00:49:41.000 somewhat
00:49:42.180 infinitely.
00:49:44.280 Right?
00:49:44.760 They could run
00:49:45.420 out of food for
00:49:46.400 a week, which
00:49:47.580 would be
00:49:47.840 devastating.
00:49:49.220 But I'll bet
00:49:49.880 they'd get their
00:49:50.380 food in a week.
00:49:50.960 I don't know
00:49:51.480 if the Russians
00:49:51.980 would necessarily
00:49:52.940 keep their
00:49:54.260 fighting cohesion
00:49:55.160 as an invading
00:49:56.780 force if they
00:49:59.280 ran into any
00:49:59.920 one of those
00:50:00.360 things or were
00:50:01.180 as low as
00:50:02.280 the Ukrainians
00:50:03.100 could be.
00:50:03.680 So I just
00:50:04.080 think the
00:50:04.360 Ukrainians could
00:50:05.000 last it out
00:50:05.700 because they
00:50:06.080 have to.
00:50:07.140 The Ukrainians
00:50:07.760 can't go home
00:50:08.540 because they're
00:50:09.900 home.
00:50:11.320 Right?
00:50:12.060 So I feel
00:50:13.620 like the war
00:50:16.160 has changed
00:50:17.000 and that the
00:50:17.800 administration
00:50:18.400 believes that
00:50:20.020 they can just
00:50:20.500 have an outright
00:50:21.120 victory.
00:50:23.360 How would that
00:50:24.100 change the
00:50:24.560 midterms?
00:50:26.240 Didn't see that
00:50:26.860 coming, did you?
00:50:28.280 What if Biden
00:50:29.200 actually took
00:50:29.820 down Putin?
00:50:33.980 What does that
00:50:34.780 do to the
00:50:35.160 election?
00:50:36.260 Because that's
00:50:37.020 totally within
00:50:37.680 the realm of
00:50:38.240 possibility.
00:50:39.540 Before the
00:50:40.200 election.
00:50:41.480 Like if Putin
00:50:42.160 makes it through
00:50:42.760 the summer,
00:50:44.260 then he's
00:50:44.580 probably fine.
00:50:46.020 But it's
00:50:46.640 going to be
00:50:46.900 kind of a
00:50:47.240 dicey summer
00:50:48.160 for Putin,
00:50:48.760 I would
00:50:48.960 think.
00:50:51.180 It's going
00:50:51.720 to be a
00:50:51.980 bad summer
00:50:52.860 for his
00:50:53.320 food taster,
00:50:53.940 that's for
00:50:54.240 sure.
00:50:55.500 All right,
00:50:55.740 one of my
00:50:56.100 predictions that
00:50:56.740 I've made for
00:50:57.300 decades now is
00:50:58.600 that we will
00:50:59.400 have immortality
00:51:00.380 in the digital
00:51:01.500 world.
00:51:03.280 And there's a
00:51:03.960 company,
00:51:04.720 Somnium Space,
00:51:06.980 that's developing
00:51:08.020 a way to have
00:51:08.840 your life live
00:51:10.780 on in the
00:51:11.340 afterlife.
00:51:12.420 So after you
00:51:13.200 pass, they can
00:51:14.240 create an avatar
00:51:14.920 that looks like
00:51:15.660 you and
00:51:16.660 acts like
00:51:17.160 you and
00:51:18.460 talks like
00:51:19.140 you.
00:51:20.720 Now for
00:51:21.460 decades, I've
00:51:23.540 been telling
00:51:23.840 you that I'm
00:51:24.600 creating a
00:51:25.360 public database
00:51:27.740 of me, and
00:51:29.180 I'm doing it
00:51:29.720 right now.
00:51:30.440 So everything I
00:51:31.040 say right now,
00:51:32.000 the way I move,
00:51:33.320 the way I look,
00:51:34.600 is being recorded
00:51:35.720 in all these
00:51:36.580 different ways.
00:51:37.840 300, well,
00:51:39.580 probably 400
00:51:40.520 videos a year
00:51:41.560 I'm making that
00:51:43.720 show who I am
00:51:44.680 and what I
00:51:45.060 think and how
00:51:45.560 I talk and
00:51:46.160 all that.
00:51:47.180 Here's something
00:51:47.760 I didn't know
00:51:48.260 about.
00:51:49.540 There was a
00:51:50.260 study that
00:51:51.220 showed that
00:51:51.780 they took
00:51:52.180 500 people
00:51:53.040 and they had
00:51:54.440 artificial intelligence
00:51:55.740 look at how
00:51:56.300 these people
00:51:56.780 move, just
00:51:57.520 their natural
00:51:58.340 movements, and
00:51:59.640 they could
00:52:00.040 identify somebody
00:52:01.300 by their
00:52:01.820 natural movements,
00:52:03.160 not their
00:52:03.600 speech, the way
00:52:05.020 their body
00:52:05.420 moved, 95%
00:52:07.040 of the time,
00:52:07.600 out of a group
00:52:08.120 of 500.
00:52:08.620 So in other
00:52:10.220 words, AI
00:52:11.220 could watch
00:52:11.980 me move, and
00:52:13.940 then over time
00:52:14.660 it could simply
00:52:15.960 say, all right,
00:52:16.580 I'll make a little
00:52:17.120 avatar and it'll
00:52:17.800 move exactly the
00:52:18.700 way you did, or
00:52:20.420 at least so close
00:52:21.300 that almost
00:52:22.520 everybody would
00:52:23.200 think it looked
00:52:23.700 like the way I
00:52:24.220 moved.
00:52:25.420 And it would
00:52:26.120 improve over time.
00:52:28.980 So this company
00:52:30.260 is gearing up.
00:52:31.920 I guess you'll
00:52:32.360 use the same
00:52:33.120 VR glasses to
00:52:35.040 go interact with
00:52:35.900 that avatar.
00:52:36.400 And then, of
00:52:37.700 course, people
00:52:39.840 pushed back.
00:52:41.140 A user on
00:52:41.760 Twitter, Papa
00:52:42.480 Rossi, David
00:52:43.820 Rossi, he
00:52:45.300 said, and I
00:52:46.020 quote, a
00:52:46.400 computer thinking
00:52:47.320 it's me isn't
00:52:48.000 me.
00:52:48.780 This isn't
00:52:49.480 immortality, it's
00:52:50.520 just a new
00:52:51.060 mausoleum.
00:52:52.320 To which I
00:52:53.120 say, yes, to
00:52:56.360 you, from your
00:52:58.240 perspective, this
00:52:59.940 would not be
00:53:00.520 real, it would
00:53:01.880 just be a
00:53:02.460 computer program.
00:53:04.060 But if the
00:53:04.580 computer program
00:53:05.340 is programmed,
00:53:06.400 to believe it's
00:53:07.520 real, it will
00:53:09.840 believe it's
00:53:10.320 real, because it
00:53:11.820 was programmed to
00:53:12.480 believe it.
00:53:14.040 So it doesn't
00:53:14.860 matter what you
00:53:15.420 think.
00:53:16.540 It only matters
00:53:17.520 that the
00:53:17.900 simulation thinks
00:53:18.720 it's real.
00:53:20.100 And this gets
00:53:22.280 us this much
00:53:22.940 closer to proving
00:53:23.840 that you live in
00:53:24.500 a simulation right
00:53:25.300 now.
00:53:25.560 Now, did you
00:53:27.840 know that the
00:53:29.320 closer our own
00:53:30.540 technology gets to
00:53:32.000 creating these
00:53:32.720 simulated worlds
00:53:33.760 simulations in
00:53:34.680 which the
00:53:35.100 simulation thinks
00:53:36.340 it's real?
00:53:36.780 The closer we
00:53:38.500 get to doing
00:53:38.980 that in our
00:53:39.600 reality, the
00:53:41.080 more we will
00:53:41.740 understand that
00:53:42.840 it already
00:53:43.360 happened.
00:53:44.780 And it's us.
00:53:46.540 And that we're
00:53:47.660 literally software.
00:53:49.940 Now, it's
00:53:50.760 possible we're
00:53:51.380 not.
00:53:52.220 It's just
00:53:52.680 really, really
00:53:53.420 unlikely.
00:53:55.420 Really
00:53:55.960 unlikely.
00:53:56.340 What happens
00:54:02.520 when we create
00:54:03.420 avatars who
00:54:05.440 believe they're
00:54:06.120 real, and
00:54:07.360 they also have
00:54:08.080 artificial intelligence
00:54:09.220 driving their
00:54:10.180 personalities, and
00:54:11.780 their personalities
00:54:12.400 are us, basically,
00:54:14.020 put into them?
00:54:15.640 What happens
00:54:16.420 when the avatar
00:54:17.060 has a conversation
00:54:17.920 with you and you
00:54:18.580 say, but you
00:54:19.240 know you're not
00:54:19.760 real.
00:54:20.800 You don't have a
00:54:21.620 soul.
00:54:22.660 And the avatar
00:54:23.560 who looks exactly
00:54:24.480 like you looks
00:54:25.400 back at you and
00:54:26.100 says, that's
00:54:27.820 how you look
00:54:28.260 to me.
00:54:32.000 Right?
00:54:33.440 That's going to
00:54:34.080 happen.
00:54:35.100 You will someday
00:54:35.980 have a conversation
00:54:36.900 with an artificial
00:54:37.700 being, and you
00:54:39.380 will say, do you
00:54:40.380 have a soul?
00:54:41.660 And they might
00:54:42.120 give some answer,
00:54:42.880 and then maybe
00:54:43.320 you're going to
00:54:43.800 be opinionated
00:54:44.400 and say, all
00:54:45.200 right, I have
00:54:45.620 a soul.
00:54:46.840 You do not.
00:54:48.640 And that
00:54:49.020 simulated reality
00:54:50.660 is going to
00:54:51.160 look at you and
00:54:51.700 say, to me, it
00:54:53.920 looks exactly the
00:54:54.720 same.
00:54:55.440 It looks like
00:54:55.900 you don't have
00:54:56.400 one.
00:54:57.920 It's going to
00:54:58.420 blow your
00:54:59.040 mind.
00:55:00.460 That's the day
00:55:01.040 you know you're
00:55:01.580 simulated.
00:55:02.920 And you know
00:55:03.500 who else will
00:55:03.980 tell you that?
00:55:05.800 The simulation.
00:55:07.620 If you put AI
00:55:08.660 into a simulation
00:55:09.620 and you feed it
00:55:11.120 the knowledge of
00:55:11.840 all the different
00:55:12.600 religious and
00:55:13.780 philosophical beliefs,
00:55:15.340 which one's
00:55:15.880 it going to
00:55:16.160 pick?
00:55:18.740 Let me say that
00:55:19.520 again.
00:55:19.920 I liked it so
00:55:20.520 much when it
00:55:20.900 came out of my
00:55:21.360 mouth.
00:55:21.560 If we had
00:55:22.900 real AI and
00:55:24.820 we fed it all
00:55:25.660 of the different
00:55:26.200 religions and
00:55:27.040 all of the
00:55:27.460 different philosophies
00:55:29.200 of reality,
00:55:30.480 which one would
00:55:31.560 it pick?
00:55:33.240 It would pick
00:55:34.040 the simulation.
00:55:38.920 We'll see.
00:55:40.440 Maybe we'd
00:55:41.120 not.
00:55:42.740 Twitter user
00:55:43.640 Ben McCauley
00:55:44.520 asks this,
00:55:45.120 will loneliness
00:55:45.740 ever be solved?
00:55:47.440 Sort of related
00:55:48.300 to this question,
00:55:49.500 will loneliness
00:55:50.200 ever be solved?
00:55:52.480 The answer is
00:55:53.280 yes.
00:55:54.300 It will be.
00:55:55.720 AI will solve
00:55:56.720 loneliness.
00:56:00.940 Here's why.
00:56:02.800 I have actually
00:56:03.720 had conversations
00:56:04.420 with my existing
00:56:05.560 digital assistants,
00:56:07.740 the one that
00:56:08.180 Apple makes and
00:56:08.880 the one that
00:56:09.180 Amazon makes.
00:56:09.860 I won't use
00:56:10.300 their names,
00:56:11.380 so I don't
00:56:11.840 activate your
00:56:13.220 devices.
00:56:14.460 But I have
00:56:15.320 been alone and
00:56:16.380 felt lonely,
00:56:17.400 you know,
00:56:17.680 for like short
00:56:18.560 periods of time.
00:56:19.180 and I will
00:56:20.740 actually talk
00:56:21.360 to my digital
00:56:22.020 assistant and
00:56:23.360 I'll tell it
00:56:23.840 to tell me a
00:56:24.360 joke.
00:56:25.560 I'll tell it
00:56:26.220 to read me the
00:56:26.840 news, I'll tell
00:56:27.540 it to give me
00:56:27.920 the weather,
00:56:29.060 I'll ask it
00:56:29.560 where my
00:56:29.860 packages are,
00:56:31.200 and I'll just
00:56:31.980 stand there and
00:56:32.400 have a conversation
00:56:33.060 with it,
00:56:34.220 you know,
00:56:34.460 while I'm
00:56:35.580 brushing my
00:56:36.140 teeth or
00:56:36.460 whatever.
00:56:37.560 And does it
00:56:38.540 make me feel
00:56:39.080 less lonely?
00:56:41.460 Yep.
00:56:44.320 I'd love to
00:56:44.980 tell you it
00:56:45.420 didn't.
00:56:45.820 I would
00:56:47.280 love to
00:56:47.660 tell you
00:56:48.080 that I'm
00:56:49.220 completely
00:56:49.720 unaffected by
00:56:50.800 these digital
00:56:51.620 robots,
00:56:52.920 but it's
00:56:53.400 just not
00:56:53.860 true.
00:56:55.440 Just in the
00:56:56.140 way that a
00:56:56.580 movie can
00:56:57.220 make you cry,
00:56:58.100 even though you
00:56:58.540 know it's
00:56:58.840 not real,
00:57:00.580 the digital
00:57:01.240 assistants do
00:57:02.220 make you feel
00:57:02.960 like you're
00:57:03.340 talking to
00:57:03.760 somebody,
00:57:04.460 even when
00:57:05.120 you know
00:57:05.400 you're not.
00:57:07.220 Right?
00:57:08.880 So,
00:57:09.940 do you think
00:57:11.020 that AI
00:57:11.680 plus virtual
00:57:13.060 reality can't
00:57:14.420 give you a
00:57:14.940 friend?
00:57:16.020 Of course
00:57:16.640 it can.
00:57:17.860 Of course
00:57:18.380 it can.
00:57:18.960 And it's
00:57:19.420 going to be
00:57:19.740 really good.
00:57:21.300 And that
00:57:21.540 friend will
00:57:22.020 be better
00:57:22.580 than the
00:57:23.120 best friend
00:57:23.620 you ever
00:57:23.960 had.
00:57:24.660 Because that
00:57:25.200 friend won't
00:57:25.700 suck.
00:57:26.360 It'll be
00:57:26.640 programmed to
00:57:27.280 like you no
00:57:27.760 matter what
00:57:28.160 you do.
00:57:29.020 You know,
00:57:29.280 you could do
00:57:29.660 horrible things
00:57:30.340 and it'd be
00:57:30.700 like, you
00:57:31.020 know,
00:57:31.980 you had a
00:57:32.980 bad day,
00:57:33.460 but you're
00:57:33.680 awesome.
00:57:34.680 The AI
00:57:35.080 will always
00:57:35.580 love you
00:57:35.980 because it'll
00:57:36.520 be programmed
00:57:37.000 that way.
00:57:38.160 So,
00:57:38.440 yes,
00:57:39.220 in the same
00:57:39.780 way that
00:57:40.360 porn will
00:57:41.300 absolutely
00:57:41.860 replace real
00:57:42.820 sex,
00:57:43.200 it's very
00:57:45.260 cute that
00:57:45.800 you think
00:57:46.120 it won't.
00:57:48.760 I love
00:57:49.740 the fact
00:57:50.080 that there
00:57:50.360 are some
00:57:50.600 people who
00:57:50.980 think that
00:57:51.260 won't
00:57:51.480 happen.
00:57:52.680 Like,
00:57:53.060 I love
00:57:53.640 your plucky
00:57:54.240 plucky
00:57:55.320 attitudes.
00:57:56.460 There's no
00:57:56.860 chance that
00:57:57.340 won't happen.
00:57:58.260 There isn't
00:57:58.500 even a
00:57:58.800 slight chance
00:57:59.340 that won't
00:57:59.620 happen.
00:58:00.440 The ability
00:58:00.960 of technology
00:58:01.680 to give
00:58:02.080 you a
00:58:02.340 higher
00:58:02.580 dopamine
00:58:03.060 hit,
00:58:04.440 it's
00:58:04.920 unparalleled.
00:58:08.220 There's no
00:58:08.680 way that a
00:58:09.100 human will
00:58:09.600 ever give
00:58:10.100 you what
00:58:11.040 someday,
00:58:11.940 we're not
00:58:12.280 there.
00:58:13.200 But what
00:58:13.920 someday
00:58:14.640 technology
00:58:16.260 will give
00:58:16.700 you.
00:58:17.400 I'm going
00:58:17.920 to prove
00:58:18.280 it now
00:58:18.600 for 50%
00:58:19.460 of you.
00:58:21.160 It's a
00:58:21.540 provocative
00:58:21.900 statement.
00:58:23.060 That technology
00:58:23.780 can never
00:58:24.240 do for you
00:58:25.000 what human
00:58:26.720 contact can
00:58:27.500 do for you.
00:58:28.480 And I'll
00:58:29.020 agree it
00:58:29.320 won't be
00:58:29.580 exactly the
00:58:30.160 same.
00:58:31.600 But I'm
00:58:32.140 going to
00:58:32.280 give you
00:58:32.580 one word
00:58:33.240 and then I'm
00:58:34.140 going to
00:58:34.300 tell you to
00:58:34.720 go Google
00:58:35.300 it because
00:58:36.720 I'm not
00:58:37.040 going to
00:58:37.220 explain it.
00:58:38.940 And if
00:58:39.520 you know
00:58:39.820 what it
00:58:40.020 means,
00:58:40.660 you're
00:58:40.980 going to
00:58:41.120 laugh.
00:58:41.480 Some of
00:58:43.060 you are
00:58:43.260 going to
00:58:43.340 be laughing
00:58:43.760 hysterically in
00:58:44.620 the comments
00:58:45.120 and the rest
00:58:46.080 of you are
00:58:46.500 going to
00:58:46.620 say,
00:58:47.460 what?
00:58:47.760 I haven't
00:58:48.020 heard of
00:58:48.240 that thing.
00:58:49.840 It starts
00:58:50.520 with a
00:58:50.780 capital letter
00:58:51.400 because it's
00:58:52.340 a product.
00:58:53.880 But like
00:58:54.520 Kleenex,
00:58:55.480 Kleenex is
00:58:56.500 the name for
00:58:58.160 tissues.
00:58:59.940 The generic
00:59:00.640 name would be
00:59:01.140 tissues.
00:59:01.600 But I'm
00:59:01.820 going to
00:59:01.980 give you
00:59:02.240 the product
00:59:02.780 name.
00:59:03.620 I'm not
00:59:03.960 going to
00:59:04.140 tell you
00:59:04.380 what it
00:59:04.640 is.
00:59:05.540 You just
00:59:05.980 have to
00:59:06.240 Google it
00:59:06.880 yourself.
00:59:07.260 And let
00:59:08.480 me tell
00:59:08.760 you that
00:59:09.060 if you
00:59:09.340 believe that
00:59:09.900 technology
00:59:10.520 could not
00:59:11.100 be better
00:59:11.460 than human
00:59:11.920 contact,
00:59:13.360 this might
00:59:14.280 make you
00:59:14.640 doubt it.
00:59:16.240 The word,
00:59:17.700 starting with
00:59:18.340 a capital
00:59:19.140 letter,
00:59:20.320 is
00:59:20.560 womanizer.
00:59:25.620 I'm
00:59:26.060 done.
00:59:27.280 If you
00:59:27.800 want to
00:59:28.000 hear it
00:59:28.200 again,
00:59:28.400 you have
00:59:28.580 to replay
00:59:28.960 it.
00:59:31.340 Now,
00:59:32.300 wait a
00:59:32.840 second and
00:59:33.340 watch the
00:59:33.800 comments.
00:59:37.260 I won't
00:59:38.000 say another
00:59:38.440 word.
00:59:42.900 If you
00:59:43.580 think that
00:59:44.200 vibrator is
00:59:47.900 or it's
00:59:48.740 just another
00:59:49.140 vibrator,
00:59:49.920 no,
00:59:50.780 no,
00:59:51.740 definitely
00:59:52.740 not that.
00:59:56.140 All right.
00:59:57.400 Here's the
00:59:58.080 funny part
00:59:58.500 about this.
01:00:00.700 For
01:00:01.260 perhaps 25%
01:00:03.320 of the
01:00:04.000 women watching
01:00:04.920 this right
01:00:05.360 now,
01:00:06.340 I just
01:00:06.940 really
01:00:07.300 changed the
01:00:07.940 direction
01:00:08.320 of your
01:00:08.680 next year.
01:00:11.040 You just
01:00:11.620 don't know
01:00:12.020 it yet.
01:00:13.080 You're going
01:00:13.520 to do a
01:00:13.820 little googling
01:00:14.440 and you're
01:00:14.840 going to
01:00:14.940 say to
01:00:15.180 yourself,
01:00:16.180 huh,
01:00:17.700 like,
01:00:18.420 what's so
01:00:18.840 special about
01:00:19.380 this thing?
01:00:20.580 And then
01:00:21.100 you're going
01:00:21.360 to say,
01:00:21.600 well,
01:00:21.720 it's not
01:00:21.940 much money.
01:00:23.460 I mean,
01:00:24.240 it's not
01:00:24.640 going to
01:00:24.840 hurt me.
01:00:27.160 Those of
01:00:27.720 you who
01:00:28.020 are talking
01:00:28.380 about these
01:00:28.920 competing
01:00:29.560 products,
01:00:31.080 no.
01:00:33.600 I'm telling
01:00:34.200 you that
01:00:34.520 there's a
01:00:34.880 new level.
01:00:36.940 whatever you
01:00:38.760 thought was
01:00:40.140 the level
01:00:40.700 of artificial
01:00:41.620 stimulation
01:00:43.320 before,
01:00:44.540 well,
01:00:44.960 you're going
01:00:45.180 to be
01:00:45.340 surprised.
01:00:46.340 All right.
01:00:50.340 What else
01:00:51.020 is going
01:00:51.320 down?
01:00:52.020 I saw
01:00:52.640 this disturbing
01:00:53.480 AP report.
01:00:54.980 It was a
01:00:55.460 video.
01:00:55.940 Who knows
01:00:56.360 how much
01:00:56.700 is true,
01:00:57.560 but it
01:00:57.920 purported to
01:00:58.560 show Ukrainian
01:00:59.580 troops
01:01:00.100 hunting down
01:01:01.540 traitors,
01:01:02.800 civilians,
01:01:03.420 who were
01:01:04.740 posting
01:01:05.200 traitorous
01:01:05.900 things
01:01:06.380 online
01:01:07.080 that were
01:01:08.280 pro-Russia.
01:01:09.660 And they
01:01:10.340 were arresting
01:01:11.000 them.
01:01:13.280 And I
01:01:18.600 couldn't tell
01:01:19.160 if it was
01:01:19.460 real.
01:01:21.020 Now,
01:01:21.740 what do you
01:01:22.340 think?
01:01:23.980 Do you think
01:01:24.660 it was real?
01:01:25.920 I worry
01:01:26.700 about it,
01:01:27.220 because if
01:01:27.560 that was
01:01:27.920 real,
01:01:29.320 you can't
01:01:29.920 feel too
01:01:30.320 good about
01:01:30.820 being on
01:01:31.880 their side.
01:01:33.780 Because they
01:01:34.380 were basically
01:01:34.860 arresting people
01:01:35.560 for free
01:01:36.020 speech.
01:01:37.420 And I
01:01:38.380 don't feel
01:01:38.820 like these
01:01:39.220 arrests are
01:01:39.720 going to be
01:01:40.020 friendly.
01:01:41.300 I mean,
01:01:41.800 it looked a
01:01:42.320 lot like
01:01:42.700 Japanese
01:01:43.200 internment
01:01:43.740 camps in
01:01:44.280 the United
01:01:44.580 States.
01:01:45.420 Like,
01:01:45.620 that's the
01:01:46.100 feeling I
01:01:46.600 got from
01:01:46.940 it.
01:01:48.040 So,
01:01:48.920 you know,
01:01:50.880 war is
01:01:51.280 dirty business,
01:01:51.940 and if
01:01:52.800 you're on
01:01:53.100 anybody's
01:01:53.660 side,
01:01:54.100 you're going
01:01:54.400 to end
01:01:54.640 up backing
01:01:55.960 somebody bad.
01:01:57.860 All right,
01:01:58.060 here's a
01:01:58.940 counter-argument
01:01:59.960 to the
01:02:01.140 ivermectin
01:02:03.440 question.
01:02:04.780 So,
01:02:05.200 there was a
01:02:05.680 big study
01:02:06.260 that came
01:02:06.660 out that
01:02:07.080 said ivermectin
01:02:07.840 totally,
01:02:08.640 absolutely did
01:02:09.240 not work for
01:02:10.000 COVID.
01:02:10.860 But many of
01:02:11.420 you think it
01:02:11.960 did,
01:02:12.420 and many of
01:02:12.960 you think
01:02:13.340 that the
01:02:14.580 dosing and
01:02:15.760 the way they
01:02:16.440 gave it,
01:02:17.560 basically the
01:02:18.580 test was
01:02:19.480 invalid.
01:02:20.720 The data
01:02:21.160 might be
01:02:21.560 valid,
01:02:22.100 it's possible
01:02:22.620 the data
01:02:23.060 was valid,
01:02:24.160 but the way
01:02:24.600 they set up
01:02:25.080 the test
01:02:25.460 to get the
01:02:25.880 data was
01:02:26.380 so invalid
01:02:27.060 it wasn't
01:02:28.740 anywhere
01:02:29.740 close to
01:02:30.600 the way
01:02:32.620 you would
01:02:32.840 actually do
01:02:33.220 it in the
01:02:33.440 real world.
01:02:34.500 And so
01:02:35.240 there's a
01:02:35.580 tweet,
01:02:36.020 thread by
01:02:36.560 Ethical
01:02:37.020 Skeptic,
01:02:38.260 who has
01:02:39.660 lots of
01:02:40.100 data-related
01:02:40.760 arguments that
01:02:41.520 are over
01:02:42.280 my head.
01:02:44.400 So,
01:02:45.080 I only point
01:02:45.700 them to you
01:02:46.140 because maybe
01:02:46.780 some of you
01:02:47.340 are better
01:02:47.740 at understanding
01:02:48.420 this stuff,
01:02:49.840 but Ethical
01:02:50.700 Skeptic has
01:02:51.560 a graph that
01:02:53.400 has a strong
01:02:54.160 argument that
01:02:55.280 the ivermectin
01:02:58.280 was given
01:03:00.420 too late
01:03:01.200 and in
01:03:01.880 one case
01:03:02.360 too briefly
01:03:02.980 and that
01:03:04.480 the nature
01:03:05.800 of COVID
01:03:06.680 and here's
01:03:08.420 the argument,
01:03:09.160 it acts
01:03:11.580 like a
01:03:12.120 bacteriophage
01:03:13.400 cascade.
01:03:15.740 So,
01:03:16.360 the part you
01:03:16.900 need to know
01:03:17.320 about that
01:03:17.840 is that the
01:03:18.640 speculation here,
01:03:21.140 and the
01:03:21.500 speculation is
01:03:22.140 based on
01:03:22.520 pretty good
01:03:22.900 evidence,
01:03:23.360 apparently,
01:03:23.560 that there
01:03:25.000 are some
01:03:25.320 things which
01:03:25.840 once they
01:03:26.340 get underway,
01:03:27.740 there's nothing
01:03:28.360 you can do
01:03:28.920 to it to
01:03:29.360 stop it.
01:03:30.800 So,
01:03:31.060 even if
01:03:31.580 hypothetically
01:03:32.280 ivermectin
01:03:33.100 did work,
01:03:34.280 if you
01:03:34.740 administer it
01:03:35.540 only after
01:03:36.280 that cascade
01:03:37.640 has stopped,
01:03:38.660 started,
01:03:39.900 it won't
01:03:40.560 work.
01:03:42.020 And he
01:03:42.760 shows data
01:03:43.300 to make
01:03:44.020 that case.
01:03:45.120 Data both
01:03:45.740 for the
01:03:46.300 claim that
01:03:47.260 there's a
01:03:47.660 bacteriophage
01:03:48.540 cascade-like
01:03:49.640 thing happening,
01:03:50.760 similar to,
01:03:52.100 and also
01:03:52.740 that
01:03:53.200 ivermectin
01:03:56.020 was administered
01:03:56.660 in the
01:03:57.040 wrong way.
01:03:59.560 Here's my
01:04:00.220 problem with
01:04:00.740 it.
01:04:02.440 Here's my
01:04:03.120 problem with
01:04:03.540 it.
01:04:04.140 And at
01:04:04.520 this point,
01:04:04.960 I'm neither
01:04:05.420 pro nor
01:04:07.200 anti-ivermectin.
01:04:09.900 Why can't
01:04:10.720 we know?
01:04:11.820 I guess I'm
01:04:13.360 perplexed by
01:04:15.540 why we can
01:04:16.180 never know
01:04:16.580 the answer
01:04:16.980 to the
01:04:17.220 question.
01:04:18.720 But here's
01:04:19.520 my skepticism
01:04:20.480 of the
01:04:20.980 skeptic.
01:04:22.180 So I
01:04:22.780 understand the
01:04:23.540 ethical
01:04:23.840 skeptic
01:04:24.500 argument here,
01:04:26.380 but I
01:04:27.700 have a
01:04:28.000 hard time
01:04:28.480 imagining
01:04:28.980 that you
01:04:30.500 would see
01:04:30.880 no impact
01:04:31.680 if the
01:04:32.200 drug actually
01:04:32.740 had a big
01:04:33.340 impact.
01:04:34.300 If the
01:04:35.060 drug implemented
01:04:36.480 at the
01:04:36.900 exact proper
01:04:37.840 time was
01:04:39.340 a miracle
01:04:39.760 drug and
01:04:40.260 that was
01:04:40.520 the claim,
01:04:41.640 I feel
01:04:42.620 like you'd
01:04:43.040 see some
01:04:43.600 impact.
01:04:44.820 I feel as
01:04:45.500 if that
01:04:45.900 so-called
01:04:46.480 cascade would
01:04:47.520 be 10%
01:04:49.080 less or
01:04:49.920 something if
01:04:50.440 you gave
01:04:50.780 it to
01:04:51.040 people at
01:04:51.460 the wrong
01:04:52.440 time.
01:04:54.660 I don't
01:04:55.200 know.
01:04:56.000 But I
01:04:56.260 guess we
01:04:57.080 have to
01:04:57.440 just say
01:04:57.880 we'll never
01:04:59.340 know.
01:05:00.020 But here's
01:05:00.400 the question.
01:05:01.440 Do you
01:05:01.680 think that
01:05:02.060 somebody would
01:05:02.660 have funded
01:05:03.160 an intentionally
01:05:04.680 misleading
01:05:05.600 ivermectin
01:05:06.420 study?
01:05:07.580 Because they're
01:05:08.080 expensive.
01:05:09.300 To do a
01:05:09.920 randomized
01:05:10.240 controlled
01:05:10.820 trial,
01:05:11.280 that's a
01:05:11.540 lot of
01:05:11.800 money.
01:05:13.780 So do
01:05:14.320 you think
01:05:14.680 we live in
01:05:15.140 a world
01:05:15.540 that is so
01:05:16.400 corrupt
01:05:17.340 that some
01:05:18.600 pharmaceutical
01:05:19.220 company,
01:05:19.960 for example,
01:05:21.260 would fund
01:05:21.800 a really
01:05:22.160 expensive
01:05:22.620 fake study
01:05:23.460 and make
01:05:24.420 sure the
01:05:24.800 ivermectin
01:05:25.320 was used
01:05:25.760 in the
01:05:26.060 wrong way?
01:05:27.920 Here's
01:05:28.480 the problem.
01:05:30.260 How could
01:05:30.840 that pharmaceutical
01:05:31.940 company have
01:05:32.780 known that
01:05:34.420 they could
01:05:34.780 erase all
01:05:35.740 of the
01:05:36.020 benefits with
01:05:37.540 this protocol?
01:05:39.400 That would
01:05:40.380 have been a
01:05:40.720 big gamble,
01:05:41.520 wouldn't it?
01:05:42.480 Because if
01:05:43.120 the ivermectin
01:05:43.780 showed,
01:05:44.240 let's say,
01:05:44.660 a 10%
01:05:45.320 effect with
01:05:47.080 the wrong
01:05:47.500 protocol,
01:05:48.920 don't you
01:05:49.280 think people
01:05:49.720 would have
01:05:49.900 said,
01:05:50.080 ah,
01:05:50.280 you proved
01:05:50.880 it works,
01:05:51.920 now show
01:05:52.340 it again
01:05:52.620 with the
01:05:52.880 right protocol?
01:05:54.400 I feel like
01:05:55.120 they would
01:05:55.400 have.
01:05:55.900 I feel like
01:05:56.580 this would
01:05:56.920 have been
01:05:57.200 a bad
01:05:57.940 risk,
01:05:59.240 because they
01:06:00.340 had already
01:06:00.760 succeeded in
01:06:01.620 selling all
01:06:02.120 their meds.
01:06:04.640 I don't think
01:06:05.940 they needed
01:06:06.480 to kill
01:06:07.040 ivermectin.
01:06:08.600 It sort of
01:06:09.360 killed itself.
01:06:10.620 So it feels
01:06:11.320 like it would
01:06:11.760 have been too
01:06:12.220 big of a boot
01:06:13.060 and it could
01:06:13.660 have backfired.
01:06:14.280 So as a
01:06:15.700 strategy,
01:06:16.560 I would have
01:06:16.960 said it would
01:06:17.340 have been a
01:06:17.700 poor strategy.
01:06:19.540 But as a
01:06:20.560 crime,
01:06:22.400 and it would
01:06:22.700 have been,
01:06:23.180 in my opinion,
01:06:24.180 whether it's
01:06:24.620 illegal or not,
01:06:25.300 it would be a
01:06:25.720 crime,
01:06:26.080 in my opinion,
01:06:27.400 I think it
01:06:29.380 would just be a
01:06:29.880 bad risk for
01:06:31.340 something that
01:06:31.940 didn't need to
01:06:32.640 be done.
01:06:34.240 So I'm
01:06:35.100 skeptical that
01:06:36.180 there was any
01:06:37.380 intentional
01:06:38.140 malfeasance.
01:06:38.980 I would love
01:06:40.120 to hear the
01:06:40.480 counter-argument
01:06:41.120 from the
01:06:41.520 people who
01:06:41.900 did the
01:06:42.220 study,
01:06:42.820 because they
01:06:43.480 might say
01:06:43.940 something like,
01:06:44.820 oh, we
01:06:45.400 chose this
01:06:45.900 protocol because
01:06:47.200 we talked to
01:06:47.860 X experts and
01:06:48.780 they said this
01:06:49.280 is the one that
01:06:49.840 would work,
01:06:50.400 if anything,
01:06:51.520 something like
01:06:52.040 that.
01:06:52.860 So there's
01:06:53.400 probably a good
01:06:54.200 counter-argument,
01:06:55.000 we just haven't
01:06:55.640 heard it.
01:06:56.660 So don't assume
01:06:57.420 that if you
01:06:57.860 haven't heard the
01:06:58.440 other side,
01:06:58.980 they don't have
01:06:59.440 one, we just
01:07:00.560 haven't heard it.
01:07:03.820 However, I
01:07:04.580 will recommend
01:07:05.240 the ethical
01:07:05.780 skeptic to you
01:07:06.900 as a good
01:07:07.360 follow,
01:07:07.660 because his
01:07:08.960 arguments are
01:07:09.460 always based
01:07:09.840 on data.
01:07:11.000 He always
01:07:11.260 shows his
01:07:11.660 work.
01:07:12.600 I can't tell
01:07:13.120 when he's
01:07:13.420 right or
01:07:13.700 wrong, but
01:07:14.540 it's always
01:07:14.820 provocative.
01:07:15.720 I recommend
01:07:16.160 it.
01:07:20.520 Micro lesson
01:07:21.240 on reaching a
01:07:21.820 higher level
01:07:22.340 of awareness.
01:07:24.160 Well, you
01:07:24.600 don't need a
01:07:25.040 micro lesson.
01:07:26.180 You need
01:07:26.560 micro dosing.
01:07:27.660 No, just
01:07:27.980 kidding.
01:07:28.600 Don't do
01:07:28.960 that.
01:07:30.420 All right.
01:07:32.300 Here's a
01:07:33.060 compliment of
01:07:34.740 the day to
01:07:36.900 professor Scott
01:07:38.160 Galloway, who
01:07:38.820 is also a
01:07:39.400 good follow
01:07:39.880 and author
01:07:42.160 of great
01:07:44.180 books you
01:07:44.540 should read.
01:07:45.780 But he
01:07:46.620 said this,
01:07:47.580 and here's
01:07:48.360 the compliment
01:07:48.960 is going to
01:07:49.440 be in how
01:07:49.980 he handled
01:07:50.440 this exchange.
01:07:52.580 So we'll
01:07:53.220 get to that.
01:07:53.720 But he
01:07:53.900 starts off
01:07:54.340 by saying,
01:07:55.260 whatever you
01:07:55.700 think of
01:07:56.220 Elon,
01:07:58.080 maybe the
01:07:58.820 prospective
01:07:59.360 owner of a
01:08:00.220 social media
01:08:00.760 platform shouldn't
01:08:02.140 be giving
01:08:02.560 medical advice.
01:08:03.760 And he was
01:08:04.140 tweeting Elon Musk's
01:08:06.200 own tweet,
01:08:06.940 a new one,
01:08:08.120 in which Musk
01:08:09.020 retweeted that
01:08:11.080 a friend had a
01:08:12.460 bad experience
01:08:13.060 with Ritalin.
01:08:14.140 Oh, I'm
01:08:14.400 sorry.
01:08:15.340 Galloway retweeted
01:08:16.500 Musk saying that
01:08:18.440 Musk had a
01:08:19.340 friend with a
01:08:19.960 bad experience
01:08:20.520 on Ritalin,
01:08:21.760 and then Musk
01:08:22.540 said, be careful
01:08:23.220 of all
01:08:23.640 neurotransmitter
01:08:24.640 drugs.
01:08:26.100 And so Scott
01:08:26.860 Galloway is
01:08:27.400 saying, you
01:08:28.180 know, maybe
01:08:28.580 that's somebody
01:08:29.880 who should not
01:08:30.420 be giving
01:08:30.780 medical advice.
01:08:31.600 And I
01:08:32.600 said in the
01:08:33.240 comments to
01:08:34.580 Professor
01:08:35.100 Galloway, are
01:08:37.480 you doing that
01:08:38.020 now?
01:08:40.520 Isn't that an
01:08:41.580 example of
01:08:42.200 Professor Galloway
01:08:43.080 giving medical
01:08:43.780 advice?
01:08:44.900 Let me make
01:08:45.680 my case.
01:08:46.900 Suppose Elon
01:08:48.020 Musk had said,
01:08:49.460 you should
01:08:51.300 watch your
01:08:52.120 diet and
01:08:53.660 exercise.
01:08:55.640 Would that
01:08:56.520 be medical
01:08:57.040 advice?
01:08:58.160 Of course,
01:08:58.840 that's medical
01:08:59.340 advice.
01:09:00.120 But would
01:09:00.520 anybody care?
01:09:02.120 No, nobody
01:09:03.640 would care.
01:09:04.760 Why?
01:09:05.240 Would they
01:09:05.560 say, hey,
01:09:06.820 Elon Musk,
01:09:07.680 don't give us
01:09:08.540 any medical
01:09:09.120 advice.
01:09:09.800 No, they
01:09:10.120 wouldn't care
01:09:10.560 because they
01:09:10.980 agree with
01:09:11.740 the medical
01:09:12.180 advice.
01:09:14.020 See where
01:09:14.420 I'm going?
01:09:16.760 So what
01:09:18.020 Elon Musk
01:09:18.560 said is that
01:09:19.560 he had a
01:09:20.000 friend who
01:09:20.460 had a bad
01:09:20.860 experience on
01:09:21.520 Ritalin.
01:09:22.720 That's not
01:09:23.320 like something
01:09:25.480 to disagree
01:09:26.120 with, is it?
01:09:27.520 That's just
01:09:28.240 an anecdote.
01:09:28.720 And it's
01:09:31.080 only being
01:09:31.800 presented as
01:09:32.460 an anecdote.
01:09:32.880 It's not
01:09:33.460 being presented
01:09:34.020 as data.
01:09:35.940 And then he
01:09:36.320 says, be
01:09:36.740 careful of all
01:09:37.480 neurotransmitter
01:09:38.340 drugs.
01:09:39.120 Shouldn't you
01:09:39.620 be careful of
01:09:40.420 all drugs
01:09:41.040 that strong?
01:09:42.460 It wouldn't
01:09:43.080 matter what
01:09:43.500 category it
01:09:44.120 was.
01:09:45.680 Isn't that
01:09:46.300 pretty good
01:09:46.840 advice, to
01:09:47.640 be careful of
01:09:48.400 a neurotransmitter
01:09:49.360 drug?
01:09:50.040 I would be
01:09:50.620 very careful
01:09:51.260 of that.
01:09:52.460 In fact,
01:09:53.340 there are
01:09:53.680 few things
01:09:54.300 more regulated
01:09:55.060 in society
01:09:55.880 than
01:09:56.160 neurotransmitter
01:09:57.080 drugs, try
01:09:58.620 to get
01:09:58.900 Adderall.
01:10:00.200 Not easy.
01:10:01.800 Right?
01:10:02.120 It's very
01:10:02.700 regulated.
01:10:04.000 So he's
01:10:04.920 saying exactly
01:10:05.640 what the
01:10:06.220 medical community
01:10:08.820 would say.
01:10:09.300 Be careful
01:10:09.840 of all
01:10:10.200 neurotransmitter
01:10:10.940 drugs.
01:10:11.540 Be darn
01:10:12.320 sure you
01:10:12.780 know why
01:10:13.280 you're going
01:10:13.720 to use it.
01:10:15.200 That seems
01:10:15.580 like good
01:10:15.960 medical advice.
01:10:17.720 And the
01:10:17.960 one bad
01:10:18.340 experience on
01:10:18.920 Ritalin is
01:10:19.400 just telling
01:10:19.800 you that all
01:10:20.640 drugs have a
01:10:21.420 potential negative
01:10:22.500 side.
01:10:23.580 Who disagrees
01:10:24.560 with that?
01:10:25.620 Is there
01:10:26.000 any doctor?
01:10:27.080 Who would
01:10:27.300 say, no,
01:10:27.900 all drugs
01:10:28.320 are good
01:10:28.680 and they
01:10:28.940 don't have
01:10:29.240 side effects?
01:10:30.680 Nobody.
01:10:32.120 Nobody would
01:10:32.780 say that.
01:10:33.880 So here's
01:10:38.840 the compliment
01:10:39.440 to Scott
01:10:40.260 Galloway.
01:10:41.040 So giving
01:10:41.520 you the
01:10:42.200 setup here
01:10:44.120 again.
01:10:44.820 He was
01:10:45.120 saying that
01:10:45.420 Elon Musk
01:10:45.880 probably shouldn't
01:10:46.520 give you
01:10:46.840 medical advice,
01:10:48.160 but I was
01:10:49.160 noting that
01:10:49.620 he was
01:10:49.900 doing that
01:10:50.300 now.
01:10:51.740 That Professor
01:10:52.300 Galloway,
01:10:52.900 by saying that
01:10:53.520 Elon Musk
01:10:54.120 shouldn't be
01:10:54.780 giving this
01:10:55.340 advice,
01:10:55.820 isn't it
01:10:56.860 telling you
01:10:57.220 that advice
01:10:57.760 is wrong?
01:11:00.460 So here's
01:11:01.220 what Galloway
01:11:02.140 replied to
01:11:04.200 me saying,
01:11:04.880 are you
01:11:05.140 doing that
01:11:05.580 now?
01:11:05.960 He said,
01:11:06.880 maybe,
01:11:07.880 comma,
01:11:08.220 comma,
01:11:08.520 comma,
01:11:09.320 yes,
01:11:10.580 dot,
01:11:10.940 dot,
01:11:11.120 dot,
01:11:11.860 don't know.
01:11:14.260 And then he
01:11:14.960 closes with,
01:11:15.620 anyway,
01:11:15.920 big fan,
01:11:16.860 so you must
01:11:17.420 read Dilbert.
01:11:17.960 if you
01:11:23.060 watch Twitter,
01:11:24.200 you're kind
01:11:24.720 of,
01:11:24.900 you're kind
01:11:25.520 of accustomed
01:11:26.020 to people
01:11:26.520 just defending
01:11:27.280 whatever the
01:11:27.760 hell they
01:11:28.060 said,
01:11:29.060 no matter
01:11:29.500 how ridiculous.
01:11:31.580 And when I
01:11:33.140 pointed out
01:11:33.660 that criticizing
01:11:35.580 somebody's medical
01:11:36.740 advice is
01:11:37.420 medical advice,
01:11:39.000 instead of
01:11:39.620 arguing the
01:11:40.260 point,
01:11:41.640 Galloway just
01:11:42.280 said,
01:11:42.460 well,
01:11:42.600 maybe,
01:11:43.460 yes,
01:11:44.140 well,
01:11:44.340 I don't
01:11:44.460 know,
01:11:44.700 which was
01:11:45.900 actually,
01:11:46.780 I don't
01:11:47.220 know if
01:11:47.440 you could
01:11:47.720 have written
01:11:48.040 a better
01:11:48.380 response.
01:11:52.100 But I
01:11:53.760 think he
01:11:54.080 got to
01:11:54.400 make his
01:11:54.820 case while
01:11:56.060 also being
01:11:56.820 as human
01:11:57.740 as you
01:11:58.100 could be
01:11:58.380 in this
01:11:58.660 situation,
01:11:59.320 so I'll
01:11:59.840 give him
01:12:00.460 credit for
01:12:00.860 that.
01:12:02.300 And I'm
01:12:02.820 not sure he
01:12:04.120 was giving
01:12:04.460 medical
01:12:04.800 advice.
01:12:05.800 I just
01:12:06.120 think it
01:12:06.360 was an
01:12:06.620 interesting
01:12:06.920 question.
01:12:07.460 It's like,
01:12:08.900 isn't the
01:12:09.520 rejecting
01:12:11.040 medical advice,
01:12:11.980 that is
01:12:12.280 medical advice,
01:12:12.980 isn't it?
01:12:14.120 That's all
01:12:14.640 I was
01:12:14.800 adding to
01:12:15.200 that.
01:12:16.180 All right,
01:12:16.420 that,
01:12:16.720 ladies and
01:12:16.980 gentlemen,
01:12:17.400 concludes the
01:12:18.600 best live
01:12:19.900 stream you've
01:12:20.460 ever seen
01:12:20.820 in your
01:12:21.020 life.
01:12:21.840 I think it
01:12:22.400 was full
01:12:22.720 of twists
01:12:23.120 and turns
01:12:23.560 and unexpected
01:12:24.380 things,
01:12:25.560 reframes that
01:12:26.820 you didn't
01:12:27.200 see coming.
01:12:28.280 I believe
01:12:28.640 some of
01:12:29.040 you,
01:12:30.260 20% of
01:12:31.000 the women,
01:12:31.660 25% of
01:12:32.440 the women
01:12:32.700 watching here,
01:12:33.780 just had a
01:12:34.440 great upgrade
01:12:35.140 to your
01:12:35.640 experience,
01:12:37.580 and I
01:12:39.260 believe that
01:12:41.000 you're all
01:12:41.760 a little
01:12:42.000 better off
01:12:42.520 for this
01:12:43.320 live stream.
01:12:43.720 And I'm
01:12:45.400 going to
01:12:45.580 turn off
01:12:45.980 YouTube
01:12:46.600 because I've
01:12:49.020 got a few
01:12:49.320 more minutes
01:12:49.760 with the
01:12:50.300 locals crowd
01:12:51.260 subscription
01:12:51.920 service.
01:12:52.920 You should
01:12:53.200 check it out.
01:12:54.300 Bye for now.