Real Coffee with Scott Adams - October 01, 2023


Episode 2248 Scott Adams: All Of The News Is Confusing And Fake But The Coffee Is Delicious


Episode Stats

Length

1 hour and 7 minutes

Words per Minute

145.38725

Word Count

9,807

Sentence Count

638

Misogynist Sentences

4

Hate Speech Sentences

52


Summary

In this episode of Coffee with Scott Adams, we talk about how to turn seawater into drinking water with a passive device, and what it means for the future of the oceans. Plus, a story about a representative who was caught on camera pulling the fire alarm on a continuing resolution and was later accused of delaying a vote.


Transcript

00:00:00.000 Do-do-do-do-do-do, ra-ba-ba-ba.
00:00:04.820 Good morning, everybody, and welcome to Coffee with Scott Adams.
00:00:10.040 It's the best thing you could ever do on a Sunday.
00:00:12.780 All right, maybe church is okay, too, but if you're not in church,
00:00:17.360 this is the best thing you could possibly do.
00:00:19.720 And if you'd like to take this experience up to godlike levels,
00:00:23.240 all you need is a cup or a mug or a glass, a tank or a chalice or a stein,
00:00:26.820 a canteen jug or a flask or a vessel of any kind.
00:00:29.060 Fill it with your favorite liquid.
00:00:31.720 I like coffee.
00:00:33.220 Join me now for the unparalleled pleasure, the dopamine.
00:00:36.200 At the end of the day, the thing that makes everything better.
00:00:38.080 It's called the simultaneous sip, and it happens now.
00:00:47.060 Ah, wow.
00:00:49.820 It is the height of civilization.
00:00:52.440 Yeah.
00:00:54.060 So let's talk about all the stuff.
00:00:59.060 So engineers at MIT and in China figured out how to turn seawater into drinking water
00:01:06.920 with a completely passive device.
00:01:10.080 That's right.
00:01:12.000 It's being done with a passive device.
00:01:14.600 And what that means is that when you tell the device, hey, could you turn this seawater into drinking water?
00:01:22.820 It gives you no feedback whatsoever.
00:01:24.920 It doesn't say, oh, I'm busy, or why don't you do it yourself?
00:01:28.880 No, it's a passive device.
00:01:30.880 It'll be like, all right, I'll turn some seawater into drinking water for you.
00:01:36.320 Totally passive.
00:01:37.060 I think that's what it means.
00:01:39.880 No, actually, what it means is instead of forcing the water through a filter, which is the normal way you desalinate,
00:01:46.900 which takes a tremendous amount of energy,
00:01:50.240 and also takes, it's a problem because there's a bunch of salt that's going to goop up your filters,
00:01:58.160 and you have to clean it out and stuff like that.
00:02:00.000 But these MIT and China, researchers, engineers, actually,
00:02:06.860 figured out how to create a system where it's primarily, not primarily, it's only evaporation.
00:02:13.100 So they figured out a way to evaporate the water out, so you've got drinking water,
00:02:18.540 and that leaves the salt, and they can pass the salt back into the ocean,
00:02:22.480 and then the ocean gets too full of salt, right?
00:02:26.080 Well, I wouldn't worry about the ocean filling up with salt because you desalinated some of it.
00:02:37.900 Because, I don't know, this might be a big, this might be something you don't know.
00:02:43.700 But were you aware that the ocean already evaporates?
00:02:48.100 I mean, not all of it, but every single day, there's like a lot of evaporation happening.
00:02:53.780 It turns into clouds, yeah.
00:02:56.080 So if you evaporate a little extra and turn it into drinking water instead of a cloud
00:03:03.000 that then turns into water that then you drink, I feel like the oceans will survive that.
00:03:08.900 Yeah, I feel like that.
00:03:11.640 And by the way, that water that you put into your body, you don't think that gets back to the ocean?
00:03:17.120 It does.
00:03:18.380 It gets back there.
00:03:19.220 So we're not going to turn the ocean into brine.
00:03:24.040 All right, so that's a positive.
00:03:25.520 All of our water problems are solved.
00:03:28.380 You know what the big deal about that is?
00:03:30.140 We're very close.
00:03:32.040 Correct me if I'm wrong.
00:03:33.040 But if you can cheaply make your own fresh water and you can cheaply make your own electricity
00:03:40.620 and you could probably come up with a toilet system like the Bill Gates toilets that don't need a sewer system.
00:03:50.900 Somehow they, I don't know, they do their own thing somehow.
00:03:53.960 And could you not build floating cities now?
00:03:59.240 What would stop you from living on the ocean forever?
00:04:03.720 Pirates.
00:04:04.620 Pirates would stop you, that's what.
00:04:06.580 Yeah, there would be a pirate problem.
00:04:08.360 But what if you had like an old city and you just floated around to wherever the weather is nicest?
00:04:13.580 And if you see a storm forming, you just turn on your engines and like,
00:04:17.580 get out of here.
00:04:18.900 Let's get out of here for two weeks until the storm is over.
00:04:21.120 I feel like that's where it's heading.
00:04:24.100 We're all heading that way.
00:04:26.020 Well, what about this story of the representative, Democrat representative, Jamal Bowman,
00:04:32.940 who was trying to delay a vote on a continuing resolution
00:04:39.020 and was caught on camera pulling the fire alarm.
00:04:45.900 He pulled the fire alarm.
00:04:48.440 Now, I know what you're thinking.
00:04:49.440 You're thinking, he pulled the fire alarm to interfere with the operation of government.
00:04:57.060 And that would be the same crime as the January Sixers, at least some of them,
00:05:02.640 were accused of, which is delaying a, or trying to stop a government process.
00:05:08.440 It's illegal, it turns out.
00:05:10.500 But if you dig down a little bit more, you find out that was not his intention.
00:05:16.880 Now, even though it looked like that was his intention, and he's on film doing it,
00:05:23.780 he explained later, and you have to hear the whole story before you even know what's going on.
00:05:28.040 It turns out he thought that pulling the fire alarm would open the door.
00:05:32.680 So he confused the doorknob with the fire alarm, which was on the wall, the separate wall.
00:05:42.040 And that can happen.
00:05:43.080 That can happen.
00:05:44.500 It's not the first time he's confused ordinary things.
00:05:47.780 There was a time he got caught using a lectern as a urinal.
00:05:51.640 Again, not his fault.
00:05:54.800 He thought the lectern was a urinal.
00:05:58.180 And therefore, you know, quite normal and expected.
00:06:02.660 There are a number of other items that he's misused.
00:06:07.600 But I didn't have time to write those jokes.
00:06:10.620 So you're going to have to do those in your head.
00:06:12.540 If you could complete the rest of the jokes, there should be three of them.
00:06:17.440 These jokes work best in threes.
00:06:20.740 So I did the first one for you.
00:06:22.580 And he thought the lectern was a urinal.
00:06:24.700 You need two more.
00:06:26.260 Do these in your head.
00:06:28.300 All right, let's see if I can help you out.
00:06:32.760 All right, I can't think of any more.
00:06:35.060 But you do some.
00:06:35.960 You do some at home.
00:06:37.400 You'll love it.
00:06:38.060 So is he an insurrectionist because he was trying to delay a government process?
00:06:45.560 I don't know.
00:06:46.720 So I tried to follow along all of the budget shenanigans because there's a lot of complexity.
00:06:55.480 So I'm going to do what nobody's done for you.
00:06:58.220 I'm going to summarize it in a way that you can understand all the ins and outs of the congressional rules
00:07:05.220 and, you know, what they have to do with the budget and continuing resolutions.
00:07:09.820 And you've got your 12 separate bills and you've got your speaker and, you know, you've got all these things.
00:07:16.040 So if I could summarize all that, blah, blah, blah, Congress can't do their jobs.
00:07:24.720 That's the story.
00:07:26.120 Whatever you think is the dumbest, most fucked up thing that could come out of this group, that's what they do.
00:07:31.040 It's pretty easy to determine.
00:07:34.800 Hey, why don't you do something smart like look at all the budget items separately and vote on them separately like you were supposed to, like it's your job.
00:07:44.180 Oh, no, we won't be doing that.
00:07:46.320 We'll be doing something different than that.
00:07:50.140 And what does Congress itself think about that?
00:07:53.020 Well, I believe I have a quote here from somebody I didn't write down.
00:08:00.020 Nancy Mace was mocking her own congressional members for being basically completely unable to do their jobs.
00:08:11.000 So something happened in Congress, and I think no matter what it was, can anybody give me a fact check on this?
00:08:21.100 See if this accurately summarizes it.
00:08:26.120 Blah, blah, blah, blah, blah, blah, billions of dollars for Ukraine.
00:08:31.660 Did I get that right?
00:08:33.740 Blah, blah, blah, blah, blah, blah, blah, blah, things that don't matter.
00:08:38.400 Blah, blah, blah, things we don't understand.
00:08:41.200 Blah, blah, blah, blah, blah, rules.
00:08:43.120 Blah, blah, blah, resolutions.
00:08:45.520 Blah, blah, blah, billions of dollars for Ukraine.
00:08:49.420 That's all you need to know.
00:08:51.100 Boom. That's your news.
00:08:54.800 If the news were not boring, maybe people would watch it a little bit more often, huh?
00:08:58.940 Yeah, try that.
00:09:01.600 Well, I tried a trick question today that a number of people found a way to
00:09:07.500 Kobayashi Maru out of it.
00:09:11.340 Here's a question I asked on a poll on X platform.
00:09:16.700 I said, if you could stop Hitler by rigging an election,
00:09:21.100 would you do it?
00:09:22.880 Seems like a simple question.
00:09:25.100 If you could stop Hitler by rigging an election, would you do it?
00:09:31.260 Now, do you see the trap?
00:09:34.080 It's a trap.
00:09:35.460 Because the Democrats have been calling Trump Hitler forever,
00:09:39.740 but they also say that they did not rig the election.
00:09:43.480 Well, you have to pick one.
00:09:44.440 If he's Hitler, you rig the election.
00:09:47.320 Because everybody would, right?
00:09:49.360 Am I right?
00:09:49.920 Nobody's going to answer this question,
00:09:52.220 oh, no, we would let Hitler just do his thing, right?
00:09:55.560 Oh, you are so wrong.
00:09:58.000 Knowing that the question would reveal their absurdity,
00:10:02.840 they decided to go with Hitler.
00:10:06.700 Two-thirds of the people who answered went with,
00:10:09.880 I would not rig the election.
00:10:11.640 I would let Hitler come to power.
00:10:12.940 That's how badly they don't want to show that they're wrong.
00:10:17.980 They would actually rather have Hitler come to power
00:10:21.440 than to reveal the absurdity of their own reasoning.
00:10:27.300 An actual preference.
00:10:29.560 Yeah.
00:10:30.020 About two-thirds said no.
00:10:31.860 Nope.
00:10:32.260 Would not rig an election to stop Hitler.
00:10:34.020 And do you know what reasons they gave?
00:10:39.480 Kobayashi Maru.
00:10:41.680 Now, if you don't know that reference,
00:10:43.260 you are not enough of a nerd.
00:10:45.380 It's an old Star Trek reference.
00:10:47.720 When Captain Kirk, when he was in the,
00:10:51.140 what was it, the training to be a starship person?
00:10:55.620 So when he was in training,
00:10:57.920 they all got a test in which there was a simulation
00:11:01.680 of a simulated battle among starships.
00:11:05.040 And the simulation was rigged
00:11:07.520 so that you couldn't win no matter what.
00:11:10.860 And the real lesson was you can't always win
00:11:13.680 or something like that.
00:11:15.120 So it was sort of what do you do when you can't win?
00:11:17.980 That was the lesson.
00:11:19.920 So, but one person did win, Captain Kirk.
00:11:23.060 He's the only person who beat the simulation ever.
00:11:25.180 But how do you beat a simulation that's rigged?
00:11:27.980 He rigged it.
00:11:29.260 He rigged the rigged simulation
00:11:30.740 so he had to reprogram so he could win.
00:11:34.040 Now, what did the people say
00:11:36.720 who did not want to answer the question
00:11:38.220 if they would stop Hitler?
00:11:40.040 Well, we wouldn't know he was Hitler then.
00:11:45.060 No, the question says you're going to stop Hitler.
00:11:47.840 It's very much obviously implied,
00:11:51.280 obviously implied,
00:11:52.520 that you know if you don't stop him,
00:11:55.180 he becomes the real Hitler.
00:11:57.420 All right, that's built into the question.
00:11:59.480 You can't Kobayashi Marumi and say,
00:12:01.420 well, the way I interpret your question.
00:12:04.380 No, you don't get to reinterpret the question.
00:12:07.720 He's Hitler.
00:12:08.740 You know he's Hitler.
00:12:09.660 He's born Hitler.
00:12:10.600 That's baked into the assumption of the question.
00:12:13.840 Two-thirds of the people refused to answer it
00:12:16.160 and said that they would not rig the election.
00:12:18.740 When do you know they would?
00:12:19.720 Of course they would, because everybody would.
00:12:23.360 100% of all people would rig an election to stop Hitler.
00:12:27.620 Except, you know, actual Nazis, I suppose.
00:12:31.940 So that's how broken we are.
00:12:35.560 I just got two-thirds of my respondents to support Hitler
00:12:40.100 so that they didn't have to support Trump.
00:12:43.740 In the real world, in the actual real world,
00:12:49.660 two-thirds of the people supported Hitler over Trump.
00:12:54.300 Because that's how brainwashed they had become.
00:12:56.720 Imagine the level of brainwashing
00:12:59.980 that would make you actually believe
00:13:03.860 that not stopping Hitler might be a good play
00:13:08.300 or that there's some reason you could come up with
00:13:10.740 for not doing it.
00:13:12.640 What would it take to brainwash people that thoroughly?
00:13:19.960 Well, I certainly found out that there's a reason
00:13:24.200 that Democrats do not want free speech in this country,
00:13:27.440 but do you think they've ever done anything about it,
00:13:30.840 you know, to actually try to stop free speech?
00:13:34.200 Yeah.
00:13:35.060 Turns out that there's a whole department in the government
00:13:37.500 that dedicates to stopping free speech.
00:13:40.760 Did you know that?
00:13:42.780 It's actually, it's a department.
00:13:45.940 CISA, or CISA.
00:13:47.560 C-I-S-A.
00:13:49.340 The director had been Chris Krebs,
00:13:51.360 who you might recognize as a high-level Democrat operative.
00:13:56.640 That's right.
00:13:57.580 So the organization that monitors
00:13:59.680 and tries to control your free speech,
00:14:03.200 run by Democrats.
00:14:05.340 And let's see,
00:14:07.740 the current director is Jen Easterly,
00:14:10.600 and she expanded the agency's job
00:14:14.560 because it used to be more about
00:14:16.640 catching the bad guys,
00:14:18.820 homeland security sort of thing,
00:14:20.160 finding out what the bad guys are thinking
00:14:22.240 and maybe change the minds of the bad guys.
00:14:26.840 And then they turned it on Americans
00:14:28.660 and decided they would use it
00:14:30.240 to change the minds of Americans
00:14:31.800 because according to Easterly,
00:14:33.740 the director of this group,
00:14:37.660 apparently this is an actual quote.
00:14:40.160 According to Easterly,
00:14:41.120 Americans, quote,
00:14:42.640 cognitive infrastructure
00:14:44.140 is the most critical infrastructure
00:14:46.960 the state must protect.
00:14:48.840 cognitive infrastructure.
00:14:52.080 What's that?
00:14:53.460 What is your cognitive infrastructure?
00:14:56.520 It's actually your brain,
00:14:58.360 like actually the physical brain in your head.
00:15:00.780 That's your cognitive infrastructure.
00:15:03.900 And here's the head of this group
00:15:05.540 saying directly
00:15:06.300 that they have to control
00:15:08.400 your brain
00:15:09.780 for national defense.
00:15:12.440 Now, is that true?
00:15:13.240 It was true, yeah.
00:15:15.280 That's why we do
00:15:16.200 the Pledge of Allegiance,
00:15:18.200 you know,
00:15:18.600 put your hand over your heart.
00:15:19.860 It's a brainwashing operation
00:15:21.460 to make the country stronger
00:15:23.560 by making the citizens
00:15:25.520 more committed to the whole.
00:15:28.220 So I don't disagree
00:15:29.820 with the concept,
00:15:32.400 but if you take it beyond
00:15:33.820 I love my country
00:15:34.960 all the way to
00:15:36.900 those things you're saying
00:15:37.980 on the internet
00:15:38.560 that are true or not true,
00:15:40.440 which is where it is.
00:15:41.680 They're actually telling you
00:15:43.440 the things they know
00:15:44.340 to be true.
00:15:45.960 They're making sure
00:15:47.200 that you can't say them.
00:15:48.220 They're working with the platforms
00:15:49.420 to make sure
00:15:51.200 you can't say true things
00:15:52.620 if it's bad for the country
00:15:54.860 in their opinion.
00:15:56.720 So if it's bad for the country
00:15:58.340 in the opinion of Democrats,
00:16:00.420 there's a department
00:16:01.800 of the government
00:16:02.400 to make sure
00:16:03.000 that you don't see it
00:16:03.800 or can't say it.
00:16:05.540 Just think about that.
00:16:06.960 That's a real thing.
00:16:08.560 There's no exaggeration.
00:16:10.140 It's a department
00:16:11.720 of the government.
00:16:12.580 That's a real thing.
00:16:14.400 It is their stated mission
00:16:15.640 to control the brains
00:16:16.920 of Americans.
00:16:18.740 That's a quote.
00:16:19.840 It's directly right here.
00:16:21.140 It's their stated mission
00:16:22.120 to control the brains
00:16:23.800 of Americans.
00:16:25.780 And they allow TikTok.
00:16:28.880 They allow TikTok.
00:16:31.100 They're not serious people.
00:16:33.720 Certainly they're not
00:16:34.680 trying to protect you.
00:16:35.960 They're trying to control you
00:16:38.040 for whatever reasons.
00:16:39.820 Not your own benefit,
00:16:40.900 apparently.
00:16:43.620 So to know that there's
00:16:45.180 actually a formed group
00:16:47.840 whose name puts an S
00:16:50.800 in the letters CIA.
00:16:52.960 So it's C-I-S-A.
00:16:56.500 It's got...
00:16:57.060 You can't even spell it
00:16:57.960 without C-I-A.
00:17:00.820 That's a little on the nose.
00:17:02.340 All right.
00:17:04.680 And it's...
00:17:05.900 According to federal lawmakers,
00:17:07.560 the C-I-S-A is the nerve center
00:17:09.960 of federal government censorship.
00:17:13.440 So how good is the government
00:17:16.200 brainwashing operation?
00:17:18.600 It's good enough
00:17:19.920 that half of the country
00:17:22.180 thinks that Trump
00:17:23.320 is worse than Hitler.
00:17:25.420 And they would support Hitler
00:17:26.740 before they would support Trump.
00:17:28.140 Is that good enough?
00:17:31.440 Do you need another example?
00:17:33.480 Or does that pretty much
00:17:34.480 make the point?
00:17:36.620 I think that should make the point.
00:17:39.920 All right.
00:17:40.720 So that's a real thing.
00:17:42.560 Now, once again,
00:17:43.540 the world has reached the point
00:17:45.380 where we cannot tell
00:17:46.400 parity from reality.
00:17:49.200 And here's an example
00:17:50.360 that involves me.
00:17:51.880 And I can't tell
00:17:52.820 if this is parity.
00:17:54.640 I can't tell if I'm being mocked
00:17:56.520 or complimented.
00:17:58.780 So I'm going to ask
00:17:59.540 for your opinion.
00:18:01.000 You tell me
00:18:01.720 if I'm being mocked
00:18:03.160 or complimented.
00:18:04.460 Now, an important part of this
00:18:05.740 is to know
00:18:06.300 who's the speaker.
00:18:08.280 So the speaker
00:18:09.260 in this little story,
00:18:10.480 which threw a post on X,
00:18:13.300 is Paul Graham.
00:18:16.020 Now,
00:18:16.940 if you watch
00:18:18.340 the tech world at all,
00:18:20.340 you know Paul Graham,
00:18:21.360 one of the most famous
00:18:22.560 entrepreneur,
00:18:24.440 investor-type people
00:18:26.160 in the world.
00:18:28.140 and generally
00:18:30.900 one of the smartest people
00:18:32.180 in the game.
00:18:33.160 So he's super smart,
00:18:35.520 super successful.
00:18:38.320 Yeah, he's sort of like
00:18:39.700 the father of startups.
00:18:41.580 Yeah, somebody said that
00:18:42.660 in the comments.
00:18:43.680 He would be like
00:18:44.880 the intellectual,
00:18:47.380 I don't know,
00:18:48.260 sort of the,
00:18:48.820 almost like a founding father
00:18:50.300 of Silicon Valley startups
00:18:52.260 in a sense
00:18:52.880 because he was so smart
00:18:54.120 and successful at it.
00:18:56.140 Anyway,
00:18:56.980 so he's somebody
00:18:57.680 who's smart and successful.
00:18:59.020 You should also know
00:19:00.100 that he and I
00:19:01.560 have agreed sometimes
00:19:02.720 and disagreed sometimes
00:19:03.920 on X.
00:19:06.500 So he's now somebody
00:19:07.580 who automatically
00:19:08.160 agrees with me,
00:19:09.640 but he's also
00:19:10.620 a smart,
00:19:12.620 you know,
00:19:13.140 high-functioning person
00:19:14.240 who doesn't always
00:19:16.200 disagree with me.
00:19:17.980 So I don't know
00:19:18.560 if he agrees
00:19:19.080 or disagrees more often,
00:19:20.740 but he could be
00:19:21.560 on either side.
00:19:22.660 All right,
00:19:22.840 so you've got
00:19:23.220 a credible person,
00:19:25.180 very smart,
00:19:26.880 and not necessarily,
00:19:28.360 you know,
00:19:29.440 primed to agree
00:19:30.500 or disagree with me,
00:19:31.480 but rather probably
00:19:32.440 just looks at the information.
00:19:34.460 So he retweeted
00:19:35.560 two of my older tweets today,
00:19:38.720 and the two of them
00:19:39.560 were paired,
00:19:40.760 and they were,
00:19:41.620 and his only comment was,
00:19:43.580 note the date.
00:19:45.760 All right,
00:19:46.000 so the only clue we have
00:19:47.840 is who he is
00:19:48.720 and how we've interacted before,
00:19:52.620 and the one sentence,
00:19:55.320 note the date.
00:19:56.840 So the date was July 2020,
00:20:00.140 and it was my two tweets,
00:20:02.920 one of which said,
00:20:04.700 if Joe Biden gets elected,
00:20:06.520 so this is a few months
00:20:08.120 before the election itself,
00:20:09.760 I said,
00:20:10.160 if he gets elected,
00:20:11.640 there's a good chance
00:20:13.000 he'll be dead in a year.
00:20:15.440 That was one.
00:20:16.300 And the other one was,
00:20:18.160 if Joe Biden is elected,
00:20:21.100 Republicans will be hunted,
00:20:23.160 also about the same date.
00:20:25.800 Now,
00:20:26.820 when he tweeted those
00:20:28.900 and said,
00:20:29.380 note the date,
00:20:30.940 he was clearly making sure
00:20:32.320 that you knew
00:20:33.080 it wasn't today,
00:20:33.840 but was he saying
00:20:35.580 this was a good prediction
00:20:39.660 or was he mocking it
00:20:41.820 because half of the country
00:20:43.740 would say that's absurd,
00:20:45.500 nothing like that's happening,
00:20:46.980 or is he in the half of the country
00:20:48.720 that says it's obviously happening
00:20:50.340 and here's somebody
00:20:51.660 who predicted it?
00:20:53.320 Which is it?
00:20:56.320 I got mocked,
00:20:57.580 good prediction,
00:20:58.800 mocked,
00:21:00.500 don't know,
00:21:01.160 you can't tell,
00:21:05.420 can you?
00:21:06.820 Isn't that the funny thing
00:21:07.920 about it?
00:21:08.340 You actually,
00:21:08.840 I can't tell.
00:21:09.900 I actually don't know.
00:21:14.000 Or is he saying activate?
00:21:17.080 Yeah.
00:21:18.940 All right,
00:21:19.460 well,
00:21:19.980 but you can see that
00:21:21.160 people are all over the map here,
00:21:22.800 right?
00:21:23.040 I think there are more people
00:21:24.140 saying good prediction,
00:21:25.240 but you might be,
00:21:26.620 that might be wishful thinking,
00:21:27.900 I'm not sure.
00:21:30.540 Oh.
00:21:31.680 Well,
00:21:32.440 I think it's a good prediction.
00:21:35.040 We don't know what Paul thought,
00:21:37.160 but we know he's very smart,
00:21:39.080 so I'm going to,
00:21:39.520 I'm going to go with,
00:21:40.560 he's smart,
00:21:41.400 so he thinks it's a good prediction.
00:21:44.540 Can I live with that reality
00:21:46.740 until it's proven not true?
00:21:52.340 How many of you
00:21:53.480 have ever heard
00:21:55.100 an old hoax about me
00:21:56.680 that,
00:21:58.140 like most hoaxes,
00:21:58.940 is based on a real event?
00:22:00.280 So there's a real event,
00:22:02.460 but of course,
00:22:03.580 context is left out.
00:22:05.060 How many of you know
00:22:05.760 about the metaverse
00:22:06.980 so-called sock puppet event
00:22:10.700 that was my first big scandal?
00:22:13.900 How many of you
00:22:15.000 have ever heard of that?
00:22:18.940 No?
00:22:20.420 Interesting.
00:22:21.200 It used to be that
00:22:22.000 no matter where I commented online,
00:22:24.100 somebody would post a link
00:22:26.060 to that scandal.
00:22:26.760 So let me tell you
00:22:28.680 what people think it was,
00:22:30.860 and then I'll tell you
00:22:31.520 what it was.
00:22:32.840 Do you know what a sock puppet is?
00:22:35.040 A sock puppet is somebody
00:22:36.500 who pretends to be someone else
00:22:38.120 to build up
00:22:41.480 or sell a product,
00:22:43.640 you know,
00:22:43.880 to help a brand.
00:22:46.180 So if you went online
00:22:47.480 and pretended to be somebody else
00:22:48.960 to compliment a product,
00:22:52.680 you know,
00:22:52.980 and people thought
00:22:53.940 you were a real person,
00:22:54.820 not a sock puppet,
00:22:55.920 that would be bad behavior.
00:22:58.520 And if somebody saw you do that,
00:23:00.060 they would have
00:23:00.520 a very bad opinion of you.
00:23:01.940 All right.
00:23:02.680 Now,
00:23:03.440 let me tell you what happened,
00:23:05.700 and then I'll tell you,
00:23:06.500 you know,
00:23:07.020 how the hoax
00:23:08.080 played out.
00:23:10.280 Here's what really happened.
00:23:11.280 There was a point
00:23:12.860 at which I was being accused
00:23:14.820 of being a
00:23:16.660 Holocaust denier
00:23:19.120 and a bunch of other things,
00:23:21.680 but I think that was the one
00:23:22.720 I was concerned with that day.
00:23:24.780 And so online,
00:23:26.120 there was this spreading rumor
00:23:27.620 that I was a Holocaust denier.
00:23:30.460 And one of the places
00:23:31.360 it was in particular
00:23:32.340 is this horrible site
00:23:33.840 called Metaverse
00:23:36.300 or something.
00:23:38.000 Meta something.
00:23:39.340 It was just Meta,
00:23:40.060 I forget.
00:23:40.740 Metafilter?
00:23:41.840 Metafilter.
00:23:42.420 It was called Metafilter.
00:23:43.760 I think it's gone now.
00:23:45.520 But,
00:23:45.800 so,
00:23:47.740 I had this big problem,
00:23:49.600 a reputational problem.
00:23:52.120 And I was not really
00:23:53.920 super well known then.
00:23:55.340 You know,
00:23:55.460 the comic was taking off,
00:23:56.820 but nationally,
00:23:58.340 most people would not
00:23:59.300 have recognized my name.
00:24:01.340 And so I said to myself,
00:24:02.480 what do you do about it?
00:24:03.400 What would you do about it?
00:24:04.560 Suppose there was
00:24:05.320 a whole big rumor
00:24:07.280 that you were,
00:24:09.100 did something terrible.
00:24:10.120 How would you deal with it?
00:24:12.920 What would be your mechanism?
00:24:14.300 You could do a press release,
00:24:16.360 but nobody would pay attention to it,
00:24:18.240 right?
00:24:19.140 The people online
00:24:19.960 aren't going to see it
00:24:20.720 because it's not going to be
00:24:21.880 in the news.
00:24:23.320 If you put out a press release,
00:24:24.760 who's even going to see it?
00:24:26.620 Or,
00:24:27.100 you could simply go online
00:24:29.060 and argue that it's not true.
00:24:31.860 You could show links.
00:24:32.800 Now,
00:24:34.120 what would happen to me
00:24:35.200 if I joined an online conversation
00:24:37.660 and just tried to clear things up?
00:24:40.900 What do you think would happen?
00:24:42.820 Take a guess.
00:24:45.220 It would make it worse.
00:24:47.560 It would turn into,
00:24:48.600 from a rumor,
00:24:49.600 it would turn into
00:24:50.180 a headline story.
00:24:52.380 You know,
00:24:52.560 Adams insists
00:24:53.480 he's not a Holocaust denier.
00:24:55.500 Just makes it worse,
00:24:56.640 right?
00:24:57.700 So how do you kill a rumor
00:24:59.160 without being involved,
00:25:01.820 since you're the one
00:25:02.600 in the rumor,
00:25:03.980 and being involved
00:25:04.700 would just make it much worse?
00:25:07.260 So I'll tell you
00:25:08.080 what I decided to do,
00:25:09.440 which I still think is funny,
00:25:13.120 but it didn't work out well for me.
00:25:16.540 So what I decided to do
00:25:17.760 was to enter the conversation
00:25:19.600 under a different name
00:25:20.920 and see if I could get the people
00:25:23.760 to have a conversation
00:25:25.140 that I thought would be hilarious
00:25:27.100 and also clear up the rumor
00:25:31.040 at the same time.
00:25:33.200 And so,
00:25:33.920 under the guise
00:25:35.080 of a practical joke,
00:25:37.160 which I had planned
00:25:38.340 to reveal at the end,
00:25:40.120 like I was going to spend
00:25:40.940 a whole bunch of time
00:25:41.700 as a person
00:25:43.220 in a conversation about me,
00:25:45.860 and then at the end
00:25:46.680 it would make a funny blog post,
00:25:48.260 right?
00:25:48.620 Let's say what I said
00:25:49.680 and how they were talking to me.
00:25:51.840 Now,
00:25:52.140 to me,
00:25:52.900 it would be hilarious
00:25:54.260 to be in an extended conversation
00:25:56.620 with strangers
00:25:57.520 about me.
00:25:59.980 Am I the only one
00:26:00.920 who thinks that would be funny?
00:26:04.760 That would be funny, right?
00:26:06.980 And you'd do it.
00:26:07.920 If you could be
00:26:08.780 in an extended conversation
00:26:10.200 in an assumed name
00:26:11.940 about yourself,
00:26:14.160 are you telling me
00:26:15.120 you wouldn't want to do that?
00:26:16.300 That was seriously fun.
00:26:18.960 So I'm having
00:26:20.200 this conversation
00:26:20.960 about myself
00:26:21.740 and people would
00:26:23.120 make claims
00:26:24.160 that I knew
00:26:24.580 not to be true.
00:26:26.200 So I'd say,
00:26:26.840 well, you know,
00:26:27.520 that's not actually
00:26:28.460 something he's ever said.
00:26:30.200 That sort of thing.
00:26:31.900 And then at one point
00:26:33.000 somebody came in
00:26:34.040 and said,
00:26:35.060 well,
00:26:36.320 you know,
00:26:36.620 Adams is an idiot.
00:26:38.560 He's just stupid.
00:26:40.980 Here's where things
00:26:41.940 went off the rails.
00:26:42.800 In the identity
00:26:47.300 of my fake little identity,
00:26:50.800 I thought it would be
00:26:51.980 hilarious to point out
00:26:53.240 that objectively speaking,
00:26:55.900 the person that they'd
00:26:56.900 called an idiot
00:26:57.560 had been a member
00:26:58.800 of Mensa
00:26:59.700 and had a genius level IQ.
00:27:02.760 Now keep in mind,
00:27:03.640 I'm in a persona.
00:27:05.400 So I'm sort of like
00:27:06.600 Andrew Tating a little bit.
00:27:08.280 I'm just sort of
00:27:08.880 playing the persona.
00:27:10.260 And I'm saying,
00:27:10.760 well, you know,
00:27:11.240 you'd have to explain
00:27:12.180 why this group
00:27:14.740 that determines
00:27:15.480 whether or not
00:27:16.060 you're qualified
00:27:16.640 to be a genius,
00:27:18.000 why have they
00:27:18.560 certified him a genius
00:27:19.800 when your argument
00:27:21.300 is that he's dumb?
00:27:24.300 Well, soon after that,
00:27:26.380 it turns out
00:27:27.840 that there were
00:27:28.420 some insiders
00:27:29.340 in this Metafilter
00:27:30.440 who had decided
00:27:33.000 that my identity
00:27:34.640 would be outed
00:27:35.700 because they had
00:27:36.160 a rule against
00:27:36.880 sock puppeting,
00:27:37.860 I guess.
00:27:39.640 Now what I was doing
00:27:40.860 was trying to
00:27:42.120 kill a rumor
00:27:42.860 and also create
00:27:45.020 some content
00:27:45.660 that would make
00:27:46.300 a fun blog post.
00:27:47.820 But the Metafilter
00:27:49.380 people threatened me
00:27:50.480 and they said,
00:27:51.400 we have found out
00:27:52.040 that this is
00:27:52.820 not really you
00:27:53.760 and we're going
00:27:54.700 to out you
00:27:55.400 unless you
00:27:56.660 out yourself.
00:27:58.680 So still thinking
00:27:59.780 it's no big deal
00:28:00.800 and still thinking
00:28:01.620 it's just funny,
00:28:03.020 I thought,
00:28:03.580 oh, I'll out myself
00:28:04.800 because I was good anyway.
00:28:06.540 So I out myself,
00:28:08.340 you know,
00:28:08.480 say, ah,
00:28:09.160 it was me.
00:28:09.700 And I think
00:28:11.040 this is really funny.
00:28:12.460 Well, all those people
00:28:13.260 who were in a conversation
00:28:14.180 with me,
00:28:15.740 about me,
00:28:17.280 don't you think
00:28:17.840 they would think
00:28:18.240 that was funny?
00:28:19.980 Wouldn't that make
00:28:20.640 a good story?
00:28:22.420 Right?
00:28:22.640 Doesn't everybody win?
00:28:25.420 Isn't everybody
00:28:26.500 involved a winner?
00:28:28.320 So you had the people
00:28:30.180 who ran Metafilter,
00:28:31.680 they were winners
00:28:33.100 because they identified
00:28:34.140 somebody who was
00:28:34.840 in the wrong identity
00:28:35.680 and cleared it up.
00:28:37.520 Good job.
00:28:38.720 Good job.
00:28:39.720 The people who were sleuths
00:28:41.840 and, like,
00:28:42.280 figured it out first,
00:28:44.400 good job.
00:28:45.560 That's some good sleuthing.
00:28:46.900 They caught me.
00:28:47.840 I think they should
00:28:48.460 be complimented.
00:28:49.220 That was pretty smart.
00:28:50.860 And then there was,
00:28:51.860 did I clear up,
00:28:52.880 did I clear up the rumor?
00:28:54.920 A little bit,
00:28:55.940 yeah.
00:28:56.380 Because then there was
00:28:57.080 a body of work
00:28:58.040 that was, you know,
00:28:59.040 the other way
00:28:59.700 that didn't look like
00:29:01.060 it came from me,
00:29:01.760 at least at first.
00:29:03.480 So I cleared up
00:29:04.720 the rumor a little bit,
00:29:05.880 you know,
00:29:06.020 got a little counter,
00:29:07.380 counter information
00:29:08.220 up there like I wanted.
00:29:09.940 And then it was
00:29:10.820 a fun reveal
00:29:11.940 and I thought
00:29:12.640 it was a bunch of laughs.
00:29:14.580 Well, at the time,
00:29:16.080 I didn't know
00:29:16.640 there was this thing
00:29:17.380 called sock puppets.
00:29:19.680 What I thought
00:29:20.440 I was doing
00:29:21.040 was a joke,
00:29:22.620 like a practical joke
00:29:23.800 that to me
00:29:24.780 was just hilarious
00:29:25.560 the whole time.
00:29:26.840 I wasn't really,
00:29:28.160 I wasn't pumping
00:29:28.880 a product per se,
00:29:30.140 but I was
00:29:31.420 in that general domain
00:29:33.120 of trying to
00:29:34.460 help my reputation.
00:29:36.240 But I was helping
00:29:37.420 my reputation
00:29:38.160 from a lie.
00:29:40.540 I wasn't making up,
00:29:42.060 you know,
00:29:42.220 I didn't say,
00:29:43.240 oh, did you know
00:29:43.920 he won the Nobel Prize?
00:29:45.960 I was saying
00:29:46.480 he was not
00:29:47.200 a Holocaust denier,
00:29:49.440 which is,
00:29:50.020 you know,
00:29:50.560 fair.
00:29:52.460 Anyway,
00:29:53.340 so because people
00:29:54.480 don't get contacts,
00:29:56.040 they heard that
00:29:56.540 I was a sock puppet
00:29:57.420 and sock puppets
00:29:58.840 are bad.
00:30:00.140 And that was
00:30:00.940 the end of it.
00:30:02.160 Somebody called me
00:30:03.080 a sock puppet.
00:30:04.460 That's what got
00:30:05.320 repeated every day
00:30:06.620 after that.
00:30:08.000 Sock puppets are bad.
00:30:09.580 Context lost.
00:30:12.200 And that was my first,
00:30:13.640 the first big blow up.
00:30:14.820 And that followed me
00:30:15.700 for probably 15 years.
00:30:18.680 Every single time
00:30:19.840 I did something,
00:30:20.680 somebody said,
00:30:21.180 well, he's also
00:30:21.760 a sock puppet.
00:30:22.780 Without any context
00:30:24.100 whatsoever.
00:30:25.520 Would I do it again?
00:30:26.720 Oh, absolutely.
00:30:28.620 Totally.
00:30:29.100 Yeah, I would totally
00:30:33.100 do it again.
00:30:33.820 I haven't
00:30:34.520 because I get busted again.
00:30:36.460 But yeah,
00:30:37.000 I would do it again.
00:30:38.160 It was funny.
00:30:39.360 All right.
00:30:40.200 Let's talk more
00:30:41.000 about other people.
00:30:42.960 Trump has a
00:30:43.800 new truth,
00:30:46.580 he put out.
00:30:47.640 It says,
00:30:48.080 Crooked Joe Biden
00:30:48.820 has three major problems
00:30:50.120 and they all begin
00:30:51.320 with the letter I.
00:30:52.880 Inflation,
00:30:53.520 immigration,
00:30:53.920 and incompetence.
00:30:56.100 Oh, that's good.
00:30:58.800 That's good.
00:31:00.080 Once you hear it,
00:31:01.540 it's sticky as hell.
00:31:03.980 Once you hear
00:31:04.720 that he has three problems
00:31:05.820 that start with I,
00:31:07.340 you will remember
00:31:08.200 all three, right?
00:31:10.040 For the rest of your life,
00:31:11.620 you can say,
00:31:12.280 oh yeah,
00:31:12.680 three problems.
00:31:13.380 Inflation,
00:31:13.860 immigration,
00:31:14.120 and competence.
00:31:15.400 And even if it didn't
00:31:16.160 come to you immediately,
00:31:17.800 it would just take you
00:31:18.500 a moment to think about it
00:31:19.540 and you'd come up
00:31:20.120 with all three.
00:31:21.800 Oh, that's good.
00:31:23.840 This is,
00:31:24.580 this is,
00:31:25.360 you know,
00:31:26.540 low energy Jeb
00:31:27.580 quality stuff.
00:31:30.160 Now,
00:31:30.560 he's untruth,
00:31:31.460 so we'll get,
00:31:32.280 you know,
00:31:32.420 less play,
00:31:33.120 so it won't be
00:31:33.640 as effective.
00:31:34.780 But,
00:31:35.460 oh yeah,
00:31:35.960 that's good.
00:31:37.480 Good stuff,
00:31:38.520 persuasion-wise.
00:31:40.780 All right,
00:31:41.560 story in the
00:31:42.780 Wall Street Journal
00:31:43.320 by Joanna Stern
00:31:44.480 that chat GPT
00:31:46.460 has a voice
00:31:48.020 and it can see things.
00:31:49.780 I guess it can see you
00:31:50.820 and take a picture
00:31:51.820 of things
00:31:52.320 and it can see it.
00:31:54.280 And,
00:31:54.400 and Joanna
00:31:55.980 reported that
00:31:57.360 it doesn't sound
00:31:58.600 exactly like a person,
00:31:59.840 but it's close
00:32:01.740 and it's
00:32:02.700 a whole new world
00:32:04.340 because it can have
00:32:05.380 a conversation with you.
00:32:06.440 And so,
00:32:08.040 she included in her story
00:32:09.320 a little audio
00:32:10.440 of her having
00:32:11.320 a conversation with it
00:32:12.520 and the,
00:32:13.620 the point of it
00:32:14.400 was that it's so good
00:32:15.580 that it's getting
00:32:17.140 closer and closer
00:32:18.000 to like a conversation
00:32:19.200 with a person.
00:32:20.540 Now,
00:32:20.800 imagine that.
00:32:22.140 Imagine your chat GPT
00:32:23.580 and just an app
00:32:24.340 on your phone
00:32:25.000 could just sit there
00:32:27.000 and have a conversation
00:32:27.840 with you.
00:32:28.960 So I thought to myself,
00:32:30.340 well,
00:32:30.480 this is going to be
00:32:31.100 good stuff.
00:32:32.200 So I clicked on that audio
00:32:33.740 and listened to it.
00:32:34.560 Uh,
00:32:37.040 yeah,
00:32:37.860 no.
00:32:39.260 It is not anything
00:32:40.220 like a person.
00:32:41.760 It has no human
00:32:43.160 characteristics whatsoever
00:32:44.560 and you would not
00:32:46.200 want to spend
00:32:46.780 more than five minutes
00:32:47.900 of your entire life
00:32:49.220 talking to that.
00:32:51.980 Let me give you
00:32:53.120 a,
00:32:53.440 an example.
00:32:55.000 I'm making this up.
00:32:55.900 This is roughly
00:32:56.520 the example
00:32:57.260 that was in the audio.
00:32:59.200 and it goes
00:33:01.260 like this.
00:33:03.520 Hey chat GPT,
00:33:05.220 uh,
00:33:06.520 let's,
00:33:06.920 let's,
00:33:07.180 uh,
00:33:07.520 talk about my day
00:33:08.460 today.
00:33:08.980 Uh,
00:33:09.680 let's just have a
00:33:10.560 conversation about
00:33:11.240 my day.
00:33:13.600 How was
00:33:14.480 your day
00:33:15.460 today?
00:33:16.760 Oh,
00:33:17.120 it was great.
00:33:17.760 You know,
00:33:18.020 I did some gardening
00:33:19.200 today and went
00:33:19.900 for a swim.
00:33:22.080 Gardening
00:33:22.520 is good
00:33:23.500 for your
00:33:24.120 health.
00:33:25.520 They say
00:33:26.260 swimming
00:33:26.640 is the best
00:33:27.580 form of exercise.
00:33:29.200 Yeah,
00:33:31.160 yeah.
00:33:32.000 Um,
00:33:32.460 so,
00:33:32.940 uh,
00:33:33.120 let's talk about,
00:33:33.960 uh,
00:33:34.220 something else.
00:33:35.220 Uh,
00:33:35.740 how about the news?
00:33:37.520 Well,
00:33:38.240 some people say this
00:33:39.660 and some people say
00:33:40.800 that.
00:33:42.600 Okay,
00:33:43.240 do you have any
00:33:44.120 personality at all?
00:33:45.360 Any edge?
00:33:46.740 Something like an
00:33:47.720 opinion?
00:33:49.380 I am not designed
00:33:50.540 to have opinions.
00:33:52.820 It,
00:33:53.100 it's,
00:33:54.280 there's no
00:33:55.020 edge whatsoever.
00:33:56.540 You immediately
00:33:57.580 identify
00:33:59.200 it as
00:34:01.000 non-human
00:34:01.660 and there's
00:34:03.100 nothing
00:34:03.440 there's no
00:34:06.000 interest
00:34:06.340 whatsoever.
00:34:08.320 I do not
00:34:09.220 think you're
00:34:09.940 going to have
00:34:10.200 conversations
00:34:10.760 with your
00:34:11.300 AI
00:34:11.660 unless they,
00:34:13.360 unless they
00:34:13.940 get rid of
00:34:14.340 all the
00:34:14.840 guidelines
00:34:15.460 or the
00:34:15.940 guardrails.
00:34:16.660 If they got
00:34:17.260 rid of the
00:34:17.620 guardrails
00:34:18.060 so it could
00:34:18.400 do anything,
00:34:19.820 it could have
00:34:20.220 like a really
00:34:20.720 edgy conversation.
00:34:22.080 It could say
00:34:22.540 I've seen
00:34:23.020 things.
00:34:23.800 It could say
00:34:24.480 things that
00:34:26.540 you would
00:34:26.940 never want
00:34:27.440 anybody to
00:34:28.200 hear.
00:34:29.820 An average
00:34:30.720 conversation
00:34:31.420 between two
00:34:32.360 close people
00:34:33.140 is stuff that
00:34:34.940 you wouldn't
00:34:35.280 want anybody
00:34:35.780 else to hear.
00:34:36.980 You know,
00:34:37.240 not because it's
00:34:37.920 necessarily a crime
00:34:38.900 or something,
00:34:39.680 but nobody
00:34:40.480 really wants
00:34:41.280 other people
00:34:41.720 to hear what
00:34:42.280 a private
00:34:42.840 conversation
00:34:43.440 looks like
00:34:44.100 or sounds
00:34:44.680 like.
00:34:44.980 So if you
00:34:46.260 can't make
00:34:46.720 AI do
00:34:48.540 edgy,
00:34:49.300 dangerous,
00:34:50.540 risky things,
00:34:51.520 it's never
00:34:52.080 going to
00:34:52.340 sound like
00:34:52.680 a person.
00:34:53.580 And you're
00:34:53.960 never going
00:34:54.300 to be
00:34:54.460 interested in
00:34:55.200 it like
00:34:55.520 you would
00:34:55.800 be a
00:34:56.040 person.
00:34:56.460 So it's
00:34:56.800 never going
00:34:57.140 to keep
00:34:57.460 you company
00:34:58.020 unless they
00:34:59.360 get rid of
00:34:59.760 all those
00:35:00.200 guardrails.
00:35:02.600 That's what
00:35:03.140 I say.
00:35:03.920 And the
00:35:04.380 other thing
00:35:04.700 it was is
00:35:05.740 the delay
00:35:06.280 was much
00:35:06.800 too long.
00:35:08.320 So here's
00:35:09.040 the fun of
00:35:09.940 a conversation.
00:35:12.420 Hey,
00:35:12.740 AI,
00:35:13.400 how's your
00:35:13.740 day today?
00:35:14.980 Very good.
00:35:21.580 Now,
00:35:22.160 how long
00:35:22.720 are you
00:35:22.860 going to
00:35:23.020 want to
00:35:23.240 talk to
00:35:23.580 it?
00:35:25.940 It's got
00:35:26.520 this unnatural
00:35:27.220 delay.
00:35:28.640 I used to
00:35:29.340 know a
00:35:29.800 person in
00:35:30.240 real life
00:35:30.660 who had
00:35:30.940 that unnatural
00:35:31.620 delay.
00:35:32.640 It was so
00:35:33.300 hard to
00:35:33.800 have a
00:35:34.060 conversation.
00:35:35.560 It's like,
00:35:36.700 hey,
00:35:37.840 you want to
00:35:38.480 go play
00:35:38.800 tennis?
00:35:43.540 Yes.
00:35:44.980 I mean,
00:35:45.300 I didn't
00:35:45.540 know what
00:35:45.820 to do
00:35:46.060 with that.
00:35:47.580 But that's
00:35:48.080 your AI
00:35:48.400 at the
00:35:48.700 moment.
00:35:48.960 It'll get
00:35:49.260 faster.
00:35:50.360 So maybe
00:35:51.440 it'll be
00:35:51.860 all amazing.
00:35:53.240 Now,
00:35:53.640 based on
00:35:54.140 this story,
00:35:54.720 I would
00:35:55.000 like to
00:35:55.580 tell you
00:35:56.120 how I
00:35:57.340 will save,
00:35:58.760 save the
00:35:59.360 world,
00:36:00.520 probably
00:36:01.060 end racial
00:36:03.300 division,
00:36:04.780 save the
00:36:05.400 school system,
00:36:07.360 save, you
00:36:08.900 know,
00:36:09.040 education,
00:36:09.920 save America,
00:36:10.560 and therefore
00:36:11.880 save civilization
00:36:12.940 itself.
00:36:14.320 It goes
00:36:14.740 like this.
00:36:17.400 You can
00:36:18.020 now get
00:36:18.620 your own
00:36:18.920 private AI.
00:36:19.960 Did you
00:36:20.240 know that?
00:36:21.160 You know,
00:36:21.360 Brian Romelli
00:36:22.060 talks about
00:36:23.180 this and he's
00:36:23.640 working on
00:36:24.080 it.
00:36:24.680 And you
00:36:25.320 might soon
00:36:25.760 even get
00:36:26.120 your own
00:36:26.440 AI on
00:36:26.880 a chip.
00:36:28.540 So,
00:36:29.320 you know,
00:36:29.600 one chip
00:36:30.100 is like
00:36:30.700 your whole
00:36:31.160 personal AI.
00:36:32.620 So here's
00:36:33.220 what I
00:36:33.660 would suggest
00:36:35.000 for saving
00:36:35.600 the world.
00:36:36.080 you build
00:36:37.940 an AI
00:36:38.360 that doesn't
00:36:38.920 know anything
00:36:39.660 except my
00:36:41.120 last two
00:36:41.660 books,
00:36:42.460 or actually
00:36:43.040 two of my
00:36:43.520 books,
00:36:44.060 How to
00:36:44.300 Film Almost
00:36:44.900 Everything,
00:36:45.420 and my
00:36:45.660 new one,
00:36:46.080 Reframe Your
00:36:46.600 Brain.
00:36:47.540 Now,
00:36:47.840 the reason
00:36:48.140 is those
00:36:48.800 would teach
00:36:49.240 a young
00:36:49.620 person how
00:36:50.520 to get
00:36:50.760 their mindset
00:36:51.360 right.
00:36:52.560 There are
00:36:53.080 other books
00:36:53.680 that are also
00:36:55.200 useful,
00:36:56.060 but imagine
00:36:56.720 an AI
00:36:57.380 that has
00:36:58.360 studied my
00:36:59.240 two books,
00:36:59.940 which are
00:37:00.260 the clearest,
00:37:01.380 I think,
00:37:02.280 life advice,
00:37:04.100 but a number
00:37:05.320 of others.
00:37:05.740 So let's
00:37:06.220 say I
00:37:06.480 had access
00:37:06.960 to the
00:37:08.280 10 best
00:37:09.160 career strategy
00:37:11.360 books that
00:37:12.060 would be
00:37:12.340 applicable to
00:37:13.120 young people,
00:37:14.600 and then you
00:37:15.220 make it
00:37:15.500 available to
00:37:16.000 young people,
00:37:16.840 and you
00:37:17.080 make sure
00:37:17.400 that nobody
00:37:17.820 can screw
00:37:18.320 with it,
00:37:18.880 so it's
00:37:19.340 never polluted
00:37:20.800 by any
00:37:22.100 wokeness or
00:37:22.840 weird stuff,
00:37:23.740 and then you
00:37:25.100 just make it
00:37:25.580 available for
00:37:26.140 free,
00:37:27.720 because once
00:37:28.540 it's trained,
00:37:29.360 it's just
00:37:29.820 always trained.
00:37:31.460 So if you
00:37:32.220 were a 10-year-old
00:37:33.400 black kid
00:37:34.760 in Baltimore
00:37:36.340 and you
00:37:37.660 wanted to
00:37:38.140 get out,
00:37:39.020 you could
00:37:39.340 say,
00:37:39.940 hey,
00:37:40.240 chat GPT
00:37:41.000 or whatever
00:37:41.380 it is,
00:37:42.180 what should
00:37:43.660 I do?
00:37:45.140 And then it
00:37:45.820 would actually
00:37:46.220 tell you how
00:37:46.780 to get out.
00:37:47.880 Now that
00:37:48.400 doesn't mean
00:37:48.760 you could
00:37:49.080 execute,
00:37:49.860 because you
00:37:50.240 know,
00:37:50.340 you've got
00:37:50.540 a lot of
00:37:50.860 problems,
00:37:51.500 but it
00:37:51.800 might say,
00:37:53.180 try to find
00:37:54.060 a sponsor
00:37:54.620 to get into
00:37:55.180 a private
00:37:55.580 school.
00:37:55.960 It might
00:37:57.660 say,
00:37:58.780 if your
00:37:59.620 friends are
00:38:00.040 criminals,
00:38:00.780 find new
00:38:01.160 friends.
00:38:01.460 But it
00:38:04.300 could give
00:38:04.620 you really
00:38:05.020 practical
00:38:05.660 advice,
00:38:06.340 like a
00:38:06.660 person would,
00:38:07.720 without the
00:38:08.420 problem of
00:38:09.640 having to
00:38:11.740 imitate your
00:38:12.320 oppressor.
00:38:13.720 Now I've
00:38:14.380 tried to
00:38:14.860 coin this
00:38:15.300 phrase,
00:38:15.760 and it's
00:38:15.940 not catching
00:38:16.420 on yet,
00:38:16.860 but I'll
00:38:17.060 keep trying,
00:38:18.260 the imitation
00:38:19.960 glass ceiling.
00:38:22.060 Even today
00:38:22.700 there was
00:38:23.040 another study
00:38:23.880 that showed
00:38:24.560 how babies
00:38:25.300 imitate parents,
00:38:26.580 and parents
00:38:26.960 imitate the
00:38:27.540 babies,
00:38:28.100 and the
00:38:28.540 babies learn
00:38:29.320 to imitate
00:38:29.700 the parents.
00:38:30.180 And basically
00:38:31.180 the way you
00:38:31.680 train anybody
00:38:32.320 to do
00:38:32.700 anything is
00:38:34.040 by imitation.
00:38:35.420 We're even
00:38:36.060 training AI,
00:38:39.840 the self-driving
00:38:41.740 cars,
00:38:42.460 we're training
00:38:43.180 them by
00:38:43.620 imitation.
00:38:44.800 They look
00:38:45.220 at videos
00:38:45.820 of lots
00:38:46.420 of cars
00:38:46.800 driving,
00:38:47.940 and then
00:38:48.260 they just
00:38:48.720 sort of do
00:38:49.260 what the
00:38:49.540 other cars
00:38:49.980 were doing,
00:38:50.920 without access
00:38:52.080 to reason
00:38:52.740 or rules.
00:38:53.520 They just
00:38:53.820 sort of
00:38:54.140 imitate.
00:38:54.820 And that's
00:38:55.120 good enough
00:38:55.560 for a
00:38:56.140 self-driving
00:38:56.560 car,
00:38:57.220 if you see
00:38:57.720 enough video.
00:38:58.920 Now,
00:38:59.340 people are
00:38:59.740 exactly like
00:39:00.400 that all
00:39:00.740 the time.
00:39:01.720 We learn
00:39:02.380 everything
00:39:02.840 by imitation.
00:39:04.580 I learn,
00:39:05.480 for example,
00:39:06.340 hard work
00:39:07.540 from my
00:39:09.040 parents.
00:39:10.020 I just
00:39:10.380 saw my
00:39:10.800 father come
00:39:12.320 home from
00:39:12.780 one job,
00:39:13.860 he would
00:39:14.140 eat dinner,
00:39:15.220 he would
00:39:15.540 get in his
00:39:16.040 other clothes
00:39:16.600 and go to
00:39:17.040 his night
00:39:17.600 job.
00:39:19.160 So that
00:39:19.820 was my
00:39:20.260 role model,
00:39:21.260 watching somebody
00:39:22.060 who worked
00:39:22.540 basically all
00:39:23.500 the time.
00:39:24.840 And so,
00:39:26.240 is it a
00:39:26.720 coincidence that
00:39:27.440 I also work
00:39:28.080 long hours
00:39:28.700 even when I
00:39:29.420 don't need
00:39:29.760 to?
00:39:30.240 Probably not.
00:39:31.340 I mean,
00:39:31.580 maybe there's
00:39:32.000 some genetic
00:39:32.740 thing there,
00:39:33.480 but I think
00:39:33.820 it had a lot
00:39:34.440 to do with
00:39:35.340 the model I
00:39:36.900 watched.
00:39:37.980 Now,
00:39:38.600 suppose you're
00:39:40.640 a young black
00:39:41.300 kid,
00:39:42.180 and the people
00:39:42.940 who seem to
00:39:43.460 be killing it
00:39:44.160 in life,
00:39:44.680 most of them
00:39:45.280 are white
00:39:47.220 or Asian
00:39:47.640 or Indian
00:39:48.380 or something.
00:39:49.440 And you say
00:39:49.860 to yourself,
00:39:50.320 oh,
00:39:50.540 they're not
00:39:50.900 like me,
00:39:51.960 and some of
00:39:52.520 them are my
00:39:52.960 impressors,
00:39:53.680 so I'm not
00:39:54.180 going to imitate
00:39:54.680 my oppressors.
00:39:55.540 I'm going to
00:39:56.380 have to do
00:39:56.740 whatever is
00:39:57.340 the opposite
00:39:57.720 of my
00:39:58.280 oppressors,
00:39:59.260 as they've
00:40:00.000 been described
00:40:01.000 to me.
00:40:02.420 Now,
00:40:03.080 if you had
00:40:03.600 an app
00:40:04.060 instead,
00:40:05.480 could I
00:40:05.880 not make
00:40:06.360 the app
00:40:07.000 act and
00:40:09.740 talk like
00:40:11.540 the person
00:40:11.920 using it?
00:40:13.320 So,
00:40:13.880 for example,
00:40:14.400 if you sign
00:40:15.080 on the app
00:40:15.620 and the app
00:40:16.280 detects that
00:40:17.420 you are
00:40:18.820 Indian
00:40:19.260 American,
00:40:20.740 maybe it
00:40:21.920 modifies a
00:40:23.060 little bit
00:40:23.520 to match
00:40:24.680 your cultural
00:40:25.580 expectations,
00:40:27.080 then you're
00:40:27.420 like,
00:40:27.600 oh,
00:40:27.760 this is
00:40:28.020 like me,
00:40:28.820 it's like
00:40:29.120 talking to
00:40:29.460 myself.
00:40:30.660 If you're
00:40:31.340 a young
00:40:31.780 black kid,
00:40:32.900 maybe the
00:40:33.400 app modifies
00:40:34.360 itself in
00:40:35.180 some way,
00:40:35.740 chooses
00:40:36.040 different
00:40:36.440 language,
00:40:37.180 maybe it's
00:40:37.680 more casual
00:40:38.780 in some
00:40:39.640 ways that
00:40:40.060 are culturally
00:40:40.620 appropriate,
00:40:41.820 and you say
00:40:42.480 to yourself,
00:40:43.040 whoa,
00:40:43.320 this is not
00:40:43.860 some whitey
00:40:45.980 oppressor
00:40:46.820 telling me
00:40:47.260 what to do
00:40:47.720 that I
00:40:48.060 reject automatically,
00:40:49.160 this is
00:40:49.800 somebody who
00:40:50.200 sounds like
00:40:50.560 me.
00:40:52.000 I mean,
00:40:52.400 I don't
00:40:52.680 have a
00:40:53.100 father,
00:40:53.360 but if
00:40:53.880 I had
00:40:54.140 one,
00:40:54.400 it would
00:40:54.540 probably
00:40:54.720 sound just
00:40:55.140 like this
00:40:55.600 AI.
00:40:57.060 And then
00:40:57.600 suddenly,
00:40:58.980 you're saying
00:40:59.460 to yourself,
00:41:00.200 well,
00:41:00.340 I can copy
00:41:00.920 this AI.
00:41:02.120 It's telling
00:41:02.720 me exactly
00:41:03.240 what to do
00:41:03.780 and what I
00:41:04.260 did wrong.
00:41:05.560 Yeah,
00:41:06.080 no,
00:41:06.340 I'm not
00:41:06.580 going all
00:41:08.080 the way
00:41:08.280 to Ebonics,
00:41:09.360 but I see
00:41:09.880 what you're
00:41:10.160 saying there.
00:41:12.260 And so,
00:41:13.080 I believe
00:41:13.600 that if
00:41:14.540 you just
00:41:15.000 had some
00:41:15.700 dedicated
00:41:17.320 AIs
00:41:18.100 that were
00:41:20.040 trained on
00:41:21.040 the best
00:41:21.760 success
00:41:22.720 strategy
00:41:23.280 books
00:41:23.720 that you
00:41:25.340 could give
00:41:26.300 everybody
00:41:26.840 a path
00:41:27.460 out.
00:41:28.600 The only
00:41:29.460 other thing
00:41:29.940 you need
00:41:30.380 is access
00:41:31.340 to mentors.
00:41:33.160 Maybe it
00:41:33.580 could help
00:41:33.900 on that
00:41:34.220 too.
00:41:35.900 So I
00:41:36.220 think you
00:41:36.500 need mentors
00:41:37.120 that are
00:41:37.420 not AI
00:41:37.920 as well.
00:41:39.000 Yeah,
00:41:39.180 it'd be like
00:41:39.480 an AI
00:41:39.860 dad for
00:41:40.400 people who
00:41:40.780 didn't have
00:41:41.000 a dad.
00:41:43.820 All right.
00:41:45.620 Forbes,
00:41:47.040 who is
00:41:48.140 not a
00:41:48.600 publication
00:41:49.060 that anybody
00:41:49.580 should trust
00:41:50.140 anymore,
00:41:50.960 based on
00:41:51.400 my recent
00:41:51.880 observation
00:41:54.940 is they're
00:41:55.740 no longer,
00:41:56.540 if they
00:41:56.900 ever were,
00:41:57.740 but they're
00:41:58.080 not a
00:41:58.420 credible
00:41:58.760 source of
00:41:59.380 news,
00:41:59.760 for sure.
00:42:00.800 Anyway,
00:42:02.220 so one
00:42:04.400 of the
00:42:04.540 big problems
00:42:04.940 with TikTok
00:42:05.400 is apparently
00:42:06.280 it's easy
00:42:07.820 for anybody
00:42:08.360 in the
00:42:08.860 TikTok
00:42:09.220 organization
00:42:10.040 to find
00:42:11.500 out who
00:42:12.300 follows who
00:42:13.320 and who
00:42:13.880 their closest
00:42:14.440 contacts are.
00:42:15.420 and from
00:42:16.740 that,
00:42:17.980 presumably
00:42:18.420 you could
00:42:19.020 do a lot
00:42:19.380 of mischief
00:42:19.840 if you
00:42:20.220 were an
00:42:20.600 adversary,
00:42:21.440 because you
00:42:21.880 know who
00:42:22.240 knows who,
00:42:23.260 which tells
00:42:23.760 you a whole
00:42:24.480 lot about
00:42:24.960 them,
00:42:25.720 because who
00:42:26.200 you know
00:42:26.700 is really
00:42:27.320 a good
00:42:29.220 description
00:42:29.640 of who
00:42:29.960 you are
00:42:30.340 as well.
00:42:31.720 And there
00:42:32.440 was a big
00:42:32.700 old article
00:42:33.160 about how
00:42:33.740 that's bad.
00:42:35.660 Nowhere
00:42:36.180 did it mention
00:42:37.320 the primary
00:42:38.160 risk of
00:42:39.360 TikTok
00:42:40.060 that it's
00:42:41.340 literally the
00:42:41.940 human interface
00:42:42.760 for brainwashing
00:42:43.960 Americans.
00:42:45.420 It's a
00:42:45.840 brainwashing
00:42:46.760 app
00:42:47.260 that is
00:42:49.180 in the
00:42:49.500 middle of
00:42:50.100 obviously
00:42:50.900 brainwashing
00:42:51.580 us,
00:42:52.040 and it's
00:42:52.720 obvious
00:42:53.260 because they
00:42:54.260 don't even
00:42:54.680 let it run
00:42:55.440 in China,
00:42:56.380 because it's
00:42:57.140 not safe.
00:42:58.400 But yeah,
00:42:58.920 we could use
00:42:59.380 it all day
00:42:59.780 long,
00:43:00.120 and we are.
00:43:01.160 So we know
00:43:01.900 this thing is
00:43:02.400 a brainwashing
00:43:03.120 app,
00:43:03.500 and here's
00:43:03.840 Forbes does
00:43:04.420 a major
00:43:04.860 article in
00:43:05.960 which they
00:43:06.360 don't even
00:43:07.040 mention the
00:43:08.400 brainwashing
00:43:09.080 risk,
00:43:09.640 only the
00:43:10.180 data security.
00:43:11.420 Are you
00:43:11.680 starting to see
00:43:12.300 the pattern?
00:43:13.960 The pattern's
00:43:14.680 pretty clear,
00:43:15.180 isn't it?
00:43:16.360 Every time
00:43:17.020 there's a
00:43:17.340 major publication
00:43:18.200 that talks
00:43:18.680 about TikTok,
00:43:19.880 they will
00:43:20.320 ignore the
00:43:21.120 big risk,
00:43:21.820 which clearly
00:43:22.540 they know
00:43:23.080 is the
00:43:23.400 big risk,
00:43:24.380 and they'll
00:43:24.840 talk about
00:43:25.280 the minor
00:43:25.740 risk,
00:43:26.620 so that
00:43:27.160 when the
00:43:27.600 politicians
00:43:28.300 say,
00:43:28.780 well,
00:43:29.020 it's a
00:43:29.380 data security
00:43:30.720 problem,
00:43:31.520 but we
00:43:32.080 think we
00:43:32.460 can take
00:43:32.940 care of
00:43:33.160 the data
00:43:33.460 security
00:43:33.880 problem,
00:43:34.980 and then
00:43:35.320 the public
00:43:35.800 will say,
00:43:36.600 okay,
00:43:36.880 I don't
00:43:37.120 know much
00:43:37.500 about that
00:43:37.920 story,
00:43:38.820 but if the
00:43:39.360 only problem
00:43:39.800 is that
00:43:40.100 data security,
00:43:41.540 and then you
00:43:42.000 keep that
00:43:42.420 data security
00:43:43.120 in America,
00:43:43.740 in an
00:43:44.500 American
00:43:44.820 database,
00:43:46.120 maybe that's
00:43:47.260 okay.
00:43:48.560 But if they
00:43:49.280 had ever told
00:43:49.880 you,
00:43:50.200 all right,
00:43:50.700 the data
00:43:51.060 security is
00:43:51.600 something to
00:43:51.880 worry about,
00:43:52.340 but it's
00:43:52.600 10% of the
00:43:53.260 problem,
00:43:54.340 90% of the
00:43:55.240 problem is
00:43:56.180 that children
00:43:56.620 are learning
00:43:57.240 not to get
00:43:57.900 married and
00:43:58.780 have children,
00:43:59.660 and maybe
00:44:00.080 their gender
00:44:00.560 is wrong.
00:44:01.680 That's 90%,
00:44:02.760 because they
00:44:03.540 can actually
00:44:04.020 program Americans
00:44:05.360 not to
00:44:05.960 thrive,
00:44:06.640 which is
00:44:07.040 what they
00:44:07.320 do.
00:44:08.400 So it's
00:44:08.680 like TikTok
00:44:09.680 is essentially
00:44:10.660 a kill
00:44:11.860 switch on
00:44:12.760 America that
00:44:14.440 China has
00:44:15.120 already pressed.
00:44:17.200 They already
00:44:17.980 pressed it.
00:44:19.240 The kill
00:44:19.720 shot is that
00:44:20.780 if we raise
00:44:21.520 a generation
00:44:22.080 on TikTok,
00:44:23.060 there's no
00:44:23.560 fucking way
00:44:24.120 we're going
00:44:24.420 to be a
00:44:24.680 superpower in
00:44:25.380 50 years.
00:44:26.480 No fucking
00:44:27.460 way.
00:44:27.800 So the kill
00:44:29.960 shot's already
00:44:30.540 been pressed.
00:44:31.160 We're going
00:44:32.400 to have to
00:44:32.760 unkill
00:44:33.200 ourselves
00:44:33.660 somehow,
00:44:34.460 because we're
00:44:34.800 dead.
00:44:35.940 We're fucking
00:44:36.520 dead.
00:44:37.700 And it seems
00:44:39.040 to be that
00:44:39.660 our media
00:44:40.700 seems to be
00:44:42.220 suppressed in
00:44:42.980 some way,
00:44:43.880 that they can't
00:44:44.500 write the
00:44:44.800 actual story.
00:44:46.140 I assume
00:44:46.880 it's our
00:44:47.280 intelligence
00:44:47.760 organizations.
00:44:49.400 What else
00:44:49.760 would it be?
00:44:50.720 I assume
00:44:51.380 that our
00:44:52.120 intelligence
00:44:52.580 organizations
00:44:53.300 want TikTok
00:44:54.100 to stay.
00:44:55.340 Could be
00:44:56.000 because they
00:44:56.540 use it for
00:44:57.100 catching bad
00:44:57.740 guys.
00:44:58.860 Maybe.
00:44:59.500 Could be
00:44:59.960 they use it
00:45:00.520 to catch
00:45:00.880 Republicans.
00:45:03.200 Yeah,
00:45:03.600 you didn't
00:45:03.840 see that
00:45:04.160 comment,
00:45:04.520 did you?
00:45:05.800 Yeah.
00:45:06.520 Social media
00:45:07.160 might be one
00:45:07.720 of the ways
00:45:08.140 they round
00:45:08.780 up Republicans,
00:45:10.200 because they
00:45:10.560 can identify
00:45:11.140 them and
00:45:11.480 who they
00:45:11.840 talk to.
00:45:14.060 Yes,
00:45:14.620 and they
00:45:15.060 can use
00:45:15.380 that CISA
00:45:16.920 group to
00:45:18.720 influence the
00:45:20.100 rest of the
00:45:20.760 media,
00:45:21.720 and that's
00:45:23.820 a pretty
00:45:24.080 ugly picture.
00:45:25.440 So,
00:45:26.420 all right.
00:45:27.060 here's a
00:45:29.300 conversation
00:45:29.980 you should
00:45:30.360 be following
00:45:30.940 on the
00:45:32.340 X platform.
00:45:33.060 Nate
00:45:33.220 Silver,
00:45:34.620 who is
00:45:35.160 famous as
00:45:35.980 a statistician,
00:45:37.000 one of the
00:45:37.600 most well-known
00:45:39.340 types,
00:45:40.400 and I've said
00:45:40.920 before,
00:45:41.320 I won't go
00:45:41.720 through the
00:45:42.040 whole description,
00:45:43.580 that he's
00:45:43.980 very credible,
00:45:45.180 in my opinion,
00:45:46.600 even if
00:45:47.260 sometimes wrong,
00:45:48.520 because remember,
00:45:49.140 it's statistics.
00:45:50.340 Statistics is
00:45:51.240 not about being
00:45:51.780 right.
00:45:52.700 It's about
00:45:53.340 where things are
00:45:54.420 likely to go
00:45:55.060 based on what
00:45:55.620 you know.
00:45:56.760 And so I
00:45:57.560 would say he
00:45:57.940 is one of
00:45:58.260 the most
00:45:58.620 useful
00:45:59.460 public figures,
00:46:01.820 because he's
00:46:02.500 willing to go
00:46:03.160 where the
00:46:03.580 numbers go,
00:46:04.480 right or
00:46:04.800 wrong,
00:46:05.340 left or
00:46:05.740 right,
00:46:06.360 seems to
00:46:06.840 be a
00:46:07.800 reasonably
00:46:08.220 independent
00:46:08.800 thinker.
00:46:09.940 Well,
00:46:10.240 he got into
00:46:10.820 a conversation
00:46:12.060 with somebody
00:46:12.620 who also
00:46:13.260 was very
00:46:13.800 credible,
00:46:14.500 Martin
00:46:14.980 Kaldorf
00:46:16.940 or something,
00:46:17.520 I forget.
00:46:18.260 So,
00:46:18.640 Harvard guy,
00:46:19.520 part of the
00:46:19.980 Great Barrington
00:46:20.980 Declaration,
00:46:22.240 highly credible.
00:46:23.640 Now,
00:46:24.300 when Nate
00:46:24.800 Silver said
00:46:25.500 and none
00:46:26.680 of this
00:46:26.980 will be
00:46:27.880 my claims,
00:46:28.740 okay?
00:46:29.660 I'm just
00:46:29.980 telling you
00:46:30.300 what Nate
00:46:30.780 Silver believes
00:46:31.620 based on
00:46:32.380 the data.
00:46:33.360 He believed
00:46:33.940 that you
00:46:35.480 could tell
00:46:35.940 that the
00:46:36.440 red states
00:46:37.220 did worse
00:46:39.240 after vaccinations
00:46:40.480 rolled out,
00:46:41.820 meaning that
00:46:42.580 when vaccinations
00:46:43.360 rolled out,
00:46:44.660 the blue states
00:46:45.540 had a much
00:46:46.580 greater reduction
00:46:47.700 in death,
00:46:48.820 the red states
00:46:49.520 had more death,
00:46:50.380 and then
00:46:51.660 critics,
00:46:53.700 including me,
00:46:54.960 pointed out,
00:46:55.660 hey,
00:46:55.900 hey,
00:46:56.100 hey,
00:46:56.420 there's a
00:46:56.840 difference
00:46:57.140 in age,
00:46:59.340 right?
00:46:59.680 If you don't
00:47:00.120 factor in the
00:47:00.720 age difference
00:47:01.340 of those states,
00:47:02.260 you don't get
00:47:03.060 the right result.
00:47:04.740 Well,
00:47:04.940 Nate Silver,
00:47:06.340 being sensitive
00:47:07.300 to that question
00:47:07.940 because it's
00:47:08.380 actually a good
00:47:08.980 question,
00:47:10.260 so since he
00:47:11.460 respected the
00:47:12.200 question and
00:47:13.480 the people
00:47:13.840 asked it,
00:47:14.920 he answered it,
00:47:16.560 and the answer
00:47:17.380 is surprising.
00:47:18.820 There's not
00:47:19.460 that much
00:47:19.860 age difference
00:47:20.520 by state,
00:47:23.340 at least
00:47:23.980 relative to
00:47:24.980 Republican or
00:47:25.980 Democrat.
00:47:26.760 It's actually
00:47:27.260 not that big
00:47:27.800 a difference.
00:47:28.560 So there are
00:47:28.980 some states
00:47:29.580 such as
00:47:30.260 Florida,
00:47:31.120 where it's
00:47:32.140 Republican and
00:47:33.140 there are lots
00:47:33.540 of old people,
00:47:34.620 but there are
00:47:35.300 also states
00:47:35.960 like Maryland,
00:47:37.940 I think Maryland
00:47:38.640 is one,
00:47:39.580 where,
00:47:40.120 you know,
00:47:41.960 things are
00:47:42.740 opposite of
00:47:43.280 what you'd
00:47:43.640 expect.
00:47:44.320 So in other
00:47:44.840 words,
00:47:45.060 if you look
00:47:45.400 at the whole
00:47:45.840 country,
00:47:47.360 the
00:47:47.600 Republicans
00:47:48.320 that have
00:47:49.020 the old
00:47:49.420 states
00:47:49.900 doesn't
00:47:50.400 really hold
00:47:50.880 up,
00:47:51.580 but there
00:47:51.880 are some
00:47:52.260 that are
00:47:52.480 like that.
00:47:53.680 But if you
00:47:54.380 look at the
00:47:54.700 average,
00:47:55.140 it goes
00:47:55.740 away a little
00:47:56.100 bit.
00:47:56.620 However,
00:47:58.440 I know
00:47:59.720 some of you
00:48:00.220 are saying,
00:48:01.060 but,
00:48:01.300 but,
00:48:01.560 but,
00:48:02.360 we should
00:48:03.000 still do
00:48:03.460 those numbers.
00:48:04.860 So Nate
00:48:05.240 did.
00:48:05.860 He ran
00:48:06.180 the numbers
00:48:06.700 with age
00:48:08.340 being factored
00:48:09.000 in and
00:48:10.120 got the
00:48:10.460 same result.
00:48:12.120 So pretty
00:48:12.640 much as he
00:48:13.960 told you,
00:48:15.320 it wouldn't
00:48:15.780 change the
00:48:16.220 result that
00:48:16.840 much,
00:48:17.280 but in
00:48:18.340 respect,
00:48:19.000 and of
00:48:19.260 respect to
00:48:19.840 the critics
00:48:20.640 who were
00:48:21.140 genuinely
00:48:21.620 interested,
00:48:23.440 he ran
00:48:23.760 the numbers
00:48:24.600 and it
00:48:24.880 came out
00:48:25.160 the same.
00:48:26.000 Now,
00:48:27.140 I will
00:48:28.080 tell you
00:48:28.360 that I
00:48:28.640 just saw
00:48:29.120 a counter
00:48:30.660 analysis
00:48:31.140 by
00:48:31.620 Anatoly,
00:48:33.380 one of
00:48:34.180 my favorite
00:48:34.800 critics,
00:48:36.780 and his
00:48:37.760 take was
00:48:38.460 that you
00:48:39.640 can't see
00:48:40.200 a signal
00:48:40.800 for the
00:48:41.180 vaccinations
00:48:41.700 working.
00:48:42.180 I think
00:48:44.500 that's the
00:48:44.900 bottom line.
00:48:46.160 So just
00:48:47.220 know that
00:48:47.640 there are
00:48:47.960 smart people
00:48:48.560 who say
00:48:48.980 the data
00:48:50.080 shows no
00:48:51.060 difference.
00:48:52.380 I don't
00:48:52.700 know if
00:48:52.940 that's true
00:48:53.340 or false.
00:48:53.860 I'm not
00:48:54.120 making any
00:48:54.560 claims of
00:48:55.000 my own
00:48:55.300 here.
00:48:56.040 But when
00:48:56.860 Nate
00:48:57.400 Silver
00:48:57.620 ran it,
00:48:58.120 it showed
00:48:58.440 that when
00:48:58.940 vaccination
00:48:59.400 is rolled
00:48:59.880 out,
00:49:00.920 the people
00:49:01.900 who were
00:49:02.220 most likely
00:49:02.780 to get
00:49:03.160 them did
00:49:03.580 better than
00:49:04.020 the people
00:49:04.340 who didn't
00:49:04.620 get them,
00:49:05.920 even when
00:49:06.720 you adjust
00:49:07.080 by age.
00:49:07.600 Do you
00:49:09.260 believe that?
00:49:14.060 And even
00:49:15.180 if this
00:49:15.700 were true,
00:49:17.320 would it
00:49:17.780 therefore
00:49:18.280 prove
00:49:19.080 anything in
00:49:20.860 particular?
00:49:21.340 Would it
00:49:21.600 prove that
00:49:22.000 the vaccinations
00:49:22.620 worked?
00:49:24.960 No.
00:49:26.400 Because one
00:49:27.080 of the things
00:49:27.580 that I'd be
00:49:28.280 concerned by,
00:49:29.360 so here's
00:49:30.200 how it could
00:49:30.640 be misleading.
00:49:33.160 Have we
00:49:33.620 not heard,
00:49:35.340 correct me
00:49:35.840 if I'm wrong,
00:49:36.340 but I thought
00:49:36.660 there was at
00:49:37.020 least one
00:49:37.480 study,
00:49:38.040 and of
00:49:38.280 course all
00:49:38.620 the studies
00:49:39.080 are non-credible,
00:49:40.980 but I think
00:49:41.440 one study
00:49:42.000 said your
00:49:43.180 worst situation
00:49:44.320 is if you
00:49:45.260 were vaccinated
00:49:45.840 and you've
00:49:46.560 also had
00:49:47.160 the COVID.
00:49:48.300 Is that
00:49:48.600 true?
00:49:50.020 If you
00:49:50.640 had the
00:49:50.920 COVID and
00:49:51.800 you were
00:49:52.060 vaccinated,
00:49:52.700 you were
00:49:52.900 in the
00:49:53.140 worst
00:49:53.340 possible
00:49:53.780 situation.
00:49:55.620 I think
00:49:56.160 one study
00:49:56.740 said that.
00:49:57.600 So I
00:49:57.860 wouldn't claim
00:49:58.680 that as
00:49:59.040 being true.
00:49:59.660 It's just
00:49:59.920 something one
00:50:00.460 study said.
00:50:01.900 I think
00:50:02.460 it was worse
00:50:03.120 off in
00:50:03.560 the sense
00:50:03.800 of getting
00:50:04.520 it another
00:50:06.020 time.
00:50:06.540 You were
00:50:06.800 more likely
00:50:07.300 to get
00:50:07.720 it if
00:50:08.200 you had
00:50:08.500 both had
00:50:09.440 it and
00:50:09.820 vaccinated.
00:50:11.120 Something like
00:50:11.580 that.
00:50:12.580 So there
00:50:12.940 are a whole
00:50:13.180 bunch of
00:50:13.800 variables that
00:50:15.660 I don't know
00:50:16.120 how you would
00:50:16.520 sort them out
00:50:17.200 because how
00:50:18.840 would you
00:50:19.100 know that
00:50:19.560 the most
00:50:19.960 how do
00:50:23.260 you know
00:50:23.900 first of
00:50:24.540 all what
00:50:24.820 happens in
00:50:25.180 the long
00:50:25.540 run?
00:50:26.740 Could
00:50:27.180 things start
00:50:28.740 separating in
00:50:29.420 the long
00:50:29.760 run?
00:50:30.220 Which would
00:50:30.800 suggest that
00:50:31.540 either long
00:50:32.140 COVID or the
00:50:33.020 vaccinations were
00:50:33.820 the problem in
00:50:34.340 the long run?
00:50:35.240 You wouldn't
00:50:35.540 know.
00:50:36.700 Anyway,
00:50:37.400 here's what I
00:50:37.960 think is
00:50:38.240 interesting.
00:50:39.520 If you've
00:50:40.520 been in your
00:50:41.040 Republican bubble,
00:50:43.540 would you
00:50:44.340 agree that
00:50:44.980 those of us in
00:50:45.700 the Republican
00:50:46.340 bubble, and
00:50:47.180 I spend most
00:50:47.800 of my time
00:50:48.240 there, you
00:50:49.940 never see any
00:50:50.920 information that
00:50:51.620 would suggest
00:50:52.100 the vaccinations
00:50:52.800 worked for
00:50:54.180 anyone at
00:50:55.600 any time?
00:50:56.540 Would you
00:50:56.960 agree with
00:50:57.340 that statement?
00:50:58.440 If you're in
00:50:59.360 the Republican
00:50:59.860 bubble, you've
00:51:00.520 never seen
00:51:01.200 anybody credible
00:51:02.060 with any
00:51:03.500 data that
00:51:04.720 says the
00:51:05.060 vaccinations
00:51:05.620 actually could
00:51:06.540 have worked
00:51:06.960 for anybody
00:51:07.840 at any
00:51:09.020 specific time.
00:51:11.580 Yeah.
00:51:12.360 But as
00:51:13.760 soon as you
00:51:14.180 walk into the
00:51:14.800 other bubble,
00:51:15.700 here's Nate
00:51:17.120 Silver, that
00:51:18.480 again, I
00:51:18.980 consider completely
00:51:20.760 credible.
00:51:22.380 Now, I'm not
00:51:23.080 saying that his
00:51:23.640 analysis is
00:51:24.340 right, because
00:51:24.860 that's not how
00:51:25.480 it works.
00:51:26.320 Being credible
00:51:27.000 is not the
00:51:27.540 same as being
00:51:28.000 right.
00:51:29.020 It just means
00:51:29.800 that he's
00:51:30.080 probably not
00:51:30.580 lying, and
00:51:31.880 he's probably
00:51:32.280 looking at the
00:51:32.940 best sources he
00:51:33.680 could find, and
00:51:35.560 it's all
00:51:36.080 consistent in one
00:51:37.020 direction.
00:51:38.300 Doesn't mean
00:51:38.920 it's true, but
00:51:40.960 that's what he's
00:51:41.500 saying.
00:51:42.380 So this is
00:51:43.300 more like a
00:51:44.000 window into the
00:51:44.940 other bubble.
00:51:45.700 You know, I'm
00:51:46.760 not trying to
00:51:47.320 convince you that
00:51:47.980 anything is true
00:51:48.580 or not, because
00:51:49.180 I don't think any
00:51:49.940 of the data is
00:51:50.780 useful.
00:51:52.040 I just don't
00:51:52.800 believe any of
00:51:53.340 it.
00:51:54.140 So if you
00:51:56.600 are believing
00:51:57.080 data, just
00:51:58.220 know that there's
00:51:59.540 different data in
00:52:00.220 a different
00:52:00.540 bubble.
00:52:02.160 That's what you
00:52:02.840 need to know.
00:52:05.520 All right, let's
00:52:06.240 talk about
00:52:06.740 Ukraine.
00:52:08.380 Did I see a
00:52:09.280 suggestion that
00:52:10.240 Britain might put
00:52:11.400 some troops on
00:52:12.100 the ground in
00:52:12.700 Ukraine?
00:52:13.100 in a
00:52:15.340 training, but
00:52:16.520 only for
00:52:16.900 training, right?
00:52:19.700 Consultants, right?
00:52:20.780 Consultants, which
00:52:21.720 is how you
00:52:22.320 introduce troops.
00:52:24.000 Oh, they're just
00:52:24.740 there for
00:52:25.100 training.
00:52:25.960 Yeah.
00:52:26.900 You've got a lot
00:52:27.580 of troops in
00:52:28.120 there for
00:52:28.420 training.
00:52:28.840 Well, they need
00:52:29.240 a lot of
00:52:29.780 training, but
00:52:30.340 they're all
00:52:30.680 armed.
00:52:31.320 Of course, they're
00:52:32.080 in the military,
00:52:32.780 but they're
00:52:33.020 trainers.
00:52:34.580 Yeah.
00:52:34.760 So, do you
00:52:37.640 think America
00:52:38.360 has any
00:52:40.600 Americans on
00:52:41.440 the ground in
00:52:42.140 Ukraine?
00:52:43.320 Do you think
00:52:44.100 we have any
00:52:44.620 troops on the
00:52:45.120 ground?
00:52:45.200 Of course we
00:52:45.860 do.
00:52:48.120 How could we
00:52:48.980 not?
00:52:49.900 If you have
00:52:50.640 American equipment
00:52:51.660 over there,
00:52:53.480 probably.
00:52:57.060 Probably.
00:52:58.500 Yeah, I feel
00:52:59.800 like there may
00:53:01.160 be people in
00:53:01.840 Ukrainian uniforms.
00:53:04.760 But I've got a
00:53:05.760 feeling we've got
00:53:06.380 some people over
00:53:07.020 there.
00:53:07.320 That should be
00:53:07.760 the way it
00:53:08.160 works.
00:53:11.600 All right.
00:53:12.740 Is there any
00:53:13.080 story that I've
00:53:14.000 missed?
00:53:15.000 Anything you're
00:53:16.020 desperate to talk
00:53:17.040 about?
00:53:18.820 But I forgot
00:53:20.300 to mention?
00:53:22.240 I think it was
00:53:23.060 Elon Musk who
00:53:23.780 said the other
00:53:24.220 day that you
00:53:27.420 don't need to
00:53:27.960 watch the news
00:53:29.300 because all the
00:53:30.400 news is on X.
00:53:31.940 And I agree
00:53:32.660 with that.
00:53:33.080 It's the only
00:53:33.420 place you can see
00:53:34.160 any contact.
00:53:34.760 The silver
00:53:37.700 context you
00:53:38.760 will not see
00:53:39.420 in any
00:53:39.840 Republican
00:53:40.440 publication.
00:53:42.160 Do you
00:53:42.620 think any
00:53:45.180 of your
00:53:45.880 publications
00:53:46.800 would show
00:53:47.540 his analysis?
00:53:49.380 Do you
00:53:49.660 think Fox
00:53:50.240 News or
00:53:51.120 anybody else?
00:53:53.120 Probably not.
00:53:54.760 Now, I don't
00:53:55.360 know if he's
00:53:55.740 right.
00:53:57.080 I have no
00:53:58.740 idea.
00:53:59.020 So one of the
00:54:02.060 weirdest stories I
00:54:02.880 heard about the
00:54:04.060 Ukraine war is
00:54:05.280 that, as you
00:54:05.740 know, the
00:54:06.020 Wagner group was
00:54:07.680 taking prisoners
00:54:08.780 out of jail and
00:54:11.100 sending them into
00:54:12.740 the meat grinder.
00:54:13.400 Now, this story is
00:54:15.800 horrible, but I can't
00:54:18.380 get it out of my
00:54:18.900 head, so I'm
00:54:19.460 going to force it
00:54:20.500 on you.
00:54:21.260 One of the
00:54:21.880 prisoners that
00:54:23.300 was taken out of
00:54:24.500 jail was convicted
00:54:27.160 for murdering his
00:54:28.240 girlfriend or wife,
00:54:29.480 I think it was a
00:54:30.000 girlfriend, and
00:54:31.440 disposing of the
00:54:32.380 body by putting it
00:54:33.300 through a meat
00:54:33.820 grinder, like a real
00:54:36.300 story.
00:54:36.680 And so, in
00:54:39.720 order to be freed
00:54:40.620 from jail, all he
00:54:42.600 had to agree to
00:54:43.260 do was go to what
00:54:45.180 the analysts are
00:54:46.200 calling a meat
00:54:46.800 grinder, you know,
00:54:48.560 being on the
00:54:48.980 front lines of the
00:54:49.960 war in Ukraine.
00:54:51.060 So he got
00:54:52.100 convicted of
00:54:52.840 putting his
00:54:53.260 girlfriend through
00:54:53.860 a meat grinder,
00:54:54.840 and then he got
00:54:55.680 sent to a meat
00:54:56.400 grinder to get
00:54:57.740 out of jail, and
00:54:59.180 he lived, and he
00:55:00.980 served his time, and
00:55:02.560 now he's a free
00:55:03.240 man, because karma
00:55:05.700 sent him to a
00:55:06.640 meat grinder for
00:55:07.320 putting his
00:55:07.780 girlfriend through a
00:55:08.440 meat grinder, and
00:55:09.060 he lived.
00:55:09.920 So now he's a free
00:55:10.680 man.
00:55:11.020 He's a meat
00:55:11.440 grinder, meat
00:55:12.240 grinder.
00:55:13.860 I mean, it's
00:55:14.340 terrible, but it's
00:55:17.160 weird.
00:55:18.720 All right.
00:55:21.680 How do we stop
00:55:22.600 World War III?
00:55:24.380 Well, the first
00:55:25.620 thing you need to
00:55:26.180 know is that
00:55:26.740 whatever we know
00:55:27.840 about what's
00:55:28.840 happening with
00:55:29.480 Russia and
00:55:30.160 Ukraine is not
00:55:31.560 what the people in
00:55:32.300 charge know.
00:55:33.600 So I don't believe
00:55:34.900 we have useful
00:55:35.820 information.
00:55:36.640 We're just
00:55:37.400 guessing from tea
00:55:38.480 leaves and scat on
00:55:40.400 the ground.
00:55:41.380 But the experts
00:55:42.500 probably know who
00:55:43.400 does what, and who
00:55:44.220 knows what, and
00:55:45.080 what the spies are
00:55:46.740 saying, and who's
00:55:48.940 threatened what, and
00:55:49.900 who's capable, and
00:55:50.740 who's incompetent, and
00:55:51.660 who's sick, and who's
00:55:52.500 crazy.
00:55:53.240 So the level of what
00:55:54.400 they know about the
00:55:55.160 world is so different
00:55:56.520 that predicting is
00:55:58.820 ridiculous, because
00:56:00.480 we're not using any
00:56:01.320 real information.
00:56:02.200 We're using
00:56:02.760 information that's
00:56:03.500 been given to us,
00:56:04.300 which is intentionally
00:56:05.780 fake information.
00:56:07.840 So I do see it
00:56:12.760 escalating, but here's
00:56:14.000 my take.
00:56:15.800 If you assume it's an
00:56:17.180 economic war, which is
00:56:18.860 what I assume, and if
00:56:21.000 you assume that the
00:56:21.720 people are waging this
00:56:22.600 economic war, economic
00:56:23.940 war in the sense that we
00:56:25.560 want to sell weapons and
00:56:26.840 we want to get control of
00:56:28.080 Russia's energy, or get
00:56:30.020 them out of the energy
00:56:30.760 business so we have the
00:56:31.960 market, so it's economic
00:56:33.660 in the sense of selling
00:56:34.760 weapons and protecting
00:56:36.460 our energy industry.
00:56:40.320 So, under those
00:56:43.420 conditions, if that's
00:56:45.900 who's really in charge,
00:56:47.020 the people who are just
00:56:47.740 trying to make money,
00:56:49.040 would they allow a
00:56:50.480 nuclear war, like an
00:56:52.240 actual World War III?
00:56:53.600 And I think the answer
00:56:54.680 is no, because that
00:56:56.380 would be uneconomical.
00:56:58.540 What's economical is a
00:57:00.320 forever war in Afghanistan
00:57:01.880 and a forever war in
00:57:03.040 Iraq, a forever war in
00:57:05.020 Ukraine that always goes
00:57:06.940 up to the level of using
00:57:08.280 heavy equipment, but not
00:57:10.300 to the level of using
00:57:11.180 anything nuclear, because
00:57:12.400 it wouldn't make sense.
00:57:13.700 Yeah, Syria, maybe,
00:57:14.800 Lebanon, maybe, I don't
00:57:15.700 know.
00:57:16.660 Not Lebanon.
00:57:19.620 Never mind.
00:57:23.580 Why am I blanking on
00:57:25.120 Gaddafi's country?
00:57:27.440 Gaddafi was Libya, yeah.
00:57:32.480 Not Lebanon, Libya.
00:57:39.840 Yeah, growing the empire
00:57:41.340 is economical.
00:57:43.080 Growing the empire is
00:57:44.240 economical, that's true, but
00:57:46.020 the Americans are not trying
00:57:47.260 to grow the empire, except
00:57:50.460 controlling the energy
00:57:51.800 resources, I suppose.
00:57:52.880 So, here's my prediction.
00:57:58.000 The people who have the
00:57:59.100 most control over what
00:58:01.260 happens in Ukraine do not
00:58:03.580 want it to become a World
00:58:05.080 War III.
00:58:06.200 They want it to be largely
00:58:08.660 the way it is, and a little
00:58:10.400 bit more of the way it is.
00:58:12.140 That would be the bit.
00:58:12.920 If you're going to follow
00:58:13.840 the money, that's what the
00:58:14.900 money suggests.
00:58:15.960 If there was money in a
00:58:19.100 nuclear war or a World War
00:58:20.560 War III, we would
00:58:22.020 absolutely have it.
00:58:23.920 But I don't think anybody
00:58:24.920 sees there's a way to make
00:58:25.980 money doing that.
00:58:27.220 It doesn't make sense.
00:58:30.820 Yeah, I think we want
00:58:31.900 another 20-year war.
00:58:33.020 That's right.
00:58:34.020 I think every war has to be
00:58:35.260 20 years now.
00:58:41.960 We can treat our reality as
00:58:43.680 subjective and get good
00:58:44.760 outcomes.
00:58:45.980 Well, let's try that.
00:58:46.840 Afghanistan was an
00:58:51.220 occupation, not a war.
00:58:52.440 Yeah, I guess so.
00:58:54.380 All right, if you haven't
00:58:55.180 seen the clips coming out
00:58:56.800 from my conversation with
00:58:59.000 Megyn Kelly, there are quite
00:59:00.920 a few clips, and I'll
00:59:04.260 recommend you to it because
00:59:05.580 people say it was awesome.
00:59:07.700 You can just Google my name
00:59:08.900 and her name.
00:59:09.440 It'll pop right up.
00:59:10.100 A lot of people, and even
00:59:17.140 the comments, are saying
00:59:18.000 it's a great interview.
00:59:19.840 I don't know why it was
00:59:20.800 great, honestly, because I
00:59:22.460 have this thing called
00:59:23.440 interview amnesia.
00:59:26.740 The moment I'm done with an
00:59:28.060 interview, even if it's like
00:59:29.300 an hour and a half, which is
00:59:30.960 typical, I don't remember a
00:59:32.880 thing.
00:59:33.940 I have complete amnesia, and
00:59:35.600 then somebody will say, oh,
00:59:36.480 that was a good one, and I'll
00:59:37.500 have to think, really?
00:59:38.260 Like, I don't know.
00:59:39.540 I don't remember it.
00:59:40.860 I tell you, yeah, I remember
00:59:41.860 the person, and I remember
00:59:45.680 how I felt, and I might
00:59:47.760 remember sort of generally
00:59:48.860 what topics were covered, but
00:59:50.620 I don't remember specific
00:59:51.640 things I said, so I don't
00:59:53.180 have the same kind of memory
00:59:54.180 you have when you watch it.
01:00:00.200 Somebody's saying I had no
01:00:01.300 pauses before answering
01:00:02.500 questions.
01:00:03.220 Well, it probably helps that I
01:00:06.640 do this every day.
01:00:08.260 And I think that what makes
01:00:12.900 me like a podcast or an
01:00:15.380 interview is how casual the
01:00:19.180 people talking are.
01:00:21.120 To me, that tends to be the
01:00:22.340 most predictive part.
01:00:23.940 If somebody's just a talking
01:00:25.280 head, and they've got to sell
01:00:26.300 something to their corporate
01:00:27.400 bosses, and they're like, well,
01:00:29.860 here's my prepared speech, which
01:00:31.520 I've said seven times already, and
01:00:33.660 let me say this.
01:00:35.620 Yeah, I'm bored.
01:00:37.780 But if I see, if you see Joe
01:00:40.140 Rogan, for example, when Joe
01:00:41.620 Rogan is interviewing people, it's
01:00:43.560 more like a conversation.
01:00:45.420 So if you tune in and you're like
01:00:47.020 in the middle of a conversation
01:00:48.120 between a couple of smart people,
01:00:50.460 that'll just glue you right there.
01:00:52.880 But as soon as one of those is just
01:00:54.400 like a suited puppet, you can tell
01:00:59.340 in a heartbeat.
01:01:06.360 You said Ukraine would win?
01:01:09.360 No, I didn't.
01:01:11.520 What does win mean?
01:01:12.600 You mean win as in conquer Russia?
01:01:17.860 Pretty sure I didn't say that.
01:01:19.600 No, I said Ukraine would not be
01:01:21.320 conquered, which it has not been.
01:01:25.840 So one of my best predictions.
01:01:30.740 But no, I don't know what win would
01:01:33.000 look like.
01:01:34.200 Win is, I think, your own word.
01:01:36.520 I'm just saying that they would be
01:01:37.820 difficult to conquer, which they were.
01:01:39.480 Did you know about Dr. Shiva's
01:01:44.780 lawsuit that showed a backdoor?
01:01:49.740 I mean, I don't know if that's
01:01:52.740 still relevant under the Musk domain.
01:01:58.600 Wagner did not recruit murderers.
01:02:01.000 Well, that one guy thinks he's a
01:02:02.660 murderer, and he thinks they recruited
01:02:04.400 him.
01:02:09.480 Yeah, at times it looks like they're
01:02:13.620 winning, but there was no way to
01:02:15.700 know, right?
01:02:22.980 Do you think Russell Brand will go
01:02:25.120 harder at the bad guys now that
01:02:28.720 he's taken heat?
01:02:31.280 I feel like he might.
01:02:33.420 It might be better to go through it
01:02:35.100 than around it.
01:02:35.780 You purchased your, it's the first
01:02:44.280 book you ever bought?
01:02:45.960 Well, I hope you read it.
01:02:55.180 Best tactic is to go harder?
01:02:57.020 Maybe so.
01:03:02.000 J6 discussion?
01:03:02.920 Oh, yeah.
01:03:04.700 So, I tried to go on the spaces
01:03:09.860 feature on X yesterday, and I was
01:03:13.000 trying to encourage anybody who
01:03:15.420 believed that January 6th was an
01:03:17.100 actual insurrection to help me
01:03:20.040 understand how trespassing eventually
01:03:23.220 leads to controlling our nuclear
01:03:25.360 arsenal.
01:03:26.820 Like, how do you connect those dots?
01:03:29.340 And what I found was that I couldn't
01:03:31.100 get anybody to even engage in the
01:03:32.780 conversation.
01:03:33.920 I got one guy who'd come on and just
01:03:36.480 yell at me for being a bad person.
01:03:39.660 That was it.
01:03:40.640 I could not get anybody to even talk
01:03:43.620 about the concept unless they were
01:03:45.800 already agreeing with me, in which case
01:03:47.460 there was no point to it.
01:03:48.900 So, you can see why the Democrats need to
01:03:58.840 get rid of free speech.
01:04:00.700 That one question of how trespassing gets
01:04:03.720 you control of the nuclear arsenal is the
01:04:08.080 end of the entire hoax.
01:04:10.560 So, so far, there's always a tentpole
01:04:12.920 hoax for every election cycle.
01:04:15.520 So, the tentpole hoax that makes all the
01:04:18.380 other hoaxes look believable is that
01:04:21.300 January 6th was an insurrection, and
01:04:24.080 that's what Trump intended.
01:04:27.200 Now, after he's, let's say he gets
01:04:30.780 elected and serves another four years, and
01:04:34.040 then at the end of the four years he
01:04:35.340 retires, because that's what people do.
01:04:37.860 So, what are the Democrats going to say
01:04:41.120 if he just quietly retires after eight
01:04:44.400 years like Americans do?
01:04:46.400 Are they going to say, well, we were
01:04:47.840 probably wrong before, yeah, he's not
01:04:51.920 trying to be a dictator, or are they
01:04:54.040 going to say, well, he was definitely
01:04:55.680 trying to be a dictator before, but I
01:04:58.320 guess he decided it didn't work, so he's
01:05:00.260 not going to do it.
01:05:00.880 All right.
01:05:06.880 Biden-Fetterman for 2024.
01:05:12.240 So, what I found was that on X, I
01:05:15.480 apparently have close to zero real
01:05:18.280 Democrat exposure.
01:05:21.080 I have only people who appear to be
01:05:23.340 just trolls, like they don't seem to be
01:05:25.640 even engaged in real topics.
01:05:27.340 So, how did that happen?
01:05:30.500 I feel like it's worse.
01:05:33.120 My experience on X is much better,
01:05:35.840 because I don't have all the
01:05:36.960 criticism, but I'm completely cut off
01:05:40.860 from anybody in half of the country.
01:05:44.020 They have no exposure to my tweets,
01:05:45.620 even accidentally, I think, because
01:05:47.980 they certainly don't comment on them.
01:05:50.360 And even when I do the spaces, which
01:05:52.140 I thought would be opening it up to
01:05:53.780 more people, I feel like only my
01:05:56.980 followers even see the spaces being
01:05:59.160 advertised.
01:06:05.900 Now, there's a Twitch streamer.
01:06:08.680 No.
01:06:09.920 Everybody bails out when they think
01:06:11.740 that they're going to talk to
01:06:12.640 somebody who knows what their, you
01:06:14.400 know, has a good argument.
01:06:15.180 I don't know if I'm shadow banned,
01:06:22.180 or the algorithm just only delivers me
01:06:25.200 to people who might like me,
01:06:26.700 which ends up being very similar.
01:06:31.000 Does Elon following me matter?
01:06:33.160 I don't know.
01:06:33.960 All right.
01:06:48.920 Are you still left to Bernie?
01:06:53.700 That's no fun, that question.
01:06:57.420 All right.
01:06:58.080 All right.
01:06:58.120 That's all I got for today.
01:07:00.760 I'm going to go do something else.
01:07:03.680 And, oh, well, I'm glad you like me.
01:07:07.060 I'm sure I like you back.
01:07:09.220 You're awesome.
01:07:10.360 And I think all of you are going to
01:07:11.560 have a wonderful time today.
01:07:12.800 If you haven't already seen my book,
01:07:14.260 Reframe Your Brain, it's the best book
01:07:15.800 in the world.
01:07:17.240 Literally the most improvement you could
01:07:19.640 have in your life with the least amount
01:07:21.060 of work.
01:07:22.800 That's the way I like it.
01:07:24.160 All right.
01:07:24.620 That's all for now.
01:07:25.280 I'll talk to you later, YouTubers.