Real Coffee with Scott Adams - August 29, 2023


Episode 2215 Scott Adams: Persuasion Grades For DeSantis & Vivek. Lots Of Fake News. Bring Coffee


Episode Stats

Length

1 hour and 29 minutes

Words per Minute

146.26808

Word Count

13,030

Sentence Count

1,051

Misogynist Sentences

6

Hate Speech Sentences

9


Summary

How would you like me to solve the problem of you not trusting ballot boxes in the upcoming election? Would you like to see me solve it? All you have to do is monetize the process, and I'll show you how.


Transcript

00:00:00.000 Good morning, everybody, and welcome to the highlight of human civilization.
00:00:17.140 It's called Coffee with Scott Adams, and I'm pretty sure you've never had a better time.
00:00:22.300 I don't care what your spouse says.
00:00:24.100 And if you'd like to take this experience up to a level where the dopamine and the oxytocin flow freely,
00:00:34.600 well, all you need is a cup or a mug or a glass of tankard chalice or stein, a canteen jug or flask, a vessel of any kind.
00:00:41.460 Fill it with your favorite liquid.
00:00:43.680 I like my coffee.
00:00:45.540 And join me now for the unparalleled pleasure.
00:00:48.660 The dopamine in a day thing makes everything better.
00:00:50.620 It's called the Simultaneous Sip, and it happens now.
00:00:54.620 Go.
00:00:58.680 Yeah, that's good.
00:01:00.580 That's good.
00:01:02.760 Well, today is going to be an extra good, amazing show.
00:01:06.820 How would you like me to completely solve the problem of you not trusting ballot boxes in this coming election?
00:01:16.220 Would you like to see me solve it?
00:01:18.760 All right.
00:01:19.160 Here we go.
00:01:20.620 All you have to do is monetize the process.
00:01:26.860 So if I were to say to you, you know what?
00:01:30.000 I'm a big news organization, and I'll offer $10,000.
00:01:35.000 I'm not offering this.
00:01:36.240 I'm just suggesting this could be an idea.
00:01:38.140 Let's say somebody offers $10,000 for a video of something that is confirmed to be illegal and makes a difference, you know, big enough to make a difference for a drop box.
00:01:52.180 So you can say to people, you know, if you get lucky and you get a video of the same person coming back several times.
00:02:00.740 Let's say you're a store owner and you've got a store window that's across from a ballot box.
00:02:07.300 Maybe you put up a trail cam, you know, put up some video security, stick it over there and just see if you get anything.
00:02:16.180 So it'd be a little like mining for a Bitcoin.
00:02:20.800 You know, you don't know if you're going to find a Bitcoin, but it's kind of fun to mine for it just in case you get a hit.
00:02:26.740 So could you, could you monetize the capture of, let's say, any voting irregularities?
00:02:37.000 And then if you did a good job and no irregularities were found, wouldn't you feel more comfortable?
00:02:44.900 Wouldn't you feel better if you'd monetized it to the point where you're pretty sure people are watching?
00:02:49.820 Now, one important point, very important, you can't have humans hanging around the drop box because that would be intimidation.
00:02:59.080 So no intimidation, not even accidental intimidation.
00:03:03.080 Don't be hanging around the drop boxes, the ballot drop boxes.
00:03:06.980 But if you can put a camera up in a legal, public way, you might be able to monetize that stuff.
00:03:15.080 Think about it.
00:03:16.120 All right.
00:03:17.560 Fake news update.
00:03:21.180 I like to keep you up to date on all the fakeness in the fake news.
00:03:26.940 Well, here's one that should bother you.
00:03:28.680 Did you know that the Associated Press, the AP, which is a source of much of the news that other entities report,
00:03:37.340 or they report about what the AP reports,
00:03:39.760 but apparently last year they announced that they were, in order to help fund the organization,
00:03:48.880 they were going to find these strategic partners,
00:03:52.360 and they'd have these partnerships to subsidize reporters.
00:03:55.780 So the reporters would get some extra money from these partnerships.
00:03:59.320 I wonder what kind of partnerships, and what kind of political organizations fund AP?
00:04:09.180 Well, let's see.
00:04:10.360 We've got the Ida B. Wells Society, and that was founded by the 1619 Project writer Hannah Jones.
00:04:21.720 What's her name?
00:04:24.780 Hannah Jones.
00:04:27.740 And she teamed up with filmmaker Steven Spielberg's Heartland Foundation
00:04:33.820 so that they could foster, quote, more inclusive storytelling.
00:04:39.020 So there you go.
00:04:42.460 So that would be an example.
00:04:44.440 Let me list, you know, that, obviously you're going to say,
00:04:47.380 oh, that's a left-leaning organization funding them, so it's going to have influence.
00:04:52.160 But that's not a complete story, right?
00:04:55.800 If I only told you about the left-leaning people funding the AP,
00:04:59.840 you'd probably say to yourself, well, you're leaving something out, right?
00:05:04.180 Like, how many right-leaning organizations fund the AP, for example?
00:05:09.020 I completely left that out of the story.
00:05:11.740 So let me read the complete list of right-leaning organizations funding the AP.
00:05:20.480 Well, that's the complete list.
00:05:22.220 There was nothing on that list.
00:05:24.300 But at least I'm now complete.
00:05:26.380 I've told both sides.
00:05:27.900 Because sometimes you tell a story like this, it's all biased.
00:05:30.860 You're, like, only telling one side.
00:05:32.800 But I want to be complete.
00:05:33.840 That is the complete list of right-leaning funders for the AP.
00:05:40.440 So when you see news from the AP, what do you think you're seeing?
00:05:46.380 Do you think you're seeing news?
00:05:48.340 Or do you think you're seeing some writing by people who knows who's paying them?
00:05:54.080 Hmm.
00:05:54.680 You decide.
00:05:55.220 Well, I like that the Trump trial date has been set by Judge Chutkin, I think that's the pronunciation,
00:06:06.320 to be right about the time of Super Tuesday.
00:06:12.540 One day before Super Tuesday.
00:06:15.000 Yep.
00:06:15.580 And that's totally legitimate.
00:06:16.960 And there were no political considerations whatsoever.
00:06:19.980 No, that's not true.
00:06:23.580 Let me tell you what's true.
00:06:25.560 We live in a zero-trust environment.
00:06:29.740 But we still should cheat individuals, people like you, people like me, as innocent until proven guilty.
00:06:36.620 That's a good standard.
00:06:37.860 But when you see any part of the government or the judiciary do something that looks sketchy,
00:06:45.400 what should be your working assumption?
00:06:49.500 Innocent until proven guilty?
00:06:52.360 No.
00:06:53.720 That would be insane.
00:06:55.780 No, your working assumption should be that they're guilty.
00:06:59.080 That should be your starting assumption.
00:07:00.860 That doesn't mean it's true.
00:07:02.840 It also doesn't mean true that an individual is actually literally innocent until proven guilty.
00:07:08.940 It's just that we have to treat it that way because that's a better system.
00:07:12.060 But when it comes to the official people, the people that we know are corrupt in a sort of a general, usual way,
00:07:21.380 to assume that anything they do is credible and real and for the right reasons is not really a good working assumption.
00:07:29.900 In 2023, you have to assume that these things are political.
00:07:33.400 You have to assume it's exactly what it looks like.
00:07:36.720 Somebody chose the least good date for Trump.
00:07:41.020 Now, that doesn't mean it's true.
00:07:43.980 If you asked me to prove it, I'd say, well, how do I prove what somebody's thinking?
00:07:48.640 Can't prove that.
00:07:50.140 I'm just saying that if you can't prove it and they all those, you know, no, let's say, mysterious intentions,
00:07:58.400 if they can't prove it with transparency, then your best operating assumption is that it's exactly as crooked as it looks,
00:08:06.140 even if it's not.
00:08:07.580 It's the right assumption.
00:08:08.460 All right.
00:08:10.500 But I like the fact that they're now so obvious about it.
00:08:14.800 You know, if you had any doubts about this being a political process to take Trump off the board,
00:08:20.240 there's no more doubt about it, right?
00:08:22.800 You know, once you get this obvious.
00:08:25.620 But how would you like a little wake-up call?
00:08:28.780 Anybody want a little wake-up call?
00:08:30.220 Well, I would like to recount conversations I've had in the past week with other citizens of the United States.
00:08:43.020 It goes like this.
00:08:45.240 So did you hear the story about Vivek Ramaswamy at that time?
00:08:49.860 And whoever I'm talking to will say, who?
00:08:54.080 Vivek?
00:08:54.940 Vivek Ramaswamy, running for president?
00:08:58.500 Who?
00:09:00.080 Seriously.
00:09:01.340 You've never heard of Vivek Ramaswamy?
00:09:04.340 Vivek?
00:09:05.940 No.
00:09:07.060 Nope.
00:09:07.720 Nope.
00:09:07.900 And then I'll say, how about Purgosian?
00:09:14.340 Have you heard of this guy Purgosian?
00:09:16.920 What?
00:09:18.240 Purgosian?
00:09:19.860 Who's he?
00:09:21.040 Died in a plane crash?
00:09:22.700 Had a Wagner?
00:09:24.420 Not ringing a bell.
00:09:26.160 Wagner?
00:09:26.620 What's Wagner?
00:09:27.080 Do you think, how much of the country do you think even knows that the Trump trial date is on Super Tuesday?
00:09:38.820 What percentage of the country knows that?
00:09:42.140 Less than 1%.
00:09:43.400 Right?
00:09:45.120 Yeah, we're all weirdos.
00:09:46.840 If you're watching this, you're in the weirdo of the weirdo of the weirdo situation.
00:09:51.000 You're like a double weirdo.
00:09:52.640 It's like a weirdo if you're watching the news at all, apparently.
00:09:57.080 Yeah, I think fewer than 10% of the country are watching the news.
00:10:00.820 But of the ones who watch the news, how many are watching at a level that they would understand about the Trump trial date
00:10:10.120 or even know which of the four indictments it refers to?
00:10:14.420 Nobody knows that.
00:10:16.280 I'm barely hanging on, and I do this every day.
00:10:20.080 Every day I'm checking the news.
00:10:22.920 Basically, I guess you could call it my job, if that's what this is.
00:10:27.080 And I'm just hanging on.
00:10:29.540 I can't keep the four indictments straight.
00:10:32.200 Can you?
00:10:34.260 If I said, quickly, quick, name all four indictments.
00:10:38.200 Can you get them?
00:10:39.800 All right.
00:10:40.120 If you could name all...
00:10:41.220 A lot of people said yes, and I believe you.
00:10:43.620 The people saying yes, just remember how unusual you are.
00:10:47.960 Very, very unusual.
00:10:51.340 No more than 1% of the country, no more than 1% could name all four indictments.
00:10:57.520 So how much are the voters going to take that into account?
00:11:01.660 So don't fall into your little bubble where you think anybody knows anything about politics.
00:11:07.980 They just don't.
00:11:08.800 The country has no idea what's going on.
00:11:12.380 But it's also, at least it's obvious, the one's paying attention.
00:11:16.300 I like...
00:11:16.780 The one thing I like about it is that we don't have to wonder if this system is rigged.
00:11:22.360 It's about as rigged as it could possibly be.
00:11:26.660 Newt Gingrich thinks that Biden is cognitively impaired,
00:11:31.360 and that who's really running the country is some combination of Obama and Clinton.
00:11:39.580 Still, you know, usually they're people, but, you know, on behalf of the big powers.
00:11:45.020 True or false?
00:11:46.880 I don't know.
00:11:48.800 I forget who said this, but somebody said that if you could dig into the Clinton,
00:11:55.140 you know, the Clinton Global Initiative thing, it would be so dirty.
00:12:00.140 Was this Newt?
00:12:01.880 Maybe it was Newt.
00:12:02.900 Somebody said it.
00:12:04.500 That if you could actually dig in and see what they did,
00:12:06.780 it would make the Biden crime family look like, you know, littering.
00:12:12.500 It would be so obvious and so big.
00:12:15.600 I believe that.
00:12:17.040 I believe it because it's a big entity with not enough transparency,
00:12:22.220 and they are guilty until proven innocent.
00:12:26.000 All right.
00:12:26.640 Are you following the account on X called Trump History?
00:12:35.420 And once a day, or maybe more than once a day,
00:12:38.920 they publish a fake parody picture created by AI that looks real,
00:12:45.180 but it's Trump in a variety of historical situations, you know, like he's inventing the light bulb,
00:12:51.520 and, you know, he's doing insanely funny things.
00:12:55.180 But the one that I just, I wish I could show it to you.
00:12:59.280 I don't have an extra screen.
00:13:00.560 But you have to go to my X thread, or what do you call it, to feed,
00:13:07.560 and you have to see the one that is titled,
00:13:11.340 Donald Trump tells a young Vivek Ramaswamy that he will choose him to be his VP in the 2024 election.
00:13:18.620 And you see this little Indian kid who's, like, six years old,
00:13:22.240 and you see Trump, like, leaning in, talking to him intently.
00:13:25.380 It's the funniest thing.
00:13:27.680 I was just thinking about it all morning.
00:13:30.000 It is just so funny that these AI pictures are great, these fake historical things.
00:13:37.900 But it's not just that they're fake and the pictures are good.
00:13:41.480 It's that this one was really chosen with comedic, it's just comedic perfection.
00:13:46.820 I don't know who's behind it.
00:13:48.620 All right, let's call this the spunky 25.
00:13:54.320 Do you know what I'm referring to?
00:13:56.840 The spunky 25?
00:14:00.280 The 25% of the country that gets every poll wrong.
00:14:05.340 Meaning if there's one really stupid answer for a poll,
00:14:09.280 25% of the country is going to be right on that point.
00:14:13.120 All right, here's an example.
00:14:14.600 Rasmussen poll.
00:14:15.500 65% of voters think the current situation at the border with the migrants is a crisis.
00:14:22.640 So 65% of the country says the border situation is a crisis.
00:14:28.760 But interestingly, how many could watch that situation and say it's not?
00:14:35.540 24%.
00:14:36.060 24%.
00:14:37.200 Yep.
00:14:38.120 24% of the country says that doesn't look like a problem to me.
00:14:42.280 And the findings haven't changed since May.
00:14:45.940 60% of the likely voters in the U.S.
00:14:51.800 think that the migrant crisis is more important to national security than supporting Ukraine.
00:14:59.620 30% say supporting Ukraine.
00:15:02.960 30%, well, it's not too far from 25%, say supporting Ukraine is more important than the border.
00:15:10.420 Now, I don't know if anybody's done this before, or if it were.
00:15:15.480 I'll just run this by you as an idea.
00:15:17.500 If Trump were to reorient his campaign toward giving Americans what the polls say they want and just tell you that, would you have a problem with it?
00:15:32.440 Suppose you said, look, 60% of you want the border to be taken care of, so I'm going to do that.
00:15:40.560 And then hold up another thing that says, all right, 70% of you say you want this, so I'm going to do that.
00:15:48.940 Now, it wouldn't be leadership.
00:15:51.100 It wouldn't be leadership.
00:15:52.680 It would be following the public.
00:15:55.180 But are there sometimes you should follow and sometimes you should lead?
00:15:59.780 Well, national defense is one of those things you should probably lead.
00:16:03.300 Would you agree?
00:16:05.520 Let me give you a reframe.
00:16:06.940 Your leader should lead during the fog of war.
00:16:13.820 Your leader should lead during the fog of war when you don't know what's what.
00:16:18.720 Somebody's got to make a decision.
00:16:20.200 Somebody's got to act fast.
00:16:22.820 Somebody's got to take the responsibility.
00:16:24.620 They're going to take the pain.
00:16:25.620 They're going to take the credit.
00:16:29.080 You've got to have a leader.
00:16:30.540 And that leader, if that leader is doing something you don't agree with, or even the majority of the country,
00:16:35.140 well, it's because you hired him, you hired him to do this, to make the fast decision before the public even knows what the situation is.
00:16:45.060 But here's my reframe.
00:16:47.420 Does that stay that way?
00:16:50.020 Once the public becomes informed and once the truth or the facts become hardened,
00:16:55.560 so we're kind of looking at the same situation, and then the public decides, well, you know what?
00:17:01.240 You know, I'm glad we had a leader to act fast, but now that we have better information, maybe we should pull back from that position.
00:17:08.180 At that point, once the public is reasonably informed, if they say, no bueno, we're not going to do this anymore,
00:17:17.280 do you think a good leader says, no, I still disagree with the majority,
00:17:22.460 or does a good leader say, thanks for trusting me when it was ambiguous, because somebody had to lead,
00:17:28.840 and now that we all see this situation, you know what?
00:17:31.600 I think I'd rather give the public what they want.
00:17:33.840 What's wrong with that?
00:17:37.780 What's wrong with being a leader when you need a leader, and being a, let's say, a populist,
00:17:44.000 when we have a better understanding of the situation?
00:17:46.720 There's nothing wrong with that.
00:17:48.060 Is that flip-flopping?
00:17:50.180 Sound like flip-flopping to you?
00:17:52.560 We'll talk about flip-flopping.
00:17:54.180 No, that would just be a reasonable person doing reasonable things.
00:17:57.140 So, as much as I think that you need your leadership when things are ambiguous,
00:18:04.080 I'm not sure things are as ambiguous as they were.
00:18:10.900 So, even Bernie Sanders is not giving his, let's say, unambiguous support to Biden.
00:18:18.600 He obviously prefers him over Trump or somebody.
00:18:21.020 But he's choosing his words carefully about Biden's age, trying to tiptoe around it without actually lying about it.
00:18:32.200 But he is kind of signaling that his concern without ever saying anything of that nature.
00:18:36.880 It's just the way he words it.
00:18:38.200 You say, hmm, sounds like you're wording it in a way to protect him.
00:18:41.880 We get it.
00:18:42.460 And I guess Bernie expressed some bewilderment, that's the word being used, bewilderment,
00:18:52.660 that the Republicans have more support from working-class voters than Democrats do.
00:18:59.200 Are you, like, shocked that working-class people think the Republicans might have a better idea?
00:19:06.960 By a little bit.
00:19:07.980 It's not a gigantic difference.
00:19:09.360 But that they're more than competitive.
00:19:12.460 With the Democrats.
00:19:15.040 Now, I think Bernie's point would be, look at all these things Biden did for you.
00:19:21.520 And he would say, infrastructure plan, and, you know, I don't know.
00:19:26.640 What else?
00:19:28.220 I can't think of anything else.
00:19:30.440 What else did he do?
00:19:32.800 I'm not sure what.
00:19:34.340 The infrastructure plan?
00:19:35.600 I haven't really seen anything happen from that.
00:19:38.480 Yeah, the other things that he's doing are, you know, inflation.
00:19:41.900 Your gas prices are going up.
00:19:44.140 There's an unnecessary war.
00:19:45.920 The border's completely uncontrolled.
00:19:47.840 People coming for your jobs.
00:19:49.460 AI is going to destroy you.
00:19:50.820 There's no plan for that.
00:19:52.120 You know, I don't think it will, actually.
00:19:54.120 But why is that a mystery?
00:19:57.480 Are you bewildered?
00:19:58.920 Is anybody like, I don't know what's going on here?
00:20:01.240 I can't figure it out.
00:20:04.000 I saw Tucker Carlson in some interview say that he thought Trump would be the most consequential president of our lifetime.
00:20:12.300 And I completely agree with that.
00:20:14.580 But he pointed out that there were three, I hope I remember them, three Trump truths that now we just accept as true when they seemed a little crazy.
00:20:25.200 So he started with, got to lock up that border.
00:20:29.300 And I think even reasonable people said, you know, I get it that there are people coming across illegally, but there always have been.
00:20:37.280 And, you know, we're doing okay.
00:20:39.840 Right?
00:20:40.200 That wasn't crazy.
00:20:41.880 That was not my view, but it wasn't crazy to say, yeah, the border's not secure, but we're also doing fine.
00:20:48.760 You know, let's just leave them alone.
00:20:50.400 Maybe they get a better life out of it.
00:20:51.780 Right?
00:20:52.100 You can see that.
00:20:52.820 But at the moment, is anybody saying that?
00:20:56.480 I don't think anybody is.
00:20:58.520 Nobody who knows the actual situation, that they're not even Central and South Americans coming over.
00:21:04.580 At this point, it's all Europeans and Asians and Africans coming in.
00:21:10.780 So, and they're coming in with the cartels, you know, huge business model.
00:21:16.520 They're making billions of dollars or whatever.
00:21:18.800 So, so Trump was clearly right about the border.
00:21:23.460 We should have secured it.
00:21:25.640 He was clearly right about China, you know, hollowing out the middle class and we had to, you know, get tougher with China.
00:21:32.460 And I, I would argue he was definitely right about energy.
00:21:37.040 But I think, I think Tucker had a third, he had a third example.
00:21:43.920 Forget what it was.
00:21:45.100 But it were three things that when you heard of you, you're like, you know, he was really right about the biggest, the biggest things.
00:21:51.600 It wasn't about the, it wasn't about the fake news, although he was right about that.
00:21:56.740 I mean, Trump is the one that taught us that the news was fake.
00:22:01.520 Think about that.
00:22:03.320 Think about the fact that Trump is the one who taught us the news was fake in a way that you didn't really understand before.
00:22:10.220 I mean, you always thought some of the news was fake.
00:22:12.340 But did you know it was all fake?
00:22:17.140 Basically, it was basically all fake.
00:22:19.820 Now, they might sometimes get one right, but I think it's a coincidence.
00:22:25.600 I don't think it's because they tried.
00:22:28.060 I think, I think if it's anything about politics or, you know, and that would include anything with science.
00:22:34.260 So, so science is all political now, right?
00:22:37.180 Because if science shows something, then you got to do something different politically.
00:22:40.440 So science is just politics at this point.
00:22:43.920 Am I wrong?
00:22:45.420 The science and politics merged?
00:22:48.380 I'm not wrong.
00:22:50.000 By the way, has anybody said that before?
00:22:52.320 I like that reframe.
00:22:54.900 Science is great, but what we now have is a hybrid of science plus politics.
00:23:01.060 If you add politics to science, you get a shit.
00:23:05.260 Science by itself, pretty terrific in the long run.
00:23:09.180 In the short run, science is just a coin flip.
00:23:12.580 It's just a coin flip in the beginning.
00:23:15.660 And I mean even after you've done a study.
00:23:18.480 Because the number of studies that are peer-reviewed that end up later being not supportable is about half.
00:23:25.300 That's actually a coin flip.
00:23:27.660 So science starts as no more dependable than a coin flip, you know, in the early hypothesis stage.
00:23:33.340 Well, hypothesis is lower than a coin flip.
00:23:35.460 You know, by the time you've done one published study, you're up to a coin flip, right?
00:23:41.640 So you start in, well, 10% chance maybe.
00:23:44.600 Worth a shot.
00:23:45.460 Let's study it.
00:23:46.680 Up to 50% if your study says yes.
00:23:49.760 But you're only 50%.
00:23:51.060 That's just a coin flip.
00:23:52.220 And then maybe after 30 years, lots of studies and lots of arguments, then we kind of solidify on something.
00:24:01.680 We say, yes, the Big Bang definitely happened just the way we said.
00:24:07.100 And then you wait 30 years and find some evidence that says the Big Bang couldn't have been what you said,
00:24:12.820 but you were happy for 30 years.
00:24:14.200 That's science.
00:24:14.740 It turns out it wasn't right after all, in some substantial ways.
00:24:22.200 All right.
00:24:24.760 As I tweeted the other day and actually saw a number of agreements, which I wasn't expecting,
00:24:30.800 the trajectory of at least cities and certainly some other things in politics, certainly the border,
00:24:38.920 is that it looks like it couldn't get any worse, doesn't it?
00:24:42.580 It's like, you know, things are just going to hell.
00:24:46.000 So the border security could not possibly be worse than it is right now.
00:24:50.440 The cities are just lost at this point.
00:24:54.020 But here's the optimism.
00:24:56.860 You ready?
00:24:58.200 Here's the optimism.
00:24:59.800 It has to hit bottom because you're dealing with addicts.
00:25:05.040 Right?
00:25:05.420 The people who are supporting the current failed system are addicts.
00:25:09.160 I don't know what they're addicted to.
00:25:10.500 They're either addicted to maybe the public approval of doing woke, you know, liberal things.
00:25:19.140 It might be they're addicted to the feeling of being the person who's fighting the big power.
00:25:25.680 Maybe.
00:25:26.060 Maybe they're addicted to the power.
00:25:28.360 Maybe they're addicted to the money, the prestige.
00:25:31.320 Maybe they're addicted to supporting their team.
00:25:34.260 But whatever it is, it's not based on a reason.
00:25:37.100 You can't look at any of our cities and say, well, that's what we planned.
00:25:43.460 And it's working fine, so let's keep doing more of it.
00:25:46.360 You can't do that.
00:25:47.780 But here's the problem.
00:25:49.620 As long as the cities limp along and, you know, they still have traffic and some business and stuff,
00:25:57.240 maybe it just keeps going, you know, slightly worse every year.
00:26:00.480 Our best case scenario is that we hit bottom.
00:26:05.880 Do you know what hitting bottom would look like?
00:26:09.400 A Democrat saying, fuck it, I'm a Republican now.
00:26:13.120 That's what it looks like.
00:26:15.400 As long as there's still Democrats all the way down, you're not at the bottom.
00:26:19.520 You hit the bottom when somebody says, whoa, everything I thought was wrong, like an addict, right?
00:26:28.680 The drinker is thinking, well, I could quit, but I like it.
00:26:32.020 And then when you hit the bottom, they're like, okay, I quit because no choice now.
00:26:38.340 So we're not at the bottom.
00:26:40.220 But don't look at our rapid decline as necessarily a one-way street.
00:26:47.540 We have to hit bottom before it gets fixed.
00:26:50.660 So once the cities are a little bit more unlivable, maybe a lot more, then something will happen.
00:26:58.080 I don't know what will happen, but it'll be some correcting force.
00:27:01.300 But don't worry that the cities are getting worse unless you live in them.
00:27:04.640 And if you do live in a city, why?
00:27:08.440 Unless it's a Republican city that's running well, why would you do that?
00:27:16.020 Get out of those places.
00:27:17.720 Get the hell out.
00:27:18.500 Let it fail as quickly as possible.
00:27:21.500 Then you've got a chance.
00:27:24.820 All right, here's my section I call Biden Dementia Takes.
00:27:30.140 Biden Dementia Takes.
00:27:31.920 He said that the U.S. intelligence community has determined that domestic terrorism, rooted in white supremacy, is the greatest terrorist threat we face in the homeland.
00:27:47.120 So he got that from the U.S. intelligence community.
00:27:50.140 The U.S. intelligence community.
00:27:52.700 Huh.
00:27:53.860 The U.S. intelligence community.
00:27:55.660 Have they ever been wrong?
00:27:56.640 Well, let's see.
00:27:59.660 So the people who were sure that the Hunter laptop was Russian disinformation are also sure that our greatest threat is this white supremacy.
00:28:12.900 Well, it's almost like you're saying that the organization that we trust the least is saying something that on its surface sounds ridiculous.
00:28:21.480 How often has that been true?
00:28:26.540 U.S. intelligence people saying something that the rest of us think is sort of ridiculous on the surface.
00:28:34.520 Hmm.
00:28:35.780 It might be the least credible thing anybody ever said.
00:28:39.940 Now, of course, he's parsing his words carefully.
00:28:43.040 So he's saying it might be the greatest terrorist threat.
00:28:48.640 What are the other terrorist threats and how do you measure them?
00:28:52.060 How do you measure the terrorist threat that hasn't happened?
00:28:55.900 Isn't the whole point of terrorism, you don't know when it's going to happen, and when it does, it could be a big deal?
00:29:01.380 How does he know that the biggest, you know, Islamic terrorist threat of all time, you know, radical Islamic terrorists, let's say, isn't tomorrow?
00:29:14.780 If you want the ultimate, I'm going to give you the ultimate conspiracy theory.
00:29:20.460 You ready?
00:29:21.480 The ultimate conspiracy theory.
00:29:25.580 I can see no reason that we haven't had ongoing terrorism in the United States from foreign sources.
00:29:32.340 I don't see any way that could be possible unless they were never real in the first place, meaning that things probably blew up and people really died.
00:29:45.700 But I mean that who is backing them exactly?
00:29:49.600 Exactly who is backing them that they can't do the simplest thing in the world, which is blow up something or destroy something in the United States?
00:30:00.520 I don't like to brag, but if I ever decided to become a terrorist in the United States, I think I could take down the whole country in about a week.
00:30:12.540 I don't even feel like it's that difficult.
00:30:16.140 So, I mean, if you called your shots correctly and planned right, right, it wouldn't be that hard.
00:30:22.880 And yet the total terrorist threat at the moment from foreign sources appears to be basically zero.
00:30:29.820 Like, I don't even think about it, do you?
00:30:34.920 And can you point to anything that would have caused that to happen?
00:30:38.020 Well, I can't.
00:30:41.300 Is it because we were so nice to people in the Middle East, you know, while fighting ISIS, we did it so professionally and politely that ISIS, when we're done, said, you know what, good fight, guys.
00:30:53.360 Yeah, you win.
00:30:54.800 Yeah, we're going to take our beating and go home.
00:30:57.060 Yeah, you win this one.
00:30:58.580 What can we do?
00:31:00.280 Now, there's something that terribly doesn't make sense.
00:31:03.680 It terribly, terribly doesn't make sense that we're not saying.
00:31:08.020 And I was also suspicious about why they have to do grandiose exploding operations.
00:31:15.380 Really, that's the only thing you could do to hurt a country is grandiose exploding things.
00:31:21.320 That's it.
00:31:23.100 Now, that was always such a tell for something not being what it looks like.
00:31:27.280 If we wanted to destroy a country, we wouldn't limit it to one kind of specific attack that's easy to stop or easier to stop than some other things.
00:31:38.020 So, I don't know who to blame or what's going on.
00:31:43.840 I'm just saying that the whole terrorism, let's say, narrative, couldn't possibly be the one we have.
00:31:51.700 In other words, what's explained to us as Americans couldn't possibly be true.
00:31:56.500 Not even a little bit.
00:31:58.880 Right?
00:31:59.220 Would you agree?
00:31:59.720 Now, we're watching refinery fires, food processing fires.
00:32:06.100 And the question is, how many of those people that came across the border from other countries are actually just terrorists?
00:32:13.540 And they're doing the smart way to destroy a country, which is a little bit at a time.
00:32:20.120 Just keep biting little edges off.
00:32:22.180 I don't know.
00:32:25.340 I don't know what to believe because there's no way to know what's true anymore.
00:32:29.360 All right.
00:32:29.620 So, Biden, dementia take number two.
00:32:33.920 He actually said this.
00:32:36.280 And he's still in office.
00:32:39.180 All right.
00:32:39.580 I want you to hold into your head how insane this is, this next story.
00:32:45.280 And that Biden's still in office.
00:32:50.040 Just look at that.
00:32:50.680 He's still in office.
00:32:51.960 And here's something he said out loud, clearly and intentionally in public.
00:32:58.160 He said, quote, I was able to literally, not figuratively, talk Strom Thurmond into voting for the Civil Rights Act.
00:33:09.620 Problem number one.
00:33:10.880 Well, Biden was 22 years old when the Civil Rights Act was voted on.
00:33:18.680 Do you think that when he was 22 that he personally talked Strom Thurmond into voting for it?
00:33:27.420 The answer is no, because Strom Thurmond famously voted against it.
00:33:34.500 Famously against it.
00:33:36.580 Not only did he not talk to him, not only was it not literally, but he didn't change anybody's mind because Strom voted against it.
00:33:46.300 Now, he said this in public.
00:33:50.380 Right?
00:33:52.100 The news reported it, so it's not like anybody missed it.
00:33:56.340 Nobody's claiming he was joking.
00:33:58.680 Right?
00:33:59.280 Nobody's claiming he was joking.
00:34:01.460 Nobody's claiming he misspoke.
00:34:03.140 Nobody's claiming he was taking that out of context.
00:34:10.500 Nobody's even claiming he lied.
00:34:16.740 Just hold this in your head for a moment.
00:34:19.040 He's the commander in chief.
00:34:20.160 Now, how much more broken could at least the Democrats, I mean, you could argue, all right, let me broaden this.
00:34:32.260 And you're telling me that the Republicans have not started an impeachment process.
00:34:36.860 The impeachment should be driving him toward the 25th, you know, replacement.
00:34:42.160 Is it only because Kamala's worse?
00:34:44.080 Are there even, are even the Republicans afraid of Kamala?
00:34:49.980 Maybe so, huh?
00:34:51.360 They don't want to give her a little boost because then she might run for president from that boost.
00:34:59.660 Somebody says he wasn't lying.
00:35:01.180 He was talking about the Civil Rights Act of 1991.
00:35:10.040 There were two Civil Rights Acts.
00:35:11.720 All right, we're getting a fact check here.
00:35:14.640 Fact check.
00:35:15.580 Give me a fact check.
00:35:17.800 There were two things with the same name.
00:35:25.280 So they're saying he was referring to a different bill.
00:35:31.460 Oh, this is interesting.
00:35:32.660 Because if you're, if you watch, you know, right-leaning X, I didn't see anybody fact check that.
00:35:42.020 But I'm seeing people, all right, let me, let me pivot.
00:35:46.500 Allow me to pivot.
00:35:47.560 I'm going to, I'm going to, I'm only going to mock Snoopy Boobs here, who says I'm getting burned.
00:35:57.020 Here's a little lesson for you.
00:35:58.420 If your brand is that you're always right, in these situations you would be triggered into cognitive dissonance, and you would argue that it couldn't possibly be true that the story is wrong, because that would embarrass me in public.
00:36:11.960 If you can't be embarrassed, you won't get cognitive dissonance, right?
00:36:19.120 So my brand is, I will change instantly when the information does, and that probably happens a lot.
00:36:28.040 So it looks like it's happening right now.
00:36:29.700 Now, it appears that in public, I'm being humiliated by my critics for having a wrong fact, which looks like they might be right, by the way.
00:36:41.180 I'm not going to doubt them.
00:36:42.560 I think I'll look into that.
00:36:44.320 So watch me not experience cognitive dissonance.
00:36:50.140 I just said, oh, that looks like you might have a good point there, because that's exactly what the news does.
00:36:55.380 In fact, the next things I'm going to be talking about are that the news does exactly that.
00:37:01.720 So if it's true that I got got, and it looks like it is, that's a good story.
00:37:09.440 And I'll just tell you that I fell for it.
00:37:14.320 Everybody good?
00:37:16.800 All right.
00:37:17.940 So, all right, let's look into that.
00:37:19.480 It doesn't make him any less dementia, but maybe that wasn't true.
00:37:25.380 Did you hear the story about Tucker Carlson explaining how he was called by some spook-type person who knew that he was negotiating to do an interview with Putin?
00:37:40.280 So I think this is back when he worked at Fox News.
00:37:42.380 And he was actually told that his Signal account, his encrypted app, wasn't secure, and the NSA was just reading his messages.
00:37:55.380 Now, can I remind you again?
00:37:58.860 Everything you put in a digital form is discoverable.
00:38:02.880 No matter what they tell you, it's discoverable.
00:38:06.680 One way or the other.
00:38:08.420 One way to discover it is they just have access to the recipient's phone.
00:38:12.960 So maybe the message got sent, you know, all encrypted, just like you should.
00:38:19.700 But once it reaches the other person's phone, you know, it's being sent to a screen.
00:38:24.080 So presumably, you could pick it up between the, you know, the encrypted app and what it presents on the screen.
00:38:30.340 Because at that point, it's unencrypted, right?
00:38:32.740 So if you own the phone of the recipient, it doesn't matter if you're encrypted or not.
00:38:39.140 And that's just one way to do it.
00:38:40.620 The other way to do it would be if they had a backdoor and we don't know about it.
00:38:44.560 Could have a backdoor.
00:38:46.280 No, no.
00:38:49.120 So you should assume that all your digital communications are public.
00:38:56.380 Just public.
00:38:57.820 Never say anything that you wouldn't say in public.
00:39:00.280 All right.
00:39:04.560 That's the best advice I'll ever give you.
00:39:07.800 So there's this weird thing going on with the coverage of Vivek.
00:39:13.140 And it goes like this.
00:39:15.720 He'll be on an interview, podcast or the news.
00:39:19.100 Somebody will take something he said and a context.
00:39:23.040 And they'll ask him to defend it.
00:39:25.500 Why did you say X?
00:39:27.320 And then he'll say, well, I didn't.
00:39:29.380 I didn't say that.
00:39:30.780 Here's what I did say.
00:39:32.380 And then they'll say, well, oh, you say you didn't say that.
00:39:35.660 Well, here's the source.
00:39:38.020 And then the source will support Vivek.
00:39:41.000 And yet the person pointing to the source won't be able to see it or know it or acknowledge it.
00:39:47.060 As if they're looking at a different movie and saying, oh, I'm looking at the sound of music, but it's really Schindler's List.
00:39:53.740 I mean, and I've seen it several times now.
00:39:57.260 And then what do they do after it looks like he's been taken out of context, but they won't admit it?
00:40:04.920 Then they say, what's the next thing they say?
00:40:08.240 Then they say he's a flip-flopper.
00:40:10.580 They say he's a flip-flopper because they say the new thing you're saying, that's different than the thing you said before.
00:40:18.000 But it isn't.
00:40:20.260 And it never was.
00:40:21.980 It's the same.
00:40:22.820 And so now they call him a flip-flopper.
00:40:26.800 And then there's a step after the flip-flopper accusation.
00:40:30.620 They take all the times that they've done this to him, and they put it in a list, and then they post it on X, and they say, look at all these times he's flip-flopped.
00:40:41.780 And in fact, every one of those is him being taken out of context.
00:40:46.060 He tells you what he really thinks and how it was taken out of context.
00:40:49.820 They refuse to acknowledge that he ever said that.
00:40:52.820 That he ever, you know, gave them the accurate story.
00:40:56.120 And then they call him a flip-flopper, and then they put it in a list.
00:41:00.980 So here's another one.
00:41:03.360 And here's the test to know when they're doing it, okay?
00:41:07.280 Remember the really test?
00:41:09.360 Where there'll be a claim in the news, and here's a way you can tell it's fake.
00:41:13.580 Just say, really?
00:41:16.160 Really?
00:41:18.440 Seriously?
00:41:18.800 If you can't get past the really test, probably not true.
00:41:23.960 All right?
00:41:24.180 Let me give you one.
00:41:25.560 The claim is that Vivek says if he were president, he would end funding to Israel in 2028.
00:41:35.020 Now, that would be just, you know, four years after taking office.
00:41:38.140 And Israel, one of our most important allies, certainly politically, you could argue, most important.
00:41:46.800 So do you think that a Republican running for a major office, let me ask this, do you think a Republican running for a major office suggested something that Israel and all supporters of Israel would immediately go, ugh, and never be able to recover from it?
00:42:05.360 Really?
00:42:06.840 Really?
00:42:08.920 Really?
00:42:10.160 Do you really think that someone as smart and as good with his messaging as Vivek would have really said that?
00:42:19.800 You think you really would have thought, oh, here's a good idea.
00:42:24.300 Does anybody want to say they believe it?
00:42:27.380 Does anybody want to say they believe it right now, before I give you any more information?
00:42:31.940 Oh, somebody says they believe it.
00:42:33.380 Really?
00:42:33.600 You really believe that a major candidate said he was going to screw Israel, which is what it would sound like.
00:42:41.760 A major candidate who says he's going to screw Israel while running for president.
00:42:46.680 That actually sounds believable to you.
00:42:49.440 Somebody who went to Harvard is so capable that he's rising up the rankings, that's all we're talking about,
00:42:56.460 and that that guy was so smart in every way, but somehow didn't realize that a clear message of non-support for Israel would somehow hurt him and make him unelectable.
00:43:09.780 He wouldn't realize that, right?
00:43:11.940 Really?
00:43:13.880 All right, you want to hear what the real story is?
00:43:15.600 Of course, of course, it was conditional.
00:43:20.800 Of course.
00:43:22.640 Do you think he said that if everything's the same as it is now, I'm just going to take their funding away?
00:43:27.980 Really?
00:43:29.540 You don't think it was a conditional statement?
00:43:32.300 It was a conditional statement.
00:43:33.980 And here's the condition.
00:43:36.120 That he would expand the Abraham Accords so that the entire region would be in a safer, more stable situation.
00:43:45.560 And then, once it's stable, they wouldn't need our help.
00:43:51.380 So, once the situation changes from the current situation where they probably do need our help to a situation where you and I would all agree,
00:44:02.640 oh, it looks like they're much safer now, you know, not totally safe, but much safer, and they could maybe handle it on their own.
00:44:11.700 Does that sound crazy?
00:44:13.560 It might be too optimistic if you say, oh, that's way optimistic.
00:44:18.080 Yes.
00:44:18.880 Yes, that's way optimistic.
00:44:20.200 But shouldn't he be aiming for it?
00:44:24.660 Is it wrong to aim for that?
00:44:27.580 Because in four years, you know, four years after 2024, he could say, you know, we didn't get there.
00:44:34.480 But if we had gotten here, we'd be talking about cutting funding.
00:44:38.480 But we didn't get there.
00:44:41.020 Now, would that be crazy?
00:44:42.740 Now, when I explain what he really said, do you really believe that he just said,
00:44:47.400 if everything stays the same, I'll just cut funding in four years to Israel?
00:44:50.880 Does anybody believe that now?
00:44:54.060 It would be crazy.
00:44:55.940 Now, I should not have had to explain that it was a conditional statement.
00:45:00.740 You should have seen that from the start.
00:45:02.380 And the start is, nobody would have said that out loud.
00:45:07.120 All right, here's another one.
00:45:08.840 Do you believe that a president, an actually sitting president, went on TV, thought about
00:45:15.400 what he was going to say, and then said that neo-Nazis are fine people?
00:45:21.000 Really?
00:45:22.220 Do you really believe that somebody actually did that?
00:45:25.020 And of course, the answer is they didn't.
00:45:27.080 Of course they didn't.
00:45:28.460 It was, you know, it was a Rupar edit.
00:45:30.840 He never said that.
00:45:32.340 He said the opposite of it literally and directly.
00:45:36.440 Here's another one.
00:45:37.800 Do you think that a sitting president of the United States once stood in public with a bunch
00:45:43.260 of science doctor people and suggested that you should, or at least that it was worth
00:45:48.940 looking into, drinking disinfectant?
00:45:54.500 Really?
00:45:57.040 Really?
00:45:58.720 That's possible?
00:45:59.940 That maybe that actually happened?
00:46:01.380 No.
00:46:02.020 No, you didn't need to know the whole background story that there was, in fact, a test about
00:46:07.480 putting light into at least the trachea, and they were maybe thinking about the lungs later.
00:46:13.260 And that the light was a disinfectant, and that the news took light as disinfectant and
00:46:19.480 misinterpreted it as chemical disinfectant.
00:46:23.040 Now, I shouldn't have had to give you the explanation, because the moment you heard it,
00:46:27.880 okay, that didn't happen.
00:46:30.200 Now, here's where it's tricky with Biden.
00:46:33.520 So I looked at this news, which I think I probably got wrong, based on your fact checks.
00:46:37.500 And he said something that would have been, like, so crazy that they would have, you know,
00:46:44.260 25th Amendmented him that day.
00:46:47.020 So probably I should have said, do you really think that he said that he helped Strom Thurmond
00:46:54.680 vote for something he didn't vote for?
00:46:57.320 Except that Biden has a history of saying wildly ridiculous things that you can't tell if it's
00:47:03.760 dementia or not.
00:47:05.520 So under that specific situation, Biden doesn't fit into the really, because Biden actually
00:47:11.280 does say things that seem disconnected from, you know, reality.
00:47:17.120 So he has a special case where the really test doesn't work.
00:47:21.100 But if you've got somebody who's a functioning person, like all of the candidates, all of the
00:47:26.120 candidates except Biden are functioning.
00:47:27.680 If you heard that, and RFK Jr. is the same thing, by the way.
00:47:33.620 If you look at the RFK Jr. attacks, just try this.
00:47:39.180 Really?
00:47:40.640 Really.
00:47:41.440 A guy as smart as RFK Jr., you're telling me that he said whatever it is that he said,
00:47:47.140 that, like, all vaccinations are bad or something.
00:47:50.980 I think that's one.
00:47:51.680 One of them is, do you really think he said all vaccinations are bad, despite being vaccinated
00:47:58.140 and his kids are vaccinated?
00:48:00.360 Probably not.
00:48:02.180 More likely, he had a problem with how well they're tested, maybe something about the
00:48:07.260 liability.
00:48:08.360 You know, that would be reasonable.
00:48:09.540 But no.
00:48:11.120 If it sounds ridiculous, it probably is.
00:48:14.140 All right.
00:48:14.340 Here's a little, I'm going to give you some persuasion takes on DeSantis and Vivek.
00:48:23.420 Vivek has, in my opinion, the best persuasion game on top of communication.
00:48:31.000 Communication is just, you know, saying what you want to say.
00:48:34.580 Persuasion is what Trump does.
00:48:37.260 Now, I told you that Trump was the best visual persuader, but he'd also make you think past
00:48:43.340 the sale.
00:48:44.520 So you weren't just thinking, does he want to build a wall, yes or no?
00:48:48.480 You were thinking the actual structure of the wall.
00:48:51.440 You're thinking, well, what do you make it of?
00:48:53.760 Right?
00:48:54.000 That's what Trump does.
00:48:54.860 He makes you think past the decision, wall or no wall, all the way to, well, what's that
00:48:59.680 wall going to be made of?
00:49:01.300 Basic, good persuasion.
00:49:03.740 So when Vivek says the FBI is corrupt, which other people have said, he doesn't leave it there.
00:49:14.140 He doesn't just tell you that he wants to, you know, change the FBI.
00:49:18.260 He says, I've published a detailed plan of where those employees would take their functions
00:49:23.860 so you don't lose the function.
00:49:25.540 You just lose the nature of the group.
00:49:27.880 Same with the Department of Education.
00:49:31.360 He doesn't just say he wants to get rid of it, which I never found convincing.
00:49:36.240 He says, I want to get rid of it and take that funding and do block grants to the state.
00:49:41.240 So now I'm wondering about block grants to the state.
00:49:44.360 So I'm already thinking past he got elected, and I'm thinking past, you know, basically he's making me think.
00:49:50.940 Now, when he talks about Taiwan, he talks about the short-term protecting him, and he talks
00:49:58.560 about the long-term, you know, it might be a different ballgame if we don't have a strategic
00:50:03.920 interest.
00:50:05.160 Let's say we're doing our microchips over here.
00:50:07.280 Again, every time that Vivek talks, he makes you imagine him president and that he's already
00:50:16.660 doing the job, and you're actually evaluating the details of how he's doing the job, not
00:50:22.400 the question of whether he got elected.
00:50:24.760 That is pure Trump technique, and it's A+.
00:50:28.980 So if you're judging Vivek only on his communication ability, which is A+++, you would miss that
00:50:37.860 embedded in it is a layer of persuasion skill that nobody else is demonstrating.
00:50:43.300 You know, Trump does.
00:50:44.760 Nobody but Trump.
00:50:46.480 So if you catch it, then you understand why people like Mike Cernovich are, you know, giving
00:50:53.440 him a strong look.
00:50:55.380 Because Cernovich can see the layers, right?
00:50:57.580 You would have to have some experience to see the persuasion layer.
00:51:02.640 But wow, it's there.
00:51:04.880 The one thing I would say to Vivek is he needs to do a better job of saying his answer first
00:51:13.580 and then his explanation for the answer.
00:51:16.640 So he waits to give the definitive answer sometimes, gives a little too much context first, and that
00:51:22.900 makes it look evasive.
00:51:24.400 When in fact, he doesn't need to evade anything, because he's never run from any of his opinions.
00:51:30.360 So he's not trying to evade anything.
00:51:32.780 He's just giving, you know, a good, complete answer.
00:51:36.120 But if the complete answer doesn't start with the conclusion, it's a persuasion mistake.
00:51:42.860 And I don't know that Trump ever does that, by the way.
00:51:47.580 This would be interesting.
00:51:48.560 I've never noticed him do it.
00:51:50.220 But if Trump is asked a question, he'll give you the answer first, and then he'll tell you why.
00:51:56.720 Am I wrong?
00:51:58.200 Watch for that.
00:51:58.960 I think you're going to see that, that he answers first.
00:52:01.500 That's a very strong technique.
00:52:02.660 The other thing I'd love to see Vivek do, that I've been doing, and it works, is that
00:52:09.840 you, before you give your direct answer, you give just a little bit of context reframing.
00:52:17.960 All right?
00:52:18.540 So when I'm asked about my controversial comments, the way that I do that is say, well, you know,
00:52:25.380 I'm not sure that most of your viewers understand that news about public figures is never real.
00:52:33.920 Now, do you know who did that reframe first?
00:52:37.400 That's the Steve Jobs reframe.
00:52:39.520 I borrowed that from him.
00:52:41.760 When he had the problem with Antenagate, instead of saying, well, our phones have this problem,
00:52:47.280 we'll do what we can, that would be weak.
00:52:50.440 He started by saying, all cell phones have problems, all smartphones have problems.
00:52:55.380 And then everybody said, oh, well, that's actually true.
00:52:59.960 So the context is, now I understand the context, they all have problems.
00:53:03.980 So when we talk about yours, it won't seem special to me.
00:53:07.420 It'll just be in the context of, yeah, they all have problems.
00:53:10.220 And then the news reported the next day after Steve Jobs said that, yeah, all smartphones have problems.
00:53:15.120 And they even mentioned them.
00:53:16.520 They mentioned the other company's problems.
00:53:19.620 You can't get a better persuasion than that.
00:53:21.920 By the way, do you know why Steve Jobs was known for having a distortion, a reality distortion field around him?
00:53:33.360 And yet you never heard that about Bill Gates, did you?
00:53:37.760 Did you ever hear, oh, Bill Gates, he's got that reality distortion field around him?
00:53:43.020 Do you know why it seemed that Jobs, unique among people, had a reality distortion field around him?
00:53:53.860 I will answer that question with one word.
00:53:59.780 Reframes.
00:54:01.100 If you listen to Steve Jobs talk, Steve Jobs talk, he reframes first, like he did with Antenagate,
00:54:08.400 and then, you know, he gives his argument.
00:54:12.400 Let me give you the most famous example was when he was trying to get Scully to leave Pepsi.
00:54:19.680 He was president of Pepsi.
00:54:21.020 He wanted to leave that job and be the head of Apple.
00:54:23.760 And he famously said to him, after Scully had said no, no, no, no, no, toward the end of the meeting,
00:54:29.460 he said, well, the question is this.
00:54:31.680 It comes down to this question.
00:54:32.920 Do you want to sell sugar water for the rest of your life or change the world?
00:54:39.060 Now, what's that?
00:54:41.400 That's a reframe.
00:54:43.460 That's a reframe.
00:54:45.560 Jobs spoke in reframes.
00:54:48.520 And when you speak in reframes, reframes are basically a form of hypnosis that probably nobody but me would call it that.
00:54:59.220 But it's a form of persuasion that's so strong and can happen so quickly, like the reframes I just mentioned.
00:55:05.520 They happen so quickly that your brain goes from, I'm over here, to what?
00:55:11.260 And that's the reality distortion field.
00:55:14.000 It's the feeling that you were sure of this until he said that, and he reframed it, and the reframe was so good,
00:55:21.640 your brain just said, I give up.
00:55:23.480 I'm taking your reframe.
00:55:24.760 And then you have this feeling like reality is loose, like he can move it around.
00:55:32.500 It's not an illusion.
00:55:35.920 He's actually moving reality around.
00:55:38.400 That's what a reframe does.
00:55:40.680 He's literally changing reality because your reality is subjective.
00:55:47.160 You could argue that there's a base reality.
00:55:49.660 I don't know.
00:55:50.420 But your reality is subjective.
00:55:51.920 And when he reframes you as quickly as he does, and he did it, almost everything he talked about was a reframe.
00:55:59.000 Do you remember the slogan for the early Macintosh?
00:56:04.000 Think different.
00:56:06.680 That's a reframe.
00:56:09.040 It assumes that everybody using these old boring IBMs were like drones.
00:56:14.100 I am a drone.
00:56:15.300 I'm just typing.
00:56:16.200 I'm using Word and maybe a little bit of Excel, but I do not have any creativity.
00:56:21.320 No, I'm like everybody else.
00:56:23.540 Who in the world likes to admit they're like everybody else?
00:56:27.180 Nobody.
00:56:28.660 Internally, everybody thinks they're different, right?
00:56:32.060 Everybody thinks they're different internally.
00:56:35.720 Externally, we look at people and go, you're just like that other person.
00:56:38.840 But internally, we all think we're special snowflakes, right?
00:56:43.000 So that was a reframe.
00:56:45.920 You reframed it from, is this computer better than this one, to, do you think differently?
00:56:53.100 Or are you one of these drones?
00:56:57.360 That was a hell of a reframe.
00:56:59.700 Hell of a reframe.
00:57:01.100 All right.
00:57:01.320 So I would like to see Vivek do a reframe for some of these challenging questions.
00:57:10.000 So he could say, you know, I'm going to answer this question directly, but just some context.
00:57:17.720 You know that there's this weird thing happening where people are taking everything I say out of context
00:57:22.780 and then challenging me and then saying that I'm flip-flopping if I simply explain what the original context was.
00:57:29.020 So we won't do that today.
00:57:31.240 But to answer your question, the answer is no, and here's why.
00:57:36.280 So that's the forum that just wins every game.
00:57:39.740 If you say the playing field is this, and you describe a reasonable playing field,
00:57:46.540 so my playing field is that this fits into a pattern of stories.
00:57:51.920 Here's my playing field.
00:57:53.280 Playing field is the pattern.
00:57:54.480 Now you're going to have to explain why this isn't another one of those.
00:57:59.600 You know, why isn't this a Rupar?
00:58:01.600 When I explain what a Rupar is, before I explain that something is a case of it, it goes over way better.
00:58:10.660 If you start by saying, okay, this one thing is a Rupar, and what they did was they changed the edit,
00:58:15.980 that you might convince some people.
00:58:20.600 You might.
00:58:21.620 But if you do it this way, okay, you know that this is a widespread, most normal technique.
00:58:28.360 And they've done it in this case, in this case, in this case.
00:58:31.160 And you can see each of the cases where they took something out of the quote,
00:58:35.200 which has the weird effect of actually reversing its meaning.
00:58:38.720 It doesn't seem like it's possible, but you can see it a number of times.
00:58:43.900 And then, when you talk about your next situation, people are all primed.
00:58:48.920 It's like, oh, that's a Rupar.
00:58:50.900 I get it.
00:58:51.860 I get it.
00:58:52.440 It's one of those.
00:58:54.120 So that's the technique.
00:58:55.380 Let's talk about DeSantis.
00:58:57.220 DeSantis is a very capable person, but his body language doesn't match what I think is happening on the inside of his head.
00:59:07.680 Now, I can't read his mind, but I have the following, let's say, beliefs about him.
00:59:14.020 I believe that all observation and his personal history suggests that he is confident and capable.
00:59:25.460 Would you say he's confident and capable, based on observation?
00:59:30.820 I think so.
00:59:31.700 I think he's both confident and capable.
00:59:33.660 However, his body language screams the opposite.
00:59:39.260 Screams it.
00:59:40.880 Right?
00:59:42.180 Problem number one, he has a head-shaking problem.
00:59:46.400 Look at me talking.
00:59:48.800 All right, I'm talking about Ron DeSantis, and I'm telling you that he has a head-shaking problem.
00:59:55.480 Notice my head isn't moving too much.
00:59:57.320 Now, I'm going to talk about, let's say, inflation, and I'm going to do a Ron DeSantis.
01:00:02.760 We've had inflation here since the beginning.
01:00:05.820 The inflation rate has been, you know, it's been higher, but now it's a little bit lower, but they're not counting it right.
01:00:12.720 Right?
01:00:13.160 The shaking the head is refuting his own voice.
01:00:18.420 That's number one.
01:00:19.580 Number two, he has liar eyes.
01:00:22.880 Liar eyes.
01:00:23.900 Liar eyes are the ones that are too wide, because when you open your eyes wide, you're trying to get somebody to believe something that you don't believe.
01:00:35.880 All right?
01:00:36.640 Let me give you the difference.
01:00:38.820 This is not liar eyes.
01:00:41.380 So I'll say something that's true.
01:00:44.280 Yesterday was a really nice day.
01:00:47.720 Okay?
01:00:48.540 Now I'm going to give you liar eyes.
01:00:50.380 I'll say the same thing, but I don't believe it was a really good day.
01:00:54.500 I just want you to believe it.
01:00:57.580 Yesterday was a really good day.
01:01:01.620 Liar eyes.
01:01:02.540 They're way wide.
01:01:04.220 He has liar eyes.
01:01:05.980 Now, I only see him in public, but I can't believe, yeah, Schiff, he has Adam Schiff liar eyes.
01:01:13.100 You're right.
01:01:13.720 But I can't believe that his eyes look like that when he's talking to his family.
01:01:17.120 Do you think his eyes are all the way open when he talks to his family?
01:01:21.900 No.
01:01:22.520 The all the way open is that you don't believe what you're saying, but you think if you change your face, you might be able to sell it a little better.
01:01:31.180 It's a recognition that there's a weakness with his argument.
01:01:35.120 Now, he also has what I call pleading voice.
01:01:39.460 Pleading voice.
01:01:40.400 If you want to see contrast, here are some people who don't have pleading voice.
01:01:49.560 Pence.
01:01:51.340 When Pence talks, whether he's right or wrong, his voice says he believes it, and this is a fact.
01:02:00.400 He'll say, we've got to do this because of this.
01:02:03.500 We have to do that because of that.
01:02:05.740 If we do this, we'll get this effect.
01:02:07.860 If we do this, we'll get that effect.
01:02:09.440 That's Pence.
01:02:11.240 Trump talks that way.
01:02:13.040 Vivek talks that way.
01:02:14.880 You know, most politicians have command voice.
01:02:19.360 That's command voice.
01:02:21.220 Pleading voice sounds like we've got to do something at the border because there's lots of people coming over.
01:02:28.480 I'll kill people dead if they come over the border.
01:02:31.620 It's not quite up talk.
01:02:33.120 But there's something that speaks to a lack of relaxation in the chest.
01:02:42.160 Maybe that's what's happening.
01:02:43.940 But the raised voice is the, I don't believe what I'm saying, but I hope you will.
01:02:49.600 Now, I started by saying that I think he's a confident, like internally, I think he's actually a confident, capable guy.
01:03:00.480 But his body language is screaming, the opposite.
01:03:05.100 And I feel like people are picking up on it.
01:03:07.200 And then on top of that, he has sort of a corporate choice of words.
01:03:12.900 You know, he's not quite as friendly, talking and familiar as somebody like a Trump or even a Christie.
01:03:20.440 You know, Christie has the common touch, you know, but also a big vocabulary.
01:03:25.700 So, anyway, so I think that that's the big problem with DeSantis.
01:03:32.060 Policies aside, you know, you can find policy problems if you want.
01:03:36.360 But I think his body language is an absolute train wreck.
01:03:40.460 And I also think that that doesn't matter as much for a governor.
01:03:45.820 It's just that when you go from governor to president, it's everything.
01:03:50.760 Right?
01:03:51.240 At the governor level, you're looking at his track record, his capability, his stated policies, you know, maybe his history.
01:04:00.140 And if he has a funny voice or he blinks too much or whatever it is, you're going to say, well, like, he's not my commander in chief.
01:04:09.760 Right?
01:04:10.420 He's not the commander in chief.
01:04:12.140 He's sort of a lawmaking, legislation, you know, fix the potholes kind of guy.
01:04:17.720 I don't care how he talks.
01:04:18.780 I don't care that his eyes are buggy.
01:04:21.180 Just fix the potholes.
01:04:23.020 Now you talk about somebody who's got to save the world in case there's a nuclear confrontation.
01:04:29.180 Do you want the guy with the bug eyes and the shaky head and the voice that sounds like he's pleading?
01:04:35.340 Nope.
01:04:36.520 Nope.
01:04:37.020 Nope.
01:04:37.380 Nope.
01:04:38.000 Nope.
01:04:38.720 Hard no.
01:04:40.040 Can't put him in that position.
01:04:41.120 Or do you want somebody like Trump, who's just Trump, which is sort of perfect if you're in a nuclear confrontation?
01:04:50.500 I'm going to pick Trump every time.
01:04:53.420 Every time.
01:04:54.920 Right?
01:04:55.140 You give me a nuclear confrontation?
01:04:57.360 Trump.
01:04:58.760 Right?
01:04:59.420 Now, could Vivek do it?
01:05:00.940 Probably.
01:05:02.160 I would have a lot of confidence in Vivek.
01:05:04.040 I would also have a lot of confidence in Ron DeSantis, because I think the internal Ron DeSantis doesn't have the problems that his body language is projecting.
01:05:14.560 I think on the inside, he's a solid guy.
01:05:19.000 All right.
01:05:19.380 Here's the latest from what I call black and white news.
01:05:28.000 Black and white news is where the news tries to divide us by race.
01:05:32.620 Chris Cuomo says to Vivek he shouldn't compare any black person to the KKK.
01:05:38.960 My take on this is I don't give a fuck.
01:05:42.880 I don't care.
01:05:44.240 I don't care who he compared to anything.
01:05:47.120 Because the person that was the subject of this was a person.
01:05:53.260 It was a human one person.
01:05:55.660 It wasn't somebody representing a whole race.
01:05:58.540 It was one asshole who he said sounded like the KKK and then gave a specific example to which I said, it's hyperbole.
01:06:08.960 I mean, it's obviously hyperbole.
01:06:11.080 It's not a literal thing.
01:06:13.400 He's saying that, you know, if you talk like this, you know, you're more associated with moving in that direction than the direction of good and rightness, I guess.
01:06:23.480 So everybody understood what he said, right?
01:06:26.240 When he compared somebody to the KKK, that's literally the entire business model of the Democrat Party.
01:06:35.140 Just comparing people to Hitler, comparing them to Goebbels, Goebbels, whoever the hell he is, comparing them to the KKK.
01:06:43.000 That's all it is.
01:06:46.560 The business model, literally the business model of the Democrats, is to do this.
01:06:52.760 So do I care that Vivek this one time did the same thing to shove it back in their fucking faces?
01:07:00.040 No.
01:07:00.780 Thank you, Vivek.
01:07:02.000 And thank you for not apologizing.
01:07:04.300 The best part was he didn't apologize.
01:07:06.660 He said, no, this is why he said it.
01:07:08.080 She said this.
01:07:09.700 Nobody agreed with what she said.
01:07:11.240 I mean, basically, was it Presley who said that they didn't need brown people who weren't supporting the brown point of view or something about Vivek?
01:07:25.920 So, yeah, she had it coming.
01:07:28.360 You say shit like that, somebody's going to compare you to the KKK.
01:07:32.580 That's the way it works.
01:07:34.440 He didn't make the rules.
01:07:36.700 So, but do I care that this is a black-white issue?
01:07:39.500 Nope.
01:07:40.260 Nope.
01:07:40.500 This is about Vivek and one person he talked about.
01:07:44.620 That's it.
01:07:45.760 I'm no longer buying into the idiot.
01:07:49.240 The average of one race should be compared to the average of the other race.
01:07:53.120 That's just stupid.
01:07:54.820 And I refuse.
01:07:56.200 I refuse to buy into the model.
01:07:58.220 It's about a person.
01:07:59.620 It's about a crime.
01:08:01.220 And that's it.
01:08:03.700 All right.
01:08:06.140 Let's see what else is going on here.
01:08:07.740 So, I don't know how to say this without sounding egotistical or being too much about me, so I'm not going to worry about it.
01:08:21.340 I'll just do it.
01:08:21.920 You know that when an author writes a book, it then is part of the job to promote it and do the marketing for it.
01:08:31.940 And I'm not super comfortable with anything that sounds like marketing or selling because I think the product has to do that, right?
01:08:41.480 If the product isn't selling itself, then you should have tried harder, right?
01:08:46.320 You missed it when you made the product.
01:08:48.700 That's where everything went wrong.
01:08:50.040 But I've taught you before that there's a tell for knowing when something is going to be big.
01:08:58.180 Now, here's a rare situation where the tell has formed, and you can really see it strongly, but the success has not happened yet, right?
01:09:09.780 It's happened at a small level.
01:09:11.900 And that's what's happening with this book.
01:09:13.540 So, I have over, I don't know, somewhere in the 40 to 50 book range, if you count the Dilbert reprint books.
01:09:20.740 I think, I forget how many regular books I've written, a dozen or so.
01:09:25.660 But, so I've seen what happens when a book is launched and what it looks like.
01:09:31.840 Usually, it's me talking about it, and some people like things I do by the book, and then sometimes they write good reviews, most of the time, I'm lucky to say.
01:09:40.900 And, you know, and then maybe some other people look at it and they buy the book.
01:09:45.020 So, that would be a normal, you get a big sales bump when you're doing the marketing, and then, you know, if people like it, maybe it lasts a little while.
01:09:55.780 That's normal.
01:09:57.100 That does not necessarily predict a big hit, because that's just the normal cycle that every book goes through if there's any promotional push.
01:10:06.240 However, there is a tell that you don't see often that is just screaming about this book.
01:10:15.760 And the tell is people extending the model.
01:10:20.840 So, today, there is yet another pirated book on Amazon.
01:10:25.300 I call it a pirated book.
01:10:27.040 It's listed as a workbook.
01:10:28.640 So, there are now three people who have ripped off my book the same week it was published to have a rip-off.
01:10:37.880 Have you ever seen that before?
01:10:40.140 Has anybody ever seen that before?
01:10:42.300 Even once.
01:10:43.340 It's happened three times in a week.
01:10:45.420 I've never seen it.
01:10:46.260 I've got already two offers from other countries.
01:10:53.180 I won't mention the countries, but they're notable countries.
01:10:57.300 Notable countries for translation rights.
01:11:00.760 Now, the way those offers work is that those publishers in other countries, if you do a deal with them, they do all the work.
01:11:07.600 They literally just take the book and they put their own cover on it, do the translation, market it, and sell it in their own country.
01:11:16.600 And the book's only been out a few days, you know, a week or so, and already other countries are asking for it.
01:11:24.840 That's unusual.
01:11:27.060 That's unusual.
01:11:27.660 But, I don't know if you're following my Twitter feed, but have you seen how many people are taking a picture of family members reading the book and reporting that they're going to have to buy more than one of them for just their family that lives in one house?
01:11:43.200 How often do you see people buy more than one copy of a book for a family of four?
01:11:49.620 It's happening massively.
01:11:53.340 All day long, people are telling me, I got two, I got three.
01:11:56.760 My wife wanted to read it.
01:11:57.960 I have to buy another one.
01:11:59.300 I just got my 12-year-old son's reading it, so we had to get another one.
01:12:03.300 I've also gotten massively.
01:12:06.100 I read the Kindle, but I wanted the hard book, the hard cover.
01:12:10.180 I bought 10 for a gift.
01:12:11.860 I've never seen anything like it.
01:12:13.120 And then the people who are reporting that it's already changed their lives.
01:12:17.960 It's crazy.
01:12:19.620 If you look at the book, there's still a few books above it in some of the categories.
01:12:24.540 If you look at the books above it, read the reviews of the books that are still above it, which I don't think that's going to last.
01:12:32.760 But the books that are above it have, like, people love them.
01:12:37.120 Like, oh, this is the greatest book.
01:12:38.860 I enjoyed it all.
01:12:40.240 But look at the language they use as to whether it changed anything about their life.
01:12:46.020 And then look at the reviews for this book.
01:12:48.160 And this book is, like, off the hook.
01:12:52.200 And then also look for a real review about this book that's also negative.
01:12:57.740 There are a couple of one-star reviews where it's really obvious they didn't read the book.
01:13:01.980 It's just some critic of mine.
01:13:03.220 So if you're famous from some other domain, you always get the critic who comes over and gives you the one-star review.
01:13:11.500 And then in the review, they make sure that, well, judge for yourself.
01:13:17.220 So one of the reviews for one-stars was, there's nothing new in the book.
01:13:21.420 Does that sound like somebody who read it?
01:13:25.480 There's nothing new in there.
01:13:28.440 For those of you who read it, right, there's no way.
01:13:32.060 There's no way you would say that.
01:13:34.340 Even if you did have a complaint, it wouldn't be that.
01:13:38.320 It definitely wouldn't be that.
01:13:39.960 So you know that's not a real review.
01:13:41.320 So I've never had a book that didn't have a bad review, that wasn't obviously just a troll.
01:13:48.420 So there's something happening.
01:13:51.180 And I think it has to do with just being in the right place at the right time.
01:13:55.940 That at the same time that AI showed us that words are how intelligence is formed, and the word combinations.
01:14:04.040 And that's what a reframe is.
01:14:06.300 A reframe is putting words into your head that replaced the words that were there that weren't helping you.
01:14:12.700 So maybe it's just a time when people's own minds connected all the things that were happening in other places.
01:14:19.580 And they said, yes, I get that.
01:14:22.200 That connects all the dots for me.
01:14:24.200 That might be what's happening.
01:14:25.200 I don't know.
01:14:25.480 All right, so here's a little test for you on, well, here's a perfect example.
01:14:31.300 Bow-tied Kong on X asked ChatGPT to write 200 words on me, my new book, Reframe Your Brain, and it showed the samples.
01:14:46.040 Now here's that tell for success.
01:14:48.800 Why did somebody think that creating content for X in which they used AI to write reviews for my book, why did they even think of that?
01:15:01.140 This is the tell for something that's going to be huge.
01:15:04.260 When people want to extend your product, and I would call this one of those, something around it.
01:15:09.700 Like, I want to make some content that's around your book.
01:15:13.300 Look, that does not happen for something that's not going to be enormous.
01:15:19.300 So, I mean, every single is just glaring.
01:15:22.100 I've never seen this.
01:15:22.960 It's really weird.
01:15:23.720 So let me give you this test.
01:15:25.320 I'm going to read two paragraphs, short ones, and you're going to tell me which one I took from an actual review of my book from a human being, and which one was AI.
01:15:36.460 Okay?
01:15:36.700 So I'll read one and then the other, but let's see if you can guess.
01:15:42.520 All right, here's the first one.
01:15:44.240 AI or human.
01:15:46.560 The reframes shared in this book are like a toolbox for the mind, shifting from managing time to managing energy.
01:15:53.880 We're seeing critics as mascots instead of monsters.
01:15:56.720 It has had an immediate impact on my outlook.
01:15:59.760 It's incredible how these simple changes can rewire our thoughts and emotions in such a profound way.
01:16:05.460 All right, so that's number one.
01:16:07.780 Here's number two.
01:16:09.160 Is it AI or human?
01:16:10.820 Imagine converting roadblocks into opportunities and setbacks into launch pads.
01:16:15.740 This book arms you with a treasure trove of strategies to revolutionize your outlook.
01:16:21.380 Adam's witty and engaging style makes complex concepts accessible, leaving you empowered to seize control of your mind.
01:16:29.380 So, let's talk about the second one.
01:16:36.380 So, the second one, human or AI?
01:16:43.160 It's funny, you're all over the map on this.
01:16:46.800 All right, the answer is that the second one was AI.
01:16:49.620 And to me, it's obvious, because real people don't say things like, makes complex concepts accessible.
01:16:58.720 Well, sometimes they do.
01:17:00.620 But, leaving you empowered to seize control of your mind.
01:17:05.760 Is that how your friends talk?
01:17:07.520 How'd you like the book?
01:17:08.900 Well, I'll tell you.
01:17:10.500 It left me empowered to seize control of my mind.
01:17:14.200 So, that was good.
01:17:17.740 Yeah, so that's AI.
01:17:19.220 Now, look at the telos for the human.
01:17:22.520 It's incredible how these simple changes can rewire your thoughts.
01:17:26.960 Would AI say, it's incredible?
01:17:32.140 No.
01:17:32.860 Do you know why?
01:17:34.940 Why would AI not say, this is incredible?
01:17:38.760 How do you know it's not AI by that one word, incredible?
01:17:42.880 It's an opinion.
01:17:44.940 It's an internal feeling.
01:17:47.660 So, the author who said, it's incredible, is saying, I have a feeling.
01:17:53.820 Because incredible is not an objective standard.
01:17:56.860 You can't say there's six of them, so it's incredible.
01:17:59.280 It's just a feeling.
01:18:00.900 That's your tell.
01:18:02.160 AI never writes about its feelings.
01:18:04.460 It would if you gave it a super prompt, but it doesn't do it automatically.
01:18:07.960 All right.
01:18:08.180 But, let's say, toolbox for the mind.
01:18:15.140 Toolbox for the mind.
01:18:16.500 AI or human?
01:18:18.180 That's somebody who's a good writer, so you can tell they're a good writer, but human.
01:18:27.280 Yeah, human.
01:18:28.220 Because toolbox is simple, as opposed to engaging styles making complex concepts accessible.
01:18:35.920 All right.
01:18:37.320 A good writer would not write this sentence.
01:18:39.780 Engaging style makes complex concepts accessible.
01:18:43.140 No good writer would write that sentence.
01:18:47.080 All right.
01:18:47.400 So, let's say, seeing critics as mascots instead of monsters.
01:18:57.580 That's another tell for a human.
01:19:00.160 Because out of the whole book, the AI would sort of randomly pick something.
01:19:06.020 But a human would very much have gone to that example.
01:19:09.680 So, that was, you know, I won't talk about what it's about.
01:19:13.760 But in the book, it was a human-ish story.
01:19:16.960 So, a human would go to that.
01:19:19.000 Whereas an AI might say, well, they all look the same to me.
01:19:21.620 I'll pick one.
01:19:22.460 All right.
01:19:22.680 So, here's my take on AI.
01:19:28.000 AI learns how to write by looking at all the writing that people have written that's available for it to study on.
01:19:35.740 Right?
01:19:36.580 How in the world can AI learn to write well?
01:19:40.580 If it looks at 99.9% bad writing, which is what I would estimate most writing is, how does it become good?
01:19:51.360 How would it know what the good writing was?
01:19:54.500 Now, suppose it went to a list of great works.
01:19:59.160 Because, you know, it doesn't do that.
01:20:00.560 So, AI doesn't use reason and say, oh, let me see what the smart people say about this book.
01:20:06.200 It doesn't do that.
01:20:07.000 It's just looking at the words and looking at patterns.
01:20:10.660 So, how in the world would it know what good writing was?
01:20:14.680 Because all of the examples, 99%, would be bad writing.
01:20:18.800 Now, somebody said, well, it would be easy to fix that.
01:20:21.960 You know, you could point it at some good writing, and then it would know what the good stuff is.
01:20:25.560 So, I said, well, what would be an example of good writing that you'd point it to?
01:20:31.340 And somebody said, Tolkien, would be an example.
01:20:34.420 Tolkien, Lord of the Rings.
01:20:36.280 Another one would be Shakespeare.
01:20:38.580 Shakespeare.
01:20:38.980 And then I said, because my ego knows no limits, those are two of the shittiest writers I've ever experienced.
01:20:48.680 Shakespeare is the worst.
01:20:50.540 That's a total disaster of writing.
01:20:53.200 And Tolkien?
01:20:54.140 My God.
01:20:55.560 He was criminally bad at writing.
01:20:58.820 Now, Tolkien was great at stories and characters.
01:21:02.420 But his writing was just exhausting.
01:21:07.020 Nobody would teach anybody to write like Tolkien.
01:21:09.660 Do you think there's a class, write like Tolkien?
01:21:12.260 No.
01:21:13.260 Because he was a terrible writer.
01:21:15.740 He was just great at storytelling.
01:21:17.900 How about Shakespeare?
01:21:19.860 Well, that's not even writing.
01:21:21.840 I don't even know what that is.
01:21:23.140 It's more like feeling the words or, you know, acting like, well, I'll be more brutal.
01:21:30.660 I believe that Shakespeare is a mass hallucination where smart people said it was good, and then all the other smart people had to say it was good.
01:21:40.140 There is no fucking way that Shakespeare would be good if it dropped today.
01:21:45.400 Right?
01:21:46.160 If somebody published the first ever works of Shakespeare and you'd never heard of them, you would not be picking that book up and say, whoa, whoa, this is the greatest writing I've ever seen.
01:21:56.280 No, it is a complete hallucination based on the fact that smart people seem to have liked it before, so you're going to say it was smart, too.
01:22:06.000 Now, does it have no redeeming features?
01:22:09.940 No, I'm not saying that.
01:22:12.100 No, Shakespeare has an entertainment quality to it, but it's not the writing.
01:22:17.000 The writing is just too hard to read.
01:22:21.060 It's the feeling of the words, perhaps.
01:22:24.340 So it's the, you know, maybe the emotion that they portray without being in the same form that would be easy to understand if it were common English.
01:22:32.900 So it has value.
01:22:35.000 More like poetry.
01:22:36.320 More like poetry.
01:22:37.180 That would be a better way to say it.
01:22:38.800 So it's not good writing, but it might be good poetry.
01:22:41.780 I'd be willing to accept that reframe.
01:22:44.580 Okay, yeah.
01:22:46.260 So suppose you teach AI, look at some Shakespeare, look at some Tolkien.
01:22:52.760 Now you know what good writers are.
01:22:54.840 But you say, Scott, those are, you know, those are, first of all, they're a little older, right?
01:23:00.580 Tolkien's a little bit older.
01:23:01.940 So we don't write like that.
01:23:03.520 So give us a more modern example of a good writer.
01:23:06.100 All right.
01:23:06.960 Stephen King.
01:23:11.080 Okay.
01:23:12.360 Stephen King.
01:23:13.680 Great writer, right?
01:23:14.580 Well, it's debatable.
01:23:22.020 Debatable.
01:23:23.020 I would say he spends a lot of time describing things I don't want to hear about.
01:23:27.960 But on the other hand, he also wrote one of the best books on how to be a writer.
01:23:33.620 It's called On Writing.
01:23:35.800 So while I can disagree with Stephen King on politics, his one book that wasn't fiction, it was about how to be a writer, highly recommend it.
01:23:44.180 If you want to be a writer, if you want to be a writer, start there.
01:23:47.260 I would recommend it, absolutely.
01:23:48.860 But his fiction, I wouldn't say he's a good writer as a writer, but he might be a good story thinker.
01:24:00.080 He might be good at organizing stories and characters or something.
01:24:03.740 I don't know.
01:24:08.260 So that's what...
01:24:09.540 Yeah, I would say Trump is an amazing writer.
01:24:12.320 You know, even with his occasional typos, he's an amazing writer.
01:24:17.120 And he will never get credit for that.
01:24:19.680 Nobody will ever understand that Trump is one of the best writers in the modern world.
01:24:25.120 It's insane, but he is.
01:24:31.040 All right.
01:24:34.800 He's an average speller.
01:24:36.280 Yeah.
01:24:40.260 All right.
01:24:41.040 So I don't think AI will ever become a good writer unless they train it to write like one particular good writer.
01:24:53.400 That could work.
01:24:55.140 So if you said write in the style of, oh, you know, but it still can't do that.
01:25:00.140 So far, it really can't write in the style of.
01:25:02.540 It's sort of like somebody doing an impression of a famous person.
01:25:08.300 You can tell it's not the famous person.
01:25:10.500 It's obvious.
01:25:11.660 But you still laugh at the impression because it reminds you of it.
01:25:15.520 So I think AI is more like an impression, doing an impression of a human.
01:25:20.320 It doesn't really remind you of what they can do.
01:25:23.140 Or remind you, but it's not what they can do.
01:25:27.020 Is Vivek America Great Again a clever slogan?
01:25:32.540 Oh, Vivek America Great Again.
01:25:34.620 I get it.
01:25:35.200 Vivek America Great Again.
01:25:36.560 No, I wouldn't put the America Great Again in there again.
01:25:40.880 Do you know what I think Trump's slogan should be?
01:25:45.300 I'm going to make this a dramatic reveal.
01:25:49.740 All right.
01:25:50.340 Because I want you to wait, have a little pause.
01:25:53.680 And then when I say it, I'm going to be quiet for a moment.
01:25:56.740 So you just soak it in.
01:25:58.280 All right.
01:25:58.560 So here's what Trump's slogan should be.
01:26:06.960 Find out.
01:26:18.280 And then just stop talking.
01:26:22.080 Stop talking.
01:26:23.020 Just take a vow of silence until the election.
01:26:28.840 My new slogan?
01:26:31.260 Find out.
01:26:34.520 Because that's where we're at.
01:26:36.380 Right?
01:26:38.020 Well, when I said it, let me do a test.
01:26:42.320 When I said it, could you feel it?
01:26:45.540 Can you see the goosebumps?
01:26:48.060 That's feeling, feeling it.
01:26:50.600 I'm literally feeling it.
01:26:51.760 I can feel it in my body.
01:26:54.560 Right?
01:26:55.640 Am I the only one?
01:26:57.580 Tell me that you can feel that in your body.
01:27:03.300 Some yes, some no.
01:27:05.460 Yeah, some feel it.
01:27:06.960 So obviously, you know, slogan isn't going to work the same on everybody.
01:27:10.920 But, yeah, everybody's going to have a different reaction.
01:27:16.180 I don't know.
01:27:16.620 I think it'd be funny.
01:27:17.660 He's not going to do that, obviously.
01:27:18.920 But it would be funny.
01:27:21.760 You know, we've been saying it's the find out phase.
01:27:27.400 But it'd be funny if he said it.
01:27:29.360 Because what I think about when I hear that, find out,
01:27:33.500 I think about one day in jail.
01:27:36.920 You know, don't you think the Democrats are wondering, well, I wonder what would happen if we actually put him in jail?
01:27:46.800 To which I say, go ahead, find out.
01:27:51.200 Do you really want to know?
01:27:52.380 Well, you're not going to like it.
01:27:55.540 But find out.
01:27:57.740 I think if you reveal too much, you're giving away too much.
01:28:03.080 I think we're in the find out phase.
01:28:05.460 And the find out phase does not require me to signal what it will be before it is.
01:28:10.760 Because that's what makes the finding out so much fun.
01:28:14.340 You find out when it's too late to turn back.
01:28:18.080 So wait until it's too late to turn back, and we'll be happy to let you find out.
01:28:22.560 Now, I, of course, do not promote violence in any way.
01:28:26.400 So in non-violent ways, I say, let's find out.
01:28:33.340 If that's where they want to take it, they will find out.
01:28:37.780 So that's all I have for you today.
01:28:40.160 The greatest live stream ever.
01:28:43.460 Ran late because I think it was worth every moment.
01:28:46.740 And I think you'd agree.
01:28:49.240 And I hope those watching on X for the first, this is the second day we've done this.
01:28:53.820 I hope you're enjoying the show.
01:28:57.340 And we will talk to you tomorrow.
01:29:01.500 Thanks for joining YouTube.
01:29:04.460 Bye.