Real Coffee with Scott Adams - October 27, 2022


Episode 1909 Scott Adams: Elon Musk Buys Twitter And Nothing Will Be The Same, Including Ukraine War


Episode Stats

Length

59 minutes

Words per Minute

139.74745

Word Count

8,256

Sentence Count

611

Misogynist Sentences

1

Hate Speech Sentences

21


Summary


Transcript

00:00:00.000 Good morning, everybody, and welcome to a highlight of civilization and one of the most
00:00:08.860 fun days that you'll ever have on Coffee with Scott Adams.
00:00:13.520 Today, Neo has entered the Matrix, and I don't think anything's going to be the same.
00:00:21.520 I don't know.
00:00:22.240 We'll find out.
00:00:23.480 But would you like to take it up a notch?
00:00:25.500 Would you like to enjoy coffee in not only this dimension, but higher dimensions as well?
00:00:31.620 Well, all you need is a cup or mug or a glass of tank or chalice, a stein, a canteen jug or
00:00:35.640 a flask, a vessel of any kind, fill it with your favorite liquid I like, coffee.
00:00:41.620 Enjoy me now for the unparalleled pleasure.
00:00:45.500 It's the dopamine of the day, the thing that makes everything better.
00:00:48.220 It's called the simultaneous sip, and it happens now.
00:00:51.240 Go.
00:00:55.500 Mmm, yeah, freedom.
00:01:00.660 Freedom!
00:01:03.520 Well, I was very proud of my tweet last night.
00:01:07.940 I tweeted the following.
00:01:09.860 Everything you can identify by its initials is working for its own interests, not yours.
00:01:16.060 No exceptions.
00:01:18.120 Now, how would you like a little persuasion tip to kick off your morning?
00:01:22.740 What was the persuasion part of this tweet?
00:01:26.420 I'll read it again.
00:01:27.640 And you tell me what is the persuasion part.
00:01:30.000 Everything you can identify by its initials is working for its own interests, not yours.
00:01:35.100 No exceptions.
00:01:36.880 Yeah, no exceptions.
00:01:39.940 Do you know me well enough to know that, of course, I know there are exceptions.
00:01:46.360 Don't you know I know there are exceptions?
00:01:48.140 Of course there are exceptions.
00:01:49.660 Of course there are.
00:01:50.440 But the fact that I make you think about it is what binds you to the tweet.
00:01:58.620 So here's what I hoped you would do.
00:02:01.960 I'm seeing some exceptions, like AIDS.
00:02:05.380 Yeah, that would be an exception.
00:02:07.780 So what you were supposed to do is say, well, this can't be true.
00:02:11.260 Well, it can't be true that everything that you can identify by its initials is working for itself, not you.
00:02:17.100 I mean, what about the FBI?
00:02:19.420 Oh, okay.
00:02:20.860 All right, but it can't be true of everything.
00:02:22.620 I mean, take, for example, the IRS.
00:02:25.860 Oh, okay.
00:02:27.040 Well, at least you've got the GOP.
00:02:29.820 Oh, okay.
00:02:32.640 Yeah.
00:02:33.360 And not all people.
00:02:35.440 Not all people.
00:02:36.320 I mean, what about AOC?
00:02:38.480 Oh, yeah.
00:02:40.700 But surely not BLM.
00:02:43.280 Oh, yeah.
00:02:47.100 But somebody said KFC.
00:02:49.700 Somebody said most, somebody said every business is identified by its letters on the stock market.
00:02:57.240 To which I say, that's my point.
00:02:59.960 That's my point, not your point.
00:03:02.080 Every single company is working for itself.
00:03:05.020 That's called capitalism.
00:03:06.160 So, yes, every single company that's listed on the exchange is working for itself.
00:03:13.600 That's how it works.
00:03:15.040 Even KFC.
00:03:16.480 Even IBM.
00:03:19.040 All right.
00:03:20.040 So, just so you know, you can be on the inside.
00:03:24.100 Of course I did not believe there were no exceptions.
00:03:27.380 Of course I did not believe that.
00:03:29.580 But it's funny.
00:03:31.340 Can you give me that?
00:03:32.640 Did it work?
00:03:33.240 How many of you spent extra time because I said there were no exceptions?
00:03:40.200 I'll bet you did.
00:03:41.640 Yeah, ESG.
00:03:42.920 Right.
00:03:44.660 All right.
00:03:45.200 So that's your persuasion tip of the day.
00:03:48.400 That would be the intentional error persuasion trick.
00:03:51.640 Well, I continue to watch CNN to see if they are turning toward the middle.
00:03:59.260 And I do see signs of it.
00:04:01.200 I do see signs of it.
00:04:02.540 For example, we did see that even CNN people, I saw several of them, question whether Fetterman should have given, even done the debate.
00:04:14.980 I'm not sure they would have done that a year ago.
00:04:19.100 I feel like that that was, you know, clearly a move to the center.
00:04:23.920 And Smirkanish, I think, was the one who said it directly.
00:04:27.060 You know, maybe it wasn't his time.
00:04:28.600 Maybe wait till next time.
00:04:30.960 You know, which has nothing to do with the disability.
00:04:33.180 I think you can be, you know, perfectly appropriate and still say maybe next time was better for him.
00:04:42.480 So along that theme, Chris Silliza, who writes for CNN, and had been one of the most prolific anti-Trump opinion people.
00:04:54.980 But here he is today.
00:04:57.020 He says, he writes a little opinion piece.
00:04:59.340 It's just this weak little opinion piece that looks like he dashed it off in 10 minutes.
00:05:03.740 He says, with less than two weeks left before the November midterm elections,
00:05:09.040 all signs are pointing to a strong Republican showing that would result in a switch of party control in the House and possibly the Senate.
00:05:18.860 He goes, that's very good news for Donald Trump.
00:05:22.800 And then I'm waiting for the twist.
00:05:25.220 You know, the part where, OK, now he's going to get him.
00:05:28.180 And he just goes on to say that Trump will do what he always does, which is take credit.
00:05:35.800 If the midterms go Trump's way, do you think Trump will take credit for the fact that a lot of people that, let's say, the observers didn't think had a chance,
00:05:47.720 or at least tie or might actually win?
00:05:52.120 Yes.
00:05:53.080 Not only will Donald Trump predictably try to take credit, but let me ask you, he deserves it, right?
00:06:03.860 Right?
00:06:04.300 I wouldn't question it at one moment, would you?
00:06:09.700 If Trump took credit, wouldn't you say, yeah, that's warranted?
00:06:14.740 Because I believe that even the people who are doing things sort of independently from Trump are still conforming to Trump.
00:06:23.480 Am I right?
00:06:24.080 There's nobody who's out there completely independent of Trump if they're a Republican.
00:06:29.720 If you're a Republican, you're running under a Trump set of parameters, whether you say it out loud or not.
00:06:36.900 You don't have to say it out loud.
00:06:39.200 Trump would get complete credit for this, in my opinion.
00:06:42.320 I think that would be completely justified.
00:06:44.420 But here's the surprising part.
00:06:46.800 Even Chris Eliza says that's true.
00:06:49.520 He says it a little more indirectly.
00:06:51.020 But he does say that Trump campaigned for, you know, some of these semi-underdogs and that they might actually win.
00:07:00.000 And that taking credit wouldn't be crazy.
00:07:03.260 Right?
00:07:03.680 I mean, I'm adding that part.
00:07:05.820 But, yeah, I do see a genuine shift toward the middle for CNN.
00:07:10.800 It continues.
00:07:12.740 All right.
00:07:13.200 The big story you all want to talk about is Elon Musk carried a sink into Twitter headquarters yesterday.
00:07:21.640 An actual kitchen sink.
00:07:25.120 Now, and then he tweeted, you know, that he was in Twitter headquarters, let that sink in.
00:07:32.700 Now, was that the most brilliant thing that anybody ever did?
00:07:36.760 See, you think that Elon Musk is mostly smart with technology.
00:07:44.720 That's what you think.
00:07:45.820 He knows how to make money.
00:07:47.420 But you really miss that his understanding of persuasion is as good as it gets.
00:07:53.320 It's as good as it gets.
00:07:55.360 He's in the elite category of persuaders.
00:07:58.760 And this was a perfect example.
00:08:00.160 Who else would have carried a sink into Twitter headquarters?
00:08:05.180 And I spent a lot of time trying to figure out why.
00:08:09.080 Right?
00:08:09.860 How many of you spent a lot of time thinking about it and trying to figure out why exactly a sink?
00:08:16.740 I actually still don't know.
00:08:20.580 Oh, is that an old meme?
00:08:23.780 It's an old meme.
00:08:27.360 Let that sink in.
00:08:28.660 Yeah, but let that sink in.
00:08:32.780 Is that why he brought the sink?
00:08:34.500 Just for that one line?
00:08:37.240 People are saying it's an old meme.
00:08:39.780 Well, I understand that everything but the kitchen sink, but how does it...
00:08:43.620 I'm not quite connecting it.
00:08:45.260 Anyway, it doesn't matter.
00:08:47.220 So, here's the beauty.
00:08:49.680 The beauty is that you all interpreted it however you wanted to.
00:08:53.120 Am I right?
00:08:54.600 Didn't you all simply put whatever interpretation you wanted on that?
00:08:58.040 Which is perfect.
00:09:00.020 It's just perfect.
00:09:01.420 So, what do I teach you about persuasion?
00:09:05.620 Here are two rules that Elon Musk is using.
00:09:10.600 Number one, visual.
00:09:14.260 Visual persuasion beats everything.
00:09:16.800 So, he created a picture that could capture the concept.
00:09:21.480 Because everybody was going to talk about the concept of him buying it today.
00:09:24.660 But there's no picture of a person doing a contract to complete a transaction.
00:09:31.440 He actually created a visual to basically capture that story.
00:09:38.860 And, of course, he guaranteed that that would be on every media outlet.
00:09:43.880 Now, here's the other.
00:09:44.840 So, the first rule of persuasion is visual.
00:09:47.740 You have to own the visual.
00:09:48.720 The second part of persuasion is surprise.
00:09:53.900 Surprise.
00:09:55.340 So, he had to do something that was visual, but visual in a surprising way.
00:10:00.760 So, he brings a sink.
00:10:03.620 These are two of the, you know, probably if you listed your top ten rules of persuasion,
00:10:09.840 these would be two of the ten.
00:10:11.080 When you see him persuading, he's doing it right.
00:10:16.700 Like, he's actually using the best technique of persuasion.
00:10:20.860 This is not an accident.
00:10:22.380 He knows how to do this.
00:10:24.980 And, I don't know where he learned it, but...
00:10:28.500 But, one couldn't speculate.
00:10:31.780 All right, so, I tweeted that Neo has entered the matrix.
00:10:39.380 And, Neo has entered the matrix.
00:10:44.960 Now, I always tease people when they make matrix, you know, references,
00:10:49.280 because it's such a stale reference.
00:10:51.800 But, I'm going to make an exception for myself, like everybody does.
00:10:56.480 Everybody makes an exception for themselves.
00:10:58.840 I'm going to make an exception for myself because it's too perfect.
00:11:01.780 Because, he actually is entering the, you know, not exactly the matrix,
00:11:08.980 but once he has access to the algorithm, he actually controls reality.
00:11:19.700 He's not just controlling a company.
00:11:23.740 And, he's not just controlling, you know, an important media enterprise.
00:11:28.800 It's way beyond that.
00:11:30.900 He's controlling reality.
00:11:33.960 Neo has entered the fucking matrix.
00:11:37.960 He's literally...
00:11:39.180 I mean, this is as close to a literal as you can get.
00:11:43.480 I mean, it's not literally Neo.
00:11:45.320 But, it's as close as you can get, you know, in our world.
00:11:48.740 Now, do you think this marks a turning point, or is it just going to be another ownership of Twitter?
00:11:56.900 And, he'll do some funny things and nothing will change.
00:11:59.720 What do you think?
00:12:00.740 Do you think this changes everything, or changes nothing?
00:12:04.360 I think it's going to change everything.
00:12:11.140 Maybe not right away, but faster than you think.
00:12:14.840 And, I'll develop this a little bit.
00:12:16.700 Now, I saw somebody very smart on Twitter, Brian Rommeli.
00:12:24.980 I wish I could pronounce his name.
00:12:26.720 But, he was speculating that this is just the first part, and that Elon might have his eyes on acquiring Rumble.
00:12:38.020 Now, I own stock in Rumble, so full disclosure.
00:12:43.340 I don't want to start a rumor that benefits me financially in some non-disclosed way, right?
00:12:49.900 So, I've got some stock in Rumble.
00:12:52.780 That I acquired because Rumble acquired locals, and I had a small investment in locals, so I went over to Rumble.
00:12:59.820 What do you think?
00:13:02.340 Do you think Elon Musk would buy Rumble?
00:13:07.460 It feels like a natural, doesn't it?
00:13:10.100 It does actually feel like a natural extension, but it feels like, you know, who knows.
00:13:14.860 So, I'm not going to predict that, but I'll note that somebody else has predicted it.
00:13:20.140 Somebody smart.
00:13:24.760 Question number one.
00:13:25.860 One, what would happen, just hypothetically, if Twitter ends up changing the algorithm such that Democrats see real news for the first time?
00:13:37.840 What would happen?
00:13:40.020 I mean, seriously.
00:13:41.180 What would happen if Democrats saw real news for the first time?
00:13:45.560 What will that do?
00:13:47.100 Because I'm here to tell you that there's a mental health element to this that is underappreciated.
00:13:53.460 Have any of you ever had the experience of thinking something was true, let's say about your own life,
00:14:01.300 only to learn that what you thought was always true, or had been true for a long time, was completely untrue?
00:14:08.580 I'll give you an example.
00:14:09.640 Suppose you worked for Enron, and you'd worked for Enron for years, and then later you found out that Enron wasn't even a real company.
00:14:17.180 It was basically a whole sham.
00:14:18.800 Like, what would that do to your mental health?
00:14:23.820 I know a woman.
00:14:25.500 I won't name names.
00:14:27.160 But somebody I knew.
00:14:28.840 Somebody I still know.
00:14:29.760 I obviously still know her.
00:14:31.660 And she had been with a guy for years and years, and they were very close.
00:14:36.720 Didn't feel like getting married necessarily, but it looked like it was heading that way.
00:14:40.320 And one day, her boyfriend had some major health problem.
00:14:47.420 So the boyfriend goes to the hospital, and she visits him in the hospital.
00:14:51.640 And there's another woman in there visiting her boyfriend in the hospital, and they see each other.
00:14:56.880 They go, hi.
00:14:57.840 I see you're visiting the same person.
00:15:00.080 What are you here for?
00:15:03.120 And she goes, oh, I'm his wife.
00:15:05.940 And she goes, what are you here for?
00:15:07.980 And she said, I'm his girlfriend of five years.
00:15:13.160 And they learned for the first time he had two lives, two complete lives.
00:15:16.960 Now, what happened to her mental health when she found out the last five years of her life were not what she thought?
00:15:27.140 It wasn't easy.
00:15:29.080 It wasn't easy.
00:15:30.760 Like, it was really disorienting.
00:15:32.740 I mean, you can imagine what that would do to you.
00:15:34.600 Your whole life, your whole life would be altered by that.
00:15:38.120 Because everything you saw would be, I don't know if I can trust that.
00:15:41.080 What would you ever trust?
00:15:42.660 You would never trust anything again, right?
00:15:46.960 So, one possibility of Elon buying Twitter is that Democrats will see real news for the first time.
00:15:57.260 Also, people on the right will see things from the left that they've never seen before.
00:16:03.100 What will that do to people?
00:16:05.100 Will that bring us together or make us crazy?
00:16:08.180 Because I think the initial impact will be to make us disoriented.
00:16:12.940 What happens when people get disoriented?
00:16:15.440 Go back to your persuasion lessons.
00:16:18.680 If everybody gets disoriented and they can't find their base anymore, like their team doesn't make sense, they're not sure which team they're on now, what happens?
00:16:27.760 When people are disoriented, they don't thrive in chaos.
00:16:32.680 No, that's not what happens.
00:16:36.080 Cognitive dissonance, maybe.
00:16:38.220 Cognitive dissonance may give them some escape.
00:16:40.880 But there's something scarier.
00:16:44.040 The scarier thing is that when nobody has anything to latch on to, the first person who provides that thing to latch on to, they'll latch on to it.
00:16:53.380 So, in other words, they'll be looking for a new truth.
00:16:56.620 Whoever provides the new truth owns them, right?
00:17:01.320 They'll be broken free from their old truth and floating free, and they're free agents.
00:17:06.520 Somebody's going to suck them into their team.
00:17:10.600 You hope it's somebody good, right?
00:17:13.260 You hope it's somebody who has some principles and not somebody evil, but somebody's going to do it.
00:17:18.560 A lot of free agents will be created by this, I think.
00:17:21.020 Now, that assumes that the algorithm changes in some dramatic way, but which might not happen.
00:17:26.600 We don't know.
00:17:27.920 What would happen if Elon Musk finds out that Twitter was actually, let's say, being influenced by foreign forces and everything was corrupt?
00:17:42.100 Well, he'd fix it.
00:17:43.400 I mean, that would be step number one.
00:17:45.700 But things could look real different, because we don't know.
00:17:49.700 Well, so, you want to hear a good example of some fake news today?
00:17:55.020 Wall Street Journal had some excellent fake news.
00:17:58.200 And I don't really blame the Wall Street Journal of fake news often, do I?
00:18:03.700 In fact, I don't know if I've name-checked them before for fake news.
00:18:08.320 So, this will be the first time, probably.
00:18:10.240 All right?
00:18:10.760 So, the Wall Street Journal had an article that China is considering interfering in our elections.
00:18:16.100 I don't know if that's true or false, but probably true.
00:18:20.440 And it reports that they may have changed their mind about interfering in the 2016 election, or was it 2020?
00:18:28.200 But they haven't interfered recently, because they didn't have a preference of who was president.
00:18:34.900 So, they didn't think that their interests would be served by either president, so they didn't interfere.
00:18:40.060 But now, the thought is that they might interfere.
00:18:44.040 And they would interfere in a general, mucking-up society way, as opposed to picking a winner.
00:18:48.900 So, they try to sow civil war and, you know, and make us doubt the elections and stuff.
00:18:56.980 Now, that part, I'm willing to believe, is all correct reporting by the Wall Street Journal.
00:19:02.480 I wouldn't know one way or the other.
00:19:04.860 But here's what they said next.
00:19:07.280 That China had not risen to the level of Russia, and they said also Iran, in interfering with their elections, as Russia and Iran have alleged to have done.
00:19:23.800 Do you see the fake news?
00:19:26.000 Russia interfered with their election.
00:19:28.880 Now, specifically, they were talking about bots and fake users.
00:19:34.840 So, here I'm not talking about hacking emails.
00:19:38.500 So, nothing about hacking emails or steal dossier or anything like that.
00:19:41.780 I'm only talking about trolls, where you're sending social media trolls.
00:19:46.820 Now, do you think that Russia influenced our elections in the past with their trolls?
00:19:52.280 What's the news say?
00:19:53.820 The news says yes, right?
00:19:55.760 The news says they tried to influence.
00:19:59.220 Now, is that real news or fake news?
00:20:01.780 Is it real or fake news that Russia tried to interfere with our elections with bots?
00:20:09.040 It's real fake news.
00:20:11.300 It's real, and it's also fake.
00:20:14.600 It's real that it happened.
00:20:17.500 I think everybody agrees that it happened.
00:20:20.420 How big was it?
00:20:21.260 It was like a $100,000 ad spend for some memes that we got to see eventually, and the memes
00:20:30.800 had no power whatsoever.
00:20:33.060 I can tell you, as somebody who has studied persuasion, they didn't have any persuasive
00:20:37.500 power.
00:20:38.240 They were just random jokes.
00:20:40.200 No persuasion whatsoever.
00:20:41.960 So do you think the Wall Street Journal gave you a straight report by saying that China
00:20:48.000 might try to get up to the level of interference of Russia, or should they have said nobody has
00:20:55.160 done anything of substance yet?
00:20:58.360 They might, but even Russia has not done anything of substance yet.
00:21:03.340 They did a real thing.
00:21:04.300 It just didn't make any difference and couldn't have at that size.
00:21:07.800 Now, correct me if I'm wrong.
00:21:10.300 That's fake news, right?
00:21:11.840 It's fake news to suggest that Russia had an impact with the bots.
00:21:17.400 That's completely fake news.
00:21:21.100 So, all right.
00:21:25.520 Elon Musk tweeted that citizen journalism is one of the big benefits.
00:21:33.140 He said, a beautiful thing about Twitter is how it empowers citizen journalism.
00:21:36.920 People are able to disseminate news without an establishment bias, which, of course, caused
00:21:42.940 critics to stream in and say, but those citizens will not be fact-checked.
00:21:50.220 Now, does that sound like the dumbest criticism you've ever heard in 2022?
00:21:55.260 Let me say it again just so you can laugh at it.
00:21:57.620 I will now be the critic.
00:21:59.700 You can't have citizen journalism because who would do the fact-checking?
00:22:03.680 All right, you can stop laughing.
00:22:07.600 There's no fact-checking now.
00:22:10.020 Who believes there's any fact-checking on the actual professional news?
00:22:14.860 The fact-checking is clearly fake.
00:22:17.140 It's obviously fake.
00:22:18.920 It's like way beyond the level where it's a conspiracy theory that it's fake.
00:22:23.360 It's obviously demonstrably factually proven to be fake.
00:22:28.140 How could citizen journalism be worse than that?
00:22:32.540 It couldn't possibly be worse, could it?
00:22:36.300 Yeah, in a sense, every tweet is fact-checked in real time by the rest of the comments.
00:22:41.020 You're right.
00:22:42.560 All right, here's some more fake news that's real news.
00:22:46.280 Fake news.
00:22:47.140 Fake news that's real news.
00:22:49.420 You remember PayPal said that they were going to fine people $2,500 for spreading misinformation.
00:23:00.700 And then the world rebelled and said, no, you can't do that.
00:23:06.540 We'll close our accounts.
00:23:08.180 And then PayPal, they backpedaled.
00:23:13.480 And they said, oh, that was just a mistake.
00:23:16.120 That wasn't even supposed to be in there.
00:23:17.720 Total mistake.
00:23:19.540 We took that out.
00:23:20.380 It was like a typo.
00:23:21.560 It's almost like it never happened.
00:23:23.880 And then, after you were no longer paying attention, they published their new terms of service.
00:23:32.160 And the $2,500 fine is, according to Twitter users, not Twitter, but users on Twitter,
00:23:39.000 they put that $2,500 back in there as soon as the outcry died down.
00:23:44.460 Is that true news or fake news?
00:23:45.960 Go.
00:23:47.720 True news or fake news?
00:23:49.220 That PayPal put the $2,500 back in there after they said they'd taken it out.
00:23:56.240 It is fake news, but also kind of true.
00:24:01.420 Here's what's fake.
00:24:03.200 They did take out the spreading misinformation part.
00:24:06.560 So the $2,500 part apparently was always in there, but it was related to other bad behavior.
00:24:15.600 So, for example, if you were using their service to promote, I don't know, Hitler or racism or something,
00:24:22.340 you could have been dinged for $2,500.
00:24:25.460 So the $2,500 is still there, but it was there always, and it wasn't related to spreading misinformation.
00:24:34.280 So the misinformation part they got rid of.
00:24:38.140 Or did they?
00:24:40.220 Or did they?
00:24:42.800 Let's go a little bit deeper.
00:24:44.440 So first of all, it's fake news, because they didn't put in the spreading misinformation part.
00:24:53.820 What is it?
00:24:55.200 Because let's look at what part they did leave in.
00:24:58.060 So they do not have spreading misinformation as one of the triggers for the $2,500 fee.
00:25:05.700 But here are some things that I think these have been in there before, I think.
00:25:11.180 But they're in there now.
00:25:13.320 So one of the things that you can't do is collect money for products that represent a risk to consumer safety.
00:25:22.080 So you can't be doing something that would present a risk to consumer safety.
00:25:31.160 So what would be some things that would be a risk to consumer safety?
00:25:38.040 Firearms.
00:25:39.720 Firearms would be a risk to consumer safety.
00:25:42.640 How about ivermectin?
00:25:44.380 Is ivermectin a product that, in somebody else's opinion, could be a risk to consumer safety?
00:25:53.680 Well, I could interpret it that way.
00:25:56.540 Because even if somebody said, well, ivermectin is well-tolerated,
00:26:01.480 then could not PayPal say, it doesn't matter that it's well-tolerated.
00:26:06.340 If you're promoting this, you're sort of anti-vaccine, and that's misinformation, or that's harmful.
00:26:13.620 Not misinformation, it's harmful.
00:26:16.220 So don't they still have a way to say that you're doing stuff that's harmful related to a product?
00:26:22.240 Does that give them any...
00:26:23.480 I mean, I'm not a lawyer, so I'm just speculating here.
00:26:26.300 I don't know.
00:26:29.880 How about this one?
00:26:31.820 Uh, yeah.
00:26:35.500 How about TikTok?
00:26:38.440 Do you think TikTok is a product that presents a risk to consumer safety?
00:26:45.560 If you collected money for something on TikTok?
00:26:48.720 I don't know if that's a thing or not.
00:26:50.760 But could you collect money?
00:26:52.100 Well, here's a story about TikTok.
00:26:54.300 Apparently, TikTok was spreading around something called the Blackout Challenge,
00:26:59.040 in which children were encouraged to strangle themselves.
00:27:03.540 I'm not making that up.
00:27:06.340 So the algorithm of TikTok delivered to a 10-year-old girl,
00:27:10.940 so it decided that she would be interested in seeing this,
00:27:13.740 a challenge to strangle herself, which she did, and then she died, allegedly.
00:27:19.020 Now, this story does have a little bit too much, a little bit too on the nose.
00:27:23.940 So if tomorrow you find out this is fake news,
00:27:26.160 don't be surprised.
00:27:28.880 This has fake news written all over it,
00:27:31.240 but it also could be true,
00:27:32.900 which would be a huge tragedy.
00:27:36.300 But whether or not this one incident is true,
00:27:40.420 clearly TikTok is spreading, you know,
00:27:44.100 information that somebody could say was dangerous.
00:27:46.960 Somebody could.
00:27:47.560 So we're in murky territory if products that are a risk to consumer safety
00:27:54.380 are a trigger for, you know, getting in trouble.
00:28:01.980 How about this?
00:28:03.220 You could also get in trouble, according to PayPal,
00:28:06.560 if you're collecting money for sale of products or services
00:28:10.840 identified by government agencies
00:28:12.980 to have a high likelihood of being fraudulent.
00:28:18.620 So if you sold ivermectin for COVID,
00:28:24.640 then government agencies would say
00:28:26.640 that has a high risk of being fraudulent.
00:28:31.940 Right?
00:28:32.740 Wouldn't that example fit that?
00:28:35.720 So they do have some questionable elements
00:28:39.520 in their terms of service,
00:28:40.680 but we'll see how that works out.
00:28:45.600 Let's see.
00:28:47.560 More on AI,
00:28:49.420 because as much as you hate talking about AI,
00:28:51.620 it's the only thing that's going to matter
00:28:53.140 in about three years.
00:28:54.420 And you really need to be ready.
00:28:56.600 You would be mad at me
00:28:57.640 if I didn't warn you about this.
00:28:59.660 Right?
00:29:03.380 Do you remember
00:29:04.360 that it wasn't long ago
00:29:06.240 when people who thought they were smart
00:29:08.860 said, well, AI might take some of your factory jobs,
00:29:13.640 but the last thing AI will be able to do is art?
00:29:17.340 Do you remember when all the smart people
00:29:18.860 thought AI couldn't do art?
00:29:21.460 And do you remember when I kept saying,
00:29:23.520 uh, you are so wrong?
00:29:26.580 No?
00:29:27.500 No.
00:29:27.880 Art will be the first thing it can do.
00:29:30.240 Why?
00:29:30.600 Because art has patterns and rules
00:29:33.300 that the artists know,
00:29:35.940 but you don't.
00:29:36.580 So the reason that art looks,
00:29:39.120 you know, magical to people
00:29:40.240 who can't do it
00:29:41.060 is that they don't know
00:29:42.360 it's just based on rules.
00:29:44.200 And if they knew the rules,
00:29:45.720 maybe they couldn't be as good
00:29:47.120 as the great artists.
00:29:48.420 They could come close.
00:29:50.140 For example,
00:29:51.200 I discovered that humor
00:29:53.380 has a formula.
00:29:55.960 And I believe I'm the only person
00:29:57.400 who's ever written about it.
00:29:58.980 But there are six dimensions of humor.
00:30:01.060 You've got to use at least two out of six
00:30:02.520 to make it a joke.
00:30:03.380 Could I teach AI to write a joke?
00:30:06.340 Yes, I could.
00:30:07.740 Then all it would need to do
00:30:08.800 is rapidly test its jokes
00:30:10.380 against real people
00:30:11.640 until it had the best form of it.
00:30:15.000 It might need to test,
00:30:16.480 you know,
00:30:16.700 a thousand different varieties,
00:30:18.860 but one of those thousand
00:30:20.340 would be better than a human could do.
00:30:22.960 It's already there.
00:30:24.760 Right?
00:30:24.940 It's just a trivial,
00:30:26.120 a trivial iteration
00:30:27.720 until it can do humor
00:30:29.200 and singing.
00:30:30.760 It'll do music.
00:30:31.640 It already does graphic art
00:30:33.240 better than actual artists,
00:30:35.020 in my opinion.
00:30:36.100 Now, what it doesn't do yet
00:30:37.460 is follow the boss's specifications
00:30:41.220 as well as a human might.
00:30:44.160 But that's, you know,
00:30:45.500 that'll happen.
00:30:47.660 So, here's an example
00:30:49.900 of how this is going to mess up
00:30:51.140 every industry.
00:30:52.840 I believe every industry
00:30:54.400 is going to be turned upside down
00:30:55.760 in three years or less.
00:30:57.820 So, Shutterstock,
00:31:00.240 a company that sells photographs,
00:31:03.360 or it licenses them
00:31:04.220 so you can use them
00:31:04.960 in your own content,
00:31:07.800 but it had a problem
00:31:09.320 because people were putting
00:31:10.380 computer-generated art
00:31:12.100 on their site.
00:31:13.760 And the computer-generated art
00:31:15.420 would borrow from living artists
00:31:17.460 and create a composite,
00:31:19.400 and so the living artists
00:31:20.420 were saying,
00:31:20.900 hey, that's actually my art
00:31:23.540 that it just stole
00:31:25.080 and combined with somebody else's.
00:31:26.600 You can't do that.
00:31:28.660 And so, Shutterstock
00:31:29.780 is coming up with an idea
00:31:31.420 in which they would compensate
00:31:33.540 the original artists
00:31:35.080 from which the AI borrowed.
00:31:39.480 I don't see how that can work.
00:31:41.800 Do you?
00:31:43.580 Because how do you know
00:31:44.960 what the AI borrowed from?
00:31:47.560 It just doesn't seem workable at all.
00:31:50.040 But they'll try to do it.
00:31:52.300 Now, let me give you a secret.
00:31:56.600 To how to destroy
00:31:58.000 a business model.
00:31:59.860 Are you ready?
00:32:01.340 So, my business model is
00:32:03.180 I'm a syndicated cartoonist,
00:32:05.180 so if Dilbert does really well,
00:32:07.380 I get a lot of benefit.
00:32:10.260 But if another cartoonist
00:32:11.960 who is also syndicated
00:32:13.040 doesn't do well,
00:32:15.220 very few newspapers buy it,
00:32:16.880 for example,
00:32:17.660 then they make a small income
00:32:19.800 and I make a big income
00:32:20.860 because Dilbert does well.
00:32:21.960 Well, do you know
00:32:23.520 what the newspapers
00:32:24.180 want us to do?
00:32:25.320 Us, the syndication.
00:32:27.160 They want to combine
00:32:28.180 all comics as one thing
00:32:29.720 and then just charge one price
00:32:32.000 and then the cartoonist
00:32:34.220 would sort of get an equal share
00:32:35.560 because everybody's cartoon
00:32:37.200 is being contributed
00:32:38.140 to this product.
00:32:39.800 Do you know what that does to me?
00:32:41.660 Puts me out of business.
00:32:43.560 Because I would never,
00:32:45.280 if that business model
00:32:46.580 becomes the dominant one,
00:32:48.160 and it will,
00:32:49.080 it will,
00:32:49.520 then I can't make money, right?
00:32:52.920 It would basically,
00:32:53.920 it would lower my income,
00:32:55.460 which is very worthwhile,
00:32:56.940 down to a level
00:32:58.100 that is not worthwhile
00:32:59.100 and then I would find
00:33:00.080 something else to do
00:33:01.100 that could pay better
00:33:02.380 because cartoony wouldn't pay.
00:33:05.780 That's what's going to happen
00:33:06.820 with these photographers.
00:33:09.060 So you start with,
00:33:10.020 you know,
00:33:10.160 the best photographers
00:33:11.160 who can make
00:33:11.780 insanely good pictures
00:33:13.180 and they're really the stars
00:33:14.720 and their stuff gets,
00:33:16.640 you know,
00:33:17.060 leased more.
00:33:17.560 those people will no longer
00:33:19.720 be in that business
00:33:20.600 because AI will make it
00:33:22.960 unprofitable.
00:33:25.280 So the best,
00:33:26.680 the best humans
00:33:27.920 will retire immediately
00:33:29.100 because they won't have
00:33:30.940 a business model.
00:33:32.580 So whenever you're doing
00:33:34.200 something like
00:33:34.780 combining people's art,
00:33:37.380 you've destroyed
00:33:38.260 the art industry.
00:33:41.020 You got that?
00:33:42.140 True with music as well.
00:33:44.240 You know how music streaming,
00:33:45.420 it was terrible
00:33:47.840 for the most important artists,
00:33:50.480 but it was probably good
00:33:51.540 for the lesser artists.
00:33:54.440 Okay?
00:33:54.980 So everything is going
00:33:56.680 in a direction
00:33:57.200 that artists will be
00:33:58.340 out of business
00:33:58.880 if they're the star artists.
00:34:02.540 Now,
00:34:02.960 you might not care
00:34:04.000 unless you're a star artist
00:34:06.660 like I am,
00:34:07.740 which is weird
00:34:09.340 to call myself that,
00:34:10.260 but it's true.
00:34:10.680 I think at the moment,
00:34:15.080 I'm not sure this is true,
00:34:16.500 but I think I'm
00:34:17.480 the most successful
00:34:19.360 living cartoonist
00:34:22.180 who still has a cartoon
00:34:23.240 that's published
00:34:23.880 for adults.
00:34:28.540 I think Garfield's bigger,
00:34:30.000 but that's more oriented
00:34:31.640 toward kids.
00:34:33.140 I think that's true.
00:34:34.300 I think I just outlived,
00:34:35.420 I outlived artists
00:34:36.780 who are better.
00:34:37.200 So here again,
00:34:42.360 let's talk about Ukraine.
00:34:43.920 The Adam's Law
00:34:45.080 of Slow-Moving Disasters,
00:34:46.700 you all remember that?
00:34:48.600 Years ago,
00:34:49.360 I came up with this idea
00:34:50.380 that humans
00:34:51.260 reliably solve
00:34:53.740 any problem
00:34:54.560 they can see coming
00:34:55.480 from a long ways away.
00:34:57.640 If it's a really bad problem,
00:34:59.240 they can still solve it
00:35:00.360 if they have time.
00:35:01.480 Like the year 2000 problem,
00:35:03.480 like running out of oil
00:35:05.220 in the 70s,
00:35:07.080 just all kinds of problems,
00:35:08.780 running out of food,
00:35:10.000 we solved it, etc.
00:35:11.360 So if you have time,
00:35:13.260 we're fine.
00:35:15.040 And here's another example
00:35:17.060 of that.
00:35:18.040 As of today,
00:35:19.040 CNN is reporting
00:35:20.100 that Germany,
00:35:22.040 who had been importing
00:35:23.720 55% of its gas
00:35:25.620 from Russia
00:35:26.220 before the war,
00:35:27.100 55%
00:35:29.400 of all of
00:35:30.400 Germany's
00:35:31.320 gas,
00:35:33.560 you know,
00:35:35.200 not gas for,
00:35:37.120 I think they mean
00:35:37.700 gas for
00:35:38.360 heating,
00:35:40.180 they've already
00:35:43.540 replaced it.
00:35:44.440 Yeah, LNG.
00:35:45.700 They've already
00:35:46.620 fully replaced it.
00:35:48.440 Germany
00:35:48.880 already has enough
00:35:50.520 gas for the whole winter
00:35:51.440 because they knew
00:35:53.380 they had time.
00:35:54.820 Now, if you told me
00:35:57.160 that that was possible,
00:35:58.440 I would have doubted
00:35:59.260 it, actually.
00:36:00.440 I would have doubted
00:36:01.200 my own law
00:36:03.260 of slow-moving disasters
00:36:04.400 because I couldn't
00:36:05.540 imagine that they
00:36:06.300 would build
00:36:06.720 storage facilities
00:36:07.980 or that they had
00:36:09.540 storage facilities
00:36:10.360 that would effectively
00:36:11.700 double their capacity
00:36:12.920 and they were just
00:36:14.200 sitting there empty.
00:36:16.100 I mean,
00:36:16.600 how exactly
00:36:17.300 do you store it?
00:36:18.840 Or is it stored
00:36:19.980 maybe on-site
00:36:21.120 somewhere else
00:36:21.900 but they own it?
00:36:23.280 Oh, you know,
00:36:23.880 I'll bet that's it.
00:36:25.340 I'll bet that's it.
00:36:26.700 It's not stored
00:36:27.420 in country.
00:36:28.760 That's it.
00:36:29.440 They own it
00:36:30.000 but it's probably
00:36:30.920 still with the producer.
00:36:32.540 Right?
00:36:33.500 Yeah, the producer
00:36:34.200 still owns it.
00:36:35.100 The producer holds it
00:36:36.220 but they own it.
00:36:37.320 That's got to be it.
00:36:39.880 But anyway,
00:36:40.780 they worked it out.
00:36:41.880 Now, Ukraine,
00:36:42.760 of course,
00:36:43.100 has a bigger problem
00:36:44.120 and I didn't see
00:36:45.540 any note about
00:36:46.220 the other European
00:36:47.020 countries because
00:36:47.900 maybe they had
00:36:48.500 less exposure
00:36:49.180 to Russia specifically.
00:36:51.560 But if the other
00:36:52.600 European countries
00:36:53.460 are in as good
00:36:54.460 a shape
00:36:54.920 that Europe
00:36:56.480 is going to get
00:36:57.080 through the winter,
00:36:58.060 it's just going
00:36:58.440 to be really expensive.
00:37:01.260 And that would
00:37:02.020 take away
00:37:02.580 Putin's biggest
00:37:03.580 weapon,
00:37:04.440 wouldn't it?
00:37:05.580 So Putin's
00:37:06.540 biggest remaining
00:37:07.800 weapons
00:37:08.360 are nuclear weapons
00:37:10.760 and freezing
00:37:13.620 Europe.
00:37:15.740 It looks like
00:37:16.500 he's not going
00:37:17.060 to get away
00:37:17.460 with freezing
00:37:18.000 Europe
00:37:18.420 and even Ukraine
00:37:19.800 I think is going
00:37:20.580 to survive.
00:37:23.360 It'll be difficult
00:37:24.340 but they will.
00:37:26.720 Oh, Italy is
00:37:27.540 up and drilling
00:37:28.580 in the Adriatic Sea.
00:37:29.780 Somebody is reporting
00:37:30.600 here.
00:37:31.700 Citizen reporting.
00:37:32.820 There you go.
00:37:36.500 So here's the next
00:37:37.660 thing that's happening.
00:37:38.340 What happens
00:37:41.780 when,
00:37:42.920 so Putin
00:37:43.620 and I get,
00:37:44.280 well,
00:37:44.660 Russia,
00:37:45.340 has threatened
00:37:46.200 to shoot
00:37:49.180 down
00:37:49.620 Elon Musk's
00:37:51.360 satellites.
00:37:53.620 What do you
00:37:54.440 think of that?
00:37:56.140 So Putin
00:37:56.880 and Russia
00:37:57.580 actually
00:38:02.400 threatened
00:38:02.840 that they
00:38:03.280 might take
00:38:03.740 out satellites
00:38:04.340 if the
00:38:04.840 satellites
00:38:05.160 are, you
00:38:05.660 know,
00:38:05.740 helping Ukraine.
00:38:06.460 now,
00:38:08.600 let me,
00:38:12.300 let me just
00:38:14.440 put this out
00:38:15.020 there.
00:38:16.780 I've told you
00:38:17.480 that AI
00:38:17.980 will be able
00:38:18.540 to do
00:38:18.940 better than
00:38:19.980 humans.
00:38:20.540 The other
00:38:20.880 thing that AI
00:38:21.540 will do
00:38:21.940 better than
00:38:22.320 humans
00:38:22.720 is persuade.
00:38:25.060 It will be
00:38:25.640 persuasive.
00:38:26.820 Way more
00:38:27.440 persuasive
00:38:28.700 than humans
00:38:29.320 because you
00:38:30.280 could just
00:38:30.620 feed AI
00:38:31.220 the ten
00:38:31.640 rules of
00:38:32.140 persuasion
00:38:32.660 and they
00:38:33.940 would just
00:38:34.240 use them.
00:38:34.620 I could
00:38:35.760 tell you
00:38:36.160 the ten
00:38:36.520 rules of
00:38:36.920 persuasion
00:38:37.360 all day
00:38:37.780 long and
00:38:38.620 you'll
00:38:38.780 still forget
00:38:39.240 to use
00:38:39.620 them.
00:38:41.080 But an
00:38:41.740 AI will
00:38:42.180 just use
00:38:42.580 them
00:38:42.720 correctly.
00:38:43.740 It will
00:38:43.980 also
00:38:44.280 instantly
00:38:44.880 test to
00:38:46.480 see which
00:38:46.860 things get
00:38:47.300 more retweets.
00:38:48.940 So AI
00:38:49.560 could,
00:38:50.820 in theory,
00:38:51.300 very quickly
00:38:51.900 be so
00:38:52.640 powerful as
00:38:54.080 a persuasion
00:38:54.680 it would be
00:38:55.340 a weapon
00:38:55.840 of mass
00:38:56.300 destruction.
00:38:58.400 Not a
00:38:59.040 joke.
00:38:59.800 I hate to
00:39:00.480 quote Biden.
00:39:01.700 My God,
00:39:02.060 I just
00:39:02.360 quoted
00:39:02.700 Biden.
00:39:04.580 That
00:39:05.100 literally,
00:39:06.620 with no
00:39:07.100 exaggeration,
00:39:07.880 no hyperbole,
00:39:09.360 persuasion,
00:39:10.540 once it's
00:39:11.020 weaponized by
00:39:11.680 AI,
00:39:12.560 will be so
00:39:13.180 powerful that
00:39:13.880 it could
00:39:14.140 cause a
00:39:14.620 revolution,
00:39:15.760 it could
00:39:16.080 take down
00:39:16.520 the country,
00:39:17.240 it could
00:39:17.520 turn off
00:39:17.940 your power,
00:39:19.220 it could
00:39:19.820 kill
00:39:20.180 everybody.
00:39:21.880 So do
00:39:23.600 you know
00:39:23.860 who has
00:39:24.240 access to
00:39:24.940 a weapon
00:39:25.340 of mass
00:39:25.780 destruction
00:39:26.280 of that
00:39:26.780 size?
00:39:29.080 Probably
00:39:29.640 Elon Musk.
00:39:32.060 Now,
00:39:32.920 I think
00:39:33.300 he's no
00:39:34.600 longer
00:39:34.960 involved
00:39:35.520 with one
00:39:36.000 of the
00:39:36.180 big
00:39:36.360 AIs
00:39:36.820 that
00:39:37.100 Sam
00:39:37.380 Altman
00:39:37.700 was
00:39:38.040 backing.
00:39:39.480 I think
00:39:39.720 they were
00:39:39.980 together
00:39:40.320 at one
00:39:40.660 point.
00:39:41.340 I read
00:39:41.700 something that
00:39:42.160 said he
00:39:42.420 might not
00:39:42.780 be involved
00:39:43.260 with AI
00:39:43.700 now,
00:39:44.820 but that's
00:39:45.320 not true.
00:39:45.720 He's doing
00:39:46.140 AI with
00:39:46.660 Tesla,
00:39:47.060 right?
00:39:47.760 Yeah,
00:39:47.920 he must
00:39:48.220 have AI
00:39:48.780 access in
00:39:50.200 a variety
00:39:50.560 of ways.
00:39:51.720 So imagine
00:39:52.560 this.
00:39:54.040 Imagine
00:39:54.380 Putin
00:39:54.820 targeting
00:39:55.500 Elon Musk
00:39:56.440 satellites,
00:39:57.820 hypothetically,
00:39:59.320 and then
00:39:59.700 imagine Elon
00:40:00.360 Musk getting
00:40:00.920 pissed off.
00:40:02.060 And deciding
00:40:02.960 to target
00:40:03.500 Putin with
00:40:05.600 AI persuasion.
00:40:08.040 Do you
00:40:08.760 think that
00:40:09.420 Elon Musk
00:40:10.280 could take
00:40:10.980 down Russia,
00:40:11.900 or take
00:40:12.320 down Putin
00:40:12.740 specifically,
00:40:13.540 with nothing
00:40:14.380 but AI
00:40:15.000 persuasion?
00:40:16.820 What do you
00:40:17.560 think?
00:40:18.680 Do you
00:40:19.080 believe that
00:40:19.760 Elon Musk
00:40:20.540 would have
00:40:20.840 first of all
00:40:21.240 access to
00:40:22.000 it?
00:40:22.740 He'd have
00:40:23.060 to have
00:40:23.280 access.
00:40:24.480 He'd have
00:40:25.140 to have
00:40:25.440 Twitter
00:40:25.860 supporting it
00:40:27.120 and not
00:40:27.420 banning it.
00:40:28.800 So he has
00:40:29.600 a platform,
00:40:30.680 he has the
00:40:31.240 AI, and
00:40:32.340 he would
00:40:32.600 have the
00:40:32.940 motive,
00:40:33.940 hypothetically.
00:40:35.220 So if his
00:40:35.780 satellites get
00:40:36.500 targeted,
00:40:38.820 Elon Musk
00:40:39.560 has the
00:40:40.500 power of
00:40:41.320 a nation
00:40:42.260 state,
00:40:43.340 just about,
00:40:44.760 as soon as he
00:40:45.180 has Twitter.
00:40:46.780 Right?
00:40:47.280 Because he'll
00:40:47.800 have the
00:40:48.320 platform,
00:40:49.120 he'll have the
00:40:49.600 distribution,
00:40:50.400 the satellites,
00:40:51.620 and he'll
00:40:52.020 have the AI
00:40:52.800 to weaponize
00:40:54.440 it in whatever
00:40:55.020 way he wanted
00:40:55.660 if his
00:40:56.920 ethical,
00:40:57.800 let's say,
00:40:58.920 if his
00:40:59.420 ethical
00:41:00.100 constraints
00:41:00.680 allowed him
00:41:01.180 to do
00:41:01.460 that.
00:41:03.220 He might
00:41:03.780 just say
00:41:04.280 never use
00:41:04.860 it for
00:41:05.120 war.
00:41:05.920 He might
00:41:06.540 have an
00:41:06.820 ethical
00:41:07.120 constraint
00:41:07.600 about
00:41:07.880 weaponizing
00:41:08.440 AI,
00:41:09.500 which would
00:41:09.960 be reasonable,
00:41:10.740 actually.
00:41:13.980 Right?
00:41:14.580 So,
00:41:15.520 I would
00:41:16.740 worry if I
00:41:17.420 were Putin,
00:41:18.380 because Putin
00:41:19.120 can attack
00:41:19.800 Ukraine,
00:41:20.760 and he
00:41:21.780 might survive.
00:41:22.500 But if he
00:41:23.920 attacks Elon
00:41:24.720 Musk's
00:41:25.140 satellites,
00:41:26.300 I don't
00:41:27.200 think he
00:41:27.500 has a
00:41:27.760 chance.
00:41:29.460 I don't
00:41:30.040 think he
00:41:30.300 has a
00:41:30.560 fucking
00:41:30.780 chance.
00:41:32.120 I think
00:41:32.540 Elon would
00:41:33.060 take him
00:41:33.400 out,
00:41:34.180 and he
00:41:34.500 would never
00:41:34.860 know.
00:41:36.220 Right?
00:41:36.820 Putin would
00:41:37.320 just know
00:41:37.720 that things
00:41:38.140 weren't bad.
00:41:39.260 He wouldn't
00:41:39.620 know why,
00:41:41.160 because the
00:41:42.000 persuasion
00:41:43.120 would be
00:41:43.580 somewhat,
00:41:44.140 you know,
00:41:44.780 under the
00:41:45.160 hood.
00:41:45.460 He wouldn't
00:41:45.640 even see
00:41:45.960 it happening.
00:41:48.340 Anyway,
00:41:49.440 the other
00:41:51.800 thing is
00:41:52.180 that if
00:41:52.600 you're a
00:41:53.500 gazillionaire,
00:41:55.040 you can
00:41:55.420 just bribe
00:41:56.600 the highest
00:41:57.560 people in
00:41:58.180 the Russian
00:41:59.040 government to
00:41:59.680 do anything
00:42:00.040 you want.
00:42:01.060 Can't you?
00:42:02.080 Oh, here's a
00:42:02.580 question for
00:42:03.060 you.
00:42:04.120 Now, it
00:42:05.520 is illegal
00:42:06.280 for an
00:42:06.840 American to
00:42:07.440 bribe a
00:42:09.280 foreigner, or
00:42:09.960 bribe anybody,
00:42:10.620 right?
00:42:10.940 Isn't it
00:42:11.360 illegal to
00:42:11.880 bribe in
00:42:12.500 every situation?
00:42:13.700 Is that
00:42:13.980 true?
00:42:15.180 Yes or
00:42:15.620 no?
00:42:16.920 It's illegal
00:42:17.860 in every
00:42:18.360 situation,
00:42:18.880 right?
00:42:19.500 Does that
00:42:20.140 include war?
00:42:22.180 Is it
00:42:22.900 illegal to
00:42:23.580 bribe
00:42:24.060 somebody in
00:42:24.680 the context
00:42:25.340 of war?
00:42:28.160 Of course
00:42:28.840 not.
00:42:29.620 In the
00:42:29.900 context of
00:42:30.440 war, you
00:42:30.760 can bribe
00:42:31.240 somebody to
00:42:31.640 be your
00:42:31.900 spy, right?
00:42:34.500 Now, I
00:42:35.480 don't know
00:42:35.800 if an
00:42:36.060 American
00:42:36.360 citizen can
00:42:37.460 participate on
00:42:38.580 their own in
00:42:39.180 a war.
00:42:40.500 Can they?
00:42:42.120 Would that
00:42:42.840 even be
00:42:43.220 illegal?
00:42:43.920 I don't know.
00:42:44.860 Like, suppose,
00:42:47.720 let me give
00:42:48.320 you, for
00:42:49.540 example, let's
00:42:50.220 say we're at
00:42:50.720 war, and I
00:42:52.120 send a tweet
00:42:52.860 out that I
00:42:53.420 think is
00:42:53.740 really persuasive
00:42:55.520 and could
00:42:56.120 make a
00:42:56.440 difference.
00:42:57.460 Am I
00:42:57.800 joining the
00:42:59.160 war without
00:42:59.640 permission?
00:43:01.180 Did I, what
00:43:02.000 if I just,
00:43:02.820 like, grab
00:43:03.980 some weapons
00:43:04.600 and fly over
00:43:05.480 there on my
00:43:05.920 own and start
00:43:06.500 shooting bad
00:43:07.140 guys?
00:43:07.840 Is that
00:43:08.200 legal?
00:43:10.000 You know, if
00:43:10.380 I'm shooting
00:43:10.820 the enemy, is
00:43:12.360 it illegal to
00:43:13.060 do that if I'm
00:43:13.660 an American
00:43:14.100 citizen?
00:43:15.280 And suppose
00:43:15.680 I only killed
00:43:16.520 actually combatants,
00:43:17.860 I didn't kill
00:43:18.280 any citizens,
00:43:19.740 would that be
00:43:20.260 illegal?
00:43:22.200 I'm assuming
00:43:22.920 it is.
00:43:23.840 I'm assuming
00:43:24.320 it is.
00:43:25.680 But, suppose,
00:43:27.100 Elon Musk went
00:43:28.000 to our
00:43:30.080 military and
00:43:30.780 said, I
00:43:32.560 know I
00:43:33.300 wouldn't
00:43:33.500 normally do
00:43:34.060 this, but
00:43:35.620 I know a
00:43:36.760 bunch of
00:43:37.080 high-level
00:43:37.540 Russians,
00:43:38.700 because I do
00:43:39.460 business over
00:43:40.040 there, and
00:43:41.540 I can guarantee
00:43:42.400 you that I
00:43:43.000 can tell some
00:43:43.540 of those
00:43:43.800 high-ranking
00:43:44.480 Russians that
00:43:45.820 if they get
00:43:46.560 rid of
00:43:46.820 Putin, they've
00:43:47.620 got a good
00:43:48.040 job with
00:43:48.520 me on
00:43:49.460 SpaceX
00:43:49.900 someday.
00:43:52.200 Could he
00:43:52.720 do that?
00:43:54.300 Would the
00:43:55.100 American
00:43:55.460 military say,
00:43:56.840 sure, I
00:43:58.020 mean, if you
00:43:58.520 can make it
00:43:58.940 happen, go
00:43:59.420 ahead?
00:44:00.580 Would they
00:44:00.960 give him the
00:44:01.700 yes?
00:44:03.120 Because, you
00:44:03.720 know, no
00:44:04.080 money has to
00:44:04.680 change hands.
00:44:05.980 All it would
00:44:06.520 require is one
00:44:07.280 conversation
00:44:07.860 through
00:44:08.320 intermediaries.
00:44:09.160 say, hey,
00:44:10.820 Igor, Elon
00:44:14.380 told me to
00:44:15.000 tell you that
00:44:16.580 if you turn
00:44:17.420 on Putin in
00:44:18.160 some specific
00:44:18.880 way, let's
00:44:19.380 say, you've
00:44:20.900 got a job
00:44:21.360 as SpaceX.
00:44:24.320 I feel like
00:44:26.460 going after
00:44:28.160 Ukraine is a
00:44:29.400 real dangerous
00:44:30.000 thing, and
00:44:31.100 Putin took a
00:44:32.300 big risk, but
00:44:33.460 going after
00:44:33.980 Elon Musk would
00:44:34.740 be another level
00:44:35.560 of danger that
00:44:36.240 I don't think he
00:44:37.300 would take on
00:44:37.860 because you
00:44:39.160 wouldn't see it
00:44:39.700 coming.
00:44:41.020 You wouldn't
00:44:41.900 know what
00:44:43.460 Elon Musk did
00:44:44.300 back.
00:44:45.600 You would just
00:44:46.320 know something
00:44:46.780 bad happened.
00:44:50.900 All right.
00:44:53.840 So, the
00:44:56.260 U.S.,
00:44:57.080 apparently,
00:44:57.900 Anthony
00:44:58.320 Blinken says
00:44:59.320 he's reiterated
00:45:00.420 that the U.S.
00:45:02.180 is tracking the
00:45:02.940 Kremlin's
00:45:03.420 nuclear saber
00:45:04.440 rattling, and
00:45:06.020 has warned
00:45:08.720 Putin not
00:45:09.760 to use
00:45:10.180 nuclear weapons,
00:45:11.320 and I guess
00:45:12.840 Putin knows
00:45:13.380 what the
00:45:13.880 response would
00:45:15.820 be, and it
00:45:16.580 would be a
00:45:17.020 terrible response,
00:45:18.180 something terrible.
00:45:19.640 Now, let me
00:45:20.740 ask you this.
00:45:22.740 I don't want to
00:45:23.600 get into a
00:45:24.520 definition war,
00:45:26.440 but is it
00:45:26.980 really a war if
00:45:28.140 you can tell the
00:45:28.760 other side what
00:45:29.420 weapons to use?
00:45:32.220 Like, there's
00:45:33.280 something weird
00:45:33.780 about this.
00:45:34.880 Am I wrong?
00:45:35.400 Well, if you
00:45:36.360 can tell your
00:45:37.140 enemy, who's
00:45:38.260 trying to kill
00:45:38.840 you, if you
00:45:40.100 can tell them
00:45:40.620 what weapons
00:45:41.700 they can and
00:45:42.300 cannot use,
00:45:44.000 is it really
00:45:44.820 a war?
00:45:47.860 I would say
00:45:48.620 it's a
00:45:48.880 negotiation.
00:45:51.060 It's a war
00:45:51.900 when you use
00:45:52.480 all of your
00:45:52.840 weapons.
00:45:54.660 When you
00:45:55.380 hold back
00:45:56.360 your best
00:45:56.900 weapons, I
00:45:59.040 think that's
00:45:59.440 just a
00:45:59.800 negotiation that
00:46:01.440 happens to
00:46:01.960 involve bullets.
00:46:02.680 chemical and
00:46:06.580 nuclear being
00:46:07.280 held back.
00:46:08.800 So, to me,
00:46:09.720 this looks like
00:46:10.340 an act of
00:46:10.980 negotiation.
00:46:12.540 If somebody
00:46:13.420 wanted to win
00:46:14.040 a fucking
00:46:14.440 war, they'd
00:46:15.500 use all their
00:46:15.980 weapons.
00:46:17.520 That's what I
00:46:18.000 think.
00:46:19.420 So, it's
00:46:20.460 partly economic
00:46:21.360 so we can sell
00:46:22.240 military weapons,
00:46:23.260 I'm sure, and
00:46:24.980 partly just an
00:46:25.820 extended
00:46:27.060 negotiation.
00:46:27.780 I mean, all
00:46:30.140 war is
00:46:30.560 negotiation.
00:46:31.460 People say
00:46:31.900 that.
00:46:33.860 All right.
00:46:36.140 Do you
00:46:36.860 believe that
00:46:37.480 somewhere in
00:46:38.040 the world an
00:46:38.900 AI is already
00:46:39.900 being trained
00:46:40.700 in the skill
00:46:42.040 of persuasion?
00:46:44.360 What do you
00:46:45.100 think?
00:46:47.080 Well, if it's
00:46:48.420 not, that
00:46:50.540 would be an
00:46:51.260 obvious oversight.
00:46:55.860 Yeah, it's
00:46:56.600 already happened,
00:46:57.260 of course.
00:46:58.480 And her name
00:46:58.880 is Trinity,
00:46:59.580 yes.
00:47:01.760 Was the
00:47:02.340 Cold War a
00:47:03.020 negotiation or
00:47:03.940 a war?
00:47:04.420 Well, it
00:47:04.820 wasn't a war.
00:47:07.120 It was a
00:47:07.940 bunch of
00:47:08.260 bullshit, is
00:47:08.940 what it was.
00:47:10.200 You know, I
00:47:10.680 still maintain
00:47:11.540 that the reason
00:47:12.560 for the Ukraine
00:47:13.320 war is not
00:47:14.420 any reason
00:47:15.040 that's been
00:47:15.480 mentioned.
00:47:17.180 Do you want
00:47:17.720 to know my
00:47:18.180 reason for the
00:47:18.800 Ukraine war?
00:47:20.140 Do you
00:47:20.380 remember what
00:47:20.840 it was?
00:47:21.560 Let's see if
00:47:21.980 you can remember.
00:47:23.320 So, my
00:47:23.880 reason for the
00:47:24.580 Ukraine war,
00:47:25.620 no, not
00:47:26.560 this, not
00:47:27.600 Hunter's
00:47:28.040 laptop, not
00:47:30.140 about money,
00:47:31.280 not about
00:47:31.720 money laundering,
00:47:32.640 not about
00:47:33.120 lithium, not
00:47:35.520 about the
00:47:36.360 World Economic
00:47:37.020 Forum, not
00:47:37.700 about lithium,
00:47:38.400 not about
00:47:38.720 money, not
00:47:39.340 about resources,
00:47:41.080 not about
00:47:41.920 ego, not
00:47:42.740 about NATO
00:47:43.400 expansion.
00:47:45.920 Right.
00:47:46.900 You know what I
00:47:47.400 think it was?
00:47:47.940 The root, root,
00:47:49.160 root cause.
00:47:50.320 You ready?
00:47:51.300 Not about
00:47:52.040 regime change,
00:47:52.980 not about land,
00:47:53.740 not about
00:47:54.560 military.
00:47:56.140 Nope.
00:47:58.060 It's simpler
00:47:58.860 than that.
00:48:00.200 There are too
00:48:01.220 many Russian
00:48:01.920 experts in our
00:48:03.080 government.
00:48:04.980 That's it.
00:48:06.420 There are too
00:48:07.000 many Russian
00:48:07.700 experts in our
00:48:09.600 government.
00:48:11.360 And let me
00:48:12.520 really fuck up
00:48:13.620 your mind here.
00:48:15.140 Here's a mental
00:48:16.120 challenge.
00:48:17.720 Imagine if the
00:48:18.620 number of Russian
00:48:19.900 experts in our
00:48:20.760 government,
00:48:21.180 government, let's
00:48:22.840 say that all
00:48:23.580 changed, and
00:48:24.680 that was a
00:48:25.420 small number,
00:48:26.500 and the
00:48:26.820 number of,
00:48:27.880 let's say,
00:48:30.860 Iranian
00:48:31.480 experts in our
00:48:32.960 government was
00:48:34.320 expanded to the
00:48:35.220 number of Russian
00:48:35.840 experts.
00:48:37.320 Do you think we
00:48:38.020 would be starting
00:48:38.860 a war with
00:48:39.440 Iran if most
00:48:41.100 of our experts
00:48:41.780 were Iranian
00:48:42.540 experts?
00:48:44.040 Yes, we
00:48:44.620 would.
00:48:46.760 Right.
00:48:47.500 You're going to
00:48:48.160 start a war with
00:48:48.960 whoever you have
00:48:49.520 the most experts
00:48:50.440 with.
00:48:51.180 Why?
00:48:52.100 Because there
00:48:52.760 will be the
00:48:53.160 most number of
00:48:53.940 voices telling
00:48:54.580 you you need a
00:48:55.240 war so that
00:48:56.400 those experts are
00:48:57.240 more important.
00:48:59.180 Follow the
00:48:59.700 money.
00:49:00.620 Right?
00:49:01.000 If you follow
00:49:01.800 the money, it
00:49:02.480 goes all the way
00:49:03.000 down to how
00:49:03.560 many experts are
00:49:04.500 on each team.
00:49:06.560 If we only had
00:49:08.080 one Russian
00:49:08.740 expert who was
00:49:10.500 really good,
00:49:12.120 just one person,
00:49:13.320 but they're
00:49:13.620 really good, and
00:49:14.420 everybody knew
00:49:15.000 they were really
00:49:15.480 good, do you
00:49:16.360 think that one
00:49:16.880 expert could start
00:49:17.680 a war?
00:49:19.100 No.
00:49:19.540 It's just
00:49:20.940 one person.
00:49:22.260 But as soon
00:49:22.780 as you've got
00:49:23.340 like Russia
00:49:24.360 experts all
00:49:25.340 through your
00:49:25.980 organizations,
00:49:27.560 they're going
00:49:29.100 to be looking
00:49:29.500 for war with
00:49:30.140 Russia, because
00:49:31.280 that's what
00:49:31.680 makes them
00:49:32.240 important.
00:49:33.160 It's good for
00:49:33.780 their income,
00:49:34.480 good for their
00:49:34.900 careers, good
00:49:35.900 for their
00:49:36.200 prestige, good
00:49:37.040 for everything.
00:49:38.060 So of course
00:49:38.680 there's going to
00:49:39.040 be war if you
00:49:39.920 have a lot of
00:49:40.400 Russian experts.
00:49:41.440 So all of
00:49:45.160 those other
00:49:45.560 reasons above
00:49:46.280 that, those
00:49:48.120 are all fake.
00:49:49.500 They're all
00:49:50.140 fake.
00:49:50.980 Because you
00:49:51.660 could recreate
00:49:52.320 the situation
00:49:53.060 just, well, you
00:49:54.720 could predict it
00:49:55.560 just by the
00:49:56.300 number of
00:49:56.660 experts.
00:49:57.420 That's all it
00:49:57.900 would take.
00:50:02.020 Right.
00:50:02.800 Yeah.
00:50:03.180 The people
00:50:03.860 who only have
00:50:05.420 a hammer think
00:50:07.120 everything looks
00:50:07.720 like an ale.
00:50:08.220 Because aren't
00:50:10.360 you confused
00:50:11.460 about the fact
00:50:12.120 that the
00:50:12.460 United States
00:50:13.020 and the
00:50:13.340 Russia have
00:50:13.800 any beef
00:50:14.420 whatsoever?
00:50:16.660 Doesn't it
00:50:17.260 make no sense?
00:50:18.620 It doesn't
00:50:18.980 make any sense.
00:50:20.460 I will say it
00:50:22.420 until I'm blue
00:50:23.080 in the face,
00:50:24.260 and it will
00:50:24.700 definitely happen.
00:50:25.920 And it goes
00:50:26.280 like this.
00:50:27.360 Russia and the
00:50:28.120 United States
00:50:28.760 are natural
00:50:29.580 allies.
00:50:30.980 You can't
00:50:31.820 stop that arc
00:50:32.600 of history
00:50:33.000 from happening.
00:50:34.400 There's nothing
00:50:34.980 that will stop
00:50:35.520 us from being
00:50:36.100 allies in the
00:50:36.780 long run.
00:50:37.120 It will
00:50:38.200 happen.
00:50:39.100 We just
00:50:39.520 have too
00:50:39.800 much in
00:50:40.100 common.
00:50:42.080 And a
00:50:43.200 distrust of
00:50:44.880 China is at
00:50:45.520 the top of
00:50:45.940 the list.
00:50:48.000 Right.
00:50:51.600 So,
00:50:52.760 I suspect
00:50:54.680 that the
00:50:56.720 way we
00:50:57.100 solve Ukraine
00:50:57.820 is by
00:50:58.300 bringing
00:50:58.660 space and
00:51:00.300 a future
00:51:00.980 where we
00:51:02.400 can work
00:51:02.720 together into
00:51:03.220 the mix.
00:51:04.160 Now,
00:51:04.720 do you think
00:51:05.060 the United
00:51:05.440 States could
00:51:06.040 ever propose
00:51:07.040 that we
00:51:08.160 work together
00:51:08.840 with Russia?
00:51:10.700 No,
00:51:11.360 because there
00:51:11.720 are too
00:51:11.880 many Russia
00:51:12.300 experts.
00:51:14.200 The only
00:51:14.820 way that
00:51:15.160 would work
00:51:15.600 is if you
00:51:16.520 found a
00:51:16.940 way to
00:51:17.380 monetize the
00:51:18.400 Russia experts
00:51:19.180 for helping
00:51:20.660 to bring
00:51:21.140 Russia into
00:51:22.200 NATO or
00:51:23.860 an alliance
00:51:24.400 or whatever.
00:51:25.520 If the
00:51:26.140 Russia experts
00:51:26.920 could find a
00:51:27.420 way to make
00:51:27.800 money from
00:51:28.420 non-conflict,
00:51:29.940 then it could
00:51:30.920 work, but I
00:51:31.500 don't know how
00:51:31.800 that would
00:51:32.120 work.
00:51:32.980 Nobody makes
00:51:33.540 money from
00:51:33.960 non-conflict.
00:51:36.460 That's an
00:51:37.160 exaggeration.
00:51:39.600 All right.
00:51:42.360 I believe
00:51:43.240 that is all
00:51:44.420 I wanted to
00:51:44.920 talk about
00:51:45.360 right now.
00:51:48.440 For those
00:51:49.260 of you who
00:51:49.660 are not
00:51:50.000 members of
00:51:50.660 the Locals
00:51:52.060 subscription
00:51:53.340 forum,
00:51:55.060 I'm going to
00:51:56.320 start putting
00:51:56.960 Dilbert Comics
00:51:57.860 in the
00:51:59.400 Locals
00:52:00.100 network,
00:52:00.980 because since
00:52:02.060 that community
00:52:03.280 I manage,
00:52:04.700 I can do
00:52:05.060 that legally.
00:52:06.540 So if
00:52:06.940 anybody wants
00:52:07.880 to see
00:52:08.640 Dilbert
00:52:08.940 automatically,
00:52:10.640 that's one
00:52:12.720 way to do
00:52:13.160 it.
00:52:13.980 I'll also be
00:52:14.820 increasing my
00:52:16.020 Robots Read
00:52:16.860 News comics
00:52:18.000 that I've
00:52:19.060 let lapse
00:52:19.600 for a little
00:52:19.940 bit.
00:52:20.720 So the
00:52:21.580 edgiest
00:52:22.200 comics you'll
00:52:23.860 ever see
00:52:24.360 will be in
00:52:25.580 Locals,
00:52:26.080 because I
00:52:26.360 can't get
00:52:26.740 canceled for
00:52:27.280 that stuff.
00:52:27.860 All right.
00:52:30.880 You haven't
00:52:31.560 emailed,
00:52:31.980 okay.
00:52:35.140 Yeah,
00:52:35.780 you know,
00:52:36.080 the local
00:52:37.240 subscription
00:52:38.660 network would
00:52:39.940 replace my
00:52:41.160 entire syndication
00:52:42.040 company,
00:52:43.040 because in
00:52:44.260 theory,
00:52:44.840 I'm not
00:52:45.260 doing this,
00:52:45.800 but in
00:52:46.060 theory,
00:52:46.480 I could
00:52:46.720 just say
00:52:47.080 to any
00:52:47.640 newspaper
00:52:48.380 or entity
00:52:48.960 that wanted
00:52:49.840 to run
00:52:50.200 Dilbert,
00:52:50.860 I could
00:52:51.100 just say,
00:52:52.220 yes,
00:52:52.460 all you
00:52:52.760 need is a
00:52:53.460 $5 a
00:52:55.120 month subscription
00:52:56.200 if you buy
00:52:57.320 an annual
00:52:57.780 subscription.
00:52:58.880 It's $5 a
00:52:59.680 month.
00:53:00.380 And for
00:53:00.660 $5 a
00:53:01.240 month,
00:53:01.440 you can
00:53:01.660 run
00:53:01.880 Dilbert
00:53:02.220 anywhere you
00:53:03.060 want.
00:53:03.700 Just put it
00:53:04.160 in your
00:53:04.380 newspaper,
00:53:05.420 and they
00:53:05.840 can just
00:53:06.100 go get
00:53:06.500 it.
00:53:07.280 They just
00:53:07.580 screen grab
00:53:08.340 it or
00:53:08.560 whatever they
00:53:08.900 need to
00:53:09.160 do.
00:53:11.020 Or download
00:53:11.740 it.
00:53:14.760 All right.
00:53:15.320 I have a
00:53:17.800 question for
00:53:18.480 you.
00:53:19.200 I need to
00:53:19.900 use the
00:53:20.540 collective
00:53:22.580 mind.
00:53:24.020 So I've
00:53:24.580 done this
00:53:24.920 experiment
00:53:25.360 before,
00:53:25.940 but every
00:53:26.240 time I do
00:53:26.700 it, I'm
00:53:27.560 fascinated,
00:53:28.220 and I'm
00:53:28.420 going to ask
00:53:28.660 something that's
00:53:29.220 purely for my
00:53:30.120 own benefit,
00:53:30.760 but maybe it
00:53:31.180 helps you
00:53:31.500 too.
00:53:32.260 Here's a
00:53:32.820 question.
00:53:34.100 If you
00:53:34.760 use a
00:53:35.780 blood pressure
00:53:37.020 monitor, the
00:53:37.880 kind that wraps
00:53:38.460 around your
00:53:38.900 arm and then
00:53:39.560 compresses your
00:53:40.340 arm, if you
00:53:42.480 sit in one
00:53:43.240 place and just
00:53:44.220 measure your
00:53:45.000 blood pressure
00:53:45.600 three times in
00:53:46.480 a row, will
00:53:48.860 it always be
00:53:49.640 lower on the
00:53:50.240 third try?
00:53:52.660 And is that
00:53:53.580 because you've
00:53:54.120 been sitting
00:53:54.520 there, or is
00:53:55.740 it because the
00:53:56.760 machine squeezed
00:53:58.080 you once or
00:53:58.720 twice and so
00:53:59.580 it's getting a
00:54:00.160 different reading
00:54:00.720 the third time?
00:54:01.720 In other words,
00:54:02.340 is the third
00:54:03.080 reading accurate
00:54:04.040 if you keep
00:54:05.300 the band on?
00:54:06.920 Do you have
00:54:07.320 to take the
00:54:07.760 band off and
00:54:08.660 put it back on
00:54:09.540 each time you
00:54:10.800 use it, or
00:54:11.460 can you just
00:54:11.920 sit there and
00:54:12.360 hit yes every
00:54:13.040 time?
00:54:15.000 All right, I'm
00:54:16.680 saying your
00:54:17.180 answers are all
00:54:17.900 over the
00:54:18.220 place.
00:54:20.320 All right, here's
00:54:20.900 why I want to
00:54:22.300 know, all right,
00:54:23.960 because I did
00:54:24.560 something yesterday
00:54:25.300 that blew my
00:54:26.000 mind.
00:54:27.820 All right, now a
00:54:28.680 number of times I've
00:54:29.640 done multiple
00:54:30.200 readings on my
00:54:31.700 blood pressure
00:54:32.320 monitor, and
00:54:33.820 generally they're
00:54:34.560 not too far, like
00:54:36.260 each reading is
00:54:37.100 within the range of
00:54:38.580 the other one.
00:54:39.780 You know, they're
00:54:40.440 different every time
00:54:41.700 you do it, but
00:54:42.740 they're not that
00:54:43.280 different.
00:54:43.580 You can tell if
00:54:44.280 you have high
00:54:44.660 blood pressure or
00:54:45.440 not, right?
00:54:47.120 But here's
00:54:47.840 something I did
00:54:48.440 yesterday.
00:54:48.960 I took it three
00:54:49.540 times.
00:54:50.720 First time, 134
00:54:54.320 over 90
00:54:56.720 something.
00:54:57.980 So that's high.
00:54:59.480 Second time,
00:55:01.080 similar, like
00:55:01.920 132 over high
00:55:04.480 use or something
00:55:05.240 like that.
00:55:06.100 All right.
00:55:06.840 Third time, I
00:55:09.200 used a mental
00:55:10.240 exercise.
00:55:11.000 And all I did
00:55:13.240 was drop myself
00:55:14.060 on a beach in
00:55:14.840 Hawaii, a beach
00:55:16.160 that I know well
00:55:16.880 enough that I can
00:55:17.640 imagine it in
00:55:18.760 perfect detail.
00:55:21.020 Dropped myself on
00:55:21.960 the beach and
00:55:23.160 then took the
00:55:23.660 blood pressure
00:55:24.120 monitor, right?
00:55:25.500 So here it went
00:55:26.940 from 134 over 90
00:55:29.100 and in 30 seconds
00:55:33.040 of visualization,
00:55:34.740 I lowered it to
00:55:36.140 118 over 68.
00:55:40.800 118 over 68.
00:55:43.140 That's low blood
00:55:44.080 pressure.
00:55:44.920 That's below
00:55:45.540 normal in 30
00:55:47.700 seconds.
00:55:49.900 Now, here, don't
00:55:51.920 take this as
00:55:52.780 impressive.
00:55:53.460 That's not where I'm
00:55:54.100 going with this.
00:55:55.120 I'm not saying that
00:55:55.940 really worked.
00:55:56.900 I'm asking you if it
00:55:58.180 really worked.
00:55:58.800 I'm not claiming
00:55:59.680 it.
00:56:00.500 I'm not claiming
00:56:01.300 it worked.
00:56:02.240 I'm asking you if
00:56:03.080 I fooled myself
00:56:04.060 because the cuff
00:56:06.160 was just looser
00:56:07.420 on the third
00:56:07.980 try.
00:56:09.160 Now, I'll try to
00:56:10.480 recreate it.
00:56:13.740 You were taught
00:56:14.500 to do it by a
00:56:15.080 psychologist.
00:56:16.140 Now, here's what
00:56:16.920 you need to know.
00:56:18.080 As you know, I'm a
00:56:19.160 trained hypnotist.
00:56:21.140 And I can do
00:56:22.600 self-hypnosis.
00:56:23.540 It's one of the
00:56:24.000 things that I learned
00:56:24.760 learning hypnosis.
00:56:26.960 And self-hypnosis
00:56:27.740 is sort of a rapid
00:56:29.160 meditation.
00:56:31.180 So you can bring
00:56:31.880 yourself to a
00:56:33.180 desired state, but
00:56:35.020 where meditation, it's
00:56:36.280 sort of, you know, you
00:56:37.220 work your way toward
00:56:38.120 it and you're never
00:56:38.880 sure you're there and
00:56:39.960 everything else.
00:56:40.760 But with hypnosis, you
00:56:41.940 can go right to it.
00:56:43.720 So when I say I drop
00:56:44.760 myself in a beach, I
00:56:46.680 mean I could feel my
00:56:47.700 entire metabolism, feel
00:56:51.420 the exact way I would
00:56:52.720 feel if I were on the
00:56:53.540 beach in the sun.
00:56:54.220 And I did it in 30
00:56:56.820 seconds, and I could
00:56:58.760 feel the entire beach
00:57:00.420 feeling.
00:57:01.440 Like, I could feel it
00:57:02.240 on my skin.
00:57:03.760 Now, I also have an
00:57:04.680 unusual, let's say, an
00:57:07.360 unusually commercially
00:57:09.360 developed imagination.
00:57:12.600 Do you ever wonder, you
00:57:13.900 know, the famous
00:57:14.920 question, do you see
00:57:16.560 red the same way I see
00:57:17.740 red?
00:57:18.640 You know, there's no way
00:57:19.380 to know.
00:57:20.160 But there's also no way
00:57:21.360 to know, although there
00:57:22.520 was a test that claims
00:57:23.820 that we do see it the
00:57:24.700 same, but I don't
00:57:25.560 believe it.
00:57:28.300 There's also no way to
00:57:29.360 know if everybody
00:57:30.000 imagines the same way.
00:57:32.680 Right?
00:57:33.180 Because I can't see your
00:57:34.040 imagination and you
00:57:34.880 can't see mine.
00:57:35.900 But I believe, the
00:57:37.460 evidence suggests, that
00:57:39.000 because I'm a
00:57:40.100 professional, creative
00:57:41.940 person, I probably have
00:57:44.300 a stronger power of
00:57:45.360 imagination.
00:57:46.820 Because I make money
00:57:47.940 on my imagination and I
00:57:49.240 use it every day.
00:57:49.920 So, just, oh-ho,
00:57:53.220 carpe donctum.
00:57:55.460 Hey, everybody, say
00:57:56.420 hi to carpe donctum, who
00:57:57.660 says to us from YouTube,
00:58:00.160 see you soon.
00:58:02.560 That is the best message I
00:58:04.460 got all day.
00:58:05.940 See you soon.
00:58:08.000 Absolutely.
00:58:10.460 Absolutely.
00:58:10.940 All right.
00:58:14.280 So the question is, do I
00:58:15.520 have better powers of
00:58:16.340 imagination?
00:58:17.100 Is that why maybe I
00:58:18.560 could imagine my blood
00:58:19.740 pressure down?
00:58:21.660 Or was it just a trick on
00:58:23.480 me?
00:58:24.040 So I'll test it again.
00:58:25.180 But I wondered if anybody
00:58:25.940 knew the answer to that.
00:58:27.180 All right.
00:58:27.480 That's all for now.
00:58:29.620 And everything is a
00:58:32.520 bell curve.
00:58:32.900 Yes, we talked about
00:58:36.920 Georgia and Milani on
00:58:38.300 different live streams.
00:58:44.040 All right.
00:58:47.780 Carpe, make sure that you
00:58:49.480 tweet me as soon as you're
00:58:51.720 on Twitter and we'll get
00:58:53.340 your numbers up.
00:58:54.920 We'll get you up to a
00:58:55.980 million users in about
00:58:58.820 six months.
00:58:59.440 All right.
00:59:02.340 That's all for now.
00:59:03.760 YouTube.
00:59:04.380 Talk to you.