Real Coffee with Scott Adams - April 10, 2022


Episode 1709 Scott Adams: Today I Will Explain How To Persuade Putin And, Separately, Cure Your Laziness


Episode Stats

Length

57 minutes

Words per Minute

153.58795

Word Count

8,831

Sentence Count

630

Misogynist Sentences

6

Hate Speech Sentences

20


Summary

In this episode, I talk about a weird thing I've been noticing in my life, and how it could be a result of a simulation, and why it's so common in reality TV shows. I also talk about the dopamine hit of the day, and what it means to be a robot.


Transcript

00:00:00.000 Good morning everybody and welcome to once again the highlight of human civilization
00:00:14.360 and possibly some aliens who came before. We don't know, we don't know, can't rule it down.
00:00:20.740 And today I'm going to blow your mind even more than the mushroom trip without the mushrooms
00:00:27.380 that I took you on the last time. Can I do it? Can I? Can I move my whiteboard so it doesn't look like
00:00:35.660 there's a weird point coming out of my head? I'll bet I can. I'll bet I can. I did. Yeah, today's a
00:00:41.920 double whiteboard day. How lucky are you? That's how lucky. Yeah, two whiteboards. Now if you're
00:00:50.940 listening and not watching, I will describe them. They're not that complicated. But I know you get
00:00:56.420 excited when you see the double whiteboard. And you should. But if you want to take that excitement
00:01:01.160 up to a new level and a heretofore unknown level, all you need is a cup or mug or a glass of tank or
00:01:09.480 chalice or stein, a canteen, jug or flask or vessel of any kind. Fill it with your favorite liquid.
00:01:15.880 I like coffee. And join me now for the unparalleled pleasure. It's the dopamine hit of the day.
00:01:22.600 And it's called the simultaneous symptoms. It's going to happen now. Go.
00:01:31.180 Oh, yeah.
00:01:36.320 So for the past two years or so, I started noticing a weird pattern in my life. And I want to see if
00:01:45.720 anyone else has had any similar pattern. It goes like this. I wake up in the morning and I say to
00:01:53.560 myself, oh, what's the plot twist going to be today? And there is one every day. Every day I wake up and
00:02:02.900 there's a plot twist. I'm talking about something so unusual that it looks like it came out of a reality
00:02:09.740 TV show. You know how in a reality TV show, there's always a plot twist? Oh, somebody did something,
00:02:16.920 the thing that you didn't know about. Somebody talked to somebody. Somebody's mad about a thing.
00:02:23.460 And it was every day for years. And to the point where I'd go all day long and it'd be, say, it's
00:02:31.240 five o'clock in the evening. And I'd say to myself, no plot twist. But then it would come
00:02:39.220 before I went to bed. Plot twist every time. Now, is it possible that that could happen so many times
00:02:47.440 on its own? And I'm talking about, I'm talking about things that are as mind blowing as, you know,
00:02:55.640 learning out, you know, learning that you're an android. I mean, I'm talking about really
00:03:00.620 basically mind blowing stuff. Has anybody had that experience? Where you just get a plot twist
00:03:08.500 every day? Or is this somehow limited to me or confirmation bias? Nope, nobody else, huh?
00:03:15.880 So, okay. Some people are saying yes. All right. Actually, we're getting some yeses here.
00:03:26.740 More than I thought. All right. Interesting. I don't know what that means, I think. But when
00:03:31.840 I ask myself why we would be a simulated environment, assuming we are, because I think we are a simulation,
00:03:39.340 not an original species, why would we exist? Why would anybody make us? And the answer is to
00:03:44.420 test stuff. Or for entertainment. I feel as though my life has become somebody else's entertainment.
00:03:54.160 Like, it looks like, I feel like I'm in a reality TV show, but I'm not aware of it. That's what the
00:04:00.180 plot twists look like. I'm like, I don't know. These plot twists look exactly like a writer introduced
00:04:07.300 them into the plot. They don't look naturally occurring. So I feel like I'm in some kind of reality
00:04:13.620 TV show. Somebody's watching. All right. I'll get a little bit weirder now. Have you noticed that a lot
00:04:22.760 of self-help advice and books have one thing in common, which is positive thinking? Have you ever
00:04:29.880 noticed that? The most famous original self-help book of all time was called The Power of Positive
00:04:35.540 Thinking, which I read and was very influential in my development. And President Trump coincidentally
00:04:44.420 went to church and the minister, I think it is the right name, or pastor, I forget what you call each
00:04:53.180 leader of each church, but I think it was the minister, was actually the author of that book,
00:04:58.480 The Power of Positive Thinking. And Trump talks about it, having an impact on it. But other books,
00:05:06.940 including my own, Had It Failed Almost Everything and Still Went Big, have some element of that at their
00:05:13.120 core, that you have to think right to get a good outcome. Now, my version, I talk about shelf space,
00:05:20.500 your mental shelf space. And specifically, I talk about you can't stop thinking about something negative.
00:05:28.480 You don't have the ability to turn off a thought. People want to, but you can't. It's not a thing.
00:05:34.200 What you can do is crowd it out with other thoughts. Just be busy, keep your mind somewhere
00:05:40.760 else, and you just will take up the time, the shelf space, so to speak. So there's just less room for
00:05:45.620 the negative, until you kill it by atrophy. So the less you think about something, the less power it will
00:05:51.300 have over time. So there are all these different, you know, fields of self-help that all have this core.
00:05:58.480 And I found out just this week, why that might actually work. Because haven't you always wondered
00:06:09.220 why it works? Or does it work? I mean, the first question is, does it work? I don't want, I'm not
00:06:14.820 trying to make you think past the sale. Does it work? And if it does work, why does it work? Is it just,
00:06:21.820 do we imagine it works? Is it purely some form of placebo, where you just, you think it worked, but
00:06:29.860 nothing happened? Well, here's something I just found out.
00:06:33.820 Whiteboard number one, coming at you. Yeah. Good times. All right. Here's why it might work.
00:06:46.740 Turns out that if you are exposed to a positive, feel-good thought or experience, that your dopamine
00:06:54.460 will improve. So for example, if you love kittens, and I handed you a little kitten,
00:07:01.660 probably your dopamine would go up a little bit. And your dopamine has a big impact on you,
00:07:07.460 your happiness, right? But here's the part that I hadn't quite appreciated. I knew that dopamine feels
00:07:13.920 good, and that it's hard to be happy without it. And I knew that if you had the right kind of,
00:07:19.280 you know, stimulation, you could produce more of it. But here's the part I didn't know,
00:07:25.060 that dopamine has two primary purposes, if you want to think of it that way.
00:07:31.160 One is a reward, the thing that makes you feel good. But the other is it's implicated in movement.
00:07:37.480 You don't actually do anything until your dopamine is in the right state. And if your dopamine is low,
00:07:44.540 you end up with Parkinson's. So the reason they treat Parkinson's with L-DOPA is to give you a little
00:07:50.800 more dopamine boost to cure your randomness of motion. If you want to move in a particular direction
00:07:59.140 or toward a particular thing, you need your dopamine to be in line. And so we have a logical,
00:08:08.700 science-based, completely non-controversial explanation of why positive thoughts might
00:08:17.860 get you to the place you want. So imagine, if you will, that you spend your time thinking of a
00:08:23.520 positive thing that's going to happen to you. Like, oh, I'm going to have this promotion,
00:08:27.260 or I'm going to live in a big house, have a nice family, fall in love, whatever it is that is your
00:08:32.620 thing. And you're thinking all these positive thoughts about the future. And those things
00:08:38.520 create dopamine. And then what kind of action do you think is going to happen? The only thing that
00:08:43.640 we do are the things we imagine we can do. That's it. The things you imagine are the things you do.
00:08:50.160 You never do something except, I guess, like accidentally, that you didn't imagine. You know,
00:08:56.900 even driving to work in the morning is you sort of imagine it first, and then you can do it.
00:09:01.660 You got to imagine it. So imagination does seem directly related to producing the chemical that
00:09:09.120 creates the motion toward the thing. It's actually just that simple. The things you think about,
00:09:16.480 the things you imagine, are the only things that activate you. And if you think about positive
00:09:21.340 things, it activates you in exactly the right way. And if you spend your time thinking about
00:09:26.680 negative things, it would activate you in a different way. And so maybe it is this simple.
00:09:33.980 Maybe it's that simple. Now, here's the part where I blow your head off.
00:09:40.920 I didn't even get to the good part yet. What if, what if, laziness is a habit of thinking about the cost of
00:09:53.020 things or things or the effort, instead of thinking about the payoff? I'm going to say
00:10:00.480 it again to get a few more heads to explode. What if laziness is nothing but a habit of thinking
00:10:10.480 about the effort instead of thinking about the outcome? What if you could reverse laziness by simply
00:10:19.020 developing a habit of thinking more about the, uh, let's say the delicious food that you would like to enjoy
00:10:25.840 instead of how long it would take you to get up and go get it?
00:10:32.080 Holy shit is right.
00:10:35.160 Holy shit is right.
00:10:36.960 It might be that easy.
00:10:39.120 So this is one of the reframes or the type of reframe that I'm going to include in the book I'm working on.
00:10:44.480 Because I can't prove that this works, but do I have to?
00:10:51.240 I mean, it seems so logically connected that it's hard to imagine it doesn't work.
00:10:56.240 It'd be hard to come up with an argument that this chain of events doesn't reliably work.
00:11:02.740 So, uh, but the, but the real question is if you were to test this at home
00:11:06.920 and try to see if you can think more about the good outcome and less about the work,
00:11:12.840 would you get it done?
00:11:14.420 Why is it that people have a second child?
00:11:17.540 You know, why does a woman who goes through this awful, awful childbirth
00:11:21.820 have a second child?
00:11:25.820 Don't they always say the same thing?
00:11:27.240 If I remembered how bad this was, I wouldn't do it again.
00:11:31.240 Right?
00:11:31.680 So the, the not thinking about the effort is vital to actually the survival of humanity.
00:11:39.720 If we focused on how hard it was to have a baby, you just wouldn't do it.
00:11:43.380 Or you do, you do too little of it.
00:11:45.800 But if you focus on how awesome it would be to have a family, well, there you go.
00:11:51.120 You're going to go through the pain because you've already, you've already committed.
00:11:55.340 So, uh, and I have this experience with writing books.
00:11:59.160 Intellectually, I know that writing a book ruins almost a year and a half of my life.
00:12:05.140 I mean, it really, really puts a, a wet blanket on all your free time if you're writing a book.
00:12:10.280 And you have to think about it all the time.
00:12:12.260 So it's, it's really a big expense on top of your normal career if you have a regular career, as I do.
00:12:18.140 It's not as bad if all you do is write.
00:12:20.200 So, the way that I can write a book is I have to forget about how hard it is.
00:12:26.300 And just think about how cool it is to have a book if it does well.
00:12:29.680 You know, I imagine it doing well and people, people talking about it, maybe having a difference.
00:12:34.980 So I think about all these positive things and then I can write it.
00:12:37.940 So, am I, am I ambitious or do I simply have a thinking habit which produces dopamine because I'm thinking about the positive outcome?
00:12:50.680 And is the dopamine the thing that gets me up and moving?
00:12:53.920 And when you're observing me, you say, how the hell do you get so much done?
00:12:58.480 Right?
00:12:58.720 Well, the single, probably the most common question I'm asked in my entire career is how do you get all that done?
00:13:07.220 Because, you know, you know I have a variety of things going, just the things you see.
00:13:11.740 Now, imagine the things you see multiplied by the things you don't know anything about.
00:13:17.300 You know, because, because you see me cartooning, you know, I have to write a cartoon every day on average.
00:13:23.640 I don't work on it every day.
00:13:24.980 I do this live stream every day, sometimes twice, trying to write a book.
00:13:30.040 You can imagine how much administrative effort there is in just keeping the operation running.
00:13:34.620 Then on top of that, you know, a personal life, trying to keep my house that's being remodeled, the construction project.
00:13:41.500 That's just a sample.
00:13:43.380 It's like, like you have no idea.
00:13:46.660 If you could imagine how complicated my life is, you wouldn't even believe it.
00:13:52.320 Right?
00:13:52.580 And how do I do it?
00:13:54.980 I think it's just this.
00:13:57.020 I think it's just this.
00:13:58.500 When I think of all the things I do, I think about them in terms of their benefits.
00:14:03.640 I love doing this.
00:14:05.720 Like right now, like at this moment, I'm about as happy as I'll be all day.
00:14:11.600 To me, this is excellent.
00:14:13.120 This is a great, great experience.
00:14:15.560 Because, I don't know, I'm not even sure exactly why I like it.
00:14:19.100 I guess it's a variety of things that makes it a good time for me.
00:14:24.820 But I didn't really think about the fact that I had to spend two hours preparing.
00:14:31.560 You know, I was up at, I don't know, 2.30 this morning, couldn't sleep.
00:14:38.000 So, you know, I spent maybe two hours preparing for today.
00:14:41.200 But I never once thought about the two hours.
00:14:44.080 I only thought that, oh, this would be fun.
00:14:45.880 I can't wait to, you know, turn it on and be doing this thing.
00:14:48.640 So, here's your tip.
00:14:50.280 It's going to change some of your lives.
00:14:52.820 Think about the positive, not about the work, and see what that does to you.
00:14:57.060 Well, in the news, Boris Johnson visited Kiev and walked around the city, seemingly safe, with Zelensky.
00:15:06.440 Now, so the Russian troops have, they have pulled away from Kiev, right?
00:15:13.400 That's a done deal, isn't it?
00:15:16.820 Didn't Ukraine win?
00:15:19.460 Well, wasn't that clearly, you know, the objective of Russia was to take Kiev and put in a puppet?
00:15:28.520 I mean, I feel as though, if you look at the primary goal, that Russia lost, and they've already admitted it.
00:15:37.380 Now, I know what you're saying, that Russia's clever.
00:15:40.300 What they're going to do is consolidate things in the south, and then they'll start moving toward the capital again,
00:15:45.680 or they'll squeeze them out, or they'll blackmail them, or something.
00:15:49.380 Maybe. Maybe.
00:15:51.600 But it kind of looks like they lost and they know it.
00:15:54.840 So they're consolidating their forces in the region, Donbass and whatever, the regions where they've got victories and they seem to have control.
00:16:05.080 But here's the thing that I don't quite get.
00:16:12.260 But has Putin figured out how he could occupy Ukraine in the long run?
00:16:18.220 Because I don't see how you can do it.
00:16:20.620 And this is what I mean, in this specific case.
00:16:22.880 Isn't it just a number of drones?
00:16:29.180 Let's go to the second whiteboard.
00:16:31.660 Are you ready for this?
00:16:33.800 I know.
00:16:34.360 One whiteboard is a lot to absorb, but...
00:16:37.660 Two?
00:16:38.620 Wow.
00:16:39.620 I know.
00:16:40.240 I think you could do it.
00:16:42.260 Second concept of the day.
00:16:47.020 Persuading Putin.
00:16:49.780 What would it take to persuade Putin?
00:16:51.700 First answer?
00:16:53.340 Nothing.
00:16:54.420 Nothing would persuade Putin.
00:16:56.200 Putin cannot be persuaded.
00:16:58.240 He's crazy.
00:16:59.080 He's Hitler.
00:17:00.160 He's made up his mind.
00:17:01.680 Maybe.
00:17:03.060 Good chance of that.
00:17:04.380 Good chance of that.
00:17:05.520 I would, however, submit to you that there's literally one piece of data, and you don't even have to be accurate,
00:17:13.740 that would persuade Putin that he already lost.
00:17:17.720 And that he needs to figure out how to manage the defeat.
00:17:21.700 Right now, he's still managing the war.
00:17:25.160 The best persuasion would get him to manage the withdrawal or manage what the defeat looks like.
00:17:31.080 So here's the one piece of data that would persuade Putin to stop now.
00:17:38.940 Stop focusing on the fight, because Putin might win the fight.
00:17:44.280 Am I right?
00:17:45.120 If I said to you, who's going to win, you know, in that area of Ukraine, and maybe even the rest of Ukraine,
00:17:51.760 if I said, who's going to win militarily, what would all of you say?
00:17:56.940 It depends how determined he is.
00:17:59.460 If he's decided to take it, he's going to take it.
00:18:03.760 And it looks like he has.
00:18:06.560 It looks like he's decided to take it, and it looks like he will.
00:18:11.600 But then what?
00:18:13.900 But then what?
00:18:15.840 He has to hold it, right?
00:18:17.180 The whole point is to occupy it.
00:18:19.620 He also took, you know, all that area around the Capitol.
00:18:23.600 But he didn't hold it, so what was the point of taking it?
00:18:28.860 So here's the thing that I think the news needs to report, or maybe not.
00:18:34.600 Maybe not.
00:18:35.300 I'll tell you why.
00:18:36.400 Is how many of these portable drones, the switchblade type, can be delivered to Ukraine, and in what time?
00:18:45.460 Because there's a theoretical number of them, which we could reach, and I just threw in a month here.
00:18:51.460 This is a random month, maybe by the summer, maybe by the end of the year.
00:18:57.220 If Ukraine held on, how many of these could they get?
00:19:00.740 Now, the switchblade drones, let me give you a demonstration that is so amazing.
00:19:07.480 I think you'll be impressed.
00:19:08.940 You will.
00:19:10.280 Watch this.
00:19:11.680 If you're watching, or if you're listening on audio, this won't be impressive at all, but watch how well I do this.
00:19:18.920 So you've got this little backpack, and in your backpack, you take out one of these switchblade drones.
00:19:27.240 And I think it fires out of a tube, and it comes out like a missile.
00:19:32.360 But in the air, once it's launched, its wings go like a switchblade.
00:19:39.400 No, like this.
00:19:40.080 So that it becomes a little plane.
00:19:43.860 I think it has two wings, actually, from the back.
00:19:46.100 And then it can fly several kilometers.
00:19:51.280 So you can launch these things from far away.
00:19:54.100 The military person who fires it then has a little monitor.
00:19:58.400 I saw a picture of it.
00:19:59.300 It looks like, you know, you can cover it from outside distractions, like light.
00:20:03.480 And just puts it on its head, and then he can see the drone.
00:20:07.540 Then the drone can fly for many miles, looking at the terrain.
00:20:12.240 And then it gets to a place, and it can just kind of hang around for, I think, 10 minutes or something.
00:20:18.180 How many minutes?
00:20:19.600 Somebody will tell me.
00:20:20.900 But you can hang around for a long time, just looking down, as the soldier who's in the remote place is watching it.
00:20:27.600 And then it can just pick out a target and just destroy it.
00:20:32.620 Now, somebody says 15 minutes, it can hover.
00:20:36.520 Now, here's my question.
00:20:38.740 How do you defend against that?
00:20:41.440 Now, quick answer is, the way you defend is with electronic means, right?
00:20:46.400 You jam them.
00:20:48.420 Has the jamming worked so far?
00:20:51.520 Have you heard any story that says, you know, all these drones aren't working at all,
00:20:55.640 because those Russian jammers are just jamming them up?
00:21:00.720 No.
00:21:01.560 No.
00:21:02.140 In fact, you're not hearing that at all.
00:21:03.780 So either Russia doesn't have good jammers,
00:21:06.540 or the workaround for the jammers is simple enough that it's not slowing them down.
00:21:12.600 I imagine that the jamming equipment itself is fairly identifiable, isn't it?
00:21:18.120 Couldn't you see the jamming equipment, or at least know where it is, or locate it?
00:21:23.300 I mean, there's probably anti-jamming munitions of some kind.
00:21:28.720 But they also have to turn it off now and then.
00:21:31.300 Did you know that?
00:21:32.700 Just turn it off.
00:21:34.380 And the reason they have to turn it off is they often have to launch their own drones.
00:21:39.140 So the Ukrainians would wait until the Russian military launched their own drones,
00:21:43.760 which required turning off their own anti-drone jamming equipment,
00:21:47.860 and then they'd attack.
00:21:48.860 So that's just a sample of the military back and forth of adjusting to each other's tactics.
00:21:57.240 But as far as I can tell, there doesn't seem to be any permanent way
00:22:01.560 to stop these switchblade attacks.
00:22:05.340 And so, since they can be launched from a distance,
00:22:08.840 it takes one person to carry them and one person to launch them,
00:22:11.780 isn't the only thing that we wonder about is how many there are
00:22:16.820 and how quickly we can get them?
00:22:19.120 I think even training is not a huge deal.
00:22:22.900 I think you could probably watch a YouTube video on how to use them.
00:22:26.960 They're built to be used in the field, not by an engineer or a scientist.
00:22:32.620 Now, of course, I need a fact check on that.
00:22:35.940 But my belief is it's probably not a lot harder than operating a commercial drone
00:22:42.700 that you would buy online and buzz your neighbor.
00:22:46.300 I'll bet it's not that much harder
00:22:47.640 because they wouldn't have designed it to be that hard.
00:22:51.320 They would have to design it to be simple.
00:22:57.480 There are many defenses against jamming, yes.
00:23:00.340 One of the defenses against jamming is to have your signal vary in real time.
00:23:07.700 So if your signal is varying in real time,
00:23:09.600 the jamming may not be widespread enough to hit the right frequency.
00:23:14.700 So you've got your jamming, your anti-jamming, et cetera.
00:23:18.640 All right, so my question is this.
00:23:21.200 If we wanted to persuade Putin,
00:23:23.640 could we persuade him that he will lose the military battle?
00:23:27.260 Answer is probably not.
00:23:28.720 Probably not.
00:23:30.340 Because he just has to decide how much pain he wants to take.
00:23:34.120 And he probably can take as much as he needs to to get it done.
00:23:37.660 So instead, persuade him that occupation is impossible.
00:23:41.780 Not improbable and not hard.
00:23:44.340 Actually impossible.
00:23:46.740 Impossible.
00:23:48.200 Right?
00:23:48.900 So you can't say it's going to be really, really hard.
00:23:51.480 That gets you nothing.
00:23:52.680 Because he's already committed to really, really hard.
00:23:55.840 It has to be impossible.
00:23:56.780 And I do think that he could be convinced
00:23:59.680 that at some number of switchblades,
00:24:03.320 switchblade drones and the like,
00:24:06.560 you know, similar equipment,
00:24:08.360 that at some level, you can't hold them.
00:24:11.540 Here's a question for you.
00:24:13.260 Where would the occupying troops,
00:24:15.160 how would they sleep?
00:24:18.600 Would they have to sleep one person, you know,
00:24:22.040 and a mile away another person?
00:24:23.920 Wouldn't they be in a barracks?
00:24:27.560 I mean, they're not going to be underground, are they?
00:24:29.840 Is the occupying force going to be underground?
00:24:33.280 Yeah, I don't think you can hide all of the military stuff.
00:24:36.140 I believe that, you know, once a night,
00:24:39.480 a military barracks would blow up under occupation.
00:24:42.980 Probably once a night.
00:24:44.780 How do you keep your military motivated to keep the occupation up
00:24:49.360 if one of their barracks blows up every night?
00:24:53.360 Right?
00:24:54.360 And that's just the barracks.
00:24:56.000 Imagine how many other things they could blow up.
00:24:59.120 So I don't think there's any way
00:25:01.060 to hold or to occupy a country militarily.
00:25:06.460 Well, there are two ways.
00:25:07.980 If Putin had taken Kiev instantly
00:25:11.160 and put in a puppet,
00:25:13.940 it might have worked.
00:25:15.740 Do you think?
00:25:17.280 If Putin's original plan,
00:25:19.120 as we imagine it was, we don't know,
00:25:21.220 but if his original plan was just
00:25:22.960 to quickly put in a puppet,
00:25:24.220 I think that's worked before.
00:25:26.480 Partly because the public says,
00:25:28.100 ah, you know, the last one was corrupt too.
00:25:31.440 Ah, it doesn't make that much difference.
00:25:33.500 I'd rather just go to work today.
00:25:35.340 You know, one puppet gets replaced by another puppet.
00:25:38.560 That might have worked.
00:25:40.260 It might have worked, or at least for a while.
00:25:42.380 But that didn't happen.
00:25:44.240 So now that it's a destroy the country down to rubble,
00:25:48.200 I think the Ukrainians are a little bit angry
00:25:50.540 and motivated to do whatever it takes
00:25:52.400 to change the situation.
00:25:54.740 And if they have unlimited drones,
00:25:57.060 or at least an unlimited stream of them coming in,
00:26:00.700 how could anybody occupy the country?
00:26:03.940 I don't think it could be done.
00:26:05.480 Because you have enough people to use the drones
00:26:07.300 and enough drones.
00:26:08.820 So, how do you persuade Putin?
00:26:11.040 The question is,
00:26:12.340 if we're here, hypothetically,
00:26:15.080 meaning that not enough drones
00:26:16.820 have been delivered to Ukraine
00:26:18.320 to stop an occupation,
00:26:19.900 just enough to, you know,
00:26:22.200 maybe give them a little pause,
00:26:24.620 but how long would it take us,
00:26:28.060 and could we,
00:26:29.040 is it even possible,
00:26:30.300 to get to the number where even Putin would say,
00:26:32.900 okay, that's impossible now.
00:26:34.640 That's actually impossible.
00:26:35.880 There's no way to occupy this country.
00:26:37.600 Do you think Putin knows how many drones are being shipped?
00:26:44.580 And do you think he knows where the crossover point is?
00:26:47.880 He does not.
00:26:49.460 He does not know how many we're shipping,
00:26:51.340 how many we can ship,
00:26:52.380 and he doesn't know what the crossover point is.
00:26:55.860 That's why we should make one up.
00:27:01.020 Just start saying there is one.
00:27:03.380 Because whoever goes first defines the talking point.
00:27:08.300 So if none of us have any idea in our mind
00:27:10.800 how many switchblades it would take
00:27:12.320 to make Ukraine impossible to occupy,
00:27:15.680 let's create that number.
00:27:17.540 Is it 10,000?
00:27:18.720 Imagine seeing headlines starting to come out
00:27:22.500 that says 10,000 switchblades
00:27:26.100 make Ukraine unoccupiable.
00:27:30.680 They have 4,000 already.
00:27:34.500 By summer, they'll have 15,000.
00:27:38.660 It only takes 10,000 to be unoccupiable.
00:27:42.980 Now, is that true?
00:27:45.360 Does it matter?
00:27:46.500 It might not matter.
00:27:49.300 Because we're talking about persuasion,
00:27:50.920 we're not talking about truth.
00:27:53.240 And if you can convince Putin, in Putin's mind,
00:27:56.300 that that number, 10,000, actually makes a difference,
00:27:59.020 it would have to come from some kind of military source,
00:28:02.020 some kind of logic to it.
00:28:03.340 You'd need somebody who's a military, I don't know,
00:28:06.360 strategist or something,
00:28:07.880 some West Point person to say it.
00:28:09.740 So you get somebody to say it,
00:28:11.680 and then you get the media to repeat it.
00:28:13.780 Now, that shouldn't be hard
00:28:16.020 because it's pretty clear
00:28:17.160 that the corporate media
00:28:18.360 is working for our intelligence agencies.
00:28:23.260 So if the intelligence agencies say,
00:28:25.660 you know, wouldn't it be nice
00:28:27.140 if you guys start talking about
00:28:28.880 the country being unoccupiable
00:28:32.580 and that the reason is drones,
00:28:35.240 there is a certain number
00:28:36.220 that would make them unoccupiable.
00:28:38.220 Putin doesn't know what that number is,
00:28:40.080 but wink, wink,
00:28:40.840 we're getting there really fast.
00:28:42.760 Wink, wink.
00:28:43.460 We think it's about 10,000.
00:28:45.260 That expert said 10,000.
00:28:46.960 We got 4,000 now.
00:28:48.500 We're going to have 15,000 by August.
00:28:51.320 Doesn't matter if any of it's true.
00:28:53.500 Does it?
00:28:54.940 It only matters if Putin can't tell the difference.
00:28:58.000 That's all that matters.
00:28:58.800 So, this is persuading Putin.
00:29:06.320 You should take this as a lesson in persuasion,
00:29:09.380 not a prediction.
00:29:10.960 I don't think it's necessarily,
00:29:12.300 anything like that will happen.
00:29:14.140 All right.
00:29:15.960 What would happen?
00:29:17.820 So I think a lot of people were surprised
00:29:19.360 to see the Russians retreat
00:29:21.680 from the capital of Ukraine.
00:29:23.700 The one person I know
00:29:28.280 who was not surprised was me.
00:29:31.180 I believe the one person who said,
00:29:33.240 now, by the way,
00:29:33.760 I was totally wrong on the prediction
00:29:35.220 that Putin wouldn't invade.
00:29:37.420 And the reason was,
00:29:38.480 it would be suicide.
00:29:40.260 And I thought he could tell.
00:29:42.460 Apparently, I could tell it was suicide,
00:29:44.520 and he couldn't.
00:29:45.860 Now, that I did not see coming,
00:29:47.560 so that's on me.
00:29:49.740 Being smarter than Putin
00:29:50.760 was not something I foresaw
00:29:52.360 in any of this.
00:29:54.140 But I was, apparently.
00:29:56.080 Or better informed, or something.
00:29:58.720 I mean, I got it right,
00:29:59.680 and he got it wrong.
00:30:00.920 That's all I know for sure.
00:30:03.040 So, but I got it wrong
00:30:04.920 that he wouldn't invade,
00:30:06.160 because I thought he was more rational,
00:30:07.640 at least as smart as I am.
00:30:12.860 But, what would happen?
00:30:15.340 So you might have been surprised,
00:30:16.960 but I wasn't,
00:30:17.700 that he didn't quickly take over Ukraine.
00:30:20.740 And I said the reason would be
00:30:21.880 the modern weapons that they would have,
00:30:24.980 and that the modern weaponry
00:30:26.380 would be surprisingly effective.
00:30:28.640 And sure enough, it is.
00:30:29.980 So my exact prediction,
00:30:32.140 at least on that part, was right.
00:30:33.840 Here's the next part.
00:30:34.940 What would happen, hypothetically,
00:30:39.320 if it looked like Ukraine
00:30:41.800 was starting to block
00:30:43.500 the retreat routes
00:30:45.680 for the Russian army?
00:30:46.840 Just imagine it.
00:30:51.600 Suppose it looked like
00:30:52.700 their military strategy
00:30:53.860 had changed
00:30:54.660 to preventing Russia
00:30:57.300 from leaving
00:30:58.040 with their military.
00:31:03.700 Because that would say
00:31:04.700 they're going to starve them
00:31:05.620 and kill them,
00:31:06.960 and that they think
00:31:07.660 they can get it done.
00:31:08.880 I'm not saying they can.
00:31:10.720 I'm saying,
00:31:11.440 what would that do to you mentally
00:31:12.620 if you heard that your retreat
00:31:14.840 was being destroyed
00:31:16.200 when they have limited weaponry
00:31:18.580 in Ukraine,
00:31:19.360 and they should be attacking
00:31:20.880 things like, you know,
00:31:22.060 tanks and weapons depots.
00:31:24.400 But what if they destroyed
00:31:25.460 the retreat?
00:31:27.440 Because that would also destroy
00:31:29.160 the, presumably,
00:31:30.840 the supply routes.
00:31:32.480 So destroying the supply routes
00:31:33.900 would destroy the retreat,
00:31:35.640 wouldn't it,
00:31:36.060 if there was just literally
00:31:36.960 no way to get in and get out?
00:31:38.500 Somebody says
00:31:43.660 it's a bad strategy,
00:31:44.780 it makes the other side
00:31:45.680 fight harder.
00:31:46.640 No, it makes the other side
00:31:47.760 starve to death.
00:31:49.760 They might fight harder,
00:31:50.900 but they've got a week to do it.
00:31:52.600 Because they're out of food
00:31:53.500 in a week.
00:31:54.940 So fight hard,
00:31:56.460 but you've got a week.
00:31:58.380 Could they do everything
00:31:59.460 they need to do in a week?
00:32:01.180 I don't know.
00:32:03.460 So,
00:32:05.120 I just wonder about this
00:32:07.460 as a persuasion play
00:32:08.800 less than a military play,
00:32:10.680 which, you know,
00:32:11.300 becomes military, of course.
00:32:13.280 But,
00:32:13.820 wouldn't that get into
00:32:15.320 the Russians' heads
00:32:16.120 if you acted like
00:32:18.380 you were going to trap them
00:32:19.300 and starve them,
00:32:20.400 and you look like
00:32:21.060 you might actually
00:32:21.700 pull it off?
00:32:23.300 It would get in my head.
00:32:24.780 Well, I saw a video
00:32:25.420 on the internet,
00:32:26.600 Carl Icahn,
00:32:27.380 the famous corporate raider
00:32:29.600 who buys companies
00:32:30.660 and, you know,
00:32:32.680 fires a lot of people
00:32:33.640 and turns them around
00:32:35.020 and then sells them
00:32:35.880 and does stuff like that.
00:32:37.460 And he was talking
00:32:38.220 about this one company
00:32:39.000 he bought years ago,
00:32:39.980 30 years ago or so,
00:32:41.380 in which,
00:32:42.240 after he bought it,
00:32:42.980 he was trying to figure out
00:32:43.780 what all the people did.
00:32:45.840 And there were 12 floors
00:32:47.640 of employees there,
00:32:49.340 and when he tried
00:32:50.260 to go in and figure out,
00:32:51.300 okay,
00:32:51.420 what do all you do,
00:32:53.100 when they explained it to him,
00:32:54.180 he still didn't know
00:32:54.700 what they did.
00:32:56.020 And they hired consultants
00:32:58.140 to come in
00:32:58.740 and figure it out for him,
00:33:00.120 and he paid them
00:33:00.860 a quarter million dollars,
00:33:01.940 and they came back
00:33:02.520 and they said,
00:33:03.800 you know,
00:33:04.240 being honest,
00:33:04.920 we have no idea
00:33:06.240 what they do.
00:33:07.520 So he fired all 12 floors
00:33:09.600 with no precision.
00:33:12.260 He just fired all 12 floors,
00:33:14.040 and he said that years had passed
00:33:16.580 and nobody had complained yet.
00:33:17.780 In other words,
00:33:20.240 he expected somebody to say,
00:33:22.880 okay,
00:33:23.080 we can't deliver that product
00:33:24.420 or we can't do this thing
00:33:25.760 because you fired
00:33:26.740 the department in charge,
00:33:28.240 and nobody ever complained.
00:33:30.500 Everything just went fine.
00:33:32.300 Operated smoothly.
00:33:33.140 Now,
00:33:34.700 that's not the real story.
00:33:36.960 The real story
00:33:38.080 is that Elon Musk
00:33:39.400 interacted with that video
00:33:41.820 in a positive way,
00:33:44.460 and I think he said
00:33:45.200 just one word exactly.
00:33:48.340 And this is right after
00:33:49.800 he bought Twitter
00:33:50.880 and maybe wondering
00:33:53.180 what all of those people
00:33:54.180 at Twitter do for a living.
00:33:57.000 Now,
00:33:57.720 that's not the only thing
00:33:59.120 he's doing.
00:33:59.540 Here are some of the things
00:34:02.840 he's done
00:34:03.280 just in the last,
00:34:04.020 like,
00:34:04.220 24 hours.
00:34:06.340 He tweeted,
00:34:07.400 should we delete
00:34:08.160 the W in Twitter?
00:34:10.780 Which,
00:34:11.300 of course,
00:34:11.680 would turn it into
00:34:12.420 a naughty word,
00:34:13.520 titter.
00:34:15.420 Now,
00:34:17.280 let's contrast this
00:34:20.980 with what I just talked about,
00:34:22.720 Carl Icahn.
00:34:23.420 Tesla famously
00:34:26.540 has no marketing department
00:34:27.980 because Elon Musk
00:34:30.740 can do the entire
00:34:31.700 marketing job
00:34:32.640 just by making
00:34:33.900 interesting products
00:34:35.080 and tweeting.
00:34:37.360 And that's it.
00:34:38.140 He doesn't even need
00:34:38.760 marketing.
00:34:40.060 And so,
00:34:41.240 Elon buys 9%
00:34:42.480 of Twitter.
00:34:43.620 He tweets,
00:34:44.380 should we delete
00:34:45.040 the W in Twitter?
00:34:45.940 and he did
00:34:47.260 more work
00:34:47.960 than the entire
00:34:48.780 marketing department
00:34:49.780 of Twitter
00:34:50.500 with one,
00:34:52.580 two,
00:34:52.880 three,
00:34:53.320 four,
00:34:53.580 five words.
00:34:54.800 Five words.
00:34:56.260 The whole job
00:34:57.460 of the marketing
00:34:57.960 department of Twitter.
00:34:59.880 So,
00:35:00.400 I guess that's one floor
00:35:01.240 he can get rid of.
00:35:03.860 He also tweeted,
00:35:05.620 and this is all
00:35:06.260 in 24 hours.
00:35:07.240 It's just mind-boggling.
00:35:08.420 He asked why
00:35:09.160 the top Twitter users
00:35:10.400 in terms of
00:35:10.980 number of followers
00:35:11.740 no longer tweet.
00:35:13.560 He says,
00:35:14.040 is Twitter dying?
00:35:14.860 And he paints to,
00:35:16.440 he points to like
00:35:17.380 Taylor Swift
00:35:18.080 and Justin Bieber
00:35:19.820 and stuff.
00:35:20.240 And I guess they
00:35:20.620 haven't tweeted
00:35:21.100 in months.
00:35:21.960 I didn't know that.
00:35:23.680 And so he says,
00:35:24.280 is Twitter dying?
00:35:25.300 It's his own
00:35:25.940 investment.
00:35:27.900 It's his own
00:35:28.660 investment.
00:35:29.940 And he asks
00:35:30.620 if it's dying.
00:35:33.020 Who does that?
00:35:34.700 Somebody who doesn't
00:35:35.540 have a marketing
00:35:36.120 department.
00:35:37.280 That's who.
00:35:38.660 Because if you don't
00:35:39.400 have a marketing
00:35:40.020 department,
00:35:40.880 I don't know a
00:35:41.660 better way to get
00:35:42.340 attention than to
00:35:43.120 ask if your
00:35:43.660 product is
00:35:44.740 dying.
00:35:45.940 Do you remember
00:35:46.460 he did something
00:35:47.220 similar with
00:35:47.900 Tesla stock?
00:35:49.420 Do you remember
00:35:50.120 when he said he
00:35:50.840 thought Tesla stock
00:35:52.020 was too high?
00:35:53.280 It was a few
00:35:54.100 years ago.
00:35:55.620 Who does that?
00:35:57.540 Who says their
00:35:58.380 own stock is too
00:35:59.140 high?
00:35:59.480 I believe it drove
00:36:00.720 it down.
00:36:02.220 Somebody,
00:36:02.780 give me a fact
00:36:03.480 check on that.
00:36:04.040 I think he drove
00:36:04.640 his own stock
00:36:05.640 price down.
00:36:06.620 Did he not?
00:36:07.900 That actually
00:36:08.540 happened.
00:36:09.980 Didn't it?
00:36:10.800 Now, one of the
00:36:11.440 things that that
00:36:12.220 did, and also this
00:36:14.260 thing about
00:36:14.680 Twitter, is
00:36:15.380 Twitter dying, is
00:36:16.760 it builds
00:36:17.140 credibility, which
00:36:19.480 is, aside from
00:36:20.860 money, and aside
00:36:22.100 for being super
00:36:23.480 smart, his
00:36:24.800 credibility is
00:36:26.560 his big asset
00:36:27.340 that is hard to
00:36:28.360 match.
00:36:29.140 He's neither
00:36:29.940 Republican nor
00:36:30.880 Democrat.
00:36:31.420 You don't know
00:36:31.760 what the hell he
00:36:32.220 is.
00:36:32.780 He just wants
00:36:33.280 things to work.
00:36:35.080 And so when he
00:36:36.440 questions his own
00:36:37.600 product, the thing
00:36:38.400 he invested in,
00:36:39.600 questions his own
00:36:40.440 stock price, it's
00:36:42.480 impossible not to
00:36:43.820 give him more
00:36:44.320 credibility, isn't
00:36:45.260 it?
00:36:46.240 It's impossible not
00:36:47.520 to say, okay, that's
00:36:48.340 credible.
00:36:49.320 Because if you're
00:36:50.560 criticizing your own
00:36:52.020 product and going to
00:36:53.820 the moon, okay, we're
00:36:55.380 going to listen to
00:36:55.900 that.
00:36:56.660 That sounds pretty
00:36:57.520 credible.
00:36:58.580 So he's got that
00:36:59.480 going for him.
00:37:02.300 And he also asked
00:37:03.480 should the Twitter
00:37:05.040 Blue, apparently
00:37:05.960 Twitter has a
00:37:06.780 service called
00:37:08.140 Twitter Blue that
00:37:09.380 you can pay $3 a
00:37:10.500 month.
00:37:11.440 And I guess you
00:37:12.880 have limited
00:37:13.660 ability to edit
00:37:14.520 your own tweets
00:37:15.300 and you don't see
00:37:16.820 advertisements or
00:37:17.680 something.
00:37:18.060 I don't know what
00:37:18.540 else it's about.
00:37:19.780 But I didn't even
00:37:20.440 know Twitter Blue
00:37:21.020 was a thing.
00:37:22.700 I mean, I had
00:37:24.380 slight awareness of
00:37:25.880 it, but I didn't
00:37:26.340 know what its
00:37:26.940 features were.
00:37:28.680 But now I do.
00:37:32.140 Yesterday, I didn't
00:37:34.360 know that Twitter
00:37:34.960 had this new
00:37:36.240 feature, which I
00:37:37.700 actually would
00:37:38.380 consider paying
00:37:39.100 for.
00:37:40.360 I don't know.
00:37:41.120 I might have
00:37:41.760 something I want.
00:37:43.260 And again,
00:37:44.980 Twitter's entire
00:37:45.700 marketing department
00:37:46.580 didn't penetrate me
00:37:47.800 with that
00:37:48.160 information.
00:37:49.740 But Elon Musk
00:37:51.100 just did.
00:37:52.660 And what he
00:37:53.420 asked is, should
00:37:54.180 all the Twitter
00:37:54.740 Blue users get a
00:37:55.780 blue check?
00:37:57.000 And the answer
00:37:57.800 is, of course
00:37:58.420 not.
00:37:59.380 Of course not.
00:38:00.640 That would defeat
00:38:01.360 the entire purpose
00:38:02.240 of the blue check.
00:38:03.400 But it's a perfect
00:38:04.480 question if you want
00:38:06.900 to get rid of the
00:38:07.440 marketing department.
00:38:09.200 It's just a great
00:38:09.860 question.
00:38:10.480 Because it just
00:38:11.140 makes you think
00:38:11.660 about it.
00:38:12.000 It's not like he's
00:38:13.220 leading you to some
00:38:14.420 specific answer.
00:38:16.200 So no, Twitter
00:38:17.400 Blue users should
00:38:18.160 not get a blue
00:38:18.780 check, because that's
00:38:19.660 not what the blue
00:38:20.240 check is used for.
00:38:21.940 But now we all
00:38:23.400 know there's this
00:38:24.120 feature.
00:38:26.220 But it would give
00:38:27.160 you, and let's say
00:38:29.480 to his point, if you
00:38:31.280 gave everybody a blue
00:38:32.200 check who paid for
00:38:33.120 it, it would kind of
00:38:34.540 democratize the
00:38:35.560 influencers, right?
00:38:36.880 Because right now if
00:38:37.660 you have a blue
00:38:38.140 check like I do, you
00:38:39.840 get more attention.
00:38:41.320 Is that fair?
00:38:42.940 Should I get more
00:38:44.060 attention because people
00:38:46.920 try to impersonate me
00:38:49.280 online?
00:38:51.240 Which, by the way, has
00:38:52.100 been a big problem.
00:38:54.460 I don't know.
00:38:55.040 Maybe.
00:38:55.680 But it gives me too
00:38:56.740 much influence for the
00:38:58.340 wrong reason.
00:38:58.940 If the reason I get a
00:38:59.920 blue check is so that
00:39:00.840 somebody doesn't imitate
00:39:02.940 me, that's good.
00:39:04.920 But if then that blue
00:39:05.820 check gives me more
00:39:06.660 influence than my
00:39:07.820 opinions or voice
00:39:08.920 deserve, well, then
00:39:11.220 it's bad.
00:39:11.960 And I think maybe Elon
00:39:13.160 might have noticed
00:39:14.000 that.
00:39:14.480 Who knows?
00:39:17.280 But Elon also
00:39:18.280 mentioned, again, in
00:39:19.400 the last 24 hours, that
00:39:21.760 if you get Twitter
00:39:23.140 Blue, you wouldn't get
00:39:24.740 ads and maybe Twitter
00:39:26.820 should be a service
00:39:27.580 that doesn't serve up
00:39:28.580 corporate influence.
00:39:31.980 So if you have a
00:39:33.440 subscription service,
00:39:35.180 you're not as
00:39:36.060 influenced by keeping
00:39:38.220 the sponsors happy.
00:39:41.240 That's a good point.
00:39:42.700 And it gets to freedom
00:39:43.720 of speech and bias and
00:39:45.160 censorship and
00:39:46.340 everything.
00:39:46.780 If you just make a
00:39:47.540 subscription.
00:39:48.740 But would enough
00:39:50.300 people buy a
00:39:51.080 subscription to
00:39:51.960 Twitter?
00:39:52.940 And if you're not
00:39:53.800 willing to pay $3 a
00:39:55.060 month for Twitter,
00:39:56.820 should you be
00:39:57.580 tweeting?
00:39:59.440 Should you?
00:40:00.860 $3 a month?
00:40:01.980 If you're not
00:40:02.540 willing to spend
00:40:03.080 $3 a month, how
00:40:05.280 important do you
00:40:05.860 think your opinion
00:40:06.440 is that other
00:40:08.060 people should hear
00:40:08.760 it?
00:40:09.560 I don't know.
00:40:10.700 Now, it looks like
00:40:13.100 we're having some
00:40:13.600 network interruptions on
00:40:14.800 one of the
00:40:15.360 platforms.
00:40:20.860 I don't know what
00:40:21.600 that's about.
00:40:22.100 I don't see a problem
00:40:23.000 on my end.
00:40:24.140 My Wi-Fi is
00:40:25.200 screamingly good
00:40:26.100 today.
00:40:26.360 All right.
00:40:28.980 So watching Elon
00:40:30.660 Musk disassemble a
00:40:32.340 company in real
00:40:33.840 time, right in
00:40:35.000 front of you, is
00:40:36.340 probably the most
00:40:37.180 interesting thing a
00:40:38.680 business model nerd
00:40:40.160 could ever see.
00:40:41.240 And that's what I
00:40:41.780 am.
00:40:42.340 I'm really nerding
00:40:43.520 out on business
00:40:44.200 models.
00:40:45.100 Like, how do you
00:40:45.700 make a business
00:40:46.360 where all the parts
00:40:47.440 work just right?
00:40:48.260 So I'm just sort
00:40:49.860 of a student of
00:40:51.520 business models.
00:40:53.060 And watching him,
00:40:54.580 he's effectively
00:40:55.500 dismantling Twitter
00:40:57.160 psychologically first.
00:41:00.520 Right?
00:41:00.920 He's already
00:41:01.580 redesigned a new
00:41:02.660 Twitter in your
00:41:03.260 mind.
00:41:04.900 He's engineered
00:41:06.020 the damn thing
00:41:06.800 in your mind.
00:41:08.500 You know, it
00:41:08.800 doesn't have
00:41:09.140 corporate influences.
00:41:10.800 The blue checks
00:41:11.600 are handled
00:41:11.980 differently.
00:41:12.980 You can edit
00:41:13.620 maybe.
00:41:15.760 So he does it
00:41:16.860 first in your
00:41:17.460 mind.
00:41:17.860 Then he can sort
00:41:18.680 of A-B test it
00:41:19.580 based on people's
00:41:20.420 influence.
00:41:21.420 And then probably
00:41:22.360 can get it done
00:41:23.160 in reality if people
00:41:25.420 are on board with
00:41:26.100 it.
00:41:27.320 And I love watching
00:41:28.420 him do things that
00:41:29.220 only he can do.
00:41:31.020 That's really
00:41:31.820 special.
00:41:33.240 When I watch
00:41:33.920 basketball on TV,
00:41:35.760 I don't really care
00:41:36.620 who wins.
00:41:38.020 I want to see what
00:41:38.760 Stephen Curry does.
00:41:40.640 Or LeBron James.
00:41:42.200 Because they're just
00:41:43.080 not like other
00:41:43.680 people.
00:41:44.540 It's like watching
00:41:45.320 a superhuman do
00:41:46.360 something.
00:41:46.660 And watching
00:41:49.120 Elon do what
00:41:49.840 only Elon could
00:41:50.600 do, because he's
00:41:51.300 the only one rich
00:41:52.260 enough, influential
00:41:53.580 enough, knows how
00:41:55.980 to manage the
00:41:57.800 psychological elements
00:41:59.120 of it, cares enough
00:42:00.640 about it.
00:42:01.040 I mean, he's the
00:42:01.400 only one.
00:42:02.460 I don't think there
00:42:03.340 was anybody else who
00:42:04.060 could do what he's
00:42:04.580 doing right now with
00:42:05.440 Twitter.
00:42:06.000 And it's super
00:42:07.100 important.
00:42:08.520 It's like he's
00:42:10.180 rewriting the source
00:42:11.160 code, maybe the
00:42:13.440 operating system, of
00:42:14.780 America.
00:42:16.680 Because we went
00:42:17.960 from, you know,
00:42:18.580 okay, free speech
00:42:19.600 and it's in
00:42:20.500 newspapers and
00:42:21.300 randomness.
00:42:22.220 But once power
00:42:23.140 got consolidated
00:42:24.040 into the social
00:42:25.080 platforms, with
00:42:26.300 Twitter, I think
00:42:26.900 is the important
00:42:27.460 one.
00:42:27.960 Because Twitter is
00:42:28.820 the one that
00:42:29.240 influences journalists.
00:42:31.240 And then journalists
00:42:31.960 influence everybody
00:42:32.740 else.
00:42:33.040 So Twitter is like
00:42:33.660 the leverage.
00:42:34.600 So of course, Elon
00:42:35.280 finds the most
00:42:36.320 important lever,
00:42:37.440 Twitter, finds a
00:42:38.960 way to have the
00:42:39.740 most leverage on it
00:42:40.660 without buying the
00:42:41.300 whole company,
00:42:42.280 9%.
00:42:43.040 And now he's
00:42:44.840 digging in and
00:42:45.500 he's going to
00:42:45.900 re-engineer the
00:42:46.580 thing, clearly.
00:42:48.560 I don't think
00:42:49.340 there's any doubt
00:42:49.940 he is actually
00:42:50.600 going to re-engineer
00:42:51.460 the product.
00:42:52.520 And he might
00:42:53.100 actually fix the
00:42:54.000 biggest problem in
00:42:54.840 the country, which
00:42:55.540 is we can't trust
00:42:57.020 anything we see or
00:42:57.940 read.
00:42:59.420 Think about that.
00:43:00.720 He might actually
00:43:01.280 fix that.
00:43:02.900 And I'll say
00:43:04.000 again, there's
00:43:04.640 nobody else who
00:43:05.200 could have done
00:43:05.620 it.
00:43:06.800 The ultimate
00:43:07.820 patriot, if I may
00:43:10.320 say that, the
00:43:11.620 ultimate patriot is
00:43:13.500 the one who does
00:43:14.140 it because nobody
00:43:14.900 else can do it.
00:43:16.440 Now that might be
00:43:17.260 including just going
00:43:18.300 to war, right?
00:43:19.180 You know, picking
00:43:19.540 up your rifle.
00:43:20.800 It needs lots of
00:43:21.620 people to say, well,
00:43:22.580 if I don't pick up
00:43:23.200 my rifle, you know,
00:43:25.560 I can't depend on
00:43:26.540 everybody else to do
00:43:27.320 it for me.
00:43:28.460 So the fact that he
00:43:30.340 just jumped into the
00:43:31.880 biggest problem in
00:43:32.760 the country, at
00:43:33.600 least the lever that
00:43:34.820 controls the biggest
00:43:35.780 set of problems, and
00:43:37.760 that he might actually
00:43:38.460 get control of this
00:43:39.400 thing, and he
00:43:40.300 is letting you
00:43:40.940 watch it while it
00:43:41.840 happens.
00:43:43.060 This is thrilling
00:43:44.020 stuff.
00:43:44.700 This is just
00:43:45.180 thrilling.
00:43:46.220 You get to watch
00:43:47.280 as he does it
00:43:47.960 right in front of
00:43:48.520 you.
00:43:49.460 This could not be
00:43:50.300 cooler for a nerd
00:43:51.340 like me.
00:43:52.880 All right, wait.
00:43:53.520 Here's the thing I
00:43:54.480 used to hear as a
00:43:55.100 kid.
00:43:56.200 Have you ever heard
00:43:56.780 this saying, only
00:43:57.600 God can make a
00:43:58.540 tree?
00:44:00.080 Only God can make
00:44:01.160 a tree.
00:44:02.520 I don't know why,
00:44:03.180 but you used to hear
00:44:03.660 that all the time
00:44:04.340 when I was a kid.
00:44:06.040 But today, Mark
00:44:08.920 Zuckerberg is
00:44:09.800 creating better, a
00:44:10.960 virtual world in
00:44:12.180 which Mark
00:44:13.520 Zuckerberg is
00:44:14.540 going to make
00:44:14.960 about a trillion
00:44:15.520 trees.
00:44:17.240 So it turns out
00:44:18.260 that Mark
00:44:19.160 Zuckerberg can
00:44:19.800 make a tree.
00:44:21.140 Now you're saying
00:44:21.700 to yourself,
00:44:22.320 that's not a tree.
00:44:26.160 That's like a
00:44:26.980 picture of a tree.
00:44:28.540 That's like a
00:44:29.100 digital tree.
00:44:30.660 Nobody's going to
00:44:31.360 be confused about
00:44:33.120 a digital tree
00:44:34.300 versus a real
00:44:35.780 tree.
00:44:36.060 So take your
00:44:37.780 stupid analogy
00:44:38.540 and shove it
00:44:39.840 up your stupid
00:44:40.580 whatever hole
00:44:42.180 and get out of
00:44:44.100 my face with
00:44:44.880 your dumb
00:44:45.460 thoughts.
00:44:46.920 That's what you
00:44:47.440 say.
00:44:48.020 That's what you
00:44:48.480 say to me.
00:44:49.560 And I deserve
00:44:50.360 every bit of it.
00:44:51.820 But here's my
00:44:52.440 counter.
00:44:53.340 If we're a
00:44:54.000 simulation, it is
00:44:54.880 almost certain
00:44:55.560 that we've been
00:44:56.200 programmed not to
00:44:57.320 notice that our
00:44:58.740 simulation is
00:45:00.460 simulated.
00:45:00.980 That would be
00:45:03.740 sort of basic
00:45:04.620 to the program
00:45:05.540 that the people
00:45:06.900 in it are
00:45:07.460 programmed not
00:45:08.440 to notice.
00:45:10.300 Code, don't
00:45:11.240 ever notice that
00:45:12.320 you're simulated
00:45:13.000 because it will
00:45:13.640 make the whole
00:45:14.100 thing not work.
00:45:16.580 So let me give
00:45:19.760 you a little
00:45:20.040 preview of what's
00:45:20.880 to come.
00:45:21.900 So let's say
00:45:22.540 meta becomes a
00:45:23.620 big hit.
00:45:24.120 I think it
00:45:24.520 will.
00:45:25.740 You will have
00:45:26.520 an avatar.
00:45:27.580 There will be
00:45:27.960 some representation
00:45:28.920 of you.
00:45:29.620 It might be an
00:45:30.380 animal or
00:45:30.840 something.
00:45:31.220 But it will
00:45:31.740 be the one
00:45:32.140 you select.
00:45:33.740 The avatar
00:45:34.480 will be doing
00:45:36.040 stuff in meta
00:45:36.840 whenever you're
00:45:37.960 online.
00:45:39.160 But only when
00:45:40.020 you're online,
00:45:40.640 right?
00:45:41.020 So if you're
00:45:41.580 not online,
00:45:42.640 your character
00:45:43.300 probably won't
00:45:44.220 exist.
00:45:46.740 But it seems
00:45:48.420 inevitable that
00:45:49.920 with artificial
00:45:50.560 intelligence and
00:45:51.980 improvements in
00:45:52.840 the artificial
00:45:53.640 environment, you
00:45:54.960 will want an
00:45:55.660 option where
00:45:56.480 your character
00:45:57.100 does some
00:45:57.720 things while
00:45:58.180 you're gone.
00:45:59.500 For example,
00:46:00.380 it might go
00:46:01.340 to work.
00:46:02.340 There might be
00:46:03.100 an economy
00:46:03.600 there.
00:46:04.300 You might have
00:46:04.760 to send your
00:46:05.400 avatar to work
00:46:06.220 when you're
00:46:06.560 doing something
00:46:07.000 else.
00:46:07.900 Maybe your
00:46:08.420 avatar will
00:46:09.040 mow the lawn,
00:46:10.040 go to bed.
00:46:11.440 Who knows?
00:46:12.560 But over time,
00:46:14.440 the amount of
00:46:15.720 things that your
00:46:16.380 avatar can do
00:46:17.160 when you're not
00:46:17.780 there should
00:46:19.100 increase.
00:46:20.100 Just because
00:46:20.660 everything gets
00:46:21.740 better and more
00:46:22.600 and developed to
00:46:24.680 a higher level
00:46:25.320 and more features.
00:46:26.080 So what
00:46:27.360 happens when
00:46:27.940 most of what
00:46:28.740 your avatar
00:46:29.260 does is when
00:46:30.240 you're not
00:46:30.600 there, and
00:46:31.680 a little bit
00:46:32.140 of it's when
00:46:32.580 you're there,
00:46:33.120 and then, and
00:46:34.600 then you die.
00:46:38.300 What happens
00:46:39.220 then?
00:46:40.400 You're going to
00:46:40.720 have this avatar
00:46:41.520 that could
00:46:42.020 actually live
00:46:42.880 autonomously
00:46:43.800 forever.
00:46:45.360 Because if
00:46:47.760 Facebook meta
00:46:49.940 doesn't find
00:46:51.240 out you died,
00:46:52.760 doesn't your
00:46:53.360 avatar just
00:46:54.040 keep on going
00:46:54.820 like an
00:46:56.220 afterlife?
00:46:57.640 And couldn't
00:46:58.720 your relatives
00:46:59.420 visit you in
00:47:00.420 the AI version
00:47:02.440 of you in
00:47:03.180 the afterlife?
00:47:04.060 Could they not
00:47:04.740 enter the
00:47:05.280 simulation
00:47:05.800 through the
00:47:07.160 goggles or
00:47:08.920 whatever and
00:47:09.840 go visit
00:47:10.560 their deceased
00:47:11.800 relative who
00:47:12.980 lives on in
00:47:14.980 an artificial
00:47:15.380 form?
00:47:16.460 Now, for
00:47:17.300 most of you,
00:47:18.340 that artificial
00:47:18.920 form wouldn't
00:47:19.640 be a good
00:47:20.240 representation of
00:47:21.360 you.
00:47:22.300 It'd be, you
00:47:23.220 know, in all
00:47:24.460 the most
00:47:24.800 surfacy ways,
00:47:26.160 it would be
00:47:26.440 sort of like
00:47:26.920 you.
00:47:27.580 But what
00:47:28.080 about somebody
00:47:28.540 like me,
00:47:30.060 where my
00:47:30.640 entire mind
00:47:31.720 has basically
00:47:32.560 been sprayed
00:47:33.440 into the
00:47:33.900 universe in
00:47:34.600 ways that are
00:47:35.120 easy to
00:47:35.560 collect,
00:47:36.480 these videos?
00:47:37.900 I mean, if
00:47:38.280 you collected
00:47:38.860 all of my
00:47:39.340 live streams,
00:47:40.800 you'd have a
00:47:41.460 pretty good
00:47:42.120 idea of what
00:47:42.780 my opinion's
00:47:43.420 going to be
00:47:43.740 on the next
00:47:44.300 thing.
00:47:44.980 Because you
00:47:46.060 know how I
00:47:46.540 think, you
00:47:47.020 know how I
00:47:47.320 approach things.
00:47:48.220 So an AI
00:47:48.980 could create
00:47:49.720 me, put it
00:47:51.640 in an avatar
00:47:52.180 and make it
00:47:52.880 live forever
00:47:53.440 in the meta,
00:47:54.740 in the
00:47:55.340 metaverse.
00:47:56.700 You know
00:47:57.360 that can
00:47:57.780 happen.
00:47:58.780 And that
00:47:59.100 little version
00:47:59.760 of me,
00:48:00.240 suppose, just
00:48:01.620 suppose, that
00:48:02.780 it's some
00:48:03.140 future version
00:48:04.380 of technology,
00:48:05.980 suppose it
00:48:06.840 optionally
00:48:07.520 learns or
00:48:09.140 is programmed
00:48:10.360 to not
00:48:12.360 notice that
00:48:13.020 it lives in
00:48:13.440 a simulation.
00:48:16.100 Oh, you
00:48:16.680 can do
00:48:16.940 that.
00:48:17.880 You can
00:48:18.280 make your
00:48:18.680 software always
00:48:19.660 not notice it
00:48:20.580 lives in a
00:48:21.000 simulation.
00:48:21.560 Whenever there's
00:48:22.060 a glitch, it
00:48:22.580 just doesn't
00:48:23.060 record it.
00:48:24.940 So am I
00:48:26.740 reborn?
00:48:28.080 Because it's
00:48:28.720 going to have
00:48:29.020 my personality,
00:48:30.220 it will be
00:48:30.600 based on me,
00:48:32.120 and just as I,
00:48:33.680 sitting in this
00:48:34.380 chair, don't
00:48:34.880 have probably
00:48:36.060 any cells of
00:48:37.180 my body in
00:48:38.360 common with the
00:48:39.240 baby I used to
00:48:40.320 be, all my
00:48:40.940 cells are
00:48:41.300 different.
00:48:42.720 It's all
00:48:43.020 different cells.
00:48:43.800 I mean, same
00:48:44.360 DNA pattern,
00:48:46.780 but I think all
00:48:47.280 the parts are
00:48:47.860 different by now,
00:48:48.560 right?
00:48:48.800 They've all worn
00:48:49.400 out and been
00:48:49.820 replaced.
00:48:50.620 So this would
00:48:51.140 just be a new
00:48:51.740 replacement part.
00:48:53.340 It would just
00:48:53.700 be a digital
00:48:54.280 replacement for a
00:48:55.260 physical entity,
00:48:56.120 and it would
00:48:56.660 live forever,
00:48:57.780 and it would
00:48:58.180 just have a
00:48:58.580 life.
00:48:59.240 It might learn
00:48:59.860 things that
00:49:00.580 would have been
00:49:01.520 interesting to
00:49:02.140 be in life.
00:49:05.800 Well, that's
00:49:06.780 what's coming.
00:49:07.200 So I do
00:49:08.500 believe that
00:49:09.240 Mark Zuckerberg,
00:49:11.040 after becoming
00:49:12.980 the founder of
00:49:13.600 Facebook, said
00:49:14.480 to himself,
00:49:15.020 well, how do I
00:49:15.460 top that?
00:49:16.880 Like, the only
00:49:17.380 job that's better
00:49:18.220 than being the
00:49:18.800 founder of
00:49:19.380 Facebook would
00:49:20.240 be what?
00:49:21.640 God?
00:49:24.120 And then he
00:49:25.000 became God
00:49:25.800 of his own
00:49:28.380 simulation.
00:49:29.720 Not of this
00:49:30.300 simulation.
00:49:31.480 He's not God
00:49:32.480 of you, but
00:49:34.220 he's going to
00:49:34.740 be the God
00:49:35.280 of meta.
00:49:35.740 And that
00:49:36.860 means that
00:49:37.660 what you can
00:49:38.320 and cannot
00:49:38.760 do in his
00:49:39.740 simulated universe
00:49:41.000 will be decided
00:49:41.800 by him.
00:49:42.840 What's that
00:49:43.260 sound like?
00:49:45.600 God.
00:49:47.100 And it's not
00:49:47.880 even like a
00:49:48.420 dictator.
00:49:49.460 If it were a
00:49:50.360 dictator, you
00:49:51.560 could still do
00:49:52.260 the things you're
00:49:52.820 not supposed to
00:49:53.380 do, but you
00:49:54.080 would be
00:49:54.620 punished.
00:49:55.900 In meta, the
00:49:58.580 God, or
00:49:59.340 whoever controls
00:50:00.100 the programming,
00:50:01.520 can just say
00:50:02.080 these things
00:50:02.600 can't happen.
00:50:04.000 Just completely
00:50:04.660 stop you from
00:50:05.360 doing things
00:50:05.880 or give you
00:50:07.020 powers or
00:50:07.720 heal people or
00:50:08.620 anything else
00:50:09.520 within meta.
00:50:10.900 So yes,
00:50:11.620 Zuckerberg is
00:50:12.460 becoming God
00:50:13.400 of his own
00:50:13.980 simulation.
00:50:15.360 The individuals
00:50:16.200 in it will
00:50:16.760 still have
00:50:17.180 something like
00:50:17.840 free will or
00:50:18.580 the impression
00:50:19.120 of free will,
00:50:20.120 but he will
00:50:20.820 determine what
00:50:21.420 they can or
00:50:21.960 cannot do.
00:50:23.840 Like God.
00:50:27.400 He actually
00:50:28.280 found a way.
00:50:29.820 He created
00:50:30.540 Facebook and
00:50:32.180 still found a
00:50:32.960 way to get
00:50:33.340 promoted.
00:50:33.780 If you're
00:50:35.520 not impressed
00:50:36.060 by Mark
00:50:36.540 Zuckerberg now,
00:50:38.320 I mean, I
00:50:38.700 was impressed
00:50:39.280 when he
00:50:39.760 started Facebook.
00:50:41.420 I was about
00:50:42.440 ten times more
00:50:43.480 impressed when
00:50:45.160 it turned out
00:50:45.660 he was a
00:50:46.140 great CEO of
00:50:47.520 a mature
00:50:48.380 company.
00:50:49.900 Didn't see that
00:50:50.580 coming.
00:50:51.380 I mean, that's
00:50:52.020 really rare,
00:50:53.300 isn't it?
00:50:54.280 You could say
00:50:54.920 Bill Gates did
00:50:55.620 it, but it's
00:50:56.460 rare.
00:50:56.720 And now
00:50:59.900 he's promoted
00:51:01.080 himself to
00:51:01.760 God of a
00:51:02.940 simulated reality.
00:51:05.180 Not bad.
00:51:06.280 Not bad, Mark.
00:51:07.740 Well, the
00:51:08.300 China lockdowns
00:51:09.260 are bad, and
00:51:10.840 I was reading a
00:51:11.440 tweet thread by
00:51:12.100 Naomi Wu, and
00:51:16.140 she says,
00:51:17.120 things are quite
00:51:17.600 bad in Shanghai.
00:51:18.660 China has a
00:51:19.220 long and ugly
00:51:19.800 history of
00:51:20.380 famine,
00:51:21.140 displacement,
00:51:21.760 and civil
00:51:22.340 unrest.
00:51:22.780 The social
00:51:23.980 contract we
00:51:24.760 have is
00:51:25.300 basically no
00:51:26.980 more of that
00:51:27.780 and a steady
00:51:28.740 improvement in
00:51:30.340 quality of life,
00:51:31.140 and we won't
00:51:31.720 quibble too
00:51:32.360 much.
00:51:32.760 So I'll say
00:51:33.140 that in my
00:51:33.500 own words.
00:51:34.880 So she's
00:51:35.500 saying that
00:51:36.960 with her
00:51:37.680 knowledge of
00:51:38.500 the local
00:51:39.260 culture, she
00:51:40.600 says that
00:51:41.240 there's a
00:51:42.340 social contract
00:51:43.260 basically an
00:51:44.180 understanding
00:51:44.780 that the
00:51:47.520 civilians won't
00:51:48.600 rebel against
00:51:49.280 the government,
00:51:50.520 even though the
00:51:51.080 government does a
00:51:51.680 lot of things
00:51:52.080 they don't like,
00:51:52.780 and limits their
00:51:54.060 freedom in a
00:51:54.680 number of ways,
00:51:55.320 they're not
00:51:55.560 going to fight
00:51:56.000 it so long as
00:51:57.620 the government
00:51:58.080 keeps them fed
00:51:59.540 and sort of
00:52:01.000 moving forward.
00:52:02.500 And they have.
00:52:03.820 Since the
00:52:04.240 government has
00:52:04.880 kept them fed,
00:52:06.300 and has kept
00:52:06.980 them moving
00:52:07.380 forward, and
00:52:07.940 hasn't done
00:52:08.520 forced displacements
00:52:09.920 except the
00:52:11.180 Uyghurs, I
00:52:11.660 guess, then
00:52:13.540 Naomi's view is
00:52:14.660 that that's all
00:52:15.300 it takes.
00:52:16.640 So as long as
00:52:17.320 they keep those
00:52:18.420 just basic social
00:52:19.560 contract things in
00:52:20.560 place, you're not
00:52:21.460 going to see
00:52:21.740 anything like a
00:52:22.320 revolution.
00:52:23.400 It's just not
00:52:23.880 going to happen.
00:52:24.880 But here's where
00:52:25.960 she warns there's
00:52:27.240 trouble.
00:52:28.580 The lockdowns
00:52:29.740 are a risk of
00:52:32.380 starvation.
00:52:34.600 And if the
00:52:35.900 Chinese feel
00:52:37.980 that that trigger
00:52:38.860 gets pulled in
00:52:40.800 their minds, now
00:52:42.040 this would be
00:52:42.580 Naomi Wu's
00:52:43.580 point of view,
00:52:44.180 I'm not the
00:52:44.920 expert here, but
00:52:46.120 in her point of
00:52:46.740 view, if famine
00:52:48.000 kicks in, it's a
00:52:49.200 whole new game.
00:52:49.840 Because that
00:52:51.060 means the social
00:52:52.040 contract has been
00:52:52.940 violated.
00:52:54.780 What's that
00:52:55.440 mean?
00:52:56.360 Like, could
00:52:57.620 things just turn
00:52:58.680 on a dime?
00:53:00.220 Is it that
00:53:02.040 sensitive that a
00:53:04.280 violation of the
00:53:05.040 social contract,
00:53:06.060 specifically famine,
00:53:07.800 you know, within the
00:53:08.540 lockdown areas?
00:53:09.800 If the government
00:53:10.660 can't get them
00:53:11.420 food and, you
00:53:13.620 know, get it to
00:53:14.160 them without
00:53:14.520 anybody dying of
00:53:16.620 famine, they
00:53:18.940 better do it.
00:53:20.200 Because probably
00:53:20.720 the whole system
00:53:21.440 depends on it.
00:53:22.680 Now, according to
00:53:23.700 Naomi Wu's
00:53:24.580 reading of her
00:53:25.700 own culture.
00:53:30.160 The AP is
00:53:31.140 reporting that
00:53:31.660 U.S. intel
00:53:32.260 officials think
00:53:33.120 Putin might use
00:53:35.140 Ukraine as a
00:53:36.280 pretext to
00:53:37.180 interfere in our
00:53:38.080 next election.
00:53:42.100 Russia collusion
00:53:43.220 too.
00:53:44.160 Here we come.
00:53:46.080 Do we ever have
00:53:46.980 new news?
00:53:47.800 Or is it just
00:53:48.260 the same news
00:53:49.820 all over again?
00:53:51.440 Yes, of course,
00:53:52.440 there will be
00:53:52.860 reports of
00:53:53.620 Russian interference
00:53:55.100 to help whoever
00:53:56.520 is the Republican
00:53:57.520 candidate.
00:53:59.180 And do we
00:53:59.960 believe the AP
00:54:00.660 story that the
00:54:01.440 U.S. intel
00:54:02.000 officials think
00:54:02.840 that?
00:54:03.640 I don't believe
00:54:04.360 any of it.
00:54:05.680 I don't believe
00:54:06.280 any of it.
00:54:07.400 All right.
00:54:09.560 Interestingly,
00:54:10.820 CNN has an
00:54:11.900 opinion piece on
00:54:12.640 their site by
00:54:14.480 S.E.
00:54:14.960 Cups.
00:54:15.920 I think she
00:54:16.400 used to be
00:54:16.800 Republican.
00:54:17.620 Can somebody
00:54:18.180 give me a
00:54:19.620 fact check on
00:54:20.180 that?
00:54:20.800 But she's been
00:54:21.480 sort of anti-Trump
00:54:22.580 for a long time
00:54:23.400 and a staple
00:54:24.720 on CNN.
00:54:26.240 And she wrote
00:54:28.020 that Obama was
00:54:28.720 wrong about
00:54:30.080 Russia being the
00:54:30.860 JV team.
00:54:31.740 He was wrong
00:54:32.580 to mock Romney
00:54:33.540 when Romney in
00:54:34.420 the debate
00:54:34.880 against Obama
00:54:37.080 said that Russia
00:54:38.640 was our biggest
00:54:39.260 strategic partner
00:54:40.120 and Obama
00:54:41.100 mocked him
00:54:41.700 because he said
00:54:42.260 it was China.
00:54:42.780 And now Romney's
00:54:44.200 right, it looks
00:54:45.520 like.
00:54:47.560 So the content
00:54:48.800 of the story
00:54:49.260 isn't so interesting
00:54:50.100 as the fact
00:54:50.820 that it's on
00:54:51.400 CNN's website.
00:54:53.520 So CNN has
00:54:54.580 a major
00:54:56.020 critical story
00:54:57.800 about Obama
00:54:58.400 on their website.
00:55:01.580 What do you
00:55:02.220 think of that?
00:55:03.560 Does it feel
00:55:04.440 like something's
00:55:05.020 changing over there?
00:55:06.460 Because they've
00:55:07.160 changed management,
00:55:08.020 right?
00:55:08.700 And I believe
00:55:09.620 that their new
00:55:10.200 management said
00:55:10.860 something about
00:55:11.480 reporting real
00:55:12.200 news for a change.
00:55:13.820 Am I right
00:55:14.500 about that?
00:55:15.320 Now this isn't
00:55:16.060 news, it's
00:55:16.500 opinion.
00:55:17.440 But I don't
00:55:19.480 know that I
00:55:20.020 would have seen
00:55:20.440 this opinion
00:55:21.020 before.
00:55:22.260 Or do you
00:55:23.020 think that
00:55:23.340 they're just
00:55:23.660 trying to get
00:55:24.240 Romney to be
00:55:25.080 the candidate?
00:55:26.840 Oh, is that
00:55:27.260 what they're
00:55:27.500 doing?
00:55:28.220 Do you think
00:55:28.820 they're just
00:55:29.140 trying to
00:55:29.460 support Romney
00:55:30.180 to mess
00:55:31.160 with the
00:55:31.600 Republicans?
00:55:32.380 Because part
00:55:33.280 of the
00:55:33.620 criticizing Obama
00:55:34.880 was to say
00:55:35.660 Romney was
00:55:36.620 right.
00:55:39.020 But I
00:55:39.420 don't, still
00:55:40.180 I can't see,
00:55:40.840 CNN throwing
00:55:42.920 Obama under
00:55:44.120 the bus
00:55:44.500 even for
00:55:45.200 some clever
00:55:45.800 plan like
00:55:46.400 that.
00:55:47.360 I don't
00:55:47.740 see it.
00:55:48.440 That's too
00:55:49.060 clever, I
00:55:49.580 think.
00:55:50.400 I don't
00:55:50.660 see it.
00:55:52.620 All right.
00:55:53.620 That, ladies
00:55:54.380 and gentlemen,
00:55:55.460 is a conclusion
00:55:56.500 of my prepared
00:55:57.400 remarks.
00:55:58.540 I think I've
00:55:59.380 cured many of
00:56:00.180 you of your
00:56:00.800 laziness.
00:56:01.860 I have
00:56:02.160 rewritten your
00:56:02.940 operating code,
00:56:05.020 taught you the
00:56:05.600 secret of how
00:56:07.140 to succeed.
00:56:07.640 It's all
00:56:08.340 about the
00:56:08.820 imagining the
00:56:09.520 good outcomes
00:56:10.120 and the
00:56:10.680 dopamine.
00:56:11.980 And I've
00:56:12.920 given you a
00:56:13.580 single data
00:56:14.220 point that
00:56:14.740 can convince
00:56:15.400 Putin to
00:56:16.040 stop the
00:56:16.560 war in
00:56:16.960 Ukraine,
00:56:17.600 and we
00:56:18.200 can even
00:56:18.560 make up
00:56:18.940 the number.
00:56:20.080 We don't
00:56:20.480 even have
00:56:20.820 to have a
00:56:21.120 real number.
00:56:22.280 We just
00:56:22.800 have to have
00:56:23.260 the news
00:56:23.680 talk about
00:56:24.200 it.
00:56:25.780 And so
00:56:27.100 now that I've
00:56:28.200 cured most of
00:56:28.940 your personal
00:56:29.600 problems as
00:56:30.400 well as war
00:56:31.700 in Ukraine,
00:56:32.680 I think I've
00:56:33.660 done enough
00:56:34.060 for now.
00:56:34.520 but because
00:56:37.340 I like
00:56:38.000 extra, I'm
00:56:39.440 going to keep
00:56:39.860 working today
00:56:40.480 and make
00:56:40.900 some comics
00:56:41.980 and maybe
00:56:42.560 work on my
00:56:43.080 book a little
00:56:43.440 bit.
00:56:43.680 We'll see
00:56:43.920 what goes.
00:56:44.860 We'll see
00:56:45.280 what happens.
00:56:46.480 But I
00:56:48.340 think you can
00:56:48.820 all agree
00:56:49.260 this live
00:56:50.480 stream was
00:56:51.180 a highlight
00:56:52.680 of civilization
00:56:53.420 probably.
00:56:54.960 100,000
00:56:55.600 years of
00:56:56.300 humans in
00:56:57.840 our form
00:56:58.360 and nothing
00:57:00.280 better than
00:57:00.740 this so far.
00:57:02.040 This was
00:57:02.540 about the
00:57:03.460 best.
00:57:03.780 And I
00:57:04.600 think you
00:57:04.880 agree.
00:57:05.980 Now,
00:57:07.380 join me for
00:57:08.220 a little
00:57:08.560 bit of a
00:57:09.340 boost in
00:57:11.300 your dopamine.
00:57:13.280 Think about
00:57:13.780 something awesome
00:57:14.600 that's going to
00:57:15.060 happen today.
00:57:16.740 Or something
00:57:17.480 awesome that'll
00:57:18.100 happen in the
00:57:18.460 next few years
00:57:19.120 even better
00:57:19.660 because you
00:57:20.180 can keep
00:57:20.560 that one
00:57:20.880 for a few
00:57:21.260 years.
00:57:22.180 Just think
00:57:22.540 about something
00:57:22.980 awesome and
00:57:24.120 watch how much
00:57:24.620 better your
00:57:25.020 day is.
00:57:25.860 And that is
00:57:26.980 the end of
00:57:27.840 my program.
00:57:28.440 I'll see you
00:57:28.740 tomorrow.
00:57:29.640 YouTube.