Real Coffee with Scott Adams - April 07, 2022


Episode 1706 Scott Adams: Watch Me Connect Politics, AI, The Simulation and Twitter Into One Story


Episode Stats

Length

1 hour and 3 minutes

Words per Minute

151.67157

Word Count

9,559

Sentence Count

671

Misogynist Sentences

7

Hate Speech Sentences

20


Summary

In this episode of Coffee with Scott Adams, host Scott Adams talks about psychedelics, and why they might be the next big thing in the world of drugs and mental health, and how they might not be as bad as you think.


Transcript

00:00:00.000 Good morning, everybody, and welcome to the highlight of the entire civilization.
00:00:12.500 It's called Coffee with Scott Adams, and if you didn't think it could get any better, surprise, it's whiteboard day.
00:00:20.300 Yes, we will have a whiteboard in which I'll connect the seemingly different fields of politics, artificial intelligence,
00:00:28.480 the simulation, and Twitter.
00:00:31.200 Yeah, I'll do all that today.
00:00:32.840 And in order for you to be primed and ready for that, this mind-blowing experience that is the simultaneous sip and coffee with Scott Adams,
00:00:41.640 you're going to need to get ready, and all you need to be ready for this amazing, amazing experience is a cupper mugger,
00:00:47.680 a glass of tanker, a gel, a canteen jugger, a flask, a vessel of any kind, filled with your favorite liquid.
00:00:52.720 I like coffee.
00:00:53.920 And join me now for the unparalleled pleasure.
00:01:00.540 It's the dopamine of the day.
00:01:02.220 It's the thing that makes everything better.
00:01:05.320 It's called the simultaneous sip, and it happens now.
00:01:07.960 Go.
00:01:12.500 Oh.
00:01:15.620 That, ladies and gentlemen, is amazing.
00:01:19.180 I'd like to start with a helpful tip.
00:01:22.100 Have you ever bought anything on Amazon?
00:01:26.360 Well, if you have, and you've bought more than one thing, you may have run into a situation I run into often.
00:01:32.720 It's called the scale problem.
00:01:36.200 As in, I think I'm buying a big old bag of something, and it shows up like it's a free sample.
00:01:44.120 How many times has that happened to you?
00:01:48.100 You know, you buy a chair for your living room, and it shows up, and it's like a, it's a Barbie chair.
00:01:52.760 You're like, you know, that looked like a real chair.
00:01:55.600 In my defense, I did not check the specs.
00:01:58.100 It looked like a chair.
00:01:59.120 It said a chair.
00:02:00.000 I bought the chair.
00:02:01.500 It just happened to be two inches tall.
00:02:03.200 Well, this brings me to my recent purchase, which should have been about this tall, about this wide, the big one.
00:02:14.160 But when you look at the little picture, it looks exactly the same.
00:02:18.340 And so I suggest the following human interface improvement for Amazon.
00:02:22.740 Jeff Bezos, if you're listening, I suggest this.
00:02:26.960 In any situation in which there might be any potential ambiguity about the size and scale of an object,
00:02:35.960 it is not good enough to include it only in the description, which you must click.
00:02:41.540 You must also have a human hand in the picture, preferably the same human hand.
00:02:46.860 Because if this had a human hand in it, I would know exactly how big it was every single time.
00:02:51.000 And how hard is it to put a hand in a picture?
00:02:53.640 Not very hard.
00:02:54.600 You can even digitally add it, just nearby.
00:02:57.660 Hand.
00:02:58.680 Picture.
00:03:00.220 So, please, user interface developers at Amazon, who are, by the way, some of the best in the world.
00:03:08.860 Amazon has some of the best user interface.
00:03:10.600 But that one thing, that one thing bites me in the ass about one time in five, probably, literally.
00:03:18.240 I just get some weird size.
00:03:21.000 All right.
00:03:24.420 You know, lately, if you've been watching my live streams, you know that I've been adding quite a bit to civilization.
00:03:32.260 I've had insightful comments ranging from, oh, I don't know, geopolitics.
00:03:37.600 You know, one of my many sort of fields of expertise.
00:03:42.020 The supply chain, where I've, you know, wasn't an expert until just a few weeks ago.
00:03:47.080 Now I am.
00:03:48.240 And, of course, the global economy, something that, you know, people like me know everything about.
00:03:53.440 So, while I've made these tremendous contributions to society, it seems that the only thing that got picked up by the media that I did in the last two weeks was the following tweet.
00:04:03.900 In which I tweeted, Madonna is transforming into Jar Jar Binks, and no one is talking about it.
00:04:11.320 Yes, of all the things I offer to this world, my many nuggets of wisdom, only one left the little bubble, which is this live stream, into the larger world to make a dent.
00:04:25.220 And it was that tweet.
00:04:27.980 Madonna is transforming into Jar Jar Binks, and no one is talking about it.
00:04:32.840 So, that'll keep me humble for a while.
00:04:35.760 All right.
00:04:37.600 In an ongoing trend, which you should watch very carefully, you've heard me say this before, but the more that it happens, the story gets bigger, right?
00:04:48.520 Which is that mushrooms are becoming mainstream almost instantly.
00:04:53.360 There's something about 2022 that's happening that is hard to explain.
00:04:58.680 But now the Washington Post has a story that your headline says, psychedelics may ease cancer patients' depression and anxieties.
00:05:06.660 You know, quote, these drugs were banned decades ago.
00:05:09.320 My clinical trial suggests they might have a meaningful positive effect in treating mood issues.
00:05:17.300 It's happening.
00:05:18.240 I don't think those of you who have had no, let's say, experience with mushrooms, I don't think you know how big this is.
00:05:30.420 This is just about the biggest story in the world.
00:05:34.060 You know, obviously the economy and war and viruses are big stories.
00:05:38.280 But in terms of our, let's say, our subjective experience of life, this is the biggest story.
00:05:46.440 This is deeply transforming of humanity.
00:05:50.180 I mean, I'm not sure anything's going to be the same after this.
00:05:53.340 And you don't have to get everybody on mushrooms.
00:05:56.160 That's not necessary.
00:05:57.600 You just have to get the right people on mushrooms.
00:06:01.740 Right?
00:06:02.300 Not everybody.
00:06:03.340 You just got to get the right people on them.
00:06:05.800 You know what I mean?
00:06:06.360 Let me ask you this.
00:06:08.240 Do you think Putin has done mushrooms?
00:06:11.340 Serious question.
00:06:12.820 Serious question.
00:06:14.120 How many of you think Putin has done mushrooms?
00:06:17.620 I would bet a very large sum of money is not.
00:06:22.200 Because you know what?
00:06:22.980 You don't find yourself in this situation if you had.
00:06:27.160 And again, people who have experience with this are saying, oh yeah, I get what you're saying.
00:06:32.080 And those who don't have experience are saying, I don't even understand what that means.
00:06:36.360 Like, how do you know he hasn't done mushrooms?
00:06:39.540 Do you know how I know Putin hasn't done mushrooms?
00:06:43.180 Because he wouldn't be Putin if he had.
00:06:46.360 It would have fundamentally rewired him.
00:06:49.180 Like, he'd be playing the, yeah, exactly, he's an ego killer.
00:06:54.140 Russia, especially with Putin at the helm, has, is suffering almost a personality disorder.
00:07:03.740 That they have to own Ukraine, and they have to subjugate people, and they have to be awesome.
00:07:08.660 They have to protect their egos, and Russia's history, and, you know, the Russian people, blah, blah, blah, blah, blah.
00:07:14.260 You take mushrooms once, that all goes away.
00:07:18.380 It all goes away.
00:07:19.820 And then you start thinking, well, what would make everybody happy?
00:07:24.380 And if I made everybody happy, would that work out for me too?
00:07:27.920 Probably.
00:07:29.600 Probably.
00:07:30.040 So, suddenly, everything is different.
00:07:33.600 Yeah, mushrooms changes everything.
00:07:35.540 Might be world peace.
00:07:38.800 And by the way, I don't recommend that you try mushrooms.
00:07:41.440 Let me be clear.
00:07:42.760 Because this is a public forum, and who knows who's watching.
00:07:46.980 I do not recommend any illegal drugs.
00:07:50.320 Talk to your doctors only.
00:07:51.580 I'm going to give you a little tip that may completely change the lives of some of you.
00:08:01.460 This is one of my favorite things to do.
00:08:04.600 And I'm going to make my claim really small, so that it's not some ridiculous over-claim.
00:08:10.680 My claim is I'm going to give you a reframe that might solve, I don't know, 1%, maybe less, maybe fewer.
00:08:21.580 Not of your mental problems.
00:08:25.840 Maybe.
00:08:26.740 I think about 1 out of 100 of you might just walk away from this live stream saying,
00:08:31.360 okay, that just changed my life.
00:08:33.780 All right?
00:08:34.640 So, I'm setting myself up for a pretty high bar.
00:08:39.720 It goes like this.
00:08:41.380 I'm going to start with this assumption, and then I'll give you the idea.
00:08:44.360 The assumption is this, and I want to see if you'll agree with the starting premise.
00:08:47.720 That we think other people think like us.
00:08:53.140 Do you buy the first assumption?
00:08:55.640 That all of us were biased by the belief that other people's brains are processing things somewhat similar to ours.
00:09:04.740 All right, so that's the first assumption.
00:09:06.260 I think you largely believe that we think they process like us.
00:09:11.360 Now, the no's I'm saying, I think you're maybe interpreting the question a little bit differently.
00:09:18.320 Because we do know that people are different at the same time we think they're not, right?
00:09:22.240 So, we do hold two beliefs that are opposites at the same time.
00:09:26.100 We hold the belief that people think like us, and that we act on it.
00:09:31.500 At the same time, we know it's not true.
00:09:33.540 We know people don't think the way we think, right?
00:09:36.500 But you act like it, even though we know it's not true.
00:09:40.540 So, that's the first part.
00:09:43.300 That's just that.
00:09:45.240 And I'm going to be talking to people who feel they have low self-esteem,
00:09:50.320 and that other people judge them poorly.
00:09:54.480 Is there anybody watching who would fall into that category?
00:09:57.560 You have low self-esteem, and you believe other people are judging you.
00:10:02.920 Quite a few, all right?
00:10:04.400 That's a lot of yeses.
00:10:05.400 The yeses just start popping up on the locals' platform.
00:10:10.900 Now, not everybody, all right?
00:10:13.800 All right, now here's the reframe,
00:10:16.380 specifically for people who live in, let's say, their reality
00:10:20.780 is that they feel they have low self-esteem,
00:10:24.800 or they feel worthless, and they feel other people judging them.
00:10:29.540 Here's the reframe that fixes it.
00:10:33.180 Stop judging other people.
00:10:36.060 That's it.
00:10:38.620 That's it.
00:10:39.180 Stop judging other people.
00:10:42.220 Do you know why?
00:10:44.000 If you can train yourself to stop judging other people,
00:10:47.200 which would take a while,
00:10:48.320 you have to just keep reminding yourself not to do it.
00:10:50.660 The trick that I use is that I literally believe that everybody has the same value
00:10:57.020 because there's no such thing as some metric for judging your value.
00:11:03.440 You could say, who's more valuable for having a baby?
00:11:07.680 Well, women, you know, of a certain age and certain health situation.
00:11:14.180 But are they more valuable than everybody else?
00:11:17.280 No, they fit a certain requirement, a need.
00:11:21.540 You know, they're important in that way.
00:11:23.380 But I don't fundamentally believe that anybody's worth more than anybody else.
00:11:28.600 The law doesn't judge you that way unless you break the law.
00:11:32.040 So if you could learn to simply talk yourself into not judging other people,
00:11:39.140 do you know what would happen?
00:11:40.020 In theory, in theory, if you stop judging other people reflexively
00:11:47.440 and just train yourself, just don't think that way.
00:11:50.340 Just everybody's equal.
00:11:51.660 We're all the same.
00:11:53.580 As soon as you do that,
00:11:55.420 you're going to stop worrying about what they think of you.
00:11:58.320 You know why?
00:11:59.920 Because you can hold in your head two thoughts that don't make sense together.
00:12:03.940 One is that, you know, everybody acts the same
00:12:07.060 and the other is that everybody thinks differently.
00:12:08.780 We kind of hold them the same.
00:12:10.780 But as soon as you see other people thinking like you do,
00:12:14.220 if your view is that everybody is the same
00:12:17.100 and then you imagine that they have the same view as you,
00:12:20.340 which they don't,
00:12:21.640 but this is the trick, right?
00:12:23.720 If you have a bias,
00:12:26.040 and you know you do,
00:12:27.280 if you have a bias toward thinking people do or should think the way you think,
00:12:31.880 then change the way you think.
00:12:35.080 Are you following it?
00:12:36.220 If you change the way you think,
00:12:38.000 it should change your subjective impression
00:12:42.020 of what other people are thinking of you.
00:12:47.400 I think 1% of you just got cured.
00:12:52.520 I'm watching the comments come in on locals
00:12:54.480 because they come in faster,
00:12:56.180 and people are having a good reaction to it.
00:12:59.780 Now, what would be the downside of trying it?
00:13:04.060 Do you see any?
00:13:05.300 Is there any downside of simply trying to judge other people less
00:13:10.560 as your own solution to how you feel about other people thinking of you?
00:13:14.300 There's no downside.
00:13:15.780 Can you think of anybody who would say,
00:13:18.440 no, you should be more judgmental about people's worth?
00:13:22.140 Now, remember, I'm not telling you that you should embrace their choices.
00:13:27.400 Not that.
00:13:28.660 You can still disagree with their choices, of course.
00:13:31.420 Just don't judge them.
00:13:34.080 They're just people.
00:13:35.520 We're all the same, value-wise, in terms of value.
00:13:40.260 All right.
00:13:41.600 And I would say that as awesome as I often feel I am in some narrow areas,
00:13:48.020 because I have good self-esteem about some specific things that I do or have done,
00:13:54.700 but of all the things you could do in the world,
00:13:57.680 the things that you could be good at,
00:14:00.040 what percentage of all the things that a person can be good at am I good at?
00:14:06.620 Like, less than 0.001%?
00:14:11.480 That's about as good as you can get.
00:14:13.180 Like, if you take the most awesome person,
00:14:17.320 just think of your friend or maybe even somebody famous,
00:14:20.900 whoever you think hits all the notes.
00:14:23.680 Like, oh, that's the person who's really got it together.
00:14:25.700 They can do this, and they can do that, and they're this and that.
00:14:30.140 Of all the things that people can do,
00:14:33.160 what percentage can they do, really?
00:14:36.720 About 0.0001.
00:14:39.800 Almost exactly the same as you.
00:14:41.720 Like, if you look at it mathematically, nobody can do anything.
00:14:46.000 I mean, it all rounds to zero.
00:14:48.140 So to imagine that somebody else's 0.001 effectiveness in life
00:14:52.540 is so much better than your 0.001.
00:14:56.840 I mean, I guess it is mathematically, but not in a real way.
00:15:01.840 So that should help some of you.
00:15:06.660 Good news.
00:15:08.000 The good guys win sometimes.
00:15:09.860 Do you remember the story of author Alex Epstein,
00:15:14.140 who found out that the Wall Street Journal
00:15:16.280 was planning a hit piece on him
00:15:17.880 before his book even came out?
00:15:22.080 And so he actually organized a...
00:15:27.520 His book is called Fossil Future.
00:15:29.780 And I guess the Washington Post had come up with some kind of angle,
00:15:32.760 maybe because they don't like a book that is, let's say,
00:15:37.240 not as anti-fossil fuel as they would like.
00:15:40.180 You know what I mean?
00:15:41.480 That they might try to suppress that kind of a book,
00:15:43.660 especially if it's good.
00:15:45.380 They don't try to suppress a book that's bad, do they?
00:15:48.440 Bad in the sense that you won't want to read it.
00:15:51.220 Nobody's going to write a whole article about a book nobody's going to read.
00:15:54.440 So they're afraid of this book, clearly.
00:15:58.000 There wouldn't be a hit piece on it unless they were afraid of it,
00:16:01.200 in terms of it threatening the worldview which they present.
00:16:06.500 So what followed was Alex Epstein quite brilliantly
00:16:11.820 organized a number of Blue Check and other people on Twitter,
00:16:16.840 and maybe other social media, I'm not sure,
00:16:18.680 but on Twitter at least.
00:16:20.980 And many of us had a dog in the fight.
00:16:26.120 Right?
00:16:26.520 I certainly did.
00:16:27.840 To me, this was personal.
00:16:29.320 This was absolutely personal.
00:16:31.160 Right?
00:16:31.540 I mean, I know Alex Epstein just from, you know, digital contacts.
00:16:36.620 But to me, it was personal.
00:16:38.700 Because I have been the subject of hit pieces,
00:16:41.900 and there's not much you can do about it.
00:16:44.060 It's a pretty helpless feeling.
00:16:45.220 So the fact that he was taking the fight to the Washington Post
00:16:49.780 and found a way to actually get traction,
00:16:52.240 which is organizing enough people to embarrass them,
00:16:56.840 preemptively embarrass them.
00:16:58.340 The result was the article did come out.
00:17:00.760 It was delayed, how long?
00:17:03.800 Delayed a week, I think.
00:17:05.520 And when it did come out,
00:17:07.240 they had removed 90% of the bad content,
00:17:11.180 you know, the hit piece content,
00:17:13.020 including any allegation that when he was 18 years old,
00:17:20.340 he wrote something that somebody thought was racist.
00:17:23.000 If you looked at it, it wouldn't be.
00:17:24.540 Right?
00:17:24.740 It isn't.
00:17:25.660 But that's the way things get framed.
00:17:27.700 Completely gone.
00:17:29.300 The main thing that he was concerned about
00:17:31.420 was the, I would say, inappropriate allegation
00:17:35.460 that just didn't make sense in a good world.
00:17:38.120 Like, you would have to live in a bad world
00:17:41.240 for that kind of thing to exist, those accusations.
00:17:46.080 And he did what I've never seen anybody do.
00:17:50.260 He actually preemptively took out a hit piece.
00:17:54.720 That's pretty hard to do.
00:17:56.300 I mean, maybe some billionaire's done it
00:17:57.980 by money and threats or something,
00:17:59.900 but I've never seen a regular person,
00:18:03.140 you know, an author do it before.
00:18:04.720 That's a first.
00:18:05.360 And I wonder if there's any kind of,
00:18:07.540 is this telling us there's any kind of shift in power?
00:18:11.500 Because I've been telling you
00:18:13.500 that the power of the internet dads
00:18:17.920 or just the, you know,
00:18:19.320 the people who have some credibility on Twitter
00:18:22.380 is growing.
00:18:24.360 And it's pretty important.
00:18:26.800 I think this really is sort of a turning point
00:18:29.300 in understanding where power lies in society.
00:18:35.360 I'm going to tie together a few stories
00:18:39.280 in ways that will amaze you.
00:18:41.860 So keep in mind that story
00:18:43.220 about the press and about Alex Epstein.
00:18:48.200 So, because we'll circle back to some things.
00:18:51.180 So I've talked about this before,
00:18:52.580 but it's more relevant today.
00:18:53.860 Do you remember when PayPal was started?
00:18:57.720 PayPal had a little group of people
00:18:59.140 who went on, several of them,
00:19:01.200 to do bigger things than PayPal.
00:19:02.700 So, for example, Peter Thiel, I believe,
00:19:06.820 was one of the early financers for Facebook.
00:19:11.180 So Peter Thiel saw the potential of Facebook really early.
00:19:15.640 Became a billionaire, in part because of that.
00:19:19.420 Reid Hoffman, who was part of that PayPal group,
00:19:22.540 he founded LinkedIn,
00:19:24.220 which is effectively, you know, an online resume,
00:19:27.740 but it's a social network as well.
00:19:29.140 So kind of weird that two of the PayPal people
00:19:32.160 would later go on to have, you know,
00:19:34.580 major influence on what became social media networks.
00:19:39.000 And now Elon Musk, one of the PayPal originals,
00:19:42.660 is buying into Twitter,
00:19:45.100 which, again, shows at least his appreciation
00:19:48.180 and understanding of social media
00:19:50.820 in a way other people don't.
00:19:52.780 Now, what are the odds
00:19:54.400 that three of the people from this one company,
00:19:57.440 which is often talked about as being special in some way,
00:20:01.760 have all had major influence
00:20:03.880 on social media platforms specifically.
00:20:07.280 Not just going on to do other unicorns,
00:20:10.900 but social media platforms.
00:20:12.140 And I ask, what is it that made PayPal work?
00:20:16.040 Because I've never understood
00:20:17.040 how the original digital money products
00:20:20.540 ever got anybody to trust them.
00:20:22.680 They weren't a bank.
00:20:23.840 They were a startup.
00:20:24.420 How does, like, a startup of nobodies
00:20:27.200 get you to put their, you know,
00:20:29.080 trust you with their money?
00:20:30.800 How did that ever happen?
00:20:32.440 Because the hard part wasn't the technology,
00:20:34.400 I'm guessing.
00:20:35.400 I'm guessing the hard part
00:20:36.340 was convincing people to use it.
00:20:38.580 How the hell did they do that?
00:20:40.540 I mean, really?
00:20:41.800 That's one of the most impressive untold stories,
00:20:45.700 or maybe it's in some book or something,
00:20:47.140 but I've never heard it.
00:20:47.840 So I say that because there's something extra
00:20:52.100 going on with all of those PayPal founders.
00:20:55.400 And what I mean is,
00:20:56.740 here's what it looks like from the outside, right?
00:20:59.600 So this is just my speculative outsider's view
00:21:02.780 of what it looks like.
00:21:04.300 It looks like they all learned
00:21:06.340 to engineer comprehensively.
00:21:10.640 And what I mean by that is,
00:21:12.160 people sometimes engineer a product,
00:21:14.120 but they don't engineer the human
00:21:16.540 who's using the product.
00:21:18.380 There's something about the interface
00:21:19.940 or the way it's used
00:21:21.020 or how it touches our minds
00:21:22.540 that's incomplete.
00:21:25.180 What's different about PayPal
00:21:27.120 and then Facebook, LinkedIn,
00:21:29.780 and now Twitter,
00:21:31.560 is that all of them have found an interface
00:21:33.560 that connects the product right to your brain.
00:21:37.280 Product directly connected to your brain.
00:21:39.500 What does that make you think of?
00:21:42.120 Elon Musk's other product, right?
00:21:44.260 Neuralink,
00:21:45.160 where you'll have actually chips in your head,
00:21:47.780 potentially, someday.
00:21:49.700 So connecting the product
00:21:51.200 directly to the brain
00:21:53.160 and treating it as though it's one system,
00:21:56.300 you're going to say,
00:21:57.660 well, everybody does that.
00:21:58.600 They all consider the human user
00:22:01.360 and they all consider the product.
00:22:03.420 But nobody does it like they do it.
00:22:06.180 Nobody has done it
00:22:07.300 in a way that you can't stop using the product.
00:22:10.020 Nobody's come close
00:22:13.560 to the effectiveness
00:22:14.480 that these folks
00:22:16.440 have in understanding
00:22:18.060 the brain product interface.
00:22:21.140 Keep that in mind
00:22:22.300 because it's going to come back today.
00:22:29.360 Here's an interesting question
00:22:30.600 about Elon Musk
00:22:31.480 since he's in the news a lot.
00:22:32.600 Why do conservatives
00:22:33.820 like him so much?
00:22:35.480 Do you ever wonder about that?
00:22:37.120 Because I'm pretty sure
00:22:38.100 he's never labeled himself
00:22:39.180 a conservative.
00:22:41.060 I'm pretty sure
00:22:42.120 he's never labeled himself
00:22:43.720 a Republican.
00:22:45.860 What is it that makes people like him?
00:22:47.540 And if I told you
00:22:51.100 that there's somebody
00:22:51.800 who started the biggest
00:22:53.280 electric car company,
00:22:56.280 would you say,
00:22:57.180 well, there's somebody
00:22:57.760 conservatives are going to love
00:22:59.100 because he's all about
00:23:01.540 the green stuff
00:23:02.540 and climate change.
00:23:04.060 It doesn't really make sense,
00:23:05.140 does it?
00:23:06.540 But he's very much embraced.
00:23:09.280 Let me tell you
00:23:09.880 what I think it is.
00:23:11.820 And you can take this
00:23:13.360 just as a compliment
00:23:14.620 to conservatives
00:23:15.400 because it's one that I feel
00:23:16.820 because I feel the same thing.
00:23:18.680 I do not identify
00:23:19.860 as conservative,
00:23:21.540 don't identify as Republican,
00:23:23.480 and I feel that I'm
00:23:25.300 fully embraced
00:23:26.960 by conservatives
00:23:28.740 all the time.
00:23:30.400 And I think it's the same reason,
00:23:32.200 or at least there's
00:23:32.960 some similar reasons.
00:23:34.400 And it goes like this.
00:23:36.780 Let's just take
00:23:37.580 Elon, for example.
00:23:40.900 Oh, and also
00:23:41.400 he's an immigrant.
00:23:43.020 So every stereotype
00:23:45.060 you should imagine
00:23:46.200 about conservatives
00:23:47.660 or Republicans.
00:23:48.740 He sort of violates,
00:23:50.060 I don't know,
00:23:50.380 some of the biggest ones,
00:23:51.340 right?
00:23:52.240 And yet,
00:23:53.760 he's widely liked.
00:23:55.620 So here's what I think it is.
00:23:56.880 Number one,
00:23:58.400 I've never seen him
00:23:59.540 disrespect Republicans
00:24:01.320 or conservatives.
00:24:03.360 Agree or disagree?
00:24:05.300 I've never seen it.
00:24:07.140 Have you ever seen him
00:24:07.940 disrespect Republicans
00:24:10.080 or conservatives?
00:24:12.080 And you could say
00:24:13.020 almost everybody's done that.
00:24:15.360 Almost everybody's done that.
00:24:17.500 If they're on the left.
00:24:20.140 And if they're on the right,
00:24:21.440 even the people on the right
00:24:22.580 insult themselves.
00:24:25.020 Conservatives fight
00:24:26.080 with each other.
00:24:27.140 Right?
00:24:27.560 But I've never seen him
00:24:28.640 disrespect anybody
00:24:29.460 on the right.
00:24:30.560 Now,
00:24:30.880 has he ever disrespected
00:24:31.920 anybody on the left?
00:24:33.800 Well,
00:24:34.080 he's going after
00:24:34.720 wokeism a little bit,
00:24:36.020 hasn't he?
00:24:37.300 Right?
00:24:37.940 Yeah,
00:24:38.180 I feel like he has.
00:24:39.940 Now,
00:24:40.400 that doesn't mean
00:24:41.080 he's associated
00:24:41.840 with the right.
00:24:43.700 It could mean that
00:24:44.960 he's more bothered
00:24:46.040 by something he sees
00:24:47.020 somewhere else.
00:24:47.660 It doesn't mean
00:24:48.040 he's associated
00:24:48.660 with one side.
00:24:50.700 Right?
00:24:51.060 Bill Maher
00:24:51.740 talks more about
00:24:53.320 the problems
00:24:53.840 on the left lately,
00:24:55.020 but it doesn't make him
00:24:55.940 on the right.
00:24:58.720 Here's some other things.
00:25:01.460 He,
00:25:01.940 Musk,
00:25:02.720 as I just talked about
00:25:03.720 with the PayPal people,
00:25:05.040 he incorporates
00:25:05.880 human motivation
00:25:06.900 and psychology
00:25:07.680 into his systems.
00:25:09.540 Who does that sound like?
00:25:11.420 Somebody builds
00:25:12.180 a system,
00:25:13.340 and in his case
00:25:14.100 it's more like
00:25:14.560 a product that's
00:25:15.380 supported by
00:25:15.960 various systems,
00:25:17.380 but who is most
00:25:19.360 like that?
00:25:20.100 That's conservatives,
00:25:22.000 Republicans.
00:25:23.160 They consider
00:25:24.200 human motivation
00:25:25.180 when they design
00:25:25.980 a system,
00:25:26.540 which is,
00:25:26.980 well,
00:25:27.740 you know,
00:25:28.200 if we build a system
00:25:29.560 where we just
00:25:30.040 give you stuff,
00:25:31.160 what's the human
00:25:31.860 motivation?
00:25:32.900 Take the stuff
00:25:33.540 and not work.
00:25:34.720 All right?
00:25:34.900 So I think
00:25:36.640 conservatives
00:25:37.100 appreciate him
00:25:38.040 for understanding
00:25:39.200 that if you forget
00:25:40.220 about the human
00:25:40.840 motivation,
00:25:41.540 you get everything
00:25:42.040 wrong,
00:25:42.520 like Democrats
00:25:43.440 often do
00:25:43.980 with their systems.
00:25:47.840 The next thing
00:25:48.700 is that he's
00:25:49.120 transparent
00:25:49.760 about both
00:25:51.480 his thinking
00:25:52.360 process
00:25:52.840 and his
00:25:53.420 motivations.
00:25:55.120 Wouldn't you say?
00:25:56.540 Do you ever
00:25:57.140 think to yourself,
00:25:57.860 I think he has
00:25:58.340 a hidden agenda?
00:26:00.060 I guess people
00:26:00.660 say that about
00:26:01.200 all billionaires,
00:26:02.280 but it doesn't
00:26:02.720 feel like it.
00:26:03.800 It doesn't feel
00:26:04.660 like he has
00:26:05.020 some hidden agenda
00:26:05.780 because I feel
00:26:06.620 like he tells
00:26:07.080 you exactly
00:26:07.560 what he wants
00:26:08.280 and then he
00:26:09.320 does it in public.
00:26:10.920 It's pretty
00:26:11.740 clear,
00:26:12.880 right?
00:26:13.440 So people
00:26:14.000 like that,
00:26:14.620 just in general,
00:26:15.400 people like
00:26:15.820 transparency.
00:26:17.200 He shows
00:26:17.880 his work,
00:26:18.400 right?
00:26:18.960 And obviously
00:26:19.720 he likes
00:26:20.080 freedom,
00:26:20.940 so that
00:26:21.800 binds him
00:26:22.840 to the right,
00:26:23.720 even if he's
00:26:24.420 not associated
00:26:25.060 with the right,
00:26:25.460 he just likes
00:26:25.880 freedom
00:26:26.240 and free speech.
00:26:28.780 And he also
00:26:29.780 works harder
00:26:30.360 than most
00:26:30.740 people.
00:26:32.420 Have you
00:26:32.900 ever seen
00:26:33.320 a conservative
00:26:34.120 dislike somebody
00:26:36.280 who works
00:26:36.980 as hard
00:26:37.520 as Elon Musk?
00:26:39.260 You could do
00:26:40.000 almost everything
00:26:40.700 else wrong
00:26:41.420 from a
00:26:42.140 conservative's
00:26:42.840 point of view,
00:26:43.500 as long as
00:26:43.960 you're obeying
00:26:44.560 the law.
00:26:45.640 You have to
00:26:46.120 get that right.
00:26:47.020 But if you're
00:26:47.400 obeying the law
00:26:48.160 and you're
00:26:49.080 working 14
00:26:50.240 hours a day
00:26:50.940 or whatever
00:26:51.260 the hell
00:26:51.540 he was doing,
00:26:52.080 18 hours a day,
00:26:53.640 conservatives
00:26:54.120 kind of like you
00:26:54.940 because you
00:26:56.040 work hard.
00:26:57.200 It's pretty
00:26:57.700 basic.
00:26:58.540 It's not hard
00:26:59.200 to be liked
00:26:59.700 by the right.
00:27:01.200 And he has
00:27:02.400 lots of kids.
00:27:03.900 What does he have
00:27:04.360 six or seven kids?
00:27:06.000 Conservatives like
00:27:06.680 that,
00:27:07.400 family-oriented
00:27:08.300 in his own way.
00:27:10.000 And I think
00:27:10.660 his family
00:27:11.480 situation is
00:27:12.220 completely
00:27:12.740 non-standard.
00:27:14.280 And still,
00:27:15.340 still,
00:27:15.940 conservatives
00:27:16.580 embrace him.
00:27:18.020 So here's the
00:27:18.620 message from this.
00:27:21.580 We're locked
00:27:22.380 into a world
00:27:23.140 in which we
00:27:23.680 think the only
00:27:24.200 way you can
00:27:24.660 run for office
00:27:25.520 is to be
00:27:26.620 totally one thing
00:27:27.600 or totally
00:27:28.040 the other thing.
00:27:28.660 You're either
00:27:28.940 a Republican,
00:27:29.700 nor you're
00:27:30.180 a Democrat.
00:27:30.940 I think
00:27:32.780 Elon Musk
00:27:33.560 proves that
00:27:35.040 if you were
00:27:35.560 smart about it,
00:27:36.280 you wouldn't
00:27:36.540 have to be
00:27:37.080 on a team.
00:27:38.460 That you
00:27:38.860 could get
00:27:39.220 the other
00:27:39.560 team to
00:27:40.000 like you
00:27:40.540 easily.
00:27:43.160 Respect
00:27:43.600 them.
00:27:45.680 Give
00:27:46.200 some,
00:27:47.020 you know,
00:27:47.360 appreciation
00:27:47.840 to what it
00:27:48.540 is that
00:27:48.840 they want.
00:27:49.840 Like freedom.
00:27:51.280 Be transparent.
00:27:52.900 Outwork
00:27:53.340 them.
00:27:54.640 Right?
00:27:55.220 How hard
00:27:55.780 would it be?
00:27:56.860 You
00:27:57.060 can have
00:27:57.580 very different
00:27:58.140 opinions
00:27:58.560 from conservatives
00:27:59.440 and still
00:28:00.000 have a lot
00:28:00.520 of them.
00:28:00.940 A lot
00:28:01.360 of them
00:28:01.600 say,
00:28:01.860 you know,
00:28:02.640 damn it,
00:28:03.620 I don't
00:28:04.060 like all
00:28:04.440 your policies,
00:28:05.300 but I love
00:28:05.760 the way
00:28:06.000 you're
00:28:06.160 treating
00:28:06.420 everything.
00:28:07.420 Like,
00:28:07.640 I love
00:28:07.940 the way
00:28:08.140 you approach
00:28:08.620 it,
00:28:09.440 even if
00:28:09.820 I'm not
00:28:10.340 on the
00:28:10.520 same page
00:28:10.960 with your
00:28:11.260 solution.
00:28:12.340 You could
00:28:12.940 totally get
00:28:14.280 people to
00:28:14.820 switch sides
00:28:15.920 if you
00:28:16.360 played it
00:28:16.740 right.
00:28:16.980 All right.
00:28:20.080 Now,
00:28:20.640 I tweeted
00:28:20.960 the other
00:28:21.300 day,
00:28:21.720 and I
00:28:22.340 said I
00:28:22.860 didn't know
00:28:23.240 if Twitter
00:28:24.420 was already
00:28:25.060 changing its
00:28:25.760 algorithm to,
00:28:27.360 you know,
00:28:27.760 maybe sweep
00:28:28.360 things under
00:28:28.800 the rug
00:28:29.140 before Elon
00:28:29.900 Musk gets
00:28:30.540 a look at
00:28:30.920 the algorithm.
00:28:31.940 And I
00:28:32.380 wondered if
00:28:32.820 my engagement
00:28:34.060 and number
00:28:35.060 of followers
00:28:35.520 per day
00:28:35.960 was going
00:28:36.340 up because
00:28:36.980 of that.
00:28:37.880 But then
00:28:38.260 somebody
00:28:38.580 pointed out
00:28:39.080 that Elon
00:28:39.620 Musk had
00:28:40.780 liked one
00:28:41.440 of my
00:28:41.680 tweets,
00:28:42.500 which would
00:28:43.220 also explain
00:28:43.900 why my
00:28:44.440 number of
00:28:44.960 new users
00:28:45.420 went up.
00:28:46.080 So today
00:28:46.580 it's still
00:28:46.960 up over
00:28:47.500 a thousand
00:28:47.980 new users
00:28:49.360 today,
00:28:50.060 which is
00:28:50.360 about,
00:28:51.000 I don't know,
00:28:51.620 five or
00:28:52.420 ten times
00:28:52.980 normal.
00:28:54.820 But that
00:28:55.260 could still
00:28:55.880 be the
00:28:56.300 spillover
00:28:56.840 from the
00:28:57.220 one tweet.
00:28:57.960 So the
00:28:58.280 tweet he
00:28:58.620 liked was
00:28:59.300 this one.
00:29:00.400 And this
00:29:02.660 is really
00:29:03.060 telling.
00:29:04.080 This actually
00:29:04.640 should be the
00:29:05.180 biggest story
00:29:05.700 in the news,
00:29:06.880 but it's not.
00:29:07.980 The biggest
00:29:08.480 story in the
00:29:08.960 news should
00:29:09.300 be that Musk
00:29:10.460 liked this
00:29:11.760 following tweet
00:29:12.640 because it
00:29:13.660 suggests that
00:29:14.440 his mind
00:29:14.920 is at least
00:29:15.460 if he
00:29:16.740 hasn't
00:29:17.000 decided this
00:29:17.780 at least
00:29:18.240 it's
00:29:18.960 compatible
00:29:19.420 with how
00:29:19.920 he's
00:29:20.100 thinking.
00:29:20.860 So here's
00:29:21.200 the tweet
00:29:21.540 he liked.
00:29:22.840 I said,
00:29:23.240 wait until
00:29:23.600 Elon Musk
00:29:24.160 starts looking
00:29:24.760 under the
00:29:25.160 hood at
00:29:25.620 Twitter and
00:29:26.540 finds out
00:29:27.100 how the
00:29:27.440 algorithm
00:29:27.800 works.
00:29:28.800 I said,
00:29:29.200 that's coming
00:29:29.860 and it is
00:29:31.000 going to be
00:29:31.480 glorious.
00:29:33.240 Now he
00:29:33.520 liked that.
00:29:34.980 Now,
00:29:35.580 does that
00:29:36.960 not suggest
00:29:37.860 that he
00:29:39.440 plans to
00:29:40.160 or would
00:29:40.600 like to
00:29:41.140 get access
00:29:42.820 to the
00:29:43.240 algorithm?
00:29:45.100 That's
00:29:45.620 everything.
00:29:46.540 That's like
00:29:46.960 the whole
00:29:47.240 game.
00:29:48.700 Civilization
00:29:49.180 will be
00:29:49.640 completely
00:29:50.100 changed if
00:29:51.280 he gets
00:29:51.620 access to
00:29:52.220 that
00:29:52.420 algorithm,
00:29:53.080 I think.
00:29:54.600 I think it's
00:29:55.240 that big.
00:29:56.520 And it
00:29:56.760 looks like,
00:29:57.840 by the fact
00:29:58.880 that he
00:29:59.120 liked the
00:29:59.540 tweet,
00:30:00.640 it looks
00:30:01.120 like he
00:30:01.520 at least
00:30:02.040 has some
00:30:03.080 impulse in
00:30:04.540 that direction.
00:30:05.680 Don't know
00:30:06.000 if he can
00:30:06.320 do it,
00:30:06.880 don't know
00:30:07.220 what will
00:30:07.520 come of
00:30:07.860 it,
00:30:08.840 but he
00:30:09.660 liked it.
00:30:11.180 So he's
00:30:11.940 not disagreeing
00:30:12.860 with the
00:30:13.140 notion that
00:30:14.360 him looking
00:30:14.900 at the
00:30:15.420 algorithm
00:30:16.360 would be
00:30:17.360 maybe
00:30:17.880 interesting.
00:30:20.100 We're also
00:30:20.740 hearing that
00:30:21.120 Twitter is
00:30:21.540 working on
00:30:22.000 an edit
00:30:22.400 button.
00:30:23.920 That makes
00:30:24.840 sense to
00:30:25.320 me in
00:30:25.900 the Elon
00:30:27.100 Musk era
00:30:28.220 of Twitter
00:30:29.040 because the
00:30:30.320 edit button
00:30:30.900 does add
00:30:32.500 some,
00:30:33.220 let's say,
00:30:33.620 clarity to
00:30:34.240 things,
00:30:34.600 doesn't it?
00:30:35.600 Now I do
00:30:36.280 think the
00:30:36.680 edit button
00:30:37.100 should be
00:30:37.500 constructed
00:30:38.000 this way.
00:30:39.400 If you
00:30:39.960 edit something,
00:30:40.820 it should
00:30:41.100 be shown
00:30:41.520 as an
00:30:41.940 edit and
00:30:43.100 you should
00:30:43.400 be able to
00:30:43.820 swipe that
00:30:44.380 edit and
00:30:44.960 see the
00:30:45.300 original.
00:30:46.100 It should
00:30:46.360 be that
00:30:46.700 easy.
00:30:47.520 It should
00:30:47.720 just be a
00:30:48.240 swipe if
00:30:49.460 you can
00:30:50.620 engineer it
00:30:51.140 easily.
00:30:51.880 Now I
00:30:52.100 don't even
00:30:52.440 want to
00:30:52.820 have it
00:30:53.220 a link
00:30:53.740 because you
00:30:55.160 know how
00:30:55.480 Instagram
00:30:55.880 has the
00:30:56.560 swipe
00:30:57.040 model,
00:30:57.620 you can
00:30:57.820 tell
00:30:58.000 Amazon
00:30:59.020 does it
00:30:59.500 too,
00:31:00.060 where there's
00:31:00.520 little dots
00:31:01.220 at the
00:31:01.460 bottom of
00:31:01.820 the picture
00:31:02.200 so you
00:31:02.520 know there's
00:31:02.840 some more
00:31:03.140 pictures if
00:31:03.680 you just
00:31:03.960 swipe in
00:31:04.320 that
00:31:04.440 direction.
00:31:04.760 If you
00:31:07.200 could swipe
00:31:07.700 it and
00:31:07.980 see the
00:31:08.280 original,
00:31:09.240 that's a
00:31:09.680 good edit
00:31:10.160 to me.
00:31:11.720 I would
00:31:12.040 also like
00:31:14.400 a notification
00:31:15.240 sent to
00:31:17.160 everyone who
00:31:18.100 interacted with
00:31:19.020 it if it
00:31:19.680 gets edited
00:31:20.160 with one
00:31:22.000 exception.
00:31:22.940 If you go
00:31:23.680 to edit
00:31:24.020 your own
00:31:24.360 tweet, it
00:31:24.880 should give
00:31:25.200 you a
00:31:25.420 choice to
00:31:26.080 say are
00:31:26.420 you editing
00:31:26.920 for let's
00:31:27.760 say grammar,
00:31:29.100 might be a
00:31:29.720 better way to
00:31:30.440 express it,
00:31:31.040 or content.
00:31:33.320 In other
00:31:33.560 words, are
00:31:33.880 you changing
00:31:34.420 the content?
00:31:35.860 If you
00:31:36.220 say you
00:31:36.620 are changing
00:31:38.720 the content
00:31:39.280 for whatever
00:31:39.840 reason,
00:31:40.480 doesn't matter
00:31:40.900 the reason,
00:31:41.660 then everybody
00:31:42.240 who interacted
00:31:42.920 with it gets
00:31:43.480 a notice with
00:31:44.840 the new
00:31:45.100 content and
00:31:45.840 this is an
00:31:46.400 edit.
00:31:47.640 How about
00:31:47.960 that?
00:31:49.080 You can do
00:31:49.580 that, right?
00:31:51.300 Because the
00:31:52.040 problem is that
00:31:52.640 people see the
00:31:53.240 fake news and
00:31:54.340 they don't see
00:31:54.820 the correction.
00:31:56.160 It's one of
00:31:56.560 the biggest
00:31:56.840 problems on
00:31:57.360 the internet,
00:31:57.780 right?
00:31:58.320 But you
00:31:58.680 could fix
00:31:59.160 that by
00:32:00.120 making sure
00:32:00.620 that every
00:32:01.100 person who
00:32:01.640 saw bad
00:32:02.220 news that
00:32:03.480 got corrected
00:32:04.460 sees the
00:32:05.560 correction.
00:32:06.320 Just the
00:32:06.740 system does
00:32:07.260 it.
00:32:07.600 You interacted
00:32:08.260 with it,
00:32:08.620 you get the
00:32:09.000 correction.
00:32:11.860 Now,
00:32:13.180 immediately you
00:32:15.200 can see that
00:32:16.620 the edit
00:32:17.020 button is way
00:32:17.860 more than a
00:32:18.800 user interface.
00:32:20.520 Am I right?
00:32:21.680 As soon as I
00:32:22.340 said this,
00:32:22.980 did you suddenly
00:32:23.600 connect that
00:32:24.700 this isn't just
00:32:25.440 about how easy
00:32:26.760 it is to edit
00:32:27.580 things.
00:32:28.540 It's about
00:32:28.980 free speech.
00:32:29.700 It's about
00:32:31.000 mind control
00:32:31.840 because if the
00:32:33.400 edit process
00:32:34.280 allows people
00:32:35.740 to rethink
00:32:37.100 what they may
00:32:37.840 have believed
00:32:38.220 on first
00:32:38.780 look,
00:32:39.400 which the
00:32:39.980 current process
00:32:40.600 doesn't.
00:32:41.380 Right now,
00:32:41.840 you look at
00:32:42.220 something and
00:32:42.720 you never know
00:32:43.380 it's corrected.
00:32:44.180 You just
00:32:44.420 move on.
00:32:45.720 So this
00:32:46.140 gets to
00:32:47.320 transparency,
00:32:48.440 it gets to
00:32:49.640 brainwashing,
00:32:51.640 it gets to
00:32:52.060 fake news.
00:32:53.440 This is
00:32:54.140 deeply,
00:32:54.920 deeply
00:32:55.380 important.
00:32:55.840 And it's
00:32:57.580 happening
00:32:57.900 coincidentally
00:32:58.580 the same
00:32:59.000 time Elon
00:32:59.500 Musk buys
00:33:00.600 9.2%
00:33:01.740 of Twitter.
00:33:02.920 On April
00:33:03.720 1st,
00:33:04.400 Twitter said
00:33:05.820 it was working
00:33:06.400 on an edit
00:33:06.960 button literally
00:33:08.280 as an April
00:33:09.060 Fool's joke.
00:33:11.400 April 1st.
00:33:13.260 Was that not
00:33:14.760 too long ago,
00:33:15.540 right?
00:33:16.520 And suddenly,
00:33:18.140 Elon Musk
00:33:19.200 buys 9.2%
00:33:20.440 of it and now
00:33:20.940 it's real.
00:33:21.700 Now it's
00:33:22.040 suddenly real.
00:33:22.780 They're really
00:33:23.140 working on an edit
00:33:23.800 button.
00:33:25.040 I feel like
00:33:25.560 that might
00:33:26.020 have been
00:33:26.220 a recent
00:33:26.680 change,
00:33:27.160 right?
00:33:27.940 Who knows?
00:33:29.560 It's also
00:33:30.140 possible that
00:33:30.800 part of
00:33:31.140 Twitter thought
00:33:32.640 it was a joke
00:33:33.160 and part of
00:33:33.760 it was actually
00:33:35.120 working on it
00:33:35.940 and the other
00:33:37.220 part didn't
00:33:37.620 know about it.
00:33:38.240 That's possible.
00:33:42.200 All right.
00:33:44.660 So here's
00:33:45.620 how you fix
00:33:46.180 everything in
00:33:46.760 the world.
00:33:47.920 Let me tie
00:33:48.420 everything together
00:33:49.080 for you.
00:33:50.060 You buy
00:33:50.620 9.2% of
00:33:51.800 Twitter to gain
00:33:52.460 influence over
00:33:53.260 it.
00:33:53.800 You don't
00:33:54.380 need to
00:33:54.680 buy the
00:33:54.940 whole company.
00:33:55.620 You just
00:33:55.880 have to
00:33:56.120 buy enough
00:33:56.460 to get a
00:33:56.800 board seat
00:33:57.260 and influence.
00:33:58.400 Then you
00:33:58.840 use that
00:34:00.540 influence to
00:34:01.100 introduce
00:34:01.500 algorithm
00:34:02.120 transparency.
00:34:03.900 Let everybody
00:34:04.460 see the
00:34:04.920 algorithm and
00:34:05.640 maybe even
00:34:06.260 Jack Dorsey's
00:34:07.420 plan of
00:34:08.460 choosing your
00:34:09.020 own algorithm
00:34:09.840 based on
00:34:10.580 what you
00:34:11.060 prefer.
00:34:12.940 Once the
00:34:15.020 algorithm is
00:34:15.800 transparent,
00:34:17.580 you've created
00:34:18.500 the first
00:34:19.080 unbiased
00:34:19.800 platform.
00:34:21.860 Right?
00:34:22.420 It wouldn't
00:34:23.600 be biased
00:34:24.080 by the
00:34:24.420 algorithm.
00:34:25.360 People would
00:34:25.940 just be
00:34:26.220 getting what
00:34:26.580 they wanted.
00:34:28.140 Of course,
00:34:28.760 that could be
00:34:29.080 its own
00:34:29.360 bias, but
00:34:29.980 in theory,
00:34:30.680 you could
00:34:30.860 have a
00:34:31.180 more
00:34:31.460 unbiased
00:34:31.980 platform.
00:34:33.500 And here's
00:34:34.460 why that
00:34:35.300 changes
00:34:35.640 everything.
00:34:37.120 Whiteboard
00:34:37.680 time.
00:34:39.580 It's
00:34:40.080 whiteboard
00:34:40.540 time.
00:34:42.880 Here's how
00:34:43.520 everything in
00:34:44.220 the world
00:34:44.540 works.
00:34:45.040 Twitter is
00:34:51.640 like the
00:34:51.980 user
00:34:52.240 interface to
00:34:53.100 the
00:34:53.280 simulation.
00:34:55.480 In other
00:34:55.920 words,
00:34:56.640 Twitter is
00:34:57.380 the lever
00:34:58.360 that ultimately
00:35:00.020 through its
00:35:00.820 connections
00:35:01.420 makes us
00:35:02.560 think the
00:35:02.980 way we
00:35:03.300 think and
00:35:03.940 imagine the
00:35:04.660 reality the
00:35:05.180 way we
00:35:05.520 imagine it.
00:35:06.100 And it
00:35:06.280 works this
00:35:06.680 way.
00:35:07.800 Journalists
00:35:08.300 are all
00:35:08.800 on Twitter.
00:35:10.480 When I
00:35:10.840 speak in
00:35:11.200 absolutes,
00:35:11.840 you can
00:35:12.080 adjust it
00:35:12.900 in your
00:35:13.160 head to,
00:35:13.620 well,
00:35:13.840 he means
00:35:14.140 most of
00:35:14.560 them.
00:35:15.040 So most
00:35:15.800 journalists
00:35:16.380 are on
00:35:16.720 Twitter.
00:35:18.120 And
00:35:18.340 journalists
00:35:18.920 are the
00:35:19.260 ones who
00:35:19.560 create the
00:35:19.940 narrative,
00:35:20.440 and the
00:35:20.660 narrative is
00:35:21.160 what programs
00:35:21.860 the citizens.
00:35:22.940 In other
00:35:23.300 words,
00:35:24.080 changes
00:35:24.460 everything.
00:35:25.820 If you
00:35:26.180 can program
00:35:27.080 the citizens
00:35:27.640 differently,
00:35:28.940 or better,
00:35:30.300 more effectively,
00:35:31.300 more honestly,
00:35:32.020 perhaps, you
00:35:33.040 get a whole
00:35:33.360 different outcome.
00:35:34.960 So now,
00:35:36.140 Elon Musk
00:35:36.920 now has
00:35:38.900 some kind
00:35:40.340 of control,
00:35:41.340 we don't
00:35:41.700 know how
00:35:42.020 much yet,
00:35:43.260 on Twitter.
00:35:43.920 that means
00:35:45.500 that he
00:35:45.960 will be
00:35:46.300 feeding
00:35:46.600 journalists
00:35:47.180 potentially.
00:35:49.080 Potentially.
00:35:49.800 We don't
00:35:50.120 know this
00:35:50.460 yet.
00:35:51.040 But he
00:35:51.800 could be
00:35:52.200 feeding them
00:35:52.720 for the
00:35:53.040 first time
00:35:53.620 something like
00:35:54.660 accurate news.
00:35:56.060 Imagine a
00:35:56.860 journalist who
00:35:58.100 is reporting
00:35:58.760 fake news on
00:35:59.660 television while
00:36:01.840 Twitter is
00:36:02.780 accurately allowing
00:36:04.080 people to see
00:36:04.820 what's true.
00:36:05.800 Because that's
00:36:06.540 not the
00:36:06.840 case.
00:36:07.160 the case
00:36:08.460 right now is
00:36:09.100 that people
00:36:09.420 watch
00:36:09.800 Democrat
00:36:10.620 news and
00:36:11.880 then they
00:36:12.180 go to
00:36:12.440 Twitter and
00:36:12.900 they see
00:36:13.220 Democrat
00:36:13.780 tweets.
00:36:15.200 So there's
00:36:15.860 nothing to
00:36:16.300 check their
00:36:16.720 work.
00:36:18.000 And it's
00:36:18.260 the same the
00:36:18.640 other way,
00:36:19.060 right?
00:36:19.320 Republicans
00:36:19.740 watch Republican
00:36:20.560 news and
00:36:21.140 then they go
00:36:21.540 watch Republican
00:36:22.220 tweets.
00:36:22.620 But what if
00:36:24.240 the algorithm
00:36:25.020 could be
00:36:25.480 tweaked so
00:36:26.700 that people
00:36:27.100 could actually
00:36:27.560 see reality
00:36:28.340 across bubbles?
00:36:30.660 Can Elon
00:36:31.400 Musk,
00:36:33.140 a member of
00:36:33.760 the PayPal
00:36:34.140 originals,
00:36:35.720 one of the
00:36:36.700 three people
00:36:37.420 we know
00:36:38.140 understands
00:36:39.320 human motivation
00:36:40.600 and how the
00:36:42.160 wiring of the
00:36:42.920 brain and the
00:36:43.660 wiring of the
00:36:44.280 device have to
00:36:45.380 be considered
00:36:46.440 one system in
00:36:47.640 a way nobody
00:36:48.200 else ever has
00:36:48.960 as effectively.
00:36:50.180 Can he
00:36:51.060 create a
00:36:52.380 situation where
00:36:53.100 journalists would
00:36:53.820 be, and here's
00:36:54.480 the key,
00:36:55.860 embarrassed,
00:36:57.840 embarrassed to
00:36:59.900 tell biased
00:37:00.480 stories on
00:37:01.060 television?
00:37:01.440 I think he
00:37:04.100 can.
00:37:04.920 I think that's
00:37:05.860 within the
00:37:06.460 doable range.
00:37:08.720 And once you
00:37:09.280 get a situation
00:37:09.920 where a
00:37:10.380 journalist can't
00:37:11.360 go on another
00:37:12.100 platform and
00:37:12.860 lie, because
00:37:13.840 they will be
00:37:14.340 devoured on
00:37:15.440 Twitter for
00:37:16.560 lying.
00:37:18.720 By their own
00:37:19.600 team, by the
00:37:20.280 way.
00:37:20.740 I'm not talking
00:37:21.320 about Republicans
00:37:22.540 yelling at,
00:37:23.560 you know,
00:37:24.740 Jim Acosta,
00:37:26.380 right?
00:37:27.180 I don't think
00:37:27.840 Acosta cares how
00:37:28.740 many Republicans
00:37:29.380 are mad at
00:37:29.960 him, but I'll
00:37:31.060 bet he cares
00:37:31.640 how many
00:37:32.000 Democrats hear
00:37:32.840 that his
00:37:33.220 story is
00:37:33.640 fake.
00:37:34.600 Am I
00:37:34.940 right?
00:37:35.980 Because that
00:37:36.340 would hurt.
00:37:37.600 So if you
00:37:38.000 can create a
00:37:38.460 situation where
00:37:39.440 the journalists
00:37:39.900 are embarrassed
00:37:40.540 into telling
00:37:41.160 the truth, and
00:37:42.580 you would only
00:37:43.000 need one major
00:37:44.120 platform to do
00:37:45.020 that, and
00:37:46.320 Twitter is the
00:37:46.800 one that
00:37:47.140 journalists are
00:37:47.760 pretty much
00:37:48.620 stuck on.
00:37:49.580 They're not
00:37:49.840 going to leave
00:37:50.160 Twitter.
00:37:51.160 That changes
00:37:51.860 the narrative,
00:37:52.500 that changes
00:37:52.900 the citizens,
00:37:53.660 the citizens
00:37:54.080 can change
00:37:54.940 anything.
00:37:56.680 And so this
00:37:57.360 story is way
00:37:58.600 bigger.
00:37:58.960 that Elon
00:38:00.960 Musk buys
00:38:01.720 9.2% of
00:38:03.180 Twitter is
00:38:04.340 way, way
00:38:05.980 bigger than
00:38:06.460 you think it
00:38:06.900 is.
00:38:07.840 And here's
00:38:08.380 the cool
00:38:08.800 thing about
00:38:09.280 it.
00:38:11.500 Musk believes
00:38:12.440 in the
00:38:12.740 simulation, or
00:38:14.500 at least he
00:38:15.520 talks about a
00:38:16.120 lot.
00:38:16.380 We don't know
00:38:17.000 his internal
00:38:17.640 thoughts.
00:38:18.600 But he
00:38:19.080 talks about
00:38:19.520 the simulation
00:38:20.080 being the
00:38:21.200 most likely
00:38:21.840 explanation of
00:38:23.120 reality.
00:38:23.500 reality.
00:38:24.500 And I
00:38:24.940 also embrace
00:38:25.980 that same
00:38:26.400 idea.
00:38:27.440 And I
00:38:28.380 have this
00:38:28.900 impression that
00:38:30.920 when you
00:38:31.340 embrace the
00:38:32.040 idea that you
00:38:32.720 are living in
00:38:33.340 a simulation,
00:38:34.260 and that we're
00:38:34.680 literally software,
00:38:36.240 you can start
00:38:37.440 to see the
00:38:37.980 machinery.
00:38:40.220 And I
00:38:41.120 don't know how
00:38:41.620 much of it is
00:38:42.120 an illusion,
00:38:43.520 probably all of
00:38:44.680 it, but you
00:38:45.840 get the sense
00:38:46.600 that you can
00:38:47.280 start seeing
00:38:48.020 how to
00:38:49.480 reprogram it
00:38:50.220 from the
00:38:50.560 inside.
00:38:50.980 And that
00:38:52.380 feels like
00:38:52.880 what all
00:38:53.300 the PayPal
00:38:54.200 people can
00:38:54.840 see.
00:38:55.900 I feel like
00:38:56.460 they see
00:38:56.900 themselves in
00:38:57.540 the simulation,
00:38:58.280 but they can
00:38:58.640 also see the
00:38:59.320 code, and
00:39:00.340 they can reach
00:39:00.820 in and tweak
00:39:01.320 it.
00:39:02.500 Musk is
00:39:03.100 doing that
00:39:03.480 with Twitter.
00:39:05.100 The code
00:39:06.320 that holds
00:39:07.000 our illusion
00:39:08.100 of reality
00:39:08.620 together is
00:39:09.220 what we
00:39:09.620 collectively see
00:39:10.800 in the
00:39:11.020 media.
00:39:12.000 He figured
00:39:12.580 out how
00:39:13.800 to control
00:39:14.320 it through
00:39:15.480 Twitter.
00:39:16.660 Now, Jeff
00:39:17.340 Bezos took
00:39:18.740 over the
00:39:19.180 Washington
00:39:19.500 Post and
00:39:20.360 got a
00:39:20.960 big voice.
00:39:22.480 But the
00:39:22.740 Washington
00:39:23.060 Post is
00:39:23.740 lower in
00:39:29.540 the chain
00:39:30.340 of influence
00:39:30.940 than
00:39:31.200 Twitter.
00:39:32.180 Twitter
00:39:32.640 affects all
00:39:33.280 journalists.
00:39:35.020 Washington
00:39:35.340 Post just
00:39:35.820 has a few.
00:39:37.060 So whoever
00:39:37.780 controls Twitter
00:39:38.620 controls the
00:39:39.320 Washington
00:39:39.640 Post.
00:39:41.500 So if
00:39:42.200 you're watching
00:39:42.580 your billionaire
00:39:43.200 chess,
00:39:45.960 Elon Musk
00:39:47.300 just took
00:39:48.080 one of
00:39:48.540 Bezos'
00:39:49.180 pieces off
00:39:49.820 the board.
00:39:50.960 Because he's
00:39:51.760 now at a
00:39:52.200 higher level
00:39:52.740 of influence
00:39:53.440 than any
00:39:54.060 of the
00:39:54.320 organs below
00:39:54.960 it,
00:39:56.080 including
00:39:56.460 the
00:39:56.900 Washington
00:39:57.140 Post.
00:39:59.960 In other
00:40:00.720 news,
00:40:02.280 I asked on
00:40:03.500 Twitter if I
00:40:04.420 could get a
00:40:05.020 black American
00:40:06.480 cartoonist to
00:40:08.340 help me design a
00:40:09.740 black cast
00:40:10.800 member for the
00:40:12.040 Dilbert comic.
00:40:13.120 Now, I have
00:40:13.820 forever wanted to
00:40:14.620 have a more
00:40:15.200 diverse cast because
00:40:17.480 all the usual
00:40:18.240 reasons, right?
00:40:19.040 You want to
00:40:19.860 attract other,
00:40:21.980 you want more
00:40:22.840 of the public
00:40:23.780 to like your
00:40:24.300 product.
00:40:25.320 So why
00:40:25.740 wouldn't I put
00:40:26.420 people in
00:40:27.280 there that
00:40:27.620 would attract
00:40:28.140 more customers?
00:40:29.420 So of course I've
00:40:30.060 always wanted to
00:40:30.640 have more
00:40:31.080 diversity in the
00:40:32.220 cast and the
00:40:32.680 reason I didn't
00:40:33.160 do it is I
00:40:33.680 couldn't figure
00:40:34.040 out how.
00:40:35.680 I couldn't
00:40:36.000 figure out how.
00:40:37.100 Because if I
00:40:37.820 put diverse
00:40:39.260 characters in a
00:40:40.140 strip and give
00:40:41.520 them character
00:40:42.200 flaws, which is
00:40:43.580 you have to
00:40:44.160 because it's
00:40:44.660 a comic, if
00:40:45.720 the characters
00:40:46.200 don't have
00:40:46.600 flaws, they're
00:40:47.200 not funny.
00:40:48.640 Imagine me, I
00:40:50.000 have a character
00:40:50.660 right now,
00:40:51.180 Wally.
00:40:51.920 He's a white
00:40:53.380 character with
00:40:54.140 six strands of
00:40:55.020 hair.
00:40:56.000 Now his
00:40:56.320 defining characteristic
00:40:57.400 is that he's
00:40:58.120 lazy.
00:40:58.740 He doesn't do
00:40:59.240 work.
00:41:00.580 Could I have
00:41:01.200 introduced a
00:41:02.160 minority character
00:41:03.260 and given that
00:41:04.460 character any
00:41:05.660 kind of a
00:41:06.840 defect like the
00:41:09.440 Wally character?
00:41:10.940 Nope.
00:41:12.200 No way I could
00:41:12.840 get away with
00:41:13.260 that.
00:41:14.160 How about my
00:41:15.040 Alice character?
00:41:16.440 She's defined by
00:41:17.500 her easily
00:41:19.680 angered, kind
00:41:21.680 of tough
00:41:22.060 attitude.
00:41:23.080 Imagine if I
00:41:23.840 put that exact
00:41:24.680 attitude into a
00:41:25.960 minority character.
00:41:28.140 Suddenly I'm in
00:41:28.900 trouble, right?
00:41:29.820 But these are just
00:41:30.520 universal qualities
00:41:31.740 that everybody has
00:41:32.800 in every group.
00:41:34.200 There's no group
00:41:35.040 who doesn't have
00:41:35.600 an angry person,
00:41:36.420 a lazy person,
00:41:37.240 etc.
00:41:37.460 But because of
00:41:39.980 my situation and
00:41:40.960 the way the world
00:41:41.460 works, I can't do
00:41:42.940 the thing everybody
00:41:43.580 wants.
00:41:44.160 Like, you know,
00:41:46.140 not everybody, but
00:41:46.880 the world wants me
00:41:48.080 to be more diverse,
00:41:49.480 I think.
00:41:50.580 And I appreciate
00:41:52.080 the impulse.
00:41:53.600 Makes sense.
00:41:57.040 I'm seeing some
00:41:57.920 super racist things
00:41:58.920 in the comments,
00:41:59.720 but because they're
00:42:00.720 funny, they don't
00:42:01.840 bother me as much.
00:42:03.640 I've told you that
00:42:04.560 rule, right?
00:42:05.840 That you can be
00:42:06.520 pretty offensive if
00:42:07.520 you're also funny.
00:42:08.420 Like, people will
00:42:10.320 accept that balance.
00:42:11.940 But if it's not
00:42:12.700 funny, it's just
00:42:13.960 racist.
00:42:15.140 It's just racist
00:42:16.300 if it's not funny.
00:42:19.540 So, anyway, I guess
00:42:21.340 that was done well
00:42:22.020 in this case.
00:42:23.240 So, I started
00:42:25.980 writing for this
00:42:26.940 character because I
00:42:30.800 hit on an angle that
00:42:33.040 I think I can make
00:42:33.980 work.
00:42:35.100 And I'm going to
00:42:35.560 preview it for you.
00:42:36.780 So, before it
00:42:37.420 appears in Dilbert,
00:42:38.220 and I'm not sure it
00:42:38.920 will, I've already
00:42:40.160 written them, and
00:42:41.020 they will get drawn,
00:42:42.700 but they might not
00:42:43.620 run.
00:42:44.960 You know, by the
00:42:45.720 time my editor takes
00:42:46.720 a look at it and
00:42:47.480 somebody has a
00:42:48.160 conversation with me,
00:42:49.340 you know, they might
00:42:50.640 not run.
00:42:52.040 But here's what I'm
00:42:53.240 going to do.
00:42:53.600 I'm going to introduce
00:42:54.220 the black character,
00:42:55.040 and the character
00:42:57.740 is going to have
00:42:58.860 one interesting,
00:43:00.760 let's say,
00:43:01.700 personality
00:43:02.360 characteristic.
00:43:04.580 The black
00:43:05.460 character will
00:43:07.540 identify as white.
00:43:10.820 And it's going to
00:43:11.800 cause a huge
00:43:12.360 problem for
00:43:13.080 everybody.
00:43:14.480 Do you know
00:43:14.720 why it's a
00:43:15.020 problem?
00:43:15.860 Because he's
00:43:16.600 hired for the
00:43:17.880 diversity targets,
00:43:19.580 but he refuses to
00:43:20.920 identify as black.
00:43:22.680 He is black,
00:43:23.380 but he refuses to
00:43:25.500 identify.
00:43:26.340 And then here's the
00:43:26.980 second level.
00:43:28.580 You'll never know if
00:43:29.640 he's joking.
00:43:31.240 That's the second
00:43:32.080 level.
00:43:32.940 You don't know if
00:43:33.540 he's just fucking
00:43:34.080 with the boss,
00:43:35.120 because it makes
00:43:35.700 everybody uncomfortable.
00:43:37.260 So I think the
00:43:37.960 character will be
00:43:38.700 somebody who is
00:43:40.840 having fun with the
00:43:42.920 fact that people are
00:43:44.200 getting uncomfortable
00:43:45.000 with the way he
00:43:46.520 chooses to identify,
00:43:47.820 because it just
00:43:48.260 screws everything up.
00:43:49.320 Nothing works.
00:43:50.920 Right?
00:43:51.620 Now, I think I can
00:43:52.820 make that work.
00:43:55.800 Somebody says
00:43:56.460 cringe.
00:43:57.220 Does anybody else
00:43:57.880 think that?
00:43:59.400 So one comment
00:44:00.480 that says cringe,
00:44:02.020 I think it's
00:44:02.500 actually a good
00:44:03.060 angle.
00:44:05.920 Right?
00:44:06.380 Because here's the
00:44:07.140 thing.
00:44:09.920 You haven't seen
00:44:10.820 it before, as far
00:44:11.760 as I know.
00:44:12.440 It would be fresh,
00:44:13.520 which is weird.
00:44:15.420 Yeah.
00:44:17.820 All right.
00:44:19.260 Yeah, it's hard
00:44:20.060 to not pander,
00:44:21.540 isn't it?
00:44:21.900 That's the thing
00:44:22.800 I wanted to avoid.
00:44:23.520 From a creative
00:44:24.640 perspective, it would
00:44:25.700 be easy to pander,
00:44:27.220 and it would make
00:44:27.640 the audience happy,
00:44:28.640 but it's just too
00:44:30.900 icky.
00:44:32.320 I want to treat
00:44:34.340 the character
00:44:35.040 respectfully while
00:44:38.020 highlighting
00:44:40.040 something irregular.
00:44:42.360 I've told you before
00:44:46.560 that the sign
00:44:48.560 of a good
00:44:49.040 comedy approach
00:44:50.320 is if the
00:44:51.740 setup makes
00:44:52.980 you laugh.
00:44:54.080 You haven't even
00:44:54.780 heard the punchlines,
00:44:56.020 so the joke
00:44:56.840 isn't even present,
00:44:58.340 but you just hear
00:44:58.960 the situation,
00:45:00.260 and you're already
00:45:00.820 laughing.
00:45:01.980 So that's one of
00:45:02.660 these, where it's
00:45:04.120 easy to write to
00:45:04.920 because the situation
00:45:05.780 itself is fun.
00:45:07.620 That's your humor
00:45:08.940 tip of the day.
00:45:09.980 All right, here's a
00:45:11.340 question for you.
00:45:12.900 Given that we know
00:45:13.720 Democrats believe they
00:45:15.120 will lose in the
00:45:16.280 upcoming elections,
00:45:17.460 2022 and probably
00:45:18.860 2024, what are they
00:45:21.320 going to do about it?
00:45:23.020 It seems like they need
00:45:24.240 a nuclear option.
00:45:26.080 Am I right?
00:45:27.480 Because all of the
00:45:28.160 normal things that
00:45:29.220 Democrats do, they
00:45:30.320 don't really look like
00:45:31.220 they're going to work.
00:45:32.000 And I think the
00:45:33.600 nuclear option,
00:45:36.420 I'm not talking about
00:45:38.340 cheating in the
00:45:38.920 elections, I think the
00:45:40.320 nuclear option is going
00:45:41.380 to be a new hoax, and
00:45:43.220 my God, it's going to
00:45:44.020 be a big one.
00:45:45.320 So I'm going to
00:45:46.100 predict that the
00:45:48.160 next hoax will be
00:45:49.740 bigger than anything
00:45:50.380 we've seen, and just
00:45:52.440 crazier.
00:45:53.360 Because they've got to
00:45:54.200 get bigger than
00:45:54.780 Russia collusion,
00:45:56.280 January 6th
00:45:57.060 insurrection hoax,
00:45:58.100 the fine people
00:45:58.680 hoax, drinking bleach
00:45:59.700 hoax, the Russian
00:46:01.000 bounty in Afghanistan
00:46:02.540 hoax.
00:46:03.500 I mean, they have a
00:46:03.980 lot of hoaxes that
00:46:06.140 sort of set the
00:46:07.060 quality expectation
00:46:09.680 for hoaxes.
00:46:10.920 I feel like they've
00:46:11.880 got to take it up a
00:46:13.760 level.
00:46:14.440 And what the hell is
00:46:15.400 that going to look
00:46:15.940 like?
00:46:17.360 I mean, this is stuff
00:46:19.360 that Democrats
00:46:20.220 actually believed.
00:46:22.100 Democrats believed
00:46:23.040 all of this.
00:46:24.460 Like everything I
00:46:25.260 just listened, they
00:46:25.940 actually believed it
00:46:26.740 all.
00:46:29.120 Yeah, they believed
00:46:29.840 that Democrats believe
00:46:30.980 that Black Lives
00:46:31.700 Matter was an
00:46:32.440 organic movement, and
00:46:34.380 it wasn't about
00:46:34.940 somebody funding
00:46:35.700 people.
00:46:37.660 They believed
00:46:38.340 everything.
00:46:40.140 Now, I'm not
00:46:40.920 saying that Republicans
00:46:42.280 don't also believe
00:46:43.240 things that aren't
00:46:43.840 true, but it's less
00:46:47.180 relevant to this
00:46:48.020 question, because the
00:46:49.700 Democrats need a
00:46:50.800 hoax, whereas the
00:46:52.580 Republicans don't.
00:46:53.740 Do you know what the
00:46:54.280 Republicans need to
00:46:55.320 win?
00:46:56.740 Play it as straight
00:46:58.040 down the center as
00:46:59.200 you can.
00:46:59.520 just talk about
00:47:01.380 reality.
00:47:02.700 Reality is all the
00:47:03.740 Republicans need to
00:47:04.820 get elected, but the
00:47:06.340 Democrats can't get
00:47:07.260 elected based on
00:47:08.360 reality.
00:47:09.460 That option is gone.
00:47:12.280 I guess this is the
00:47:13.320 best way to say it.
00:47:14.560 Republicans can
00:47:15.440 definitely get elected
00:47:16.600 based on reality.
00:47:18.920 Right?
00:47:20.040 Anybody disagree?
00:47:21.420 That the polls and
00:47:23.040 just the facts we all
00:47:24.580 observe are just so
00:47:26.100 clear at this point
00:47:27.840 that all you need to
00:47:29.600 do is tell the truth.
00:47:31.220 Just be straight with
00:47:32.020 the American people.
00:47:32.900 How could you fail to
00:47:33.940 get elected at this
00:47:35.120 point?
00:47:36.480 Right?
00:47:36.780 But the Democrats
00:47:37.720 can't use transparency
00:47:39.720 because that would work
00:47:40.700 against them.
00:47:41.780 The only thing they have
00:47:43.600 is a bigger hoax.
00:47:45.320 It's all they have.
00:47:46.820 And ordinarily, it would
00:47:48.500 be hard to predict they
00:47:49.320 could outdo the Russia
00:47:50.300 collusion hoax, but they
00:47:52.060 have to.
00:47:52.920 They have to.
00:47:54.400 They don't have any
00:47:55.520 other play except losing,
00:47:57.420 I suppose.
00:47:58.900 So I predict that the
00:48:02.740 next hoax is going to be
00:48:05.000 magnificent.
00:48:06.680 Just magnificent.
00:48:08.280 China collusion hoax.
00:48:09.600 Yeah, maybe.
00:48:10.720 Maybe.
00:48:13.620 All right.
00:48:15.080 China's in lockdown.
00:48:16.360 23 cities.
00:48:19.020 So over 193 million people
00:48:21.040 have some form of
00:48:21.860 lockdown.
00:48:22.180 And they represent 22%
00:48:24.700 of China's GDP are in
00:48:27.280 some form of lockdown,
00:48:28.340 according to brokerage
00:48:29.960 firm Nomura.
00:48:32.360 However, I love something
00:48:35.300 about this story, which is
00:48:36.460 the resilience of humans.
00:48:40.000 People are really damn
00:48:41.280 resilient.
00:48:42.620 Maybe not every one of us,
00:48:44.520 but collectively, my God,
00:48:46.740 we can adjust the stuff
00:48:47.900 fast.
00:48:48.320 Apparently, in China,
00:48:50.480 since there's lockdowns,
00:48:51.700 but they don't want the
00:48:52.440 banking system and
00:48:53.360 everything to collapse,
00:48:54.700 there are a bunch of
00:48:55.380 people who have put tents
00:48:57.180 next to their cubicles to
00:48:59.220 keep the banking system
00:49:00.260 open.
00:49:00.640 So they literally live on the
00:49:02.840 floor next to their cubicle
00:49:04.140 because they have to to keep
00:49:06.740 the country running.
00:49:07.440 And now, I think we would have
00:49:09.280 done that here.
00:49:10.020 I think in America, if you had
00:49:11.820 to, you know, if you didn't
00:49:13.200 have any other choice, I think
00:49:14.940 we would have pitched tents
00:49:16.540 next to cubicles, too.
00:49:18.220 But I just love how, how, I
00:49:21.540 don't know, spunky people are.
00:49:23.740 They'll do anything to keep
00:49:25.520 the system working.
00:49:28.160 Or work for home.
00:49:29.580 Yeah, I don't know that banks
00:49:30.820 always have a work for home
00:49:32.060 option because of security.
00:49:34.080 That may have changed,
00:49:35.420 though.
00:49:36.480 Somebody do a fact check on
00:49:37.820 me.
00:49:38.420 Back in the old, old days,
00:49:40.440 you sort of had to be on
00:49:41.740 site for some of the highest
00:49:43.240 security stuff.
00:49:44.940 Has that changed?
00:49:46.700 Maybe it's changed with better
00:49:47.940 security systems.
00:49:49.220 I don't know.
00:49:51.020 Banks work from home now.
00:49:52.420 But every job in the bank.
00:49:55.400 I feel as if there's secure
00:50:01.180 ways, there are secure ways
00:50:03.680 of logging in now, remotely.
00:50:06.180 There should be.
00:50:08.480 Yeah.
00:50:08.940 Okay.
00:50:09.540 Well, let's assume they could
00:50:10.440 have worked for home, so I
00:50:11.340 have no idea why they're
00:50:12.140 sleeping in tents.
00:50:14.600 So we learned today that the
00:50:16.220 Biden administration is giving
00:50:17.760 free phones to illegal
00:50:19.360 immigrants.
00:50:19.860 What's your first reaction to
00:50:23.420 that?
00:50:25.080 Free phones to illegal
00:50:26.580 immigrants.
00:50:29.020 My first reaction was I
00:50:30.520 thought it was a joke.
00:50:32.360 Did you?
00:50:33.280 The first time you saw it,
00:50:34.580 did you think it was real?
00:50:36.060 Well, they're not giving
00:50:36.720 phones to illegal immigrants.
00:50:38.700 Nobody would do that.
00:50:40.620 It's real.
00:50:41.760 It's real.
00:50:42.240 Now, does anybody remember an
00:50:45.140 idea I had about giving free
00:50:47.000 phones to immigrants?
00:50:49.120 Let's see if anybody can
00:50:50.780 remember that.
00:50:52.500 So if you've been following me
00:50:53.860 for a while, do you remember my
00:50:55.860 idea for giving free phones to
00:50:57.840 immigrants?
00:51:00.340 Yeah.
00:51:00.900 And what did Jen Psaki say was the
00:51:05.280 reason for the free phones?
00:51:07.380 To track them and also to allow
00:51:09.880 them to check back in.
00:51:12.280 Is that good enough?
00:51:15.340 Is that a good enough reason for
00:51:16.740 you that they can track them?
00:51:18.760 Well, they don't plan to deport
00:51:20.200 them, so what's the point?
00:51:22.300 Right?
00:51:22.640 If you're not going to deport
00:51:23.880 them, why do you even need to
00:51:26.580 track them?
00:51:27.720 Just to call them and tell them
00:51:29.020 to, hey, hey, really, I'll turn
00:51:30.820 off your phone if you don't show
00:51:31.860 up to the court appointment, I
00:51:33.760 think they would let their phone
00:51:34.740 be turned off, right?
00:51:35.940 So I don't know how it works the
00:51:37.340 way they've designed it.
00:51:38.660 Let me tell you how I suggested
00:51:40.780 designing it.
00:51:42.380 The way I suggested it is that the
00:51:45.240 only way an illegal alien could be
00:51:47.360 paid was if they had the phone
00:51:49.960 and it had tracking on and the
00:51:53.360 payments came digitally through that
00:51:55.380 phone, like it would have to be
00:51:56.700 that phone, no other phone.
00:51:57.920 Because if you said the only way
00:52:00.200 you could get paid for your work
00:52:01.560 is through the phone, maybe they'd
00:52:05.440 keep them.
00:52:06.960 And maybe that would be a way for
00:52:08.520 people to get temporary workers and
00:52:10.720 that the phone would act like a work
00:52:12.300 visa.
00:52:14.540 Right?
00:52:15.480 So now you need a Republican
00:52:17.420 administration to make this work.
00:52:19.340 So if there was somebody who had the
00:52:20.540 phone and they were just living here
00:52:21.840 illegally and they weren't working or
00:52:23.620 adding anything to the system, you
00:52:25.660 know, then the Republicans could say,
00:52:26.980 yeah, well, we know where you are and
00:52:28.260 they'd go get you because they're
00:52:29.260 tracking your phone.
00:52:30.740 Wouldn't be much of that, I suppose.
00:52:33.860 But I do think, so here's what I
00:52:35.620 think.
00:52:37.560 There's one thing I agree with AOC
00:52:39.580 about.
00:52:40.720 And I know you're going to hate this.
00:52:42.840 If there's anything that would trigger
00:52:44.220 you more than that, I don't know what
00:52:45.400 it would be.
00:52:45.740 There's one thing I agree with AOC
00:52:48.100 about when it comes to defund the
00:52:51.600 police and also the economy in general
00:52:54.700 and capitalism.
00:52:55.940 And you're not going to like it.
00:52:59.060 She thinks that maybe we should just
00:53:01.040 rethink the whole system from scratch.
00:53:03.420 Generally speaking.
00:53:04.320 I don't want to speak for her.
00:53:05.840 But I think she says we should rethink
00:53:07.300 capitalism, rethink the criminal network
00:53:10.360 system.
00:53:10.720 Now, when she says that, what do we do
00:53:13.680 reflexively?
00:53:16.060 Her critics then say, well, if we're
00:53:18.400 going to rethink it, let's think of the
00:53:20.480 worst possible way to do it and then
00:53:22.400 blame her for it.
00:53:24.480 Which is, you know, what critics do.
00:53:26.740 Okay, yeah, we'll rethink capitalism in
00:53:28.700 the worst possible way that could never
00:53:30.460 work and we'll say AOC came up with it.
00:53:32.700 Or we'll just get rid of the police and
00:53:35.200 blame AOC for whatever happens.
00:53:37.060 That's not exactly re-engineering the
00:53:41.000 justice system.
00:53:41.800 That's just getting rid of police.
00:53:43.880 That's no plan.
00:53:46.160 So here's what I like about the phones
00:53:48.380 as a, let's say, the seed of something
00:53:51.240 that would be better than border control.
00:53:55.040 Maybe the phones are your immigration
00:53:56.760 control.
00:53:58.540 Could you design a system where the phone
00:54:01.120 became indispensable to the illegal,
00:54:05.060 let's call them, immigrants, which would
00:54:07.720 actually be legal because we would
00:54:09.940 develop the system for them to come into
00:54:11.660 the country, add to our system, add to
00:54:15.620 the system.
00:54:17.160 So if money is coming through the phone,
00:54:19.440 it means there's some employer that wants
00:54:21.000 them.
00:54:21.860 That means they're adding to the system,
00:54:24.300 at least in that sense.
00:54:26.000 And maybe there's some way to have, you
00:54:28.400 know, massive inflows and outflows of
00:54:31.500 people coming in seasonally.
00:54:32.660 For example, for farm work.
00:54:34.620 They come in, they help us, we help them.
00:54:37.620 They take their money back to Mexico.
00:54:39.580 Mexico gets more money.
00:54:41.100 Good for them, right?
00:54:43.080 So if you were going to rethink
00:54:45.480 immigration completely, could there be a
00:54:49.400 way that the phone idea sort of binds
00:54:54.400 them to the country in a productive way?
00:54:56.440 I don't know.
00:55:00.080 But I think it's such a provocative idea
00:55:03.100 that I'm sort of attracted to it, even
00:55:05.680 though I don't know how you do it
00:55:07.060 exactly.
00:55:07.800 It just feels like there's something
00:55:08.980 there.
00:55:09.540 I don't know.
00:55:10.060 Maybe not for every immigrant, but for
00:55:12.180 some who are coming in for work
00:55:14.540 specifically, not the criminals,
00:55:16.480 obviously.
00:55:18.860 Have you ever thought of this, that the
00:55:20.640 only thing that holds our country
00:55:21.740 together is racism?
00:55:22.780 How do you feel the first time you
00:55:28.480 hear that?
00:55:29.040 The only thing holding America
00:55:30.640 together is racism.
00:55:32.300 Do you know why?
00:55:34.140 There's a reason.
00:55:35.980 See if you can come up with it before
00:55:37.800 I tell you.
00:55:38.780 Racism is the only thing that holds the
00:55:40.380 country together.
00:55:42.140 Here's the reason.
00:55:44.120 If you didn't have racism, we would
00:55:46.460 realize that it's a case of the rich
00:55:49.080 against the poor, and that all the poor
00:55:51.260 people are on the same team, and
00:55:53.200 there's a shit ton of them.
00:55:54.920 There's a lot of them, and they're
00:55:56.500 all on the same team.
00:55:58.140 Do you know why they don't know
00:55:59.240 they're on the same team?
00:56:01.820 Because of racism.
00:56:03.240 They think they're on a race team.
00:56:05.340 They're not.
00:56:06.860 Not really.
00:56:08.400 If all the poor people said, hey,
00:56:10.040 you know, I get that there's, you
00:56:12.340 know, differences and discrimination.
00:56:14.360 I get all that.
00:56:15.120 That's real.
00:56:16.500 But what if we poor people just all
00:56:18.320 banded together and said, give us some
00:56:20.320 shit, you rich people, we want to take
00:56:22.360 your stuff, because we have enough
00:56:24.540 votes.
00:56:25.860 The only thing that keeps the country
00:56:27.320 together is that the rich have found a
00:56:29.280 way, through the media, to convince
00:56:32.300 people that their main filter should be
00:56:35.260 race.
00:56:37.060 As long as their main filter on life is
00:56:39.120 race, they're oblivious, just like a
00:56:42.080 magic trick.
00:56:43.020 It's a distraction, so you don't see
00:56:44.460 how the trick is done.
00:56:45.600 The magic trick is it's always been
00:56:47.000 about money.
00:56:48.320 Dave Chappelle says exactly what I'm
00:56:50.000 saying, except not exactly what I'm
00:56:52.680 saying.
00:56:53.580 He uses different words.
00:56:55.080 But the same message, that racism is a
00:56:58.080 distraction from the actual, you know,
00:57:01.180 the actual inequality of our time, which
00:57:03.160 is income or wealth.
00:57:08.060 All right.
00:57:08.880 There's a weird story about some two
00:57:11.780 people posing as federal agents and
00:57:14.740 giving gifts to a number of secret
00:57:16.280 service officers, including like
00:57:19.160 apartments and high-powered weapons
00:57:21.860 and spy technology and stuff.
00:57:25.160 And I'll just read the names of the two
00:57:29.680 individuals who were captured, pretending
00:57:32.880 to be members of Homeland Security.
00:57:35.900 You can make your own racist judgments about
00:57:40.720 who they might be working for.
00:57:42.360 One of them is named Arian
00:57:45.940 Teherzadi, and the other is
00:57:48.760 Hader Ali.
00:57:51.120 Now, if these two are working for any
00:57:55.600 foreign country, I think that would
00:57:57.900 narrow it down to probably a Middle East
00:57:59.440 country.
00:58:02.900 Iran, I'm looking at you, but I don't
00:58:04.620 know for sure.
00:58:06.300 I'm not even sure if those names specify
00:58:09.020 the region, do they?
00:58:09.940 Could you tell the region from the last
00:58:12.080 names?
00:58:13.980 I don't know if you could do that.
00:58:16.600 And we don't know if they're, I mean,
00:58:18.420 they could just be Americans.
00:58:19.840 So, you know, don't make any racist
00:58:22.000 assumptions about their allegiances or
00:58:24.780 their nationalities by their last names.
00:58:28.300 Can we all agree with that?
00:58:31.020 I think we're all adult enough to know
00:58:33.800 that the last names are a red flag, but
00:58:36.700 we're also adult enough not to just
00:58:38.800 assume that, you know, race is telling
00:58:41.860 you something, right?
00:58:42.660 Race doesn't tell you something.
00:58:44.340 It might tell you where to look, and I
00:58:47.000 guess that's racist enough, but it doesn't
00:58:49.780 tell you what their motivations are or
00:58:51.720 where they came from.
00:58:53.100 It doesn't even tell you they're Islamic.
00:58:54.640 All right.
00:58:58.560 So we'll wait and see what's what on that.
00:59:01.740 All right.
00:59:01.900 That, ladies and gentlemen, brings us to
00:59:05.080 the conclusion of the best live stream
00:59:07.240 that's ever happened.
00:59:08.860 And I hope you enjoyed this breathtaking
00:59:11.620 romp from artificial intelligence, which I
00:59:15.320 haven't mentioned yet, but I'm going to
00:59:16.720 now, through the simulation, through politics,
00:59:20.380 through it all.
00:59:21.620 Let's talk about AI.
00:59:24.360 Have you seen the new AI art that's being
00:59:27.820 created?
00:59:29.260 Apparently, there is AI that can make you a
00:59:31.860 perfect image just by describing it in words.
00:59:36.440 So if you say, for example, a monkey doing
00:59:39.240 his taxes, which was one of the actual
00:59:41.240 examples, the AI will draw you a picture
00:59:44.600 of a monkey, not a picture, but like it
00:59:47.820 looks real, like an actual monkey, over a
00:59:50.740 piece of paper, like struggling over his
00:59:52.480 paperwork.
00:59:54.100 You just have to describe it, and it draws
00:59:56.080 the picture.
00:59:57.180 And some of them are just like crazy, like
01:00:01.720 your mind is blown, and they're actually
01:00:03.280 pleasant to look at.
01:00:04.760 So now imagine the AI can do art, because
01:00:07.820 it is.
01:00:08.440 It is doing art.
01:00:09.740 And now imagine that the AI can rapidly
01:00:12.120 test which of these two pictures is better.
01:00:15.840 It just has a website, or maybe it's got some
01:00:18.740 people who do this for it, for money or
01:00:20.600 whatever.
01:00:21.260 And they just respond.
01:00:22.680 They'll get a notice, say, oh, do you like
01:00:24.320 the one on the left or the right?
01:00:25.940 They go, ah, the left.
01:00:27.460 That's all they do that day.
01:00:28.580 Just one person, one click.
01:00:30.400 But there are lots of people doing lots of
01:00:31.920 clicks.
01:00:32.940 So AI immediately goes through a bunch of
01:00:36.480 iterations and finds out the best piece of
01:00:38.660 art.
01:00:39.420 You don't think artists are in trouble?
01:00:41.220 As an artist, let me tell you that art is
01:00:45.540 mostly a formula.
01:00:47.660 It doesn't seem like it, because you don't
01:00:49.500 know how to do it.
01:00:51.880 I hate to tell you that.
01:00:53.860 But the reason you think art is mysterious, and
01:00:56.700 there's some magic to the creative process, is
01:01:00.720 because you don't do it.
01:01:02.400 If you do it, you see the machinery.
01:01:06.560 Right?
01:01:06.600 Do a Google search on the six dimensions of
01:01:10.620 humor.
01:01:11.200 It's literally a formula for knowing that the
01:01:14.760 humor is there or not.
01:01:16.160 Like, I've just turned it into a formula, and I
01:01:18.260 use it.
01:01:19.420 So, and everything else that I've done that has
01:01:22.900 any creative element to it, eventually, you
01:01:26.100 know, sometimes I start out intuitively, but
01:01:29.220 eventually I can see the machinery.
01:01:31.720 And I go, oh, I see what's happening now.
01:01:33.680 When I follow this formula, it works.
01:01:35.360 When I follow this formula, it doesn't.
01:01:36.840 So that's the machinery.
01:01:38.760 So, yeah.
01:01:39.740 AI is going to take over art.
01:01:42.560 And it's going to influence your social media
01:01:45.980 algorithms and everything else.
01:01:48.260 It's all connected.
01:01:49.760 And if you had to have somebody who's sort of
01:01:52.840 has their finger on the user interface for
01:01:55.700 reality, somebody who knows AI, somebody who
01:01:59.800 knows social networks, somebody who knows how
01:02:02.180 the human mind is interfacing with products, somebody
01:02:06.340 who's so clever about these things, he or she might
01:02:10.020 not even need a marketing department.
01:02:13.800 That's kind of who I want to have their finger on
01:02:15.900 that lever.
01:02:16.340 So I think we might be in good shape thanks to good people
01:02:21.640 trying to make the world better and maybe making some
01:02:24.400 money at the same time.
01:02:25.660 Maybe making some money at the same time.
01:02:27.940 Perfectly acceptable as long as it's transparent and as
01:02:31.760 long as, you know, he has the country's best interests in
01:02:36.400 mind and it looks like he does.
01:02:37.720 So, on that note, on that note, I'd like to say goodbye to
01:02:47.640 the YouTube people.
01:02:48.400 I'm going to talk to the locals people a few more minutes and
01:02:50.860 I think you'd agree.
01:02:55.900 I think you'd agree.
01:02:57.680 This is the best live stream you've ever seen.
01:03:00.860 Bye for now.