Real Coffee with Scott Adams - April 20, 2022


Episode 1719 Scott Adams: Practical Solutions, Climate Change, Fake News, Poverty, Systemic Racism


Episode Stats

Length

59 minutes

Words per Minute

154.07997

Word Count

9,158

Sentence Count

675

Misogynist Sentences

19

Hate Speech Sentences

16


Summary

A solution to nuclear power plants, climate change, and nuclear meltdowns, and the future of the nuclear industry. Today's episode is a mashup of a few of my favorite solutions to some of the world's most thorny problems.


Transcript

00:00:00.000 Good morning, everybody, and welcome to not only the highlight of civilization and the
00:00:08.380 best day of your life, but today I'm going to solve some of civilization's most thorny
00:00:14.700 problems.
00:00:16.320 Does that sound like hyperbole?
00:00:18.940 Well, interestingly, it's not.
00:00:23.220 Now, I'm not saying that I'm making up all these solutions myself.
00:00:26.560 In one case, I'm just going to tell you about a solution maybe you didn't know about, but
00:00:32.400 in a few other cases, I'm actually going to solve the biggest problems in the world.
00:00:38.920 You don't think I can do it?
00:00:41.660 Well, doubters, stay around, but wouldn't you like to take it up a notch for this special
00:00:48.100 holiday?
00:00:49.360 It's April 20th, and on April 20th, do we do the simultaneous sip?
00:00:54.340 Well, optionally, you could, but for this day only, in celebration of the specialness
00:01:02.060 of it, we will be doing the simultaneous whatever, today only.
00:01:08.880 So when we get to that part, whatever.
00:01:11.900 And all you need is a cup or mug or a glass, or whatever, a tank or Chelsea Stein, a canteen
00:01:18.680 junk or plastic, a vessel of any kind, fill it with your favorite liquid, or whatever.
00:01:25.300 And join me now for the unparalleled pleasure.
00:01:28.220 It's the dopamine hit of the day.
00:01:30.500 It's the thing that makes everything better.
00:01:33.140 It's called the simultaneous whatever, and it happens now.
00:01:39.620 Go.
00:01:44.800 Oh, wow.
00:01:47.600 So good.
00:01:49.860 Now, are you ready?
00:01:51.600 Are you ready for me to solve or tell you the solution?
00:01:55.940 I take no credit for this whatsoever.
00:01:59.060 This is just something that's in the news.
00:02:02.500 Climate change.
00:02:03.960 We're going to knock that one out first.
00:02:05.520 You ready?
00:02:06.060 Solution to climate change.
00:02:08.480 The company Rolls-Royce does more than make cars.
00:02:13.840 They have a nuclear division, and their nuclear division says that by 2024, they expect that,
00:02:20.220 at least in the UK, they'll have a regulatory approval for factory-built mini-nuclear power
00:02:28.640 plants.
00:02:30.300 And what this means is they will have solved the economics of safe nuclear power by 2024.
00:02:42.060 Now, of course, it would take a while for the United States to do something like it.
00:02:46.600 But what they will have done is proven the model.
00:02:50.880 Once they prove the model, then other companies presumably will follow the model.
00:02:55.920 And the model is this.
00:02:57.220 The problem with building a nuclear power plant is that they're all one-off.
00:03:01.820 Every design uses stuff that exists, but the total design is sort of new every time
00:03:08.640 because there's always an element added and that sort of thing.
00:03:12.200 But suppose every nuclear power plant were exactly the same, and they were small enough
00:03:19.780 so that the components of them could be built in the factory, and every one would be the
00:03:25.260 same.
00:03:26.100 So once you've approved this kit, if you will, these parts that one factory makes so they
00:03:33.100 can be put together in the field, assembled and then transported because they'd be small
00:03:38.320 enough, you can get your economics of mass production down, you can get your approval
00:03:44.380 cycle, which is sometimes the biggest problem, way down, you can move them to places that are
00:03:49.760 underserved because they're small, you could probably put them in areas that would be, let's
00:03:56.840 say, less of an exclusionary zone just because people are still afraid of anything nuclear.
00:04:01.660 But if it's a small one, I don't know, I'm just guessing, this is just speculation, but
00:04:08.040 wouldn't that be less scary?
00:04:10.600 If you said to somebody, here's this giant domed building that looks dangerous just from
00:04:17.980 a distance with a big stack and there's some kind of steam coming out, you see one of those
00:04:23.020 things and you think, if that baby blows, that's going to take out the whole state.
00:04:27.180 It's not true, but when you look at it, it looks like it is a bomb, right?
00:04:33.340 That big dome, doesn't it look like it sort of looks like a bomb?
00:04:38.600 I mean, it doesn't look like a building, does it?
00:04:41.240 But when you make the small ones, you can actually make them architecturally cute.
00:04:47.640 They're actually fun to look at.
00:04:49.100 They don't look like anything.
00:04:50.780 You know, they're just their own little thing.
00:04:52.060 And imagine, if you will, the psychology of the people who have to, let's say, live with
00:04:59.600 these dotted around the landscape, if they see these cute little ones and you say to
00:05:05.040 them, hey, this is the new technology, it's a technology that has never melted down, which
00:05:10.300 would be true of the current versions.
00:05:12.700 The current version of nuclear has never melted down.
00:05:15.600 It's only the earlier versions that have ever had an incident.
00:05:19.040 So, if you told the story as if it's new, hey, people, do you remember that old, dangerous
00:05:27.720 nuclear stuff we had?
00:05:29.400 There were these big power plants, they even looked like bombs.
00:05:32.540 Most of them, or a lot of them, not most of them, a lot of them were the old technology
00:05:36.380 and some of them had some problems.
00:05:38.380 Well, did you know that the new technology that they're putting in these little cute ones
00:05:42.440 has never had a problem?
00:05:44.140 Well, I mean, I guess theoretically they could, but never happened.
00:05:49.920 Hasn't happened yet, hasn't happened in any country, hasn't happened for Michael Schellenberger,
00:05:55.400 give me help here, decades, I don't know.
00:05:59.020 They've been around a long time, just not in these tiny forms.
00:06:02.260 So, if you could solve the economics, which it looks like if they get approval, it's kind
00:06:10.040 of solved, at least on the concept level, meaning that that model could be cloned.
00:06:17.100 And then if you get the regulatory thing, which is part of the economics, and then if you get
00:06:22.880 the psychology right by making them smaller and say, oh, this is the new stuff, you're
00:06:29.040 thinking of the old stuff.
00:06:30.120 Man, we wouldn't put that anywhere.
00:06:32.300 Yeah, nobody wants to touch the old stuff.
00:06:34.200 But the new stuff, and the little cute buildings, oh, you definitely want those.
00:06:39.140 Everybody wants those.
00:06:40.120 It's good for climate change.
00:06:41.400 By the way, it's bipartisan.
00:06:43.260 Oh, it's not even an issue of Democrat versus Republican, because it actually wouldn't be.
00:06:49.240 It wouldn't be.
00:06:50.420 It would actually be bipartisan.
00:06:52.980 Even Congress is bipartisan on nuclear.
00:06:55.460 So, if they can really pull this off by 2024, and maybe they're off by a year or two, it's
00:07:04.640 plenty in time to get us out of a catastrophic situation.
00:07:11.180 And that's not even counting the fact that these could be hooked up to scrubbers.
00:07:18.980 You know, once you get the cost of nuclear down, let's say you take it down by a factor
00:07:24.840 of 90%.
00:07:25.180 Could you get down to the point where you would have your dedicated scrubber that would suck
00:07:32.000 the CO2 out of the air, have it just connected to one tiny nuclear plant, and then suddenly
00:07:40.120 the economics of scrubbing the air turn positive?
00:07:43.960 Because the biggest cost is the energy.
00:07:47.460 So, if you solve the cost of producing the clean energy, tiny nuclear plant, you can suck
00:07:53.320 the hell out of the CO2.
00:07:54.820 You can suck it right out of the air.
00:07:56.700 So, you could have tons of suckers, like you could build a whole desert full of suckers
00:08:02.840 with one nuclear power plant and just suck the entire CO2 out of the country.
00:08:08.380 I don't know if it works that way.
00:08:09.820 If you sucked all the CO2 out of your zip code, would it attract CO2 from other zip codes?
00:08:18.380 Because it's not like air.
00:08:19.760 It's not like a vacuum.
00:08:21.720 Would that even work?
00:08:22.860 Or would you have to put all of your suckers in geographically dispersed places?
00:08:28.380 It would work a little bit, of course.
00:08:30.660 I mean, because the CO2 isn't going to know to stop at the border of the zip code.
00:08:35.540 But would it work well enough?
00:08:36.940 Or would you really have to spread them out?
00:08:38.400 I don't know.
00:08:39.040 Maybe we don't know that.
00:08:41.660 All right.
00:08:41.980 So, the first claim I believe I've delivered on, and by the way, the design for these little
00:08:47.780 nuclear plants is based on military engines.
00:08:52.240 Military nuclear engines have existed for a long time.
00:08:59.780 It's such a well-known technology.
00:09:03.840 So, have I delivered on my first claim?
00:09:08.600 And again, it's not me doing something.
00:09:10.520 I didn't do anything.
00:09:11.860 But is my first claim not persuasive?
00:09:15.960 That climate change looks a little bit solved, doesn't it?
00:09:20.740 Not as in solved in the past, but we now have a very clear, practical way to get everything
00:09:28.260 we want.
00:09:30.800 We did it.
00:09:32.520 Now, there's a lot more work, but there's nothing in the way.
00:09:36.400 Nothing.
00:09:37.080 There's literally nothing in the way.
00:09:38.900 Because follow the money.
00:09:40.100 If they can make these economical, how many can they sell?
00:09:45.480 All of them.
00:09:46.660 All of them, right?
00:09:48.080 Rolls-Royce is going to be like one of the biggest companies in the world if this works.
00:09:55.340 And it looks like it probably would.
00:09:56.980 Because there's nothing really that would stop it from working.
00:09:59.540 That's what's different about this.
00:10:01.580 Normally, you can look at the situation and say, all right, I can see how they can do a lot
00:10:05.780 of this, but how are they going to get past this big obstacle?
00:10:10.120 There aren't any.
00:10:12.000 There aren't any big obstacles.
00:10:14.100 This is just a big truck driving right down a highway toward a solution.
00:10:19.680 That's what it looks like, unless there's something I'm missing here.
00:10:22.200 All right, let's talk about Netflix.
00:10:25.380 Netflix is actually losing subscribers.
00:10:28.760 And the stock was down 40%.
00:10:31.580 And what they were talking about is maybe the problem is that people are sharing their
00:10:37.740 subscription, and so they would have more growth except for all the sharing.
00:10:43.200 And maybe it's because the pandemic is winding down, and people don't need to be just indoors
00:10:48.540 all the time.
00:10:49.660 But you know what I think it is?
00:10:52.200 I think it's because a whole bunch of people signed up to Netflix during the pandemic, consumed
00:10:58.900 everything on Netflix that they wanted to watch, and then nobody could create enough good content
00:11:05.920 going forward to ever keep them satisfied.
00:11:09.140 So I don't think after you eat a whole bunch of it, there's not really something to keep
00:11:16.940 you going.
00:11:18.540 So I guess Netflix understands that.
00:11:20.320 But it has more to do with the fact that the whole genre of movies and scripted TV just
00:11:25.840 doesn't work anymore.
00:11:27.960 It's just the same movie over and over again.
00:11:30.420 If it's a drama, or like an action film, if it's an action film, there's going to be a
00:11:34.740 car chase, there's going to be somebody strapped to a chair and tortured, there's going to be
00:11:39.420 three acts, and it's going to take two and a half hours, and you don't have that kind
00:11:42.980 of time.
00:11:43.320 So I think the problem with Netflix is what I've been saying for now for a while, that
00:11:48.700 movies are dead, or dying.
00:11:51.960 They're like dead men.
00:11:53.380 I just can't imagine movies being a legitimate form of entertainment ten years from now.
00:12:00.620 I think movies as an art form are just going to slide into oblivion.
00:12:04.700 Now, one of the biggest problems is the wokeness.
00:12:10.160 Have I ever told you this story?
00:12:12.220 For a couple of years, I worked pretty hard on trying to write a script for a really good
00:12:18.200 Dilbert movie.
00:12:20.040 And I got the whole thing, you know, storyboarded out, and I just needed to put it on paper.
00:12:23.940 And when I looked at it, I threw away all my two years of work, and I never want to see
00:12:31.780 it again, because I realized that I'd written a woke movie.
00:12:38.220 Let me tell you how bad it was.
00:12:40.760 Now, this was still several years ago, so it was before the wokeness reached the level it
00:12:46.080 is now.
00:12:47.280 So I could already see it coming pretty clearly, maybe, what, five years ago, whatever it was.
00:12:53.940 No longer than that.
00:12:56.520 Seven years ago?
00:12:57.440 I can't remember.
00:12:58.860 But, so the script was going to be this.
00:13:02.280 So Dilbert would have a number of big problems in the world that his company would be involved
00:13:07.420 in, and that he would be involved in.
00:13:09.300 You don't need to know that part.
00:13:10.500 But there would be some kind of master hacker, or there would be like some kind of technological
00:13:16.660 presence or entity that would be guiding things through the movie so that Dilbert would
00:13:23.700 be suddenly saved from things.
00:13:26.280 And it would be a mystery throughout the movie who this unknown technical genius was.
00:13:33.460 Now, you would have a hint, though, early in the movie there would be some foreshadowing,
00:13:38.660 because you would know that Dilbert's father had gone to the all-you-could-eat buffet and
00:13:44.200 never returned.
00:13:45.620 So that Dilbert grew up without knowing his father, except for maybe the first few years
00:13:50.700 or something.
00:13:51.060 So you would be led to believe that his father was probably some kind of a super genius.
00:13:55.280 You'd meet his mother, too, who raised Dilbert.
00:13:57.900 But she was the stay-at-home mom.
00:14:02.100 And you would presume that whatever Dilbert got that makes him such a good engineer, that
00:14:08.840 it probably got it from his father.
00:14:10.940 And that you're going to find that his father's been guiding him the whole time as sort of
00:14:16.120 the secret entity who's like a super engineer, and that Dilbert's like the son of the super
00:14:21.880 engineer.
00:14:22.600 And that would be the big reveal at the end.
00:14:25.760 Except it wasn't.
00:14:28.560 Here was the actual big reveal.
00:14:31.160 It was going to be his mom.
00:14:32.760 And that his mom had to pretend not to be technically proficient all her life because of gender
00:14:41.140 stereotypes.
00:14:42.860 And that, in the end, you would see her back cave, and that she was the one running the
00:14:48.140 whole show all the time.
00:14:50.420 And then I threw it away.
00:14:51.980 Do you know why?
00:14:54.320 You see it, right?
00:14:55.880 You see it, don't you?
00:14:57.680 You see why I had to throw the whole fucking thing away?
00:14:59.880 Because it wasn't art.
00:15:03.560 It wasn't even close to art.
00:15:05.640 All I did was take society's expectations and put them in a forum and then give it back
00:15:11.540 to them.
00:15:12.380 I basically took what people wanted to see, put a bow on it, and handed it back to them.
00:15:18.220 It was so creatively empty that I hated myself for it.
00:15:24.440 Like, I actually felt dirty.
00:15:27.160 I felt I'd become the enemy.
00:15:29.880 In a way.
00:15:31.560 Now, let me be clear.
00:15:34.380 If we had not already entered the wokeness era, I think it would have been a great movie.
00:15:39.800 Am I right?
00:15:41.060 Can you imagine that movie in 1950?
00:15:46.380 It would have killed.
00:15:48.180 It would have been amazing.
00:15:49.560 In 1950.
00:15:51.600 Right?
00:15:51.900 Because it would have been a surprise.
00:15:55.820 You wouldn't see it coming at all.
00:15:57.760 And then that's why it would be good.
00:15:59.500 And it would, you know, break gender stereotypes and stuff.
00:16:02.540 But now it would just be too much.
00:16:04.080 It would just be too much piling on.
00:16:06.760 So if I...
00:16:07.840 And I don't think you could get any other kind of a movie funded.
00:16:11.640 I think your movie would have to have some woke element to it or climate change element
00:16:17.140 or something just to even get funded.
00:16:19.680 So movies, I think, are a dead art.
00:16:21.960 And I'm glad I bailed out.
00:16:24.000 Although I think it would have made a lot of money.
00:16:25.980 But I'm glad I bailed out.
00:16:26.860 So I'm not going to make a big deal about this because you hate it.
00:16:34.960 But CNN is now saying there's a study that I don't believe that says that regular masks protect the wearer.
00:16:41.720 I thought we went through the whole pandemic thinking that the masks were for the benefit of the other people.
00:16:47.340 It was going to stop the infected from pluming.
00:16:49.480 But now we're saying that, of course, the N95s, we always imagined, might be useful.
00:16:55.120 So nobody's going to argue about the N95s.
00:16:57.800 I think everybody says, oh, if you had a well-fitted N95, you know, that you could change out regularly, that'd probably help you.
00:17:04.960 So everybody agrees with that.
00:17:06.360 It's just these regular, like, weak surgical masks.
00:17:09.600 And CNN is saying that there was some tiny little study that said there was a big difference.
00:17:14.740 And the people who wore the surgical masks were way less likely to get infected.
00:17:21.140 Would you believe a study like that?
00:17:24.380 Would you believe that just comparing people who chose to wear masks to people who chose not to is going to get you some kind of a useful result?
00:17:35.300 Because don't you think that the person who chooses to wear a mask is also choosing a whole range of behaviors that would be compatible with that mindset, such as, do they shake hands?
00:17:48.740 Don't you think that the people who don't wear masks are more likely to actually, like, shake hands, get up close to you, talk to another person up close who doesn't have a mask on?
00:17:57.900 I mean, basically, their entire lifestyle would be different.
00:18:00.960 So to imagine that this has been proven in some little study looks to me like, well, I'm not sure if it's intentional disinformation, but it's bullshit.
00:18:12.940 I mean, it's fake news for sure.
00:18:15.220 It's fake news in the sense that you shouldn't rely on this study to tell you anything useful.
00:18:19.740 It might be true.
00:18:21.100 That part I don't know.
00:18:22.920 But I wouldn't rely on it.
00:18:24.660 And the fact that they're selling this as a fact, yuck.
00:18:31.300 I wouldn't buy that.
00:18:33.540 All right.
00:18:34.720 I'm now going to...
00:18:36.520 So we've taken care of climate change.
00:18:38.840 To keep my promise, I will now handle systemic racism.
00:18:43.000 Systemic racism.
00:18:44.680 With a story.
00:18:46.600 Recently, someone asked, someone in my social circle, asked me for some business advice.
00:18:53.280 Now, it turns out that because I was a banker for a long time, I made loans to small businesses, I've got an MBA, I've started some small businesses of my own.
00:19:04.760 They didn't do well, but I learned a lot.
00:19:07.140 I'm sort of a good person to ask for general advice about what to do first if you're starting a business.
00:19:13.640 And so I gave my advice, and, you know, I got a nice thank you for it.
00:19:21.480 And I thought to myself, you know, it wasn't like a specific piece of advice.
00:19:26.620 It was like a range of advice.
00:19:28.420 So it wasn't one topic.
00:19:30.280 And I thought to myself, why did this one person get my advice?
00:19:35.740 And why does anybody get anybody's advice?
00:19:41.200 Because I'm pretty sure that my success was dependent on other people's advice.
00:19:47.240 You know, I literally asked somebody who knew how to be a cartoonist, how do you get started being a cartoonist?
00:19:53.860 And I followed that advice.
00:19:55.700 And here I am.
00:19:56.440 So how did I get that advice that changed the entire course of my life?
00:20:03.300 And I got the advice from somebody I don't know, a complete stranger.
00:20:07.340 Well, I got the advice by just approaching the person by a letter and asking for advice.
00:20:13.340 And I offered nothing in return.
00:20:15.800 There was no quid pro quo.
00:20:17.600 I didn't say, I'll buy your advice.
00:20:19.560 I didn't say, I'll make you feel good if you give me some advice.
00:20:21.920 I offered nothing.
00:20:23.320 I literally just asked for something for nothing.
00:20:25.240 And I got it.
00:20:28.100 Not only did I get the advice, but I got a follow-up advice I didn't even ask for.
00:20:32.000 And it was the follow-up advice that I followed that got me to here, where I'm talking to you.
00:20:39.760 And so whenever I get a chance and somebody asks me for advice, I give it.
00:20:46.120 Have you ever heard of a comic called Pearls Before Swine?
00:20:50.080 It's one of the biggest comics out there.
00:20:53.220 Now, if you've read the creator's story, you'll know that his origin story involves me.
00:21:01.740 So I gave him a good recommendation early on.
00:21:04.880 But he also met with me, and we're friends at this point.
00:21:08.860 And he basically just reached out and said, can you spend some time with me?
00:21:13.580 Because he lived locally.
00:21:15.260 And give me some advice.
00:21:16.800 And I liked his comic so far, what I'd seen.
00:21:19.800 So I did.
00:21:20.900 And he did something weird.
00:21:23.140 He followed my advice.
00:21:25.620 Which is weird.
00:21:26.900 Because I'm actually not used to it.
00:21:28.260 I've given a lot of people, especially artists, advice.
00:21:31.400 They almost never follow it.
00:21:33.000 Usually they say something like, oh, thank you, but, you know, I have a reason to go a different direction.
00:21:40.960 Okay.
00:21:41.960 But the people who followed my advice do pretty well.
00:21:45.800 And I would say that the only reason I'm where I am is that I followed somebody else's advice.
00:21:51.140 Somebody knew a hell of a lot more than I did.
00:21:53.660 So I just followed their advice.
00:21:54.920 So, here's the problem with systemic racism.
00:21:59.380 We keep solving the wrong problem.
00:22:01.420 Or, to say it better, one part of the problem is already solved.
00:22:05.680 Here's the problem that's already solved for systemic racism.
00:22:09.780 If you're, I'll just pick an example, a young black man with an education, do you think you'll do okay?
00:22:18.100 Yeah, because every corporation is trying to fix their, or improve their diversity.
00:22:27.640 I mean, it's a very stated, important goal.
00:22:30.900 I mean, you could argue about the wokeness of it and whether, you know, I'm not going to argue whether they should do it.
00:22:36.580 I think they should.
00:22:37.840 But that's not the argument.
00:22:39.680 The argument is, do they have opportunities?
00:22:42.860 The answer is, yeah, better than white people.
00:22:45.180 That's just the truth.
00:22:47.060 It's just the truth.
00:22:47.820 I'm sorry, if it offends you.
00:22:50.580 If you're a black, educated young man, you have a way better chance of getting a job than the identical white man in the same situation, with the same education.
00:23:01.000 It's not even close.
00:23:02.680 It's not even close.
00:23:04.380 And likewise for getting a college scholarship or even being accepted in a good college.
00:23:09.600 Two equally qualified young black man, young white man, same grades, same SAT scores.
00:23:16.000 It's not even close.
00:23:18.860 The young black man wins that contest every time.
00:23:22.100 So, if that's so good for the people who are starting in the hole, aren't we done?
00:23:30.980 Because I used to kind of think that.
00:23:33.220 That was my point of view.
00:23:34.800 My point of view is, if all these opportunities exist, and you just have to take advantage of them, exactly as I did.
00:23:41.920 I went to work for a big corporation, because when I went there, I was still in demand.
00:23:48.280 They were hiring people like me.
00:23:50.640 So, I went there, and they trained me, and I was better off for it.
00:23:54.440 So, I keep telling myself, well, I started with basically nothing, and I went exactly where they can go.
00:24:02.920 I worked in school exactly like anybody else can.
00:24:06.340 Why can't everybody else just do what I did?
00:24:08.300 And one of the reasons is, that some people get advice.
00:24:15.200 Some people have mentors.
00:24:17.100 Now, I sought out mentors, but I don't think that's normal.
00:24:22.460 And I can tell you that people have sought me out a number of times.
00:24:26.380 Do you know who I say yes to almost every time?
00:24:31.160 Guess.
00:24:32.640 What demographic groups do I say yes to almost every time?
00:24:36.820 Black, female, Hispanic, every time?
00:24:46.060 And would I more likely give them advice than somebody I knew to be white and male?
00:24:53.420 I don't know.
00:24:54.200 Probably not.
00:24:55.520 Probably not.
00:24:56.000 I'd probably give anybody advice if they were local, and they had a good reason, and I could help them.
00:25:00.420 The local part's the important part.
00:25:02.440 Now, I realize I'm opening myself up here to be attacked by people who want advice.
00:25:07.480 And I have to tell you, I just don't have the time to do most of it.
00:25:11.820 You know, I do it as I can.
00:25:14.120 But there is people who succeed, and if there's anybody on here who wants to confirm this,
00:25:20.620 if you have succeeded, let's say financially, if you've succeeded, aren't you really happy to give advice?
00:25:27.460 Like, you give it for free.
00:25:30.220 You're happy to do it.
00:25:32.920 Look at all the yeses on the locals platform.
00:25:35.500 People are just, yes, yes, yes, yes, yes.
00:25:37.140 You want advice?
00:25:38.460 You want to get mentored?
00:25:40.120 Absolutely.
00:25:40.520 So here's my solution to systemic racism.
00:25:46.700 Black people, and anybody, any person of color, anybody who's, you know, starting in a hole, for whatever reason,
00:25:54.180 there are plenty of people who will give you advice, and it will change your life if you follow it.
00:25:58.520 But you've just got to reach out, you've just got to ask, and then follow the advice.
00:26:05.460 Now, I do get that that's harder culturally across some boundaries.
00:26:11.900 It's also harder across gender boundaries.
00:26:15.400 You know, it's easier for a woman to ask a woman.
00:26:18.060 But let me tell you, you're really missing out, because men will give you all the advice you want.
00:26:21.500 They'll mansplain you to death.
00:26:23.580 They'll give you more advice than you ever wanted.
00:26:26.240 It's all free.
00:26:28.000 It's all free.
00:26:30.140 And, you know, so I think the mentoring and advice connection is the only thing that needs to be fixed.
00:26:39.400 Because if you can catch young black families, and I'm going to say the parents and especially the mother,
00:26:45.680 if you can catch them early, with the right kind of advice, just connect them to some kind of mentoring situation,
00:26:55.140 then they can get to the second part, which is the corporations really want to hire them.
00:27:00.560 And other people too, not just corporations.
00:27:03.680 So the second part of the plan for giving everybody equal opportunity is done.
00:27:10.780 The second part's done.
00:27:12.800 It's the first part that doesn't exist.
00:27:15.680 It's connected you to people who can just, like, nudge you in the right direction.
00:27:19.560 It's the only thing that made me successful.
00:27:22.180 If you take away the advice that I got at key points along the way, I'm not here.
00:27:29.240 Now, what I had going for me is I was going to go get that advice.
00:27:33.540 Right?
00:27:34.260 If I had to steal it, I probably would have gone and gotten it.
00:27:39.080 Right?
00:27:39.580 Because I wasn't going to not succeed.
00:27:41.320 I sort of had a mindset that there wasn't any obstacle that was going to stop me.
00:27:47.900 But that's also not normal, and I don't think you can teach it.
00:27:52.240 I think it's like a genetic flaw that has some weird advantages to it.
00:27:57.820 Right?
00:27:57.920 My genetic flaw is that I can't be satisfied with what I have.
00:28:03.120 Like, I have to push through the next thing.
00:28:05.680 So it's like a, it's like this, you know, it's almost like an illness.
00:28:10.660 And a lot of successful people will tell you the same thing.
00:28:13.400 You don't become successful and then quit.
00:28:16.300 Say, oh, I got successful.
00:28:18.220 I guess I'm done.
00:28:19.520 I'm done working now.
00:28:21.180 I wish it worked like that.
00:28:23.360 But whatever it was that got you there doesn't turn off.
00:28:26.600 You're like, ah, damn it.
00:28:28.580 So what about everybody who doesn't have that flaw that they just have to keep going?
00:28:34.100 I think you have to find a way that the mentors find them to make it easy to get that advice.
00:28:39.140 I don't think you can count on a young black kid to go find some successful white adult or any successful adult of any kind and go ask for advice.
00:28:53.300 That's a lot to ask for a young person.
00:28:56.100 I was willing to do it, but it's rare.
00:29:01.220 All right.
00:29:03.140 Let's fix free speech.
00:29:05.360 So let's see, systemic racism fixed, climate change fixed.
00:29:09.780 We're doing pretty well so far.
00:29:11.600 We're on target.
00:29:13.700 I'm going to tell you a little very interesting story that's happening right now.
00:29:18.060 And then I'll tell you how to solve all of fake news and disinformation.
00:29:24.060 So watching Jack Dorsey, I'm going to say off the leash, meaning he's going full free speech about free speech.
00:29:35.360 It's so meta, like it's making my brain hurt.
00:29:38.840 So Jack Dorsey, you know, founder and ex-CEO of Twitter, until now I think he's said things that a CEO can say for the most part.
00:29:50.120 You know, things that are sort of, you know, as non-controversial as a CEO should be.
00:29:56.420 But now he's not CEO anymore.
00:29:58.140 And I've told you his other opinions, and he's very free speech.
00:30:04.360 And that's going to be important to something I'm getting to in a bit.
00:30:08.320 But he had this little exchange with Mark Benioff, founder of Salesforce.
00:30:12.640 So Benioff is a pretty awesome guy, actually.
00:30:18.580 I've talked about him before.
00:30:19.780 So I spent some time with Mark Benioff before giving a talk at Salesforce.
00:30:25.200 So we got to chat for a little while.
00:30:27.220 And I got sort of a feel for his vibe.
00:30:29.760 And I came away thinking that he's the real deal, meaning that he's somehow plugged into some extra dimension in a way that's hard to explain.
00:30:39.280 That he's just not operating like normal people.
00:30:42.780 What you feel is that he's operating at a higher level of awareness in which he has found that capitalism and being good for people seems to be compatible.
00:30:55.500 And it's sort of the holy grail.
00:30:57.960 How can you be a soulless capitalist at the same time you're trying to take care of people?
00:31:04.000 And somehow, somehow, he's kind of making it work, at least within his world.
00:31:09.660 Because Salesforce has this, I may be mischaracterizing this, but it's like a 1%, 1%, 1% rule about everybody should do something for charity.
00:31:20.300 And he's really serious about it.
00:31:21.860 And I watched him be serious about it in person.
00:31:24.600 Like I watched him chastise, that's too strong a word, but let's say continuously correct one of his lieutenants
00:31:32.660 for not featuring the people part of it before the business part of it.
00:31:40.920 So I know that he's serious about it, or I feel that he's serious about it.
00:31:44.920 So I don't think he's cynical at all, betting off.
00:31:47.660 I think he really wants to help the world and get rich, and it's working.
00:31:50.460 But, so he tweeted this.
00:32:02.280 So he tweeted a picture, so Mark Benioff tweets a picture of himself on the cover of CEO magazine, right?
00:32:09.320 Which is interesting that you tweet a cover of himself on CEO magazine.
00:32:12.660 But I guess he's quoting himself saying,
00:32:15.360 Capitalism, as we have known it, is dead, and the obsession that we have with maximizing profits for shareholders alone
00:32:21.460 has led to incredible inequality and a planetary emergency.
00:32:25.880 When we serve all stakeholders, business is the greatest platform for change.
00:32:30.780 So there's Benioff making a call to helping people, but also having a business be robust and part of the solution.
00:32:39.960 And then Jack tweeted in reply,
00:32:48.520 You bought this magazine too?
00:32:51.240 Now, I didn't realize until I looked it up
00:32:54.800 that Benioff had recently, in 2018 I guess, purchased Time magazine.
00:33:01.440 So did you know that Benioff owned Time magazine?
00:33:05.260 I didn't know that.
00:33:05.780 So there he is in the cover of CEO magazine.
00:33:10.320 So Jack says, You bought this magazine too?
00:33:13.020 Now, what I love about this is you have to know about the context
00:33:17.240 and sort of read between the lines to know what this is about.
00:33:21.780 And I think this is Jack, I think.
00:33:25.100 You know, I can't read his mind, so this is always dangerous.
00:33:27.420 But the way I interpret it is that Jack's new, not new,
00:33:34.940 but let's say his outspokenness about freedom of speech
00:33:38.240 is really what this is about.
00:33:41.500 That here's Benioff saying the good things he's doing
00:33:45.180 and showing himself on a magazine.
00:33:47.060 But at the same time, he does own a magazine.
00:33:49.900 And when the rich people own the communication channels,
00:33:55.800 then you've got something you need to look at.
00:33:59.760 But they weren't done with this exchange.
00:34:03.800 So Benioff, having a good sense of humor too,
00:34:06.500 he responds on Twitter, he goes,
00:34:08.320 Nope, but Jack, I can get you a subscription to Time.
00:34:12.420 And then he shows a cover of Elon Musk on the cover of Time
00:34:15.820 because Elon Musk is about to buy Twitter.
00:34:17.780 So you have to know all these stories and how everything connects
00:34:20.640 to know how interesting this exchange is.
00:34:24.000 And then it gets better.
00:34:26.000 So, you know, you'd think that it would be done.
00:34:29.120 And then Jack tweets back,
00:34:33.780 Nah, I'm good, man.
00:34:35.220 With a photo of, it's a meme,
00:34:38.080 of a photo that looks like Obama
00:34:39.880 putting a Medal of Honor around Obama himself.
00:34:45.320 And like now my brain is starting to split.
00:34:47.340 And I'm like, okay, I have to decipher that.
00:34:48.980 What does that mean?
00:34:50.360 Obama putting a Medal of Freedom or something,
00:34:53.360 whatever it is, on himself
00:34:54.820 would be like people congratulating themselves.
00:34:59.280 So it's sort of,
00:35:02.000 which is just wonderfully subtle.
00:35:05.160 And then some other user tweeted in,
00:35:09.880 you know, got in the middle,
00:35:11.400 some mushroom and mar,
00:35:12.780 and tweets,
00:35:13.760 this took 10 hours
00:35:14.800 because it took 10 hours for Jack to respond
00:35:17.960 with his Obama,
00:35:19.040 giving Obama a medal thing.
00:35:21.720 And then Jack responds to that by saying,
00:35:24.000 sleep and touching grass took 10 hours.
00:35:26.900 And I think, okay,
00:35:29.580 I understand the sleep part.
00:35:32.940 Touching grass.
00:35:36.900 I don't know.
00:35:38.420 It was 419.
00:35:39.960 I don't know what he meant by touching grass.
00:35:42.100 But I assume he meant going outdoors
00:35:43.600 and enjoying nature.
00:35:45.100 But so here's my take
00:35:51.480 on solving disinformation and fake news.
00:35:55.080 And it's going to dovetail from this story.
00:35:58.140 So this is just an interesting exchange
00:36:00.020 between two super interesting people
00:36:01.940 on a super interesting topic,
00:36:04.160 in my opinion,
00:36:04.800 which is freedom of speech
00:36:06.120 and, you know,
00:36:07.300 who controls it.
00:36:08.560 But here's my solution
00:36:10.720 to solve disinformation.
00:36:16.060 Elon Musk buys Twitter.
00:36:18.100 Let's say that if he gets rejected,
00:36:20.160 there's some way for him
00:36:22.240 to power Ranger up
00:36:23.620 and just get enough money
00:36:25.540 and super rich people to say,
00:36:27.220 okay, we're done playing around.
00:36:29.660 Elon Musk is going to buy Twitter
00:36:31.240 with help or without help
00:36:33.980 or one way or another.
00:36:35.940 So let's say he gets it done.
00:36:37.400 Now, this is the big if
00:36:39.640 and it depends on this.
00:36:41.060 The next part is the board of directors.
00:36:45.940 Now, even as a private company,
00:36:47.400 you need a board of directors.
00:36:48.600 Who could you have
00:36:49.500 as a board of directors at Twitter
00:36:52.380 that would make you feel
00:36:54.060 that Twitter had become
00:36:55.940 the first legitimate platform
00:36:57.740 in terms of getting rid of bias
00:36:59.460 and, therefore,
00:37:01.280 as a lever that controls
00:37:02.840 the other platforms?
00:37:04.400 Because if things are correct on Twitter,
00:37:06.400 it's hard to do your fake news
00:37:09.400 somewhere else
00:37:10.060 because Twitter is going to call you out
00:37:11.900 and all the journalists are there,
00:37:13.240 all the politicians are on Twitter.
00:37:15.020 So you have to get Twitter right
00:37:16.760 and then you can be fake news
00:37:18.880 and all the rest.
00:37:20.060 But if Twitter is calling out
00:37:21.520 fake news everywhere,
00:37:23.100 left and right,
00:37:24.060 then you just can't get away with it.
00:37:26.560 It's like the lever
00:37:27.340 that controls everything.
00:37:28.320 So the board of directors,
00:37:30.620 in theory,
00:37:31.860 under a Musk ownership,
00:37:33.960 also in theory,
00:37:35.520 would include a board of directors
00:37:37.000 that anybody would look at
00:37:38.400 and say,
00:37:39.080 eh, okay,
00:37:40.300 that's a pretty good board of directors.
00:37:43.080 So I'm going to suggest some names,
00:37:46.120 but I don't mean it.
00:37:48.020 All right?
00:37:48.800 The names I'm going to suggest
00:37:49.920 are just to get you thinking.
00:37:52.120 They're not necessarily
00:37:53.440 the greatest ideas.
00:37:55.460 All right?
00:37:56.140 Now, yes,
00:37:56.800 I thought about putting myself
00:37:57.940 on the board
00:37:58.500 and rejected it.
00:38:00.480 Because?
00:38:01.140 Why did I reject myself?
00:38:03.700 Because I'm not credible
00:38:04.960 to a big part of the country.
00:38:08.900 So you don't want a board of directors
00:38:10.320 that isn't credible.
00:38:11.740 That's just recreating the problem.
00:38:14.320 And as much as I think
00:38:15.780 I'd be awesome,
00:38:17.120 you know,
00:38:17.380 in my own head,
00:38:18.640 I'd be,
00:38:18.860 oh, I'd be awesome for that job.
00:38:20.020 Like, actually,
00:38:21.120 I think I could be useful.
00:38:22.540 But the public would not see it that way.
00:38:25.360 I'm a little too controversial.
00:38:28.380 So I'm not a good choice.
00:38:30.560 Here's something that might be.
00:38:32.460 And before you react,
00:38:36.340 keep in mind that the only thing
00:38:37.760 I'm looking for is credibility
00:38:39.420 for free speech.
00:38:42.180 So these are people
00:38:43.160 you don't have to like
00:38:44.360 in any other context.
00:38:46.160 The only thing you have to say
00:38:47.400 is true about this group of people
00:38:48.980 is that, oh yeah,
00:38:50.760 they do like free speech.
00:38:52.820 Like, that's the part
00:38:53.580 you don't doubt.
00:38:54.660 Okay?
00:38:55.540 First one,
00:38:56.380 Bill Maher.
00:38:58.180 Bill Maher.
00:38:59.840 Say what you will.
00:39:02.500 Say what you will
00:39:03.500 about his politics
00:39:04.980 if you don't like him.
00:39:07.240 Can you say for sure
00:39:08.620 that he likes,
00:39:09.800 he's ride or die
00:39:10.640 on free speech?
00:39:11.380 I know,
00:39:13.880 you hate it,
00:39:14.340 don't you?
00:39:14.660 Wait for the full list.
00:39:16.940 Bill Maher is ride or die
00:39:18.580 on free speech,
00:39:19.440 I think.
00:39:21.980 Glenn Greenwald
00:39:22.900 lives in a different country.
00:39:25.440 So his credibility
00:39:26.420 is hurt because of that.
00:39:28.020 I thought of him.
00:39:30.260 Here's another one.
00:39:31.160 And again,
00:39:31.860 don't get hung up
00:39:32.680 on the specific names.
00:39:34.060 Get hung up on
00:39:35.220 the concept
00:39:36.260 of finding people
00:39:37.220 that would fit
00:39:37.800 this description
00:39:38.560 even if you don't
00:39:39.340 like these names.
00:39:40.580 Second one,
00:39:41.640 Naval Ravikant.
00:39:43.960 Why?
00:39:45.220 Nobody knows
00:39:45.860 his politics.
00:39:47.980 I don't know.
00:39:49.120 But he's one of
00:39:49.700 the most credible people
00:39:50.880 in all of,
00:39:52.760 you know,
00:39:53.020 high-tech world.
00:39:55.320 And other people
00:39:56.060 see him as credible.
00:39:57.340 And anybody
00:39:57.880 on the board
00:39:59.880 would see him
00:40:00.380 that way too.
00:40:00.880 But there's
00:40:01.800 a better reason.
00:40:03.260 You know what
00:40:03.620 the better reason is?
00:40:05.360 He'd be the canary
00:40:06.240 in the coal mine.
00:40:06.960 All you'd have
00:40:09.440 to do
00:40:09.960 is wait
00:40:11.400 to see if he quit.
00:40:13.760 If he doesn't quit,
00:40:15.700 nobody's getting
00:40:16.420 away with anything
00:40:17.140 because he'd see it.
00:40:19.920 Nobody's getting
00:40:20.560 away with anything
00:40:21.180 if he's still
00:40:21.620 on the board.
00:40:22.440 If he quits,
00:40:24.100 all bets are off.
00:40:26.060 The canary is dead.
00:40:28.200 So you need
00:40:28.940 one canary.
00:40:30.280 Now,
00:40:30.520 I'm not saying
00:40:30.900 it's him.
00:40:32.340 So I'm not saying
00:40:33.520 he'd be the right one.
00:40:34.340 I'm not saying
00:40:34.760 he'd do it.
00:40:35.300 I imagine
00:40:36.700 he wouldn't,
00:40:37.280 actually.
00:40:37.840 I can't imagine
00:40:38.580 him doing it,
00:40:39.020 actually.
00:40:39.760 But you need
00:40:40.780 somebody to be
00:40:41.400 the canary.
00:40:43.020 Somebody who's
00:40:43.740 rich enough
00:40:44.420 and strong-willed
00:40:46.520 enough
00:40:46.960 and independent
00:40:47.940 enough
00:40:48.380 that if that person
00:40:49.820 quits the board,
00:40:51.100 you've got to say,
00:40:51.780 all right,
00:40:51.980 we have to see
00:40:52.460 what's going on now.
00:40:53.780 We have to reopen
00:40:54.740 the hood,
00:40:56.060 right?
00:40:56.700 Now,
00:40:57.000 we don't trust
00:40:57.520 anybody anymore
00:40:58.280 because that one
00:40:58.960 or canary died.
00:41:00.980 Who's the canary?
00:41:02.660 Somebody said
00:41:03.220 Joe Rogan.
00:41:03.860 thought of him.
00:41:05.620 He'd be excellent
00:41:06.300 except he's not
00:41:08.200 considered credible
00:41:09.000 by a lot of people.
00:41:12.420 Unfortunately.
00:41:14.140 Your ad to be
00:41:14.660 Tulsi Gabbard
00:41:15.400 is on my list.
00:41:17.140 Is she the perfect one?
00:41:18.300 I don't know
00:41:18.960 because you can find
00:41:20.980 reasons to disagree
00:41:21.980 with her.
00:41:22.660 But has she ever
00:41:23.820 been against
00:41:24.360 freedom of speech?
00:41:26.460 No.
00:41:27.680 Oh,
00:41:28.320 there's a suggestion
00:41:29.520 I like.
00:41:30.060 Chris Rock.
00:41:31.920 Chris Rock.
00:41:32.580 Say what you will
00:41:34.640 about Chris Rock.
00:41:35.840 Do you like his humor?
00:41:36.900 Don't like his humor?
00:41:37.800 I love it.
00:41:39.180 But
00:41:39.380 do you think
00:41:41.540 Chris Rock
00:41:42.100 would ever be
00:41:42.700 against
00:41:43.260 free speech?
00:41:46.000 Doubt it.
00:41:47.360 And then you've got
00:41:48.120 some,
00:41:48.600 you know,
00:41:48.820 add some diversity.
00:41:49.940 I think that'd be good.
00:41:51.720 Here's the one
00:41:52.340 that's going to fool you.
00:41:54.140 I would add
00:41:54.720 to the board
00:41:55.380 Jack Dorsey.
00:41:57.720 You didn't see
00:42:01.800 that one coming,
00:42:02.500 did you?
00:42:03.500 He's perfect.
00:42:04.840 Nobody knows
00:42:05.460 more about the company.
00:42:07.300 And coming in
00:42:08.160 as a board member
00:42:08.940 would just scare
00:42:09.620 the shit out of him.
00:42:10.960 He's already
00:42:11.600 criticized the board.
00:42:13.100 He's the last person
00:42:14.020 that the current board
00:42:15.260 would want on the board.
00:42:16.520 And do you know
00:42:17.040 who I'd want
00:42:17.640 on the board
00:42:18.260 of a new,
00:42:19.960 of a new board
00:42:20.900 if, let's say,
00:42:21.620 Musk owned it?
00:42:22.520 I'd want somebody
00:42:23.460 that the old board
00:42:24.320 definitely didn't want
00:42:25.080 on there.
00:42:25.460 I mean,
00:42:26.460 that would seem
00:42:27.340 like a feature to me.
00:42:29.020 I don't know
00:42:29.560 if they would want him
00:42:30.240 or not,
00:42:30.560 but, yeah,
00:42:32.640 so Jack Dorsey
00:42:33.480 is unambiguously
00:42:35.420 in favor of free speech.
00:42:38.060 He's just able
00:42:38.880 to say it now
00:42:39.580 because he's not a CEO,
00:42:40.740 so he can say it
00:42:41.680 in a more clear,
00:42:43.920 and you can tell
00:42:44.640 that it's,
00:42:45.580 you can tell
00:42:46.140 that he's emotionally
00:42:47.160 attached to it.
00:42:48.240 It's not just talk.
00:42:50.000 The way he's reacting
00:42:51.060 to it shows
00:42:52.420 that he's in the fight.
00:42:53.960 You know,
00:42:54.080 he's not a bystander.
00:42:55.460 I will throw in
00:42:58.420 Mark Andreessen.
00:43:01.640 If you're not familiar
00:43:02.680 with his name,
00:43:03.680 you should Google him.
00:43:05.180 I don't need
00:43:05.620 to explain to everybody.
00:43:07.120 Mark Cuban.
00:43:08.100 There's,
00:43:08.340 that's interesting.
00:43:09.860 Mark Cuban.
00:43:11.980 Yeah,
00:43:12.540 I could see that
00:43:13.300 because Mark Cuban
00:43:15.720 has that same,
00:43:17.360 I mean,
00:43:17.680 I would imagine
00:43:18.620 he's a free speech
00:43:19.820 absolutist.
00:43:21.080 I haven't asked him.
00:43:22.420 I haven't seen him
00:43:23.100 talk about it,
00:43:23.720 but I would imagine.
00:43:26.760 And
00:43:27.160 he would be
00:43:29.200 a canary too.
00:43:31.340 I think,
00:43:31.740 I think he would be
00:43:32.700 a canary
00:43:33.040 because I think
00:43:33.660 he would quit
00:43:34.160 the board
00:43:34.600 if they did
00:43:35.740 something that was
00:43:36.420 unambiguously
00:43:37.160 anti-free speech.
00:43:38.680 You don't think
00:43:39.280 he would quit
00:43:39.700 the board?
00:43:40.240 Or if he couldn't
00:43:40.980 see what he was
00:43:41.800 on the board of,
00:43:42.900 if he didn't have
00:43:43.460 visibility that
00:43:44.420 made him feel
00:43:45.420 comfortable that
00:43:46.100 things were okay.
00:43:47.040 Yeah,
00:43:47.860 I think he would
00:43:48.360 quit the board.
00:43:50.460 Good name.
00:43:51.540 All right.
00:43:52.960 Molly Hemingway.
00:43:54.620 Too,
00:43:54.960 too controversial.
00:43:56.340 Too political.
00:43:57.380 I mean,
00:43:57.720 people would see
00:43:58.340 her as political,
00:43:59.240 although she would
00:43:59.900 be great.
00:44:01.420 Yeah.
00:44:02.020 So you have to,
00:44:02.680 you have to,
00:44:03.460 you have to separate
00:44:05.080 who would be
00:44:05.760 great in terms
00:44:06.780 of qualified
00:44:07.480 with who would
00:44:08.840 be great and
00:44:09.460 qualified,
00:44:10.120 but also
00:44:10.720 would look like
00:44:12.320 it from the
00:44:12.740 outside.
00:44:13.120 It's Dave Chappelle.
00:44:15.060 Good suggestion.
00:44:17.440 You know,
00:44:18.040 Chris Rock has
00:44:19.200 the same quality,
00:44:20.000 which is you want
00:44:20.500 a comedian in there
00:44:21.460 because they like
00:44:23.600 free speech,
00:44:24.580 and the left
00:44:25.420 likes comedians,
00:44:27.100 so,
00:44:27.800 and the right
00:44:28.240 does as well.
00:44:29.380 Chappelle is a
00:44:30.000 really good name.
00:44:31.480 Yeah.
00:44:32.400 Dave Rubin.
00:44:34.200 Dave Rubin.
00:44:35.300 Dave Rubin,
00:44:36.160 unambiguously
00:44:36.860 free speech.
00:44:38.780 Unambiguously,
00:44:39.340 you can see both
00:44:40.160 sides of the left
00:44:41.660 and the right.
00:44:43.260 Good name.
00:44:45.060 How about,
00:44:45.740 all right,
00:44:46.500 I'm going to end
00:44:47.780 with the most
00:44:49.920 controversial one.
00:44:52.140 I want one that
00:44:53.160 just makes your
00:44:53.760 head explode,
00:44:54.600 okay?
00:44:54.900 You ready?
00:44:55.520 You ready for this
00:44:56.040 one?
00:44:57.120 AOC.
00:45:00.040 Go on.
00:45:01.200 Go on.
00:45:01.800 Let your heads
00:45:02.400 explode.
00:45:03.880 Now,
00:45:04.240 is it because I
00:45:04.960 agree with AOC's
00:45:06.240 policies?
00:45:06.980 Nope.
00:45:07.480 Nope.
00:45:07.860 Nope.
00:45:08.220 Nope.
00:45:08.420 Nope.
00:45:08.600 Nope.
00:45:08.760 Nope.
00:45:09.580 Is it because I
00:45:10.580 think she is a
00:45:11.540 keen observer
00:45:13.000 and knows how
00:45:14.580 to get things
00:45:15.440 done now?
00:45:19.460 But,
00:45:20.680 has she ever
00:45:21.220 spoken out
00:45:22.900 against free
00:45:23.560 speech?
00:45:24.660 Do you know
00:45:25.280 what is
00:45:25.580 interesting about
00:45:26.260 AOC?
00:45:27.760 She's like a
00:45:28.600 huge victim of
00:45:29.800 free speech,
00:45:31.540 but I'll bet she
00:45:32.420 backs it.
00:45:34.620 Now,
00:45:35.100 I don't know
00:45:35.480 this,
00:45:36.220 right?
00:45:36.540 Now,
00:45:36.800 remember,
00:45:37.640 remember,
00:45:38.200 you don't have to
00:45:38.680 agree with any
00:45:39.180 of the names
00:45:39.580 because they're
00:45:40.000 not real.
00:45:41.140 They're not real
00:45:41.880 suggestions.
00:45:43.000 They're not
00:45:43.360 my suggestion.
00:45:44.800 I'm just
00:45:45.180 throwing out
00:45:45.620 controversial
00:45:46.180 ideas,
00:45:47.300 right?
00:45:48.020 The positive
00:45:49.360 way to have
00:45:50.120 AOC would
00:45:52.140 be to say,
00:45:52.840 you know,
00:45:53.760 if AOC is
00:45:54.920 okay with
00:45:55.420 this and
00:45:56.680 the other
00:45:57.420 people are
00:45:57.860 okay with
00:45:58.420 it,
00:45:58.580 it must be
00:45:59.000 okay,
00:46:00.220 right?
00:46:00.680 Because you'd
00:46:01.180 want at least
00:46:01.680 one person
00:46:02.400 on there
00:46:02.880 who's just
00:46:05.840 going to
00:46:06.600 shake the
00:46:07.640 box every
00:46:08.240 now and
00:46:08.560 then.
00:46:09.240 So it
00:46:09.720 doesn't have
00:46:10.060 to be her.
00:46:10.400 somebody said
00:46:11.600 Geraldo.
00:46:12.960 Geraldo.
00:46:14.200 I would
00:46:14.840 take
00:46:15.040 Geraldo.
00:46:16.360 Yes,
00:46:16.860 I would
00:46:17.320 trust
00:46:17.640 Geraldo
00:46:18.080 to be
00:46:19.160 an honest
00:46:20.000 broker of
00:46:20.980 what makes
00:46:21.460 sense on
00:46:21.900 the left
00:46:22.220 or the
00:46:22.460 right.
00:46:23.680 Again,
00:46:24.360 I don't
00:46:24.660 agree with
00:46:25.340 Geraldo
00:46:25.840 on every
00:46:26.540 one of
00:46:27.180 his
00:46:27.320 opinions,
00:46:28.020 but that's
00:46:28.420 not the
00:46:28.740 issue.
00:46:30.540 I think
00:46:31.140 he's an
00:46:31.480 absolutist
00:46:31.980 on free
00:46:32.320 speech
00:46:32.660 and would
00:46:33.940 seem like
00:46:34.520 it to a
00:46:35.900 lot of
00:46:36.100 people.
00:46:38.020 All right,
00:46:38.400 so that
00:46:39.640 is my
00:46:39.940 solution to
00:46:40.440 free
00:46:40.640 speech.
00:46:41.520 Fixed
00:46:41.860 Twitter,
00:46:42.240 you fixed
00:46:42.620 everything.
00:46:43.620 If Musk
00:46:44.080 buys
00:46:44.400 Twitter,
00:46:44.860 he could
00:46:45.120 put together
00:46:46.820 a board
00:46:47.320 that the
00:46:48.160 public
00:46:48.460 would
00:46:48.860 respect.
00:46:51.480 That
00:46:51.960 fixes
00:46:52.280 everything.
00:46:54.420 Am I
00:46:54.820 right?
00:46:56.360 Or at
00:46:56.700 least it's
00:46:57.020 worth trying.
00:46:57.960 If you're
00:46:58.260 going to
00:46:58.380 A-B test
00:46:58.940 a solution
00:46:59.940 to
00:47:00.200 disinformation,
00:47:01.060 this is
00:47:01.340 the way
00:47:01.560 I'd do
00:47:01.860 it.
00:47:02.660 Compare
00:47:03.040 that to
00:47:03.600 Obama's
00:47:04.420 foundation
00:47:06.040 that he's
00:47:06.480 created to
00:47:07.000 battle
00:47:07.380 disinformation.
00:47:08.400 Would you
00:47:09.280 trust
00:47:09.760 Barack
00:47:10.280 Obama's
00:47:11.080 disinformation
00:47:12.380 battling
00:47:12.980 foundation?
00:47:15.340 It's the
00:47:16.200 Obama
00:47:16.520 foundation.
00:47:17.280 We're
00:47:17.320 working to
00:47:17.780 empower and
00:47:18.320 equip emerging
00:47:19.060 leaders to
00:47:20.240 tackle issues
00:47:21.080 like the
00:47:21.520 spread of
00:47:21.940 disinformation.
00:47:23.220 No,
00:47:23.540 they're not.
00:47:25.220 He's
00:47:25.960 literally
00:47:26.340 the most
00:47:27.860 political
00:47:28.640 person in
00:47:30.180 the world.
00:47:31.860 He's the
00:47:32.740 last person
00:47:33.340 who should
00:47:33.640 be touching
00:47:34.120 disinformation.
00:47:36.580 He's
00:47:37.320 literally the
00:47:37.920 provider.
00:47:38.400 of it.
00:47:39.280 He's the
00:47:40.380 most famous
00:47:41.620 symbol of
00:47:42.260 the people
00:47:42.840 who provide
00:47:43.360 it.
00:47:44.420 Now,
00:47:44.860 I'm not
00:47:45.080 saying the
00:47:45.440 right doesn't
00:47:45.960 have any
00:47:46.360 disinformation,
00:47:47.840 but nobody
00:47:48.540 is more
00:47:49.080 associated with
00:47:50.080 it than
00:47:50.360 this guy.
00:47:52.120 That's not
00:47:52.760 even a
00:47:53.080 criticism of
00:47:53.780 Obama as
00:47:54.300 a president.
00:47:55.380 It's just
00:47:55.720 talk about
00:47:58.000 the wrong
00:47:58.400 guy for
00:47:58.800 the job.
00:48:00.440 Just as I
00:48:01.280 was saying
00:48:01.580 that these
00:48:02.320 people for
00:48:02.860 the imaginary
00:48:04.640 board of
00:48:05.240 Twitter would
00:48:05.660 be the
00:48:05.900 right people,
00:48:06.360 sometimes
00:48:09.460 that's
00:48:09.840 really the
00:48:10.220 wrong
00:48:10.380 person.
00:48:10.700 All right.
00:48:12.600 There's a
00:48:13.440 story that
00:48:13.940 Russians are
00:48:15.840 making
00:48:17.080 Ukrainians in
00:48:18.140 Ukrainian-conquered
00:48:19.280 territory fight
00:48:20.800 other Ukrainians.
00:48:21.860 In other words,
00:48:22.280 they're forcing
00:48:23.000 the Ukrainians
00:48:25.100 under their
00:48:25.560 control to
00:48:26.760 join the
00:48:27.120 military and
00:48:27.780 go fight
00:48:28.640 other Ukrainians.
00:48:29.460 I'm not so
00:48:30.820 sure that's
00:48:31.460 exactly correct.
00:48:34.600 That sounds
00:48:35.560 a little
00:48:36.060 propaganda-ish,
00:48:38.360 but it does
00:48:39.320 seem true
00:48:39.880 that there
00:48:41.520 are examples
00:48:42.360 in which
00:48:42.860 Russia has
00:48:43.720 in the past
00:48:44.580 used Ukrainians
00:48:46.960 in their
00:48:47.260 military force
00:48:48.060 fighting Ukrainians,
00:48:49.240 but it was in
00:48:50.080 territory that
00:48:50.760 they'd held
00:48:51.180 a while,
00:48:52.560 meaning that
00:48:53.140 they probably
00:48:53.720 had a better
00:48:54.220 handle on
00:48:54.820 who they
00:48:55.100 could trust
00:48:55.600 and who
00:48:55.880 they couldn't.
00:48:56.380 Well, I don't
00:48:57.780 know about
00:48:58.160 giving a gun
00:48:58.840 to somebody
00:48:59.320 you conquered
00:48:59.860 yesterday.
00:49:02.100 All right,
00:49:02.680 yesterday,
00:49:03.580 yesterday we
00:49:04.200 were shooting
00:49:04.640 at you,
00:49:05.140 really tried
00:49:05.600 hard to kill
00:49:06.080 you,
00:49:06.340 killed a lot
00:49:06.720 of people
00:49:07.000 in your
00:49:07.360 entity,
00:49:08.620 but I'll
00:49:08.860 tell you
00:49:09.120 what,
00:49:09.660 we're going
00:49:10.120 to give
00:49:10.440 you a gun,
00:49:11.280 it's going
00:49:11.620 to have real
00:49:12.160 ammo,
00:49:12.700 and you're
00:49:13.180 going to be
00:49:13.420 on our
00:49:13.720 side now,
00:49:14.500 and if you
00:49:15.700 don't,
00:49:16.160 we'll shoot
00:49:16.720 you.
00:49:18.360 And you're
00:49:18.680 like,
00:49:19.260 cluck,
00:49:19.540 cluck,
00:49:19.900 and you're
00:49:20.120 like holding
00:49:20.500 this gun,
00:49:21.060 and you're
00:49:21.180 like,
00:49:21.720 okay,
00:49:22.040 you're the
00:49:22.320 guys who
00:49:22.720 murdered all
00:49:23.220 the people
00:49:23.520 in my
00:49:23.940 troop,
00:49:25.360 and we're
00:49:25.660 going to
00:49:25.820 go out
00:49:26.060 in that
00:49:26.280 firefight where
00:49:27.820 nobody knows
00:49:28.440 where any
00:49:28.960 bullet is
00:49:29.380 coming from,
00:49:31.020 and you
00:49:32.180 think I'm
00:49:32.600 going to
00:49:32.800 be shooting
00:49:34.020 at my
00:49:34.740 team?
00:49:36.680 Explain to
00:49:37.320 me how
00:49:37.560 that works,
00:49:38.380 because you
00:49:39.080 were shooting
00:49:39.480 at me
00:49:39.900 yesterday.
00:49:42.860 One of
00:49:43.400 those bullets
00:49:43.820 went right
00:49:44.820 by my head.
00:49:45.760 I don't
00:49:46.540 think I'm
00:49:46.960 going to be
00:49:47.240 shooting in
00:49:47.660 that direction.
00:49:49.200 I mean,
00:49:49.500 I might point
00:49:50.080 my gun there
00:49:50.640 until the
00:49:51.740 shooting starts,
00:49:53.220 but as soon
00:49:53.980 as the
00:49:54.220 shooting starts,
00:49:54.880 I'm going
00:49:55.100 to be
00:49:55.260 like...
00:49:57.980 Now,
00:50:00.320 to me,
00:50:00.880 it looks
00:50:01.100 like fake
00:50:01.580 news.
00:50:02.700 Fake
00:50:03.080 news that's
00:50:03.820 close enough
00:50:04.420 to real
00:50:04.860 news that
00:50:05.880 places where
00:50:06.960 they've
00:50:07.220 controlled it
00:50:07.820 long enough,
00:50:08.920 maybe they
00:50:09.320 can get
00:50:09.700 people that
00:50:10.320 they trust
00:50:10.900 to be in
00:50:11.820 the military.
00:50:12.740 It just
00:50:13.660 feels
00:50:13.900 propaganda-ish
00:50:14.920 to me.
00:50:15.720 I wouldn't
00:50:15.980 buy that
00:50:16.460 on face
00:50:17.360 value.
00:50:20.100 All
00:50:20.220 right.
00:50:21.840 Rasmussen
00:50:22.360 Hezbollah
00:50:22.980 asking people
00:50:23.660 if women
00:50:24.160 should fight
00:50:24.720 on the
00:50:24.960 front lines
00:50:25.480 and perform
00:50:26.100 all combat
00:50:26.720 duties.
00:50:27.620 59% now
00:50:28.880 say yes,
00:50:29.860 that women
00:50:30.180 should fight
00:50:30.700 on the
00:50:31.000 front lines.
00:50:32.260 Now,
00:50:32.540 some of
00:50:32.900 you will
00:50:33.240 interpret
00:50:33.600 this as
00:50:35.040 men
00:50:36.980 becoming
00:50:37.420 more woke
00:50:38.100 or maybe
00:50:39.360 women too,
00:50:40.660 and that
00:50:41.420 there's much
00:50:42.720 more equality
00:50:43.680 and this
00:50:45.880 is,
00:50:46.480 you know,
00:50:46.700 we're not
00:50:47.300 there yet,
00:50:48.120 not to 100%
00:50:49.280 people saying
00:50:49.940 it would be
00:50:50.260 fine,
00:50:50.860 no matter
00:50:51.720 what you
00:50:52.060 gender.
00:50:52.780 But you
00:50:53.080 can see
00:50:53.340 that there's
00:50:53.700 a big
00:50:54.040 push
00:50:55.080 toward
00:50:56.240 much more
00:50:57.400 wokeness
00:50:58.080 and open
00:50:58.560 mindedness.
00:50:59.820 No,
00:50:59.980 that's not
00:51:00.260 what's
00:51:00.480 happening.
00:51:01.100 I think
00:51:01.420 people just
00:51:01.860 hate women.
00:51:03.900 And I
00:51:04.960 think there
00:51:05.260 are just
00:51:05.480 more people
00:51:06.060 who you
00:51:06.480 say to
00:51:06.800 them in
00:51:07.080 a poll,
00:51:07.820 do you
00:51:08.060 think women
00:51:08.520 should go
00:51:08.980 to the
00:51:09.180 front lines
00:51:09.720 and be
00:51:10.060 torn to
00:51:10.460 pieces by
00:51:11.100 shrapnel?
00:51:12.020 I think
00:51:12.620 just more
00:51:13.060 people are
00:51:13.480 willing to
00:51:13.860 say,
00:51:15.080 yeah,
00:51:15.500 fuck them.
00:51:18.040 I don't
00:51:18.560 know.
00:51:19.340 I just
00:51:20.080 don't know.
00:51:20.620 I just
00:51:21.740 don't know
00:51:22.140 that this
00:51:22.540 survey is
00:51:23.040 picking up
00:51:23.520 exactly what
00:51:24.340 you think.
00:51:26.480 I'm not
00:51:27.060 saying that's
00:51:27.480 my opinion.
00:51:28.720 I'm just
00:51:29.120 saying we're
00:51:29.720 living in
00:51:30.200 such a
00:51:30.560 contentious
00:51:31.100 world that
00:51:32.280 I don't
00:51:32.800 know that
00:51:33.220 equality is
00:51:34.100 what's driving
00:51:34.660 this number.
00:51:36.020 I think it's
00:51:36.500 kind of,
00:51:36.940 yeah,
00:51:37.280 I don't
00:51:37.580 care.
00:51:40.260 What about
00:51:40.900 all the
00:51:41.240 incels?
00:51:42.080 All the
00:51:42.440 men who have
00:51:42.860 given up
00:51:43.260 women forever
00:51:43.880 because they
00:51:44.460 can't get
00:51:44.780 dates and
00:51:45.520 it's only
00:51:46.620 the top
00:51:47.200 2% of
00:51:48.060 men on
00:51:48.440 Tinder who
00:51:48.920 are getting
00:51:49.140 all the
00:51:49.460 action?
00:51:49.760 You don't
00:51:50.660 think some
00:51:51.100 of those
00:51:51.420 men who
00:51:52.300 have been
00:51:52.580 completely
00:51:53.120 abandoned by
00:51:53.820 women in
00:51:54.400 the sense
00:51:54.780 that it's
00:51:55.380 useless to
00:51:55.980 even try to
00:51:56.420 get one for
00:51:57.100 them, you
00:51:57.960 don't think
00:51:58.320 that they're
00:51:58.660 saying, yeah,
00:51:59.620 I don't care.
00:52:00.860 You can send
00:52:01.620 them all to the
00:52:02.120 front line, I
00:52:02.640 don't care.
00:52:03.740 It wouldn't
00:52:04.320 affect my
00:52:04.780 life one way
00:52:05.340 or the
00:52:05.560 other.
00:52:08.300 All I'm
00:52:08.880 saying is you
00:52:09.520 have to be
00:52:09.880 careful about
00:52:10.400 how you
00:52:10.700 interpret these
00:52:12.420 things.
00:52:14.380 There's a
00:52:14.900 story about
00:52:15.320 Taylor Lorenz,
00:52:16.540 a reporter for
00:52:17.220 the Washington
00:52:17.640 Post, and I
00:52:18.900 don't care.
00:52:19.760 So I'm
00:52:20.500 going to
00:52:20.620 skip that
00:52:20.960 one.
00:52:22.200 But I want
00:52:22.760 you to know
00:52:23.100 I heard
00:52:23.400 about it.
00:52:24.360 It just
00:52:24.620 isn't
00:52:24.860 interesting.
00:52:27.080 Once again,
00:52:27.860 I seem to
00:52:28.300 be in
00:52:28.540 every story.
00:52:31.680 There's a
00:52:32.200 story in the
00:52:32.660 Evening Standard,
00:52:33.600 a British
00:52:33.940 publication,
00:52:35.040 about MPs
00:52:35.940 that are going
00:52:36.240 to vote on
00:52:36.700 whether Boris
00:52:38.160 Johnson should
00:52:39.400 be investigated
00:52:40.140 for a
00:52:40.720 misleading
00:52:41.120 parliament,
00:52:41.900 about I
00:52:42.400 don't know
00:52:42.680 what.
00:52:43.620 And the
00:52:43.920 story leads
00:52:44.440 out with a
00:52:44.900 story about
00:52:45.300 me.
00:52:46.460 But what
00:52:46.740 the hell?
00:52:48.160 What the
00:52:48.880 hell?
00:52:49.860 How do I
00:52:50.460 get in
00:52:50.760 this story
00:52:51.200 about Boris
00:52:51.760 Johnson and
00:52:52.440 some damn
00:52:52.800 thing?
00:52:53.860 But they
00:52:54.580 were quoting
00:52:55.320 some past
00:52:56.260 thing that I
00:52:57.880 talked about.
00:53:00.360 Student debt.
00:53:01.460 52% of
00:53:02.480 likely U.S.
00:53:03.100 voters support
00:53:03.800 Biden's plan to
00:53:05.700 cancel student
00:53:06.420 debt.
00:53:06.740 So that makes
00:53:08.540 it a good idea,
00:53:09.360 right?
00:53:09.780 Politically, this
00:53:10.480 would be good
00:53:10.900 for Biden because
00:53:12.040 the majority of
00:53:12.820 people agree with
00:53:13.580 it, right?
00:53:14.600 Is that right?
00:53:15.140 Because the
00:53:15.620 majority agree.
00:53:16.900 So therefore, it
00:53:17.480 would be a good
00:53:17.820 thing to do.
00:53:18.860 Because that's
00:53:19.400 how politics
00:53:20.040 works.
00:53:21.100 If the majority
00:53:21.860 like it, you
00:53:23.400 want to do it,
00:53:23.980 right?
00:53:25.660 No, not in
00:53:26.600 this case.
00:53:27.900 Do you know
00:53:28.540 why?
00:53:29.880 Because there
00:53:30.520 are too many
00:53:30.940 people who
00:53:31.600 will really,
00:53:32.380 really be mad
00:53:33.040 about this.
00:53:34.340 You don't
00:53:34.820 want to do
00:53:35.240 something that's
00:53:35.860 good for people
00:53:36.680 if it makes
00:53:37.340 even the
00:53:38.900 minority of
00:53:39.520 people so
00:53:39.980 mad that
00:53:40.420 they'll change
00:53:40.900 their vote.
00:53:42.560 I don't think
00:53:43.420 too many people
00:53:44.100 are going to
00:53:44.500 vote for
00:53:45.300 this just
00:53:46.960 to save
00:53:47.400 money.
00:53:48.000 I don't
00:53:48.280 think too
00:53:48.580 many people
00:53:48.940 are going
00:53:49.120 to say,
00:53:49.360 well, if
00:53:49.680 I get
00:53:49.900 Biden, I'll
00:53:50.420 save some
00:53:50.800 money.
00:53:51.580 But I'll
00:53:51.960 bet there'll
00:53:52.300 be a lot
00:53:52.660 of people
00:53:52.940 who vote
00:53:53.240 against him
00:53:53.740 if he
00:53:53.960 does it.
00:53:54.960 I would
00:53:55.480 be kind
00:53:55.980 of mad.
00:53:57.640 Yeah.
00:53:58.780 I've paid
00:53:59.440 off somebody
00:54:00.700 else's student
00:54:01.320 loans, actually.
00:54:02.940 So I
00:54:04.340 helped a,
00:54:04.880 let's say,
00:54:05.500 a loved one
00:54:06.160 pay off
00:54:06.520 student loans.
00:54:07.980 And I'd
00:54:08.880 be pretty
00:54:09.260 pissed if I
00:54:10.680 wasted my
00:54:11.180 money.
00:54:12.040 Pretty pissed.
00:54:12.640 All right.
00:54:16.360 I've got to
00:54:16.980 imagine that
00:54:17.480 there are
00:54:17.700 some Democrats
00:54:18.280 who have
00:54:18.680 paid their
00:54:19.040 own student
00:54:19.640 loans.
00:54:20.080 We're not
00:54:20.340 going to
00:54:20.480 like this
00:54:20.900 at all.
00:54:23.860 I feel like
00:54:24.620 there was
00:54:24.880 something I
00:54:25.280 missed.
00:54:26.100 Now, have
00:54:26.440 I solved
00:54:27.020 all the
00:54:27.300 problems?
00:54:28.520 Let's see.
00:54:29.340 What did I
00:54:29.880 promise you?
00:54:30.940 I promised
00:54:31.540 you that I
00:54:34.440 would solve
00:54:34.980 climate change.
00:54:36.300 Okay.
00:54:36.840 Fake news I
00:54:37.580 solved with a
00:54:38.340 Twitter board,
00:54:39.100 if Musk buys
00:54:39.860 them.
00:54:41.020 Systemic
00:54:41.520 racism through
00:54:42.340 mentoring.
00:54:42.960 Oh, and
00:54:43.100 then poverty.
00:54:44.160 Well, poverty
00:54:44.680 will be solved
00:54:45.500 by the
00:54:47.180 small nuclear
00:54:48.060 making energy
00:54:48.780 costs go
00:54:49.280 down.
00:54:50.260 If you fix
00:54:51.040 the fake
00:54:51.400 news, then
00:54:52.020 you can get
00:54:52.440 real solutions
00:54:53.160 instead of
00:54:53.780 fake ones.
00:54:54.820 And if you
00:54:55.240 fix systemic
00:54:55.880 racism with
00:54:56.820 the mentoring,
00:54:57.860 that eventually
00:54:58.940 fixes poverty.
00:55:00.080 So I
00:55:00.360 believe I
00:55:00.680 got all
00:55:00.920 four.
00:55:01.600 Climate
00:55:01.840 change,
00:55:02.300 fake news,
00:55:02.860 poverty,
00:55:03.320 systemic
00:55:03.660 racism in
00:55:04.340 one live
00:55:04.780 stream.
00:55:06.580 And I
00:55:07.180 did that on
00:55:07.740 420.
00:55:08.300 And let
00:55:09.740 me tell
00:55:10.020 you, I
00:55:11.100 couldn't
00:55:11.360 have done
00:55:11.640 it without
00:55:12.020 the
00:55:12.260 simultaneous
00:55:12.820 whatever.
00:55:16.140 So how
00:55:18.620 many people
00:55:19.020 think I
00:55:19.360 delivered?
00:55:20.020 Did I
00:55:20.440 deliver?
00:55:22.900 Have I
00:55:23.760 delivered four
00:55:24.780 solutions, not
00:55:25.620 that I made
00:55:26.040 up myself, but
00:55:27.480 four solutions
00:55:28.180 that actually
00:55:29.380 would work?
00:55:30.860 That's right.
00:55:31.940 Nailed it.
00:55:32.920 Now, that
00:55:33.400 doesn't mean
00:55:33.800 these will all
00:55:34.460 be implemented
00:55:35.320 just the way I
00:55:36.000 said, but we
00:55:36.460 do have the
00:55:36.880 solutions.
00:55:38.300 Isn't it
00:55:38.860 fun to
00:55:39.800 know that
00:55:41.340 even if they
00:55:41.840 don't get
00:55:42.200 solved, we
00:55:43.300 do know
00:55:43.700 how.
00:55:45.120 We do
00:55:45.740 know how to
00:55:46.260 solve all
00:55:46.700 this stuff
00:55:47.120 now.
00:55:48.040 We just
00:55:48.760 have to do
00:55:49.420 it.
00:55:50.620 All right.
00:55:51.760 No doubt
00:55:52.440 this is the
00:55:53.180 highlight of
00:55:53.900 all live
00:55:54.980 streams ever.
00:55:56.360 The best
00:55:56.820 thing that
00:55:57.160 anybody's ever
00:55:57.740 done anywhere.
00:55:59.280 And probably
00:56:01.780 you would like
00:56:03.220 to close with
00:56:04.380 one more
00:56:04.860 simultaneous
00:56:05.580 whatever.
00:56:08.280 And those
00:56:08.840 of you
00:56:09.100 watching
00:56:09.480 asynchronously,
00:56:10.580 as in
00:56:11.000 recorded,
00:56:12.740 you want
00:56:14.460 to go with
00:56:14.900 this one.
00:56:15.560 You want
00:56:15.900 to join
00:56:16.700 us.
00:56:17.080 It's going
00:56:17.340 to make
00:56:17.520 you feel
00:56:17.760 better.
00:56:19.320 You ready?
00:56:20.540 Let's do
00:56:21.060 the simultaneous
00:56:21.920 whatever.
00:56:28.340 I'm being
00:56:29.040 criticized for
00:56:29.780 not curing
00:56:30.440 wokeness.
00:56:30.960 I think
00:56:32.740 wokeness is
00:56:33.520 curing
00:56:33.860 itself.
00:56:35.720 Wokeness
00:56:36.240 seems
00:56:36.480 self-curing.
00:56:39.300 Wokeness
00:56:39.860 simply has
00:56:40.320 to go too
00:56:40.820 far.
00:56:42.180 Can you
00:56:42.580 count on
00:56:43.120 wokeness to
00:56:43.760 go too
00:56:44.120 far?
00:56:45.380 Oh, yeah.
00:56:46.920 Oh, yes,
00:56:47.780 you can.
00:56:49.820 And when it
00:56:50.560 goes too
00:56:51.000 far, and
00:56:51.660 it has,
00:56:52.980 then it
00:56:56.540 will bounce
00:56:56.980 back.
00:56:57.780 Yeah, the
00:56:58.060 wokeness is
00:56:58.740 a, the
00:57:01.280 thing that
00:57:01.880 will kill
00:57:02.180 wokeness is
00:57:04.020 this realization,
00:57:05.720 that the
00:57:06.680 wokeness is
00:57:07.380 hurting the
00:57:08.680 people it's
00:57:09.160 meant to
00:57:09.480 help.
00:57:10.840 Let me
00:57:11.380 give you a
00:57:12.020 general concept
00:57:12.980 that is really
00:57:14.440 important.
00:57:15.560 In the early
00:57:17.020 days of
00:57:17.660 greatest
00:57:18.080 discrimination,
00:57:19.620 you wanted
00:57:20.040 to use the
00:57:20.560 harshest tool,
00:57:21.740 like a civil
00:57:22.380 war to end
00:57:23.120 slavery, for
00:57:23.780 example.
00:57:24.460 And then the
00:57:24.900 civil rights
00:57:25.480 were a pretty
00:57:26.300 big tool.
00:57:27.280 You could put
00:57:27.700 people in
00:57:28.120 jail for
00:57:28.600 violating
00:57:28.960 people's
00:57:29.400 civil rights
00:57:29.900 and sue
00:57:30.660 people for
00:57:31.220 stuff.
00:57:31.900 That's a
00:57:32.400 pretty big
00:57:32.700 tool.
00:57:33.620 And that
00:57:33.820 was important
00:57:34.600 because the
00:57:35.380 problem was
00:57:35.980 that size.
00:57:37.720 But as you
00:57:38.380 get down to
00:57:38.820 more subtle
00:57:40.040 things like
00:57:40.780 who has a
00:57:41.300 mentor and
00:57:42.080 systemic racism
00:57:43.920 and stuff,
00:57:44.980 at that point,
00:57:46.660 when you're
00:57:47.120 talking about a
00:57:47.840 strategy difference,
00:57:49.900 because that's
00:57:50.280 really what we're
00:57:50.840 down to,
00:57:51.980 we're down to
00:57:52.640 a strategy
00:57:53.420 difference.
00:57:54.340 Can you find
00:57:54.840 a mentor?
00:57:56.200 Because if you
00:57:56.740 can, everything
00:57:57.900 else is in
00:57:58.440 place, just
00:58:00.140 a strategy,
00:58:00.880 just get that
00:58:01.660 mentor.
00:58:02.920 So once
00:58:03.520 you've got it
00:58:03.980 down to a
00:58:04.380 strategy place,
00:58:05.440 the wokeness
00:58:06.080 is anti-productive.
00:58:08.940 Do you want to
00:58:09.760 be a young
00:58:10.840 black man and
00:58:11.760 you get a
00:58:12.160 good job and
00:58:13.800 everybody around
00:58:14.500 you thinks that
00:58:15.280 it was
00:58:15.560 affirmative action?
00:58:17.780 Messed up,
00:58:18.820 right?
00:58:19.140 That's messed
00:58:19.660 up.
00:58:20.320 That works
00:58:20.700 against you.
00:58:21.260 That works
00:58:22.580 against you.
00:58:23.120 So at
00:58:25.440 what point do
00:58:26.200 you get to
00:58:26.500 the point where
00:58:27.140 the discrimination
00:58:28.100 part, although
00:58:29.560 discrimination and
00:58:30.380 bigotry will
00:58:30.960 always exist,
00:58:31.820 right?
00:58:32.000 There's going to
00:58:32.300 be a baseline
00:58:32.760 of that.
00:58:33.340 But the
00:58:33.640 actual problem
00:58:34.520 part,
00:58:35.380 economically,
00:58:36.280 just economically,
00:58:37.500 it's all just
00:58:38.520 down to a
00:58:38.960 strategy now,
00:58:40.140 a really simple
00:58:40.900 one.
00:58:41.740 Connect mentors
00:58:42.700 to people who
00:58:43.300 need it and
00:58:43.840 then take their
00:58:44.920 advice.
00:58:45.500 That's it.
00:58:45.840 So I would
00:58:48.880 say that this
00:58:50.060 is the time
00:58:50.660 when wokeness
00:58:51.460 will overshoot
00:58:52.380 because the
00:58:53.360 thing it's
00:58:53.700 trying to
00:58:54.060 solve is just
00:58:54.880 making a
00:58:55.400 mockery of
00:58:56.600 things that
00:58:57.080 now have
00:58:57.460 solutions that
00:58:58.160 are more
00:58:58.420 like strategy
00:58:59.660 and less
00:59:00.840 like, oh,
00:59:01.600 there's a
00:59:02.060 gigantic
00:59:02.660 injustice that
00:59:05.120 somebody is
00:59:05.660 perpetrating,
00:59:06.540 specifically
00:59:07.100 and consciously.
00:59:08.280 It's not that
00:59:08.880 anymore.
00:59:09.780 And that,
00:59:10.300 ladies and
00:59:10.640 gentlemen,
00:59:10.940 is the end
00:59:11.380 of the show
00:59:11.980 for YouTube.
00:59:13.060 You're all
00:59:13.440 wonderful and
00:59:14.520 man, I
00:59:16.160 can't help
00:59:16.640 but mention
00:59:17.080 this.
00:59:18.100 You're sexier
00:59:18.960 than I've
00:59:19.360 ever seen
00:59:19.820 you before.
00:59:21.260 My goodness,
00:59:22.060 you look good.
00:59:23.260 You must have
00:59:23.680 gotten a good
00:59:24.100 night's sleep
00:59:24.520 last night.
00:59:25.620 Bye for now.