Real Coffee with Scott Adams - March 27, 2023


Episode 2060 Scott Adams: TikTok Shows Us Who Is Bought Off, CRT Lowers Black Test Scores?


Episode Stats

Length

59 minutes

Words per Minute

146.75499

Word Count

8,803

Sentence Count

770

Misogynist Sentences

12

Hate Speech Sentences

19


Summary

A man built a bike that shoves a pole up his ass, flying taxis are coming to Chicago, and Microsoft is building a phone built around artificial intelligence, and it s going to be a phone with no apps.


Transcript

00:00:01.000 Good morning everybody, and welcome to the highlight of civilization that's called Coffee
00:00:05.960 with Scott Adams.
00:00:06.960 There's never been a better time.
00:00:09.360 And if you made the mistake of watching anything else at this time, well, you're probably regretting
00:00:14.840 it already.
00:00:15.840 Because you can take this experience up to levels where nobody's ever seen it before.
00:00:22.920 And all you need is a cup or a mug or a glass of a tank or a chalice or a sty on a canteen
00:00:26.400 glass, a vessel of any kind, fill it with your favorite liquid, I like coffee.
00:00:32.400 And join me now for the unparalleled pleasure of the dopamine, the day thing that makes
00:00:35.800 everything better.
00:00:36.800 It's called the simultaneous sip and it happens now.
00:00:40.360 Go.
00:00:41.360 Ah, that was a strange sip, strange but beautiful, graceful, elegant, classy, really.
00:00:54.080 All right, well, we got a lot going on here, so I'd like to start with my favorite story,
00:01:01.080 which I will look for on Twitter to show you if you haven't seen it already.
00:01:04.600 There was some gentleman, I'm not sure where it was, it might have been an African country,
00:01:08.600 couldn't tell from the background.
00:01:10.960 But there's a clever gentleman who rigged a bicycle so that if you tried to steal the bicycle,
00:01:17.480 and it was left unlocked in an easy place to steal, the seat would collapse when you got
00:01:23.320 on it and the pole that normally holds the seat would go right up your ass.
00:01:29.560 So you would be shoving a metal pole up your ass with the force of your own body weight when
00:01:34.160 you sat on it.
00:01:36.480 If you don't think that's funny, well, you don't know me, because there's a compilation
00:01:46.200 video of all the people sitting on it.
00:01:49.160 Now, the funny part is watching them, none of them steal the bike, they all walk around
00:01:54.880 like this.
00:02:04.600 The thing I love about it, the thing I love about it is that none of them actually steal
00:02:09.720 the bike.
00:02:10.720 Theoretically, you could still ride the bike or push it away and get the seat fixed,
00:02:16.760 but everybody is completely done with the bike after it shoves a pole up their ass.
00:02:23.200 I still want to get that bike or build my own.
00:02:26.880 That's just the funniest thing ever.
00:02:28.960 Well, Chicago is going to have flying taxis.
00:02:33.560 That's really going to happen.
00:02:35.040 So if you go to the Chicago airport and you're heading to the middle of the city, in maybe
00:02:41.320 a year or so, United will allow you to take a short hop flight in a vertical takeoff plane.
00:02:50.780 So the plane will just go straight up, over and straight down in the middle of the city.
00:02:55.700 And apparently they've already purchased the hardware and they've got the plan and it's
00:03:00.140 actually going to happen.
00:03:01.200 So there will be flying taxis in a year.
00:03:04.300 It's not flying cars.
00:03:05.300 I think it takes, you know, six people or something.
00:03:08.600 Yep.
00:03:09.600 That's pretty exciting.
00:03:10.600 So you miss all of the, all of the traffic, 150 miles an hour.
00:03:16.280 Can't wait.
00:03:17.280 There's your flying cars, kind of, but flying taxis.
00:03:21.280 All right.
00:03:22.280 I saw an interesting prediction from Naval Ravikant.
00:03:28.280 And if you don't know who Naval is, the only thing you need to know is if he predicts something,
00:03:33.480 pay attention.
00:03:34.480 All right.
00:03:35.480 That's my whole commercial.
00:03:37.400 If he says something, you should look at it.
00:03:39.820 You should pay attention to that.
00:03:41.440 But anyway, here's his prediction on Twitter.
00:03:43.940 He said, Microsoft ships a phone built around AI by the end of the year.
00:03:51.500 And then he says, look out Apple and Google.
00:03:55.260 So what do you think?
00:03:56.400 So Microsoft has some ownership of this big chat, GPT, OpenAI, whatever it's called.
00:04:03.860 And so they, their Bing search engine already uses it.
00:04:09.240 And Naval is thinking that they might build it into a phone.
00:04:13.360 Now, how long have I been telling you?
00:04:16.640 Probably five years or more, I've been saying.
00:04:20.800 The ultimate obvious place that the phone interface will go is no apps.
00:04:26.340 No apps.
00:04:27.420 You just start doing what you want to do.
00:04:29.800 The perfect interface for a phone would be a blank screen.
00:04:33.840 Just a blank screen.
00:04:35.580 And if you say to it, hey, make a spreadsheet and add up these numbers, then it just creates
00:04:43.400 a spreadsheet right in front of you.
00:04:45.480 Or if you say, send a message, it just creates the app and sends the message and then deletes
00:04:51.340 the app.
00:04:52.340 Or it could save it, I suppose.
00:04:53.900 But you'll never have to worry about apps.
00:04:55.240 You just tell it what to do and it goes and figures out how to do it.
00:04:58.460 I think that's what's going to happen.
00:05:00.660 I think your phone is just going to be a blank screen and you talk to it.
00:05:05.220 Alternately, here's how I would have designed the phone already.
00:05:09.480 I would have designed it so that you pick it up and start doing the thing you want to
00:05:12.600 do.
00:05:14.440 Just start doing it.
00:05:15.720 If you want to search for something on Facebook, you just type in blah, blah, blah, and then
00:05:23.920 you just type in the search term.
00:05:26.860 And then as you type it in, the AI says, well, I don't know what this is about.
00:05:31.300 He could be writing an email, could be sending a message, or it could be a search term.
00:05:37.360 So it would pop up several choices.
00:05:40.400 And as soon as you typed it in, you'd say, the best place to eat in San Francisco.
00:05:47.060 You type it in.
00:05:48.820 And among the choices are a Google search, a Bing search, or an email.
00:05:53.380 But you know it's a search, so you just hit boop.
00:05:56.040 So interfaces are backwards.
00:05:58.220 You should never have to deal with the app before you do the activity you wanted the app
00:06:02.600 for.
00:06:03.500 You should start the activity and then AI should figure out what apps or app would make sense
00:06:09.520 with what you're doing.
00:06:10.720 So it should look at your context and then figure out the app for you.
00:06:15.460 Imagine how much less thinking that would be.
00:06:18.420 Do you know how much time?
00:06:19.820 Why type?
00:06:20.820 Well, you would type because people are listening, so you can't always speak out loud to your phone.
00:06:29.740 That's the way it's going, right?
00:06:31.740 There's no way that 10 years from now, you're going to be selecting an app and then telling
00:06:36.520 it what to do.
00:06:37.860 There's no way that that could last.
00:06:41.800 All right.
00:06:44.320 Twitter is fun again.
00:06:46.620 So I guess Musk is giving some stock grants to employees, which value the company at about
00:06:54.620 half of what he bought it for.
00:06:56.020 So he bought it for $44 billion.
00:06:58.980 He's valuing it, at least in terms of valuing the stock options, at $20 billion, but he suggests
00:07:04.920 that in 10 years or so it could be worth $250 billion and that there's a difficult but very
00:07:13.960 doable path to $250 billion valuation.
00:07:16.920 What do you think?
00:07:19.860 If he gets to $250 billion and he's the richest person times three, I think, I think it's possible.
00:07:26.680 Yeah.
00:07:27.680 I think it's possible.
00:07:28.680 Because to do that, he would have to fold in different functions like payments, you know,
00:07:33.720 have a better advertising, you know, situation, have payments in there.
00:07:38.440 Yeah.
00:07:38.700 I think that's all going to happen.
00:07:40.320 Do you think there'll be a Twitter phone?
00:07:43.360 Could be.
00:07:45.240 Yeah.
00:07:45.980 Could be.
00:07:46.480 You know, it's also easy to imagine that you would get Twitter plus Skylink, somehow
00:07:55.580 there'd be some kind of combined deal or something.
00:07:57.740 You could imagine that happening.
00:07:59.180 Anyway, I do think that there are paths to $250 billion valuation.
00:08:04.180 But the funniest thing about Twitter is that they closed down their press contact.
00:08:10.320 So it used to be if you emailed press at twitter.com, you could ask a question if you were the press.
00:08:18.320 And now if you email, you get an automatic response from press at twitter.com of a poop emoji.
00:08:29.060 That's all you get.
00:08:30.340 It just gives you an automatic poop emoji.
00:08:35.280 Now, you tell me that that could be more perfect.
00:08:38.640 No.
00:08:39.540 Nope.
00:08:40.320 That is the perfect interface.
00:08:45.140 Well, Trump is saying out loud that he thinks that Bragg has already dropped the Stormy Daniels case.
00:08:52.540 He told reporters on his plane.
00:08:54.820 What do you think?
00:08:55.980 Do you think the charges are already dropped?
00:08:58.260 I guess the grand jury's meeting again today.
00:09:01.900 I don't know if it's already dropped, but I don't see how it could go forward.
00:09:05.680 Alan Dershowitz has gone so far as to say that the case is so weak and the main guy who would probably testify would be Michael Cohen.
00:09:18.500 And now that Michael Cohen's own lawyer produced a document that would show that Cohen is a liar, either is a liar or was a liar, but it's going to end up looking the same.
00:09:29.960 Because if he's a liar and you think he's going to say the opposite of what he said in writing he believed, you could get disbarred.
00:09:39.860 Because you could get disbarred for knowingly put a liar on the stand.
00:09:45.100 If it's his witness and he knows he's a liar and he needs that lie to make his case and he puts him on the stand, he could be disbarred.
00:09:54.120 Now, I think you'd have to prove he knows it's a lie, so I don't think it could really happen.
00:09:59.400 But Dershowitz is putting on the pressure.
00:10:02.440 Like, well, you know, there's a possibility that this could end with Bragg being the one who loses his job instead of Trump.
00:10:10.360 I like that he put it out there.
00:10:11.800 I don't think it's likely, but I like that it's out there.
00:10:14.460 So I'm going to predict that the charges will not go forward.
00:10:19.320 What's your prediction?
00:10:21.400 That one way or another the charges will not proceed.
00:10:25.860 And I think that if it were still politically good, but legally sketchy, it would go through.
00:10:35.040 But now it's obvious that it's a political disaster as well as a legal disaster.
00:10:39.740 So legally it was always weak, but politically maybe you could, you know, get some points.
00:10:45.320 But now it's obvious that this, even the threat of it made Trump more popular.
00:10:51.900 Even the threat of it.
00:10:53.760 So I tried to play a video on CNN.
00:10:56.340 The tease of it was that Jon Stewart said something along the lines in a recent interview that this is why Trump got elected.
00:11:07.640 Now, wouldn't you want to know what that video said?
00:11:10.680 What did Jon Stewart say about Trump, this is why he got elected?
00:11:17.040 I mean, it feels like this.
00:11:18.660 It feels like the Bragg situation, the Alvin Bragg situation.
00:11:25.340 But have you noticed that the more you want to watch a video, the less likely it will ever play?
00:11:32.160 You can click on that motherfucker all day long and it won't play.
00:11:36.200 And the same thing on Fox News.
00:11:39.520 Fox News will say, sexy picture of somebody you actually want to look at.
00:11:44.440 You know, sexy bikini picture.
00:11:46.500 And I'll be like, well, I'm here anyway.
00:11:49.820 Got a little extra time.
00:11:51.780 I think I'll just maybe click on that sexy bikini picture and see what all the news is about.
00:11:57.920 Because I'm not the kind who just looks at the headlines.
00:12:00.400 I like to do my own research.
00:12:02.200 That's why I like to click on the stories and get the pictures.
00:12:05.260 And I'll be like, well, okay, if this is the sexiest picture ever from this person, I think I got to see it.
00:12:13.280 Click, nothing, nothing.
00:12:18.500 You want a guaranteed video that won't play?
00:12:21.800 Put a headline on CNN that says, Trump agrees he should not be president.
00:12:27.960 And then whatever that video is, it'll never play.
00:12:31.900 It will never play.
00:12:33.660 Because it's too interesting.
00:12:35.140 So somehow there's some kind of technology that makes anything interesting unplayable at the same time.
00:12:41.480 And it's a direct correlation.
00:12:43.680 If it's some boring-ass story of a general said, oh, I'm a general, and blah, blah, Ukraine.
00:12:51.240 Oh, that'll play.
00:12:52.660 Yeah.
00:12:53.200 That'll play just fine.
00:12:54.520 First try.
00:12:55.600 Or how about the Pope comes out against war?
00:12:59.640 Oh, that'll play.
00:13:01.160 Uh-huh.
00:13:01.660 That'll play.
00:13:02.120 But if they say, here's a video of Nancy Pelosi having, I don't know, sex with Adam Schiff, caught on video, that won't play.
00:13:19.060 No, you could click on that all day long.
00:13:21.320 That will not play.
00:13:23.000 Nope.
00:13:23.240 So that's how you know.
00:13:27.440 Well, I don't know what you know from that.
00:13:29.080 It's just a fact.
00:13:31.980 All right.
00:13:32.700 I saw a opinion piece on why the school choice movement is working well at the moment, when for so long it didn't get much traction.
00:13:42.800 And a lot of it is being credited to Corey D'Angelo and his strategy.
00:13:49.900 Now, some of it, of course, was the pandemic, right?
00:13:53.040 People got to see Zoom school and see how horrible it was and got more interested in their kids' education and all that.
00:13:59.880 So some of it's that.
00:14:00.960 Some of it's the alleged CRT in classrooms.
00:14:03.160 Some of it's the teaching young kids too much about sex too early, say, people.
00:14:09.240 So there's lots of reasons why people would be more interested in homeschooling.
00:14:14.720 But the current thinking is that homeschooling is being driven on values as opposed to education.
00:14:23.180 Do you buy that?
00:14:24.740 That feels right.
00:14:26.020 I don't have data to support that.
00:14:27.800 But it feels like people are saying, okay, I was okay when I didn't know if my kid was learning to read and write.
00:14:34.640 But, you know, it seemed like everybody was in the same boat.
00:14:37.040 So, you know, you sort of went along.
00:14:39.140 What are you going to do anyway?
00:14:40.460 But as soon as you find out your kids are being taught that they're either victims or oppressed or that their gender is sort of up to them,
00:14:49.800 then the parents end up getting really involved.
00:14:53.600 So I think when you say, what will you teach my kids?
00:14:59.020 People go, well, you know, I guess I can put up with some imperfection.
00:15:03.240 But when you say, what will I turn my kids into, that's a whole different game.
00:15:10.220 Because the, you know, that's not about math.
00:15:13.160 That's actually turning them into the kind of people that is somebody's idea of a good citizen,
00:15:19.280 but maybe not the parent's idea of a good citizen.
00:15:22.060 So you can see why this is getting energy right now.
00:15:24.560 It's the social part of it.
00:15:26.420 So if you're going to argue it with anybody, I would go with the social argument.
00:15:30.460 Seems stronger than the, they can get better grades if they do this.
00:15:35.740 All right.
00:15:39.260 So I watched a clip from NBC News where Chuck Dodd was talking to Senator Warren about TikTok.
00:15:45.840 And Senator Warren talked about the privacy issue and never mentioned the big problem,
00:15:56.480 which is persuasion, which is the Chinese Communist Party can essentially push one button to make anything viral.
00:16:05.120 And that's not even hyperbole.
00:16:08.360 There is actually literally a button called heat.
00:16:13.560 It's actually labeled the heat button where they can make anything viral.
00:16:18.160 So that's a gigantic risk because, I don't know, 150 million Americans use it.
00:16:23.780 And they can make anything a fact because our minds are programmed by what we see and then how often it's repeated.
00:16:32.220 That's it.
00:16:33.220 That's your whole operating system for your brain.
00:16:35.440 So they have control over what you see on TikTok and how often you see it.
00:16:41.740 That's complete control of your brain.
00:16:44.500 That's all it takes.
00:16:45.880 Now, I'm talking on average.
00:16:47.780 I don't mean every single person will be immediately programmed by some memes.
00:16:52.540 I mean that on average, you can move the average reliably by how much you show them of what.
00:16:58.760 Reliably.
00:16:59.800 And you can see that in every poll.
00:17:01.460 If you look at the poll, you know, the Democrats always go one way.
00:17:05.480 The Republicans always go the other way.
00:17:07.220 Even on issues that are not political, that's how you know that you're being programmed.
00:17:14.460 If you are not being programmed by some third party, then when a topic comes up that has no political connection whatsoever,
00:17:23.080 the opinions would be sort of mixed all over the place.
00:17:25.680 But they're not.
00:17:27.600 Every topic just becomes political, which is proof that you're being programmed.
00:17:34.200 So TikTok has that power.
00:17:36.860 And Elizabeth Warren never even mentions that risk.
00:17:39.960 She only talks about data privacy.
00:17:43.060 And Chuck Dodd doesn't mention it either.
00:17:46.700 So it's a story about the risk of TikTok without mentioning the big risk of TikTok.
00:17:54.740 Now, how in the world is that an accident?
00:17:58.160 How is that an accident?
00:18:00.100 It couldn't possibly be an accident at this point.
00:18:03.600 At this point, there's no way you can say that's an accident.
00:18:06.760 These have to be two entities that are in the bag for China.
00:18:10.200 This looks like pro-CCP propaganda.
00:18:12.760 Now, I told her yesterday that AOC also came out in favor of not banning TikTok.
00:18:21.640 And her answer looked so obviously bought off that people just said,
00:18:26.380 well, she must be accepting money.
00:18:29.300 Because there's no way you could have that opinion unless somebody just paid you to have it.
00:18:33.640 Because it's dumb.
00:18:34.520 And now we find out that Fox News Digital reported that ByteDance, TikTok's Chinese parent company,
00:18:46.240 funneled six-figure contributions to non-profits aligned with the Congressional Black and Hispanic Caucuses.
00:18:53.600 Huh.
00:18:54.560 Didn't we just see a member of the Black Caucus saying we shouldn't ban TikTok?
00:19:01.420 I think I just saw that recently.
00:19:03.380 Or somebody who was aligned with them.
00:19:06.780 And so they gave $150,000 to these two, the Black Caucus and the Congressional Hispanic Caucus Foundation.
00:19:14.940 Huh.
00:19:15.740 I wonder if AOC has any connection to the Hispanic Caucus Institute.
00:19:21.980 Oh, yeah.
00:19:22.660 She's a member.
00:19:24.840 She's a member.
00:19:28.200 Any other questions?
00:19:29.320 There's your answer.
00:19:33.240 There's your answer.
00:19:34.220 China just bought her.
00:19:36.540 Not only did China buy her, but we have the receipt.
00:19:40.780 They left the receipt.
00:19:43.440 It's not even in question.
00:19:46.060 You know, usually it's sort of speculative.
00:19:48.500 Like, I don't know, how did Elizabeth Warren, you know, benefit in any way?
00:19:52.160 I don't know.
00:19:52.940 Maybe she just likes these two caucuses, too.
00:19:55.700 Or wants their support or something.
00:19:57.220 But there it is, plain as day.
00:20:01.640 Now, the beauty of the TikTok story is that there is only one right answer.
00:20:06.800 And the right answer is to ban it.
00:20:08.980 And there is no argument among people who are willing to describe the risk.
00:20:14.360 So let me say that again because it's important.
00:20:16.300 There's nobody who can describe the risk of TikTok who thinks it should be legal in America.
00:20:24.520 Everybody who can say there are two risks.
00:20:26.900 One is data security.
00:20:28.580 And the other one is persuasion.
00:20:30.860 Anybody who can say those two things out loud also says ban it.
00:20:34.620 Anybody who doesn't say it's a persuasion risk, mostly, they're in favor of keeping it.
00:20:41.080 But they know.
00:20:42.660 They know there's another risk.
00:20:44.840 And that means you can tell for sure.
00:20:47.200 And this has never been true before.
00:20:49.460 But you can tell for sure who's been bought off by China.
00:20:52.180 Now, when I say bought off, I don't mean directly.
00:20:55.600 It could be, you know, funding for a caucus, something like that.
00:20:59.180 But it's very clear that anybody who's still in favor of keeping TikTok are just bought off.
00:21:06.980 I've never seen one this clear before, have you?
00:21:10.380 Usually there's some argument on both sides.
00:21:12.740 But what makes this an interesting one is there isn't.
00:21:15.920 There is no argument on both sides.
00:21:18.240 There is one argument.
00:21:19.400 A hundred percent of people are on the same side if they can state the argument out loud.
00:21:26.480 If they can't state it out loud, it's intentional.
00:21:29.520 Because they know the argument at this point.
00:21:33.460 It's amazing that any of these people keep their jobs.
00:21:38.120 Well, here's an interesting update on the reparations situation in California.
00:21:42.880 Now, there are two reparations stories in California.
00:21:44.840 One is what San Francisco came up with, which is, you know, super crazy time.
00:21:50.120 Five million per descendants of black slaves and one dollar house you could buy and $90,000 per year.
00:21:57.780 But there's a lesser but still crazy one for the state itself.
00:22:03.100 And the news is that Governor Newsom has been quiet and not weighed in on the recommendations.
00:22:09.420 Now, I started by telling you that he was being brilliant, because by telling the committees to go work out a recommendation, I said to myself, oh, that's brilliant.
00:22:20.660 They'll come back with something that's so stupid that he can easily ignore it without being the one who turned it down.
00:22:27.980 And I thought, that's pretty good.
00:22:29.880 He said, here's what I got totally wrong, if you like it when I admit I'm wrong.
00:22:35.100 Oh, my God, was I wrong.
00:22:37.540 Here's why.
00:22:39.460 Apparently, there's no level of stupidity that Californians will recognize as stupid.
00:22:47.340 I thought it would be so far over the line of reasonable that we would just laugh at it and ignore it.
00:22:55.100 And then that would be the end of it.
00:22:56.340 But apparently, there are enough people in the state, I'm guessing, especially the black citizens of California, who are treating this seriously.
00:23:08.860 They're actually acting like it's a real thing, which puts Gavin Newsom in a bad spot.
00:23:15.960 Because if he were to approve this, that's the end of his political life.
00:23:21.140 Would you agree?
00:23:22.940 That would be absolutely the end of his political life.
00:23:26.340 Does anybody disagree with that?
00:23:29.320 There's no way he could be a national politician.
00:23:33.080 And that looks like what he wants.
00:23:35.160 Yeah.
00:23:35.560 So he's in a tough shot, tough place.
00:23:38.280 He can't say yes, and he can't say no.
00:23:40.840 Whichever he says will end him politically.
00:23:43.440 And he put himself in that position.
00:23:45.560 So I'm going to take back how clever it was.
00:23:48.020 I vastly underestimated the gullibility of Californians.
00:23:54.020 I mean, honestly, I should have seen it coming.
00:23:57.780 Can you spend a minute just insulting me?
00:24:00.320 Because I have it coming.
00:24:01.860 I think this would be good for all of us.
00:24:04.000 Could you tell me how fucking stupid I was?
00:24:06.840 Just let it out.
00:24:08.620 Just put it all out there.
00:24:10.920 Free pass.
00:24:11.920 Nobody gets blocked.
00:24:12.780 I deserve it.
00:24:15.260 Keep it coming.
00:24:18.360 Idiot.
00:24:19.080 Yes, idiot.
00:24:19.960 Keep it coming.
00:24:22.300 Bring it.
00:24:23.340 Bring it.
00:24:24.460 Yeah.
00:24:24.820 Okay.
00:24:26.060 All right.
00:24:26.760 Thank you.
00:24:27.520 Thank you.
00:24:27.980 Good job.
00:24:29.080 And those who use all caps for their insults, extra credit.
00:24:33.680 Extra credit.
00:24:34.920 Yeah.
00:24:35.620 Totally moronic.
00:24:36.620 Did not see how gullible and stupid my state is.
00:24:42.780 Now, but I'm going to put it in a larger trend.
00:24:47.360 Okay.
00:24:47.580 You can stop insulting me now.
00:24:51.040 All right.
00:24:51.520 You had your fun.
00:24:53.140 Wrap it up.
00:24:54.620 This would be a good time to wrap that up.
00:24:56.280 Wrap it up.
00:24:56.920 Wrap it up.
00:24:58.860 Let's get back on another trail.
00:25:02.520 Here's a related story.
00:25:06.020 Or is it?
00:25:08.420 Or is it?
00:25:10.240 Is this related to that story?
00:25:11.720 You'd be the decision.
00:25:12.860 You'd decide.
00:25:13.720 Okay.
00:25:15.220 ESPN is, they did a special video package to celebrate Women's History Month.
00:25:21.940 And they did it by honoring trans swimmer Leah Thomas.
00:25:25.980 So for Black History Month, ESPN is focusing on trans athlete Leah Thomas.
00:25:32.600 Now, what does that have to do with the California reparations story?
00:25:40.240 Do you see anything that they have in common?
00:25:43.580 Do you see it yet?
00:25:48.740 Ridiculousness is close.
00:25:49.920 It looks like the white people in California have decided to embrace and amplify to end wokeism.
00:26:01.140 Not just California, but wherever ESPN is out of.
00:26:04.080 It looks like there is a secret plan by white men, mostly men, to pretend to be so on board with wokeness that they're going to break the system.
00:26:15.960 Because there's no way women are going to let ESPN get away with celebrating Leah Thomas, a trans athlete, on Women's History Month.
00:26:31.360 There's no way women are going to let anybody get away with that.
00:26:34.700 So to me, it looks like this is like forced shark jumping.
00:26:40.900 You know, when they talk about poorly written TV shows, they say, oh, it jumped the shark.
00:26:46.140 It's a reference to the old Happy Days TV show.
00:26:49.220 And it looks like white men are pushing the shark.
00:26:54.820 They're like pushing you over the shark.
00:26:56.740 It's like, well, if this is where you want to go, let's go there as fast as possible.
00:27:01.280 Let's get there right away.
00:27:02.480 Let's take that slippery slope right to the bottom.
00:27:06.940 So let's let's let's find out what those reparations are.
00:27:11.820 Because those are totally practical.
00:27:13.960 Yeah, that'll work.
00:27:15.400 Everything will be fine.
00:27:16.820 It's all fine.
00:27:18.460 Five million dollars per person.
00:27:20.100 Who would complain?
00:27:21.500 What problems could you possibly have?
00:27:23.900 And how about Women's History Month?
00:27:25.680 Oh, well, you know, as Ricky Gervais says,
00:27:29.320 those old classic old style women with vaginas and wombs and shit.
00:27:34.900 No, that's old school.
00:27:37.520 We like the new women.
00:27:38.940 We like the new women with penises or had penises.
00:27:42.860 Yeah, we like that kind of women.
00:27:44.220 So I believe that the next thing you're going to see is a movement by white men
00:27:55.820 to punish white men and white women more severely.
00:28:02.260 Maybe maybe public whippings.
00:28:05.260 Because I think we've got to push this thing to we figure out what is too far.
00:28:09.180 Because too far, I thought I thought we'd already reached too far.
00:28:14.680 But we have not.
00:28:16.400 We've got to keep going.
00:28:18.120 Keep keep making it more ridiculous.
00:28:21.080 All right, we'll get back to that.
00:28:24.760 Here's why I think chat GPT and AI will be illegal.
00:28:30.420 Here's why.
00:28:31.080 So I asked AI about President Trump and whether...
00:28:42.280 One of the sample questions on the chat GPT thing was,
00:28:45.840 did Trump incite January 6th violence?
00:28:49.480 That's a pretty big question, isn't it?
00:28:51.820 Did Trump incite the violence?
00:28:55.580 What do you think AI said?
00:28:57.420 Well, AI didn't want to commit to that interpretation, so that's good.
00:29:04.700 But it described what Trump said
00:29:06.760 and said that some people interpreted it as calling for violence.
00:29:11.220 Guess what they left out of his quotes?
00:29:13.940 What part of his quotes from the speech just before the January 6th bad stuff happened,
00:29:20.120 what part of his speech do you think they left out?
00:29:22.780 They did quote his speech, but they left out a part.
00:29:25.520 It's the part about protests peacefully.
00:29:30.500 They left out peacefully and patriotically, make your voices heard today.
00:29:36.700 AI left that out.
00:29:38.820 Why do you think they left that out?
00:29:41.900 Do you think it was programmed to leave it out?
00:29:45.580 Well, here's what I think.
00:29:47.380 I don't think it's directly programmed to do that.
00:29:50.660 My understanding is it's a word prediction system.
00:29:56.100 So in other words, it simply looks at how everybody has ever talked ever
00:29:59.740 and then tries to talk the way most people talk.
00:30:04.060 And if the thing it trained on mostly said that Trump was not in it for a peaceful situation,
00:30:13.640 that's what people said.
00:30:15.360 If most of the people who talked about January 6th completed a sentence with
00:30:20.800 January 6th, Trump incited violence,
00:30:24.900 if there's more of that than January 6th, Trump called for peaceful protest,
00:30:31.000 it's just going to go with the majority.
00:30:34.260 That's my understanding.
00:30:35.520 I mean, if that's wrong, somebody needs to correct me.
00:30:38.480 But my understanding is you're just looking for word patterns.
00:30:41.700 It's not thinking.
00:30:42.680 It's just looking for word patterns.
00:30:45.280 This word usually comes after that word.
00:30:48.640 I was using some autofill program.
00:30:52.060 I forget what it was.
00:30:53.580 Some app.
00:30:54.500 And I started a sentence, and it started suggesting the next word.
00:30:58.960 And I thought, oh, that is the next word.
00:31:01.540 So I hit it.
00:31:02.840 And then from the first two words I'd written,
00:31:06.540 it suggested three words that are most likely the next word after those two.
00:31:11.040 And one of them was correct.
00:31:13.960 I wrote the whole sentence from the first word without ever typing another word.
00:31:22.480 From the first sentence, it gave me choices and then narrowed down what I was going to say
00:31:27.840 until by the end it was sort of only one choice for the last word.
00:31:32.200 Yeah.
00:31:32.540 That'll spin your brain around a little bit.
00:31:37.460 All right.
00:31:37.900 Here's what I've been saying about AI that will become more and more true the longer we learn about it.
00:31:46.520 It goes like this.
00:31:48.100 We are not learning how to make machines intelligent.
00:31:52.500 That's what you thought we were doing.
00:31:54.580 We are not teaching machines to be intelligent.
00:31:57.780 Here's what we're doing.
00:32:00.540 We're proving we never were.
00:32:04.520 AI can only prove that humans were never intelligent and only imagined that they were.
00:32:11.180 That's what this proved.
00:32:13.140 The way humans think is also word pattern repetition.
00:32:19.320 The way you can see it yourself.
00:32:21.220 If you go on Twitter, you know how a person's going to end a sentence because of the way they started.
00:32:28.200 Don't you?
00:32:29.140 You can tell, oh, it's one of those, one of those people, whoever those people is.
00:32:33.460 You could be on either side and still say the same thing.
00:32:36.120 Oh, it's one of those people.
00:32:37.920 You know, the left would say, it's one of those MAGA people.
00:32:40.140 I know what they're going to say.
00:32:41.460 And MAGA people would say, oh, it's a leftist, progressive.
00:32:44.260 I know what they're going to say.
00:32:45.740 And they're right.
00:32:46.860 They're both right.
00:32:47.600 Because both sides are just imagining their thinking, but they're not.
00:32:54.020 When I tell you that the news assigns you your opinions, and you reject that and say, well, not me.
00:33:02.040 But it probably does assign those opinions to the other people.
00:33:05.160 Probably does.
00:33:05.940 But not me.
00:33:07.020 I use my reason.
00:33:09.280 No.
00:33:10.360 Your opinion is based on the pattern of what you saw the most.
00:33:13.660 And if you watched Fox News and, you know, right-leaning media the most, then when you complete a sentence, you're going to complete it the way they did.
00:33:23.440 And you will believe that you thought.
00:33:26.020 You'll believe that reason happened in your brain.
00:33:28.360 But you'll just be doing pattern recognition exactly like the AI is.
00:33:32.560 What is the most likely end of the following sentence?
00:33:37.060 Okay, I watch Fox News all day.
00:33:38.840 The end of the sentence is, Trump was framed.
00:33:43.960 Framed.
00:33:45.000 Okay, now I go over to watch CNN all day.
00:33:47.520 And I watch CNN and MSNBC and nothing else all day.
00:33:51.260 Finish the sentence.
00:33:53.020 Trump is guilty of multiple crimes.
00:33:57.360 And that's it.
00:33:59.100 That is what you think is thinking.
00:34:01.760 I know you don't believe it.
00:34:03.160 Yesterday, I told you about a study that we've known for decades, that the critical thinking part of your brain doesn't engage until half a second after you decide.
00:34:15.260 Did you know that?
00:34:16.740 The critical thinking part of your brain doesn't even come online.
00:34:21.640 It doesn't even activate until after you've decided.
00:34:24.840 And we've known that for decades.
00:34:26.700 What does that tell you?
00:34:27.760 It means that the decision is based on this irrational pattern recognition thing, just like AI, right?
00:34:35.620 But then we have another process where we rationalize it after the fact.
00:34:40.340 The rationalizing is why the things we say don't seem to make sense, the cognitive dissonance.
00:34:47.580 Every now and then, the things we say do make sense.
00:34:51.940 But it's a coincidence.
00:34:55.400 It's a coincidence.
00:34:56.460 And by coincidence, if the thing you say makes sense, then you don't get cognitive dissonance, because it's all consistent.
00:35:05.720 You only get cognitive dissonance when your pattern recognition and your word fill-in thinking comes up with something that the rest of your brain says,
00:35:15.000 ah, that seems inconsistent, but you have to go with it anyway.
00:35:18.920 So you paper it over with cognitive dissonance, where you literally hallucinate that the things that don't make sense make sense.
00:35:26.460 So, you don't know this yet, but this is the biggest risk of AI.
00:35:35.200 The biggest risk of AI is not what it does.
00:35:39.380 The biggest risk of AI is what you realize about yourself, or people realize.
00:35:45.960 It's going to get rid of religion.
00:35:49.440 It's going to get rid of political preference.
00:35:52.680 You will start to recognize them as programmed effects in your brain.
00:35:57.460 You will recognize that your opinions are not real, and that's going to take some adjustment.
00:36:04.240 Once you realize your opinions are not real, they're not even coming from you.
00:36:08.800 They're just pattern recognition, and that comes from the outside.
00:36:13.040 Once you realize that, everything's different.
00:36:15.740 Now, it won't kill you, because I've been on that, you know, I've been on that menu since, I was probably 23.
00:36:27.780 Probably age 23, I first realized that your brain is not a logical engine.
00:36:33.060 When I learned hypnosis, that's the first thing you learn.
00:36:35.500 First thing you learn in hypnosis is that people are not logical engines.
00:36:39.160 If they were logical, you couldn't hypnotize them.
00:36:43.540 Did you know that?
00:36:45.360 Hypnosis wouldn't work if people had logical brains the way you think they do.
00:36:50.400 It works because what I make you focus on, and what I repeat, becomes your operating system.
00:36:58.040 It's just that simple.
00:37:01.340 All right.
00:37:02.040 Here's a reframe that's going to change everything.
00:37:11.880 Back in, let's see, when was it?
00:37:15.860 1982, I think.
00:37:17.860 There was a study, oh no, it was long ago, 1964.
00:37:22.900 It was a study in which teachers were randomly, there was a group that were randomly selected in a classroom,
00:37:30.680 and the teachers were told that they were the smart ones,
00:37:35.220 and so the teachers treated one group like they were gifted,
00:37:39.240 and then treated everybody else the same.
00:37:41.580 What do you think happened?
00:37:43.960 You already know.
00:37:45.540 The group that thought it was gifted because their teacher acted like that,
00:37:50.280 their performance went way up.
00:37:54.320 Now, I was looking for whether that study had been repeated, and I didn't find it.
00:37:59.460 But it is accepted.
00:38:02.360 It is accepted because I see it a lot.
00:38:05.120 But I'd love to know if that's been debunked.
00:38:07.520 I don't think it's been debunked.
00:38:09.180 I believe there's plenty of evidence to suggest that how you expect to succeed is how it'll turn out.
00:38:18.520 Now, you could say that that's a winner's mindset, wouldn't you?
00:38:22.780 If you expect to do well, do you do better?
00:38:26.620 Everybody knows that.
00:38:28.460 Everybody knows that if you expect to do well, you're probably going to do better than if you expect to not do well.
00:38:35.520 Now, is there anybody who doubts the premise before I go on?
00:38:39.560 I need you all to buy into the premise that what you expect to accomplish is really going to make a difference.
00:38:46.700 Look at Elon Musk.
00:38:50.800 He expected that he could build a rocket ship to Mars.
00:38:56.160 Other people expected that they couldn't.
00:38:59.440 The only one who expected he could do it did it.
00:39:02.640 This is very consistent.
00:39:03.900 I expected, I know this is weird, and I know it doesn't make me sound good when I say it,
00:39:12.980 but before I became a cartoonist, I was reading a newspaper, looking at the comics, and I said to myself,
00:39:18.340 I feel like I could do this on the first try.
00:39:22.300 And then I did.
00:39:23.960 I actually expected that with no experience whatsoever, I could become a famous cartoonist.
00:39:29.820 And then I did.
00:39:30.660 Now, were my expectations necessary?
00:39:35.200 Of course.
00:39:36.900 That my odds of succeeding in this field were like 1 in 10,000.
00:39:41.160 Or maybe 1 in 100,000.
00:39:42.580 It was just crazy.
00:39:44.020 But I expected it would work.
00:39:46.400 So I did it, and then it worked.
00:39:49.400 Coincidence? I don't know.
00:39:50.900 But I can tell you that in my life, when I expect something to work,
00:39:55.640 I expect it to work.
00:39:58.300 Even if it's irrational.
00:39:59.500 I still expect it to work.
00:40:02.980 You know, some of it is just pure optimism.
00:40:05.180 It's not based on fact.
00:40:06.700 But when I expect something to work, damn it, I expect it to work.
00:40:11.620 When I learned to play drums, I expected it to work.
00:40:16.740 And so a year and a half, two years of trying with no progress at all.
00:40:21.820 Imagine doing something for a year and a half, and you couldn't even make it sound like a beat.
00:40:27.020 I mean, nothing even remotely like music.
00:40:30.180 Didn't stop me for a minute.
00:40:32.180 Did not slow me down.
00:40:33.460 Do you know why?
00:40:34.660 Because I expected to succeed.
00:40:37.480 Now, eventually, once I got limb independence, you know, everything came easily.
00:40:42.180 So now I could probably play just about anything you could play on drums.
00:40:45.840 I would just have to practice that specific stuff.
00:40:48.400 I'm also playing the guitar.
00:40:52.200 Here's the problem.
00:40:54.120 I don't expect I can do it.
00:40:56.760 I don't.
00:40:58.420 I'm playing it as if I'm not believing my own opinion.
00:41:03.220 So I'm still taking lessons, and I'm going to grind away for, yeah, I'll probably grind away for a year or so, no matter what, just to find out.
00:41:11.360 But I don't expect it, and that's a problem, don't you think?
00:41:18.400 If I expected it, do you think I'd try harder?
00:41:21.760 I think I would.
00:41:23.960 I think I would, yeah.
00:41:26.380 So expectations make your performance different.
00:41:31.760 Everybody's on board with that, right?
00:41:33.440 I want to make sure there's nobody who disagrees with that statement.
00:41:38.080 Okay, I think I've got full agreement.
00:41:41.360 All right.
00:41:42.440 Now let's talk about critical race theory and what children are taught in school.
00:41:49.280 Are the white children taught that they can't succeed?
00:41:53.220 Nope.
00:41:54.380 Are the Asian kids taught that they can't succeed?
00:41:58.560 Nope.
00:41:59.900 Are the Indian American kids taught they can't succeed?
00:42:05.000 Nope.
00:42:06.500 Nope.
00:42:07.460 Are the Hispanic children taught they can't succeed?
00:42:09.740 Well, they often, if they're immigrants, they've got a language issue and stuff.
00:42:14.860 So those are real.
00:42:16.660 But otherwise, no.
00:42:18.100 They think they can do whatever they want.
00:42:19.960 If you're a black American kid, do you expect to succeed?
00:42:23.600 Or do you expect that systemic racism will prevent you from success?
00:42:28.260 Well, you're being taught that systemic racism is a barrier that you have that other people don't have.
00:42:34.480 What would be the predictable outcome of telling black kids they have more obstacles to success than white kids?
00:42:44.100 What would you predict?
00:42:46.100 Lower test scores, right?
00:42:48.320 Isn't that the most predictable thing?
00:42:50.320 Is lower test scores.
00:42:52.580 And sure enough, there are lower test scores.
00:42:54.160 Now, isn't the purpose of CRT to improve the lives of black kids if it's in school?
00:43:04.500 Now, of course, there's the argument whether they teach CRT or they just talk about the same elements of it in different ways,
00:43:11.460 which is, I'm not going to say that's different.
00:43:13.180 It's all the same, right?
00:43:15.980 So here you have something that scientifically 100% of people would agree is bad for black kids.
00:43:25.800 Would you agree?
00:43:26.600 Do you think that there's any psychologist, any, black, white, Asian,
00:43:34.280 do you think there's any trained psychologist who would say it's good for black kids to learn that they have an extra thing preventing them from success?
00:43:43.980 I don't believe anybody who has a degree or any credentials in psychology would be in favor of teaching some kids that they're not going to succeed.
00:43:54.800 Now, they don't say it that way.
00:43:56.300 They don't say you're not going to succeed.
00:43:58.120 But that's sort of the message.
00:44:00.340 You know, you're in a class of people who have this special problem.
00:44:03.580 You'll always have this problem.
00:44:05.340 No matter what you do, you'll run into this problem.
00:44:09.040 If you told me that every day, I don't know that I would try so hard.
00:44:13.180 I would just figure out if something wasn't working, after a few years of plugging along,
00:44:18.200 I'd say, well, they were right.
00:44:20.860 This systemic racism is keeping me from success.
00:44:24.200 But it might have been that third year of plugging that you needed.
00:44:27.020 I don't see any scenario in which CRT is not super harmful to black Americans.
00:44:35.400 What do you say?
00:44:37.100 Anybody disagree?
00:44:37.800 Is there any counter to that at all?
00:44:45.220 I don't see any.
00:44:47.660 And yet we're okay with that.
00:44:50.540 We're okay with it.
00:44:51.820 I think the way to make the CRT stuff disappear is that you should call it what it is.
00:44:57.820 It's a way to suppress black progress.
00:45:00.360 CRT, it wasn't designed that way.
00:45:04.640 It's nobody's intention.
00:45:06.220 But it's clearly going to do it.
00:45:08.520 Clearly.
00:45:09.600 And even those, the black students who succeed anyway, are they better off?
00:45:18.500 Do you think a black successful person is better off knowing that the people who are looking
00:45:26.480 at their success are thinking it was probably some kind of favoritism?
00:45:33.100 It's not really good for people who are actually successful.
00:45:36.620 I would be really pissed.
00:45:38.240 Imagine how much I would hate it if, you know, I had a successful career and people would just
00:45:45.440 look at me and say, yeah, because of your race, you got a little boost there, didn't
00:45:49.560 you?
00:45:50.420 Got a little extra.
00:45:52.120 I wouldn't like it at all.
00:45:55.540 All right.
00:45:58.060 So that's the way to, I think, fix all this.
00:46:01.160 Now, let me get a little more controversial by quoting, I'm following an account on Twitter
00:46:08.320 by Tyrone Williams.
00:46:11.240 His Twitter thing is immune hack.
00:46:15.580 All is one word, immune hack.
00:46:18.680 Now, he doesn't have that many followers, a few thousand, 6,000 followers, something like
00:46:22.260 that.
00:46:22.420 But he is pushing a message that the difference in black performance compared to other races
00:46:32.980 in America needs to be looked at more deeply.
00:46:39.320 And I think his complaint is that if you look at just poverty and you look at just any IQ,
00:46:48.260 you might be missing the real reasons for the difference.
00:46:50.580 So Tyrone, this is not me, right?
00:46:54.900 If I tweeted this, I'd be double canceled.
00:46:57.340 But Tyrone is trying to use what I'll call some tough love.
00:47:02.180 So my take is that Tyrone is really trying to help.
00:47:06.460 And he's putting himself out there at great risk because he probably thinks, I think, I
00:47:12.200 mean, this is my, just my impression of it.
00:47:13.700 I'm not a mind reader.
00:47:14.800 But it looks like he's just trying to help.
00:47:16.660 And he's trying to help the black community in particular with some tough love and tough
00:47:22.840 honesty.
00:47:24.200 But he's also making very useful, very useful additions to the conversation.
00:47:31.500 For example, he said, this is his tweet, Tyrone Williams.
00:47:36.380 He says, blacks are less likely to, one, optimize their prenatal diet, exercise and sleep.
00:47:42.280 Now, I don't know if that's true, but he seems to have looked into it because he's kind of
00:47:47.220 data driven.
00:47:48.600 But he says that, optimize their prenatal diet, exercise and sleep.
00:47:52.440 Less likely to breastfeed, less likely to read to the kids, less likely to ensure their
00:47:57.280 kids attend school, stay under trouble in their proficient in math and reading, and less
00:48:03.220 likely to do well on tests.
00:48:05.640 And then he says, but they expect equal representation at top universities.
00:48:09.360 So that's the tough love part.
00:48:12.400 So the useful part is, and I would really like to know about this, haven't we determined
00:48:18.020 that breastfeeding improves your intelligence?
00:48:21.260 Am I wrong that that's scientifically demonstrated?
00:48:26.140 We know that, right?
00:48:27.860 Yeah.
00:48:28.720 And if it's true that there's a difference, well, there's a very specific lever that you
00:48:35.320 could go after.
00:48:35.960 You know, there might be an education thing, it might be a practical thing, too.
00:48:40.180 There may be some practical reason that some people can do it and some can't.
00:48:44.020 So that's really useful.
00:48:47.540 To me, that's useful.
00:48:49.080 It's a specific thing you could target, and you could say, let's do better on this.
00:48:53.260 Reading to your kids, something you could target.
00:48:56.780 Optimizing prenatal diet.
00:48:58.600 Well, I would think that this is a problem for everybody who is low income.
00:49:01.440 I don't know if there's any racial difference in exercise and sleep.
00:49:07.520 I don't know that.
00:49:09.180 But if there is, it's probably something you could find out.
00:49:13.100 If there is a difference, those things would, in fact, have an impact on IQ.
00:49:17.620 We know that proper exercise helps your IQ.
00:49:20.920 We know that diet helps your IQ, don't we?
00:49:22.840 Is there any doubt that these three things would help your IQ?
00:49:26.740 Breastfeeding, better prenatal diet, and more focus on exercise and sleep.
00:49:33.800 And that's pretty much guaranteed stuff, right?
00:49:36.660 So this is why Tyrone's so useful.
00:49:38.980 He's going for solutions, not politics.
00:49:42.360 And then, you know, ensuring your kids stay in school and do well in tests and stuff like
00:49:46.040 that is pretty basic stuff.
00:49:47.900 So I would say we have some several very promising, and also somebody mentioned lead and paint might
00:49:59.240 be another factor that is disproportionately affecting black Americans.
00:50:03.800 So there might be some environmental, nutritional, mindset things.
00:50:10.380 And CRT and the, you know, the message that white people are victimizing you and holding
00:50:17.100 you down is almost certainly reducing test scores and success.
00:50:22.740 Now, if you want it to be useful, these are things you could really make a difference on.
00:50:28.020 Like, you could actually move the needle on all of those things.
00:50:30.540 And then find out, find out what's what.
00:50:34.720 You know, does it close the gap?
00:50:36.140 Does it not close the gap?
00:50:37.320 I'd like to know.
00:50:40.380 All right.
00:50:45.580 Yeah.
00:50:46.840 The growth mindset tempers the effects of poverty on academic achievement.
00:50:52.680 All right.
00:50:52.940 So here's something from, I forget where I saw it.
00:50:57.960 But basically, it's saying that your mindset of whether you can succeed is one of the biggest
00:51:03.800 important things for success.
00:51:06.780 Now, where do people get a mindset?
00:51:08.240 Where do you get your mindset from?
00:51:12.420 Where do you think that comes from?
00:51:14.880 Do you think you're born with it?
00:51:18.320 Now, I've seen a lot of studies that say having a single father gets you a good result, but
00:51:26.900 having a single mother doesn't.
00:51:28.760 Have you seen that?
00:51:29.480 I don't know if it was limited to black population, but it might have been.
00:51:34.300 But the study showed that if you only had one parent and it was a mother, you'd have some
00:51:40.660 issues.
00:51:41.760 But if the one parent is a father, that the kid would perform as well as anybody who had
00:51:47.460 two parents.
00:51:47.920 I'm a little skeptical about that, because I think there's a selection bias in that.
00:51:55.760 The selection bias being that if the dad is capable of and wants to be a dad, that probably
00:52:04.360 sorts people into the better category automatically.
00:52:07.980 So I'm not sure you're seeing a father effect as much as a filtering effect of who decides
00:52:17.000 to be the father, who decides to be a single father in the first place.
00:52:20.760 I think that's a filtering effect.
00:52:22.500 All right.
00:52:30.860 Okay.
00:52:32.280 Mother's matter, father's matter, yeah.
00:52:34.980 But I think mindset comes from your peers.
00:52:38.800 It comes from your parents.
00:52:42.040 But where does success mindset come from specifically?
00:52:47.940 I wonder if that does come from dad.
00:52:49.940 Now, in my case, it came from my mother.
00:52:56.180 So I can't speak to any other situation.
00:52:58.960 But in my case, my mother was the one who said, you're going to college from the time
00:53:03.200 I was a fetus.
00:53:04.740 Told me I'd be successful from the time I could understand language.
00:53:08.760 Always did.
00:53:09.900 And, you know, when I was a teenager, I was already consuming self-help books.
00:53:17.020 Have I ever told you that?
00:53:17.900 By the time I was a teenager, and certainly by my early 20s, I was consuming everything
00:53:24.100 I could about how somebody got famous or how they got successful.
00:53:29.080 I would read every story about somebody who started with nothing and made it.
00:53:33.220 Every story.
00:53:34.300 I'd read every book that said, here's the secret to success.
00:53:37.900 I just absorbed it all the time.
00:53:39.960 Now, where did that come from?
00:53:41.980 Where did I get that habit?
00:53:43.400 Because all I remember is thinking, oh, wait a minute, there might be a formula for success.
00:53:50.380 Are you telling me if I just learned the formula that I would be successful?
00:53:54.820 You know, assuming I'm a functional person.
00:53:57.800 And so I came to believe that if I learned the formula that I could be successful.
00:54:03.320 So I spent years and years trying to figure out the formula, you know, putting it together.
00:54:08.800 And that's where I came up with the talent stack idea of combining useful talents.
00:54:14.480 It's by studying other people's, you know, ways, looking for patterns.
00:54:18.340 And it's where I came up with this system is more important than a goal.
00:54:21.400 That you have to be doing something every day to improve your odds in general, not just working toward this one goal in a straight line way.
00:54:32.640 Maybe you thought of something without being assigned to you.
00:54:35.080 I don't think so.
00:54:36.200 I think it came from my mother.
00:54:37.920 Because my mother was always the go to school and, you know, it's all within your power to have whatever you want.
00:54:44.700 I mean, my mother would say the old 50s and 60s thing.
00:54:48.340 You can be whatever you want.
00:54:49.760 So my mother was always, if you put it in the work and you learn how to do the work right, you know, college or anything else.
00:55:00.020 So if you figure out how to do it right and you put it in the work, you can do anything you want.
00:55:05.640 That's exactly how my life turned out.
00:55:08.540 Now, I don't think that works for everybody.
00:55:11.580 So I'm biased by my own experience.
00:55:13.680 But how in the world did that not help me compared to, let's say, a parent who said, you know, nobody succeeded in this family because of all that discrimination or something.
00:55:27.640 I never heard that story.
00:55:29.440 I never heard that I couldn't succeed.
00:55:31.460 I only heard that I could.
00:55:32.780 I never heard anything else.
00:55:35.440 That's got to make a difference.
00:55:37.380 Don't you think?
00:55:38.640 It's got to make a difference.
00:55:44.200 All right.
00:55:44.820 Let me check with you.
00:55:47.900 How many consider yourself successful and had parents who had a positive mindset?
00:55:53.600 I'm looking for the double.
00:55:55.220 You consider your life successful, however you define it.
00:55:58.040 It doesn't have to be financial.
00:56:00.440 Yeses and yeses.
00:56:02.200 Yes and yes.
00:56:03.160 Yes and yes.
00:56:03.740 Double yes and yes.
00:56:05.620 No and no.
00:56:06.320 Okay, that's interesting.
00:56:10.480 So the one, I'll bet there aren't too many people whose parents told them they could do anything they wanted and then they consider themselves a failure at middle age.
00:56:19.980 I'll bet it's rare.
00:56:22.180 Yeah.
00:56:22.660 I mean, you can see the effect of mindset.
00:56:25.300 It's everything.
00:56:27.460 All right.
00:56:28.980 So, let me remind you, give you a little backup in context.
00:56:33.520 Do you remember I got in a little trouble for saying something with a racial overtones a few weeks ago?
00:56:41.260 Anybody remember that?
00:56:42.420 There were headlines about it.
00:56:44.600 And when it happened, I realized that something that wasn't obvious had happened at the same time.
00:56:53.460 Which is, I can talk about the topic of race productively without worrying about getting canceled now.
00:57:02.120 And I'm like the only person who can do it.
00:57:04.580 I'm the only one.
00:57:06.080 I'm the only one who could have an honest conversation about race.
00:57:08.760 So, people, if you want to have an honest conversation, I'm the one.
00:57:16.500 And I'm fairly well informed.
00:57:18.860 So I think I have something I could add.
00:57:21.140 Even if the only thing I added was honesty.
00:57:23.000 Because, you know, how many teachers, let me put it this way.
00:57:31.040 Don't you think that professional teachers are fully aware that telling some group they're victims suppresses their performance?
00:57:41.500 Now, if I know it, you don't think teachers know it?
00:57:45.520 I mean, I'll ask my sister.
00:57:47.540 She was a teacher for a year.
00:57:48.940 She's retired.
00:57:49.480 But if I ask my sister, who's probably listening right now, well, text me.
00:57:55.320 All right, sister, I won't say your name just so people don't get all over you.
00:58:00.320 But sister of mine, text me right now and tell me.
00:58:07.020 Tell me if you're not fully aware that people will rise to their expectations.
00:58:14.380 Yeah.
00:58:16.760 Her last name is not Adams, so you can't find her.
00:58:19.480 I'll bet in a moment she'll weigh in.
00:58:29.660 Actually, I don't know if she watches it live or watches it recorded.
00:58:33.920 I think she watches it recorded, maybe.
00:58:36.520 But I don't think there's anybody who disagrees that expectations affect performance.
00:58:42.980 And nobody would disagree that telling people that they've got this, you know, invisible yoke on them called systemic racism.
00:58:51.340 That's got to affect performance.
00:58:53.000 It just has to.
00:58:56.640 Teachers also know the parents are mostly to blame when the kids are a disaster.
00:58:59.920 I don't know.
00:59:01.960 I mean, the kids are partly to blame.
00:59:11.940 All right, that's all for you, YouTube.
00:59:13.720 Thanks for listening.
00:59:14.620 I'm going to talk.
00:59:15.980 Let's see if I got my message here.
00:59:17.520 Okay.
00:59:24.220 No, that wasn't what I was expecting.
00:59:26.740 All right.
00:59:27.700 That's all for now.
00:59:28.980 YouTube.
00:59:29.340 All right.
00:59:29.620 Bye.
00:59:30.540 Bye.
00:59:30.800 Bye.
00:59:31.140 Bye.
00:59:33.180 Bye.
00:59:33.840 Bye.
00:59:39.500 Bye.
00:59:39.540 Bye.
00:59:41.600 Bye.
00:59:43.200 Bye.
00:59:43.520 Bye.
00:59:43.840 Bye.
00:59:44.640 Bye.
00:59:48.920 Bye.
00:59:49.320 Bye.
00:59:50.320 Bye.
00:59:52.140 Bye.
00:59:53.020 Bye.
00:59:54.080 Bye.
00:59:55.260 Bye.
00:59:55.660 Bye.
00:59:55.720 Bye.
00:59:56.220 Bye.
00:59:57.020 Bye.
00:59:57.500 Bye.
00:59:57.780 Bye.
00:59:58.980 Bye.