Episode 2060 Scott Adams: TikTok Shows Us Who Is Bought Off, CRT Lowers Black Test Scores?
Episode Stats
Words per Minute
146.75499
Summary
A man built a bike that shoves a pole up his ass, flying taxis are coming to Chicago, and Microsoft is building a phone built around artificial intelligence, and it s going to be a phone with no apps.
Transcript
00:00:01.000
Good morning everybody, and welcome to the highlight of civilization that's called Coffee
00:00:09.360
And if you made the mistake of watching anything else at this time, well, you're probably regretting
00:00:15.840
Because you can take this experience up to levels where nobody's ever seen it before.
00:00:22.920
And all you need is a cup or a mug or a glass of a tank or a chalice or a sty on a canteen
00:00:26.400
glass, a vessel of any kind, fill it with your favorite liquid, I like coffee.
00:00:32.400
And join me now for the unparalleled pleasure of the dopamine, the day thing that makes
00:00:36.800
It's called the simultaneous sip and it happens now.
00:00:41.360
Ah, that was a strange sip, strange but beautiful, graceful, elegant, classy, really.
00:00:54.080
All right, well, we got a lot going on here, so I'd like to start with my favorite story,
00:01:01.080
which I will look for on Twitter to show you if you haven't seen it already.
00:01:04.600
There was some gentleman, I'm not sure where it was, it might have been an African country,
00:01:10.960
But there's a clever gentleman who rigged a bicycle so that if you tried to steal the bicycle,
00:01:17.480
and it was left unlocked in an easy place to steal, the seat would collapse when you got
00:01:23.320
on it and the pole that normally holds the seat would go right up your ass.
00:01:29.560
So you would be shoving a metal pole up your ass with the force of your own body weight when
00:01:36.480
If you don't think that's funny, well, you don't know me, because there's a compilation
00:01:49.160
Now, the funny part is watching them, none of them steal the bike, they all walk around
00:02:04.600
The thing I love about it, the thing I love about it is that none of them actually steal
00:02:10.720
Theoretically, you could still ride the bike or push it away and get the seat fixed,
00:02:16.760
but everybody is completely done with the bike after it shoves a pole up their ass.
00:02:35.040
So if you go to the Chicago airport and you're heading to the middle of the city, in maybe
00:02:41.320
a year or so, United will allow you to take a short hop flight in a vertical takeoff plane.
00:02:50.780
So the plane will just go straight up, over and straight down in the middle of the city.
00:02:55.700
And apparently they've already purchased the hardware and they've got the plan and it's
00:03:05.300
I think it takes, you know, six people or something.
00:03:10.600
So you miss all of the, all of the traffic, 150 miles an hour.
00:03:17.280
There's your flying cars, kind of, but flying taxis.
00:03:22.280
I saw an interesting prediction from Naval Ravikant.
00:03:28.280
And if you don't know who Naval is, the only thing you need to know is if he predicts something,
00:03:43.940
He said, Microsoft ships a phone built around AI by the end of the year.
00:03:56.400
So Microsoft has some ownership of this big chat, GPT, OpenAI, whatever it's called.
00:04:03.860
And so they, their Bing search engine already uses it.
00:04:09.240
And Naval is thinking that they might build it into a phone.
00:04:20.800
The ultimate obvious place that the phone interface will go is no apps.
00:04:29.800
The perfect interface for a phone would be a blank screen.
00:04:35.580
And if you say to it, hey, make a spreadsheet and add up these numbers, then it just creates
00:04:45.480
Or if you say, send a message, it just creates the app and sends the message and then deletes
00:04:55.240
You just tell it what to do and it goes and figures out how to do it.
00:05:00.660
I think your phone is just going to be a blank screen and you talk to it.
00:05:05.220
Alternately, here's how I would have designed the phone already.
00:05:09.480
I would have designed it so that you pick it up and start doing the thing you want to
00:05:15.720
If you want to search for something on Facebook, you just type in blah, blah, blah, and then
00:05:26.860
And then as you type it in, the AI says, well, I don't know what this is about.
00:05:31.300
He could be writing an email, could be sending a message, or it could be a search term.
00:05:40.400
And as soon as you typed it in, you'd say, the best place to eat in San Francisco.
00:05:48.820
And among the choices are a Google search, a Bing search, or an email.
00:05:53.380
But you know it's a search, so you just hit boop.
00:05:58.220
You should never have to deal with the app before you do the activity you wanted the app
00:06:03.500
You should start the activity and then AI should figure out what apps or app would make sense
00:06:10.720
So it should look at your context and then figure out the app for you.
00:06:20.820
Well, you would type because people are listening, so you can't always speak out loud to your phone.
00:06:31.740
There's no way that 10 years from now, you're going to be selecting an app and then telling
00:06:46.620
So I guess Musk is giving some stock grants to employees, which value the company at about
00:06:58.980
He's valuing it, at least in terms of valuing the stock options, at $20 billion, but he suggests
00:07:04.920
that in 10 years or so it could be worth $250 billion and that there's a difficult but very
00:07:19.860
If he gets to $250 billion and he's the richest person times three, I think, I think it's possible.
00:07:28.680
Because to do that, he would have to fold in different functions like payments, you know,
00:07:33.720
have a better advertising, you know, situation, have payments in there.
00:07:46.480
You know, it's also easy to imagine that you would get Twitter plus Skylink, somehow
00:07:55.580
there'd be some kind of combined deal or something.
00:07:59.180
Anyway, I do think that there are paths to $250 billion valuation.
00:08:04.180
But the funniest thing about Twitter is that they closed down their press contact.
00:08:10.320
So it used to be if you emailed press at twitter.com, you could ask a question if you were the press.
00:08:18.320
And now if you email, you get an automatic response from press at twitter.com of a poop emoji.
00:08:35.280
Now, you tell me that that could be more perfect.
00:08:45.140
Well, Trump is saying out loud that he thinks that Bragg has already dropped the Stormy Daniels case.
00:09:01.900
I don't know if it's already dropped, but I don't see how it could go forward.
00:09:05.680
Alan Dershowitz has gone so far as to say that the case is so weak and the main guy who would probably testify would be Michael Cohen.
00:09:18.500
And now that Michael Cohen's own lawyer produced a document that would show that Cohen is a liar, either is a liar or was a liar, but it's going to end up looking the same.
00:09:29.960
Because if he's a liar and you think he's going to say the opposite of what he said in writing he believed, you could get disbarred.
00:09:39.860
Because you could get disbarred for knowingly put a liar on the stand.
00:09:45.100
If it's his witness and he knows he's a liar and he needs that lie to make his case and he puts him on the stand, he could be disbarred.
00:09:54.120
Now, I think you'd have to prove he knows it's a lie, so I don't think it could really happen.
00:10:02.440
Like, well, you know, there's a possibility that this could end with Bragg being the one who loses his job instead of Trump.
00:10:11.800
I don't think it's likely, but I like that it's out there.
00:10:14.460
So I'm going to predict that the charges will not go forward.
00:10:21.400
That one way or another the charges will not proceed.
00:10:25.860
And I think that if it were still politically good, but legally sketchy, it would go through.
00:10:35.040
But now it's obvious that it's a political disaster as well as a legal disaster.
00:10:39.740
So legally it was always weak, but politically maybe you could, you know, get some points.
00:10:45.320
But now it's obvious that this, even the threat of it made Trump more popular.
00:10:56.340
The tease of it was that Jon Stewart said something along the lines in a recent interview that this is why Trump got elected.
00:11:07.640
Now, wouldn't you want to know what that video said?
00:11:10.680
What did Jon Stewart say about Trump, this is why he got elected?
00:11:18.660
It feels like the Bragg situation, the Alvin Bragg situation.
00:11:25.340
But have you noticed that the more you want to watch a video, the less likely it will ever play?
00:11:32.160
You can click on that motherfucker all day long and it won't play.
00:11:39.520
Fox News will say, sexy picture of somebody you actually want to look at.
00:11:51.780
I think I'll just maybe click on that sexy bikini picture and see what all the news is about.
00:11:57.920
Because I'm not the kind who just looks at the headlines.
00:12:02.200
That's why I like to click on the stories and get the pictures.
00:12:05.260
And I'll be like, well, okay, if this is the sexiest picture ever from this person, I think I got to see it.
00:12:21.800
Put a headline on CNN that says, Trump agrees he should not be president.
00:12:27.960
And then whatever that video is, it'll never play.
00:12:35.140
So somehow there's some kind of technology that makes anything interesting unplayable at the same time.
00:12:43.680
If it's some boring-ass story of a general said, oh, I'm a general, and blah, blah, Ukraine.
00:13:02.120
But if they say, here's a video of Nancy Pelosi having, I don't know, sex with Adam Schiff, caught on video, that won't play.
00:13:32.700
I saw a opinion piece on why the school choice movement is working well at the moment, when for so long it didn't get much traction.
00:13:42.800
And a lot of it is being credited to Corey D'Angelo and his strategy.
00:13:49.900
Now, some of it, of course, was the pandemic, right?
00:13:53.040
People got to see Zoom school and see how horrible it was and got more interested in their kids' education and all that.
00:14:03.160
Some of it's the teaching young kids too much about sex too early, say, people.
00:14:09.240
So there's lots of reasons why people would be more interested in homeschooling.
00:14:14.720
But the current thinking is that homeschooling is being driven on values as opposed to education.
00:14:27.800
But it feels like people are saying, okay, I was okay when I didn't know if my kid was learning to read and write.
00:14:34.640
But, you know, it seemed like everybody was in the same boat.
00:14:40.460
But as soon as you find out your kids are being taught that they're either victims or oppressed or that their gender is sort of up to them,
00:14:49.800
then the parents end up getting really involved.
00:14:53.600
So I think when you say, what will you teach my kids?
00:14:59.020
People go, well, you know, I guess I can put up with some imperfection.
00:15:03.240
But when you say, what will I turn my kids into, that's a whole different game.
00:15:13.160
That's actually turning them into the kind of people that is somebody's idea of a good citizen,
00:15:19.280
but maybe not the parent's idea of a good citizen.
00:15:22.060
So you can see why this is getting energy right now.
00:15:26.420
So if you're going to argue it with anybody, I would go with the social argument.
00:15:30.460
Seems stronger than the, they can get better grades if they do this.
00:15:39.260
So I watched a clip from NBC News where Chuck Dodd was talking to Senator Warren about TikTok.
00:15:45.840
And Senator Warren talked about the privacy issue and never mentioned the big problem,
00:15:56.480
which is persuasion, which is the Chinese Communist Party can essentially push one button to make anything viral.
00:16:08.360
There is actually literally a button called heat.
00:16:13.560
It's actually labeled the heat button where they can make anything viral.
00:16:18.160
So that's a gigantic risk because, I don't know, 150 million Americans use it.
00:16:23.780
And they can make anything a fact because our minds are programmed by what we see and then how often it's repeated.
00:16:33.220
That's your whole operating system for your brain.
00:16:35.440
So they have control over what you see on TikTok and how often you see it.
00:16:47.780
I don't mean every single person will be immediately programmed by some memes.
00:16:52.540
I mean that on average, you can move the average reliably by how much you show them of what.
00:17:01.460
If you look at the poll, you know, the Democrats always go one way.
00:17:07.220
Even on issues that are not political, that's how you know that you're being programmed.
00:17:14.460
If you are not being programmed by some third party, then when a topic comes up that has no political connection whatsoever,
00:17:23.080
the opinions would be sort of mixed all over the place.
00:17:27.600
Every topic just becomes political, which is proof that you're being programmed.
00:17:36.860
And Elizabeth Warren never even mentions that risk.
00:17:46.700
So it's a story about the risk of TikTok without mentioning the big risk of TikTok.
00:18:00.100
It couldn't possibly be an accident at this point.
00:18:03.600
At this point, there's no way you can say that's an accident.
00:18:06.760
These have to be two entities that are in the bag for China.
00:18:12.760
Now, I told her yesterday that AOC also came out in favor of not banning TikTok.
00:18:21.640
And her answer looked so obviously bought off that people just said,
00:18:29.300
Because there's no way you could have that opinion unless somebody just paid you to have it.
00:18:34.520
And now we find out that Fox News Digital reported that ByteDance, TikTok's Chinese parent company,
00:18:46.240
funneled six-figure contributions to non-profits aligned with the Congressional Black and Hispanic Caucuses.
00:18:54.560
Didn't we just see a member of the Black Caucus saying we shouldn't ban TikTok?
00:19:06.780
And so they gave $150,000 to these two, the Black Caucus and the Congressional Hispanic Caucus Foundation.
00:19:15.740
I wonder if AOC has any connection to the Hispanic Caucus Institute.
00:19:36.540
Not only did China buy her, but we have the receipt.
00:19:48.500
Like, I don't know, how did Elizabeth Warren, you know, benefit in any way?
00:20:01.640
Now, the beauty of the TikTok story is that there is only one right answer.
00:20:08.980
And there is no argument among people who are willing to describe the risk.
00:20:14.360
So let me say that again because it's important.
00:20:16.300
There's nobody who can describe the risk of TikTok who thinks it should be legal in America.
00:20:30.860
Anybody who can say those two things out loud also says ban it.
00:20:34.620
Anybody who doesn't say it's a persuasion risk, mostly, they're in favor of keeping it.
00:20:49.460
But you can tell for sure who's been bought off by China.
00:20:52.180
Now, when I say bought off, I don't mean directly.
00:20:55.600
It could be, you know, funding for a caucus, something like that.
00:20:59.180
But it's very clear that anybody who's still in favor of keeping TikTok are just bought off.
00:21:06.980
I've never seen one this clear before, have you?
00:21:12.740
But what makes this an interesting one is there isn't.
00:21:19.400
A hundred percent of people are on the same side if they can state the argument out loud.
00:21:26.480
If they can't state it out loud, it's intentional.
00:21:33.460
It's amazing that any of these people keep their jobs.
00:21:38.120
Well, here's an interesting update on the reparations situation in California.
00:21:42.880
Now, there are two reparations stories in California.
00:21:44.840
One is what San Francisco came up with, which is, you know, super crazy time.
00:21:50.120
Five million per descendants of black slaves and one dollar house you could buy and $90,000 per year.
00:21:57.780
But there's a lesser but still crazy one for the state itself.
00:22:03.100
And the news is that Governor Newsom has been quiet and not weighed in on the recommendations.
00:22:09.420
Now, I started by telling you that he was being brilliant, because by telling the committees to go work out a recommendation, I said to myself, oh, that's brilliant.
00:22:20.660
They'll come back with something that's so stupid that he can easily ignore it without being the one who turned it down.
00:22:29.880
He said, here's what I got totally wrong, if you like it when I admit I'm wrong.
00:22:39.460
Apparently, there's no level of stupidity that Californians will recognize as stupid.
00:22:47.340
I thought it would be so far over the line of reasonable that we would just laugh at it and ignore it.
00:22:56.340
But apparently, there are enough people in the state, I'm guessing, especially the black citizens of California, who are treating this seriously.
00:23:08.860
They're actually acting like it's a real thing, which puts Gavin Newsom in a bad spot.
00:23:15.960
Because if he were to approve this, that's the end of his political life.
00:23:22.940
That would be absolutely the end of his political life.
00:23:29.320
There's no way he could be a national politician.
00:23:48.020
I vastly underestimated the gullibility of Californians.
00:23:54.020
I mean, honestly, I should have seen it coming.
00:24:29.080
And those who use all caps for their insults, extra credit.
00:24:36.620
Did not see how gullible and stupid my state is.
00:24:42.780
Now, but I'm going to put it in a larger trend.
00:25:15.220
ESPN is, they did a special video package to celebrate Women's History Month.
00:25:21.940
And they did it by honoring trans swimmer Leah Thomas.
00:25:25.980
So for Black History Month, ESPN is focusing on trans athlete Leah Thomas.
00:25:32.600
Now, what does that have to do with the California reparations story?
00:25:49.920
It looks like the white people in California have decided to embrace and amplify to end wokeism.
00:26:01.140
Not just California, but wherever ESPN is out of.
00:26:04.080
It looks like there is a secret plan by white men, mostly men, to pretend to be so on board with wokeness that they're going to break the system.
00:26:15.960
Because there's no way women are going to let ESPN get away with celebrating Leah Thomas, a trans athlete, on Women's History Month.
00:26:31.360
There's no way women are going to let anybody get away with that.
00:26:34.700
So to me, it looks like this is like forced shark jumping.
00:26:40.900
You know, when they talk about poorly written TV shows, they say, oh, it jumped the shark.
00:26:46.140
It's a reference to the old Happy Days TV show.
00:26:49.220
And it looks like white men are pushing the shark.
00:26:56.740
It's like, well, if this is where you want to go, let's go there as fast as possible.
00:27:02.480
Let's take that slippery slope right to the bottom.
00:27:06.940
So let's let's let's find out what those reparations are.
00:27:29.320
those old classic old style women with vaginas and wombs and shit.
00:27:38.940
We like the new women with penises or had penises.
00:27:44.220
So I believe that the next thing you're going to see is a movement by white men
00:27:55.820
to punish white men and white women more severely.
00:28:05.260
Because I think we've got to push this thing to we figure out what is too far.
00:28:09.180
Because too far, I thought I thought we'd already reached too far.
00:28:24.760
Here's why I think chat GPT and AI will be illegal.
00:28:31.080
So I asked AI about President Trump and whether...
00:28:42.280
One of the sample questions on the chat GPT thing was,
00:28:57.420
Well, AI didn't want to commit to that interpretation, so that's good.
00:29:06.760
and said that some people interpreted it as calling for violence.
00:29:13.940
What part of his quotes from the speech just before the January 6th bad stuff happened,
00:29:20.120
what part of his speech do you think they left out?
00:29:22.780
They did quote his speech, but they left out a part.
00:29:30.500
They left out peacefully and patriotically, make your voices heard today.
00:29:41.900
Do you think it was programmed to leave it out?
00:29:47.380
I don't think it's directly programmed to do that.
00:29:50.660
My understanding is it's a word prediction system.
00:29:56.100
So in other words, it simply looks at how everybody has ever talked ever
00:29:59.740
and then tries to talk the way most people talk.
00:30:04.060
And if the thing it trained on mostly said that Trump was not in it for a peaceful situation,
00:30:15.360
If most of the people who talked about January 6th completed a sentence with
00:30:24.900
if there's more of that than January 6th, Trump called for peaceful protest,
00:30:35.520
I mean, if that's wrong, somebody needs to correct me.
00:30:38.480
But my understanding is you're just looking for word patterns.
00:30:54.500
And I started a sentence, and it started suggesting the next word.
00:31:06.540
it suggested three words that are most likely the next word after those two.
00:31:13.960
I wrote the whole sentence from the first word without ever typing another word.
00:31:22.480
From the first sentence, it gave me choices and then narrowed down what I was going to say
00:31:27.840
until by the end it was sort of only one choice for the last word.
00:31:37.900
Here's what I've been saying about AI that will become more and more true the longer we learn about it.
00:31:48.100
We are not learning how to make machines intelligent.
00:31:54.580
We are not teaching machines to be intelligent.
00:32:04.520
AI can only prove that humans were never intelligent and only imagined that they were.
00:32:13.140
The way humans think is also word pattern repetition.
00:32:21.220
If you go on Twitter, you know how a person's going to end a sentence because of the way they started.
00:32:29.140
You can tell, oh, it's one of those, one of those people, whoever those people is.
00:32:33.460
You could be on either side and still say the same thing.
00:32:37.920
You know, the left would say, it's one of those MAGA people.
00:32:41.460
And MAGA people would say, oh, it's a leftist, progressive.
00:32:47.600
Because both sides are just imagining their thinking, but they're not.
00:32:54.020
When I tell you that the news assigns you your opinions, and you reject that and say, well, not me.
00:33:02.040
But it probably does assign those opinions to the other people.
00:33:10.360
Your opinion is based on the pattern of what you saw the most.
00:33:13.660
And if you watched Fox News and, you know, right-leaning media the most, then when you complete a sentence, you're going to complete it the way they did.
00:33:26.020
You'll believe that reason happened in your brain.
00:33:28.360
But you'll just be doing pattern recognition exactly like the AI is.
00:33:32.560
What is the most likely end of the following sentence?
00:33:47.520
And I watch CNN and MSNBC and nothing else all day.
00:34:03.160
Yesterday, I told you about a study that we've known for decades, that the critical thinking part of your brain doesn't engage until half a second after you decide.
00:34:16.740
The critical thinking part of your brain doesn't even come online.
00:34:21.640
It doesn't even activate until after you've decided.
00:34:27.760
It means that the decision is based on this irrational pattern recognition thing, just like AI, right?
00:34:35.620
But then we have another process where we rationalize it after the fact.
00:34:40.340
The rationalizing is why the things we say don't seem to make sense, the cognitive dissonance.
00:34:47.580
Every now and then, the things we say do make sense.
00:34:56.460
And by coincidence, if the thing you say makes sense, then you don't get cognitive dissonance, because it's all consistent.
00:35:05.720
You only get cognitive dissonance when your pattern recognition and your word fill-in thinking comes up with something that the rest of your brain says,
00:35:15.000
ah, that seems inconsistent, but you have to go with it anyway.
00:35:18.920
So you paper it over with cognitive dissonance, where you literally hallucinate that the things that don't make sense make sense.
00:35:26.460
So, you don't know this yet, but this is the biggest risk of AI.
00:35:39.380
The biggest risk of AI is what you realize about yourself, or people realize.
00:35:52.680
You will start to recognize them as programmed effects in your brain.
00:35:57.460
You will recognize that your opinions are not real, and that's going to take some adjustment.
00:36:04.240
Once you realize your opinions are not real, they're not even coming from you.
00:36:08.800
They're just pattern recognition, and that comes from the outside.
00:36:15.740
Now, it won't kill you, because I've been on that, you know, I've been on that menu since, I was probably 23.
00:36:27.780
Probably age 23, I first realized that your brain is not a logical engine.
00:36:33.060
When I learned hypnosis, that's the first thing you learn.
00:36:35.500
First thing you learn in hypnosis is that people are not logical engines.
00:36:39.160
If they were logical, you couldn't hypnotize them.
00:36:45.360
Hypnosis wouldn't work if people had logical brains the way you think they do.
00:36:50.400
It works because what I make you focus on, and what I repeat, becomes your operating system.
00:37:02.040
Here's a reframe that's going to change everything.
00:37:17.860
There was a study, oh no, it was long ago, 1964.
00:37:22.900
It was a study in which teachers were randomly, there was a group that were randomly selected in a classroom,
00:37:30.680
and the teachers were told that they were the smart ones,
00:37:35.220
and so the teachers treated one group like they were gifted,
00:37:45.540
The group that thought it was gifted because their teacher acted like that,
00:37:54.320
Now, I was looking for whether that study had been repeated, and I didn't find it.
00:38:09.180
I believe there's plenty of evidence to suggest that how you expect to succeed is how it'll turn out.
00:38:18.520
Now, you could say that that's a winner's mindset, wouldn't you?
00:38:28.460
Everybody knows that if you expect to do well, you're probably going to do better than if you expect to not do well.
00:38:35.520
Now, is there anybody who doubts the premise before I go on?
00:38:39.560
I need you all to buy into the premise that what you expect to accomplish is really going to make a difference.
00:38:50.800
He expected that he could build a rocket ship to Mars.
00:38:59.440
The only one who expected he could do it did it.
00:39:03.900
I expected, I know this is weird, and I know it doesn't make me sound good when I say it,
00:39:12.980
but before I became a cartoonist, I was reading a newspaper, looking at the comics, and I said to myself,
00:39:23.960
I actually expected that with no experience whatsoever, I could become a famous cartoonist.
00:39:36.900
That my odds of succeeding in this field were like 1 in 10,000.
00:39:50.900
But I can tell you that in my life, when I expect something to work,
00:40:06.700
But when I expect something to work, damn it, I expect it to work.
00:40:11.620
When I learned to play drums, I expected it to work.
00:40:16.740
And so a year and a half, two years of trying with no progress at all.
00:40:21.820
Imagine doing something for a year and a half, and you couldn't even make it sound like a beat.
00:40:37.480
Now, eventually, once I got limb independence, you know, everything came easily.
00:40:42.180
So now I could probably play just about anything you could play on drums.
00:40:45.840
I would just have to practice that specific stuff.
00:40:58.420
I'm playing it as if I'm not believing my own opinion.
00:41:03.220
So I'm still taking lessons, and I'm going to grind away for, yeah, I'll probably grind away for a year or so, no matter what, just to find out.
00:41:11.360
But I don't expect it, and that's a problem, don't you think?
00:41:26.380
So expectations make your performance different.
00:41:33.440
I want to make sure there's nobody who disagrees with that statement.
00:41:42.440
Now let's talk about critical race theory and what children are taught in school.
00:41:49.280
Are the white children taught that they can't succeed?
00:41:54.380
Are the Asian kids taught that they can't succeed?
00:41:59.900
Are the Indian American kids taught they can't succeed?
00:42:07.460
Are the Hispanic children taught they can't succeed?
00:42:09.740
Well, they often, if they're immigrants, they've got a language issue and stuff.
00:42:19.960
If you're a black American kid, do you expect to succeed?
00:42:23.600
Or do you expect that systemic racism will prevent you from success?
00:42:28.260
Well, you're being taught that systemic racism is a barrier that you have that other people don't have.
00:42:34.480
What would be the predictable outcome of telling black kids they have more obstacles to success than white kids?
00:42:54.160
Now, isn't the purpose of CRT to improve the lives of black kids if it's in school?
00:43:04.500
Now, of course, there's the argument whether they teach CRT or they just talk about the same elements of it in different ways,
00:43:11.460
which is, I'm not going to say that's different.
00:43:15.980
So here you have something that scientifically 100% of people would agree is bad for black kids.
00:43:26.600
Do you think that there's any psychologist, any, black, white, Asian,
00:43:34.280
do you think there's any trained psychologist who would say it's good for black kids to learn that they have an extra thing preventing them from success?
00:43:43.980
I don't believe anybody who has a degree or any credentials in psychology would be in favor of teaching some kids that they're not going to succeed.
00:44:00.340
You know, you're in a class of people who have this special problem.
00:44:05.340
No matter what you do, you'll run into this problem.
00:44:09.040
If you told me that every day, I don't know that I would try so hard.
00:44:13.180
I would just figure out if something wasn't working, after a few years of plugging along,
00:44:20.860
This systemic racism is keeping me from success.
00:44:24.200
But it might have been that third year of plugging that you needed.
00:44:27.020
I don't see any scenario in which CRT is not super harmful to black Americans.
00:44:51.820
I think the way to make the CRT stuff disappear is that you should call it what it is.
00:45:09.600
And even those, the black students who succeed anyway, are they better off?
00:45:18.500
Do you think a black successful person is better off knowing that the people who are looking
00:45:26.480
at their success are thinking it was probably some kind of favoritism?
00:45:33.100
It's not really good for people who are actually successful.
00:45:38.240
Imagine how much I would hate it if, you know, I had a successful career and people would just
00:45:45.440
look at me and say, yeah, because of your race, you got a little boost there, didn't
00:46:01.160
Now, let me get a little more controversial by quoting, I'm following an account on Twitter
00:46:18.680
Now, he doesn't have that many followers, a few thousand, 6,000 followers, something like
00:46:22.420
But he is pushing a message that the difference in black performance compared to other races
00:46:39.320
And I think his complaint is that if you look at just poverty and you look at just any IQ,
00:46:48.260
you might be missing the real reasons for the difference.
00:46:57.340
But Tyrone is trying to use what I'll call some tough love.
00:47:02.180
So my take is that Tyrone is really trying to help.
00:47:06.460
And he's putting himself out there at great risk because he probably thinks, I think, I
00:47:16.660
And he's trying to help the black community in particular with some tough love and tough
00:47:24.200
But he's also making very useful, very useful additions to the conversation.
00:47:31.500
For example, he said, this is his tweet, Tyrone Williams.
00:47:36.380
He says, blacks are less likely to, one, optimize their prenatal diet, exercise and sleep.
00:47:42.280
Now, I don't know if that's true, but he seems to have looked into it because he's kind of
00:47:48.600
But he says that, optimize their prenatal diet, exercise and sleep.
00:47:52.440
Less likely to breastfeed, less likely to read to the kids, less likely to ensure their
00:47:57.280
kids attend school, stay under trouble in their proficient in math and reading, and less
00:48:05.640
And then he says, but they expect equal representation at top universities.
00:48:12.400
So the useful part is, and I would really like to know about this, haven't we determined
00:48:21.260
Am I wrong that that's scientifically demonstrated?
00:48:28.720
And if it's true that there's a difference, well, there's a very specific lever that you
00:48:35.960
You know, there might be an education thing, it might be a practical thing, too.
00:48:40.180
There may be some practical reason that some people can do it and some can't.
00:48:49.080
It's a specific thing you could target, and you could say, let's do better on this.
00:48:53.260
Reading to your kids, something you could target.
00:48:58.600
Well, I would think that this is a problem for everybody who is low income.
00:49:01.440
I don't know if there's any racial difference in exercise and sleep.
00:49:09.180
But if there is, it's probably something you could find out.
00:49:13.100
If there is a difference, those things would, in fact, have an impact on IQ.
00:49:22.840
Is there any doubt that these three things would help your IQ?
00:49:26.740
Breastfeeding, better prenatal diet, and more focus on exercise and sleep.
00:49:33.800
And that's pretty much guaranteed stuff, right?
00:49:42.360
And then, you know, ensuring your kids stay in school and do well in tests and stuff like
00:49:47.900
So I would say we have some several very promising, and also somebody mentioned lead and paint might
00:49:59.240
be another factor that is disproportionately affecting black Americans.
00:50:03.800
So there might be some environmental, nutritional, mindset things.
00:50:10.380
And CRT and the, you know, the message that white people are victimizing you and holding
00:50:17.100
you down is almost certainly reducing test scores and success.
00:50:22.740
Now, if you want it to be useful, these are things you could really make a difference on.
00:50:28.020
Like, you could actually move the needle on all of those things.
00:50:46.840
The growth mindset tempers the effects of poverty on academic achievement.
00:50:52.940
So here's something from, I forget where I saw it.
00:50:57.960
But basically, it's saying that your mindset of whether you can succeed is one of the biggest
00:51:18.320
Now, I've seen a lot of studies that say having a single father gets you a good result, but
00:51:29.480
I don't know if it was limited to black population, but it might have been.
00:51:34.300
But the study showed that if you only had one parent and it was a mother, you'd have some
00:51:41.760
But if the one parent is a father, that the kid would perform as well as anybody who had
00:51:47.920
I'm a little skeptical about that, because I think there's a selection bias in that.
00:51:55.760
The selection bias being that if the dad is capable of and wants to be a dad, that probably
00:52:04.360
sorts people into the better category automatically.
00:52:07.980
So I'm not sure you're seeing a father effect as much as a filtering effect of who decides
00:52:17.000
to be the father, who decides to be a single father in the first place.
00:52:42.040
But where does success mindset come from specifically?
00:52:58.960
But in my case, my mother was the one who said, you're going to college from the time
00:53:04.740
Told me I'd be successful from the time I could understand language.
00:53:09.900
And, you know, when I was a teenager, I was already consuming self-help books.
00:53:17.900
By the time I was a teenager, and certainly by my early 20s, I was consuming everything
00:53:24.100
I could about how somebody got famous or how they got successful.
00:53:29.080
I would read every story about somebody who started with nothing and made it.
00:53:34.300
I'd read every book that said, here's the secret to success.
00:53:43.400
Because all I remember is thinking, oh, wait a minute, there might be a formula for success.
00:53:50.380
Are you telling me if I just learned the formula that I would be successful?
00:53:57.800
And so I came to believe that if I learned the formula that I could be successful.
00:54:03.320
So I spent years and years trying to figure out the formula, you know, putting it together.
00:54:08.800
And that's where I came up with the talent stack idea of combining useful talents.
00:54:14.480
It's by studying other people's, you know, ways, looking for patterns.
00:54:18.340
And it's where I came up with this system is more important than a goal.
00:54:21.400
That you have to be doing something every day to improve your odds in general, not just working toward this one goal in a straight line way.
00:54:32.640
Maybe you thought of something without being assigned to you.
00:54:37.920
Because my mother was always the go to school and, you know, it's all within your power to have whatever you want.
00:54:44.700
I mean, my mother would say the old 50s and 60s thing.
00:54:49.760
So my mother was always, if you put it in the work and you learn how to do the work right, you know, college or anything else.
00:55:00.020
So if you figure out how to do it right and you put it in the work, you can do anything you want.
00:55:13.680
But how in the world did that not help me compared to, let's say, a parent who said, you know, nobody succeeded in this family because of all that discrimination or something.
00:55:47.900
How many consider yourself successful and had parents who had a positive mindset?
00:55:55.220
You consider your life successful, however you define it.
00:56:10.480
So the one, I'll bet there aren't too many people whose parents told them they could do anything they wanted and then they consider themselves a failure at middle age.
00:56:28.980
So, let me remind you, give you a little backup in context.
00:56:33.520
Do you remember I got in a little trouble for saying something with a racial overtones a few weeks ago?
00:56:44.600
And when it happened, I realized that something that wasn't obvious had happened at the same time.
00:56:53.460
Which is, I can talk about the topic of race productively without worrying about getting canceled now.
00:57:06.080
I'm the only one who could have an honest conversation about race.
00:57:08.760
So, people, if you want to have an honest conversation, I'm the one.
00:57:23.000
Because, you know, how many teachers, let me put it this way.
00:57:31.040
Don't you think that professional teachers are fully aware that telling some group they're victims suppresses their performance?
00:57:41.500
Now, if I know it, you don't think teachers know it?
00:57:49.480
But if I ask my sister, who's probably listening right now, well, text me.
00:57:55.320
All right, sister, I won't say your name just so people don't get all over you.
00:58:00.320
But sister of mine, text me right now and tell me.
00:58:07.020
Tell me if you're not fully aware that people will rise to their expectations.
00:58:16.760
Her last name is not Adams, so you can't find her.
00:58:29.660
Actually, I don't know if she watches it live or watches it recorded.
00:58:36.520
But I don't think there's anybody who disagrees that expectations affect performance.
00:58:42.980
And nobody would disagree that telling people that they've got this, you know, invisible yoke on them called systemic racism.
00:58:56.640
Teachers also know the parents are mostly to blame when the kids are a disaster.