TRIGGERnometry - June 16, 2024


Harvard Professor: The Facts About Police Brutality - Roland Fryer


Episode Stats

Length

1 hour and 13 minutes

Words per Minute

168.041

Word Count

12,295

Sentence Count

733

Misogynist Sentences

7

Hate Speech Sentences

10


Summary

Summaries generated with gmurro/bart-large-finetuned-filtered-spotify-podcast-summ .

In this episode, economist Roland Berger talks about how he became a victim of bias by the police, and how he went on a ride-along with them to prove that the police are biased against black suspects. He also talks about his own experience with bias, and why he thinks the police should have a gun.

Transcript

Transcript generated with Whisper (turbo).
Misogyny classifications generated with MilaNLProc/bert-base-uncased-ear-misogyny .
Hate speech classifications generated with facebook/roberta-hate-speech-dynabench-r4-target .
00:00:00.700 Broadway's smash hit, The Neil Diamond Musical, A Beautiful Noise, is coming to Toronto.
00:00:06.520 The true story of a kid from Brooklyn destined for something more, featuring all the songs you love,
00:00:11.780 including America, Forever in Blue Jeans, and Sweet Caroline.
00:00:15.780 Like Jersey Boys and Beautiful, the next musical mega hit is here, The Neil Diamond Musical, A Beautiful Noise.
00:00:22.660 April 28th through June 7th, 2026, The Princess of Wales Theatre.
00:00:27.120 Get tickets at Mirvish.com.
00:00:30.980 Broadway's smash hit, The Neil Diamond Musical, A Beautiful Noise, is coming to Toronto.
00:00:36.820 The true story of a kid from Brooklyn destined for something more, featuring all the songs you love,
00:00:42.080 including America, Forever in Blue Jeans, and Sweet Caroline.
00:00:46.080 Like Jersey Boys and Beautiful, the next musical mega hit is here, The Neil Diamond Musical, A Beautiful Noise.
00:00:52.820 April 28th through June 7th, 2026, The Princess of Wales Theatre.
00:00:57.900 Get tickets at Mirvish.com.
00:01:00.000 I thought I was going to be able to show that the police were biased very easily.
00:01:05.860 And then we gathered literally millions and millions of data points on lethal use of force.
00:01:12.020 What we found was zero racial differences.
00:01:14.220 And that is the part that made people really, really upset, really angry.
00:01:19.180 Lack of bias made people upset and angry.
00:01:21.300 Yes, yes, the world was turning upside down.
00:01:24.840 Yes, there were threats against me and my family.
00:01:28.220 Police are rationally responding to what they view as threats, right?
00:01:32.440 When I embedded myself in the Houston Police Department, I noticed that independent of the skin color,
00:01:37.020 the police officer was really scared when he pulled someone over.
00:01:39.940 I asked him why.
00:01:41.240 He says, man, this is Texas.
00:01:42.800 We assume everyone's got a gun.
00:01:44.440 Roland, it's awesome to have you on the show.
00:01:48.500 I've been tracking your work for a very long time.
00:01:51.680 The conversation you had with our good friend Barry Bice was one of the most interesting and funny things that I've ever seen.
00:01:57.220 As it turns out, you're a former stand-up, or you did stand-up for a while.
00:02:00.040 But before we get into your work and your opinions, there was one particular incident that obviously changed the course of your life quite profoundly.
00:02:08.560 You're this super promising economist.
00:02:10.900 You've done incredibly well from a very difficult background getting to that position.
00:02:15.720 You're very promising.
00:02:16.820 You do this one study about police brutality and police violence against black suspects in particular.
00:02:22.860 And then you're suddenly under police protection for over a month and all sorts of other crazy stuff happens.
00:02:31.600 Tell us about that.
00:02:32.500 What happened?
00:02:33.460 What happened in the study or what happened to me?
00:02:35.680 Both.
00:02:36.120 Both, yeah.
00:02:37.040 Well, in the study, it started back in 2014 when we all were kind of mesmerized by what was going on with Michael Brown, etc.
00:02:47.360 And there were protests happening around the country, and I wanted to do something, but protesting's not my thing.
00:02:54.120 I mean, other folks, go for it, but it's not my thing.
00:02:56.980 But I wanted to do something that I thought would help.
00:02:58.840 So I thought this was going to be the easiest thing I'd ever done.
00:03:01.500 So I went and tried to collect some data, and I thought I was going to be able to show that the police were biased very easily.
00:03:07.940 I grew up in the South, partly in Florida, partly in Texas, and I grew up not liking the police.
00:03:15.580 So I just thought this was literally the easiest thing I'd ever done.
00:03:19.660 And a colleague came to me and said, what are you working on these days?
00:03:23.480 Oh, I'm just about to write this great paper showing the police are biased.
00:03:27.020 And he says, interesting.
00:03:29.380 What do you think the police are maximizing?
00:03:33.000 What do you think their side of the story is?
00:03:34.780 What does maximizing mean, just for people who are listening?
00:03:38.440 Yeah, so what are they solving for?
00:03:39.200 When they go to work every day, what are their set of incentives?
00:03:43.300 What are they after?
00:03:44.500 What are they trying to achieve at work?
00:03:47.600 And it's something that I had not thought about at all.
00:03:50.940 It was embarrassing, right?
00:03:52.940 And so I made arrangements to figure out how to actually do ride-alongs with the police.
00:03:59.800 And, in fact, you guys should do ride-alongs.
00:04:02.140 You should be good at it.
00:04:02.660 We're planning to.
00:04:03.380 Yeah, you should do it.
00:04:04.220 And bring a weapon.
00:04:06.380 And it's, I don't recommend doing it in Camden.
00:04:10.820 That was the first city I had done it in.
00:04:12.360 Where is Camden, by the way?
00:04:13.340 Camden, New Jersey.
00:04:14.300 Okay.
00:04:14.520 It's, you know, a city that has at least had a lot of crime, and especially given it's relatively small.
00:04:23.920 It's close to Philadelphia.
00:04:24.720 And so here we are on the ride-alongs, and so here we are on the ride-alongs, and I'm embarrassed to tell you.
00:04:29.760 Like, after a few hours, I became the worst police officer you could ever imagine, right?
00:04:34.800 Everyone I saw looked like a criminal, right?
00:04:36.680 Like, I was going around, and the cops were like, you really should not be a cop.
00:04:39.720 Because I would go around, and I'd say, what's that guy doing, you know, dribbling that basketball?
00:04:43.880 Looks suspicious to me.
00:04:44.660 Let's pull him over.
00:04:45.360 What do you mean, let's pull him over?
00:04:46.260 Anyway, I did this, and then we gathered literally millions and millions of data points on police use of force.
00:04:56.420 And I, again, thought that this was going to show all sorts of bias.
00:05:00.280 What we found was that in the kind of non-lethal uses of force, so pushing someone up against a car or drawing your weapon on them but not arresting them or pushing them on the ground,
00:05:14.000 those types of uses of force, there are large racial differences.
00:05:17.820 There's real racial bias in those uses of force.
00:05:21.380 In fact, just here in New York, a black person during that time was 50% more likely to have force used on them in any given interaction with the police officers.
00:05:31.840 Crazy, right?
00:05:33.560 That was lower-level uses of force.
00:05:35.180 But then we also collected data from 16 different cities across the U.S. on lethal use of force, the kind of force used on Michael Brown and others.
00:05:43.780 And that's what the protesting really was about.
00:05:46.680 And in those uses of, in those situations, what we found was zero racial differences in police use of force.
00:05:53.380 And that is the part that made people really, really upset, really angry.
00:05:58.580 Lack of bias made people upset and angry.
00:06:01.180 Yeah.
00:06:01.980 And in fact, one of the, you know, I talked to a newspaper reporter before the paper was even public.
00:06:07.300 And he says, I wouldn't write about this.
00:06:10.260 The first part's obvious and the second part's wrong.
00:06:14.840 What do you do about that?
00:06:16.260 And so we released those results.
00:06:19.480 And look, I'm skipping over a lot of stuff here.
00:06:21.660 I mean, not only did we think we did a rigorous job in the actual analysis, but then we literally, when we got these very surprising results, hired new research assistants, had them do it all over again just to make sure that they were robust.
00:06:34.260 Right?
00:06:34.380 Like, I was a little worried about, not worried, but I wanted to be sensitive about putting results like this out into the world because they were also counter to my own beliefs.
00:06:44.840 But we did, and yeah, I'd say that, you know, for a while, I thought, yes, the world was turning upside down.
00:06:56.320 Yes, there were threats against me and my family.
00:06:59.080 But the interesting part was there were thousands and thousands of emails that came from places like Kansas and Colorado and everywhere across the U.S. who said, wow, thanks for actually using data to shed at least some light on this.
00:07:16.800 And I've got the following 30,000 questions, but at least you're bringing real analysis to this question because our cities are literally burning and no one's talking about what the actual facts are.
00:07:27.240 So thank you.
00:07:28.440 And I engaged with lots of people on that dimension.
00:07:31.940 But yes, I was very, very surprised at how upset, you know, fellow academics got.
00:07:41.520 You know, as I've said before, I was taken to the side and said, you're going to ruin your career.
00:07:45.920 And I naively thought we were in this to actually develop a set of facts.
00:07:52.300 I actually thought that's what tenure was for.
00:07:53.900 I didn't I later realized it was for drinking Chardonnay at 1030 in the morning.
00:07:58.300 But but I at that point, I thought tenure was for going after the truth, even if it was unpopular and that the university was going to protect you from whatever may come of you being a real social scientist or at least following the facts as you saw fit.
00:08:15.740 So, you know, it we produced this result.
00:08:20.920 I'm still very, very proud of the work we did.
00:08:25.180 But I highly underestimated the response.
00:08:29.780 Roland, there's something before we get into all of that.
00:08:32.280 You said that I think in the study and quote me if I'm wrong, black and Hispanic people are more likely to see to be pushed around by the police, suffer physical force, et cetera, et cetera.
00:08:47.960 Do we know why that is particularly?
00:08:50.060 Is it that people from those types of communities are more likely to see the police as a threat and therefore they're more likely to resist arrest?
00:08:57.480 Is it that the police go in more aggressively?
00:08:59.840 Do we know what is actually happening?
00:09:01.560 It's a great question.
00:09:03.060 We don't have a perfect answer on that, but we have some threads.
00:09:07.420 So one of the most interesting coefficients in that study to me is that when the police report that a suspect is fully compliant.
00:09:18.780 They were not arrested and no contraband was found.
00:09:23.260 Nothing went wrong in this traffic stop as reported by the police.
00:09:27.020 Blacks are 25 percent more likely to have force used on them, even in those interactions.
00:09:33.880 Wow.
00:09:34.020 And so, you know, yes, there's some police departments who have written to me and says, you know, you're saying use of force, but what you really mean is response to resistance.
00:09:44.440 And no, that's not what I mean.
00:09:45.980 I mean use of force here because even in situations where, again, I can't emphasize this enough.
00:09:52.340 The police report they are fully compliant, right?
00:09:55.060 I took a lot of flack for people saying, how dumb can you be?
00:09:57.440 You use police reports.
00:09:59.000 Well, stop and think for a second.
00:10:01.000 If the police are reporting it and they still, you can still find bias in that data.
00:10:06.760 You can really believe it.
00:10:08.100 See what I'm saying?
00:10:08.940 And so the police report them being fully compliant.
00:10:11.900 And yet there's still a large racial difference in the use of force.
00:10:15.420 So I don't know exactly why that is.
00:10:17.680 We gave, we did some very, a set of pretty rigorous bias tests.
00:10:23.600 So we actually think there is just, it's racial, part of it is racial discrimination, not all of it.
00:10:29.840 Some of it can be explained also by the behavior of civilians.
00:10:34.020 That also differs, but it can't explain all of it.
00:10:36.400 So I think there's enough kind of, not fault, but there's enough, both sides of that equation that you described, civilians acting differently, police potentially being biased.
00:10:49.640 We find evidence for both of those.
00:10:51.880 And now the key question is, what can we actually do about it, right?
00:10:56.520 And, you know, what we try to propose is a way to work with police departments to eliminate this type of bias, because I think it's at the root of a lot of the discontent in black neighborhoods between the police and civilians.
00:11:16.960 Look, if we admit that some of the racial differences in police use of force are due to bias, and then there's a controversial shooting, it's almost irrational for someone in the community not to believe that the shooting, before you saw this data, right?
00:11:33.860 You say, well, on the things I do know, I know there's discrimination.
00:11:38.540 On the thing I can't really observe, I wasn't at the shooting, probably was discrimination, right?
00:11:44.820 And so I think this is a point that we don't make enough, but if it's hard to negotiate with the police directly on shootings, right?
00:11:55.160 Hold your weapon, that's hard.
00:11:57.300 You know, their lives are at stake too, but on the lower level uses of force, right?
00:12:04.240 Every department I've talked to privately knows that there are things that they can do better on the lower level uses of force that would then garner community support so that we could actually have productive conversations when a controversial shooting happens.
00:12:21.420 Roland, sorry, Francis, I feel like we're getting into the meat of the conversation, which is great, but I want to take one quick detour before we carry on on that part, which is you said something that I think in the moment that we're sitting here is a very interesting conversation to at least explore, which is you said, protesting isn't for me.
00:12:39.220 Why?
00:12:41.020 There are many ways of making a difference, right?
00:12:43.880 And, you know, I don't begrudge anybody who wants to have free expression and to protest.
00:12:51.540 Great.
00:12:53.940 It's just an economic lingo.
00:12:56.620 It's not my comparative advantage.
00:12:58.280 Okay.
00:12:58.980 Right?
00:12:59.360 I am a data nerd, so when I see a problem, my first inclination is not to stand up and protest.
00:13:07.280 My first inclination is let me get a bunch of data and prove to people why this is, what's actually going on here.
00:13:12.900 And, you know, to make real social progress, maybe you need both.
00:13:17.260 And I'm sure some of the people who are out protesting are saying, well, numbers aren't my thing.
00:13:22.960 Great.
00:13:23.320 You do your thing and I'll do mine.
00:13:25.480 But it's not that I have anything against it in any way, shape or form.
00:13:28.820 It's not just for me.
00:13:30.520 And the other quick detour I wanted to take, and I'm posing this in a very kind of devil's advocate kind of way.
00:13:36.940 Me too.
00:13:38.000 My read of you is you're someone who cares about the truth, who's prepared to be the outlier, who's prepared to say the unpopular thing from various conversations that I've seen you have.
00:13:46.980 So how true is it that you did this study out of a desire to prove that the police are biased?
00:13:53.740 Was there a part of you that was kind of, I'm going to, like, find something out here that's going to be different to what everyone else is saying?
00:13:59.560 The exact opposite.
00:14:00.640 It was, wow, I'm going to finally find something people are going to like me.
00:14:05.580 Oh, boy, were you wrong.
00:14:07.240 Yeah, man.
00:14:09.560 Yeah.
00:14:10.140 No, it is not, I promise, I do not get up in the morning thinking, let me try to stir the Kool-Aid today, right?
00:14:17.060 Like, that is not my view.
00:14:19.060 On the other hand, you know, sorry, my fellow academics.
00:14:23.060 They don't, I'm not afraid of them, right?
00:14:25.020 Like, they don't, you know, they don't intimidate me in any way, shape or form.
00:14:28.500 And so I'm not scared to say what I believe is the truth.
00:14:34.760 But I'm not an instigator.
00:14:37.000 I'm not a, people, I hate when people call me controversialist.
00:14:39.980 I don't want to be, right?
00:14:41.580 I think people who stick to bullshit, even though it doesn't fit the facts, that's controversial.
00:14:46.820 People who follow the data wherever they lead, I mean, isn't that social science?
00:14:50.860 And so, no, I literally thought this is going to be the easiest paper I'd ever written because I'd just come off of paying kids to learn.
00:15:02.740 Oh, my gosh.
00:15:04.320 You know, now people pay kids to learn.
00:15:06.400 They're like, this is interesting.
00:15:07.440 But back in 2008, oh, this was horrible.
00:15:10.240 I mean, I was compared to the Tuskegee experiments, literally, right?
00:15:14.060 Can you imagine that?
00:15:15.380 Get up one day, you go to work, and there's someone with a picket, speaking about protesting, someone with a picket sign who said,
00:15:21.040 Roland Fryer is the worst thing for black people since the Tuskegee experiments.
00:15:24.660 What are they?
00:15:25.660 Because I don't know what they are.
00:15:26.640 Oh, okay.
00:15:27.360 Well, let me give you the dichotomy here.
00:15:29.500 They're paying kids, on the one hand, to do the things that will help them later in life, unknowingly injecting people with syphilis.
00:15:38.340 You do the math, right?
00:15:40.380 Crazy.
00:15:41.060 Like, just crazy.
00:15:42.700 Yeah.
00:15:43.240 Yeah, yeah.
00:15:43.940 No, that's what happened.
00:15:45.120 Why are you surprised?
00:15:46.440 Anytime anyone has the wrong opinion now, they're literally Hitler.
00:15:49.340 This is basically it.
00:15:50.260 Yeah.
00:15:50.800 It's the same thing.
00:15:51.620 It's crazy, right?
00:15:52.180 So I had just come off of that, and then we had done this work in Houston where we took the worst schools in Houston
00:15:56.800 and did all we could to reform them.
00:16:01.580 And so through that education journey, and we'd written some papers on charter schools that people didn't love,
00:16:09.020 I thought, this is my chance.
00:16:10.880 People are going to really, finally, like something that I do.
00:16:16.160 The thing that I found disappointing from the reaction and the inevitable fallout,
00:16:20.820 where it's inevitable now with the benefit of hindsight,
00:16:22.720 is your report, as we've touched on, raised some really interesting points,
00:16:28.420 which actually, if we look to the results dispassionately,
00:16:32.080 there are things that we all need to work on, particularly the police.
00:16:35.980 And actually, if we address them, we could make relations between the African-American community
00:16:42.280 and the police so much better.
00:16:45.120 And actually, we could have, as a result of that, a more cohesive society.
00:16:49.920 Look, where were you in 2016?
00:16:54.280 Keeping my head down, Mike.
00:16:55.640 That's where I was.
00:16:56.700 That guy's controversial.
00:16:59.920 Right.
00:17:00.580 Of course, yes.
00:17:02.260 A hundred percent.
00:17:03.420 But that's the reason I think there was so much pushback,
00:17:07.220 because it was coming from different directions.
00:17:09.100 Literally, the emails I was getting, the communication I was getting.
00:17:11.880 On the one hand, people thought, oh, my gosh, what are you doing?
00:17:15.680 This is not true.
00:17:16.560 There's, you know, police are out there murdering innocent black people.
00:17:22.280 And on the other hand, I had police telling me, what are you talking about?
00:17:25.080 We're not biased.
00:17:26.920 Right.
00:17:27.400 And so, and in fact, right, when we published this in the New York Times,
00:17:34.000 the folks that were interviewing me at the time,
00:17:37.920 they were more concerned with calling NYPD biased
00:17:40.560 than they were saying that shootings were unbiased at the time.
00:17:47.160 Very, very different now.
00:17:48.760 But at the time, that's where the nervousness was coming from when they published it.
00:17:52.760 The world has changed.
00:17:53.680 The world has changed.
00:17:55.880 He's apparently remained the same.
00:17:58.580 I'm consistent.
00:17:59.620 How much do you think the reaction to your report and the fallout
00:18:05.880 and the general narratives are to do with history,
00:18:09.500 the way that the police has behaved in the past,
00:18:11.280 if we think about things like Rodney King, for instance,
00:18:13.460 and all of those types of incidents?
00:18:15.520 It has a lot to do with that.
00:18:16.700 Again, the relationship between black communities and the police are broken
00:18:19.960 in most communities.
00:18:21.680 And that's why I believe, just like you just described,
00:18:25.500 working together on these lower level uses of force,
00:18:28.780 when the stakes are lower,
00:18:32.160 that is the place to really have reform.
00:18:36.240 It's hard when the police go to a scene,
00:18:40.060 there's weapons involved,
00:18:41.640 and you say, well, now is the time to compromise.
00:18:43.780 No, I'm not sure.
00:18:45.920 And so, yeah, that is the source.
00:18:49.500 And so there have been years and years,
00:18:52.980 decades of contentious relationships
00:18:58.800 between black people and the police.
00:19:00.280 So, you know, look, if you go to the black neighborhoods,
00:19:02.360 none of this surprised anybody.
00:19:05.180 And it has just come into the mainstream in the last few years.
00:19:09.560 But this has been boiling over for a while.
00:19:11.740 And again, the frustrating part to me is,
00:19:15.280 if we could all look clear-eyed at the data,
00:19:17.300 there are positive things we can do for change.
00:19:20.080 These are not marks on a chalkboard.
00:19:22.680 This isn't academic research anymore.
00:19:24.480 That's done.
00:19:25.320 The question is, how are we going to work together
00:19:27.740 to actually create better relationships
00:19:31.360 between communities and police
00:19:33.280 in a way that's productive,
00:19:34.800 in a way that we can actually
00:19:35.960 talk through some of the difficult issues?
00:19:39.320 It seems like the challenge there,
00:19:41.380 from what I understand,
00:19:42.460 is there's a kind of mutual bias going on
00:19:44.580 because from somebody who's a civilian,
00:19:48.500 as you say,
00:19:49.460 the perspective is they're being mistreated
00:19:52.040 because of their skin color and so on.
00:19:54.040 And from the police perspective,
00:19:55.180 from people that I've spoken to as well,
00:19:56.960 there's the thing they'll say is,
00:19:58.960 well, different groups of people
00:20:00.560 commit crimes at different rates.
00:20:02.400 So if I'm a black police officer,
00:20:04.880 white, it doesn't really matter.
00:20:05.720 If I turn up to the scene of a crime
00:20:07.000 and half the time it's a black perpetrator,
00:20:09.820 that's going to build in certain narratives
00:20:11.600 in my head.
00:20:12.240 So next time I pull someone over
00:20:13.480 and they are black,
00:20:15.420 how am I not going to be
00:20:16.840 a little bit more wary of that?
00:20:18.900 And then you've got that mutual bias kicking in.
00:20:21.100 People are mutually suspicious of each other.
00:20:23.200 How does that get untangled?
00:20:24.220 Well, I think we've got to look for better clues
00:20:25.860 than just black or white.
00:20:27.060 Right.
00:20:27.420 Right.
00:20:27.780 I mean, when I was in college,
00:20:29.140 we used to call it the usual suspects.
00:20:31.160 Like some crime would be committed
00:20:32.700 and they'd say,
00:20:33.900 well, the suspect is between 5'8 and 6'10.
00:20:38.000 He's only 150 and 300 pounds.
00:20:40.920 Black male, black hair and brown eyes.
00:20:43.460 And we're like, shit, none of us can go outside.
00:20:44.860 Right?
00:20:46.460 And so we need to do better than that.
00:20:49.320 I joked with Loretta Lynch once
00:20:54.380 that we need to have,
00:20:57.200 is there something,
00:20:57.800 can we have TSA pre?
00:20:59.100 Is there something I can do
00:21:00.360 that when the police pull me over,
00:21:02.400 they can look beyond just race, right?
00:21:06.340 Is it kind of,
00:21:07.100 maybe if I wear, you know,
00:21:08.160 salmon colored pants,
00:21:09.380 I mean, that's something
00:21:09.900 that not a lot of criminals do.
00:21:11.300 Right.
00:21:11.680 I don't want to do it either.
00:21:13.520 Might be worth being roughed up.
00:21:15.520 But you get the idea.
00:21:16.880 We have to figure out
00:21:18.100 other context clues beyond that, right?
00:21:20.900 But you're right.
00:21:21.620 The police are rationally responding
00:21:24.020 to what they view as threats, right?
00:21:26.140 When I embedded myself
00:21:27.800 in the Houston Police Department,
00:21:29.000 I noticed that independent of the skin color,
00:21:31.240 the police officer was really scared
00:21:33.020 when he pulled someone over.
00:21:34.180 I asked him why.
00:21:35.520 He says, man, this is Texas.
00:21:37.040 We assume everyone's got a gun.
00:21:39.100 Yeah.
00:21:39.520 Right?
00:21:39.940 And so there,
00:21:40.800 all of these layers of complexity are here.
00:21:43.620 I just want it to be
00:21:45.300 several layers deeper
00:21:47.820 than just saying,
00:21:49.180 look at that person of their race.
00:21:52.680 And, you know,
00:21:53.180 it's the other extreme
00:21:57.200 also frustrates a lot of us.
00:21:59.240 Have you ever been in a TSA pre-line
00:22:00.780 where they're patting down the 90-year-old?
00:22:02.660 And you're like, I don't know.
00:22:04.160 Could we get this line moving?
00:22:06.000 Right?
00:22:06.120 We start to stereotype
00:22:07.800 from the other direction.
00:22:08.940 So I just think we need better ways
00:22:11.440 of understanding who's a threat
00:22:13.740 and who's not.
00:22:14.160 Broadway's smash hit,
00:22:17.160 The Neil Diamond Musical,
00:22:18.560 A Beautiful Noise,
00:22:20.040 is coming to Toronto.
00:22:21.440 The true story of a kid from Brooklyn
00:22:23.260 destined for something more,
00:22:24.960 featuring all the songs you love,
00:22:26.700 including America,
00:22:28.000 Forever in Blue Jeans,
00:22:29.220 and Sweet Caroline.
00:22:30.700 Like Jersey Boys and Beautiful,
00:22:32.500 the next musical mega hit is here,
00:22:34.780 The Neil Diamond Musical,
00:22:36.360 A Beautiful Noise,
00:22:37.520 now through June 7, 2026
00:22:39.560 at the Princess of Wales Theatre.
00:22:41.720 Get tickets at murbush.com.
00:22:44.160 I love shopping for new jackets
00:22:46.840 and boots this season.
00:22:48.260 And when I do,
00:22:49.180 I always make sure I get cash back
00:22:51.020 with Rakuten.
00:22:52.140 And it's not just fashion.
00:22:53.540 You can earn cash back on electronics,
00:22:55.480 beauty, travel, and more
00:22:56.680 at stores like Sephora,
00:22:58.260 Old Navy, and Expedia.
00:22:59.880 It's so easy to save
00:23:00.960 that I always shop through Rakuten.
00:23:02.920 Join for free at rakuten.ca
00:23:04.700 and get your cash back
00:23:05.940 by Interacte Transfer, PayPal, or check.
00:23:08.500 Download the Rakuten app
00:23:09.620 or sign up at rakuten.ca.
00:23:11.660 That's R-A-K-U-T-E-N dot C-A.
00:23:15.280 And so you presented your findings,
00:23:18.940 you presented your paper.
00:23:20.580 What happened then at that point?
00:23:24.480 You know, in the beginning,
00:23:28.000 I thought,
00:23:29.440 all this is going really well.
00:23:32.520 I did.
00:23:33.600 I got Interacted.
00:23:34.820 I'm not sure I've ever admitted it.
00:23:35.720 I really thought,
00:23:36.520 I said, this is like,
00:23:38.000 I really crushed this one.
00:23:39.800 I hit the spot.
00:23:40.480 We hit the spot, right?
00:23:42.720 Because, you know,
00:23:44.000 we have this place in economics
00:23:45.900 where we put our working papers
00:23:47.620 and one of the people
00:23:49.460 that worked there
00:23:50.200 sent me a note
00:23:51.800 and the first weekend
00:23:52.700 it was published
00:23:53.520 and says,
00:23:53.920 man, you just broke
00:23:54.580 all the download records.
00:23:55.920 You know,
00:23:56.760 this paper's hot.
00:23:58.280 And it was in the New York Times
00:23:59.580 and Upshot or what have you.
00:24:01.100 My emails were exploding.
00:24:03.560 President Obama at the time
00:24:04.780 invited me to the White House
00:24:05.860 for a very long meeting
00:24:06.800 about police use of force.
00:24:07.860 And I thought,
00:24:08.100 I am making a difference.
00:24:11.320 But as the narrative
00:24:12.920 started to change over time,
00:24:15.820 and yes, there was,
00:24:17.700 let me back up,
00:24:18.260 yes, there were threats
00:24:20.600 of physical violence
00:24:21.600 and we had police protection
00:24:23.220 and all that stuff.
00:24:24.340 And that was not good,
00:24:26.500 but I was focused on all
00:24:27.720 the good things
00:24:28.720 I thought could come out of this.
00:24:29.760 And, but over time,
00:24:32.840 it has become something where,
00:24:35.940 I don't know,
00:24:38.440 it's, it's, it's,
00:24:40.860 it's, people have used it
00:24:44.180 to kind of,
00:24:47.140 not just write me off,
00:24:48.240 but write off research
00:24:49.360 in this way,
00:24:50.620 kind of writ large.
00:24:52.020 And that has been very surprising.
00:24:53.360 You know,
00:24:55.520 it has
00:24:56.480 contributed to,
00:24:59.160 I think,
00:25:00.460 other social scientists
00:25:01.780 being less willing
00:25:03.860 to,
00:25:04.740 to follow the facts
00:25:06.300 wherever they leave.
00:25:07.420 And I think that is really,
00:25:09.640 if I'm right about that,
00:25:10.860 and I hope I'm not,
00:25:11.640 actually,
00:25:12.180 that's really,
00:25:13.360 really bad.
00:25:14.060 For sure.
00:25:14.940 You know,
00:25:15.200 I've got students
00:25:16.740 that,
00:25:17.660 you know,
00:25:18.160 I meet with
00:25:19.340 at Harvard Square
00:25:20.320 who say things like,
00:25:21.160 I can't say this in public.
00:25:22.700 And I'm like,
00:25:23.820 who are you?
00:25:24.820 Like,
00:25:25.040 this is just you and I
00:25:26.780 over a cup of coffee.
00:25:28.340 And people have become
00:25:29.340 so leery of it.
00:25:30.600 And,
00:25:30.980 and,
00:25:31.460 and the way that
00:25:34.380 the paper has been covered
00:25:36.020 by both sides,
00:25:37.220 right?
00:25:37.500 Both sides.
00:25:38.800 I,
00:25:39.460 I saw once
00:25:40.960 or read once
00:25:41.640 that
00:25:43.040 people were saying
00:25:44.740 football players
00:25:46.540 shouldn't kneel
00:25:47.240 at the national anthem
00:25:48.660 because
00:25:49.700 my paper demonstrates
00:25:51.160 they shouldn't kneel.
00:25:52.000 I thought,
00:25:52.360 my paper has nothing
00:25:53.160 to say about whether
00:25:53.900 or not they kneel
00:25:54.500 at the national anthem
00:25:55.220 or not.
00:25:57.420 And,
00:25:58.080 on the other hand,
00:25:59.700 I've seen
00:26:00.440 researchers find
00:26:01.760 the exact same results
00:26:02.860 but bury them,
00:26:04.240 either bury them
00:26:05.000 in appendix tables
00:26:05.840 or refuse to publish
00:26:06.840 them at all.
00:26:08.700 And so,
00:26:09.240 it has been
00:26:09.960 really quite chaotic
00:26:11.980 since then.
00:26:12.840 I would say that,
00:26:13.620 again,
00:26:13.840 the first month or two
00:26:14.800 I thought,
00:26:15.240 wow,
00:26:15.520 this is great.
00:26:16.240 I can't wait
00:26:16.660 to do the next thing.
00:26:17.500 but my life
00:26:19.500 really got turned
00:26:20.100 upside down
00:26:20.880 because of it.
00:26:22.860 And,
00:26:23.280 one last thing,
00:26:24.080 I know I'm rambling
00:26:24.580 but I don't want to say this.
00:26:25.540 Take your time.
00:26:26.440 And I want,
00:26:28.080 the most important thing is
00:26:30.020 I do it all over again.
00:26:32.720 A hundred percent.
00:26:35.320 And almost in the exact same way.
00:26:37.840 Not because I didn't make
00:26:39.520 any mistakes along the way
00:26:40.600 and all that,
00:26:41.180 but because
00:26:43.100 I just refuse
00:26:45.960 to lie
00:26:46.540 to people,
00:26:48.020 especially
00:26:48.500 the constituents
00:26:52.780 that I've been working
00:26:53.900 for my entire life.
00:26:54.920 I've been doing this
00:26:55.420 for 21 years,
00:26:56.660 80 plus hours a week.
00:26:59.540 It's hard enough
00:27:00.800 empirically
00:27:01.580 to figure out
00:27:02.160 what the truth is.
00:27:03.060 I'm not gonna
00:27:03.620 constrain myself
00:27:05.400 for only those truths
00:27:06.640 that are in a nice box
00:27:07.740 with a particular bow.
00:27:09.260 I can't do that.
00:27:09.960 This is after
00:27:11.320 we're trying
00:27:11.800 to solve problems,
00:27:13.180 right?
00:27:13.740 And,
00:27:14.940 yes,
00:27:15.200 my life was turned
00:27:16.160 upside down.
00:27:17.660 Who cares?
00:27:19.300 Right?
00:27:19.720 This is about
00:27:20.740 whether or not
00:27:21.580 we can actually
00:27:22.760 get police to change
00:27:25.020 and actually get
00:27:26.220 folks in the neighborhood
00:27:27.340 to understand
00:27:28.220 that,
00:27:29.640 yes,
00:27:29.840 there are some issues
00:27:30.820 but,
00:27:32.000 you know,
00:27:33.300 the biggest threat
00:27:34.020 in the world
00:27:34.480 is not their police officers.
00:27:36.720 And can we come together
00:27:38.120 to make the country better
00:27:40.360 as regards public safety?
00:27:42.960 For me,
00:27:43.320 that's what this is about.
00:27:44.420 So,
00:27:44.980 who cares what happened to me?
00:27:47.800 You know,
00:27:48.300 if I discover
00:27:49.340 a result like this tomorrow,
00:27:51.720 I'm gonna put it out
00:27:52.440 in the same exact way
00:27:53.480 and if I suffer
00:27:54.180 the same consequences,
00:27:55.360 so be it.
00:27:56.120 I respect that
00:27:57.000 and I get that.
00:27:58.220 The question I was gonna ask you is,
00:28:00.000 you mentioned
00:28:00.600 the narrative changing.
00:28:03.320 I can't imagine
00:28:04.000 what it's like.
00:28:04.660 You do a paper,
00:28:05.460 you get called in
00:28:06.040 by the president
00:28:06.780 for a lengthy conversation
00:28:09.140 about your research
00:28:10.360 and then,
00:28:12.240 quote-unquote,
00:28:12.740 the narrative changes.
00:28:14.080 I don't even know
00:28:14.700 what that means.
00:28:15.440 Like,
00:28:15.960 who changed it?
00:28:17.660 I don't know.
00:28:18.820 I don't know
00:28:19.540 who gets to change narratives.
00:28:21.080 It is mind-bending.
00:28:23.800 Truly.
00:28:24.620 I can imagine.
00:28:25.860 Or maybe I can't.
00:28:26.900 A few years ago,
00:28:27.780 you weren't allowed
00:28:28.240 to say a joke on campus
00:28:29.380 and now,
00:28:29.960 you can call
00:28:30.420 for the genocide
00:28:31.280 of whole populations
00:28:32.260 and that's free speech.
00:28:33.540 Exactly.
00:28:34.160 Progress.
00:28:40.180 And let me guess,
00:28:41.080 you were there all along.
00:28:42.200 Yeah.
00:28:42.860 I was consistent.
00:28:46.540 So,
00:28:47.140 I don't know
00:28:47.780 who decides that.
00:28:49.880 But who was coming,
00:28:50.900 I guess part of what
00:28:52.000 I'm asking is,
00:28:53.260 who was coming after you?
00:28:54.460 Was it black people
00:28:56.900 in the neighborhood
00:28:57.620 who were saying,
00:28:58.400 your research
00:28:59.180 is not sufficiently
00:29:00.280 reflective of our reality?
00:29:02.320 Was it white academics
00:29:04.160 who were like,
00:29:05.060 this is a toxic subject,
00:29:06.660 you've got the wrong conclusions?
00:29:08.460 Was it the media?
00:29:09.900 Was it all of those at once?
00:29:11.520 Like,
00:29:12.220 I get,
00:29:12.800 I'm genuinely asking,
00:29:13.700 like,
00:29:14.020 who gets to change
00:29:15.320 the narrative
00:29:15.880 from you're sitting
00:29:17.080 in the White House
00:29:17.900 to suddenly you're,
00:29:18.960 you're some kind
00:29:19.620 of demon monkey?
00:29:21.020 D,
00:29:21.300 all of the above.
00:29:22.680 Okay.
00:29:24.360 It was,
00:29:25.840 but I'd put
00:29:26.540 the least amount
00:29:27.280 of emphasis
00:29:27.700 on people
00:29:28.280 in the communities.
00:29:29.060 Okay.
00:29:30.840 You know,
00:29:31.980 I think people
00:29:33.060 in the communities
00:29:33.580 had questions
00:29:34.300 and,
00:29:34.620 and,
00:29:35.020 you know,
00:29:35.560 civil rights folks
00:29:36.280 in the communities
00:29:36.820 had questions,
00:29:38.280 which ranged from,
00:29:40.480 are you sure
00:29:42.300 that Houston's representative
00:29:43.400 of this or that,
00:29:44.340 which is a fair question.
00:29:45.540 Totally.
00:29:45.980 Bring that all day long.
00:29:47.040 Totally.
00:29:47.520 To,
00:29:48.840 did you have to put it out here?
00:29:49.920 We had them on the ropes,
00:29:50.820 brother.
00:29:51.160 You know?
00:29:51.320 I'm not going to say
00:29:54.920 who that was,
00:29:55.720 but I was like,
00:29:56.780 my bad.
00:29:57.280 I didn't,
00:29:57.500 I didn't know
00:29:57.920 you had them
00:29:58.320 on the ropes.
00:29:58.900 I did.
00:30:01.980 To,
00:30:02.680 you know,
00:30:03.260 academics who,
00:30:04.340 who,
00:30:04.640 yeah,
00:30:04.940 this is,
00:30:05.480 you know,
00:30:05.740 the paper was labeled
00:30:08.960 hate speech
00:30:09.600 on Twitter
00:30:10.060 at the time.
00:30:11.420 Right?
00:30:11.920 I'm not on social media,
00:30:13.020 but someone posted it
00:30:14.280 and the account
00:30:14.840 was suspended
00:30:15.380 for hate speech,
00:30:16.220 right?
00:30:16.940 That's ridiculous.
00:30:18.460 That's ridiculous.
00:30:20.740 To other academics
00:30:22.120 who are out
00:30:22.980 lying,
00:30:24.240 saying,
00:30:24.680 oh,
00:30:24.980 he's retracted
00:30:25.800 those results.
00:30:26.420 They aren't true.
00:30:28.160 Silliness,
00:30:28.940 right?
00:30:31.080 To
00:30:31.560 police officers
00:30:33.640 who,
00:30:34.480 some of which
00:30:34.940 who say,
00:30:35.300 yes,
00:30:35.520 you may have
00:30:36.720 a point here to,
00:30:37.540 no,
00:30:38.160 I don't think
00:30:39.000 there's any bias
00:30:39.860 in lower level
00:30:40.880 uses of force.
00:30:41.560 So again,
00:30:41.940 there was something
00:30:42.560 there for
00:30:44.260 everyone not to like.
00:30:47.260 And so maybe
00:30:48.280 it's true there
00:30:48.880 if it ended up
00:30:51.180 pissing off everyone.
00:30:52.840 So I don't know
00:30:54.120 your question
00:30:55.040 of who gets
00:30:55.560 to change the narrative,
00:30:56.400 but I will say
00:30:57.120 that different people
00:30:58.560 had different angles
00:30:59.420 on these results.
00:31:00.300 Yes.
00:31:00.600 Sure.
00:31:01.040 I guess what I'm
00:31:01.760 trying to get at is
00:31:02.920 that seems to me
00:31:04.880 like an extraordinary
00:31:06.020 thing to happen
00:31:07.140 where you go from
00:31:08.740 being celebrated
00:31:09.900 for this study
00:31:10.880 all the way
00:31:11.940 up to the White House
00:31:13.140 to then people,
00:31:14.980 something must have
00:31:16.000 happened in the
00:31:16.700 social consciousness
00:31:17.580 around that time
00:31:19.080 for people to just
00:31:19.780 see this issue
00:31:20.620 differently,
00:31:21.480 including one of the
00:31:22.480 things that strikes
00:31:23.120 me about it
00:31:23.660 is like,
00:31:24.760 why would people
00:31:25.860 be upset
00:31:26.640 about information
00:31:27.720 that's helpful
00:31:28.440 to taking
00:31:29.120 the conversation forward?
00:31:30.380 like that to me
00:31:31.740 is very strange.
00:31:32.780 Do you know
00:31:33.420 what I'm saying?
00:31:33.560 No, I do know
00:31:34.160 what you mean
00:31:34.440 and I've asked
00:31:35.300 people I respect
00:31:38.880 something similar,
00:31:39.920 not quite as elegant
00:31:40.600 as you just did,
00:31:41.420 but what their view
00:31:43.080 is, is look,
00:31:44.540 if you've got
00:31:45.440 your mind made up
00:31:46.380 that the police
00:31:48.040 are really
00:31:49.020 discriminatory,
00:31:50.380 then this is just
00:31:51.480 annoying to you,
00:31:52.540 right?
00:31:53.460 That because the
00:31:55.420 paper's not perfect,
00:31:57.280 right?
00:31:57.560 I don't have
00:31:58.520 every police
00:31:59.140 department.
00:31:59.960 I don't know
00:32:00.620 if Chicago's
00:32:01.300 police department
00:32:01.840 is discriminatory
00:32:03.040 in shootings
00:32:03.480 versus not.
00:32:04.100 I don't have
00:32:04.400 Chicago's data.
00:32:05.200 I'd love to.
00:32:07.020 I don't know
00:32:08.100 what's going on
00:32:08.920 in a lot of
00:32:09.300 police departments
00:32:09.880 and the data's
00:32:10.440 not perfect.
00:32:11.180 And so in their
00:32:11.760 view, I've put
00:32:13.500 something out there
00:32:14.380 irresponsibly
00:32:15.380 that shows
00:32:17.900 something that
00:32:18.400 they just know
00:32:19.160 can't be true.
00:32:20.880 And so it's
00:32:21.600 inconvenient for them
00:32:22.420 instead of
00:32:23.320 saying,
00:32:24.320 huh, how can
00:32:25.960 we work together
00:32:26.600 to collect more
00:32:27.620 data?
00:32:28.300 This is the best
00:32:28.900 evidence we have.
00:32:30.860 Let's collect more
00:32:31.900 data.
00:32:32.260 Let's see if it's
00:32:32.840 real robust.
00:32:33.500 That's how social
00:32:34.280 science typically
00:32:35.100 makes progress.
00:32:36.180 But in this case,
00:32:38.220 it's such a hot
00:32:40.080 topic, or at least
00:32:41.040 has been, that
00:32:42.440 people are just
00:32:43.400 jumping to
00:32:43.920 conclusions and
00:32:44.820 taking sides.
00:32:45.540 It's become much
00:32:46.260 more tribal than
00:32:48.000 social scientific.
00:32:50.000 And I'm not
00:32:51.040 equipped for that.
00:32:51.920 That's not what I
00:32:52.520 do for a living.
00:32:53.280 I don't join
00:32:53.880 tribes, unless the
00:32:56.400 nerds have one.
00:32:57.820 But you get the
00:32:59.020 idea.
00:32:59.460 So I don't, that's
00:33:00.600 what they're upset
00:33:01.280 about, I believe,
00:33:02.500 though I don't know.
00:33:03.520 It's about
00:33:04.120 narratives, isn't
00:33:04.900 it?
00:33:05.020 Every tribe,
00:33:06.000 they have their
00:33:06.600 narrative.
00:33:07.380 If you are, for
00:33:08.680 want of a better
00:33:09.200 term, conservative,
00:33:10.400 pro-police, you
00:33:11.660 have this narrative.
00:33:13.060 If you are a
00:33:13.980 liberal progressive,
00:33:14.880 you have this
00:33:15.540 narrative.
00:33:16.180 And when someone
00:33:16.900 who comes along
00:33:17.880 not only challenges
00:33:19.580 your narrative, but
00:33:20.460 also provides
00:33:21.300 evidence, which is
00:33:22.940 very strong,
00:33:24.520 borderline irrefutable,
00:33:26.000 then you're faced
00:33:28.020 with two options,
00:33:28.800 really.
00:33:29.360 You either change
00:33:31.360 the narrative, which
00:33:32.120 is incredibly painful
00:33:33.040 because that means
00:33:33.600 that you get
00:33:34.000 isolated from your
00:33:35.020 tribe, or you
00:33:37.100 defend your tribe,
00:33:37.980 which means you go
00:33:38.660 on the attack.
00:33:39.420 Yeah, and you
00:33:40.220 discredit the person
00:33:41.280 or the study or
00:33:42.020 what have you.
00:33:42.900 And I think part
00:33:44.320 of that's it.
00:33:44.940 This is a, there
00:33:47.260 is a political game
00:33:48.540 going on or a
00:33:49.720 political fight going
00:33:50.900 on, or a culture
00:33:52.020 war, whatever you
00:33:52.720 want to call it.
00:33:53.900 And frankly, for
00:33:55.500 both sides, this
00:33:56.160 was a little
00:33:56.520 inconvenient.
00:33:57.760 Right.
00:33:58.920 And, but, I
00:34:02.660 thought that's
00:34:03.120 what tenure was
00:34:03.740 for.
00:34:04.160 I thought that's
00:34:04.740 what we were
00:34:05.120 supposed to be
00:34:05.680 doing.
00:34:06.060 I'm pretty
00:34:06.400 thrilled by the
00:34:08.340 idea that you
00:34:10.720 can get data, you
00:34:12.080 can do the
00:34:13.140 analysis to the
00:34:13.800 best of your
00:34:14.200 ability, and you
00:34:14.740 can actually be
00:34:15.400 relevant, right?
00:34:16.780 Many of my
00:34:17.260 economist friends
00:34:17.880 are working on
00:34:18.340 the optimal cake
00:34:19.020 eating problem,
00:34:19.960 right?
00:34:20.200 That's not
00:34:21.080 what this is.
00:34:22.020 So, for me, it
00:34:22.860 was a real, truly,
00:34:24.220 a real opportunity
00:34:25.060 to actually feel
00:34:25.980 like you were
00:34:26.680 relevant in a
00:34:27.400 discussion that
00:34:27.980 was important.
00:34:28.480 I've never
00:34:28.840 written a paper
00:34:29.520 that spoke to
00:34:31.920 one of the
00:34:32.580 exact points that
00:34:34.300 people were
00:34:35.780 protesting in that
00:34:37.280 moment, right?
00:34:39.340 Usually, academics
00:34:40.160 are like, oh,
00:34:40.620 something happened
00:34:41.160 in the world,
00:34:41.800 seven years later,
00:34:42.680 I got it!
00:34:45.040 The world has
00:34:45.960 moved on, and
00:34:47.140 it becomes part of
00:34:47.920 history.
00:34:48.340 This, I got
00:34:49.980 really lucky, I
00:34:51.580 believe, that
00:34:52.420 this paper was
00:34:55.040 relevant in that
00:34:55.920 moment, unfortunately,
00:34:57.120 and continues to be
00:34:57.940 relevant, but I
00:35:00.200 was not expecting,
00:35:01.440 because I had not
00:35:01.860 seen that, I had not
00:35:02.920 had that experience
00:35:03.580 before, I was not
00:35:04.760 expecting what would
00:35:05.600 come.
00:35:06.580 It's also, and
00:35:07.900 maybe this is me
00:35:08.980 projecting a little
00:35:09.680 bit and push back if
00:35:10.580 you feel this is
00:35:11.220 incorrect, I imagine
00:35:13.120 it put a few noises
00:35:14.520 out of joint with
00:35:16.040 fellow academics,
00:35:16.920 you know, because a
00:35:17.920 lot of them toil
00:35:18.600 away, they produce
00:35:20.020 their papers, they
00:35:21.020 don't get, you
00:35:22.220 know, they're not
00:35:22.640 shared a lot, they're
00:35:23.860 not talked about a
00:35:24.860 lot, and then they
00:35:25.600 move on to the next
00:35:26.320 one.
00:35:26.740 Here you have this
00:35:27.680 young guy who comes
00:35:29.060 in, produces his
00:35:30.400 paper, gets to meet
00:35:32.020 with the president,
00:35:33.080 breaks the download
00:35:33.980 records, and all of
00:35:35.280 a sudden they're
00:35:35.640 going, him?
00:35:38.800 Why him?
00:35:39.400 Why not me?
00:35:40.400 Was that part of it,
00:35:41.340 do you think?
00:35:41.640 Yeah, that's
00:35:41.980 interesting.
00:35:43.300 I haven't thought
00:35:43.900 much about that it
00:35:44.660 could be, and I'd
00:35:45.540 also add to that,
00:35:46.520 that I don't, I
00:35:51.140 don't carry myself
00:35:51.960 as the typical
00:35:53.300 Harvard, maybe I
00:35:54.680 do now, the
00:35:55.120 sweater looks pretty
00:35:55.720 typical.
00:35:55.940 I mean, your
00:35:56.600 shirt is, my
00:35:58.820 wife bought me
00:35:59.480 this.
00:36:02.420 But, but in all
00:36:03.780 seriousness, you
00:36:04.360 know, when I was
00:36:04.740 younger in my
00:36:05.240 career, I'd wear
00:36:06.020 jerseys to class, and
00:36:07.360 you know, I was,
00:36:07.860 you know, just being
00:36:08.440 myself, right?
00:36:09.240 And so, and you
00:36:11.120 know, I'm a video
00:36:12.160 game aficionado, so I
00:36:14.140 had a video game
00:36:14.760 console in my
00:36:15.420 office, things like
00:36:16.200 that, that I think,
00:36:18.380 you know, they might
00:36:19.360 have not been, not
00:36:20.020 just saying, I can't
00:36:20.960 believe that, but
00:36:21.420 that guy, like the
00:36:22.880 guy who's playing
00:36:23.400 video games during
00:36:24.300 the day?
00:36:26.080 So some of that
00:36:27.140 could have been it.
00:36:27.680 There's, you know,
00:36:28.200 it's no secret, there's
00:36:29.300 a lot of jealousy in
00:36:30.780 the academy.
00:36:31.700 But I think it was,
00:36:32.740 it was more than
00:36:33.900 that.
00:36:34.140 I think it was
00:36:35.320 that people wanted
00:36:39.120 to be seen as being
00:36:40.320 on the right side of
00:36:41.380 this issue, back to
00:36:42.280 the tribes.
00:36:44.240 I mean, my own
00:36:45.080 department put out a
00:36:46.080 statement about police
00:36:46.900 use of force that was
00:36:47.880 completely opposite of
00:36:48.880 what the paper actually
00:36:50.600 showed, right?
00:36:52.040 That was one of the
00:36:52.580 hardest things to see.
00:36:54.080 I was like, it's right
00:36:55.300 here, I'm in the same
00:36:56.360 department, and their
00:36:57.860 statement started off,
00:36:59.040 we know how much bias
00:37:00.040 and race and
00:37:00.740 discrimination, I was
00:37:01.280 like, that's not
00:37:02.160 exactly what we found.
00:37:04.140 So not even my own
00:37:05.120 department could be
00:37:05.760 subtle in that way.
00:37:07.240 So I think people
00:37:08.160 were running scared.
00:37:11.340 There was a sense of
00:37:13.060 are you with the
00:37:14.420 good guys or the
00:37:15.140 not?
00:37:15.740 And so you couldn't
00:37:16.780 in 2017 or 18 in
00:37:19.100 an Ivy League
00:37:21.460 university stand up
00:37:22.580 and say, I don't
00:37:23.840 know, it seems
00:37:24.760 complicated, let's take
00:37:25.880 a look at the
00:37:26.300 numbers.
00:37:26.680 It was, nope,
00:37:27.420 clearly it's biased.
00:37:28.200 We've seen the
00:37:28.620 videos, we've seen
00:37:29.200 all, you know, 12,
00:37:30.620 15, 20 of them,
00:37:31.460 whatever the number
00:37:31.900 is, we've seen them
00:37:32.540 and that's bias.
00:37:33.100 And that's the, I
00:37:34.680 felt like that was
00:37:35.460 the knee-jerk reaction
00:37:36.320 that was happening.
00:37:37.360 I think the other
00:37:37.960 disappointing thing that
00:37:39.260 happened, in my opinion,
00:37:40.100 was that the standards
00:37:43.960 of evidence start to
00:37:44.880 change when you don't
00:37:45.500 like the result.
00:37:48.260 And so there were
00:37:50.360 people I really
00:37:50.840 respected, their
00:37:51.740 methods, their
00:37:53.140 analysis over the
00:37:54.040 years, followed their
00:37:55.220 work for literally
00:37:56.040 more than a decade,
00:37:57.340 who said things like,
00:38:00.040 well, I don't, this, I
00:38:01.260 don't like the way you've
00:38:02.060 done this paper.
00:38:02.740 And I was like, it's
00:38:03.280 the same as you do,
00:38:04.540 right?
00:38:05.460 So what has changed?
00:38:08.020 And so this cherry
00:38:09.460 picking of what's good
00:38:12.280 and what's bad, not
00:38:13.400 based on the
00:38:14.100 fundamentals, but based
00:38:15.920 on the output, that
00:38:17.800 is, that's the part
00:38:18.860 that's most concerning
00:38:19.680 to me.
00:38:20.280 You see what I'm
00:38:20.620 saying?
00:38:21.000 Yeah, yeah.
00:38:21.500 I see exactly what
00:38:21.660 you're saying.
00:38:22.120 I mean, what you're
00:38:23.300 talking about is you're
00:38:24.760 someone who's
00:38:25.360 interested in pursuing
00:38:26.220 the truth, who found
00:38:28.400 himself in a
00:38:29.040 politicized environment,
00:38:30.740 which, this will sound
00:38:33.100 perhaps like a strange
00:38:33.860 comparison, but as
00:38:34.600 someone who grew up in
00:38:35.260 the Soviet Union, that's
00:38:36.620 exactly the environment
00:38:38.100 that people operated
00:38:40.000 under, where there was
00:38:41.960 certain things, certain
00:38:42.940 areas.
00:38:43.320 If you research physics,
00:38:44.500 physics was non-political
00:38:45.920 until it became political
00:38:47.320 for whatever reason, and
00:38:48.440 suddenly your physics
00:38:49.820 paper would be
00:38:50.800 politicized, and people
00:38:52.160 would have opinions about
00:38:53.180 it, and they would
00:38:53.700 criticize it, not on
00:38:55.080 the merits, but because
00:38:56.320 it was the tribal
00:38:58.020 thing at that
00:38:58.840 particular moment in
00:38:59.740 time to do.
00:39:00.560 So I guess the
00:39:01.360 question I wanted to
00:39:02.100 ask you, we've had
00:39:02.760 obviously lots of
00:39:03.520 academics on the
00:39:04.280 show, and anyone
00:39:05.920 looking from the
00:39:06.740 outside into what we
00:39:08.820 see as the
00:39:09.380 manifestations of the
00:39:10.580 academic environment
00:39:11.840 now, people have
00:39:14.640 questions, and the
00:39:16.580 pursuit of truth seems
00:39:17.560 to be definitely, from
00:39:18.940 an outsider's
00:39:19.580 perspective, one of the
00:39:20.420 things that's really
00:39:20.920 taken a beating in
00:39:21.940 recent years.
00:39:22.780 Would you, on the
00:39:24.200 campuses around the
00:39:25.760 world, actually, the
00:39:26.580 Western world, would
00:39:27.320 you agree with that?
00:39:28.560 Do you think that's
00:39:29.040 true?
00:39:29.600 Certainly on, you
00:39:32.840 know, social topics and
00:39:35.540 sensitive subjects.
00:39:36.600 You know, I don't have a
00:39:38.500 sense that the math
00:39:39.240 guys are doing
00:39:39.860 functional theory any
00:39:40.840 differently than they
00:39:41.500 were before, but I do
00:39:43.620 believe that, you
00:39:46.200 know, research into
00:39:48.440 genetics, research into
00:39:50.120 gender identity or other
00:39:55.980 identities, research into
00:39:57.260 the police and the
00:39:58.220 education, I think those
00:39:59.460 things have been heavily
00:40:00.500 skewed, yes, I do
00:40:01.660 believe that, and it's
00:40:03.580 exactly as you
00:40:04.700 described.
00:40:05.660 I really, you know, I
00:40:06.960 showed up at Harvard at
00:40:08.280 age, I don't know, maybe
00:40:10.240 I was 26 or something, 25,
00:40:12.540 and the job I had before
00:40:16.440 that, I was like
00:40:16.940 McDonald's, I mean, I
00:40:17.760 don't know anything
00:40:18.540 better.
00:40:20.400 I really believed that, I
00:40:22.960 mean, we sit in these
00:40:23.680 seminars once a week, twice
00:40:25.680 a week, and someone
00:40:27.860 presents something, and
00:40:28.920 it's just crushing them
00:40:31.100 about exactly how you do
00:40:32.760 economics.
00:40:33.280 Let's get to the
00:40:33.920 fundamentals of what,
00:40:35.180 whether this paper is
00:40:36.040 correct or not.
00:40:37.520 I love that process,
00:40:39.380 because it didn't really
00:40:40.900 matter what the subject
00:40:42.420 was, the question is,
00:40:43.860 is this a valid research
00:40:45.800 design or not?
00:40:47.200 And those things get
00:40:48.380 spirited, and I did the
00:40:51.060 last three years of my
00:40:51.800 PhD at the University of
00:40:52.580 Chicago, which is where
00:40:53.480 they do it really in a
00:40:54.920 spirited way.
00:40:56.180 I mean, this is like, it
00:40:57.340 was a full contact sport.
00:40:58.640 I love the place, really
00:41:00.160 love the place.
00:41:00.800 I've given seminars there
00:41:01.740 that have spoiled it into
00:41:02.640 the middle of the night.
00:41:03.640 And so I fundamentally
00:41:06.120 believed that.
00:41:08.340 And so it literally
00:41:10.500 changed my whole worldview,
00:41:12.160 because I thought, oh,
00:41:14.580 crap, they were playing
00:41:17.120 a different, we're playing
00:41:18.060 different games here.
00:41:19.980 Right?
00:41:20.580 And I didn't know that.
00:41:22.640 And obviously it doesn't
00:41:24.200 apply everywhere.
00:41:25.340 There's, you know, there's
00:41:26.980 really, really great
00:41:28.500 academics everywhere, in
00:41:30.000 every place.
00:41:30.500 But I have been surprised
00:41:33.280 because when I came to
00:41:34.600 Harvard as a postdoc, I
00:41:37.300 went to the Harvard
00:41:38.180 Society Fellows.
00:41:41.820 Academics, I viewed
00:41:42.820 academics as the place
00:41:44.020 where smart, quirky people
00:41:47.980 went because they had no
00:41:50.040 political skills.
00:41:51.740 And so they weren't going
00:41:52.780 to go to corporations
00:41:53.500 because those were too
00:41:54.340 political.
00:41:55.280 But in a department, it was
00:41:58.560 really about meritocracy and
00:41:59.960 how smart you were and the
00:42:01.080 best ideas would rise to
00:42:02.060 the time.
00:42:04.000 I just don't think that's
00:42:05.040 true anymore.
00:42:06.120 I think that we're
00:42:08.000 selecting for people who
00:42:08.880 are much more political
00:42:09.760 and much more sophisticated
00:42:11.880 on that dimension because
00:42:13.400 the universities are, you
00:42:16.240 know, to be successful.
00:42:20.860 people, if you're a young
00:42:24.280 scholar, I do believe political
00:42:25.820 sophistication and savvy are
00:42:27.180 very important.
00:42:28.840 Just on that point, not to
00:42:30.540 personalize this too much,
00:42:31.800 Roland, but one of the people
00:42:32.900 involved in the campaign
00:42:34.660 against you was, I nearly
00:42:38.160 said the late Claudine
00:42:39.240 Gay.
00:42:42.020 And it seems to me, without,
00:42:44.100 you know, taking a particularly
00:42:45.560 strong stance either way on it,
00:42:46.920 that just that, her recent
00:42:48.920 departure kind of symbolized a
00:42:50.660 lot of the simmerings of the
00:42:51.980 stuff that we've been talking
00:42:53.680 about.
00:42:54.140 Now, I know you probably feel
00:42:56.180 quite strongly on a personal
00:42:57.420 level about her, but what did
00:42:59.080 you make of that whole saga,
00:43:00.760 including in relation to you,
00:43:02.340 but also just in terms of where
00:43:03.640 we are in the way that people
00:43:06.360 are appointed to positions within
00:43:08.000 academia, people are promoted
00:43:09.400 within academia, people are
00:43:10.980 required to have certain
00:43:12.880 opinions, people are treated
00:43:14.160 differently depending on their
00:43:15.320 background and ethnic background,
00:43:16.640 et cetera.
00:43:18.060 I found that whole situation
00:43:19.560 pretty sad.
00:43:23.960 It was just not a good day for
00:43:29.320 higher ed, for Harvard, or it
00:43:32.140 wasn't a day, weeks and weeks.
00:43:34.720 And so I just thought that
00:43:37.620 particularly because much of the
00:43:42.640 discussion was about her own
00:43:43.940 scholarship, which I know nothing
00:43:45.100 about, I've never read a paper by
00:43:46.380 Claudine Gay, but that people
00:43:51.420 seemingly around the world were
00:43:54.220 questioning the quality of that
00:43:55.740 at an institution like Harvard was
00:43:59.100 really hard.
00:44:00.280 You know, that was just really
00:44:03.820 sad.
00:44:05.220 So my personal involvement with the
00:44:11.180 administration aside, it's just, you
00:44:14.780 know, hard to watch every single
00:44:16.760 day people questioning the
00:44:18.680 integrity, the academic rigor of the
00:44:24.460 person who was supposed to be
00:44:25.600 leading the university.
00:44:26.760 And when you say it's hard, just to
00:44:28.620 be clear, you mean because it
00:44:32.380 possibly is the case?
00:44:34.640 Or do you mean because it's just
00:44:36.860 it kind of affects the reputation of
00:44:39.500 Harvard or why is it sad?
00:44:41.980 Both.
00:44:42.560 Because, you know, this is what we're
00:44:44.600 there for, rigor.
00:44:47.840 And, you know, I am a, I don't know a lot
00:44:51.040 about university administration.
00:44:52.280 I just don't, clearly.
00:44:53.420 Um, so, um, but, you know, I, I came to
00:45:01.140 the university when Larry Summers was
00:45:03.100 president and, um, I'll never forget.
00:45:08.600 Uh, this is not dodging your question.
00:45:10.940 This is giving context for your
00:45:12.000 question.
00:45:13.080 I got a call when I was, uh, a fourth
00:45:15.840 year graduate student and I never had a
00:45:17.880 call like this.
00:45:18.440 Maybe you, you're a fancy guy.
00:45:19.620 Maybe you've had one.
00:45:20.820 Uh, it's, you know, you pick up the
00:45:22.160 phone and he's, hello, hold for the
00:45:24.240 president of Harvard University.
00:45:26.520 Okay.
00:45:28.080 And so Larry gets on the phone.
00:45:29.420 We started talking and he tells me, if
00:45:32.760 you come to Harvard, because I had
00:45:34.020 offers at other places, we'll go to
00:45:36.840 lunch once a month and we'll talk about
00:45:38.300 your research.
00:45:40.080 What?
00:45:41.300 That would be unthinkable.
00:45:42.560 Now, I just couldn't imagine a
00:45:43.960 university president doing that.
00:45:45.120 Okay.
00:45:45.360 So he does it.
00:45:46.260 I don't believe him, but he says it.
00:45:49.140 So I get off.
00:45:49.940 I choose Harvard, go there.
00:45:52.160 Within four weeks, I get the call
00:45:53.740 off again.
00:45:54.740 I'd like to schedule lunch with you and
00:45:55.800 the president.
00:45:57.800 Okay.
00:45:58.600 So I go and we go to lunch and I had
00:46:01.000 just written a paper about how, um,
00:46:04.720 black students who go to historically
00:46:05.980 black colleges may pay, pay a wage
00:46:07.740 price.
00:46:09.780 And I was calling the paper at that
00:46:11.180 time, the price of identity.
00:46:12.960 Okay.
00:46:14.020 So he, he calls me in cause he wants to
00:46:16.200 discuss that paper.
00:46:16.940 And I go, Oh, I must be in trouble.
00:46:18.180 And he starts eating and he gets so
00:46:21.740 excited about the research research that
00:46:24.760 he's talking while he's eating and his
00:46:26.800 food is splattering all over him.
00:46:28.340 And in that moment, that's the first
00:46:29.560 time I ever really felt comfortable at
00:46:30.920 Harvard.
00:46:31.500 I thought this guy could be president.
00:46:33.080 He's a mess.
00:46:34.840 If he can be president, then there's a
00:46:36.400 chance because in that moment, it felt
00:46:38.400 all about ideas.
00:46:39.640 Right.
00:46:40.020 Yeah.
00:46:40.760 Right.
00:46:41.040 He was sloppily eating.
00:46:44.060 I probably was using the wrong fork, but
00:46:46.180 we were talking about statistical
00:46:47.740 identification on a particular paper
00:46:49.360 that he thought was interesting.
00:46:51.880 And so that's what I pride myself of in
00:46:54.560 administration.
00:46:55.460 I want someone who's been through it to
00:46:57.420 help guide you through.
00:46:59.560 It would have been nice to have, um,
00:47:02.520 Larry in 2016, when I was navigating this
00:47:05.340 stuff with Washington and with the paper
00:47:07.240 to provide guidance.
00:47:08.920 Right.
00:47:09.040 I, my budget, make sure I'm, that's
00:47:12.360 silly.
00:47:12.680 Anyone can do that.
00:47:13.620 So I, I, um, the reason it was hard is
00:47:16.640 because my preference and the way I
00:47:18.740 started at Harvard was to have a scholar,
00:47:21.000 a true scholar president.
00:47:23.200 And, uh, again, I don't know about, uh,
00:47:27.240 former president, gay scholar.
00:47:29.200 I don't, I've never written a paper,
00:47:30.240 read a paper.
00:47:30.760 Maybe they're phenomenal.
00:47:31.600 I don't know.
00:47:32.440 But to see the media attacking them,
00:47:35.340 those papers and questioning scholarship
00:47:37.100 and questioning what it meant to be at
00:47:38.640 Harvard university as someone who's been
00:47:40.700 there for 20 years, that part was hard.
00:47:43.120 Sure.
00:47:43.460 I understand the experience you described.
00:47:45.600 It sounds like the way universities should
00:47:48.240 operate, the pursuit of truth, the discovery.
00:47:51.140 I'm going to sound like a wide eyed romantic
00:47:53.880 here, but that's kind of what I thought
00:47:55.420 academia was for.
00:47:57.500 That's what I thought.
00:47:58.020 Me too.
00:47:58.260 You know, I, I remember giving, uh, I had an idea once
00:48:03.000 about, uh, segregation and how to measure it differently
00:48:06.180 because the old measures of segregation have been around
00:48:08.240 for decades and decades, but could we use our new
00:48:11.700 understanding of network theory to find a kind of
00:48:15.000 network-based model of segregation, right?
00:48:17.320 It's, it will sound weird, but it's the truth.
00:48:19.160 I was flying into Chicago and if you fly into Chicago at
00:48:21.580 night, the whole city is massive, but it looks like
00:48:24.360 this grid and the lights kind of make the edges of that
00:48:28.180 grid.
00:48:28.500 You can kind of see a matrix.
00:48:30.140 And I looked out the window on the plane and I thought,
00:48:32.080 geez, that's how we should think about segregation.
00:48:34.400 Cause I knew Chicago and I knew basically the, uh, black
00:48:38.540 people lived in this part of the grid and the Polish people.
00:48:40.640 And so, um, I eventually wrote that paper and then gave it
00:48:47.220 in, in one of the flagship seminars, uh, the Gary Becker
00:48:51.180 ran at Chicago and Gary Becker's got a Nobel prize.
00:48:53.980 He's a phenomenal, he was a phenomenal, one of my just
00:48:57.060 absolute heroes in the profession.
00:48:59.660 And that seminar went for an hour and a half.
00:49:02.580 Then it spilled over at, uh, to the Q club, uh, where I'm
00:49:07.480 sitting there with multiple Nobel prize winners around
00:49:10.620 the table and assistant professors and other people and
00:49:13.000 me, um, as a, an assistant professor or a postdoc.
00:49:17.380 And I just have this image in my head of literally Gary
00:49:21.080 Becker, this famous economist with a chicken wing hanging out
00:49:24.180 of his, out of his mouth and pointing to a piece of, uh, a
00:49:29.080 napkin where he was going to scribble some notes while eating
00:49:31.480 the chicken wing about this paper.
00:49:33.560 That's, we can both romanticize, but those are the ways I grew
00:49:36.980 up in academia.
00:49:37.720 So to now have it be a question of, can we say that?
00:49:42.060 Can we say this?
00:49:42.780 Can we do that?
00:49:43.420 Can we, we're not the world's spokespeople, right?
00:49:46.800 In the same way comedians aren't political ambassadors.
00:49:50.840 I mean, some think they are, right?
00:49:52.360 I'm going to be honest with you.
00:49:53.820 But we are supposed to be the, the, the ground bearers of truth
00:49:57.460 and then let everyone else figure out the politics.
00:50:00.840 We are supposed to just inject truth into the world.
00:50:03.660 Maybe that's naive.
00:50:04.380 Maybe I'm stupid, et cetera.
00:50:05.720 But that is the way I grew up.
00:50:07.820 That's what Glenn Lowry and Gary Becker and, and some of the
00:50:12.100 greats, they, that's what they, that's what they taught me.
00:50:14.460 And we have, we're way, way, way different from that.
00:50:17.080 Roland, you know, when you were talking about getting, uh,
00:50:21.160 your research published, breaking download records, et cetera,
00:50:24.060 et cetera.
00:50:25.260 I, my, my thought to myself is in a post George Floyd world,
00:50:30.420 there's part of me that thinks you wouldn't even get published now.
00:50:34.520 I don't, I think there would be some academic institutions,
00:50:37.580 maybe not all of them, but certainly some of them were like,
00:50:40.220 we can't publish this.
00:50:41.580 Yeah.
00:50:42.100 We, we, what are you trying to do?
00:50:44.560 Yeah.
00:50:45.520 You might be right.
00:50:46.800 Um, I, I, I, um, one of the projects I'm working on right now
00:50:51.440 is to understand academic freedom.
00:50:52.960 So I have, um, I'm looking into all the published papers, uh,
00:50:57.700 in the last two decades and I'm using AI to help me understand,
00:51:02.740 uh, the political slant of the results.
00:51:06.100 Would these results be considered centrist or conservative or liberal?
00:51:10.240 And what I'm showing, I'm not sure I'm right just yet,
00:51:14.000 but the preliminary results are showing are exactly what you just
00:51:16.920 described, which are, uh, there are far fewer published results
00:51:21.640 that are both center and, and right after, um,
00:51:26.560 leading up to and after 2020.
00:51:28.340 Now the question is, is it because the journals are not publishing them,
00:51:31.980 which is, what's your inclination or people aren't writing them?
00:51:34.380 I can't tell the difference.
00:51:35.480 Maybe progressivism is just true.
00:51:37.760 There was a paper, you know, there was a, there was a, uh, uh,
00:51:41.480 an article about that in the, in the Harvard student newspaper.
00:51:45.680 They said, what do we need diversity for of thought
00:51:48.100 when progressivism is just the right.
00:51:51.640 It's true, I don't, I don't, yeah.
00:51:54.100 Why research?
00:51:55.100 We know, yeah, we know the answer to everything.
00:51:57.220 Exactly, precisely.
00:51:58.560 But, and that is terrifying because they, as, as is most commonly said,
00:52:05.320 as is said all the time, the greatest form of censorship
00:52:08.160 is self-censorship.
00:52:09.280 Yeah.
00:52:09.820 So if you've got this idea and you think,
00:52:11.800 this could be a dynamite paper and then go,
00:52:15.040 but I can't write it because it won't get passed.
00:52:18.780 I've got the wrong results.
00:52:19.760 Yeah, because it will get the wrong results
00:52:21.520 or what you most likely think is self-preservation.
00:52:25.080 Yes.
00:52:25.380 What is this going to expose me to?
00:52:26.900 Because not everybody's like you temperamentally.
00:52:29.420 Then what we've got is essentially a,
00:52:33.540 a system that isn't fit for purpose.
00:52:36.740 A hundred percent.
00:52:37.340 And you're going to be making policies
00:52:38.960 based on the research that comes out,
00:52:40.780 which is only going to be one way
00:52:42.420 because no one wanted to show the other.
00:52:44.720 So that's, that's what I'm,
00:52:45.700 that's fundamentally what I,
00:52:47.220 what I'm losing sleep at night about.
00:52:48.820 That's exactly it.
00:52:50.220 So I guess the question is,
00:52:51.980 what do we do about this?
00:52:53.420 How can we challenge this?
00:52:54.780 How can we make a better academy?
00:52:58.480 Well, I don't know the answer to that
00:53:01.180 because it's very difficult
00:53:03.040 because you have to,
00:53:05.640 I used to think,
00:53:07.400 well, let's have grant money
00:53:09.820 for controversial ideas.
00:53:11.700 That's not going to work
00:53:12.480 because unless people can get tenure
00:53:16.600 and raise amongst the ranks
00:53:18.360 in their profession
00:53:19.180 by doing what they know is right.
00:53:24.200 And I'm skeptical
00:53:25.420 that we can change that part.
00:53:27.020 Unless you change
00:53:27.980 that overall set of incentives,
00:53:29.680 well, you know,
00:53:30.240 all the rest of the stuff
00:53:31.280 is window dressing.
00:53:32.240 That makes sense to me.
00:53:33.220 So until you make truth seeking
00:53:35.020 a high status thing
00:53:35.940 to do within academia,
00:53:37.940 you can't win.
00:53:39.160 You can't win.
00:53:39.640 But that's the purpose of academia.
00:53:41.940 So how, how, how do,
00:53:43.460 I mean, there's got to be a way
00:53:44.600 to make it,
00:53:45.840 to coin a phrase,
00:53:46.900 to make academia great again.
00:53:49.200 But there's got to be a way, right?
00:53:52.180 I'm trying to think of the,
00:53:53.520 it's not MAG.
00:53:58.020 Yeah, I think, you know,
00:53:59.720 all it takes is a couple of universities
00:54:01.200 to show us the way.
00:54:02.540 So I think, you know,
00:54:03.700 obviously that's what University of Austin
00:54:05.240 is trying to do.
00:54:05.980 You know, there are universities
00:54:09.260 like University of Chicago
00:54:10.420 who has tried to keep it that way.
00:54:12.420 There's, I just talked to the,
00:54:14.980 you know, general counsel
00:54:15.820 of a couple of other universities
00:54:16.940 who asked me the exact same question.
00:54:18.420 How do we do this?
00:54:19.500 And the issue is,
00:54:20.560 even if you do it
00:54:21.320 within your own university,
00:54:22.680 if that person doesn't get tenure there,
00:54:25.280 they fall potentially.
00:54:29.060 You know, imagine I had a research program
00:54:30.900 that said, you know what?
00:54:33.380 A lot of people have been looking
00:54:35.180 into government programs
00:54:37.680 as a way of helping black Americans.
00:54:41.220 I'm going to look into
00:54:42.340 how families can do a better job.
00:54:47.560 I'm not saying that's a good idea or not,
00:54:48.920 but just imagine someone said that.
00:54:50.960 And they decided on,
00:54:51.980 they went on a very rigorous research course
00:54:53.720 over five or eight years
00:54:55.300 on that side of the equation.
00:55:00.040 And even if their university supported that
00:55:03.540 and said, this is great,
00:55:04.560 but it just is beneath the line,
00:55:06.820 where are you going to go
00:55:07.520 if that doesn't work out?
00:55:09.400 Right?
00:55:09.600 There might be a much steeper fall
00:55:12.680 because other universities
00:55:13.920 might not like that line of inquiry.
00:55:16.380 If that's true, again,
00:55:18.260 the incentives for a person to do that
00:55:20.160 are very small.
00:55:22.320 And so you're not,
00:55:23.380 what am I saying with all this?
00:55:24.600 You not only need to have to fix
00:55:25.820 one university or two,
00:55:27.200 you have to fix a lot.
00:55:29.380 There's got to be a way,
00:55:31.120 there needs to be a way of saying,
00:55:32.820 if the research design
00:55:35.220 and the methods are rigorous enough,
00:55:37.000 we don't care what the answer actually is.
00:55:39.880 If the data are high quality
00:55:41.640 and the research design is solid,
00:55:45.260 we don't care if the solution points
00:55:47.460 right or left or in the middle.
00:55:49.280 And if there were a way
00:55:50.240 to provide insurance for those people,
00:55:53.580 then we could solve the problem.
00:55:56.760 But right now,
00:55:57.620 what I am hearing
00:55:59.280 from young scholars,
00:56:03.440 from graduate students,
00:56:04.600 is that they are scared
00:56:06.360 to do these things.
00:56:08.480 Right?
00:56:09.140 I've had people literally ask me,
00:56:11.100 what do you do
00:56:12.000 when you come up with something
00:56:13.140 that's taboo?
00:56:15.400 Right?
00:56:17.080 I've got these questions,
00:56:18.460 but I'm not sure I can answer them.
00:56:21.080 Do you think I can?
00:56:23.020 I, of course, encourage them.
00:56:24.800 But I also know
00:56:25.880 that there are potential costs to this.
00:56:28.500 And so,
00:56:29.440 we've got to find a way
00:56:31.860 to provide insurance.
00:56:33.000 Isn't that crazy?
00:56:33.940 To provide insurance
00:56:34.840 for telling the truth.
00:56:36.000 Roland, yeah.
00:56:39.220 Truth insurance, man.
00:56:41.640 I mean,
00:56:42.180 I don't think a lot of people
00:56:43.480 would give you that insurance,
00:56:44.740 I'm going to be honest with you.
00:56:45.700 A lot of the company's insurance...
00:56:46.780 The premiums on that, baby,
00:56:48.000 are going to be...
00:56:48.700 It's skyrocketing.
00:56:49.500 It's a new idea for Geico.
00:56:51.600 There's...
00:56:52.440 It seems to me
00:56:53.540 when I talk to academics
00:56:54.540 or we talk to academics
00:56:55.740 and we interview them,
00:56:57.480 they broadly fall into two camps.
00:56:59.340 There's two people...
00:57:00.700 There's the camp that says
00:57:01.920 the academy can be saved.
00:57:04.840 There's the other camp
00:57:05.880 which says
00:57:06.540 the academy is beyond saving.
00:57:08.600 We need to build new academies.
00:57:10.600 I realize that's essentially
00:57:12.680 a binary choice
00:57:13.740 I'm presenting you with.
00:57:14.840 Are you in one camp or another
00:57:17.120 or is there a third way?
00:57:19.860 I don't think you left
00:57:20.780 any room for a third way.
00:57:23.120 I'll tell you a third way, actually,
00:57:24.640 just to add some nuance to this.
00:57:26.100 So I guess
00:57:26.400 what we were posing
00:57:27.700 is like Ralston College
00:57:29.000 in Savannah, Georgia.
00:57:29.820 They're building a new thing
00:57:30.820 from the ground up.
00:57:31.480 Yes.
00:57:32.000 University of Austin, similar.
00:57:33.960 There are other people
00:57:34.620 who work within institutions.
00:57:35.840 I was in Grand Junction
00:57:38.320 in Colorado
00:57:38.880 and they have
00:57:40.840 a relatively small college there
00:57:42.920 and they don't do either.
00:57:45.720 They're not like massive,
00:57:49.080 obsessed with DEI
00:57:50.460 or anything like that,
00:57:51.060 but they're also not
00:57:51.940 aggressively against any of it.
00:57:53.680 They just do things
00:57:54.800 their own way.
00:57:55.840 Yeah.
00:57:56.200 Maybe that's the third way.
00:57:57.680 Yeah, I consider that
00:57:58.400 the first way,
00:57:58.980 which is reforming
00:57:59.700 the new institutions
00:58:00.540 we already have.
00:58:01.860 Right?
00:58:02.420 So I think there's a question
00:58:03.800 of, you know,
00:58:05.040 as if this were a house,
00:58:07.080 do we refurbish what we have
00:58:09.720 or do we start
00:58:10.640 with new construction?
00:58:12.360 And, you know,
00:58:13.320 maybe the market's
00:58:14.260 going to tell us
00:58:14.720 a little bit of both
00:58:15.380 and what the optimal mix is,
00:58:16.560 I don't know.
00:58:17.380 My inclination is
00:58:18.540 to try to refurbish
00:58:19.960 what we have,
00:58:20.700 although I'm very supportive
00:58:21.960 of what University of Austin
00:58:23.100 is doing, trying to do.
00:58:26.520 And here's a,
00:58:28.020 but here is a third way.
00:58:31.580 The people who are doing
00:58:32.900 the research
00:58:33.640 could just care less
00:58:35.280 about what other people think.
00:58:36.360 every big movement,
00:58:39.520 people had to have
00:58:40.480 sacrifice.
00:58:42.920 So either you believe in truth
00:58:45.420 and you're willing
00:58:46.780 to pay some personal cost
00:58:48.540 or you don't.
00:58:51.680 Right?
00:58:52.000 If this were,
00:58:53.100 if we were arguing about,
00:58:54.960 you know,
00:58:58.580 some social issue,
00:59:01.000 we would say,
00:59:02.280 well, yes,
00:59:02.820 we can have this policy
00:59:03.560 or that policy
00:59:04.100 or they could just work harder.
00:59:05.180 Right?
00:59:06.660 And so in this case,
00:59:08.040 why are we letting
00:59:08.920 the scholars off the hook?
00:59:11.640 So maybe that's
00:59:12.640 the third choice
00:59:13.420 is you can go
00:59:15.000 into this realizing
00:59:15.960 maybe I won't win
00:59:18.460 all the awards
00:59:19.280 that I could
00:59:21.080 if I,
00:59:21.640 if I made the results
00:59:23.120 seem more coherent
00:59:24.780 to the day's politics.
00:59:26.520 Maybe that's okay.
00:59:27.700 And maybe if there's
00:59:28.520 enough of those people
00:59:30.100 that that becomes
00:59:31.560 a new thing
00:59:32.100 to actually tell the truth.
00:59:33.420 Right?
00:59:33.620 Someone said
00:59:35.060 the other day,
00:59:36.040 I got a note saying,
00:59:37.400 I really liked
00:59:38.040 your interview on XYZ.
00:59:41.340 Authenticity is in.
00:59:42.300 And I thought,
00:59:42.640 what does that even mean?
00:59:44.360 Right?
00:59:45.420 I can tell you
00:59:46.340 exactly what it means.
00:59:47.260 We are in the authenticity moment.
00:59:48.900 This is why new media
00:59:50.140 is taking off
00:59:50.780 the way it is
00:59:51.320 because,
00:59:51.960 you know,
00:59:52.820 if you think about
00:59:53.280 what TV is,
00:59:54.020 it's a bunch of fake people
00:59:55.020 having a fake conversation
00:59:56.280 in a fake room.
00:59:57.380 This is the exact opposite.
00:59:58.800 Right?
00:59:58.980 We're just talking straight.
01:00:00.720 And I think
01:00:01.220 this is the moment
01:00:02.140 for authenticity.
01:00:03.360 The flaw in your argument...
01:00:05.000 Why can't the academics
01:00:06.000 be authentic?
01:00:06.220 Well,
01:00:06.400 this is the flaw
01:00:07.080 in your argument,
01:00:07.700 right?
01:00:07.920 Because I agree
01:00:09.400 with you 100%
01:00:10.680 about people
01:00:11.340 just need to man up
01:00:12.340 and tell the truth
01:00:13.360 and pursue the truth
01:00:14.500 and do the work
01:00:15.360 that they're actually
01:00:15.860 supposed to do.
01:00:16.760 On the other hand,
01:00:17.380 you're an economist
01:00:17.920 and you know
01:00:18.500 that human beings
01:00:19.200 respond to incentives
01:00:20.060 first and foremost.
01:00:21.420 And if it's not
01:00:21.940 a high status thing to do,
01:00:23.220 if it's not a money making
01:00:24.320 thing to do,
01:00:25.140 if it's not a blah, blah, blah.
01:00:26.220 And look at universities.
01:00:27.360 I mean,
01:00:27.500 I mentioned DEI.
01:00:28.440 They're not even hiring
01:00:31.060 or recruiting people
01:00:32.400 based on merit.
01:00:34.040 So if you have
01:00:34.820 an institution
01:00:35.300 that operates
01:00:36.180 in that way,
01:00:36.700 why would the people
01:00:37.640 in it pursue the truth
01:00:38.700 at any cost?
01:00:39.420 I mean,
01:00:39.720 that's not reflective
01:00:41.960 of how people work.
01:00:43.080 Let me push back.
01:00:44.560 This is fun.
01:00:47.280 Well,
01:00:47.700 incentives matter,
01:00:48.580 but what economists
01:00:49.700 really believe
01:00:50.200 is that people
01:00:50.520 maximize utility.
01:00:52.100 So then the question is,
01:00:53.280 why isn't it
01:00:53.920 that the academics
01:00:54.640 care enough
01:00:55.400 about pursuing the truth?
01:00:57.200 Right?
01:00:59.180 Yes,
01:00:59.620 incentives on the margin
01:01:00.620 matter here or there.
01:01:01.580 And yes,
01:01:01.920 I agree the incentives
01:01:02.720 are pointing
01:01:03.040 in the opposite direction.
01:01:05.560 But if you look
01:01:06.720 at some of the things
01:01:08.960 that people are,
01:01:09.720 how wild,
01:01:10.480 this is where we started.
01:01:11.520 Why are things,
01:01:12.080 who gets to change
01:01:12.820 the narrative?
01:01:13.400 Why do things swing
01:01:14.300 wildly one way
01:01:15.620 and wildly the other way?
01:01:17.700 Maybe,
01:01:18.960 you called it man up,
01:01:20.660 I'm going to call it,
01:01:21.540 maybe people ought
01:01:22.160 to have a North Star.
01:01:23.600 Sure.
01:01:23.880 that maybe we've lost
01:01:26.500 that compass,
01:01:28.840 that driving force
01:01:30.860 that said,
01:01:31.500 it doesn't matter
01:01:33.980 if I find
01:01:35.680 that this group
01:01:38.460 is to blame
01:01:39.080 for something,
01:01:40.240 I'm going to say it.
01:01:41.020 If I find another group
01:01:41.920 is to blame for that
01:01:42.720 or that's what
01:01:43.100 the empirical evidence says,
01:01:44.160 I'm going to say it.
01:01:45.620 Right?
01:01:45.840 And then,
01:01:46.240 yes,
01:01:46.540 let's get the universities
01:01:47.500 to back you up,
01:01:48.420 to provide support
01:01:50.020 for that.
01:01:50.660 Can we go this far?
01:01:52.780 Encourage that.
01:01:54.980 Right?
01:01:55.240 That's what I had
01:01:56.360 when I first got to Harvard.
01:01:58.020 I had a president
01:01:59.340 who not only said,
01:02:03.300 this is okay,
01:02:04.300 but actually encouraged
01:02:06.380 the work.
01:02:07.760 Right?
01:02:08.500 And so,
01:02:09.420 I think encouragement
01:02:11.080 within the university
01:02:12.720 could go a long way,
01:02:14.360 but I also believe
01:02:15.760 that,
01:02:16.960 you know,
01:02:19.080 maybe we should be
01:02:19.940 selecting different people
01:02:20.860 to go into the academy.
01:02:21.740 Do you think
01:02:23.360 it's...
01:02:24.360 I love it.
01:02:28.220 Is that a wrap?
01:02:29.260 I mean,
01:02:30.180 it works for me.
01:02:31.600 France has got
01:02:32.200 more and more questions.
01:02:33.040 One more question,
01:02:33.740 which is,
01:02:34.600 how much do you think
01:02:35.940 this is,
01:02:37.740 and I've seen it
01:02:38.560 because I used to work
01:02:39.740 in teaching,
01:02:42.720 and people being afraid
01:02:45.400 of their own students,
01:02:47.540 how big of a problem
01:02:49.060 is that?
01:02:49.480 again,
01:02:54.560 I don't know.
01:02:58.820 And I'm not even sure
01:02:59.900 people are afraid
01:03:00.820 of their own students.
01:03:01.780 Maybe they are.
01:03:03.640 It could be that
01:03:05.140 the incentives
01:03:06.240 to do the right thing
01:03:10.040 when it comes to grading
01:03:11.500 or putting pressure
01:03:12.760 on students
01:03:15.100 to study
01:03:16.560 for this or that
01:03:17.280 are just not there.
01:03:20.420 And,
01:03:21.020 you know,
01:03:24.040 earlier in my career,
01:03:27.080 I was a hardliner
01:03:30.460 on,
01:03:31.780 I said,
01:03:32.700 the papers
01:03:34.180 would do at five o'clock.
01:03:35.980 Five o'clock,
01:03:36.600 I don't take them.
01:03:38.080 And I didn't say
01:03:38.780 you had to turn it in five.
01:03:39.520 You could turn it in
01:03:40.020 three weeks ago
01:03:40.500 if you wanted to,
01:03:41.220 but five o'clock
01:03:41.840 is the deadline.
01:03:43.760 I'm softer
01:03:45.660 on that now.
01:03:46.560 Not because I'm
01:03:48.060 scared of the students,
01:03:49.600 but because
01:03:50.980 it's five o'clock.
01:03:52.580 Do I want the headache
01:03:53.500 that this is going to cause?
01:03:55.100 Right?
01:03:55.360 Because the incentives
01:03:56.180 are very different.
01:03:58.400 So I think,
01:03:58.720 I think distinguishing
01:03:59.540 between those two
01:04:00.300 is really important.
01:04:01.120 I,
01:04:01.420 you know,
01:04:01.840 I don't,
01:04:02.740 I think that,
01:04:04.940 I think we actually
01:04:06.940 oftentimes
01:04:08.300 can bend over backwards
01:04:10.480 so much
01:04:10.940 that you actually
01:04:11.420 do the students
01:04:12.260 a disservice.
01:04:13.840 Let me give you
01:04:14.540 a tiny,
01:04:15.140 tiny example
01:04:15.780 that means nothing.
01:04:17.800 You teach a course
01:04:18.800 and you say,
01:04:19.420 hey,
01:04:19.680 you should read
01:04:20.460 Gary Becker's dissertation
01:04:22.940 on the economics
01:04:23.620 of discrimination
01:04:24.280 published in 1957.
01:04:25.480 See you next class,
01:04:26.640 Peter.
01:04:28.240 Professor,
01:04:28.760 could you provide
01:04:29.420 a link to that?
01:04:30.580 Could you send us
01:04:31.340 around a link?
01:04:33.400 No,
01:04:33.740 just go find it.
01:04:35.800 But that we are,
01:04:36.680 they're used,
01:04:37.280 students are now
01:04:37.820 becoming used to
01:04:38.600 a level of customer
01:04:39.380 service
01:04:39.880 that is such
01:04:41.380 that,
01:04:42.100 you know,
01:04:43.400 everything is just
01:04:44.060 really laid out
01:04:44.960 for them
01:04:45.380 very,
01:04:45.860 very carefully
01:04:46.380 and it's more
01:04:48.220 like problem
01:04:48.760 set solving
01:04:49.380 than deep thinking.
01:04:50.660 You have a paper,
01:04:51.480 how exactly,
01:04:52.040 how many words
01:04:52.480 should it be?
01:04:52.840 What exactly should,
01:04:53.720 it's like,
01:04:54.440 why don't you decide
01:04:55.520 those,
01:04:56.360 the answers
01:04:56.840 to those questions?
01:04:58.200 So I,
01:04:58.720 you know,
01:04:58.960 part of the reason
01:04:59.460 I don't provide
01:05:00.160 links for papers
01:05:00.940 is not because
01:05:01.480 I'm being lazy
01:05:02.140 but because
01:05:02.800 when I was a student,
01:05:03.860 when you went
01:05:04.220 to search for the paper,
01:05:05.380 you actually found
01:05:05.840 three or four
01:05:06.380 more around there
01:05:07.260 were actually interesting
01:05:07.940 and I pulled
01:05:08.340 those off the shelf too.
01:05:10.000 And so I actually think
01:05:10.720 by not providing
01:05:12.240 every single thing
01:05:13.400 they need
01:05:13.820 for their
01:05:14.180 customer service
01:05:15.520 needs,
01:05:16.680 you actually provide,
01:05:17.540 sometimes can provide
01:05:18.300 a better education.
01:05:19.860 Roland,
01:05:20.120 it's been an absolute
01:05:20.760 pleasure having you on.
01:05:21.660 We're going to ask you
01:05:22.200 a bunch of questions
01:05:23.040 from our supporters
01:05:23.840 in a second
01:05:24.380 but we always wrap up
01:05:25.680 with the same question
01:05:26.460 which is,
01:05:27.200 what's the one thing
01:05:27.720 we're not talking about
01:05:28.780 as a society
01:05:29.500 that you think
01:05:30.040 we should be?
01:05:30.500 I think that
01:05:39.560 for me
01:05:42.840 that question
01:05:46.240 boils down to
01:05:47.300 we basically know
01:05:51.000 there are
01:05:52.120 three or five things
01:05:54.540 that could fundamentally
01:05:56.500 change
01:05:57.340 racial inequality
01:05:58.560 in America.
01:06:00.220 We know it,
01:06:01.080 we know what to do.
01:06:02.700 Not to solve
01:06:04.080 but fundamentally
01:06:05.560 change racial inequality
01:06:06.620 in America.
01:06:07.540 Why don't we have
01:06:08.720 the political courage
01:06:09.660 to do it?
01:06:10.160 What other things?
01:06:13.820 Education's huge,
01:06:15.160 number one.
01:06:16.240 So when I did
01:06:16.960 some work on mobility
01:06:18.620 trying to understand
01:06:19.560 for people,
01:06:20.900 you know,
01:06:21.040 you have a set of people
01:06:21.840 who were born
01:06:22.940 into poverty,
01:06:24.020 some got out,
01:06:24.660 some didn't.
01:06:25.120 what are the differences
01:06:26.840 between them?
01:06:27.580 Number one,
01:06:29.720 resoundingly education.
01:06:31.500 So we know
01:06:33.100 how to
01:06:34.560 create better schools
01:06:36.400 for kids.
01:06:36.860 We've got exemplars
01:06:37.620 all over the country
01:06:38.460 and yet politics
01:06:39.740 gets in the way
01:06:40.460 of making those reforms.
01:06:42.720 That'd be number one.
01:06:43.840 Why?
01:06:44.340 Why do we let that happen
01:06:45.620 as a country?
01:06:47.120 Right?
01:06:47.400 We've got places
01:06:48.240 in our country
01:06:48.900 like,
01:06:49.840 you know,
01:06:50.580 D.C. public schools
01:06:51.500 where
01:06:53.760 less than 10%
01:06:55.540 of kids are,
01:06:56.260 black kids are proficient
01:06:57.140 in reading math
01:06:57.980 and
01:06:59.860 we just go on
01:07:02.360 like nothing happened.
01:07:03.720 We're talking
01:07:04.300 about other stuff
01:07:04.840 and so for me,
01:07:08.080 and we fundamentally
01:07:08.900 know how that's
01:07:09.940 impacting their mobility
01:07:10.840 and
01:07:12.840 if it is
01:07:14.220 extraordinarily
01:07:15.520 frustrating
01:07:16.280 to be
01:07:16.680 in a social
01:07:17.760 scientist's shoes
01:07:18.580 and to
01:07:19.580 see,
01:07:20.740 hey,
01:07:21.280 we actually
01:07:21.860 know
01:07:22.560 some things
01:07:23.600 that could
01:07:23.860 fundamentally,
01:07:24.800 I'm not talking
01:07:25.220 about at the margin,
01:07:26.120 fundamentally make
01:07:27.220 their lives better
01:07:28.060 and yet
01:07:29.180 the adults
01:07:30.500 can't get their
01:07:31.180 shit together
01:07:31.780 to help the
01:07:32.680 kids that they
01:07:33.340 say they care
01:07:33.840 so much about.
01:07:35.100 So as a society,
01:07:36.680 why do we
01:07:37.660 continue to let
01:07:38.780 that happen?
01:07:40.380 Roland,
01:07:40.660 what are the things
01:07:41.320 in that particular,
01:07:42.280 I want to go into
01:07:42.880 the other things,
01:07:43.600 but on education,
01:07:45.160 what are the
01:07:45.680 maybe not simple,
01:07:47.200 but the obvious
01:07:48.880 answers to
01:07:49.740 dealing with that
01:07:50.700 issue?
01:07:51.840 You know,
01:07:52.280 at a high level,
01:07:55.200 you know,
01:07:56.060 I'm going to show
01:07:56.400 you where we got
01:07:56.920 it from now
01:07:57.280 to exactly what
01:07:57.840 they are.
01:07:58.580 So we took
01:07:59.620 a couple years
01:08:00.160 and went and,
01:08:00.880 you know,
01:08:01.120 charter schools
01:08:01.880 on average
01:08:02.620 are no better
01:08:03.020 than public
01:08:03.820 schools in America,
01:08:05.060 but there's some
01:08:06.500 that are amazing
01:08:07.320 and there's some
01:08:08.460 that are pretty
01:08:09.380 awful.
01:08:09.880 Okay.
01:08:10.340 Okay.
01:08:10.960 And so what we
01:08:11.800 did was we went
01:08:12.480 in and tried to
01:08:13.040 understand what
01:08:13.760 makes them good
01:08:15.020 and what makes
01:08:15.380 them awful,
01:08:16.060 right?
01:08:16.280 And what we
01:08:16.700 found,
01:08:17.060 where there
01:08:17.460 were five
01:08:17.940 factors that
01:08:18.620 explained 50%
01:08:19.780 of the variance
01:08:20.400 and what makes
01:08:21.060 some schools
01:08:21.540 great and other
01:08:22.020 schools not so
01:08:22.620 great.
01:08:22.980 And those five
01:08:23.800 factors are the
01:08:25.120 amount of time
01:08:25.720 you spend in
01:08:26.220 school,
01:08:26.880 right?
01:08:27.220 So that's,
01:08:27.860 that's one
01:08:28.360 factor.
01:08:28.780 I call it the
01:08:29.200 basic physics
01:08:29.820 of education.
01:08:30.400 If you're behind
01:08:31.140 and they're ahead,
01:08:32.000 you need to put
01:08:32.480 in more time
01:08:33.220 or ask them,
01:08:33.900 please stop
01:08:34.280 working so hard.
01:08:35.820 Number two,
01:08:36.560 number two
01:08:38.000 was,
01:08:40.020 you know,
01:08:41.180 how schools
01:08:42.100 use data to
01:08:42.880 drive instruction.
01:08:44.240 So,
01:08:45.040 you know,
01:08:46.240 when I got
01:08:47.460 started,
01:08:48.380 data was a
01:08:50.000 real asset
01:08:50.460 in schools.
01:08:50.960 Now it's
01:08:51.220 almost like a
01:08:51.640 liability.
01:08:52.200 There's so much
01:08:52.860 of it.
01:08:53.140 No one knows
01:08:53.580 how to process
01:08:54.040 it.
01:08:54.220 What do you
01:08:54.560 do?
01:08:55.080 The schools
01:08:55.580 that are
01:08:55.780 effective,
01:08:56.220 when they see
01:08:57.320 that they're
01:08:57.620 students,
01:08:58.040 they take
01:08:59.220 assessments
01:08:59.680 throughout the
01:09:00.160 year,
01:09:00.300 when they see
01:09:00.820 the students
01:09:01.260 aren't getting
01:09:01.940 what they're
01:09:02.260 supposed to
01:09:02.760 in classes,
01:09:03.280 they regroup,
01:09:04.820 they figure it
01:09:05.340 out,
01:09:05.580 they come up
01:09:05.940 with a different
01:09:06.320 strategy,
01:09:06.760 and they go
01:09:07.320 back to work.
01:09:08.140 So data-driven
01:09:08.760 instruction.
01:09:09.220 The third
01:09:10.420 is
01:09:12.140 small group
01:09:13.580 tutoring.
01:09:14.160 It is
01:09:14.480 high-dosage
01:09:15.300 tutoring.
01:09:15.780 If you had
01:09:16.200 to do one
01:09:16.820 thing in
01:09:17.260 education,
01:09:18.040 in my
01:09:18.380 opinion,
01:09:18.760 it'd be
01:09:18.960 high-dosage
01:09:19.540 tutoring.
01:09:19.900 You can
01:09:20.100 get incredible
01:09:20.940 gains when
01:09:21.620 you've got a
01:09:22.000 kid around
01:09:22.380 a half-moon
01:09:23.120 table,
01:09:24.240 two kids,
01:09:24.880 one tutor,
01:09:25.900 and with
01:09:26.460 very direct
01:09:27.300 instruction.
01:09:29.240 The fourth
01:09:30.140 is,
01:09:30.840 you know,
01:09:31.340 human capital.
01:09:33.000 Who are
01:09:33.560 the teachers
01:09:34.100 teaching them
01:09:35.400 and how often
01:09:36.160 you give them
01:09:36.660 feedback,
01:09:37.560 right?
01:09:37.740 So the
01:09:38.040 bad schools
01:09:38.680 give you
01:09:39.280 feedback once
01:09:39.900 a year.
01:09:40.360 They come
01:09:40.720 in in the
01:09:41.480 spring,
01:09:42.040 you don't
01:09:42.540 teach very
01:09:42.960 well,
01:09:43.760 most of the
01:09:44.220 kids fail,
01:09:44.740 and they
01:09:44.840 say,
01:09:45.080 see you
01:09:45.380 again in
01:09:45.620 the fall.
01:09:46.980 Schools
01:09:47.420 that are
01:09:47.660 very effective,
01:09:48.620 very often,
01:09:49.980 small bits
01:09:50.460 of feedback
01:09:50.940 to ensure
01:09:51.700 that teachers
01:09:53.280 have the
01:09:53.840 support they
01:09:54.360 need to
01:09:54.780 get better.
01:09:55.820 And the
01:09:56.060 last one
01:09:56.620 is very,
01:09:57.340 very important
01:09:57.780 to me.
01:09:58.920 All of them
01:09:59.520 are important,
01:09:59.920 but this one
01:10:00.300 speaks to me
01:10:00.940 personally,
01:10:01.920 is cultured
01:10:03.760 expectations.
01:10:05.240 Probably one
01:10:05.940 of the most
01:10:07.340 eloquent
01:10:08.960 political
01:10:09.580 statements
01:10:09.920 I've heard
01:10:10.240 in my
01:10:10.480 lifetime
01:10:10.880 was the
01:10:11.660 soft bigotry
01:10:12.260 of low
01:10:12.540 expectations.
01:10:13.960 And I
01:10:14.860 believe,
01:10:15.700 and the data
01:10:16.180 support it,
01:10:16.840 that kids
01:10:17.260 live up or
01:10:17.960 down to
01:10:18.320 your expectations.
01:10:19.500 Like the
01:10:19.840 schools that
01:10:20.280 are effective
01:10:20.800 understand that
01:10:21.600 if you go
01:10:21.900 into an
01:10:22.200 inner city
01:10:22.560 you're dealing
01:10:23.040 with very
01:10:25.500 high rates
01:10:26.140 of single
01:10:26.720 female-headed
01:10:27.220 households,
01:10:28.540 poverty,
01:10:29.560 et cetera.
01:10:30.240 They don't
01:10:30.840 use that as
01:10:31.480 an excuse
01:10:32.040 not to educate
01:10:32.860 them.
01:10:33.280 They use that
01:10:33.960 as an excuse
01:10:35.120 to be more
01:10:35.660 efficient inside
01:10:36.640 the school
01:10:37.220 day to
01:10:37.900 say,
01:10:38.340 hey,
01:10:38.640 we only
01:10:39.120 have seven
01:10:39.660 hours with
01:10:40.160 this kid.
01:10:40.700 We've got to
01:10:41.120 make up for
01:10:41.700 single-female
01:10:42.620 headed household
01:10:43.220 poverty,
01:10:43.960 et cetera,
01:10:44.200 so we need
01:10:44.700 to get to
01:10:45.080 work.
01:10:45.360 We need
01:10:45.580 to use
01:10:45.840 data smartly.
01:10:46.560 I need
01:10:46.740 to be able
01:10:47.000 to give
01:10:47.200 you feedback
01:10:48.180 on your
01:10:48.480 teaching,
01:10:49.200 et cetera.
01:10:49.900 So those
01:10:50.680 five factors,
01:10:51.360 we took
01:10:51.720 those,
01:10:52.140 we put
01:10:52.440 them in
01:10:52.780 a randomized
01:10:53.220 experiment in
01:10:53.960 Houston,
01:10:54.260 Texas,
01:10:54.600 in regular
01:10:55.000 old public
01:10:55.460 schools,
01:10:56.140 and we
01:10:56.400 showed that
01:10:57.780 in three
01:10:59.500 years you
01:11:00.040 could close
01:11:00.460 the racial
01:11:00.820 achievement gap
01:11:01.460 in man,
01:11:02.260 and five
01:11:03.220 years you
01:11:03.600 could do it
01:11:03.940 in reading.
01:11:04.460 So you
01:11:04.780 can do
01:11:05.460 this stuff.
01:11:06.160 And this
01:11:06.600 is just
01:11:06.860 little old
01:11:07.540 nerdy me.
01:11:08.160 This is not
01:11:08.760 these phenomenal
01:11:11.040 superstars like
01:11:11.940 Jeff Canada
01:11:12.500 who runs
01:11:13.900 the Harlem
01:11:14.180 Children's
01:11:14.500 Zone here
01:11:14.760 in New York.
01:11:15.240 Of course
01:11:15.560 he's amazing,
01:11:16.180 but he's
01:11:16.460 done the
01:11:16.780 same thing.
01:11:17.620 So you
01:11:17.980 can do
01:11:18.380 this at
01:11:19.000 scale.
01:11:19.540 We've
01:11:19.680 done it
01:11:19.920 in Colorado,
01:11:20.840 we've
01:11:21.000 done it
01:11:21.240 in Houston,
01:11:21.840 we've
01:11:21.980 done it
01:11:22.160 in other
01:11:22.400 places.
01:11:23.660 And so
01:11:24.040 it's very
01:11:24.640 frustrating
01:11:25.540 that even
01:11:26.520 just in
01:11:26.880 the simple
01:11:27.300 slice of
01:11:27.760 education,
01:11:28.900 we don't
01:11:31.560 see policy
01:11:32.420 going towards
01:11:33.300 the things
01:11:33.800 that we
01:11:34.180 know
01:11:34.520 actually
01:11:35.240 can be
01:11:36.280 effective.
01:11:36.940 And so
01:11:37.320 the answer
01:11:38.580 to your
01:11:38.840 question for
01:11:39.360 me is
01:11:39.840 how can
01:11:41.220 we as
01:11:41.640 a society
01:11:42.320 both sit
01:11:43.160 around and
01:11:43.940 lament the
01:11:44.720 lack of
01:11:45.120 progress and
01:11:46.260 also not
01:11:47.540 be willing to
01:11:48.640 do whatever
01:11:49.120 it takes to
01:11:50.580 solve these
01:11:51.320 things in our
01:11:51.860 generation?
01:11:52.860 Okay.
01:11:53.280 I want to
01:11:53.900 dig more
01:11:54.300 though,
01:11:54.540 even though
01:11:54.840 it's the
01:11:55.080 last question.
01:11:55.580 You said
01:11:55.820 education,
01:11:56.500 there's
01:11:56.700 others though,
01:11:57.320 right?
01:11:57.820 Sure.
01:11:58.580 We need to
01:11:59.400 really figure
01:11:59.920 out how do
01:12:00.900 we increase
01:12:02.740 the non-cognitive
01:12:04.320 skills like
01:12:05.960 resilience and
01:12:08.000 grit, those
01:12:09.820 things.
01:12:10.300 Really, those
01:12:10.880 were the, in my
01:12:11.480 mobility thing,
01:12:12.140 those were the
01:12:12.600 next ones.
01:12:14.040 We also have a
01:12:15.560 sense of, you
01:12:16.900 know, how to do
01:12:18.400 health care in a
01:12:19.280 more efficient and
01:12:20.180 better way.
01:12:21.040 I mean, I'll come
01:12:22.160 back on the show,
01:12:22.820 health care is a
01:12:23.340 whole different
01:12:23.760 topic, but you
01:12:24.600 get the point.
01:12:25.860 Health care,
01:12:27.020 criminal justice,
01:12:27.740 education, and
01:12:30.720 non-cognitive
01:12:31.760 skills, which is
01:12:32.420 included in
01:12:32.840 education.
01:12:33.800 We have, the
01:12:34.980 social science is
01:12:36.020 not perfect by
01:12:36.780 any means, but
01:12:38.620 we know enough
01:12:39.860 to make
01:12:40.360 fundamental
01:12:41.120 progress and
01:12:42.060 we just can't
01:12:42.800 get past
01:12:43.220 ourselves.
01:12:43.740 And that is
01:12:45.400 extremely, extremely
01:12:46.880 frustrating.
01:12:47.760 Well, Roland, we
01:12:48.460 would be delighted
01:12:49.440 to have you back
01:12:50.100 on to talk about
01:12:50.860 some of those
01:12:51.280 things next time.
01:12:51.980 But for now,
01:12:52.440 follow us over
01:12:53.180 to Locals, where
01:12:53.940 we continue the
01:12:54.580 conversation with
01:12:55.300 your questions.
01:12:57.740 Do you think
01:12:58.360 the narrative of
01:12:59.020 systemic racism
01:12:59.920 towards African
01:13:00.740 Americans by
01:13:01.460 governments and
01:13:02.000 law enforcement
01:13:02.620 ever can ever be
01:13:03.780 changed?
01:13:04.380 Or has the media
01:13:05.220 now perpetuated
01:13:06.120 this narrative to
01:13:07.440 the point it
01:13:08.040 almost can't be
01:13:08.780 challenged in the
01:13:09.600 mainstream?