#123 — Identity & Honesty
Episode Stats
Words per Minute
186.11356
Summary
Vox Media's Ezra Klein and I discuss Charles Murray and his controversial views on race, IQ, and IQ tests, among other topics, in the first episode of The Making Sense Podcast with Ezra Klein, a new podcast hosted by Sam Harris and co-hosted by Ezra Klein. In this episode, we discuss the timeline of events leading up to the creation of The Charles Murray Theory, and Ezra's account of the events that led us to that point. We also discuss what it means to be a "pro-choice" conservative in America, and what that means for the future of the culture in which we're all judged on our ability to make sense of the information we're given, and how we should respond to it. And, yes, there's a lot more to it than that, but we'll get to that in the second half of the podcast, where we talk about it all, including Ezra's thoughts on why Charles Murray is a bad idea and why we should be mad at him for it, and why it's a good idea to have him on the show. Please consider becoming a supporter of the show by becoming a patron. We don't run ads on the podcast and therefore it's made possible entirely through the support of our listeners, so if you enjoy what we're doing here, please consider becoming one! and we'll make sure to make sure you're getting the most out of this podcast as much as possible of the best possible listening experience possible for your time and listening to the podcast. Thanks for listening and sharing it with your fellow podcast listeners! -Sam Harris and supporting the show, making sense in the most important podcast in the world. -Ezraversus-making sense? Sam Harris -The making sense podcast by Vox Media's making sense by by the making sense of it by The Making sense podcast by is a podcast by the Making sense Podcasts by ezrakel@themakingsense.org by vox.co.org/themakingmmindingspondent by r/makingsensepodcast v=1Vox_tQQ&p&q&q=a&qid=4q8q&t=3q&ref=8&qref=1&qx&qq&s=3&q_t=1s&q%3d&qw&qf=3
Transcript
00:00:10.880
Just a note to say that if you're hearing this, you are not currently on our subscriber
00:00:14.680
feed and will only be hearing the first part of this conversation.
00:00:18.420
In order to access full episodes of the Making Sense Podcast, you'll need to subscribe at
00:00:24.060
There you'll find our private RSS feed to add to your favorite podcatcher, along with
00:00:30.260
We don't run ads on the podcast, and therefore it's made possible entirely through the support
00:00:35.900
So if you enjoy what we're doing here, please consider becoming one.
00:00:50.900
I think I'm going to resist the temptation to add any editorial comments here.
00:00:55.640
My previous episode, the Extreme Housekeeping Edition, had my full reaction to all of the
00:01:04.860
At the beginning here, I go through a timeline of events with Ezra.
00:01:12.680
I think the conversation speaks for itself, and if you listen to the whole thing, you will
00:01:17.300
definitely know what I think about it by the end.
00:01:20.680
I think it probably does have educational value.
00:01:27.440
As to what lessons can be drawn from it, I will let you decide.
00:01:32.000
All I can say is that I actually did my best here.
00:01:37.960
This was a sincere effort to communicate, and you can judge the effect.
00:01:53.620
Okay, so for better or worse, we're finally doing a podcast together.
00:01:58.120
So here's what I would suggest, and I wanted to see if this was amenable to you.
00:02:05.140
I thought it would make sense for me to just give a couple minutes, you know, short kind
00:02:10.560
of like opening thing at the beginning, try to sort of frame where I am on this.
00:02:13.520
I think I maybe have a way to frame it a little productively.
00:02:16.760
And then I'm happy to, in return for that, give you the last word on the podcast, if that
00:02:22.920
Actually, I had a couple of ideas, and so let me just put those out there.
00:02:25.980
And so first, I think we should make the ground rules explicit so that our listeners understand
00:02:32.560
So my understanding is that we'll both present the audio unedited, and it's fine to cut mouth
00:02:38.940
noises and other glitches, but, you know, if we take a bathroom break, we'll cut that
00:02:44.060
And we can have sidebar conversations that we both agree are off the record, but basically
00:02:47.960
no meaningful words will be cut from the exchange.
00:02:52.780
And I had thought, I'm happy to do this after you start, or it makes some sense, I think,
00:03:00.360
to do it before you add what you just suggested.
00:03:03.220
I thought I should summarize how we got here and just kind of go through the chronology
00:03:07.620
so that people who are just hearing this for the first time understand about the email
00:03:15.360
I assume, I mean, look, we can do this in different ways, but my assumption with you, you tend to have,
00:03:20.720
as I understand it, you know, intros where you do stuff like that.
00:03:27.360
Yeah, but I think it would be good to avoid the perception that our account of how we got here is
00:03:35.720
I think maybe I should give an account which you then can say, okay, yeah, that's the sequence of
00:03:43.800
Here's my only hesitation on this, and I don't have a huge problem with it.
00:03:48.720
But I would worry about us ending up burning a lot of our time going back and forth on
00:03:57.000
So if we just want to do a very neutral account of it, that's fine with me, but I wouldn't
00:04:00.120
want to end up with like a long chronology argument.
00:04:04.640
So I'll do that, and then you'll jump in at the end of that and give me your current take.
00:04:10.180
And obviously, I'll be describing this account from this chronology from my point of view,
00:04:15.080
but I'll flag the places where I think we have a different interpretation of what happened.
00:04:19.340
But I think the sequence of events is totally objective here.
00:04:27.060
Almost exactly a year ago, I had Charles Murray on my podcast.
00:04:30.700
And Murray, as many of our listeners will know, is the author of the notorious book,
00:04:34.880
The Bell Curve, and it has a chapter on race and IQ and differences between racial measures
00:04:45.220
So Murray's a person who still gets protested on college campuses more than 20 years later.
00:04:50.840
And while I have very little interest in IQ and actually zero interest in racial differences
00:04:57.560
in IQ, I invited Murray on my podcast because he had recently been deplatformed at Middlebury
00:05:03.160
College, and he and his host were actually assaulted as they left the auditorium.
00:05:08.860
And in my view, this seemed yet another instance of a kind of moral panic that we were seeing
00:05:15.860
And it caused me to take an interest in Murray that I hadn't previously had.
00:05:20.100
So I had never read The Bell Curve, because I thought it must be just racist trash, because
00:05:26.700
I assumed that where there was, you know, all that smoke, there must be fire.
00:05:32.340
And so when I did read the book and did some more research on him, I came to think that he
00:05:38.980
was probably the most unfairly maligned person in my lifetime.
00:05:44.500
It doesn't really run the risk of being much of an exaggeration there.
00:05:47.860
And the most controversial passages in the book struck me as utterly mainstream with respect
00:05:58.100
They were mainstream at the time he wrote them, and they're even more mainstream today.
00:06:02.760
So I perceived a real problem here of free speech and a man's shunning.
00:06:12.920
I felt culpable because I had participated in that shunning somewhat.
00:06:21.100
I had declined at least one occasion where I could have joined a project that he was associated
00:06:26.820
And I declined because he was associated with it, because I perceived him to be radioactive.
00:06:32.760
So I felt a moral obligation to have him on my podcast.
00:06:37.560
And in the process of defending him against the charge of racism, and in order to show that
00:06:44.580
he had been mistreated for decades, we had to talk about the science of IQ and the way
00:06:50.700
genes and environment almost certainly contribute to it.
00:06:53.320
And again, IQ is not one of my concerns, and racial differences in IQ is absolutely not one of my
00:07:03.420
But a person having his reputation destroyed for honestly discussing data, that deeply concerns me.
00:07:09.960
So I did that podcast, again, exactly a year ago, and Vox then published an article that was highly
00:07:20.380
And it was written by Eric Turkheimer and Catherine Harden and Richard Nisbet.
00:07:24.380
And this article, in my view, got more or less everything wrong.
00:07:30.160
OK, it read to me like a piece of political propaganda.
00:07:36.280
So hearing this, I'm totally happy to have you do this on yours.
00:07:42.880
I think this is a long kind of, and I totally get like from your perspective thing on it.
00:07:48.140
But just imagine what it will be like for people coming to this podcast not knowing why we're
00:07:54.020
And I just think that if we want to do it that way, let's just do a shorter version of this.
00:07:58.320
You know, just like, you know, like I would suggest something more, you know, not and expand
00:08:03.660
But like, look, you had Murray on your podcast a year ago.
00:08:06.220
You wanted, you know, he had been deplatformed at Middlebury.
00:08:10.580
We published an article that was highly critical of you.
00:08:13.680
You know, I guess you can call it propaganda if you want.
00:08:16.220
But obviously, the more you lean on this, the more this is going to become what we talk about.
00:08:23.400
So it's like we've had a back and forth, published emails.
00:08:25.780
Like, I'm totally happy to have you summarize it.
00:08:27.840
But I don't want to suspend like, I don't want to feel like I'm sitting here for 10 minutes.
00:08:35.680
I think in my mind, I'm setting you up to say what you said you wanted to say,
00:08:40.240
which is what your current take is on the situation.
00:08:48.860
It accused us of peddling junk science and pseudoscience and pseudoscientific racialist
00:08:55.100
speculation and trafficking in dangerous ideas.
00:08:59.900
But at minimum, I'm painted as a total ignoramus.
00:09:03.300
It was one line which said, you know, if you know, while I have a Ph.D.
00:09:06.020
In neuroscience, I appear to be totally ignorant of facts that are well known to everyone in
00:09:11.900
And I think that you should quote the line if you want to quote a line.
00:09:19.580
Sam Harris appeared to be ignorant of facts that were well known to everyone in the field
00:09:24.220
Now, that's since been quietly removed from the article, but it was there and it's archived.
00:09:31.120
And I sent you an email where I was pretty pissed because, again, I felt I was treated
00:09:41.960
And I was especially pissed that you declined to publish an article that came to us unbidden,
00:09:50.340
It was unbidden by me or Murray from Richard Hare, who's the editor in chief of the journal
00:09:54.620
Intelligence and a far more mainstream voice on this issue than Nisbet or Turkheimer or
00:10:01.940
And he came to our defense and he, you know, that would have done a lot to correct the
00:10:11.440
And I got increasingly exasperated over just how I perceived you in the email exchange.
00:10:18.980
And there was some talk of us doing this podcast together, but then I pulled the plug on that
00:10:23.000
because I felt it would be totally unproductive.
00:10:25.020
And at the end of the email exchange, I said, if you continue to slander me, I will publish
00:10:31.260
this email exchange because I felt that people should understand the actual backstory here
00:10:36.120
and how this happened and why I'm not doing a podcast with you.
00:10:39.800
And you did actually publish one more article from Turkheimer that took a shot at us.
00:10:44.500
But basically, we went radio silence for a year about, as far as I know.
00:10:49.480
And then what happened is there was an article published in The New York Times by David Reich,
00:10:55.160
a geneticist at Harvard, which made some of the same noises that Murray and I had made.
00:10:59.980
And Murray retweeted it saying, wow, this sounds familiar.
00:11:02.780
And then I retweeted it taking a snide dig at you saying something like, well, I hope Ezra
00:11:11.260
And then you responded writing yet another article about me and Murray.
00:11:16.940
And I felt this article was just as unfair as anything that had preceded it.
00:11:22.280
In particular, I felt that you had summarized our email exchange in a way that was self-serving
00:11:33.520
And I will be the first to admit, and I think you will agree with this, that that backfired
00:11:38.100
on me, the public perception of my publishing those emails was that it was not a good look
00:11:44.440
And most people who came to those emails cold thought I was inexplicably angry and that you
00:11:53.160
And it just, you know, people had to do a lot of work to understand why I was pissed.
00:11:59.660
I'm not saying that everyone who did the work, who listened to the podcast and read all the
00:12:03.560
articles would take my side of it, but anyone who didn't do the work thought that I was somehow
00:12:10.620
In particular, the fact that I was declining to do a podcast with you was held very much
00:12:16.200
And that caused me to change my mind about this whole thing because I realized, okay,
00:12:20.120
this is not, I can't be perceived as someone who won't take on legitimate criticism of his
00:12:26.060
And so I went out on social media just to see if in fact people really wanted us to attempt
00:12:30.900
this and after 40 or 50,000 people got back and it was, I think it was 76% said yes.
00:12:37.280
I decided that I was up for a podcast with you and you had already said you were up for
00:12:44.900
And again, much of that's described from my point of view, but I think the timeline is
00:12:51.040
This is not my ideal, but I'm actually, I'd prefer we get into it.
00:12:53.340
The only thing I would say here that you should just change a little bit in there so I don't
00:12:57.100
do it on your behalf is that you didn't email me.
00:13:00.940
What happened is that this piece published out, I tweeted it out.
00:13:05.320
You tweeted a public challenge to me to come on your show.
00:13:09.280
Your producer emailed, emailed me to come on your show.
00:13:12.480
I emailed your producer and said, hey, like, can you connect me to Sam?
00:13:30.860
You know, why is your criticism of me and Murray valid?
00:13:38.580
Obviously, and I'm sure we'll get into this stuff.
00:13:41.400
I have disagreements with which articles are fair and which aren't.
00:13:45.880
But I don't think that that is where I want to begin this.
00:13:49.900
I want to try to frame what I want to do here today, because I think people can go through,
00:13:55.140
they can read the original Vox articles, all be linked in my show notes.
00:14:04.480
If you would like to be a Sam Harris and Ezra Klein completist, the option is very much there.
00:14:10.420
So I listened to your housekeeping episode the other day.
00:14:14.100
So I think I have some sense, Sam, of where you are coming into this.
00:14:17.440
And I want to give you a sense of where I am in the hopes that it'll be productive.
00:14:23.560
So something you've said over and over and over again to me at this point is that to you
00:14:30.080
from the beginning, I've been here in bad faith.
00:14:32.460
The problem is that I've come to this, coming to slander you, to destroy your reputation,
00:14:38.680
And I really, I take that as a signal failure on my part.
00:14:42.580
I have not been able to persuade you, and maybe I will be today, that I really disagree
00:14:49.460
I think some of the things you're trafficking in are not just wrong, but they're harmful.
00:14:58.140
In your podcast with Murray, the way I see what's going on here from my perspective, and
00:15:04.840
one of the tricky things here is that I was not that involved in the original Vox article.
00:15:09.020
I was editor-in-chief at the time, but I didn't assign or edit that.
00:15:12.420
Things you publish when you're editor-in-chief ultimately are on you, and I actually think
00:15:16.780
But there are times when I can only speak from my perspective, not from the perspective of
00:15:23.400
But the way I read the conversation you had with Murray, and I think you gesture at this
00:15:27.940
in your opening here, you begin that conversation by really framing it around your shared experience
00:15:37.260
You say, and I'm quoting you here, in the intervening years, so the intervening years
00:15:41.480
since Murray published The Bell Curve, that you ventured into, I ventured into my own controversial
00:15:48.560
I experienced many hysterical attacks against me in my work.
00:15:51.640
I started thinking about your case, your case being Murray's case, a little, again, without
00:15:56.240
ever having read you, and I began to suspect that you were one of the canaries in the coal
00:16:03.020
So you say explicitly in the opening to that podcast that in the treatment of Murray, you
00:16:10.660
And I've spent a lot of time thinking about this because something that I've been trying
00:16:18.920
I think you have, you clearly have, a deep empathy for Charles Murray's side of this conversation
00:16:26.560
I don't think you have as deep an empathy for the other side of this conversation, for
00:16:30.980
the people being told, once again, that they are genetically and environmentally and at any
00:16:36.220
rate, immutably less intelligent, and that our social policy should reflect that.
00:16:41.320
And I think part of the absence of that empathy is it doesn't threaten you.
00:16:45.100
I don't think you see a threat to you in that in the way you see a threat to you in what's
00:16:49.960
In some cases, I'm not even quite sure you heard what Murray was saying on social policy,
00:16:54.300
either in the bell curve and a lot of his later work or on the podcast.
00:17:03.460
I think you have a big platform and a big audience.
00:17:06.220
And I think it's bad for the world if Murray's take on this gets recast here as political
00:17:14.760
So what I want to do here, it's not really convince you that I'm right.
00:17:23.100
What I want to convince you of is that there is a side of this you should become more curious
00:17:28.380
You should be doing shows with people like Ibram Kendi, who's author of Stamp from the
00:17:32.180
Beginning, which is a book on racist ideas in America, which won the National Book
00:17:37.560
People who really study how race and these ideas interact with American life and policy.
00:17:42.740
I think the fact that we are two white guys talking about how growing up non-white in America
00:17:46.760
affects your life and cognitive development is a problem here, just as it was a problem
00:17:53.640
And I want to persuade you that some of the things that the so-called social justice warriors
00:17:57.700
are worried about are worth worrying about, and that the excesses of activists, while
00:18:02.340
very real and problematic, they're not as big a deal as the things they're really trying
00:18:10.600
So maybe I'll take a breath there and let you in.
00:18:18.940
I guess the first thing I want to say is that there are two things I regret here, both in
00:18:27.320
And so I should just put those out first, I think.
00:18:30.680
The first is that I was, as you said, very quick to attribute malice and bad faith to you
00:18:38.500
And it's quite possible I did this when it wasn't warranted.
00:18:43.240
The reality is, the background here, which you alluded to, is that I am so battle-scarred
00:18:48.420
And I've dealt with so many people who are willing to consciously lie about my views and
00:18:57.160
I've got people who edit the contents of my podcast to make it sound like I've said the
00:19:04.320
And then people like Glenn Greenwald and Reza Aslan forward these videos consciously knowing
00:19:11.640
There's been so much pushback about this, there's been so much correction, that at this
00:19:15.620
point, the possibility that it's not conscious, the chance of that is zero, right?
00:19:19.940
So I'm dealing with people on a daily basis who are just happy to smear me dishonestly
00:19:30.360
And in fact, when I published our emails, the tipping point for me was to see that Glenn
00:19:34.780
Greenwald, Reza Aslan, and you in a single hour on Twitter had all hit me with stuff that
00:19:46.280
And if I treated you unfairly, attributing bad faith when you were just led by sincere
00:19:52.340
conviction that I had made an error or that you were arguing for something that was so
00:19:57.460
important and that I wasn't seeing it, that's, you know, that is on me.
00:20:01.920
Now, that said, I think your argument is where even where it pretends to be factual, wherever
00:20:10.760
you think it is factual, it is highly biased by political considerations.
00:20:16.780
And these are political considerations that I share.
00:20:19.760
The fact that you think I don't have empathy for people who suffer just the starkest inequalities
00:20:28.300
of wealth and politics and luck, it's telling and it's untrue.
00:20:35.660
And the fact that you're conflating the social policies he endorses, like the fact that he's
00:20:40.620
against affirmative action and he's for universal basic income.
00:20:45.020
And I know you don't happen to agree with those policies.
00:20:48.640
There's a good faith argument to be had on both sides of that conversation.
00:20:52.600
That conversation is quite distinct from the science.
00:20:55.220
And even that conversation about social policy can be had without any allegation that a person
00:21:03.080
is racist or that a person lacks empathy for people who are at the bottom of society.
00:21:11.420
And the other thing that I regret, which I think is, this is the thing you're taking me
00:21:19.280
But I do regret that in the preface to my podcast with Murray, I didn't add some full
00:21:29.180
And the reason why I didn't, or certainly at least one reason why I didn't, is that I had
00:21:34.620
maybe two months before that done a podcast with Glenn Lowry, the economist at Brown, who
00:21:43.760
He's got his own podcast, The Glenn Show, which everyone should watch.
00:21:47.240
But so Glenn was on my podcast, and we were talking about race and violence in America.
00:21:51.840
And I prefaced the conversation with a fairly long statement about the reality of white privilege
00:22:00.600
And when I got to the end of it, Glenn pretty much chastised me for thinking that it was
00:22:05.660
necessary for me to say something like that just because I'm white, right?
00:22:09.220
The fact that any conversation about race and violence, especially coming from a white
00:22:13.900
guy like me, has to be bracketed with some elaborate virtue signaling on that point.
00:22:19.500
So he basically said, I mean, these aren't his words, but this was his attitude, basically
00:22:24.280
said, you know, obviously, since you're not a racist asshole, it can go without saying that
00:22:29.860
you understand that slavery was bad and that Jim Crow was bad and that you totally support
00:22:35.960
And so his take on my saying that, it was not a total surprise given who Glenn is, but
00:22:42.540
the fact that he viewed it as fairly pathetic, that I felt the need to do that, and that it
00:22:48.140
couldn't just go without saying, I remembered that.
00:22:51.180
And I mean, obviously, your point is well taken.
00:22:53.980
I mean, two white guys talking about differences in IQ across races or across populations.
00:23:00.360
I mean, if ever there's a time to signal that you understand that racism is still a problem
00:23:06.920
And while we did say some things that I think should still have been fully exculpatory, I
00:23:12.900
mean, for anyone paying attention, I think it should be obvious with a modicum of charity
00:23:18.260
extended to us that Murray and I are not racist and that what we were saying was not coming
00:23:25.640
But I mean, that is the backstory for why I didn't have some kind of elaborate framing
00:23:33.320
So I want to I want to be this is good because I think this gets much closer to the meat of
00:23:38.920
And something I want to be clear about is what I think was wrong in that podcast is not that
00:23:46.040
It's not that you didn't come out and say, hey, listen, just before I start this up, I
00:23:51.680
And by the way, I'm not here to say you're a racist.
00:23:56.500
I actually think that's a different set of things.
00:23:59.520
I think this would actually be a good conversation for us to have about literally just what racism
00:24:07.240
But my criticism of your podcast and by the way, my criticism also of Murray, and this
00:24:12.720
is useful because I can work backwards through your answer here, is not that you didn't excuse
00:24:18.560
It's that in a conversation about an outcome of American life, right?
00:24:25.280
How do African-Americans and whites score on IQ tests in America today?
00:24:30.120
What happens when somebody sits down and takes a test today?
00:24:32.820
That is an outcome of the American experiment, an experiment we've been running this country
00:24:37.840
You did not discuss actually how race and racism act upon that outcome.
00:24:47.080
I mean, amazingly to me, you all didn't talk about slavery or segregation once.
00:24:52.300
And what I'm saying here is not that you lack empathy, although I am saying in a different
00:24:58.000
space, I don't I think you have a like an a sense of what Murray's going through that
00:25:04.700
is different from your sense of what other people who are hurt in this conversation go
00:25:11.640
But as it comes to the way you actually conducted the conversation, I'm arguing that you lacked
00:25:16.400
a sense of history, that you didn't deal in a serious way with the history of this
00:25:21.760
conversation, a conversation that has been going on literally since the dawn of the country,
00:25:26.120
a conversation that has been wrong in virtually every version in every iteration we've had
00:25:32.440
The other thing I want to say about this, and this gets very importantly to Charles Murray's
00:25:38.940
And so I get that you look at Murray and you look at the bell curve and what you see are
00:25:44.520
the tables and the appendices and the kind of scientific version of Charles Murray.
00:25:55.560
Charles Murray, not just to me, what he literally is, is what we call a policy entrepreneur.
00:26:00.940
He's somebody who his entire career has been spent at Washington think tanks.
00:26:04.980
He's at the American Enterprise Institute, where I have a lot of friends and I respect that
00:26:09.940
And he argues in different ways and throughout his, again, his entire body of work for policy
00:26:17.620
His book before The Bell Curve is called Losing Ground.
00:26:21.240
It's a book about why we should dissolve the great society programs.
00:26:24.420
By the way, when he was selling that book, he said, a lot of whites think they're racist,
00:26:28.300
and this is a book that tells them they aren't.
00:26:31.780
And we'll go through this, and I'll quote this back to you.
00:26:33.960
But The Bell Curve's final chapter, he says, why did I do any of this?
00:26:39.240
Tim and Richard Hernstein, obviously, the co-author of that book, do.
00:26:42.620
And he says, the reason I did it is because we in America need to re-embrace a politics
00:26:49.320
We need to understand that we are cognitively different from each other, not just by race,
00:26:58.480
And that understanding that changes what we should do in social policy.
00:27:02.800
He literally says, and again, I can quote this to you if you'd like.
00:27:05.620
He says, for one thing, we have all these low cognitive capacity women giving birth.
00:27:11.800
And by having the social supports for poor children in this country, we are subsidizing
00:27:17.480
And what we need to do is take those subsidies away.
00:27:20.800
So these women who, according to his book, are disproportionately African-American, their
00:27:25.200
poor children do not get as much federal support when they are born.
00:27:29.860
And so they are disincentivized to have as many children.
00:27:32.160
He also says that we have all these folks who are Hispanic coming up over the border,
00:27:36.060
that our immigration policy is letting in too many low IQ people.
00:27:39.560
And while he's not quite as prescriptive in that part, he's pretty clear that he wants
00:27:43.120
us to change our immigration policy in order to resist dysgenic pressure.
00:27:51.920
And this is why the reason I bring this up is that the reason I think Charles Murray's work
00:27:55.600
is problematic is that he uses these arguments about IQ and a lot of other arguments he makes
00:28:00.240
about other things to push these points into the public debate, where he is very, very,
00:28:06.520
He's not by any means a silenced actor in Washington.
00:28:11.160
He won the Bradley Prize in 2016 and got a $250,000 check for it.
00:28:15.800
His book on UBI, it is completely of a piece with this.
00:28:25.300
According to Murray's own numbers, he says it would cut social spending by a trillion
00:28:31.720
To give you a sense of scale, Obamacare costs $2 trillion over 10 years.
00:28:36.720
So this is another book in a different way that is a huge argument for cutting social spending,
00:28:42.080
which in part he justifies by saying we are trying to redress racial inequality based on
00:28:46.480
an idea that is a product of American history when in fact it is some combination of innate
00:28:51.960
and environmental, but at any rate, it is not something we're going to be able to change.
00:28:56.540
And so we should stop trying or at least stop trying in the way we have been.
00:29:00.720
Okay, Ezra, again, you can't conflate his views on social policy with an honest discussion
00:29:11.160
You can agree about the data or disagree in a good faith way about the data and have a
00:29:16.880
separate conversation about what to do in response to the data and then disagree in
00:29:21.720
Now, I'm not defending Murray's view of what the social policy should be.
00:29:29.740
I think there can be a good faith debate about many of these topics.
00:29:36.160
And I totally share your concern about racism and inequality.
00:29:40.420
And again, I have no interest in using science to establish differences between races.
00:29:45.540
But the problem is, and I have publicly criticized people who do have an interest in using science
00:29:51.660
And one of my critical questions of Murray was, why pay attention to any of this stuff?
00:29:56.480
And I've said publicly that I didn't think his answer was great on that.
00:30:00.720
And I'm not interested in paying attention to this stuff.
00:30:03.460
And yet I have to in order to have conversations like this.
00:30:07.380
But the problem is that the data on population differences will continue to emerge whether we're
00:30:15.380
And the idea that one should lie about these data or pretend to be convinced by bad arguments
00:30:22.960
that are politically correct or worse, that it's OK to malign people or even destroy their
00:30:28.660
reputations if they won't pretend to be convinced by bad arguments.
00:30:35.220
Morally and politically and intellectually, that is a disaster.
00:30:41.460
That's my criticism of what you have done at Vox and what Turkheimer and Nisbet and Hardin
00:30:48.180
And the truth is, for whatever reason, OK, however noble it is in your head, you've been
00:30:55.940
extraordinarily unfair to me and Murray, especially to Murray.
00:31:01.280
I just want to give you a couple of examples here.
00:31:03.020
I think we have to go into this issue of, you know, you just claimed you didn't call us
00:31:13.140
Which you know most people will read as racist.
00:31:15.240
But even if you even if that is an adequate way to split the difference, everything else
00:31:20.200
you said imputed, if not an utter racial bias and a commitment to some kind of white
00:31:28.760
superiority, you say again and again that here's a quote from your article is actually the
00:31:36.460
And when I you know, I called the podcast with Murray forbidden knowledge.
00:31:41.600
It's America's most ancient justification for bigotry and racial inequality, right?
00:31:47.000
We're shilling for bigotry and racial inequality.
00:31:52.240
Again, this is a quote of being engaged in a decades long focus on the intellectual inferiority
00:32:04.240
I mean, Murray has not been focused on African-Americans.
00:32:07.660
He's been waging a decades long battle to survive being scapegoated by people who insinuate that
00:32:15.500
And the nature of that battle is to continually try to you have to keep touching this issue
00:32:23.540
But as you know, the bell curve was not focused on race.
00:32:28.960
And the truth is that, and you almost alluded to this in what you just said, the truth is
00:32:34.720
that Murray is just as worried about unearned privilege as you are.
00:32:40.620
He's just worried about a different kind of privilege.
00:32:45.440
And the bell curve is an 800-page lament on this type of privilege.
00:32:50.960
And again, it has nothing in principle to do with race.
00:32:54.020
Murray is just as worried about the white people on the left side of the IQ distribution
00:33:01.760
And you could have said it would be just as true to describe him as having been involved
00:33:08.460
in a decades-long focus on the superiority of Asians over white people, OK?
00:33:16.940
And, you know, you might ask yourself why you didn't do that.
00:33:19.680
But I want to read a quote from Murray on my podcast because this is, again, I'm not
00:33:36.380
If there's one thing that right in the bell curve did, it sensitized me to the extent to
00:33:40.420
which high IQ is pure luck, that none of us earn our IQ, whether it's by nature or
00:33:49.680
Hard work and perseverance and all those other qualities are great, but we can't take
00:33:56.320
We live in a society that is tailor-made for people with high IQs.
00:34:01.180
The people who got the short end of the stick in that lottery deserve our admiration and
00:34:10.620
He is worried about a world where success is determined by a narrow range of abilities.
00:34:16.740
And these abilities, whether they come from nature or nurture, are distributed unequally.
00:34:25.100
We just know that they can't possibly be equal, both among individuals and across groups.
00:34:30.800
And when you're talking about the averages in groups.
00:34:32.560
And he's totally committed, as I am, again, I don't know how many times you have to reiterate
00:34:37.600
this in a podcast to make it stick, but the punchline here is that everyone has to be treated
00:34:46.940
I mean, there's more variance within a group than between groups.
00:34:50.260
And everyone has to be encountered on their own merits.
00:34:55.060
So to paint him as callous and as racist and as essentially a white supremacist, you're
00:35:01.620
talking he's fixated on the inferiority of blacks on your account.
00:35:08.120
And that that's the kind of wrong that I was trying to address by giving him a platform
00:35:14.260
And that is what produced so much outrage in me in our email exchange.
00:35:18.980
When I hear this, I actually really wonder how much I want to be careful here.
00:35:27.200
When I wrote my very first piece as a journalist in Washington, it was a piece about poverty.
00:35:46.700
And the quote you read from him about luck, I want to put a pin in that because there's
00:35:50.920
a whole conversation I want to have with you about that quote.
00:35:53.820
If Charles Murray followed what that quote implies, I think things would look very different
00:36:00.880
But I do think I need to go through some of what you said here.
00:36:03.580
So first, I don't know how much you understand Charles Murray's career.
00:36:12.140
In the interest of time and basic human sanity here, Ezra, I'm worried.
00:36:16.580
That what you're going to do is actually is all the stuff you're going to cover is actually
00:36:21.640
Hey, Sam, I've let you I've let you had your say.
00:36:24.080
I'm going to I'm just going to I'm just going to keep going.
00:36:27.200
But I just want to prevent your and listener frustration here, because if you go on for
00:36:31.520
10 minutes for me to only say, well, again, his social policies are not social policies
00:36:38.960
We're going to go we're going to go through all this.
00:36:40.460
And I I don't mean this to be sharp, but you don't give short answers yourself.
00:36:45.020
So, you know, we're just going to have to indulge the other one here.
00:36:53.280
And again, he says about that book, a lot of whites think they're racist.
00:37:00.320
The way Murray often defends The Bell Curve is by saying, hey, look, it only had this one
00:37:05.000
And he's completely or actually a couple of chapters.
00:37:07.840
The chapters where that is mentioned, they are not the bulk of the book.
00:37:10.880
But I'm actually a publisher of pieces and I work with a lot of authors on book excerpts.
00:37:16.420
The furor around The Bell Curve is not around the book, which it's a long book.
00:37:21.980
It's that the part of the book that he had excerpted on The New Republic, on the cover
00:37:28.620
of The New Republic under Andrew Sullivan, the cover of The New Republic, it just says in
00:37:35.380
The reason that is the part people focus on is that they pulled the most controversial
00:37:42.560
I know that authors, when they don't want their most controversial part to define the
00:37:52.420
I don't know if you've ever read or even are that familiar with Human Achievement.
00:37:56.300
Just to be on the record here, I've read The Bell Curve and I've read Coming Apart.
00:38:03.380
And Coming Apart just spells out his concern about the cognitive stratification of society.
00:38:08.620
So Human Achievement is a book where Murray, and this comes right after The Bell Curve.
00:38:12.720
And when I describe this book, I almost feel like people are not going to believe me.
00:38:17.260
Murray wants to quantify the human achievements of different races.
00:38:21.100
And the way he does that is he looks in a bunch of encyclopedias and he literally counts up the
00:38:26.660
amount of space given to the accomplishments of artists and philosophers and scientists
00:38:33.740
And he uses that to say, European Americans, Europeans, white Europeans have done the most
00:38:41.880
One criticism that I and other people have of Murray is that he often looks at indicators
00:38:47.480
that reflect inequality and uses them to justify inequality.
00:38:53.680
That book is like one of the most massive correlation causation errors I can possibly imagine.
00:38:58.960
So now the next thing you say is that in doing this, that I am conflating two things.
00:39:03.860
I am conflating just a calm discussion you two had about the science with a social policy
00:39:10.660
I want to read you actually what was said in your discussion with Murray about this, because
00:39:19.200
When you were talking with Murray, one thing I think to your credit is you repeatedly asked
00:39:25.640
Why have this whole discussion about race and IQ?
00:39:30.080
So you say, why seek data on racial differences at all?
00:39:34.880
And Murray responds, and again, I'm quoting, because we now have social policy embedded in
00:39:39.880
employment policy, in academic policy, which is based on the premise that everybody's equal
00:39:44.060
above the neck, whether it is men or women or whether it is ethnicity.
00:39:48.060
And when you have that embedded into law, you have a variety of bad things happen.
00:39:53.240
You say, needless to say, I'm sure we can find hate supremacist organizations who love the
00:39:57.920
fact that the Bell Crow was published and admonish their members to read it at the first
00:40:03.540
How does this help society give more information about racial difference?
00:40:07.000
And Murray, again, I'm not going to read the whole thing because I think that would be
00:40:09.420
dull, gives a long answer about affirmative action and why it is bad.
00:40:13.400
So I am not the one conflating this, number one.
00:40:21.380
And the reason I care about this stuff is because I care about what the actual social policy
00:40:26.540
Ezra, then you don't know what I mean by conflate.
00:40:30.040
Sam, you can respond to everything when I'm done.
00:40:34.920
The final thing that you did in your answer to me here was you said again and again, people
00:40:40.220
pretending to believe politically correct ideas, people pretending to believe bad evidence.
00:40:46.440
I don't doubt your sincerity in this, but I can assure you that Nisbed and Paige Hardin
00:40:51.980
and Eric Terkheimer and me, we actually believe what we believe.
00:40:57.340
And one of the things that has honestly been frustrating to me in dealing with you is you
00:41:01.640
have a kind of a very sensitive ear to where you feel that somebody has insulted you, but
00:41:09.620
During this discussion, you have called me and not through implication, not through something
00:41:14.120
where you're reading in between the lines, you've called me a slanderer, a liar, intellectually
00:41:18.120
dishonest, a bad faith actor, cynically motivated by profit, defamatory, libelous.
00:41:23.020
You've called Terkheimer and Nisbed and Paige Hardin, you've called them fringe.
00:41:28.400
You've said just here that they're part of a politically correct moral panic.
00:41:32.080
I do think that you need to do a little bit more here to credit the idea that there just
00:41:38.760
And it's a disagreement in part because people are looking at different parts of this with
00:41:41.980
different emphasis, but also disagreement because people look at this issue and see
00:41:47.100
I often hear you on your podcast talk about how it's important to try to to try to extend
00:41:54.740
And one thing that is annoying is that, you know, among the one thing that one thing that
00:41:59.000
I have not done is assume that you don't believe what you believe.
00:42:02.080
Everybody here is trying to have an argument about something that is important, that in
00:42:05.080
Murray's words is about how we end up that should feed into how we order society, what
00:42:13.160
And that's not just a high stakes conversation.
00:42:21.740
I guess there's two topics here that I should address.
00:42:24.440
I think we have to talk about what it means to insinuate that someone's racist.
00:42:32.920
I get that you see that he thinks his social policies are justified by what he thinks is
00:42:38.960
empirically true in the world of data and facts and human difference.
00:42:45.780
And you're worried that if one takes the data seriously in the way that he takes it
00:42:51.440
seriously, if one endorses his interpretation of the data from psychology or psychometrics or
00:42:58.520
behavioral genetics, that that will lead to social policies that you find abhorrent or that
00:43:04.960
you think will produce a massive amount of inequality or suffering or something wrong.
00:43:11.960
But the conflation is, is that talking about data is one thing.
00:43:18.160
Talking about what should be done in light of the facts that you acknowledge to be true
00:43:25.940
And there can be good faith disagreements in both of those conversations.
00:43:30.060
Those conversations are not inextricably linked.
00:43:33.380
And what I am noticing here is, and what I've called a moral panic, is that there are people
00:43:39.340
who think that if we don't make certain ideas, certain facts taboo to discuss, if we don't
00:43:47.140
impose a massive reputational cost in discussing these things, then terrible things will happen
00:43:55.500
That the only way to protect our politics is to be, again, this is a loaded term, but
00:44:02.120
this is what is happening from my view scientifically, is to be intellectually dishonest, to be led
00:44:15.400
And everything you've said about the politics and the historical wrongs of racism, which
00:44:20.220
you wrote about a lot in your last piece, I totally agree with, okay?
00:44:24.360
And I'm probably more aligned with you politically than I am with Murray, which is to say that
00:44:32.180
I share the bias that is leading you to frame the matter the way you're framing it.
00:44:36.640
Again, I probably should have spelled this out in the beginning of my podcast with Murray,
00:44:42.940
I don't think it would have made a bit of difference, but I still should have done it.
00:44:46.940
And I think it would have been called anodyne the way that Nisbet et al called, are talking
00:44:54.040
But I think everything you say about the history of racism is true.
00:44:57.600
I think you could well be on the right side of a good debate about social policy, and your
00:45:07.860
So this goes to the charge of bad faith against you, which in this conversation I admitted might
00:45:16.380
You might not be the Glenn Greenwald character I read you to be at a certain point in that
00:45:23.080
So let's just assume, as you say, that you feel intellectually scrupulous and ethically
00:45:32.680
And you feel this way because you are concerned about racism, you're horrified by the history
00:45:37.200
of racism, and you feel that the kinds of social policies that Murray favors would be disastrous.
00:45:43.320
And again, I'm not arguing for those social policies, but your bias here, your connection
00:45:50.700
to the political outcomes when you're talking about the empirical science is causing you
00:45:57.220
to make journalistic errors, is causing Nisbet and Turkheimer to make errors of scientific
00:46:03.940
I mean, in your last piece, you have this whole section on the Flynn effect and how the Flynn
00:46:07.660
effect should be read as accounting for the black-white differences in purely environmental
00:46:13.440
Well, even Flynn rejects that interpretation of the Flynn effect.
00:46:16.980
I mean, he had originally hoped, he publicly hoped that his effect would account for that,
00:46:21.820
but now he has acknowledged that the data don't suggest that.
00:46:25.560
And there are many other errors of this kind that you and Nisbet and Turkheimer are making
00:46:30.880
when you criticize me and Murray, and you criticize Murray for errors that he didn't make.
00:46:37.300
And in order for you to imagine that I'm equally biased, right?
00:46:45.800
Why am I looking at the same facts that Nisbet and Turkheimer and Hardin are looking at, and
00:46:53.120
You have to imagine that I have an equal and opposite passion, that I feel equally righteous,
00:47:02.660
I would have to be a grand dragon of the KKK to feel an equal and opposite bias on these data.
00:47:11.300
And you've already said you don't think I'm a racist, but that's what would have to be true of me,
00:47:16.720
to be as biased as you are, again, understandably, given the history of racism on these data.
00:47:26.120
What you have in me is someone who shares most of your political concerns and yet is unwilling to,
00:47:36.680
again, a loaded word, lie about what is and is not a good reading of empirical data and what is and
00:47:44.800
is not a good argument about genetics and environment and what is reasonable to presume based on what we
00:47:52.680
And again, the problem is, is that even if we never look for these things again, even if we follow this
00:47:59.620
taboo and decide that it's just, there's no ethical reason to ever look at population differences,
00:48:09.080
They're just going to spring out of our study of intelligence generically or human genetics generically.
00:48:17.160
It's happened on other topics already and people try to keep quiet about it because, again,
00:48:23.300
the environment journalistically and politically is so charged.
00:48:27.140
And my criticism of you has been from day one that you are contributing to that political charge.
00:48:34.580
And it's totally unnecessary because the political answer is so clear.
00:48:39.760
The political answer is we have to be committed to racial equality and everyone getting all the
00:48:46.640
opportunities in life for happiness and self-actualization that they can use.
00:48:51.680
And we're nowhere near achieving that kind of society.
00:48:55.220
And the real racists are the people who are not committed to those goals.
00:49:02.340
I actually really appreciate that answer because I think it helps open this up.
00:49:10.800
One is it, one of my macro, one of the things I've come to think about you that I actually
00:49:16.040
did not come into this believing is you're very quick to see a lot of psychological tendencies,
00:49:25.800
cognitive fallacies, et cetera, in others that you don't see applying to yourself or people
00:49:33.800
So you say words in there like confirmation bias, et cetera, to me about Murray, about how
00:49:40.580
And my whole the whole thing I just told you is that Charles Murray is a guy who works at
00:49:45.220
conservative think tanks whose first book was about how to get why we should get rid of the
00:49:48.500
welfare state, who is his whole life's work is about breaking down social policy.
00:49:54.420
So to the extent that I have any biases that flow backwards from political commitments,
00:50:04.040
I promise you I will get to your bias very quickly.
00:50:06.580
I do want to know you mentioned James Flynn here to prepare for this conversation.
00:50:14.200
His read of the evidence right now, and this is me quoting him, he says, I think it is
00:50:19.320
more probable than not that the IQ difference between black and white Americans is environmental.
00:50:24.880
As a social scientist, I cannot be sure if they have a genetic advantage or disadvantage.
00:50:30.640
So I'm just that is what James Flynn thinks as of Monday.
00:50:34.020
So then you ask me, and I think this is a great this is a good question, because I think
00:50:38.100
this gets to to the core of this and it gets to where I tried to open us up into.
00:50:42.000
You your view of this debate is that to say that you have a bias in it is to say in your
00:50:48.700
terms that you're you're like the grand dragon of the KKK, that the only version of a bias
00:50:53.220
that could be influencing what you see here is a core form of racism.
00:50:57.540
That's actually not my view of you, but I do think you I do think you have a bias.
00:51:08.340
And you have a lot of difficulty extending an assumption of good faith.
00:51:12.000
To anyone who disagrees with you on an issue that you code as identity politics.
00:51:17.780
And there's a place, actually, where I think you got into this in a pretty interesting way.
00:51:21.720
I went back and I read your discussion with Glenn Lowry at the beginning when you're talking
00:51:28.480
You say my goal was to find an African-American intellectual who could really get into the
00:51:33.780
details of me, but whom I also trusted to have a truly rational conversation that wouldn't
00:51:42.020
To you, engaging in identity politics discredits your ability to participate in a rational conversation.
00:51:48.600
And it's something, as far as I can tell, that you do not see yourself as doing.
00:51:53.720
So here's my question for you on that specific quote.
00:51:56.520
What does it mean to you, particularly when you're talking about something like race?
00:52:04.520
If you'd like to continue listening to this conversation, you'll need to subscribe at
00:52:17.000
Once you do, you'll get access to all full-length episodes of the Making Sense podcast, along
00:52:21.560
with other subscriber-only content, including bonus episodes, NAMA's, and the conversations
00:52:28.540
The Making Sense podcast is ad-free and relies entirely on listener support, and you can