Making Sense - Sam Harris - March 09, 2016


#31 — Evolving Minds


Episode Stats

Length

42 minutes

Words per Minute

196.51991

Word Count

8,286

Sentence Count

433

Misogynist Sentences

1

Hate Speech Sentences

24


Summary

Jonathon Haidt joins me to talk about his new book, The Happiness Hypothesis and The Righteous Mind, and how he and I have collided with one another on a number of occasions, and this conversation could have gone either way. It's a high-wire act, a high wire act, but it's a necessary one, and I think you'll agree that it's worth the risk to have a conversation with someone who's been critical of us for a long time. In this episode, we talk about our disagreements, and why we should talk to one another even if we disagree on things we agree on. We also talk about why we need to talk to people who disagree with us, and what we can learn from each other about how we can make better decisions about what we should and shouldn't talk about in public. If you like what we're doing here, please consider becoming a supporter of the podcast by becoming a subscriber. We don't run ads on the podcast and therefore, therefore, it's made possible entirely through the support of our subscribers, and therefore you'll benefit entirely from the support we're making possible by the support from our listeners. Thanks to our listeners, and we'd like to thank you for all the support you're giving us a chance to make a difference in the making sense of the world. Make sense of it all by listening to the podcast. -Sam Harris Please consider becoming one of us, becoming a patron of Making Sense Podcasts, and/or sharing the podcast on your social media platforms, and helping us spread the word out there to the world, and beyond. Thank you, and thank you, again and again, for helping us make the world a better place to reach more people who can make sense of things they're listening to things they care about things they can do more of the things we can do better, better, and more of a better, more of them, like us all of us making sense, more like us. . -Jonah Jonathan Sam (The Righteous mind? The Happy Hypothetter - The Happiness Theory and The Happy Mind? - The Happiness and The Goodeous Mind? - The Good Thing? The Good Place? -- The Goodness That We Can Talk About It? (Aristotle, The Good Life, The Badger's Good Life? ) and more!


Transcript

00:00:00.000 Welcome to the Making Sense Podcast.
00:00:08.820 This is Sam Harris.
00:00:10.880 Just a note to say that if you're hearing this, you are not currently on our subscriber
00:00:14.680 feed and will only be hearing the first part of this conversation.
00:00:18.420 In order to access full episodes of the Making Sense Podcast, you'll need to subscribe at
00:00:22.720 samharris.org.
00:00:24.060 There you'll find our private RSS feed to add to your favorite podcatcher, along with
00:00:28.360 other subscriber-only content.
00:00:30.520 We don't run ads on the podcast, and therefore it's made possible entirely through the support
00:00:34.640 of our subscribers.
00:00:35.880 So if you enjoy what we're doing here, please consider becoming one.
00:00:46.420 And so here I reached out to Jonathan Haidt, who is a professor of ethical leadership at
00:00:51.020 NYU's Stern School of Business.
00:00:53.100 He's a very well-known psychologist.
00:00:55.140 Many of you know his work, and taught at UVA for many years.
00:01:00.860 And he's the author of The Happiness Hypothesis and The Righteous Mind.
00:01:04.880 He and I have collided with one another on a number of occasions, and this conversation
00:01:10.180 could have gone either way.
00:01:11.980 I was not surprised that it was as successful as it was.
00:01:16.520 But it was a risk, like many of these things are, and this one paid off.
00:01:20.780 We come out of a history of strong and even ad hominem criticism of one another, and we
00:01:29.460 make progress.
00:01:30.660 I now give you John Haidt.
00:01:38.540 Well, I'm here with Jonathan Haidt.
00:01:40.440 John, thanks for coming on the podcast.
00:01:42.280 My pleasure, Sam.
00:01:43.000 Looking forward to it.
00:01:43.760 Well, listen, before we get into topics about which we agree, and there are a lot of them,
00:01:48.480 let's start with areas of disagreement, because we've had a few past controversies, which I
00:01:53.320 think our listeners should know about.
00:01:54.860 So many of our listeners will know who you are, because you've done extremely influential
00:01:57.980 work in psychology and have covered many topics that are really just of enormous importance
00:02:02.440 outside psychology.
00:02:03.740 But many might not know the history of our bickering in public.
00:02:07.020 Right.
00:02:07.320 So you've been among the prominent critics of the so-called new atheists who have gone
00:02:12.260 after Richard Dawkins and Christopher Hitchens and Dan Dennett and me for what we've said
00:02:17.660 about religion, principally.
00:02:19.140 And you spent less time on Hitch because he didn't claim to be representing science.
00:02:22.920 So you've criticized me in the past and in ways that I thought were pretty wrongheaded.
00:02:28.040 And I push back fairly hard against this in ways that may have bordered on incivility at
00:02:33.580 times.
00:02:34.320 And so the way things were left, I don't think either of us would have tended to see
00:02:38.360 the other as a natural collaborator.
00:02:40.460 That's right.
00:02:40.760 So I mean, I find this interesting just as a social phenomenon.
00:02:44.040 I find it intellectually quite consequential that people stop talking to one another after
00:02:48.960 they have certain collisions in public.
00:02:51.000 And more and more, I've been attempting to engage people with whom I've had a strong disagreement
00:02:55.500 on important topics just to see if conversation is possible.
00:02:59.720 And I should just point out to you and to our listeners who will know that this doesn't
00:03:03.940 always work out.
00:03:04.740 I had one podcast that I didn't even release because it went so badly.
00:03:08.360 And I had one that I released recently and probably shouldn't have because it just seemed
00:03:12.360 to do nothing more than increase the sum total of frustration in the universe.
00:03:16.300 It was just people found it excruciating to listen to.
00:03:18.760 And I've had very mixed success doing this in writing.
00:03:21.520 The most memorable failure being that I attempted to engage Noam Chomsky and that project just
00:03:26.260 fell apart as fast as I could type.
00:03:28.340 That does not sound promising.
00:03:29.880 So, but I suspect you and I are up to this.
00:03:32.680 So, you know, we don't need to spend a lot of time rehearsing our past skirmishes, but
00:03:36.440 I just want to, and we can just discover what we disagree about now as the topics come up.
00:03:41.360 But I just want our listeners to know that this history exists and it was fairly acrimonious.
00:03:46.480 And they should just appreciate that you and I are doing a bit of a high wire act just having
00:03:50.820 this conversation because most people with our history just actually don't willingly talk to each
00:03:55.780 other in these ways. And so, again, my underlying aim is to demonstrate that two people can
00:04:00.520 have a fairly inauspicious beginning and then successfully communicate and make intellectual
00:04:05.420 progress.
00:04:05.880 Great. I want that too.
00:04:07.740 Cool.
00:04:07.960 And actually, you know, in preparing for this call, I was looking back over our past conflicts
00:04:11.120 and, you know, looking at it as a psychologist who studies morality and moral disagreement,
00:04:15.220 I actually think it's kind of revealing the way this all worked.
00:04:18.760 So, you know, initially, as far as I can tell, the first salvo was when I wrote that essay
00:04:25.080 on edge, very critical of the new atheists.
00:04:28.840 And, you know, I don't think that I was uncivil there, although it was within the bounds of
00:04:32.440 normal edge conversation.
00:04:33.760 It's, you know, edge is not a safe zone.
00:04:35.540 I think you and I both agree that intellectual discourse should not be a safe zone.
00:04:40.860 Um, and then you, uh, you wrote back and, and from my point of view, um, it was when
00:04:46.560 you, you were comparing, you were, uh, saying my ideas would basically, you know, justify
00:04:51.200 or lead to Aztec human sacrifice and, and all these other horrible things.
00:04:54.980 And okay, you know, that too is within, within the bounds.
00:04:57.820 Um, all right.
00:04:58.620 So, so we're sort of up against the edge there, but that's sort of normal.
00:05:01.400 Then if I remember the timing, it was like right after that, that we first met face to
00:05:05.360 face at the Beyond Belief conference.
00:05:07.240 Um, and there too, um, if I remember there, I think you said like my beliefs would lead
00:05:13.360 to either North Korea or something like that.
00:05:16.480 So again, well, not so, I don't, I don't think I would have ever said that it would lead to,
00:05:19.920 but just that you would be hard pressed to say what was wrong with those systems by your
00:05:24.760 lights.
00:05:25.460 Okay.
00:05:26.060 Okay.
00:05:26.620 I mean, so it may be a distinction without a difference in your mind, but it's, it's a
00:05:30.800 pretty important one.
00:05:31.560 That is, that is.
00:05:32.100 But the point is just that from, you know, the way it felt to me was no matter what I say,
00:05:36.060 you will, you will link me somehow to, uh, you know, North Korea or something like that.
00:05:42.300 I don't think you ever did the Nazis.
00:05:43.560 And I thank you for that.
00:05:44.840 Um, but the point is that I felt that you had a particular rhetorical style, which was
00:05:49.480 well suited for what you were doing.
00:05:51.260 You were writing for a popular audience about a very hot topic, but I felt like, Hey, Sam's
00:05:56.760 rhetoric.
00:05:57.080 This is like not academic rhetoric.
00:05:58.960 This is very different and I don't like it.
00:06:01.500 All right.
00:06:01.860 So that's the backstory.
00:06:03.140 So, but you know, I never really responded.
00:06:04.760 I didn't do anything in writing after that.
00:06:06.100 And then when you wrote a book on morality, um, in which again, you were critical of me
00:06:11.140 in the same way.
00:06:11.620 And again, that's fine.
00:06:12.880 Um, and, but it was like, uh, okay, you had another provocation.
00:06:16.580 I didn't do anything.
00:06:17.080 And then when you came out with the moral landscape challenge and you were saying, if anyone can
00:06:22.540 pay me, if anyone can convince me to change my mind, I'll pay him $10,000.
00:06:26.840 And that was like too much.
00:06:27.860 It was like, Oh my God, this is too much.
00:06:29.620 I've got to respond to this.
00:06:31.080 Um, and so then I wrote that, the essay, why Sam Harris is unlikely to change his mind.
00:06:35.260 And for the most part, as I was just reading that over, I think it's, you know, it's a
00:06:38.760 perfectly legitimate statement of my research and how my research leads me to believe that
00:06:43.780 you won't change your mind.
00:06:45.040 So all that's fine.
00:06:45.600 The thing though, is that clearly was a kind of a jerk move on my part, um, was I think
00:06:50.700 throwing, you know, I analyze, so for listeners who don't know this, uh, this story, this debate,
00:06:55.260 um, I analyzed Sam's, uh, your books, I analyzed your books and a bunch of other books.
00:07:00.000 And I found that according to this program, Luke, that counts words like always and never
00:07:04.660 in a category of certainty, uh, that you came out as the most certain person, uh, even more
00:07:10.720 so than Glenn Beck and Sean Hannity and all those guys.
00:07:13.580 So that was a very jerk thing of me to do.
00:07:16.120 And Sam, I do apologize to you for that.
00:07:17.900 That was inappropriate.
00:07:18.660 Well, but it did give me a chance to respond to that, which I still am to, uh, pat my own
00:07:23.640 back.
00:07:24.000 I still am amused by my response to that, where I used every one of your keywords in a paragraph,
00:07:28.980 which was a, on its face, a statement of total intellectual humility and openness to
00:07:34.660 being wrong.
00:07:35.300 But it in fact used all your certainty terms in the same paragraph.
00:07:38.340 That's right.
00:07:38.780 And so look, Sam, so in fact, that kind of points to a similarity between us, which is that
00:07:42.420 we're both, we both really enjoy being clever.
00:07:45.880 That was a, you know, my thing was a very clever thing.
00:07:48.700 Your response where you used all those, all those words, you know, was, you know, we were
00:07:53.000 sort of smart Alex.
00:07:54.600 Um, and you know, when smart Alex come up against each other, uh, the audience is in for a treat,
00:07:59.060 I suppose.
00:07:59.420 Well, but not to trivialize what we're doing here, because I think these issues are hugely
00:08:04.580 important.
00:08:05.200 And I think our, our disagreements are important.
00:08:08.200 And I think our misunderstanding one another is important.
00:08:10.480 And just to talk a little bit more about the genesis of this thing, which it occurs to
00:08:14.380 me now, I never really thought about this as we were sparring about these issues, but
00:08:19.820 I think this may be part of the problem.
00:08:21.700 And, and, you know, correct me if this seems crazy, but so your field is social psychology,
00:08:26.340 where you've said that upwards of 95% of people are liberal and usually strongly liberal.
00:08:32.020 So you've been surrounded by people who consider political conservatism to be a form of mental
00:08:37.480 illness, essentially.
00:08:38.660 And you've pushed back against this in ways that have been extremely important and really
00:08:43.140 ingenious.
00:08:44.300 And you and I are going to agree about many of the points you've raised against liberals.
00:08:47.920 Oh yeah, the political correctness part.
00:08:48.520 We both have really come up against political correctness.
00:08:50.480 Yeah, yeah.
00:08:51.380 So you've been, so you've been fighting from that trench for a while.
00:08:54.380 And then when you saw the so-called new atheists, which are just a gang of liberal intellectuals,
00:08:59.460 initiate this frontal assault on religion and arguing that it's not only false, but dangerous.
00:09:04.820 And in my case, hearing me say that science will eventually replace religion on questions
00:09:08.820 of morality and human wellbeing.
00:09:10.580 I think you viewed this as yet another example of left-leaning secularists who are totally out
00:09:16.600 of touch with the lived experience of religious people doing what left-leaning academics often
00:09:21.360 do to social conservatives, which is dismiss them as morally and intellectually defective.
00:09:25.420 I think that's right.
00:09:26.300 Oh, good.
00:09:26.840 Well, that's not crazy.
00:09:27.980 So, I mean, so forgive me for psychoanalyzing you, but it's just, it seems to me that this,
00:09:32.340 from my point of view, has caused you to be too hostile to our criticism of religion
00:09:37.140 and to actually misunderstand it in important ways.
00:09:39.760 And I'm sure we'll touch those points.
00:09:41.500 And it's also made you too soft on religion in ways that can't be scientifically justified.
00:09:48.020 And because you believe you're correcting for a harmful bias in the scientific community,
00:09:52.540 and you have been with respect to the political divide between liberals and conservatives.
00:09:57.040 But I would argue that viewing the new atheist attack on religion through that lens has caused
00:10:02.040 you to misread us.
00:10:03.120 And at the very least, I feel like you've misread me.
00:10:05.240 I see.
00:10:05.440 Yeah, I don't think that I perceive you guys as a bunch of far-left people.
00:10:11.140 So, while there is some truth to what you say, I think it's not so much left-right as
00:10:15.640 sort of rationalist intuitionist.
00:10:18.120 That really, I think, is the heart, that sort of the scientific nub of the difference between
00:10:23.600 us is what do we each believe is the nature of human rationality and the reliability of human
00:10:31.000 reason?
00:10:31.440 And you have a much stronger belief that individual reason can lead to reliable conclusions than
00:10:40.960 I do.
00:10:41.540 So, would you agree with me that that is a fundamental, factual difference between us?
00:10:45.640 Yeah.
00:10:46.240 And I want to get into morality second.
00:10:49.480 I think we should deal with religion first.
00:10:51.340 But yes, I think that is a difference between us to some degree, although you'll find me taking
00:10:55.820 most of your points about what people descriptively do under the aegis of reasoning morally or
00:11:03.340 attempting to justify or argue for their moral positions.
00:11:07.840 But let's just focus on religion for a second, and we'll get to the foundations of morality
00:11:11.200 after that.
00:11:12.400 So, religion, as you've pointed out, is more than just a set of beliefs.
00:11:16.560 And you've argued against me as though I have disputed that, which I actually haven't.
00:11:21.940 But you're not alone in this.
00:11:22.980 Many people do that.
00:11:24.200 So, I just wanted to track through a few of the things you say in your book and then talk
00:11:27.980 about them.
00:11:28.860 So, you say in your book, The Righteous Mind, that trying to, this is a quote, trying to
00:11:34.480 understand the persistence and passion of religion by studying beliefs about God is
00:11:39.120 like trying to understand the persistence and passion of college football by studying the
00:11:43.180 movements of the ball.
00:11:44.060 You've got to broaden the inquiry, end quote.
00:11:46.900 So, now, I think that analogy isn't quite right, but I actually agree with your general
00:11:52.320 point.
00:11:52.760 Religion is obviously more than what people believe.
00:11:55.780 And yet, I think it's totally coherent and, in fact, necessary to worry about the specific
00:12:01.580 consequences of specific beliefs.
00:12:03.960 Yes.
00:12:04.580 And so, let me just reform your analogy a little bit and get you to react to it.
00:12:08.740 So, because I think it's somewhat, to stick with your analogy, it's a little bit more like
00:12:12.780 asking the question, why are people on each team always tending to run in one direction?
00:12:18.660 I mean, so, if you see them running sideways or even backwards for a few moments, it's always
00:12:23.080 with the purpose to get the ball to the other end of the field.
00:12:25.760 So, what is so special about the ends of the field that everyone wants to get there?
00:12:30.540 And to explain that, you have to understand the rules of the game.
00:12:34.380 In particular, you have to understand what a touchdown is.
00:12:36.660 But once you know that, more or less everything these people are doing is easy to understand.
00:12:42.640 And again, there's more to, I mean, there's all the people out, you know, having tailgate
00:12:46.920 parties outside the stadium, right?
00:12:48.480 So, that's part of the spectacle.
00:12:50.220 But to understand what the most energized participants are doing in this situation, all you really need
00:12:56.540 to know is what they want and what they believe will get them what they want.
00:13:01.020 And so, I would argue that this is true for the most destructive behavior and moral attitudes
00:13:06.440 we see inspired by religion.
00:13:08.180 So, when you ask yourself, you know, why is ISIS throwing gay people off of rooftops?
00:13:12.220 It's because their scripture tells them to.
00:13:14.360 It's actually written in the rulebook.
00:13:16.580 Now, in this case, the specific injunctions in the Hadith is not in the Quran, but it's part
00:13:21.640 of the larger rulebook of Salafi Islam.
00:13:24.360 And, I mean, you can say anything you want about religion being more than just beliefs and
00:13:28.500 doctrines, and you can talk about doing and belonging, which you do, in addition to belief
00:13:33.000 as being central to religion.
00:13:34.340 And you can talk about the power of ritual and strong communities and the importance
00:13:38.540 of transcendence, which is something that interests me.
00:13:40.940 And I agree about all these things being interesting.
00:13:43.400 But if you want to explain the behavior of ISIS, all you really have to know are the rules
00:13:47.740 of the game as they understand them.
00:13:49.380 And if their rulebook changed slightly, let's say their rulebook on this point said, don't
00:13:54.580 harm homosexuals under any circumstances.
00:13:56.840 Simply force them to recite the Quran for 12 hours a day and actually create a special
00:14:01.880 cast of priests that there's homosexuals who just chant from the Quran and who are otherwise
00:14:08.280 venerated, right?
00:14:09.540 I think we can be absolutely sure that this is what they would be doing.
00:14:13.460 In fact, there are analogous behaviors in other religions in human history.
00:14:16.880 So this is why I think specific doctrines matter.
00:14:19.860 And that no one, I mean, so you're going to talk about the intuitive roots of many of these
00:14:23.080 things, but no one has an intuition that they should throw gays off of rooftops specifically,
00:14:27.820 or eat a cracker every Sunday and call it the body of Jesus, or oppose embryonic stem cell
00:14:32.740 research.
00:14:33.340 And in fact, ISIS wouldn't even oppose embryonic stem cell research, and the Catholic Church
00:14:37.960 would.
00:14:38.420 And this is why the specific doctrines matter so much.
00:14:41.560 Okay, so I will certainly grant that specific doctrines matter, and that I think your thought
00:14:47.140 experiment is correct.
00:14:48.000 If there was a specific verse, and especially if it appeared in multiple places, that said,
00:14:52.320 here's how you treat homosexuals, you know, then they would treat them differently.
00:14:55.600 So I don't deny that the scripture matters.
00:14:58.360 But first, to understand your analogy, you tell me, what is the end zone?
00:15:02.440 What do you think they're all up to?
00:15:04.480 What is the thing they're all striving for to get when you use this end zone analogy?
00:15:08.000 Well, if you're talking about the real players, the real believers who are devoting their lives
00:15:13.480 fully to this, the end zone is paradise and avoiding hell.
00:15:17.360 So it's what happens after death.
00:15:19.020 It's living by the, playing by this rule book, playing this game, advancing the ball down the
00:15:24.880 field is ensuring that after death, you will spend eternity in paradise and escape hell.
00:15:31.240 Okay, so I think this is one of the differences between us, is that I am opposed to the pursuit
00:15:37.000 of parsimony.
00:15:38.360 I think that the social science, human nature and the social science are so complex, and
00:15:43.440 especially if you look at morality or religion.
00:15:45.520 So anytime someone says, the goal of religion ultimately is to attain paradise, or the goal
00:15:50.520 of religion ultimately is to have a sense of meaning, or even closer to what I say, if
00:15:55.440 you were to say the goal of religion is ultimately social bonding and connection, well, those are
00:15:59.680 all goals.
00:16:00.060 There are lots of different goals.
00:16:01.420 In this case, I was talking about ISIS.
00:16:02.740 I was talking about the, what we would call the extreme committed, you know, death cult
00:16:07.160 of Islam.
00:16:08.400 Now there's analogous cults in, in other traditions, but I'm not saying that all religious people
00:16:14.640 in every denomination of every level of commitment, that their main goal is paradise.
00:16:19.440 Some, you know, some, you know, Unitarians don't necessarily even believe in heaven, right?
00:16:23.400 So I was speaking about ISIS in this case.
00:16:25.580 Okay, yeah, so I can certainly grant your point that beliefs do matter, and I hope I
00:16:29.660 never said that they don't, but I think I would still claim that your analysis here
00:16:37.780 is too focused on the explicit.
00:16:41.560 And so, and this was, you know, my main criticism, my main concern about your writings on religion
00:16:45.420 was I felt like, sometimes felt like you were writing, you know, your two religion books.
00:16:49.460 I felt like you were writing those mostly with the Bible and the Quran and the New York Times
00:16:54.340 on your desk.
00:16:55.560 And you were sort of saying, okay, well, look at this verse, or look at this event that
00:16:58.960 happened, and then just trying to make sense of it yourself.
00:17:02.140 And I was thinking of it much more both from a kind of a Durkheimian point of view, or a,
00:17:07.460 you know, unconscious modus point of view.
00:17:08.860 I mean, there's just so much going on here.
00:17:10.640 And I have not studied ISIS.
00:17:12.040 I don't know what's going on with them.
00:17:13.200 But I don't believe you could understand them by just by reading the Quran and saying,
00:17:17.580 oh, the Quran says this, that this is why ISIS is doing it.
00:17:20.460 Well, the thing is...
00:17:21.060 The motives of humiliation, geopolitical, I mean, I don't know what's going on, but
00:17:24.980 there's a lot going on.
00:17:26.800 But this is, I mean, the issue is that this is how they understand themselves.
00:17:30.860 And now here, I'm not just speaking about ISIS, I'm just speaking about religious fundamentalists
00:17:34.900 in general.
00:17:35.380 When you ask them how they understand what they're doing, if you ask them why homosexuality is an
00:17:41.240 anathema, for instance, they have a scriptural justification for it.
00:17:46.140 And it does explain the belief and subsequent behavior, and where in certain cases, nothing
00:17:54.420 else does.
00:17:55.320 I mean, so we might, it might be relatively easy to come up with other non-scriptural reasons
00:18:01.080 to be uncomfortable around the phenomenon of homosexuality.
00:18:05.360 And we can talk about that.
00:18:06.720 I mean, it gets into your kind of moral intuitions, the moral foundations theory.
00:18:09.960 But for many of these things, the only way this idea could ever get into someone's head
00:18:15.440 is based on the tradition and the explicit teaching on a specific point.
00:18:20.960 Agreed.
00:18:21.420 I agree with you on that point.
00:18:22.700 And so let's make a distinction that I think will be very helpful here, which is between
00:18:25.540 fundamentalism and religion more generally.
00:18:28.260 So if we're talking about fundamentalist movements, then you and I are going to agree much more,
00:18:32.080 including in the moral evaluation of them.
00:18:34.020 And so if we live in a diverse society, if we live in a society, or if we value progress
00:18:43.340 and open debate of ideas and challenging each other and the things we need for the sciences,
00:18:49.280 then fundamentalism is incompatible with all of those things.
00:18:52.340 Christian fundamentalism, Islamic fundamentalism, I would say politically correct fundamentalism
00:18:56.280 or social justice fundamentalism.
00:18:57.600 I think you and I both personally dislike fundamentalists, the fundamentalist mindset,
00:19:03.260 I should say.
00:19:03.620 I don't mean the people.
00:19:04.400 I mean, the fundamentalist mindset is opposed to values that you and I both hold as individuals
00:19:09.820 and for science.
00:19:11.620 So there, I think there's not as much disagreement between us.
00:19:14.520 But then if you say, what about non-fundamentalist?
00:19:17.060 That's where I think you're much more negative than I am about people who are religious but not
00:19:21.500 fundamentalist.
00:19:22.100 Is that true?
00:19:22.500 Well, yeah, I'm more negative in the sense that I feel like they make, one, honest talk
00:19:28.160 about the problem of fundamentalism much more difficult because they don't want anything
00:19:32.160 too critical said about their holy books or about a tradition of venerating the concept
00:19:37.120 of revelation, right?
00:19:38.340 I think we're, I think revelation is a problem here.
00:19:40.720 The idea that one of our books was not the product of the human mind but the product of
00:19:45.640 omniscience, that already just deranges our intellectual and moral discourse really beyond
00:19:51.780 saving and we have to get out of that part of the religion business.
00:19:56.980 And so that insofar as moderates and liberals do, well, then my only real concern is that,
00:20:02.860 well, I guess there are two more concerns.
00:20:04.240 One is they tend to not be intellectually honest about the process whereby they have become
00:20:09.360 moderate or liberal.
00:20:10.380 So they pretend that there's something in the tradition, in the books, that has been
00:20:15.760 self-purifying.
00:20:16.900 But no, when you go back to the books, they're every bit as theocratic as they always were.
00:20:20.840 What's happened is that they have collided with a wider set of values, secular values and
00:20:26.760 scientific insights and progress, and they have found being doctrinaire and dogmatic is
00:20:32.840 no longer how they want to live.
00:20:34.900 They can no longer justify it, but they're not really honest about just how that winnowing
00:20:39.440 has taken place.
00:20:40.540 And they tend to give credit to the resources of the tradition, whereas really it's the resources
00:20:45.500 of a much larger conversation that human beings have had.
00:20:48.040 Sure. So if you want to say that people are adopting positions and then searching for a
00:20:52.120 justification and looking for some sort of textual justification of what they've decided
00:20:56.000 to do intuitively, yep, I'm down with that.
00:20:58.260 That's the core of my research is that that's what a lot of our moral argument and justification
00:21:03.600 is all about.
00:21:04.700 Yeah. Yeah.
00:21:05.460 And so you do one, I just want to go back to your book briefly. You do one thing in your
00:21:09.700 book, which I, it's pretty clearly an area where we disagree. I don't think we need to
00:21:13.560 go into it any real depth because it may be a little too hard to parse here in a podcast,
00:21:18.340 but I think we should just flag it because it is one, I think it's also one reason why
00:21:23.460 you think I and Richard Dawkins and others have been too hard on religion. And it's this notion
00:21:29.480 that religion has provided an evolutionary benefit to us.
00:21:34.200 Is it an adaptation or a byproduct? You're right. That is the other core factual or scientific
00:21:39.460 issue that we disagree on.
00:21:40.820 Right. So I just want to introduce this, this concept of group selection to those who don't
00:21:45.440 know anything about it, and then we can table it probably. But so you defend this notion of,
00:21:49.500 of group selection and specifically the idea that, that religion has helped certain groups
00:21:55.440 survive and perhaps a lack of religion has caused others to fail. And you think that this mechanism
00:22:00.820 hasn't just been cultural, but that it's also been biological. And so this, this idea of group
00:22:06.640 selection, which obviously relates to much more than just religion, this is very controversial in
00:22:11.240 biology. And, and, you know, its main champion who you do side with here is someone named David Sloan
00:22:16.500 Wilson, who interestingly, he's also attacked the new atheists with a level of energy that I never
00:22:20.740 quite understood. So I should just point out that there are many biologists, and I would think
00:22:24.880 still most, as far as I can take the temperature on the whole field, disagree with, with this idea
00:22:31.020 of group selection. And so if our listeners are interested in it, I think the best summary of
00:22:35.660 the reasons to doubt that group selection occurs was written by, by our mutual friend, Steven Pinker,
00:22:41.600 and the title is The False Allure of Group Selection. And that can be found on edge.org.
00:22:46.140 I know you must be aware of that paper.
00:22:48.260 Oh yeah, no, I responded to it.
00:22:49.720 So, so you weren't persuaded by it.
00:22:51.100 So yeah, so that is what the debate comes down to, is if, you know, is religion a product of
00:22:57.500 evolution? Is it an adaptation? In which case, that doesn't mean it's still adaptive today,
00:23:01.660 but it would mean that it conferred some benefit. The really exciting idea that so captivated me when
00:23:08.080 I first read Dawkins in college was, wow, what if it's like a virus? What if there, it's just,
00:23:14.820 it's taken advantage of the hardware up there and it's exploiting it for its purposes. And of course,
00:23:20.360 Dawkins and Dennett are, you know, are really explicit about that. It's a really cool idea.
00:23:24.800 And I used to believe it. And that was the prevailing wisdom. You know, Dawkins' book,
00:23:29.020 The Selfish Gene, was an incredibly powerful book and a testament to the power of good writing,
00:23:34.160 to be persuasive. So the state of the art in the 70s and 80s was, as you say, that most biologists
00:23:39.220 doubted it. In fact, almost all did. Group selection was dismissed because there wasn't
00:23:43.940 any way to solve the free rider problem. If groups were to cooperate for the benefit of the
00:23:48.040 group, any free rider within the group would get extra benefit and the genes for free riding would
00:23:52.600 spread. And so the topic was put aside and David Sloan Wilson was seen as alone crazy. But a lot has
00:23:58.480 changed since then. So right around that time, the whole idea of major transitions in evolution was
00:24:03.440 being formulated. And there are many other examples of agents that were functioning at an individual
00:24:10.520 level, competing with each other, coming together to be more effective as a group. And even the cells
00:24:16.280 in our body are an example of that. The mitochondria have their own DNA because it was an example of a
00:24:21.140 major transition where multiple entities got together to act as a group. Let's see, what else was there?
00:24:27.220 I go through in my book, I go through as though there were four new exhibits, four reasons to
00:24:33.380 re-examine the case since the 1970s, gene culture co-evolution, things like that. And while it is
00:24:40.140 still true that their biologists mostly seem to side against this, this is actually because I think
00:24:47.300 E.O. Wilson made a big mistake in writing a paper. I love him. I think he's mostly right about things.
00:24:52.720 But I think he made a big mistake in writing a paper saying that kin selection doesn't matter.
00:24:58.780 And I don't understand. I don't think that makes any sense. Kin selection is really powerful.
00:25:02.460 So he took a lot of flack and people are conflating his rejection of kin selection with his endorsement
00:25:08.700 of groups or I should say multi-level selection. So just the final point on this is the whole debate
00:25:14.420 since the 60s with Williams and then Dawkins was always looking at altruism. Can we explain
00:25:20.060 altruism as a product of group selection? We are nice to each other because the benefits then to
00:25:27.120 the group outweigh the cost to me as an individual. And so my response to Steve Pinker was, well, if you
00:25:34.580 just focus on being nice or altruistic, well then, yeah, it's kind of hard to argue that this is from
00:25:39.980 group selection. But if you look at the tribalism, that's what really got me. That's why I'm on this
00:25:44.880 side of the debate. If you look at tribalism, how similar it is, how initiation rights all over the
00:25:50.340 world are actually mimicked in fraternity brothers' initiations. I don't think it's because they
00:25:55.340 studied anthropology. It's because there's something in the human mind that makes people, especially
00:26:00.900 young men, want to do things that involve painting their faces or changing their appearance, exposing
00:26:06.860 each other to extreme risk, doing all sorts of things that bond them together as a group, make them
00:26:11.660 quite dangerous, quite able to be predators of other groups. So I think you and I agree on those
00:26:16.460 external costs. So anyway, that's why I'm saying that if you focus on tribalism, you try to understand
00:26:20.640 that. I don't see how you can explain that from individual selection. And this is why I think that
00:26:26.760 the arguments for limited group selection were overwhelmed. That's why I say we're 90% chimp.
00:26:31.400 We're overwhelmingly evolved by individual level selection in the way that Dawkins describes it.
00:26:35.940 But we have this interesting tribal overlay. And I think that's essential for understanding not just
00:26:40.740 morality and religion, but politics, as we're going to talk about very soon.
00:26:45.040 Right, right. Well, I'd be remiss if I didn't say just a couple of words about why group selection
00:26:49.620 seems spooky from a more traditional evolutionary point of view. And then I'll just get off it because
00:26:55.660 I don't think we'll resolve it here. But I just think, you know, from the point of the criticism,
00:27:00.160 it seems to be a metaphor that gets taken too literally and that blurs the lines between genes
00:27:05.920 and individuals and groups as units of selection. So, you know, as you said, group selection is
00:27:10.840 often called multi-level selection.
00:27:12.500 Yeah, that's the way to think about it.
00:27:13.400 Right. But, you know, as Steve and others have pointed out, there are many problems with saying
00:27:18.280 that selection acts on groups in the same way that it acts on individuals to maximize their
00:27:23.500 inclusive fitness, or that it acts on them in the same way that it acts on genes, increasing
00:27:27.580 numbers of copies that appear in the next generation. So if there's a...these things are operating
00:27:33.040 differently, and I just...again, I'm dogged by the fact that I feel like this is a little
00:27:38.480 too hard to parse in a podcast for people to listen, so...
00:27:41.560 Right. We can skip it. We can just point people. Actually, I think, you know what, chapter nine
00:27:45.800 of my book...so let's do this. I have made chapter nine of The Righteous Mind available for free
00:27:51.780 on my webpage. So if people go to righteousmind.com, they can find my argument for group selection.
00:27:57.640 And if they Google...well, I guess you can direct them to...but if they basically just Google
00:28:01.180 edge, pinker, what, false allure of group selection?
00:28:04.640 Yeah, yeah. That's Steve's article.
00:28:05.840 They can find that. So that's Steve's...Steve makes a strong argument against it. So I think
00:28:08.680 we can just pass it off in that way.
00:28:10.440 Yeah. I mean, so just to crib Steve briefly, the issue is that there's a lot of causality
00:28:16.760 in the world that you don't need natural selection to explain. And so merely having one tribe outcompete
00:28:23.660 another doesn't require natural selection. So like, for instance, if the Nazis had won the war,
00:28:29.900 right, and we were now living in the first century of the thousand-year Reich, this wouldn't be an
00:28:34.460 example of group selection. I mean, and the difference that would make a difference here
00:28:38.340 is almost certainly cultural and not genetic. So if the Germans had won the war, sequencing
00:28:43.360 Hitler's genome wouldn't tell us why. And yet we would still be living in a world where everyone
00:28:48.940 would now be a Nazi and the Nazis have succeeded. But here again, so when in talking about success,
00:28:55.140 the success of a group, in this case, the Nazis, we're using a metaphor here, because this is not
00:28:59.740 analogous to the success we talk about when we talk about genes spreading in a population. Because
00:29:04.600 in here, in this case, the success itself applies to the group, the Nazi party, enduring for
00:29:11.100 centuries, not to some entity at the end of generations of replicators that have been copying
00:29:16.500 themselves with some rate of mutation and then out-competing all others. So Steve argues,
00:29:21.100 I think, very strongly that it's a confusion over a metaphor. The interesting thing for me,
00:29:25.520 though, is with group selection, I think it's actually a red herring for me, because I'm happy
00:29:33.220 to assume it's true for the sake of argument, right? And it won't actually change any of the
00:29:37.060 things you and I disagree about in this space. Because it seems to me that you draw normative
00:29:41.560 claims from the fact that group selection is a fact.
00:29:45.560 And very indirectly, yes.
00:29:47.760 You seem to be saying that even if the tenets of religion are false,
00:29:51.480 right, group selection proves that religion has still been a kind of necessary social glue.
00:29:57.720 Well, hold on. Wait, wait, let me reword that. So I think, look, you and I are both atheists.
00:30:01.560 We're both naturalists. We both believe that religion is out there in the world. It's part of
00:30:08.100 human nature in some way, shape, or form, and that evolution has to do with the explanation for why
00:30:12.120 it's out there. So we're both naturalists. The question at hand is whether it does something
00:30:17.000 or confers some benefit, such that if we could rip it out, we would lose nothing or something.
00:30:22.620 And on Dawkins' view, and I think your view, if we could just get rid of it entirely,
00:30:27.180 we'd be better off. And that might be true. I don't know. But if religion is an adaptation,
00:30:33.520 as I believe it is, then it could still be true that it was necessary for getting us to where we are.
00:30:38.820 And I do believe that religion and the psychology of religion helps explain how we and only we made
00:30:44.580 the transition to living in large-scale societies of non-kin. It could still be the case that it was
00:30:50.660 useful back in tribal days, and now we've supplanted it with law and other things. So I would never say
00:30:55.780 that religion being an adaptation or the truth of multilevel selection would prove anything about how we
00:31:00.980 ought to live today. But what I do draw from it is that seeing it as an adaptation for group
00:31:07.160 solidarity and group coherence makes it easier to see some of the psychological benefits and
00:31:12.720 socio-structural benefits that might be there that are hard to see if you're a secular person on the
00:31:19.400 left. Because that is what I see, is it's really hard to understand what's good about the other side
00:31:24.940 once you're in an argument or debate with them. And from reading scholarship on religion, from reading
00:31:30.920 books, especially the book James Ault has this wonderful book, Spirit and Flesh, that really helps you
00:31:36.240 see the sociology of a small evangelical community. So that's my only point. I wouldn't say I draw
00:31:41.800 normative implications directly, but I do draw implications for what kinds of lives are happy
00:31:47.860 and satisfying, what kinds of social patterns and structures make people less selfish and more
00:31:53.520 inclined to think about others. And there, I think you just have to think twice if you're going to say
00:31:59.180 religion's just bad and it makes people do bad things, get rid of it.
00:32:01.900 Yeah. Well, so obviously I share your concern for human flourishing and us getting in a position
00:32:08.340 to tune all the dials to maximize it. I guess I was detecting in you some version of the naturalistic
00:32:15.760 fallacy, some version of saying that because this thing is natural to us and in fact selected for and
00:32:21.260 did our ancestors good, that is some argument, some weight on the balance to argue that it is in fact
00:32:27.560 good morally speaking. Oh, no, no. I'm only making the argument actually in a way that very much the
00:32:32.480 way you make in the moral landscape. If we're going to talk about human flourishing, we need a full
00:32:37.540 picture of human psychology just straight descriptively. So I think you and I differ a
00:32:42.340 little in our descriptive picture of human psychology. But beyond that, it's pretty much a
00:32:46.600 straight flourishing happiness explanation. So I see what you're saying, but I'm pretty sure I'm not
00:32:52.120 making the naturalistic fallacy by saying if it's evolved, therefore it's right or good. I'm not saying that.
00:32:56.260 Right. So it's just, if it's evolved, you would suggest that it could be harder to get rid of,
00:33:02.180 if bad, because we've all evolved to think in these ways. But one distinction I still think in
00:33:07.720 this area that divides us, at least it changes the way we tend to talk about this, is there is a
00:33:12.300 distinction in thinking about how science can touch this subject and the distinction between how we got
00:33:18.300 here, the evolutionary story of just how we came to have the brains and mental capacities we have,
00:33:24.300 and then there's a question of just what is possible given what we are. And that's, for me,
00:33:30.080 that is a, those are two distinct and totally interesting and justifiable projects, but they're
00:33:35.940 distinct and science has a very different role to play in each. And so if you're just going to do
00:33:41.340 descriptive science and talk about how we got here, yes, that has no necessary normative implications.
00:33:46.840 And many people stop there and say, well, so science can't tell you how to live. Science can just tell
00:33:50.760 you why it is you find certain things disgusting, why we've, you know, evolved to have very strong
00:33:55.800 in-group, out-group thinking. But, you know, we did not evolve to successfully build a global civil
00:34:01.700 society that's committed to human rights and the free exchange of ideas and racial and gender equality,
00:34:06.680 right? So the question is, can we accomplish this? And, you know, I think we can. But the further
00:34:11.520 question is, you know, would it be moral to accomplish it? And would it be a bad thing if we failed?
00:34:16.180 And I think, yes, we can answer yes on both of those questions. And the crucial point, though,
00:34:21.680 is that success on this front will entail overcoming a fair amount of what we've evolved to care about.
00:34:27.800 So you cite a bunch of work, I remember Putnam and Campbell being some of it, that seems to show
00:34:33.520 that religion is good for people. So in this case, it makes them more generous.
00:34:37.440 Yes. In the United States, that's right. It doesn't say globally, but yes, in the United States,
00:34:41.380 there's a lot of evidence that religion makes people happier and better citizens, according to Putnam
00:34:46.520 and Campbell. That's right. And this is the result of their belongingness to a religious community,
00:34:51.800 not their beliefs and doctrines. That's exactly right. And this increased generosity isn't just
00:34:57.460 lavished on their in-group, it actually extends to the rest of society, which would surprise many
00:35:02.240 atheists. Now, I don't actually know whether or not this is true. Let's just assume it is all true.
00:35:07.240 It seems to me that even if we accept that as true, it obviously isn't the whole story. I mean,
00:35:12.920 I think we could design a dozen invidious experiments where we show that religious people
00:35:16.960 are more homophobic, say, or sexist than secular people on average, or have a lesser understanding
00:35:22.900 of science or less respect for science. And this would help complete the picture. But I think
00:35:28.700 the problem I have with this line of thinking is that there seems to be a tacit assumption that if we
00:35:34.960 can show that religion is doing something good for people, there is no better way to get those goods
00:35:41.800 that's compatible with a truly rational worldview. That's a fine point. I agree with that. I agree
00:35:46.260 with that. But let's see. I think you raised a question that I think would be great for us to
00:35:51.300 try to work out here. I think we might come to different views. So you said, I think we both agree
00:35:56.400 that our evolved human nature did not prepare us to live in a giant, global, peaceful, egalitarian
00:36:02.560 society under rule of law, whereas in a sense, we're living above our design constraints. And
00:36:07.200 clearly, to some extent, it's possible because despite the imperfections, we're sort of doing
00:36:11.200 it nowadays. So our evolutionary past, while it makes it puts on some constraints, they're kind
00:36:16.780 of loose constraints, and we can live in all kinds of ways that we weren't designed for. But here's
00:36:20.480 where here's where I think our different views of religion would lead to different prescriptions for
00:36:25.260 how to do that. So I take part in a lot of discussions, I'm invited to all sorts of,
00:36:29.900 of sort of, you know, lefty meetings about a global society. And, and, you know, the left
00:36:36.300 usually wants, they want a global, that global governance, they want more power vest in the UN,
00:36:43.400 they, I hear a lot of talk on the left about how countries or national borders are bad things,
00:36:49.100 they're arbitrary. So the left tends to want more of a universal, I'm just think about the John
00:36:54.280 Lenin song, this is, you know, what I always go back to, just think about imagine, imagine,
00:36:57.580 there's no religion, no countries, no private property, nothing to kill or die for, then it
00:37:02.300 would all be peace and harmony. So that is a sort of a far leftist view of what the end state of human
00:37:08.040 evolution, the social evolution could be. Now, is that, is that possible? Or is it, is it consistent
00:37:14.620 with our evolved nature? Now, here's the other side. The other side, the conservative view,
00:37:19.060 is that we are fundamentally groupish, more parochial creatures, and to have global governance,
00:37:27.380 and a, a bit, one giant country, or one giant community of the, of all earth, would be a nightmare,
00:37:34.100 it would be chaos, it just wouldn't work. Far better to have authority at the lowest level possible
00:37:40.960 at all times, and, and build up with nested structures. So a country ends up, for conservatives,
00:37:47.940 a country ends up being a very reasonable, basic building block, and they would not want as much
00:37:53.960 of a global society. They certainly would want international law, they would want treaties,
00:37:58.160 they'd want all sorts of things. So I think this is a case where, if you, if you have a kind of a blank
00:38:02.780 slate view, or a very positive view that our basic nature is love and cooperation, and it's only
00:38:08.280 capitalism that screwed it up, you're going to want a kind of a John Lennon vision of the future,
00:38:13.340 and I don't think that that could really work. Whereas, if you start with Edmund Burke,
00:38:18.620 who talks about the little platoons of society, we start, we, we develop in the family. So
00:38:23.900 conservatives are really, really focused on the family and lower level institutions. And if you
00:38:28.200 focus on making those strong, and then you think about some sort of a legal and social architecture
00:38:33.360 that allows multiple families and communities and states and countries to work together with
00:38:39.960 a minimum of friction, I think that's much more workable. So getting a getting a correct view of
00:38:45.760 our evolutionary heritage and the psychology that resulted from it, I think is very helpful. It
00:38:51.320 doesn't tell us what's right or wrong, but I think it does tell us which way is more likely to work.
00:38:55.440 And if you see us as products of multi-level selection, with a deep, deep tribalism, that
00:39:03.360 suggests that you're probably better off going for more the Burke way, and having groups that are
00:39:09.180 composed of groups and finding ways for them to work together, rather than the John Lennon way,
00:39:13.300 which is let's erase all group boundaries. Let's erase divisions of nation and everything else and
00:39:19.920 just have one giant planet. You know, I just don't think that's likely to work. I think that is,
00:39:26.360 like as with the communist societies, it's making assumptions about human nature that end up, you
00:39:31.780 know, people just refuse to live that way, and it's a disaster. One thing about what you said that
00:39:35.760 I want to pull back to the focus on religion is just that you're essentially exposing some of the
00:39:41.280 holes in secular thinking. And I agree those holes are there. In fact, I've written two books that
00:39:46.460 attempt to shore up some of the weaknesses I see in secularism. And what you just said relates to
00:39:52.860 this very topical example of the recent migrant crisis in Europe, where you have, you know, secular
00:39:58.620 liberals for the most part, and, you know, atheists who really can't find a rationale, morally speaking,
00:40:06.320 for anything less than an open borders policy. And in fact, so there's two reasons here, there's two
00:40:11.880 connections here, because there's this low birth rate in Europe, and many people attribute that to
00:40:17.360 secularism. The loss of religion is really leading to a loss of babies. And that becomes a justification
00:40:25.000 for bringing in immigrants, because they need people to work in these societies. So one could argue
00:40:31.340 that for two reasons, both economically and morally, secularists are now in a position, you know,
00:40:38.360 someone like Angela Merkel, where they're unable to find a reason to keep the borders closed. And
00:40:44.180 let's just say that this happens where you have millions upon millions of Muslims who on balance
00:40:48.960 are deeply religious and disposed to have large families, they flood into Europe over the next few
00:40:53.640 decades. And in 100 years, Europe is predominantly Muslim and deeply religious, right? This is a possible
00:40:59.320 counterfactual or actual history. So what lesson should we draw from this? Many people would conclude
00:41:05.380 that what Europeans needed in the year 2016 is more Christianity, right? That only a belief in Jesus
00:41:13.400 and the associated behavior and belongingness that that confers, and the fertility rates that get
00:41:19.280 associated with a taboo around contraception, that only that could have protected them from the sweeping
00:41:25.480 changes in their society. And I would just argue is that there must be a truly rational way for secular
00:41:31.780 people just to figure out what sort of world they want to live in and simply build it.
00:41:36.560 Yep, I totally agree, Sam. And I think this is a nice example for us to talk about because I think
00:41:41.260 you and I both are wary of...
00:41:43.780 If you'd like to continue listening to this conversation, you'll need to subscribe at
00:41:49.040 samharris.org. Once you do, you'll get access to all full-length episodes of the Making Sense podcast,
00:41:54.720 along with other subscriber-only content, including bonus episodes, NAMA's,
00:41:59.480 and the conversations I've been having on the Waking Up app. The Making Sense podcast is ad-free
00:42:04.360 and relies entirely on listener support. And you can subscribe now at samharris.org.