Making Sense - Sam Harris - November 20, 2017


#105 — Richard Dawkins, Sam Harris, and Matt Dillahunty


Episode Stats

Length

38 minutes

Words per Minute

151.08365

Word Count

5,779

Sentence Count

318

Misogynist Sentences

1

Hate Speech Sentences

2


Summary

In this episode of the Making Sense Podcast, we sit down with evolutionary biologist Richard Dawkins to talk about the role of "randomness" in evolution, and the problem of how we have been so willfully ignorant about it for so long that we can't even begin to understand what it is we're trying to understand. And how can we get better at understanding it? And what can we do about it? In part one of this conversation, we talk with Richard Dawkins and Sam Harris about what it means to be an evolutionary biologist, and how we can learn to be more aware of the role that randomness plays in our understanding of evolution. And we answer a question that was asked by a high school science teacher who had no idea what evolution is all about, and no appreciation for it at all. We hope you enjoy this episode, and we'd like to hear from you, the listeners, if you have any thoughts or suggestions on how we should be better at explaining evolution to others. We don't run wild with randomness. We'd love to hear your thoughts and theories, so send them in the comments section below! Thanks again for listening, Sam and Richard! Make sense! --Sam and Richard -- Sam Harris and Richard Dawkins The Making Sense podcast by Sam Harris - and by is a podcast about science, philosophy, and all things related to science and philosophy. by Richard Dawkins, from the University of British Columbia, Canada, Vancouver, Canada. -- Written by , and . , , edited and produced by -- and , with permission from , produced and produced and edited by . . -- from -- with permission -- is a production of the podcast, -- we hope you find this podcast interesting, and -- in any case, in any way possible -- by ? with any other than , or -- or or , it's a good one? -- thanks to , not too bad? , we're making sense of it's worth listening to it? -- or not? or not too good, , at least it's good enough? -- and it's not bad at all? -- -- can you help us do so? -- we'll be better than that? -- at least they're good enough, right? -- thanks, anyway?


Transcript

00:00:00.000 Welcome to the Making Sense Podcast.
00:00:08.820 This is Sam Harris.
00:00:10.880 Just a note to say that if you're hearing this, you are not currently on our subscriber
00:00:14.680 feed and will only be hearing the first part of this conversation.
00:00:18.420 In order to access full episodes of the Making Sense Podcast, you'll need to subscribe at
00:00:22.720 samharris.org.
00:00:24.060 There you'll find our private RSS feed to add to your favorite podcatcher, along with
00:00:28.360 other subscriber-only content.
00:00:30.260 We don't run ads on the podcast, and therefore it's made possible entirely through the support
00:00:34.640 of our subscribers.
00:00:35.900 So if you enjoy what we're doing here, please consider becoming one.
00:00:58.360 We clearly waited too long to come to Vancouver.
00:01:19.740 Thank you.
00:01:20.780 Amazing.
00:01:21.080 So a quick rundown on this evening's event.
00:01:25.040 There's a couple microphones set up.
00:01:26.700 We will be getting to questions from you guys a little later on.
00:01:29.320 We're going to chat for, you know, however long we feel like it, but we want to make sure
00:01:33.360 there's time for questions after that.
00:01:35.820 So good to see you both again.
00:01:38.280 Yeah, my pleasure.
00:01:38.980 I'm really enjoying kind of this series of events, and I thought today we'd start off
00:01:43.740 in a different direction that's all about me.
00:01:47.320 No.
00:01:48.880 It's actually a question that I think both of you are going to have really good input on.
00:01:53.440 I did a debate a couple weeks ago against a preacher who seemed to have not only no understanding
00:02:00.600 of science, but no appreciation for it, didn't care, didn't care if he was fairly representing
00:02:07.500 it.
00:02:07.880 As a matter of fact, I think there's a chance you might have stood up and accosted him
00:02:12.760 at some point, because he literally stood in front of me and said, oh, that evolution
00:02:18.100 stuff, it's not like anybody's ever banged sticks and rocks together and got a puppy.
00:02:22.440 He said this twice during the debate.
00:02:29.000 The first time, we're in a debate structure, so I'm trying not to interrupt.
00:02:34.220 You know, I need to follow the rules of debate.
00:02:37.000 And the second time, I just halted and jumped right in, and I was like, you're right, that's
00:02:42.500 never happened, and no scientist has ever portrayed anything like that happening.
00:02:47.220 And luckily, we were in high school, and the students seemed to get it.
00:02:49.600 But how do we work past not only just this willful scientific ignorance, but this, we
00:02:57.760 seem to have built communities where we haven't instilled any appreciation for it, or any appreciation
00:03:03.000 to treat it reasonably.
00:03:06.260 Let's just throw up a straw man and call it nonsense.
00:03:10.200 I don't often quote Tony Blair.
00:03:12.780 But he said, education, education, education.
00:03:23.480 There is staggering ignorance of what evolution is all about.
00:03:29.540 And, hello?
00:03:30.980 I think we're living in a simulation right now, and it's failing us.
00:03:41.980 So, Richard, what do you do with this underlying misunderstanding of the role of randomness in
00:03:48.000 evolution?
00:03:49.200 Can you inoculate us against that problem?
00:03:51.360 Well, mutation is random, only in one sense, actually.
00:03:54.260 Mutation is random only in the sense that it's not directed towards improvement, specifically.
00:03:58.520 It's non-random in other senses.
00:04:01.420 Natural selection is quintessentially non-random.
00:04:04.760 That's exactly what natural selection is.
00:04:06.840 Anybody who thinks that you could possibly explain the beauty and the elegance of living
00:04:12.420 things by some kind of random process would be stark-raving bonkers.
00:04:17.420 Anybody who thinks that we think that has got to be stark-raving bonkers.
00:04:21.100 Of course it's not random.
00:04:22.480 The whole point of the scientific enterprise in this case is to find an escape from randomness,
00:04:28.220 is to find a solution to the problem of how you get these staggeringly non-random things
00:04:34.420 which are living creatures out of the laws of physics.
00:04:39.360 And that's what we're about.
00:04:41.560 I mean, to explain that by postulating a creator.
00:04:45.440 Now, that is almost resorting to randomness.
00:04:49.340 That's saying that complexity, non-randomness is another word for complexity,
00:04:55.940 comes into being spontaneously by sheer luck.
00:05:01.320 God just happened to be there.
00:05:03.320 What natural selection, what evolution does,
00:05:05.780 is to explain how you get there from simple beginnings, which are easy to understand,
00:05:10.440 and how you work up gradually, gradually, gradually, up a kind of ramp of improvement
00:05:15.140 until you get to complexity.
00:05:17.800 That's the whole point.
00:05:20.940 We're trying to escape from randomness,
00:05:23.280 and natural selection is the only escape that anybody has ever suggested that will work.
00:05:29.560 It strikes me, I was thrilled that students,
00:05:32.900 this was in a public high school, although I believe it was a charter school,
00:05:35.600 because it's going to be unlikely that a regular state-sponsored public school
00:05:40.920 is going to invite me in to debate a preacher,
00:05:43.100 although it was a debate class.
00:05:45.980 But I was inspired that the students seemed to catch on to what was going on,
00:05:50.920 so at least I'm a little optimistic that they were reasonably educated on the subject.
00:05:56.960 But how do we deal with adults, this minister?
00:06:02.500 He's not going to go back to school.
00:06:03.820 He's not going to pay any attention to us.
00:06:06.900 What did he actually say?
00:06:08.080 I didn't quite hear the final word of what he said.
00:06:11.380 He portrayed evolution as if scientists were saying
00:06:14.840 that you bang sticks and rocks together and you get a puppy.
00:06:19.420 That's sort of ridiculous over the top.
00:06:23.740 That's going to be a meme, that face right there.
00:06:31.740 I'm just lost for words.
00:06:33.080 Although, truth be told, the details of procreation are almost that strange.
00:06:40.500 I mean, if you've ever had a child, it could not be more alien.
00:06:47.560 If we watched a horror movie, and this is how the aliens produced their offspring,
00:06:53.660 it could not be made stranger than it is.
00:06:57.000 That was not an anti-sex tirade, by the way.
00:07:03.740 That was just...
00:07:04.740 If anyone thinks that the great majority of scientists are so utterly idiotic and naive
00:07:13.260 that they think that the way you get life is by banging sticks together and stones together,
00:07:18.260 I mean, doesn't it give him pause to think that actually the vast majority of scientists have
00:07:24.020 a fully coherent theory that fills library shelves and volumes of books about it?
00:07:32.100 If it was that simple, if we're just banging sticks together, that's not the way it would work.
00:07:38.560 What do you do with the underlying improbability of the whole process getting started in the first place?
00:07:44.520 The tornado going through a junkyard and assembling a fully working 747 argument?
00:07:48.720 The first step, the origin of the first self-replicating molecule, the origin of the first gene,
00:07:55.840 that was a necessary first step before natural selection could get started.
00:08:00.060 And that is a step that nobody has yet solved.
00:08:02.840 There are quite a lot of theories about it.
00:08:05.500 We may never know for certain, because it happened a very long time ago.
00:08:10.040 We know the kind of thing that must have happened.
00:08:13.460 And that is a big barrier.
00:08:15.280 That is one of the main questions that remains.
00:08:17.860 Once that's happened, that was a fairly simple start.
00:08:22.340 Once that's happened, then the whole panoply of life,
00:08:24.720 the whole branching, complexifying beauty of life then gets going.
00:08:30.500 We do need a theory of the origin of life.
00:08:33.420 But once that starts, then everything else follows with great logic and persuasiveness.
00:08:41.860 And of course, until we get to the point where we have a good understanding,
00:08:44.840 then the answer that we should give is we don't know yet, rather than pretending that we do
00:08:49.780 and that there's some, you know, god-like governing force.
00:08:53.220 Exactly.
00:08:53.880 Scientists are...
00:08:54.840 Yeah.
00:08:56.160 We like to say we don't know, because that gives us something to do.
00:09:03.220 It's incredibly good job security for the curious.
00:09:07.160 One of the things that troubled me is having...
00:09:10.160 All of us have dealt with religious-minded individuals in debate-type formats.
00:09:16.580 Here's a preacher who knew nothing.
00:09:20.120 And it was proudly on display.
00:09:22.940 And there's a part of me that says,
00:09:27.060 should this individual be allowed to speak to children at all?
00:09:33.140 And yet I have to defend this idea of freedom of expression,
00:09:37.580 that people get to share their ideas.
00:09:39.800 And that puts us in a place where we're constantly in a battle of ideas.
00:09:44.880 How badly informed should somebody be before we just stop paying attention to them
00:09:51.940 and work on the people who perhaps are reachable?
00:09:55.740 Well, the problem in that case is that the preacher represents, in the U.S.,
00:10:02.180 what, 35%, 45%, depending on what his convictions are of the population.
00:10:08.640 So it's not...
00:10:09.980 You can ignore the preacher, but you can't ignore the fact that a significant minority,
00:10:15.520 and on some questions a majority of Americans,
00:10:18.200 hold just patently absurd ideas.
00:10:21.980 So it's the ideas that really matter.
00:10:24.980 He knew nothing, but he was proud of knowing nothing, it sounds to me.
00:10:30.160 A lot of us are ignorant of lots of things.
00:10:33.080 I mean, I'm ignorant of very many things, and I'm sure you are as well.
00:10:36.560 But we don't pontificate...
00:10:38.680 I've never heard it put so nicely.
00:10:42.360 But we admit when we're ignorant,
00:10:46.220 and we don't try to pontificate about things of which we know nothing.
00:10:51.120 Whereas he was doing exactly that.
00:10:53.880 In a way, it wasn't so much that...
00:10:55.680 I don't think he thinks he's ignorant.
00:10:57.260 I don't think he's proud of his ignorance.
00:10:59.360 I think he thinks he's convinced he has the right answer,
00:11:02.100 and that we are all engaged in a scientific fairy tale.
00:11:05.360 So there's an extra layer of smug superiority over the top of it,
00:11:09.980 where he gets to dismiss the work of countless scientists
00:11:14.600 that have taught us the best current understanding of the diversity of life,
00:11:20.340 and he gets to shrug it off with sticks and rocks.
00:11:24.320 Well, if we ever have to convene gatherings like this in hell,
00:11:28.360 we'll know we did something wrong.
00:11:30.080 I'm pretty sure a part of that was in hell,
00:11:38.100 but I maintained my composure.
00:11:41.180 I mean, that really is the thing.
00:11:42.460 That's what completely changes the equation.
00:11:45.200 The moment you believe you are certain,
00:11:48.420 or even just have very good reason to believe,
00:11:51.180 that this life is just a way station
00:11:54.300 on the way to some eternity
00:11:56.900 that you could get very, very wrong or very, very right,
00:12:00.240 depending on what you believe.
00:12:01.360 Just that being your master algorithm,
00:12:05.300 that makes a mockery of every pretense to human knowledge,
00:12:11.200 no matter how technologically useful it is.
00:12:14.380 It doesn't matter if we cure cancer with some future biology,
00:12:19.020 and prayer has never worked.
00:12:21.000 If you believe in heaven and hell,
00:12:22.800 that really governs everything, it seems.
00:12:26.140 In a way, I don't think I mind his believing what he believes.
00:12:30.300 What I mind is his thinking we believe what he thinks we believe.
00:12:33.420 Yes.
00:12:34.980 Because how could anybody be so stupid as to think that you could...
00:12:38.320 He simultaneously presented a straw man of evolution
00:12:44.220 and evolutionary scientists,
00:12:46.500 and anybody who fell into that, you know,
00:12:48.620 I accept reality.
00:12:50.320 I'm going to straw man you all with sticks and rocks.
00:12:53.800 Now, we can laugh at it, and, you know,
00:12:56.580 if you feel like laughing at it some more, by all means,
00:12:58.620 there's been lots of discussion about how best to engage on these.
00:13:03.820 How much, for lack of a better phrase,
00:13:06.000 how big of an asshole should you be?
00:13:07.540 How much pushback should there be?
00:13:09.340 How seriously should you take them?
00:13:10.900 And quite frequently, someone will come up
00:13:13.140 and present the idea that there are sophisticated theologians,
00:13:18.200 that this preacher that I had a debate with is in one category,
00:13:23.140 and some other academic erudite theologians
00:13:28.540 are in another category.
00:13:30.300 Is that the case?
00:13:33.140 Well, there are sophisticated theologians
00:13:35.820 who accept evolution, of course,
00:13:39.000 and have no problem with that.
00:13:41.220 And so our argument with them is a quite separate argument.
00:13:45.200 I have met sophisticated theologians who believe pretty astonishing things,
00:13:52.880 like believing literally that Jesus turned water into wine.
00:13:58.560 And I thought sophisticated theologians had written all that stuff off
00:14:01.760 and said, oh, no, that's just metaphor,
00:14:03.360 that's just a nice story.
00:14:06.680 We don't really believe that anymore.
00:14:07.920 But I have spoken to very, very highly qualified, sophisticated theologians,
00:14:13.660 highly educated,
00:14:15.880 that accept evolution totally,
00:14:17.820 but yet they think Jesus turned water into wine
00:14:20.220 and walked on water
00:14:21.500 and rose from the dead
00:14:23.820 and was born of a virgin.
00:14:25.260 All very unscientific ideas.
00:14:31.040 And still they call themselves sophisticated theologians.
00:14:34.780 Well, first we should acknowledge that sophistication is better
00:14:37.600 insofar as it means moderation
00:14:39.800 and less of a commitment to the most dangerous ideas.
00:14:43.500 But my problem with so-called sophisticated theology
00:14:47.380 is that no one ever admits where the sophistication is coming from.
00:14:51.640 It's coming from a loss of faith in specific doctrines.
00:14:56.800 I mean, it's getting hammered into them from the outside.
00:15:02.740 It's coming from science and a modern conception of ethics,
00:15:09.020 a universal conception of human rights,
00:15:11.320 a sense of how unseemly it is to think that anyone,
00:15:15.900 by virtue of being born in the wrong place,
00:15:19.020 is going to spend eternity in hell
00:15:20.420 just because they didn't happen to hear the good word
00:15:22.660 from their parents.
00:15:23.840 So they lose their purchase on those dogmas,
00:15:28.340 and yet they retain this conviction
00:15:29.940 that Jesus was born of a virgin
00:15:31.900 or was resurrected and will be coming back.
00:15:34.660 And those are just the...
00:15:36.020 It's a God of the gaps argument in certain cases,
00:15:40.020 but it's a...
00:15:41.260 There's just certain questions
00:15:42.560 where science hasn't yet closed the door to belief,
00:15:47.560 and so they're putting all of their chips on those questions.
00:15:51.700 We might have slightly different views
00:15:53.760 of what a sophisticated theologian is,
00:15:57.080 which is probably a testament
00:15:58.840 to how it's actually not sophisticated theology,
00:16:02.000 but obfuscated theology.
00:16:04.220 Because when I hear someone say,
00:16:06.020 oh, you know, you take calls on the atheist experience
00:16:09.380 and you get people who couldn't present
00:16:11.840 a reasonable argument at all,
00:16:13.220 why don't you take on real sophisticated theologians?
00:16:16.360 And my answer is I always tell them to call in,
00:16:18.260 here's the phone number,
00:16:19.400 they can call in whatever week they want.
00:16:21.580 And they'll say, well, you know,
00:16:22.820 oh, but here's this, you know,
00:16:24.520 academic who's presented this particular version
00:16:26.800 of the ontological argument,
00:16:28.840 the moral argument.
00:16:29.840 And it's, you know,
00:16:30.640 you've got Ray Comfort,
00:16:31.620 the banana man on one hand,
00:16:33.120 and they pretend that there's something superior
00:16:36.260 with regard to argumentation on the other.
00:16:38.920 And the many years I've been hosting the show
00:16:41.340 and doing debates,
00:16:42.100 what I find is what gets labeled
00:16:43.900 as sophisticated theology
00:16:45.080 is the exact same thing.
00:16:47.380 It's not like the arguments
00:16:48.780 of these sophisticated theologians
00:16:50.280 are any more sound
00:16:51.500 than the arguments of Ray Comfort.
00:16:53.480 It's just that they're better speakers.
00:16:55.540 They have a better way to communicate.
00:16:56.500 They're actually less sound in one way
00:16:57.700 in that they don't,
00:16:58.740 so the belief system is still anchored
00:17:02.140 to a belief in revelation.
00:17:05.280 They're still fixated on the text,
00:17:06.800 but they have ignored much
00:17:10.080 of what seems untenable in the text,
00:17:12.540 and they don't have an argument
00:17:13.940 about why that's okay.
00:17:15.300 Because if God wrote any of these books,
00:17:18.040 and nowhere in the book does God say,
00:17:20.440 well, you could ignore the first half
00:17:21.860 because now I'm getting to the good part.
00:17:26.780 It's all God's words.
00:17:28.340 It's actually a less principled position
00:17:30.320 than fundamentalism.
00:17:32.240 And that's why it's always, in my view,
00:17:34.440 unstable in the face of fundamentalism
00:17:36.720 because the fundamentalist
00:17:37.820 always has the advantage of saying,
00:17:39.600 listen, I'm going to read the whole book.
00:17:41.160 I'm going to take the most plausible
00:17:43.800 interpretation of it.
00:17:44.840 I'm going to read every word
00:17:45.880 as literally as possible.
00:17:47.580 And that always begins to fixate
00:17:51.040 on more divisive, more doctrinaire,
00:17:53.820 more irrational ideas.
00:17:56.000 At least with a fundamentalist,
00:17:57.380 you know what you're arguing against.
00:17:58.840 You're not arguing against a wet sponge.
00:18:01.380 No, there's a, it seems perverse to say it,
00:18:06.040 but there's actually more integrity
00:18:08.060 to the most fundamentalist position
00:18:11.380 because there's simply one irrational move,
00:18:14.940 which is the belief that this book
00:18:17.260 is perfect in every word.
00:18:19.240 But the moment you believe that,
00:18:20.700 well, then it is in fact rational
00:18:22.720 to try to connect all the dots
00:18:24.200 as reasonably as possible.
00:18:25.960 But sometimes they really don't say anything.
00:18:29.780 They say something like,
00:18:30.700 well, God is the ground of all being.
00:18:34.200 Right.
00:18:34.740 Or God is the essence of is-ness or something.
00:18:39.160 Well, I have a soft spot for that kind of...
00:18:42.260 Ah, yes.
00:18:44.420 I mean, I don't like the theistic version of it,
00:18:46.840 but this is perhaps the only argument
00:18:49.860 I can adduce in favor of so-called
00:18:52.340 sophisticated theology,
00:18:53.760 which is there's an experience
00:18:55.780 that people have,
00:18:57.920 you know, Christian contemplatives, say,
00:19:00.280 or really contemplatives in any tradition
00:19:03.820 and have had for millennia,
00:19:05.700 which does provoke those sorts of noises from people.
00:19:10.840 I mean, the problem is you get far enough
00:19:13.200 into any of these contemplative traditions
00:19:14.780 and everyone begins to sound like a Buddhist.
00:19:17.680 And then they, you know,
00:19:18.620 if you're in the 14th century in Christendom,
00:19:21.860 the inquisition shows up at your door,
00:19:24.920 as they did to Meister Eckhart,
00:19:27.540 who happily died of natural causes just in time.
00:19:31.440 But there's an experience that people have
00:19:33.960 of, you know, losing their sense of self, say,
00:19:37.600 and feeling at one with the universe or the world,
00:19:41.700 or having some kind of ethical,
00:19:43.320 just a full ethical reboot of their hard drive
00:19:47.880 where they feel love that they didn't know was possible,
00:19:51.100 right?
00:19:51.360 That kind of self-transcending love.
00:19:53.580 Yeah, I'd enjoy that, I think.
00:19:55.940 Well, yeah, so...
00:19:56.800 I'm not sure I would.
00:19:58.860 We can help.
00:19:59.180 And I'm not sure that it's good.
00:20:00.200 What is it that's a good thing
00:20:01.860 about losing one's sense of self?
00:20:04.880 That's a big question.
00:20:06.060 Well, when you look at
00:20:07.260 just the mechanics of your own suffering,
00:20:10.220 when you look at just what self-concern
00:20:12.420 gets you psychologically,
00:20:14.920 you begin to...
00:20:15.920 you can begin to feel that
00:20:17.600 most of your suffering
00:20:21.660 is not actually...
00:20:23.340 it's not directly tied to bad things happening.
00:20:25.920 It's tied to all this whole machinery of self-concern,
00:20:31.000 you know, anxiety about the future
00:20:32.440 and regret about the past
00:20:33.820 and worries about what people said of you
00:20:36.980 or think of you
00:20:37.720 or will think of you.
00:20:38.860 And so much of our neurosis
00:20:40.860 is taking place
00:20:42.940 just in the conversation
00:20:44.080 we're having with ourselves.
00:20:45.420 And that's all predicated on
00:20:47.820 the legitimacy of this starting point
00:20:49.880 of feeling like there's a self
00:20:51.060 riding around in the head
00:20:52.740 who is carried through
00:20:54.620 from one moment to the next in life.
00:20:56.520 You are the same person
00:20:57.820 you were yesterday.
00:20:59.360 So the thing that embarrassed you yesterday
00:21:01.180 that you're now remembering
00:21:02.400 and now feels terrible
00:21:04.180 is the psychological continuity there
00:21:07.840 and the durable continuity
00:21:09.520 that seems to mandate
00:21:11.060 that you suffer over
00:21:12.580 precisely the thing
00:21:14.120 that you were suffering over yesterday
00:21:16.600 because you are that same self
00:21:18.140 carried through moment to moment.
00:21:19.600 Just have everybody watch Frozen
00:21:20.920 and you can just let it go.
00:21:22.440 Right, well, yes.
00:21:23.680 Because I think the sense of self
00:21:26.040 is actually something
00:21:26.620 that's incredibly valuable
00:21:27.560 that, you know,
00:21:29.220 we have a preservation motivation,
00:21:31.580 we have a desire to understand
00:21:34.220 the world that we inhabit.
00:21:35.540 That is,
00:21:36.680 it may even be the case
00:21:37.920 as I've argued and others have
00:21:40.400 that there's no such thing
00:21:41.680 as altruism in a true sense
00:21:42.980 but that you could have altruism
00:21:44.540 from a purely selfish standpoint
00:21:46.080 and still do good.
00:21:48.460 Yeah, but I wouldn't call it,
00:21:49.460 I mean, that begins to play
00:21:51.080 with the boundaries of the, quote, self.
00:21:53.900 So the moment you begin to feel
00:21:56.560 that your selfishness extends
00:21:59.800 to everyone being happy, right?
00:22:03.680 Because you actually care about everyone, right?
00:22:06.140 And you feel better
00:22:06.920 when you see people smiling
00:22:08.240 rather than, you know, weeping.
00:22:10.700 If you extend the circle
00:22:12.180 of your self-concern to everyone,
00:22:13.840 well, then that's not normal selfishness.
00:22:17.040 That's, you know, sainthood in a religious sense.
00:22:19.020 If I'm doing it because I feel that good
00:22:21.080 when people smile,
00:22:22.100 that doesn't mean I necessarily care about them.
00:22:23.880 It means I might care about
00:22:24.840 that good feeling that I get.
00:22:26.520 Yeah, except the I get part
00:22:29.380 is vulnerable to inspection.
00:22:31.500 I mean, the sense that there's an I
00:22:33.020 who's appropriating that in every moment is,
00:22:35.620 it's just, it's a project
00:22:37.420 which can be accomplished in a moment
00:22:39.940 or you can fail to accomplish it
00:22:42.420 after many years of looking.
00:22:43.860 But there's the sense that there is a,
00:22:47.340 it's useful to define what we mean by self
00:22:49.600 because most people don't feel
00:22:51.240 identical to their bodies.
00:22:53.600 So when I say,
00:22:54.380 when I say the self doesn't exist,
00:22:56.400 I'm not saying that people don't exist.
00:22:58.040 I'm not saying that, you know,
00:22:59.260 nobody's here and, you know,
00:23:00.780 this is all an illusion.
00:23:02.200 And there are contemplative
00:23:04.780 and religious and spiritual traditions
00:23:06.420 that can sound like
00:23:07.560 they're saying something very much like that.
00:23:09.680 I'm saying that the sense that we all have
00:23:12.240 of being a subject in the head,
00:23:15.720 riding around in the body
00:23:17.320 as though it were a kind of vehicle, right?
00:23:19.140 Because this really is most people's starting point.
00:23:21.100 They don't feel truly coterminous with their bodies.
00:23:24.800 They feel like they're in their head
00:23:26.540 and that, you know,
00:23:27.700 their hands are down there in some sense.
00:23:30.320 And that sense of being a subject in the head
00:23:32.300 is vulnerable to inspection.
00:23:33.840 You can lose that sense.
00:23:35.160 And on one level,
00:23:35.780 you can just be identified with your body.
00:23:38.380 I mean, that is actually progress.
00:23:40.600 Simply to feel like a body in the world
00:23:43.080 is different from the way most people feel.
00:23:46.880 Most people are kind of,
00:23:48.060 and this is what we're running into,
00:23:49.540 most people are common sense dualists.
00:23:51.180 They feel like the mind
00:23:52.440 can't possibly be identical to the brain.
00:23:55.720 The mind is something altogether different
00:23:57.520 and it just feels like it's in the head
00:24:00.560 as a kind of,
00:24:01.980 there's a sort of locus of attention
00:24:03.780 that's emanating from the head.
00:24:05.600 But this body is a machine that can malfunction
00:24:08.900 and it's changing over time.
00:24:11.100 It's clearly not what I am.
00:24:13.640 And I am probably a soul then.
00:24:15.540 I'm probably spirit.
00:24:16.780 I can probably drift off the brain at death.
00:24:19.760 And that sense,
00:24:21.860 and all the ways in which that sense
00:24:23.360 can be played with by fasting or prayer
00:24:27.300 or meditation or psychedelics
00:24:29.380 or getting crazy ideas
00:24:31.700 that you find emotionally very animating,
00:24:34.580 sort of adventures you can have in dualism
00:24:36.800 are part of the problem here.
00:24:39.120 Adventures in dualism
00:24:40.240 should be the title of your next book.
00:24:41.540 I don't know if you want to jump in on that at all.
00:24:49.160 I have nothing to contribute, no.
00:24:50.420 Yeah.
00:24:53.280 I sit here and I listen to this
00:24:55.740 and I think there's like a four-hour
00:24:57.960 fascinating-for-me conversation.
00:25:00.560 You might not think so much
00:25:02.240 because I have no problem with the idea
00:25:03.900 that the mind is the brain.
00:25:05.300 But I know that there are people who do.
00:25:08.840 But it doesn't feel that way.
00:25:10.520 I know.
00:25:10.820 I know it doesn't feel that way.
00:25:12.160 And I don't know that it's necessary.
00:25:14.840 And I don't know what the right path is.
00:25:16.340 I don't know that, for example,
00:25:19.340 losing this sense of self
00:25:23.700 could be a great thing.
00:25:25.680 Well, one thing I would add
00:25:27.360 is that you lose it all the time
00:25:29.140 because it actually isn't there.
00:25:31.940 I mean, you are losing it all the time.
00:25:34.800 How can you lose something that's not there?
00:25:36.180 It always seems there retrospectively.
00:25:38.120 But when you're really paying attention to something,
00:25:40.560 you know, when you're so-called lost in your work
00:25:42.920 or you're lost in some athletic task
00:25:45.700 or you're just lost in thought,
00:25:47.620 like you're actually,
00:25:48.480 you're thinking about something
00:25:49.900 and you're not aware that you're thinking.
00:25:52.140 This sense of our own kind of central presence
00:25:56.240 in our heads is constantly being undercut
00:25:59.120 by attention being diverted to something out in the world
00:26:02.620 or to some experience.
00:26:04.660 And you can become increasingly sensitive
00:26:06.720 to how it's being interrupted.
00:26:09.380 I would love to get to the truth
00:26:11.360 and I love the fact we're on the pursuit,
00:26:13.060 but irrespective of what the truth is,
00:26:14.820 Richard, something like consciousness,
00:26:16.160 which we still,
00:26:18.180 some would say we understand
00:26:19.320 and some would say we don't.
00:26:20.700 I think we don't.
00:26:21.500 But what would be the evolutionary advantage
00:26:27.100 and the process by which we get to consciousness
00:26:29.960 as we seem to have it
00:26:32.100 that might distinguish us from other animals?
00:26:35.920 It's a big mystery
00:26:37.300 because you could build an animal
00:26:41.280 which did all the sophisticated things
00:26:43.300 animals have to do,
00:26:44.320 hunting for food,
00:26:45.980 avoiding predators,
00:26:47.400 looking for mates,
00:26:48.300 doing everything that an animal has to do
00:26:49.720 in order to survive and propagate its genes.
00:26:52.980 And I don't think it would have to be conscious at all.
00:26:56.180 I think it could all be done
00:26:57.740 in the way that a computer would do it.
00:27:00.320 I mean, when you talk to Siri or Alexa,
00:27:03.720 they sound conscious,
00:27:04.740 but you know they're not.
00:27:06.740 And for an animal to survive with a nervous system,
00:27:11.760 it doesn't, it seems to me,
00:27:13.180 need to be conscious.
00:27:14.880 And I'm very glad I'm conscious
00:27:17.840 and I'm pretty sure you are as well.
00:27:20.260 I think other people.
00:27:23.280 I'm not a solipsist.
00:27:26.300 But I do find it a bit of a mystery
00:27:30.080 why we have consciousness at all.
00:27:32.500 Yeah, I would agree.
00:27:33.880 I think it's, as I wrote somewhere,
00:27:36.600 it's the one thing in this universe
00:27:38.460 that can't be an illusion,
00:27:40.020 including the universe.
00:27:41.620 I mean, this universe could be a simulation
00:27:44.520 on some alien hard drive.
00:27:46.560 I think Descartes said something similar.
00:27:48.360 Although I got 40 emails last week
00:27:50.220 that says we've proved that that's not true,
00:27:52.260 and I don't necessarily buy those emails either.
00:27:55.320 That's exactly what the simulators would say
00:27:57.360 if you were...
00:28:00.620 But consciousness as just the felt sense
00:28:08.800 that something is going on,
00:28:10.360 the fact that there's an experiential quality,
00:28:13.340 whatever this is,
00:28:14.500 whether you're a brain in a vat,
00:28:15.920 whether you're in the matrix,
00:28:16.920 whether consciousness is being produced
00:28:18.520 by information processing in your head,
00:28:22.060 as seems reasonable to believe,
00:28:24.720 consciousness is always the first fact
00:28:27.120 before any other facts can be discussed.
00:28:30.640 And it's also the most important thing
00:28:32.820 in the universe.
00:28:33.560 It's the only thing that makes,
00:28:35.120 at least in my view,
00:28:36.120 it's the only thing that makes
00:28:36.940 the universe important,
00:28:38.580 the fact that the lights are on,
00:28:40.940 the fact that it's possible for the lights to be on.
00:28:42.840 If you told me there's a universe somewhere
00:28:44.340 that's got stars and planets,
00:28:47.020 but the constants of nature are tuned
00:28:49.400 just a little awry
00:28:50.840 so that conscious life is impossible,
00:28:53.700 that is a deeply uninteresting universe.
00:28:58.400 And consciousness is the only ground
00:29:00.600 for any moral dimension to our lives, too.
00:29:02.820 And yet, I'm with you in feeling that
00:29:05.560 it's not clear that it does anything.
00:29:07.800 It's not clear how it would be selected for.
00:29:10.540 Because if you just look at your own experience,
00:29:14.400 everything that you're conscious of,
00:29:16.700 anything that you seem to be consciously deciding,
00:29:19.320 or any place where it seems that consciousness
00:29:21.460 is necessary to integrate information behaviorally,
00:29:25.260 you know, to have a complex goal,
00:29:26.660 say, for someone to say to you,
00:29:28.320 well, you should really get to the Orpheum Theater
00:29:30.420 at 8 o'clock to hear this talk,
00:29:32.580 for that to become a behavioral plan,
00:29:35.120 let's just say that that is, in fact,
00:29:37.240 something that can only be done consciously
00:29:39.180 in apes like ourselves.
00:29:40.740 Still, it's not clear why,
00:29:44.400 well, as you said,
00:29:45.380 it's not clear that that should be the only way
00:29:47.440 that it gets accomplished,
00:29:48.900 and we could easily build robots,
00:29:50.720 one presumes,
00:29:51.440 that could do these things
00:29:53.420 without it being something that it's like
00:29:55.640 to be those robots.
00:29:57.200 But even in our own case,
00:29:59.280 if consciousness really is just what it is
00:30:02.540 at the level of our neurophysiology,
00:30:05.180 it's only effective in virtue of what it is
00:30:09.540 at the level of neurophysiology.
00:30:11.020 So the fact that there's a subjective side to it
00:30:14.120 doesn't matter.
00:30:16.180 And the fact that you're having this experience now,
00:30:20.020 which, again, is the most important thing
00:30:22.340 in anyone's life,
00:30:25.000 the experience side of it
00:30:26.840 is not what is actually behaviorally effective
00:30:30.960 if, in fact, consciousness has the other face,
00:30:36.460 which is its neurophysiology
00:30:37.840 and its information processing dynamics.
00:30:40.060 Nicholas Humphrey suggested
00:30:41.440 that one of the most important things
00:30:45.880 we have to do
00:30:46.720 is to second-guess other people.
00:30:49.720 We swim through a sea of,
00:30:52.360 a social sea.
00:30:54.580 We have to make our way
00:30:56.800 through very, very complicated relationships
00:30:59.440 with other people,
00:31:00.740 and we have to second-guess
00:31:02.400 what they're going to do all the time
00:31:04.320 where we're having to predict
00:31:05.600 how this other person is going to react.
00:31:08.700 And so he postulates what he calls the inner eye,
00:31:11.520 which is looking inwards to yourself
00:31:14.560 as an aid to second-guessing
00:31:18.020 what the other person might do.
00:31:20.480 You need this extra sense organ
00:31:22.140 to help you
00:31:24.600 to predict the behavior
00:31:27.960 of the other person.
00:31:30.820 I still don't think that does it.
00:31:33.700 Somehow I wouldn't have been as surprised
00:31:35.420 by the last presidential election
00:31:37.080 if I was doing that correctly.
00:31:39.820 Yeah, and I was here before the election
00:31:41.880 and predicted that there was no way Trump could win,
00:31:44.540 and as somebody who occasionally pretends
00:31:46.720 to read minds and make predictions for a living,
00:31:49.140 boy, was that a mistake.
00:31:50.200 I think what Richard's talking about
00:31:53.380 is something that I've heard elsewhere
00:31:55.300 is the intentional stance.
00:31:57.820 What if it's the case that consciousness,
00:32:00.940 which gives rise to this sense of self
00:32:03.480 in a way that goes beyond
00:32:06.700 mere self-reflection and consideration,
00:32:09.940 leads us to connections with other people,
00:32:13.580 and this is what provides
00:32:15.160 the evolutionary benefit.
00:32:17.660 But it also leads to something else
00:32:19.560 that I thought we'd talk about,
00:32:21.300 which is tribalism.
00:32:23.940 This, our lives as individuals
00:32:27.700 become merged, obviously, with our family.
00:32:30.600 We have this immediate connection
00:32:32.920 to our family,
00:32:33.560 and then we extend this,
00:32:34.680 and we extend the definition of family,
00:32:36.540 and we begin to form tribes.
00:32:40.080 And there was a time and a place
00:32:41.320 where that may have been the best thing.
00:32:44.580 Is it the case that,
00:32:46.900 I mean, obviously,
00:32:49.100 these could all be side effects
00:32:50.420 of just what happened,
00:32:52.120 and I would think that I'd be okay
00:32:54.440 with the idea that consciousness
00:32:55.980 and tribalism and everything
00:32:56.920 are side effects of what happened.
00:32:58.960 You know, there doesn't have to be
00:32:59.920 a guiding hand.
00:33:01.460 But in the process,
00:33:02.780 viewing it from natural selection,
00:33:05.020 what were the benefits of tribalism,
00:33:06.980 and have we actually outgrown them,
00:33:09.080 or are we maybe taking a step back?
00:33:10.820 Well, if I'm not mistaken, Richard,
00:33:13.420 I think altruism,
00:33:16.760 the evolutionary rationale for altruism,
00:33:20.260 really only makes sense
00:33:21.740 in a tribal context.
00:33:23.940 So that, like,
00:33:24.460 one of the silver linings
00:33:25.800 of internecine tribal conflict
00:33:28.640 was that in-group altruism
00:33:31.020 got selected for.
00:33:33.020 I don't know if there's
00:33:33.840 any recent work on that,
00:33:35.220 but that was my reading of things.
00:33:37.420 Yes.
00:33:37.860 That's not to say that we're stuck
00:33:39.220 with tribalism as the only
00:33:41.320 rationale for altruism,
00:33:42.560 but in terms of how apes
00:33:43.740 like ourselves became
00:33:44.860 as altruistic as we are,
00:33:47.400 it's thought that competition
00:33:48.860 among tribes was the basis.
00:33:51.660 Well, I suppose a Darwinian view
00:33:53.140 of altruism would go back
00:33:56.020 to a time when we lived
00:33:57.280 in small tribal groups,
00:33:59.760 and there were two things
00:34:02.880 about living in these small groups.
00:34:04.460 One was that you were
00:34:05.680 completely surrounded by kin,
00:34:07.860 cousins, second cousins,
00:34:10.560 siblings, nephews and nieces,
00:34:12.540 and so on.
00:34:13.740 And so there would have been
00:34:15.060 a Darwinian incentive
00:34:16.800 to altruism towards anybody
00:34:19.480 you meet,
00:34:20.340 because anybody you meet
00:34:21.400 is a member of your own village,
00:34:23.000 your own tribe,
00:34:23.820 your own clan.
00:34:25.960 And the second thing,
00:34:27.260 the other Darwinian engine,
00:34:29.920 motor of altruism,
00:34:31.940 is reciprocation.
00:34:34.100 And reciprocation depends
00:34:36.680 or largely depends upon
00:34:38.080 encountering the same individual
00:34:41.140 again and again.
00:34:42.020 And that again happens
00:34:43.560 within the village,
00:34:44.560 within the band,
00:34:45.800 within the tribe.
00:34:47.580 So there would have been
00:34:49.780 a selection pressure
00:34:50.760 in favor of within-group altruism
00:34:54.140 and out-group hostility,
00:34:56.680 xenophobia.
00:34:58.560 So we can expect
00:35:00.940 that there should be
00:35:02.340 this tendency
00:35:03.420 to despise the out-group
00:35:05.700 and to be altruistic
00:35:07.900 and cooperative
00:35:09.240 with the familiar in-group.
00:35:11.880 And that could be defined
00:35:12.680 as people you've known
00:35:13.420 on your life,
00:35:14.200 people you were brought up with,
00:35:15.780 people who look like you.
00:35:17.420 There are all sorts of ways
00:35:18.720 in which the rule of thumb
00:35:20.780 for how to behave
00:35:21.800 could have latched on.
00:35:24.020 And it's a pretty depressing outlook
00:35:27.740 when we've moved out
00:35:28.940 of our tribal past
00:35:30.280 and moved into big cities
00:35:32.980 where we're no longer
00:35:33.680 in small tribal groups.
00:35:35.860 But we still have
00:35:36.920 the same rules of thumb
00:35:39.620 which work,
00:35:41.040 and that is a good thing.
00:35:41.940 We have a rule of thumb
00:35:42.740 that says just,
00:35:44.300 in general,
00:35:44.820 be nice to anybody,
00:35:46.120 empathize with anybody,
00:35:47.560 because in the distant past
00:35:50.240 anybody would have been
00:35:51.760 defined as
00:35:52.800 your own tribe,
00:35:54.340 your own clan,
00:35:56.240 your own kin group,
00:35:58.060 your own reciprocation group.
00:36:00.100 So I wonder if it's
00:36:00.960 to our benefit to,
00:36:03.000 there's a couple of
00:36:03.940 potential ways to go there.
00:36:05.220 One is to get everybody
00:36:05.940 to realize that
00:36:06.800 everybody still
00:36:07.940 is part of our clan,
00:36:09.540 that we are one human clan.
00:36:12.460 I don't know,
00:36:13.920 I don't have the magic solution
00:36:15.240 to end the various divisions,
00:36:18.200 but the others,
00:36:19.620 maybe to get people
00:36:20.800 to recognize that
00:36:21.500 they can be a part
00:36:22.340 of a number of different glands
00:36:23.440 that overlap.
00:36:24.740 This is how we build societies.
00:36:27.780 Right.
00:36:27.980 I care more about
00:36:28.920 my immediate family
00:36:30.000 than I do my neighborhood,
00:36:31.200 but I care more about
00:36:31.860 my neighborhood
00:36:32.300 than I do,
00:36:33.920 you know,
00:36:34.180 the broader world,
00:36:35.020 but I can't
00:36:36.660 diminish my
00:36:37.780 caring for things
00:36:39.420 outside my scope
00:36:40.260 to zero
00:36:40.800 because we know
00:36:42.600 that we have
00:36:43.440 an impact on each other
00:36:44.960 even at great distances.
00:36:46.740 Peter Singer wrote a book
00:36:47.780 called The Expanding Circle
00:36:49.200 in which he starts out
00:36:51.660 by talking about
00:36:52.420 this in-group,
00:36:53.980 kin group,
00:36:54.940 and then talks about
00:36:56.380 the altruism
00:36:57.180 broadening itself out
00:36:58.500 to wider and wider
00:37:01.580 and wider groups,
00:37:02.360 and he would like that
00:37:03.920 to include non-human animals
00:37:05.540 as well,
00:37:06.980 that psychologically
00:37:08.220 we can extend
00:37:09.180 our tribal loyalty
00:37:10.920 to all sentient beings.
00:37:14.800 You!
00:37:15.720 Yeah.
00:37:20.080 There were some folks
00:37:21.160 out front
00:37:21.580 with pictures
00:37:22.540 of both of you
00:37:23.440 actually lobbying
00:37:25.020 for something
00:37:25.540 along those lines.
00:37:26.740 Oh, yeah.
00:37:27.040 Which was nice.
00:37:29.060 I think Singer's
00:37:30.520 heuristic is the right one,
00:37:32.940 that moral progress is.
00:37:35.400 If you'd like to continue
00:37:36.380 listening to this conversation,
00:37:37.560 you'll need to subscribe
00:37:39.000 at samharris.org.
00:37:40.540 Once you do,
00:37:41.440 you'll get access
00:37:41.940 to all full-length episodes
00:37:43.220 of the Making Sense podcast,
00:37:44.880 along with other
00:37:45.460 subscriber-only content,
00:37:47.200 including bonus episodes
00:37:48.500 and AMAs
00:37:49.580 and the conversations
00:37:50.620 I've been having
00:37:51.200 on the Waking Up app.
00:37:52.740 The Making Sense podcast
00:37:53.640 is ad-free
00:37:54.520 and relies entirely
00:37:55.920 on listener support,
00:37:57.100 and you can subscribe now
00:37:58.560 at samharris.org.
00:38:00.000 Thank you,
00:38:00.960 much more.
00:38:13.020 You