Making Sense - Sam Harris - June 05, 2023


#321 — Reckoning with Parfit


Episode Stats

Length

52 minutes

Words per Minute

165.42729

Word Count

8,713

Sentence Count

524

Misogynist Sentences

3

Hate Speech Sentences

11


Summary

David Edmonds is a writer and philosopher and the author of several books, the most recent being Parfit: A Philosopher and His Mission to Save Morality. In this episode, we talk about Parfit's work on identity, time bias, the non-identity problem which he actually discovered, population ethics and the so-called repugnant conclusion, the ethical importance of future people, and other topics. We also discuss whether or not Parfit had dementia, and whether he was a person who was peculiar throughout his life. We don't run ads on the podcast, and therefore, therefore, are made possible entirely through the support of our subscribers. If you enjoy what we're doing here, please consider becoming a supporter. You'll get access to our private RSS feed, where you'll be able to subscribe to your favorite podcatcher, along with other subscriber-only content. Thanks for listening to the Making Sense Podcast! Sam Harris This is a podcast I do not normally do, but I really enjoyed this episode and it was fascinating to talk about Derek Parfit and his work with someone who knew him. So I thought it would be fun to make a podcast about a philosopher who I had a good friend who is also a good human being. So here you can do just that. I hope you enjoy the podcast! - Sam's Note: If you're not currently on our subscriber feed, you'll need to become a supporter of the podcast - you'll get to listen to the first part of this podcast, making sense of the second part of the making sense podcast, which will be out soon! - Thank you, I'll be listening to that in the podcast by me soon. - Timestamps: 0:00 - 5:30 - 6:00) 7:00 8:30 9:30) 11:40 12:30s 13:40s 15:00s 16:40 s 17:20s 17, 14) 15) 16) 17 + 17 18) And so on and so on & so on & so forth... + And so forth? ) 5) ) ) ) & v=f=1f etc. + + + cz=1s=1p So do you think so? ) +


Transcript

00:00:00.000 Welcome to the Making Sense Podcast.
00:00:08.820 This is Sam Harris.
00:00:10.900 Just a note to say that if you're hearing this, you are not currently on our subscriber
00:00:14.680 feed and will only be hearing the first part of this conversation.
00:00:18.440 In order to access full episodes of the Making Sense Podcast, you'll need to subscribe at
00:00:22.720 samharris.org.
00:00:24.140 There you'll find our private RSS feed to add to your favorite podcatcher, along with
00:00:28.360 other subscriber-only content.
00:00:30.620 We don't run ads on the podcast, and therefore it's made possible entirely through the support
00:00:34.640 of our subscribers.
00:00:35.900 So if you enjoy what we're doing here, please consider becoming one.
00:00:46.480 Welcome to the Making Sense Podcast.
00:00:48.660 This is Sam Harris.
00:00:50.720 Today I'm speaking with David Edmonds.
00:00:53.520 David is a writer and philosopher and the author of several books.
00:00:57.200 The most recent is Parfit, a philosopher and his mission to save morality, and that is
00:01:03.900 the topic of our conversation.
00:01:05.760 As many of you know, Derek Parfit was a philosopher I greatly admired.
00:01:10.360 As you'll hear, I almost interviewed him near the end of his life, but my timing was terrible.
00:01:17.080 We talk about Parfit's place in philosophy, his work on identity, time bias, the non-identity
00:01:26.740 problem, which he actually discovered, quite interesting, population ethics, and the so-called
00:01:33.160 repugnant conclusion, the ethical importance of future people, effective altruism, moral truth,
00:01:40.440 and other topics.
00:01:42.920 Anyway, it was fascinating to talk about Parfit and his work with someone who knew him.
00:01:47.340 So now I bring you David Edmonds.
00:01:55.340 I am here with David Edmonds.
00:01:57.360 David, thanks for joining me.
00:01:58.960 Thanks for inviting me.
00:01:59.740 So you have written a wonderful book.
00:02:02.140 You've actually written two wonderful books, although we'll focus on the most recent.
00:02:07.020 You may have written more books than that, and they may all be wonderful, but I've read
00:02:10.580 two of them.
00:02:11.660 The recent one is about one of the most interesting philosophers of our time.
00:02:18.200 I think it's almost surely an objective statement.
00:02:21.440 The book is about Derek Parfit, titled Parfit, A Philosopher and His Mission to Save Morality.
00:02:28.560 And the other book that I loved, which I believe you co-wrote, was Wittgenstein's Poker, which
00:02:35.820 is about a famous and maybe semi-apocryphal encounter between Wittgenstein and Karl Popper,
00:02:43.720 which maybe we can touch on.
00:02:46.040 But I want to, we'll focus on Parfit, but before we get there, perhaps you can describe
00:02:51.480 your background in philosophy and just the kinds of things you have focused on and maybe
00:02:57.400 your connection to Parfit.
00:03:00.240 Gosh, well, my background in philosophy is that I studied it.
00:03:03.840 So I did what's called PPE at Oxford, which is Philosophy, Politics and Economics.
00:03:08.200 And then I did a two-year post-grad degree called the BPhil, which back in the 70s was kind
00:03:15.740 of the route into teaching.
00:03:17.580 And then after that, it was then required that you had to have a PhD as well.
00:03:22.440 And then I went off and did other things, went into journalism, but I had a kind of philosophical
00:03:26.680 itch and so started a PhD, which I did, I guess, about, well, in the early 90s, I think.
00:03:34.900 And my supervisor for my BPhil was a chap we're going to talk about, Derek Parfit.
00:03:40.220 And my BPhil was on obligations we have to future people.
00:03:43.980 So these are people who are not yet born.
00:03:47.480 And then I did the PhD later.
00:03:49.900 And my supervisor for my PhD was a very good female philosopher called Janet Radcliffe Richards,
00:03:56.540 who went on to become Derek Parfit's wife, married Parfit at a later stage.
00:04:02.700 So when I wrote the biography, I was sort of well-connected.
00:04:05.440 So I knew both Derek and Janet, which was fortuitous for writing the book.
00:04:10.760 So, yeah, I never met Parfit.
00:04:14.000 I almost interviewed him.
00:04:15.980 I mean, my timing was just bad.
00:04:18.300 I reached out to him at the end of 2016.
00:04:22.260 And we set up an interview that was going to be, I think it was going to be written.
00:04:27.040 I wasn't yet podcasting as regularly as I am now.
00:04:30.240 And so we were going to have a written exchange.
00:04:33.700 And then I believe his wife got sick.
00:04:36.660 And then at some point he was unwell.
00:04:40.620 And actually, it was interesting.
00:04:41.940 Toward the end of your book, you speculate as to whether or not he had dementia.
00:04:47.240 We can talk about what was peculiar about him throughout his life.
00:04:51.720 But he actually, I should say, in this email, he did describe himself as showing signs of dementia to me.
00:04:59.420 So I thought, I assume that was common knowledge.
00:05:03.720 No, not at all.
00:05:04.500 So I'm very, very interested in that.
00:05:05.740 He mentioned that to one person I interviewed.
00:05:07.620 And I was slightly skeptical because Janet didn't think he did.
00:05:11.300 And he didn't tell it to anybody else.
00:05:12.640 So you are a second source.
00:05:14.520 Yeah.
00:05:14.900 So that is interesting.
00:05:16.020 Interesting.
00:05:16.320 And you did extremely well to even sort of get him to engage with you because, as we might go on to discuss, by this stage, he was very unwilling to be interviewed.
00:05:29.040 And to kind of, as it were, he would have seen it as a kind of a slight waste of his time because he wanted to focus on research and writing.
00:05:36.420 And I tried to get a podcast interview with him as well.
00:05:38.600 And he turned that down.
00:05:39.800 He was more willing to engage in a written interview because then he could make sure that he didn't make any errors.
00:05:46.980 He was a perfectionist.
00:05:47.840 He didn't want to make any mistakes.
00:05:49.680 So if he had a chance to revise and edit his answers, that would make him happier, I think, than doing a verbal interview where he might fear that he would say the wrong thing.
00:06:02.160 And, yeah, yeah, I mean, I really consider it one of the great missed opportunities for me because I just love his work.
00:06:09.520 Although, like many who love his work, I can't say that I have read all of his last book, which is 1,500 pages long.
00:06:16.220 And for some reason, I, you know, it may just be due to length, but I also just got sidetracked while reading it, even though I'm quite sympathetic with what he was attempting there.
00:06:26.180 I aspire to get back to it and finish it.
00:06:28.300 But his first book, Reasons and Persons, you know, many of us, I think, appreciate as some kind of masterpiece.
00:06:36.220 It's not a—it's structured strangely, and it seems like it probably could have become an even better book based on some form of editing.
00:06:46.820 I don't know who would have forced Parfit to edit it, but there's something really otherworldly about the book.
00:06:53.880 It's like, you know, it strikes one as written from a, you know, the point of view of someone coming from outer space and just manufacturing thought experiments by which to understand human morality.
00:07:09.060 It's a very odd but brilliant book.
00:07:12.080 Did you know Parfit before he published that, or you came to him after he—?
00:07:16.880 No, so the book is published in 84, and I study with him in 87.
00:07:23.660 I think I bought Reasons and Persons in 1986, which is when the paperback came out.
00:07:28.820 So he gobbles—I gobble that book up.
00:07:31.260 I think, like you, it's a work of genius.
00:07:33.880 And in fact, that's widely acknowledged.
00:07:35.980 I mean, even the people who are his philosophical enemies—he had very few actual enemies—but his philosophical opponents, even they acknowledge that this is an extraordinary work of philosophy.
00:07:47.520 And what I didn't realize until I was writing the book was that it wasn't supposed to be one book.
00:07:52.160 He was working on lots of books at the same time.
00:07:55.660 And then he had this crisis when, in 1981, the college that was his home, which was All Souls College, which is this college where there were no undergraduates, there's no teaching at all.
00:08:08.300 The people who are based there are purely there for research.
00:08:12.340 They were threatening to throw him out because he hadn't managed to produce one book in 14 years.
00:08:18.140 And so they say they'll give him an extension to his fellowship, but they say he's got to produce a book by 1984.
00:08:26.800 And what he does is he throws everything together.
00:08:30.280 And what was potentially going to be several books turns out to be this strange mishmash.
00:08:36.260 And it covers a whole variety of things.
00:08:38.760 Most famously, there's a huge chapter on personal identity, what it is that makes me the same person I am now to the person I will be at the end of the interview and the same person I am now to the person I was when I was five years old and the person I will be when I'm 85 years old, inshallah.
00:08:56.220 And then there's a huge section, which was the section that at the time most interested me, which was on future people.
00:09:02.200 And he basically invents this whole sub-genre of moral philosophy.
00:09:06.960 Until then, there was nothing that we now call population ethics.
00:09:11.660 And it's an area of philosophy that looks at our obligations to future people and puzzles about how many people we want in the world, whether we care about total well-being or average well-being.
00:09:25.220 And he comes up with these ingenious puzzles.
00:09:27.220 And even, whatever are we, we're sort of 40 years on at least, 40 years on roughly, even now, when people write about this area of philosophy, basically Parfit is the template.
00:09:39.720 I mean, people are responding to Parfit.
00:09:41.820 So I want to get into many of those specific problems.
00:09:46.460 But generally, what would you say his place in philosophy is now?
00:09:53.180 I mean, how would you describe him as a member of the pantheon of recent great philosophers?
00:09:59.420 Well, he divides people.
00:10:01.740 So I'm a Parfitian.
00:10:03.160 I'm a fan.
00:10:04.020 So you're going to get a biased view.
00:10:05.900 And I share the view of lots of very many philosophers and very many top philosophers, which is that he's one of the great moral philosophers of the 20th century.
00:10:15.560 I wouldn't necessarily go as far as some and say he's the greatest moral philosopher since John Stuart Mill, but some very serious philosophers make that claim.
00:10:22.680 But he's not like Wittgenstein in the sense that people who don't like Wittgenstein's philosophy, nobody dismisses Wittgenstein.
00:10:29.840 But there are philosophers who think that he does moral philosophy in completely the wrong way.
00:10:36.000 And especially the later work was going down a cul-de-sac when he tries to prove that morality is objective.
00:10:43.220 He's desperate to prove that there are moral facts.
00:10:45.800 So he divides people, but there are many people like me who think he's definitely one of the greats in moral philosophy of the past hundred years.
00:10:54.500 Yeah, well, I'm very sympathetic with trying to prove that there are such things as moral facts.
00:10:59.020 And I know what sort of pushback one gets when one goes down that path.
00:11:03.580 And as he focused on that more in his last book, which is really three large volumes on what matters, and perhaps we'll get there as well.
00:11:13.820 I guess just one more general question about him before we get into his areas of philosophical focus.
00:11:21.040 What do you think the significance of his psychology was for his philosophy?
00:11:26.920 I mean, he really did strike me, even just without knowing anything about him personally.
00:11:32.100 And there's a lot in your book that is revelatory as to what sort of person he was.
00:11:36.600 But just reading Reasons and Persons, I felt I was in the presence of a neuroatypical philosopher.
00:11:44.220 And many of the advantages of that book seemed to me born of a truly atypical angle of attack on all the questions he was touching there.
00:11:56.020 And, you know, so I always, without having any evidence for it, I always thought he was someone who must be, you know, on the autism spectrum to some degree.
00:12:07.100 I know you entertain that hypothesis in the book and are uncertain as to where you come down.
00:12:13.460 But let's talk about that.
00:12:15.180 Maybe it's true of, we could bring in Wittgenstein here too, because he's, he also struck me, insofar as I think I know anything about him, from reading Ray Monk's biography and some other secondary work.
00:12:29.200 He struck me as neuroatypical as well.
00:12:32.840 And so much of what is interesting about his thought could be born of that.
00:12:37.960 Yeah.
00:12:38.100 So I think your instincts are right.
00:12:39.840 I think both Wittgenstein and Parfit were neuroatypical.
00:12:46.000 There are lots of interesting similarities between them, and there are lots of interesting differences between them.
00:12:50.580 And one is, Derek Parfit was just a lot more of a benign character.
00:12:56.320 So Wittgenstein went around trying to persuade everybody to give up philosophy.
00:13:00.000 And I think he damaged quite a few lives, because there were people who were potentially good philosophers and would have had an interesting, successful academic life, who he persuaded to give up the academic life and go and work with their hands, go and do manual work.
00:13:15.280 And as I say, I think that was very damaging to them.
00:13:17.780 Whereas Parfit was quite the opposite.
00:13:19.220 Parfit tried to persuade everybody to give up anything and move to philosophy, because he thought philosophy was basically really what mattered.
00:13:26.000 But they were very, you know, atypical, I think.
00:13:30.020 The big puzzle in my book, the puzzle I really wanted to resolve was that I thought I knew Derek when I started, but then I started researching his early life.
00:13:42.380 And his early life is extremely rich.
00:13:45.520 He's got lots of interests.
00:13:47.320 He starts off as a historian.
00:13:49.340 He's interested in music.
00:13:50.920 He's interested in sport.
00:13:52.100 He plays chess.
00:13:53.180 He seems to have a kind of social circle.
00:13:56.300 He dabbles in student journalism.
00:13:59.060 He's a debater.
00:14:00.860 He has a very rich life.
00:14:03.400 And yet slowly he sheds all that.
00:14:07.600 And then in the second half of his life, he really becomes a duomaniac.
00:14:12.660 He has two interests.
00:14:14.180 One is philosophy and the other is photography.
00:14:17.280 And every year he goes to the same two cities.
00:14:20.520 He goes to Venice and he goes to what was then Leningrad now, St. Petersburg, and he photographs the same buildings.
00:14:28.780 He goes there at dawn.
00:14:30.400 He goes there at dusk.
00:14:31.500 And he photographs the same buildings every single year.
00:14:34.600 And what puzzled me was how Derek turns from this sort of from early Derek to later Derek.
00:14:43.080 And also, which was the real Derek, which was the authentic Derek, as it were.
00:14:47.120 And it's an ironic question, of course, given that he spends much of his life worrying about what it is that makes us the same person.
00:14:53.800 And I was puzzling about what it was, which was the, as it were, essence of Derek.
00:14:57.980 And he rejects the idea that we have an essence.
00:14:59.840 But I came to believe that the later Derek was really the more natural Derek, that once he got into this strange institution, all souls, and he didn't have to socialize, he could just focus on his work.
00:15:13.720 I think that was the happier Derek.
00:15:15.480 That was the more content Derek.
00:15:17.680 But you ask about the relationship between his, as it were, his neuroatypicality and his philosophy.
00:15:26.600 I mean, I think there are some interesting parallels.
00:15:28.300 He was essentially a consequentialist, especially in his early philosophy.
00:15:32.660 That's very strong.
00:15:33.400 So he believed that what matters were the consequences of our actions.
00:15:37.760 And if you look at his personality, his dispositions, they were very consequentialist.
00:15:42.820 Just to give you one example.
00:15:44.380 So, for example, he would often burst into tears when he read or heard about the suffering of distant strangers.
00:15:53.400 If you told him about what happened in the trenches of the First World War, he would stop and he would cry and be unable to carry on.
00:16:03.740 And yet he felt very weak obligations to his nearest and dearest.
00:16:08.380 And that's evident when later on, you know, his friends invite him to weddings and he says he hasn't got time because he's got to work.
00:16:16.580 And again, that's sort of in line with a consequentialist mentality that sort of everybody matters equally.
00:16:23.180 We don't necessarily have strong special interests with any particular people.
00:16:27.200 And I think consequentialism came extremely naturally to him.
00:16:31.400 And that is one link with his neural atypicality.
00:16:37.100 Yeah, there's one story to touch on the mania for photography for a moment.
00:16:43.040 There's one story in the book which I think exemplifies what a peculiar person he was.
00:16:50.700 It's because perhaps you can tell the story where there's some photo, I forget of which building, that he took enormous pains to get exactly right.
00:17:01.220 And then one of his students comes and admires it.
00:17:04.800 And perhaps you can take it from there.
00:17:07.100 But where this story lands is, again, it's so strange.
00:17:11.820 I mean, I can't imagine behaving in this way.
00:17:15.540 The philosopher is, I think, his first PhD student, a very good philosopher, called Larry Temkin.
00:17:21.900 And Larry is in his, Derek's office in, or Derek's set of rooms in All Souls in Oxford.
00:17:28.880 And he's admiring this photo that Derek has taken of, I think it's the Radcliffe camera, sort of outside Derek's room.
00:17:38.780 Beautiful, beautiful building.
00:17:40.720 And as you say, Derek just didn't take photographs.
00:17:43.760 What he did was he would take the photographs, and then he would send them.
00:17:47.440 This was in the days before Photoshop.
00:17:49.900 He would send them to a production company, and he would spend thousands of pounds back in the 70s and 80s.
00:17:57.320 He would spend all his savings, basically, on touching them up.
00:18:01.540 He would send them off, and he would say he wanted a bit more pink.
00:18:04.220 And they would come back, and the shade of pink was not quite right.
00:18:07.140 And he would send it back to have it adjusted.
00:18:09.260 And this would go back and forth multiple times.
00:18:11.340 It was a very, very expensive process, until he got the perfect shot.
00:18:15.480 And Larry was there one day, and Larry said, oh, I love that.
00:18:17.760 It's a beautiful, beautiful photo.
00:18:20.620 And Derek says, Derek said, well, you have it then.
00:18:24.680 And Larry says, I can't take that photo.
00:18:26.780 That's absurd.
00:18:28.040 And Larry goes off home.
00:18:30.480 And at that stage, he's in Houston.
00:18:32.260 I think he's at Rice University.
00:18:33.540 And some time later, Derek has been invited to give a lecture at Rice University.
00:18:39.900 And he turns up, and he opens his suitcase, and he says, Larry, I've got a present for you.
00:18:46.540 And he scrunched up this incredibly beautiful photo that cost thousands of pounds to perfect.
00:18:54.960 He's kind of scrunched it up in his suitcase, and he hands it over to Larry.
00:18:58.600 And Larry is just appalled, of course, by the way Derek has treated this photo.
00:19:04.500 But I guess for Derek, he's achieved perfection, and that was what mattered.
00:19:08.800 But Larry then has to spend another few thousand pounds trying to iron out the creases.
00:19:13.200 And I think he's still got the photograph to this day.
00:19:16.160 And they've done everything they can.
00:19:18.040 He got the top professionals to work on it.
00:19:20.160 But there's still the sign of a crease where Derek had scrunched it up in the briefcase.
00:19:25.020 So that's the story.
00:19:26.420 Yeah, so everything up until the final scene is, you know, idiosyncratic, but amazing and, you know,
00:19:35.180 compatible with every ethical norm and psychological norm we might want to hew to.
00:19:41.360 But the scrunching up part is just bizarre, right?
00:19:46.240 It's either a total lack of awareness or, I mean, there's just something very peculiar
00:19:53.040 about a mind that would do that.
00:19:56.380 So can I tell you about a story I have in the preface, which is a similar kind of story
00:20:01.280 because it's totally, utterly baffling.
00:20:03.900 And to this day, I can't work it out.
00:20:06.120 So what happened was that back in 2014, I wrote an article about Derek and Janet because
00:20:13.700 a magazine in the UK called Prospect Magazine had named both of them as the top, in the
00:20:19.160 top 50 public intellectuals, or no, actually intellectuals in the world, top 50 intellectuals
00:20:24.120 in the world.
00:20:24.500 And I wrote to Prospect with whom I had a relationship.
00:20:27.080 And I said, did you know they knew each other quite well?
00:20:30.160 In fact, they were married.
00:20:30.920 And they said, no, we didn't know that.
00:20:32.360 And I said, I'll write you an article about their sort of interesting relationship, which
00:20:37.080 was very interesting.
00:20:38.280 And they said, fine.
00:20:39.240 So I got Janet and Derek to agree, and I went to Janet's house in North London, and Derek
00:20:47.640 was there.
00:20:48.420 And I was interviewing them, and I was taking notes.
00:20:51.140 And sometime later, I completed the article, and I sent the article off to Janet and Derek
00:20:57.440 just to fact check it, because I wanted to make sure I hadn't made any errors.
00:21:00.920 And I was walking with my wife somewhere on a hill, and I glanced at my mobile.
00:21:05.720 And Derek said, I mean, I can't remember the exact words, but dear David, thank you very
00:21:10.740 much for the article, but you're not going to like the contents.
00:21:13.900 I'm afraid you can't publish it.
00:21:15.960 So I was very upset.
00:21:17.960 I'd spent a long time on this article.
00:21:19.680 I rushed home, and I looked at the attachment he sent, and there was a list of, I don't know,
00:21:24.600 two dozen, three dozen errors that I'd made in it.
00:21:27.600 And I started going through them.
00:21:29.000 And the first one was not in the article, and nor was the second, and nor was the third,
00:21:35.180 nor was the fourth.
00:21:35.820 And I just couldn't work out what was going on.
00:21:38.340 And then I realized that what I'd done was I'd sent Derek my notes that I'd been taking
00:21:44.860 while I'd been in Janet's house when I was interviewing them.
00:21:49.020 And what is so, you know, to this day, I can't fathom what he was thinking.
00:21:54.540 Nobody could have thought that was an article.
00:21:56.580 It didn't have a title.
00:21:58.400 It didn't have a beginning.
00:21:59.600 It didn't have an end.
00:22:00.520 It didn't have a middle.
00:22:01.820 It had half sentences.
00:22:03.720 It was impossible to believe that this was a finished article in any way.
00:22:08.740 And yet, because I'd said it was the article, Derek thought, well, it must be the article,
00:22:13.340 and had commented on it.
00:22:15.240 So it's a similar story to the photograph story in the sense that one just doesn't know
00:22:19.780 what to make of it.
00:22:21.240 Yeah, it's, I don't mean to be.
00:22:23.940 You know, invidious when I keep returning to this phrase, but it does kind of cry out
00:22:28.420 for a neurological explanation, right?
00:22:31.240 Yeah, yeah, yeah.
00:22:32.660 I think so.
00:22:33.780 Okay, let's talk about the most interesting problems he tackled.
00:22:37.580 You mentioned a few.
00:22:38.900 I think identity is close to the top of the list.
00:22:43.260 And then there's the famous non-identity problem, which is very interesting and has interesting
00:22:50.280 implications for ethics.
00:22:51.440 What were Parfit's insights around the topic of identity?
00:22:55.720 Okay, so as I mentioned earlier, what he was interested in was what it was that makes
00:23:02.060 us the same person over time.
00:23:05.300 It sounds like a very abstract philosophical question, but actually, it has quite deep
00:23:10.740 consequences.
00:23:12.880 And his, I guess, he had a few insights.
00:23:16.720 But the first was that there was no essence of us.
00:23:20.360 We were constantly changing, our body was changing, we were losing memories, we were gaining memories.
00:23:27.540 There was no essence of us.
00:23:29.120 I mean, if you're a religious believer, then you believe in a soul.
00:23:31.720 And you think that soul is immutable.
00:23:33.600 And so you've got an answer to what it is that makes a person the same person over time.
00:23:38.220 They have the same soul.
00:23:40.040 But if you are secular, if you reject the idea of a soul, then the question is, well, what
00:23:45.260 is it that makes you, you?
00:23:47.120 And Parfit said, well, there's no essence of you.
00:23:49.640 Another insight was that identity, and this was an idea that he kept returning to throughout
00:23:56.720 his life.
00:23:57.300 He changed some of his views about identity, but this remained the same.
00:24:01.440 Identity was not what mattered.
00:24:04.020 What mattered was psychological continuity.
00:24:07.000 So you, Sam, in 10 years' time, what matters, what should matter to you, is whether you're
00:24:13.700 psychologically continuous in 10 years' time to your current self.
00:24:17.100 In other words, whether you have the same dispositions and the same memories and so on.
00:24:21.840 He says identity is not what matters.
00:24:24.200 And he has these various puzzles to kind of show that identity can't be what matters.
00:24:30.220 One of them involves a, he imagines that there's three brothers, and one brother is, the body
00:24:40.240 is dying, and they move the two hemispheres of that brother's brain into the two different
00:24:48.460 brothers.
00:24:49.300 So hemisphere A goes into brother one, hemisphere B goes into brother two.
00:24:55.240 And they both think they're the same.
00:24:57.780 They still have exactly the same memories, exactly the same dispositions, and so on.
00:25:02.700 And the question is, well, which is the same brother?
00:25:07.160 Which is identical to the first brother?
00:25:11.460 And I mean, it sounds sci-fi, as it were, but it's based on real-life cases.
00:25:17.940 Because with epileptic patients, what they've done is they've cut the brain stem, and they've
00:25:24.220 created a divide between these two hemispheres of the brain.
00:25:27.600 And what they found is that you get these two spheres of consciousness.
00:25:31.280 I think there's one case where they found that the left hemisphere was kind of religious,
00:25:36.000 and the right hemisphere was atheist, or something like that.
00:25:40.160 So you get these two streams of consciousness.
00:25:42.480 So Parfit's question is, well, which is the original brother?
00:25:45.120 Now, it seems arbitrary to say, well, one of them is original.
00:25:49.060 Why would one of them be original and not the other?
00:25:51.000 So that seems to make no sense.
00:25:53.040 But you also can't say that they're both identical to the original, because they're going to both
00:25:57.420 go on and have different lives, and then they'll have different memories, and they will marry
00:26:02.420 different partners, and so on.
00:26:03.380 They can't be identical to each other.
00:26:05.080 But if they're not identical to each other, they can't be identical with the original brother,
00:26:09.820 because then they would all be identical with each other.
00:26:12.180 So there's a case where we just don't know what to say about identity.
00:26:16.720 Yeah, I think the clearer case for me is the teletransporter case, which is the—I've
00:26:22.820 discussed this in various places before on the podcast, but people will be familiar with
00:26:28.120 the concept, usually from Star Trek, where, you know, Beam Me Up, Scotty, is imagined to
00:26:34.960 be a procedure where, essentially, all of the information in the atoms in the body gets
00:26:40.000 read out and encoded and sent at the speed of light to a new pod, where the body gets reassembled
00:26:46.820 atom by atom and everything is perfectly intact, and the person steps out of what Parfit called
00:26:52.760 the teletransporter pod, you know, on Mars, perfectly intact, with all of his memories
00:26:57.860 intact, and his last memories of just pressing a button in the pod on Earth, and now he's
00:27:04.180 stepping out onto the surface of Mars.
00:27:06.540 And you can go back and forth from Mars to Earth like that a hundred times, and you seem
00:27:11.880 none the worse for wear, but quite ingeniously, Parfit imagines a few different versions of
00:27:19.300 this procedure which change our intuitions, I think, fundamentally as to what's happening
00:27:25.960 there.
00:27:26.340 So in the first case, when you just get sort of disassembled and reassembled with all the
00:27:32.380 information preserved, there is this sense that basically you, the person, have been, you
00:27:39.280 know, albeit in a strange and, you know, potentially scary way, have been successfully sent back
00:27:46.860 and forth between Earth and Mars, and you are still you.
00:27:50.080 You remember everything about the process, you remember your life, you know, you're not
00:27:54.840 sick, you're not injured, your spouse still recognizes you and finds you familiar, everything's
00:27:59.600 fine, but then Parfit asks us to imagine the version where there's a delay, I forget how
00:28:06.420 he describes the reason for the delay, but you want to jump in here?
00:28:10.240 Well, yeah, so what happens is he imagines that there's a copy of you on Mars, but as
00:28:16.860 it were, you're talking to yourself, so that there's you back on planet Earth, and there's
00:28:21.160 the copy of you, and, you know, you're about to kind of talk to this copy, and then you're
00:28:27.760 told that you, the person back on planet Earth, are about, as it were, to implode, and your
00:28:33.980 life is about to end, and the copy of you will continue, and he's asking us, I guess, what
00:28:40.500 our intuitions are in that case, and Parfit's intuitions are, you are as good as preserved.
00:28:48.280 You know, the fact that you are made up of a copy of you is irrelevant, you've got what
00:28:53.720 matters, you've got psychological continuity, you've got everything that counts, and you
00:28:59.380 may think that you should be very upset about this, and probably lots of people would be
00:29:04.420 very upset about this, but Parfit's intuitions are not like that.
00:29:07.460 Parfit thinks, you've as good as survived, this is really what matters to you, that you
00:29:13.100 have psychologically survived, that that person up there is connected to the person that had
00:29:20.760 existed on planet Earth, and that's what you should care about.
00:29:25.820 Except if you let the person on Earth live any appreciable amount of time, then their timelines
00:29:31.760 begin to branch, and then you have to consider that person a different person.
00:29:37.760 Well, exactly right.
00:29:39.100 So that comes back to the twins example.
00:29:42.380 If they're both operating together, and then Parfit wants to say, well, the copy is as good
00:29:47.340 as you, but then the question is, well, is it identical to you?
00:29:50.540 Well, if it goes off and has a separate life, and has separate hobbies, and whilst you go
00:29:55.520 fishing, it goes to play chess, clearly you two are different people, but in a way, that's
00:30:03.380 two for the price of one, as far as Parfit is concerned.
00:30:06.240 You know, you're psychologically continuous in two different people, and that's doubly good.
00:30:12.060 Yeah, it's interesting.
00:30:12.920 I think I land in a slightly different spot.
00:30:15.620 I mean, when being transported is synonymous with being destroyed and reassembled, it just
00:30:22.680 seems like a successful maintenance of psychological continuity without problem, right?
00:30:29.060 But the moment there's a delay, and you first make the copy, and then you destroy the original,
00:30:36.660 that seems like a copying and a murder, and it's hard to see it otherwise.
00:30:42.060 So, Parfit changes his mind a little bit about this, but at least in reasons and persons,
00:30:47.960 he thinks that the body itself is almost an irrelevant.
00:30:52.820 So, our cells are constantly dying and regenerating.
00:30:57.180 If you cut off your nails, you're not changing your identity.
00:31:00.200 If you lose a finger, you're not changing your identity.
00:31:01.920 Your body, as it were, might hold your mind, but it's the mind that matters.
00:31:08.760 And so, at least in reasons and persons, he doesn't share your intuitions about that.
00:31:13.220 So, he wants to say, you know, you can call it murder if you want.
00:31:17.260 It might feel like murder to you, but he's very happy to be there on Mars,
00:31:21.820 continuing psychologically from the person that existed on planet Earth.
00:31:27.040 So, where does he change his opinion about it?
00:31:29.980 Well, later on, he becomes more sympathetic to what's called animalism,
00:31:35.480 which is the idea that we're organisms.
00:31:39.120 But he never goes all the way.
00:31:41.000 I think he comes to believe that it's still psychological continuity that matters,
00:31:46.320 but psychological continuity has to sit within a kind of organism.
00:31:51.560 So, if you want to know what matters, well, it's still psychological continuity,
00:31:57.220 but psychological continuity can only survive within an evolving organism.
00:32:02.780 So, it's a subtle difference.
00:32:04.880 The main thing is that his big claim that identity is not what matters persists throughout his life,
00:32:12.460 and he continues to believe that it's psychological continuity that matters.
00:32:17.020 And that has implications for how we should regard the past, how we should regard the future.
00:32:22.660 So, it has implications, for example, about whether we should hold somebody guilty for something they committed a long time ago and don't remember.
00:32:31.900 It has implications for thinking about whether we should save for the future,
00:32:37.620 because, you know, people do save for their pensions.
00:32:40.080 They worry about what their lives are going to be like in 25, 30 years.
00:32:43.840 They want to make sure that they're going to have a comfortable retirement.
00:32:49.260 And if Parfit's arguments are right, then the gap between us and our future and our past becomes greater,
00:32:59.440 and the gap between us and other people narrows.
00:33:04.180 So, we're more connected to other people and less connected to our past selves and our future selves.
00:33:10.960 Yeah, that's interesting.
00:33:12.060 I mean, the variable of time here is fascinating.
00:33:16.100 He has one thought experiment.
00:33:17.920 I forget the title.
00:33:18.860 Perhaps you'll recall it.
00:33:20.120 But it's the experiment where a person wakes up in the hospital and is told that they've either had a surgery
00:33:31.100 or they will have a surgery in the next 24 hours.
00:33:34.940 The nurse is unsure that she has to go check the chart.
00:33:37.420 But if they had the surgery, they had a very long and painful variant of the surgery.
00:33:45.140 And if, you know, there was some glitch and it, you know, it was 10 hours of torture.
00:33:49.560 And if they're going to have the surgery, normally it just takes an hour and it's not all that bad.
00:33:54.360 And so, I'll be back in a second.
00:33:56.320 Let me figure out which person you are.
00:33:57.740 And so, the person is left to wonder and worry whether he has already had a surgery, which he can't remember,
00:34:04.760 or whether he will soon have a surgery.
00:34:07.120 So, this is like, this is on a Tuesday.
00:34:08.780 So, like, you can correct me if I've messed any of this up.
00:34:12.220 But, so, if you wind back the clock, if you put the person at time zero before any of these surgeries
00:34:19.900 and you asked him, well, which would you rather have, a 10-hour botched surgery that is a harrowing ordeal
00:34:27.220 and then we give you a drug and you no longer remember it,
00:34:29.940 or a much shorter normal surgery that you also don't remember,
00:34:33.900 well, if it's a choice between 10 hours of torture or something far more benign,
00:34:40.800 well, obviously, I want that thing that's far more benign.
00:34:44.640 But, if you tell me, well, the 10 hours might have happened yesterday
00:34:49.600 and the more benign surgery is going to happen tomorrow,
00:34:53.500 we have such a time bias with respect to past and future suffering
00:34:58.020 and the person is hoping he had the 10-hour ordeal yesterday
00:35:02.800 and just can't remember it rather than the more normal ordeal coming on the next day.
00:35:08.740 And so, what Parfit was suggesting is that you could have a,
00:35:11.420 you could be unbiased with respect to past and future suffering
00:35:15.840 and just recognize that the quality of your life is,
00:35:20.280 it's just the area under the curve of, you know, total experience
00:35:24.000 and you should just want less suffering under the curve in its entirety
00:35:28.840 and therefore, in this case, you should want to be the person
00:35:32.340 who has the future surgery, not the past one.
00:35:34.780 And he seemed to find it inscrutable that we are so strongly
00:35:39.780 weighting our future suffering and so fully discounting suffering in the past.
00:35:47.240 Yeah, you've described that brilliantly.
00:35:48.680 I mean, I've nothing more to add to it, really.
00:35:50.680 You've described it exactly right.
00:35:51.760 He says that almost everybody, if they're lying in that hospital bed
00:35:55.760 wondering about whether they are the patient
00:35:58.400 who's had the terrible operation that lasted all night
00:36:03.260 or the patient who's about to have a short and painful operation
00:36:08.140 in the next 24 hours,
00:36:10.660 almost everybody will wish that they are the person who's already had.
00:36:15.100 Who's already been tortured, yeah.
00:36:16.320 Who's already been tortured.
00:36:17.320 And he is puzzled by that and thinks that perhaps it would be more rational,
00:36:21.320 to prefer to be the person who is about to have the short and painful operation.
00:36:27.580 And towards the end of his life, time was something that really fascinated him.
00:36:31.240 And he leaves behind sort of indications of what he wanted to work on.
00:36:38.500 And we know one of the things he wanted to work on when he died was he wanted to work more about on time.
00:36:44.940 There were various issues he wanted to delve into more deeply.
00:36:49.700 One was free will.
00:36:50.820 He was fascinated by the topic of free will.
00:36:52.840 And he was very suspicious that any of us had free will.
00:36:56.560 He wanted to work more on that.
00:36:58.160 He wanted to work more about time.
00:37:00.400 He wanted to work more about something which he called the sublime,
00:37:04.160 a sublime experience and what it was to feel the sublime,
00:37:08.600 which he thought was sort of the opposite of pain.
00:37:12.040 And he wanted to go back and work a bit more on future people and so on.
00:37:15.700 But had he lived a bit longer, he died when he was only 74.
00:37:18.640 Had he lived longer, I think we'd have had more writings from Derek on the subject of time.
00:37:24.100 So what was the non-identity problem?
00:37:26.740 Gosh.
00:37:27.020 Okay, well, I think the non-identity problem is brilliant.
00:37:31.540 And once I explain it, it will seem so obvious to you that you'll wonder why on earth nobody had ever thought of it before
00:37:40.180 and why it took Derek Parfit two millennia after the great kind of ancient Greek philosophers to come up with it.
00:37:46.920 How come no philosopher had ever thought of it before?
00:37:49.080 So I should start with an example he starts with, which I think is a slightly awkward example
00:37:55.840 because it feels a bit kind of sexist and classist somehow.
00:37:58.980 But anyway, this is how Derek sets it up.
00:38:01.100 He imagines that there's a 14-year-old girl and she is wondering whether to have a child.
00:38:09.740 Now, that might be bad for the 14-year-old girl.
00:38:13.220 Let's put that issue to one side.
00:38:15.560 We also think that it would be bad for the child, that if a child is born to a 14-year-old mother,
00:38:23.560 that's a bad start for the child.
00:38:26.580 And so it would be natural to try and persuade the 14-year-old girl to wait 10 years or 15 years
00:38:33.300 or whatever before she has a child.
00:38:36.440 And then the child would have a better start in life.
00:38:39.160 I mean, that's the natural intuition.
00:38:40.540 But Derek points out something, as I say, that is bleeding obvious.
00:38:46.160 If the 14-year-old girl has a child, so long as that child's life is not worse than nothing,
00:38:54.080 that child won't regret being brought into the world.
00:38:58.240 And if that 14-year-old girl waits for 10 years and has a child, she won't have the same child.
00:39:07.120 She will have an entirely different child.
00:39:09.380 And so the question is, well, if she has a child when she's 14, who has she harmed?
00:39:15.700 It doesn't look like she's harmed anyone.
00:39:17.100 She hasn't harmed the child who is born to the 14-year-old because that child's life is better than nothing.
00:39:22.740 Now, normally, we think that morality must be about harming particular individuals.
00:39:29.900 If I throw a stone at another human being and hurt that human being,
00:39:35.420 I've done something wrong because I've hurt a particular human being.
00:39:38.540 If I just throw a stone on the ground and it doesn't hit anybody, I haven't harmed anybody.
00:39:44.100 I've done nothing wrong.
00:39:45.380 What's wrong is when you harm a particular individual.
00:39:48.360 That seems to be the basis of morality.
00:39:50.600 But Parfit spotted that there are lots of areas where it looks like we've done something wrong,
00:39:57.380 and in this case, the 14-year-old girl having a child, where nobody is harmed.
00:40:02.100 And he then extrapolates that to a whole series of policies.
00:40:07.200 So, for example, think about climate change.
00:40:09.620 If we did nothing about climate change, then several generations – well, in fact, probably not several generations,
00:40:16.020 probably only two generations down the line – we're already feeding the effects of climate change –
00:40:19.740 but two or three generations down the line, people are going to have very bad lives.
00:40:23.340 There's going to be hurricanes, there's going to be typhoons, there's going to be droughts, there's going to be mass migration.
00:40:29.440 The world is not going to be a happy place.
00:40:31.900 But let's assume that, although it's not a very happy place, those lives are still better than nothing.
00:40:37.680 Now let's assume that we did something about climate change.
00:40:42.040 For example, let's say we did something drastic.
00:40:44.640 Let's say we said people could only drive on Mondays and Wednesdays and Saturdays, or people couldn't fly planes.
00:40:52.460 Now, if we did something drastic like that, that would affect who was born.
00:40:58.220 Each of us is a product of a unique sperm and a unique egg.
00:41:04.540 And if my mother had come home late on a particular day, or my father had been delayed for some reason,
00:41:11.540 I wouldn't be here to talk to you today.
00:41:14.820 I'm the product of a kind of unique union of sperm and egg at a unique time.
00:41:20.480 If you did something drastic like you stop cars, then you would change who exists.
00:41:26.240 I mean, and the way Derek puts it is, imagine that trains had never existed.
00:41:31.640 Which of us would still be alive today?
00:41:33.240 None of us would still be alive today, because our lives have all been changed by the invention of the train.
00:41:40.820 I can't remember if you do this in the book or not, but I feel like you or someone imagined whether we would all exist if Hitler hadn't existed.
00:41:50.060 How fully can we regret Hitler if half the world would be people by different people?
00:41:57.660 Yeah, I had a very annoying review of the book where the reviewer just didn't understand the point.
00:42:03.980 And I made the point that you made.
00:42:06.300 I said, I wouldn't exist if Hitler didn't exist.
00:42:09.540 And she said, well, your mum and dad, she wrote in the review, might still have met each other.
00:42:12.940 It's absurd.
00:42:13.940 I am actually the product of two Jewish refugees.
00:42:17.820 They would work not for Hitler.
00:42:19.840 They would not have arrived in England.
00:42:21.600 They would not have met each other.
00:42:22.640 There was no way I would exist if Hitler hadn't existed.
00:42:26.660 But that's true of almost everybody.
00:42:28.180 I mean, he changes the whole world.
00:42:29.700 So you wouldn't exist if Hitler hadn't existed.
00:42:32.240 Nobody would exist if Hitler hadn't existed, because he changes the world.
00:42:35.900 And actually, that raises interesting questions, which maybe we shouldn't go into, but it raises interesting questions about reparations and so on.
00:42:44.340 Because, you know, Hitler did obviously very bad things, but I owe Hitler my life, as it were.
00:42:52.800 So one has a complicated relationship with the past when one begins to think of it like that.
00:42:57.840 Yeah, so, but back to the case Parfit's making here.
00:43:02.820 So what is inscrutable about this, as you said, we have this intuition that for something to be wrong, there has to be a victim.
00:43:13.160 And the idea that there's a category of crime called a victimless crime has seemed to many of us to be just a logical contradiction.
00:43:23.080 And yet, here, Parfit is exposing these cases where something, there are better and worse options, and yet they're better and worse for no existing people, because the people are different.
00:43:40.320 Exactly. There is no one person for whom they are better or worse.
00:43:44.460 But nonetheless, we can still make the judgment that if we do nothing about climate change, we've done something very bad.
00:43:51.140 We can still make the judgment that the 14-year-old girl has made a mistake by having a child at 14 rather than delaying becoming a mother.
00:44:00.640 So the way he does that, and it's very easy with same number, what he calls same number cases,
00:44:05.980 if you've got a choice between a person born at X time and a different person Y, then the way to judge whether you should bring X or Y into the world is, well, who has the better life?
00:44:22.180 So the reason why we should do something about climate change, let's assume that if we got rid of cars and planes, it would help a great deal.
00:44:33.400 But let's assume, let's make the absurd assumption that it wouldn't affect the numbers of people being born.
00:44:38.000 On that absurd assumption, the reason why it would be a good thing to do is because two or three generations down the line,
00:44:46.140 the people who would exist will be better off than the people who otherwise would have existed had we done nothing about climate change.
00:44:54.260 And that's how he solves the conundrum.
00:44:55.880 Yeah. In some ways, you can capture this by putting the phrase, identity is not what matters, to slightly different use here.
00:45:05.140 It's just that the identities of the people isn't what is relevant for the moral calculus.
00:45:10.400 It's just the fact that there are people in each case, and in one case, the people are much better off.
00:45:17.400 Yeah. Yeah. And I think both in his arguments about personal identity and his arguments about future people,
00:45:23.740 he converges on a kind of consequentialist conclusion, which is, it's the consequences that matter.
00:45:30.440 We should prefer the better consequences rather than the less bad consequences.
00:45:35.260 In the case of personal identity, because my future self has become less like me, as it were, than we thought,
00:45:42.760 we should be more interested in the lives of others. That's a consequentialist conclusion.
00:45:47.360 When it comes to future people, we should prefer the outcome in which the people are better off.
00:45:53.100 That's another consequentialist conclusion. So lots of his arguments, although he comes at them from very different angles,
00:45:59.660 have very similar conclusions.
00:46:02.180 Yeah. There's something very Buddhist about him. I don't know that he was ever aware of how reminiscent of Buddhism,
00:46:09.000 how much of his reasoning is, but it's, you know, yeah, it's...
00:46:12.480 He was. He was. There's an appendix, actually, in Reasons and Persons,
00:46:16.340 where he draws a connection between his views on personal identity and his views on the self.
00:46:21.580 And there's an anecdote in the book about how a philosopher goes to Tibet, I think, or North India,
00:46:31.320 and somehow they get a copy of...
00:46:34.440 They're talking about personal identity and the connection between Derek's views and Buddhist views,
00:46:37.900 and they give a copy of Reasons and Persons to this Buddhist monk.
00:46:44.460 The monks in Dharamsala, I think, yeah.
00:46:46.400 Yes, yes, yes. And then later on, the philosopher goes back and he discovers that the monks are reciting
00:46:52.520 or chanting extracts of Reasons and Persons.
00:46:56.260 Yeah, I'd forgotten that. Yeah, that's a fantastic story.
00:46:58.700 Okay, so let's talk about population ethics, as it's now called, and the repugnant conclusion,
00:47:06.560 which was the, again, one of these found objects of philosophy that he drove to
00:47:14.020 on the basis of a very clever thought experiment, which it's very hard to think about.
00:47:22.120 It does seem, in some sense, destructive of the whole enterprise of doing population ethics
00:47:28.940 in a consequentialist way. Let's just describe it. What is the repugnant conclusion?
00:47:35.060 Right, well, some of the arguments to get there are quite complicated, and sort of it helps to be
00:47:40.420 able to draw diagrams, which we can't do in a podcast. But I'll tell you what the repugnant
00:47:44.140 conclusion is. The repugnant conclusion is Parfit's claim that for any set of people, let's say there are
00:47:52.120 imagine there are 8 billion people on the planet who have, all of them, very happy lives. For any set
00:48:00.420 of people like that, there must be a set of people much, much larger than that, maybe trillions of
00:48:08.200 people whose lives are barely worth living. They're just better than nothing. And that outcome is better
00:48:15.480 than the outcome where there are 8 billion very happy lives. Now, that seems an absurd conclusion.
00:48:23.860 Parfit called it the repugnant conclusion. He felt it couldn't possibly be right because it's so
00:48:29.420 counterintuitive. And yet, he thought that there were arguments which suggested that it was the right
00:48:36.540 conclusion to draw. And to the end of his life, he was trying to find a solution to the repugnant
00:48:42.060 conclusion, trying to find a way out of reaching the repugnant conclusion. But that's what it was.
00:48:47.360 It was that for any set of people, there must be a much greater number of people whose lives are
00:48:53.940 barely worth living, which is a better outcome than the first set of people whose lives are very
00:48:59.860 well worthwhile. I think that we can get people there without diagrams, although those diagrams in
00:49:05.220 the book are surely helpful. But if you just imagine, if you imagine a billion people who are
00:49:10.980 more or less perfectly happy, you know, that's one circumstance. And just imagine adding some more
00:49:17.240 people who are not exactly as happy as the first billion, but they're still remarkably happy,
00:49:23.140 far happier than any average population on earth today. Surely that is, if moral value is in any way
00:49:31.580 additive, right? If more is better in any sense, well, then surely more good lives is better than
00:49:38.960 the first billion.
00:49:39.840 Or at least, if I can just interrupt, at least it's not worse.
00:49:42.940 Right.
00:49:43.120 It seems like you can't say that if you add a few happy lives, even if they're not as happy as
00:49:48.220 the lives that already exist, if you add a few happy lives, you can't be making the universe worse.
00:49:53.360 Maybe you're not making it better, but you're not making it worse. But you carry on,
00:49:56.600 you're doing a very good job.
00:49:57.300 I think, well, I think I would push it a little further than that. I think most people will
00:50:01.840 recognize it to be obviously better because then, if you don't recognize that, then you seem to be
00:50:07.780 saying that, you know, the average is the most important principle. And so if, let's just rule
00:50:15.040 this out in advance. So many people might be tempted to think that what we really care about is the
00:50:22.100 average happiness. But if average is what concerned us, then a world in which there's one perfectly
00:50:30.400 happy person and billions upon billions of extraordinarily happy people, but they're just
00:50:36.660 not perfectly happy. Well, those billions have brought down the average. So it would be better
00:50:42.140 just to have a world with one perfectly happy person. And all those billions of wonderful lives
00:50:46.880 can be annihilated to the advantage of the universe. That seems crazy.
00:50:51.780 Yeah. And you can also do it the other way around as well. I mean, the average view is bonkers
00:50:57.680 because the other way around, the flip side of that is you can imagine that there are a billion
00:51:03.440 people whose lives are totally miserable, who are tortured the whole time. And then you imagine that
00:51:09.820 you bring in another person who's tortured only six days of the weeks and the other day kind of
00:51:15.840 scraps around. Well, according to the average view, you've improved the universe by bringing in this
00:51:21.620 really horrible life because you've increased the average. That's obviously nuts. I mean,
00:51:25.760 the average view makes no sense at all. Back to you.
00:51:29.420 Okay. So now we're adding people. So we started with the perfectly happy people and now we're adding
00:51:34.160 billions upon billions of, again, extraordinarily happy people. I mean, the happiest person you have
00:51:40.200 ever met on the happiest day in their life. If you'd like to continue listening to this conversation,
00:51:45.700 you'll need to subscribe at SamHarris.org. Once you do, you'll get access to all full-length
00:51:50.460 episodes of the Making Sense podcast, along with other subscriber-only content, including bonus
00:51:55.540 episodes and AMAs and the conversations I've been having on the Waking Up app. The Making Sense
00:52:00.900 podcast is ad-free and relies entirely on listener support. And you can subscribe now at SamHarris.org.
00:52:10.200 Thank you for listening to SamHarris.