Making Sense - Sam Harris - November 27, 2023


#342 — Animal Minds & Moral Truths


Episode Stats

Length

1 hour and 2 minutes

Words per Minute

163.25238

Word Count

10,134

Sentence Count

410

Misogynist Sentences

10

Hate Speech Sentences

12


Summary

Peter Singer is often called the father of the modern animal rights movement, and was named one of the most influential people in the world by Time Magazine. He is an Australian philosopher and a professor of bioethics at Princeton, he has contributed to more than 50 books in over 30 languages, and he is the founder of The Life You Can Save, a non-profit which recommends various effective charities. And his seminal book, Animal Liberation, has been revised and published under the title Animal Liberation Now, which is the main topic of discussion today. In this episode, we talk about the moral status of non-human animals, the ethics of moral hierarchies, speciesism, the tragic case of Sam Bankman Freed, some concerns about effective altruism, and the problems with focusing on existential risk. We also discuss the important work of Derek Parfit, whether there are objective claims to make about right and wrong and good and evil, and other topics. And now I bring you Peter Singer, I am with Peter Singer. I am your host, Sam Harris, and I am here to bring you the first part of this conversation. This is the Making Sense Podcast, and it's my pleasure to have him on the show. If you enjoy what we re doing here, please consider becoming a supporter of what we're doing here. You can see the charities we support at the Waking Up Foundation over at wakingup.org, and you can also consult Peter Singer's organization, which also recommends effective charities over at the Getting What We Can Save. And you can read The Buddhist and the Ethicist, The Buddhist And The Ethicist by The Dalai Lama over at The New York Review of Books, which I haven t read, I haven't read yet. And I have a new book coming out coming out, The Buddha and the Buddhist And the Buddhist and The Buddhist and I can t wait to read The Dalai and I don t know what I do not have a book about that yet. I have not read that yet, but I am going to read it yet, I'm going to do that, I hope you're not doing that, and so on and I'm not doing it, I'll read it, and all of that, so I'm looking out for that, right and I think that I'll do it right, I've checked out that, you can do it, right have that, etc., etc., and so much more, etc. -- Sam Harris


Transcript

00:00:00.000 Welcome to the Making Sense Podcast.
00:00:08.820 This is Sam Harris.
00:00:10.880 Just a note to say that if you're hearing this, you are not currently on our subscriber
00:00:14.680 feed and will only be hearing the first part of this conversation.
00:00:18.420 In order to access full episodes of the Making Sense Podcast, you'll need to subscribe at
00:00:22.720 samharris.org.
00:00:24.060 There you'll find our private RSS feed to add to your favorite podcatcher, along with
00:00:28.360 other subscriber-only content.
00:00:30.520 We don't run ads on the podcast, and therefore it's made possible entirely through the support
00:00:34.640 of our subscribers.
00:00:35.880 So if you enjoy what we're doing here, please consider becoming one.
00:00:46.220 Today I'm speaking with Peter Singer.
00:00:49.380 Peter is often called the father of the modern animal welfare movement, and was named one
00:00:54.700 of the most influential people in the world by Time Magazine.
00:00:58.360 He is an Australian philosopher and a professor of bioethics at Princeton.
00:01:04.180 He's contributed to more than 50 books in over 30 languages, and he's the founder of
00:01:10.580 The Life You Can Save, a non-profit which you can find online that recommends various
00:01:16.440 effective charities.
00:01:18.240 And his seminal book, Animal Liberation, has been revised and published under the title
00:01:23.720 Animal Liberation Now, which is the main topic of discussion today.
00:01:28.540 We talk about the moral status of non-human animals, the ethics of moral hierarchies, speciesism,
00:01:37.500 the scale of animal suffering, animal experimentation, the tragic case of Sam Bankman Freed, some concerns
00:01:46.140 about effective altruism, the problems with focusing on existential risk, the comparative
00:01:52.540 nature of human suffering, the important work of Derek Parfit, whether there are objective
00:01:58.180 claims to make about right and wrong and good and evil, and other topics.
00:02:02.440 I should say, on the topic of effective altruism, both Peter and I continue to support it, just
00:02:09.960 with various shadings and caveats.
00:02:13.600 The crucial thing for me is that systematizing one's philanthropy seems like an objectively good
00:02:19.840 idea.
00:02:21.020 Deciding, for instance, to give 10% of one's pre-tax income away each year to the most effective
00:02:26.840 charities, that seems like a good thing.
00:02:29.440 And in my experience, it's a fairly revolutionary thing to do in one's life.
00:02:34.660 As many of you know, I took that pledge through Will McCaskill's organization, Given What We
00:02:38.760 Can, and I've since heard from Will that over 10% of the members who have taken that pledge
00:02:45.080 have referenced this podcast in their explanation of why they decided to do that.
00:02:50.820 And that represents over $300 million in pledged donations, which is amazing.
00:02:56.420 And Will tells me that even on a conservative basis, which takes into account pledge attrition,
00:03:03.040 as well as how much would have been given away anyway, and factors time discounting, that's
00:03:08.700 worth at least $20 million in present value to top charities.
00:03:13.100 So that's fantastic.
00:03:15.000 Of course, this is the time of year where many people think about giving.
00:03:18.680 So if you want some recommendations there, I suggest you check out Giving What We Can
00:03:24.640 and Give Well.
00:03:27.540 You can see the charities we support at the Waking Up Foundation over at wakingup.com
00:03:32.440 slash foundation.
00:03:34.300 And you can also consult Peter Singer's organization, The Life You Can Save, which also recommends
00:03:39.440 effective charities.
00:03:40.980 And now I bring you Peter Singer.
00:03:43.200 I am with Peter Singer.
00:03:51.080 Peter, thanks for joining me again.
00:03:53.400 It's my pleasure, Sam.
00:03:54.720 So you have two books, two newish books.
00:03:58.460 One is a revision of your classic Animal Liberation.
00:04:03.420 Animal Liberation Now is the current title.
00:04:06.560 And then you have a new book coming out, which I haven't read, The Buddhist and The Ethicist.
00:04:10.740 And we can talk about, I want to talk about both of those, but let's jump into Animal Liberation
00:04:17.240 now because it's, remind me, the book first came out in 71?
00:04:22.620 1975.
00:04:23.620 75.
00:04:23.820 The book came out, yeah, that's right.
00:04:25.680 So it's not quite 50 years for the book.
00:04:27.900 It's 50 years since I first actually published something on this topic, which was an article
00:04:32.280 called Animal Liberation in the New York Review of Books in April 1973.
00:04:37.700 Right, okay.
00:04:38.520 Well, that has been, you tell me, you're often credited as being the real father of the animal
00:04:47.860 rights movement.
00:04:49.220 You know, you detail in the book some of the history of our callousness toward animals and
00:04:54.360 how we made some moral progress, however incremental.
00:04:58.200 What was your experience as a philosopher writing a book of such compelling social importance?
00:05:06.800 And that's not the common experience of academic philosophers.
00:05:09.940 So tell me what happened to your life when you wrote that book.
00:05:12.800 Yes, well, it was very interesting because I really had no idea what it would do to my
00:05:17.120 philosophy career at that stage, which was really just beginning.
00:05:21.240 And philosophy was just on the cusp of coming out of this ordinary language mode of philosophy,
00:05:28.140 as it was sometimes called, or linguistic philosophy.
00:05:30.540 And some of the leading philosophers in that area had expressed the idea that philosophy really
00:05:35.300 has nothing to say about what is right or wrong, doesn't give advice in ethics.
00:05:41.540 A.J.
00:05:41.940 Ayer, who was a very prominent philosopher at the time, said, that's the business of the
00:05:46.380 politician or the preacher, and we should leave it to them.
00:05:49.320 But what I was trying to do was to write something that would be both intelligible to
00:05:53.760 ordinary people, but still of philosophical interest.
00:05:58.200 And I wasn't really sure whether that was possible, but I was so compelled by the need
00:06:03.480 to write this book that in a way, you know, if it had harmed my career in philosophy, well,
00:06:07.700 I could see myself having had a career as an animal activist, I suppose.
00:06:12.240 But fortunately, the reaction was actually very good from philosophers.
00:06:16.920 At least a few of them, I don't know.
00:06:18.760 The ones who wrote about it mostly welcomed it.
00:06:21.100 There were a couple who ridiculed it, but most of them said, you know, yes, this is important
00:06:26.660 and philosophy should get back on track.
00:06:29.860 In the 1970s, as I say, it was on the cusp of change because there were other philosophers
00:06:34.580 who wanted to discuss, for example, the war in Vietnam, the right to civil disobedience,
00:06:39.600 and of course, the civil rights movement, which had been unfolding in the United States
00:06:43.360 for more than a decade prior to that point.
00:06:46.560 And what was your experience of, because, you know, subsequent to your publication of
00:06:52.040 this book, you have been no stranger to controversy.
00:06:56.720 I mean, you're a, unlike almost anyone else in your line of work, you're often noticed
00:07:05.100 by the wider public in terms of how your arguments brush up against concerns about public policy
00:07:12.080 and things like euthanasia.
00:07:14.220 And we'll get into some of the reasons why, and we'll talk about the foundations of your ethics.
00:07:20.600 But what has been the experience of being an academic philosopher whose work is so often cited
00:07:29.040 to resolve or to confound questions of public policy?
00:07:35.360 Well, I've certainly enjoyed it.
00:07:36.660 I've felt it was important, if you're writing in ethics, to contribute to some of the deeper
00:07:43.980 ethical questions that, you know, underlie our decisions about life and death, for example,
00:07:49.640 about what we eat, about what we do with our spare cash.
00:07:53.720 Those are all important questions.
00:07:55.220 And to some extent, they're novel questions in that they're being asked in a different world
00:08:01.400 from the world of a century or two ago.
00:08:03.520 So, to me, it's been, in a way, the stimulus to work hard in ethics and philosophy that
00:08:11.920 I can have an influence and that these are important questions.
00:08:15.220 I'm not just writing for my fellow philosophers to read and ponder and write replies to.
00:08:20.980 I'm also trying to change the world for the better.
00:08:23.980 And that's a huge motivating factor.
00:08:27.460 Yeah.
00:08:27.680 Well, I share that aspiration, and I should say that your work has been very influential
00:08:33.840 in my life, both directly and also as a result of the other people you have influenced who
00:08:40.140 have, in turn, influenced me, people like Will McCaskill.
00:08:43.640 So you're also credited with being, in some ways, the father of the effective altruism movement,
00:08:51.600 which has suffered some PR wounds of late.
00:08:56.800 We can talk about that.
00:08:58.280 I mean, Sam Bankman-Fried was also on this podcast back in the day.
00:09:03.280 So I'd love to get into all that.
00:09:05.500 But let's talk about the revised book, Animal Liberation Now, and your central argument against
00:09:13.120 what you describe as speciesism.
00:09:16.100 And just to kind of make the case here over the course of a few minutes, what is our current
00:09:22.100 prejudice as you see it?
00:09:25.100 And what do you think would be ethically normative?
00:09:29.180 And how do we get there?
00:09:30.840 Right.
00:09:31.280 Well, I think our current prejudice still is that members of our species, members of the
00:09:37.300 species Homo sapien, have automatically, and just in virtue of being a member of that species,
00:09:44.900 have a higher moral status than any other beings.
00:09:49.680 And that means that we are entitled to use other beings for our own ends, even when those
00:09:56.960 ends are not absolute necessities, even when they're not saving our life.
00:10:01.380 But for example, because we prefer a particular taste, a particular kind of food, that that
00:10:07.360 entitles us to rear and then kill vast numbers of animals, and not even to give them minimally
00:10:15.520 decent lives, but to lock them in huge sheds by the thousands or even tens of thousands in
00:10:22.440 the case of chickens, just to produce their flesh more cheaply than we would be able to
00:10:27.920 do if we gave them a life that is more normal for them, in a flock of 30 or 40 hens or chickens
00:10:34.680 maybe, and running around outside.
00:10:37.840 So that, I think, is a prejudice.
00:10:40.780 And I use the term speciesism, which I didn't invent, but I found in a leaflet published in
00:10:46.420 the early 1970s by a man called Richard Ryder.
00:10:49.760 And to me, when I saw that word, it's like a light bulb went on.
00:10:54.620 Yes, there is something going on here that is parallel to racism or to sexism or some of
00:11:01.680 the other isms that we reject.
00:11:04.080 I say parallel, it's not exactly the same, obviously.
00:11:06.560 But in all of these cases, we have a group that is able to be dominant over others.
00:11:14.580 So at least, say, in the 18th century, when the slave trade was at its height, that group
00:11:19.840 was Europeans who had technology that Africans did not have and could capture or buy Africans,
00:11:27.820 send them on a horrible voyage across the Atlantic in a ship, and sell them into slavery.
00:11:33.480 And obviously, they could do that because they had that technology, and then they developed
00:11:38.260 an ideology which justified it.
00:11:41.140 The idea that Europeans are superior, maybe that we were even helping these Africans by
00:11:46.900 Christianizing them and then saving their souls, or finding verses in the Bible that justify
00:11:53.520 what we're doing.
00:11:54.540 There were slaves referred to in the Old Testament.
00:11:57.520 And similarly, with men over women, there's also an ideology that it's natural for women
00:12:02.920 to be subordinate to men.
00:12:04.600 And so women were denied equality in terms of, certainly in politics, they didn't have
00:12:09.560 the right to vote.
00:12:10.880 In some countries, they did not have the right to own property.
00:12:13.940 If they were married, their property automatically all belonged to their husbands.
00:12:17.740 So when it comes to animals, we have the same attitude.
00:12:21.860 We are dominant over them.
00:12:23.440 We can do all kinds of things to them that they cannot really resist.
00:12:26.800 And we justify that with, again, an ideology.
00:12:30.920 And the ideology might, once again, be a religious one.
00:12:34.800 It says in the book of Genesis that God gave us dominion over the animals, so they're ours
00:12:39.700 to do as we please with.
00:12:41.300 Or it might be that this is a natural arrangement in some way.
00:12:45.480 We've always done this, and therefore it must be okay.
00:12:47.840 But again, I think it's unjustifiable to think that species membership somehow makes a crucial
00:12:54.200 moral difference.
00:12:55.600 That, you know, just as being, we now recognize that being of one race or another or one sex
00:13:02.040 or another does not give one a right to rule over the other.
00:13:06.720 So I think we should recognize that being of the human species does not mean that whatever
00:13:12.520 your interests are, override the interests of another sentient being, that is, another
00:13:18.020 being who can feel pain, whose life could go well or badly, and whose pain, you know, humans
00:13:24.780 should not be ignoring.
00:13:26.160 We should be saying, yes, pain is pain.
00:13:28.580 It matters just as much whether it's experienced by a human or a cow or a dog or a chimpanzee.
00:13:36.120 What matters is how severe the pain is or how great the suffering is, but not.
00:13:42.520 What species is this being who is suffering?
00:13:46.500 Well, I think most people have a natural or culturally acquired intuition that there is
00:13:54.780 a moral hierarchy here.
00:13:56.400 I often think about this in terms of what I'm now going to dub as the windshield test.
00:14:03.080 I mean, if you're driving home in a car and a bug splatters on the windshield, you may
00:14:09.860 not feel much of anything about it.
00:14:12.220 I mean, it's not, it's certainly not a moral emergency.
00:14:15.180 And the absence of feeling there is based on an intuition, however, in Coet, that not
00:14:21.920 much has really happened, all things considered.
00:14:25.540 You have not, this is not a tragedy that you have to spend the rest of your life trying
00:14:31.980 to figure out how to find some emotional equanimity over because you don't attribute that much
00:14:38.280 sentience or perhaps any sentience to bugs, right?
00:14:42.400 So if I say it's a bee, and this is both positively and negatively valenced, right?
00:14:48.740 So in terms of their capacity to suffer, if it exists at all, you must imagine it's minuscule
00:14:55.700 compared to that of more complex animals and certainly compared to humans.
00:15:02.040 And in terms of the type of happiness they might have enjoyed, but for the fact that they
00:15:07.760 came into contact with your speeding car, the loss of opportunity for that enjoyment
00:15:12.700 is also not a tragedy.
00:15:15.560 But as you move up the hierarchy, you know, phylogenetically, as you, you know, if you run
00:15:20.300 over a squirrel or somebody's dog or in the worst possible case, a person, what you recognize
00:15:27.400 there with each step up is kind of the wider implications of suffering and deprivations of
00:15:35.680 happiness and also the, you know, the social context in which that may or may not be happening.
00:15:41.460 So with a dog, you immediately think of the owner of the dog and the suffering of that
00:15:47.200 person and the family, et cetera.
00:15:50.440 And with a person, if you run over somebody's child, this is the sort of, you know, life deranging
00:15:56.160 catastrophe that, you know, you may never get over given its implications.
00:16:01.680 Now, part of your argument suggests that that moral hierarchy is not ethically defensible
00:16:10.180 or at least not fully defensible as given, or if we're going to defend it by reference
00:16:15.220 to capacities, capacities for suffering and capacities for happiness, we have to recognize
00:16:20.260 that those capacities don't, in every individual case, track the boundaries between species.
00:16:25.900 So just react to my intuitive sense of there being a moral hierarchy here and how one might
00:16:32.660 justify it or not.
00:16:34.380 Well, I think what you said is really compatible with what I said before.
00:16:38.460 I said that pain is pain and it doesn't matter what the species is.
00:16:42.480 What matters is how much the being suffered.
00:16:44.700 Now, what you're suggesting is that a bug that hits our windscreen may not be capable of
00:16:51.000 suffering at all, and I agree, that's, it's certainly, we can't be certain that insects
00:16:55.740 feel pain.
00:16:56.820 And if it does suffer, then we assume that the suffering is in some way less than ours,
00:17:03.400 that the capacities for suffering are far less, you know, the bug has far fewer, vastly fewer
00:17:09.460 neurons than we have and, you know, may not suffer at all or may have a quite different
00:17:16.340 kind of suffering that is less than ours.
00:17:19.400 So I think that's what, that's what we hope is going on.
00:17:22.040 And if we get up to, you said, do you hit a squirrel or a dog perhaps?
00:17:26.940 I think if we do that and we get, we stop the car, hopefully, and we see the animal is
00:17:32.900 injured and not dead and presumably suffering from the injury, I hope we would be concerned
00:17:38.500 about that.
00:17:39.800 And if we think the injury is serious and probably this animal is not going to survive, I hope we
00:17:45.220 do something about it, actually, you know, and maybe people would be reluctant to do
00:17:49.320 this, but I hope that we find a big piece of wood or a rock and we crush the squirrel's
00:17:56.100 head so that the squirrel is not going to have a slow, drawn out death.
00:18:00.320 Because I don't think in the case of a squirrel or even a dog, really, it's the killing that
00:18:05.080 is so significant because I don't think that they are beings who, like us, live as much
00:18:11.560 over time, have a sense of their biographical life and have hopes about what they're going
00:18:17.220 to do in the future in the way we do.
00:18:19.560 So I think that if we're just talking about the fact that a being is killed, those cognitive
00:18:25.300 differences are morally significant.
00:18:28.520 But if we are talking about the pain that they're feeling, then I think they're less
00:18:32.180 significant.
00:18:33.040 They may, of course, as in the case of the bug and possibly in the case of the squirrel
00:18:37.060 and the dog too, there may be lesser capacities for pain, but I think we're on pretty shaky
00:18:42.700 ground if we assume that with other birds and mammals that there is a lesser capacity
00:18:48.720 for pain than we have.
00:18:50.940 It might be a different capacity.
00:18:52.460 It might be different things that make them suffer.
00:18:54.480 But I think most people who live with dogs would think that their dogs are certainly capable
00:18:59.520 of suffering quite acutely in some circumstances.
00:19:01.660 But one of the implications of your argument is that these distinctions cannot be made neatly
00:19:08.920 at the species boundary.
00:19:11.220 So for instance, if we had a, I think you used an example of an anencephalic child, you
00:19:17.000 know, a child born without a cerebrum, you know, just with a brainstem keeping the child
00:19:23.120 alive.
00:19:23.520 But, you know, there's zero hope of a fully human existence and probably no reason to
00:19:30.980 attribute consciousness to that child.
00:19:33.260 This is a human child, but not one destined to become a person, really.
00:19:39.900 And our intuition is that still this child is, you know, however compromised, is more important
00:19:47.840 than any non-human animal by virtue of her humanness.
00:19:52.720 And your argument seems to cut across that.
00:19:56.200 Because if we're going to, if suffering really is the point, if sentience is the point, and
00:20:00.360 if we want to extend that by reference to various mental capacities, I mean, you just added
00:20:08.040 this sense of, you know, biographical continuity in time and, you know, future expectations of
00:20:14.720 the future, if that, you know, further elaborates one's capacity for suffering and happiness,
00:20:21.160 well, then this child is not the locus of any of that and never will be, and therefore
00:20:26.660 has to be morally less important than any fully intact dog or squirrel or chimpanzee.
00:20:36.120 So these distinctions do kind of run roughshod over any kind of species boundary.
00:20:41.820 Yes, that's correct.
00:20:42.440 In a way, that's the other side of my view about species not counting.
00:20:49.700 On the one hand, it means that non-human animals, at least those capable of suffering or enjoying
00:20:55.900 their lives, matter more than we generally attribute the significance of their pain or suffering.
00:21:02.420 And on the other hand, the idea of the equal value of all life or of the lives of all members
00:21:10.780 of the species Homo sapien is also criticized by the view that I take.
00:21:16.320 Because as you correctly point out, if there is a child who, an anencephalic infant has only a brain stem,
00:21:24.300 they don't have the rest of the brain.
00:21:26.220 They're not brain dead because there is a functioning brain stem and that means, you know,
00:21:30.760 they can breathe and the heart beats, but they will never be conscious.
00:21:36.220 So I think that actually their life in itself has less significance or importance than the life of
00:21:44.140 a dog or a cow or a pig because those beings can experience things and can have good lives
00:21:51.060 or bad lives from their perspective.
00:21:53.560 Now, I say with the infant taken in itself, if the parents somehow want this child to live
00:22:00.320 and want it to be treated, well, that's another factor to consider.
00:22:05.300 But the example I give in the book, which I think I use to illustrate how far we go in this speciesism,
00:22:12.280 is one of a baby who was born with anencephaly.
00:22:17.820 And the parents actually, recognizing that this was a tragedy and understanding correctly
00:22:23.420 that their child would never become a person, would never recognize her mother or smile at her mother,
00:22:30.380 wanted something positive to come out of this.
00:22:33.560 And so they asked if the babies could be an organ donor
00:22:38.180 and if her organs could be given to, for example, another baby who was born with a major heart defect,
00:22:45.160 as occasionally happens.
00:22:46.980 And when babies are born with heart defects, with what's sometimes called a hole in the heart,
00:22:52.540 it's very hard to get organs for them because, of course, there are very few babies who die,
00:22:59.600 or injured, say, in a car accident and are brain dead and from whose hearts might then be removed.
00:23:04.260 So there was some potential for something good to come out of this.
00:23:09.360 But the hospital said they couldn't do it because that would be killing a human being.
00:23:15.500 And the parents even went to court to try to get that overruled.
00:23:18.860 But the judge said the same.
00:23:20.320 You can't cut the heart out of a living human being,
00:23:23.080 even if that human being has absolutely no potential to ever become a person,
00:23:27.420 to ever walk around and enjoy their lives or experiencing anything.
00:23:31.160 On the other hand, we do, of course, experiment on animals all the time,
00:23:35.180 including removing their hearts and trying to do transplants of their organs,
00:23:40.580 trying to overcome the rejection that typically follows if you take an organ from one species
00:23:45.140 and transplant it into another.
00:23:47.120 And we do this with a variety of animals, including baboons.
00:23:51.200 It's been done with and it has been done with a chimpanzee as well.
00:23:54.500 And, you know, without recognizing that, well, this being is far more conscious,
00:23:58.800 far more aware of the world, has far more of a life to live than the anencephalic baby.
00:24:04.560 Yeah, well, this actually connects to some other useful fictions, which,
00:24:13.260 I mean, so when you discuss this early in the book,
00:24:15.080 you say that this notion that all humans are equally valuable is not a statement of fact.
00:24:21.800 It's a prescription for how we should treat other human beings.
00:24:25.460 And so this is a statement of political equality and it is in some sense just a heuristic for
00:24:34.060 fairness and justice and arranging a sane society under some quasi Rawlsian principle of
00:24:43.380 that's going to ensure the best outcomes for most of us most of the time.
00:24:48.920 But it's not strictly true and it's not strictly true even in
00:24:52.860 situations quite a bit divorced from what we're currently discussing.
00:24:56.900 So you just imagine like a hostage situation where
00:24:59.360 a U.S. president is one of the hostages.
00:25:02.260 Well, no one imagines that that is going to be treated
00:25:05.460 the same way as any other routine hostage situation.
00:25:10.300 The U.S. president is going to be treated as more valuable
00:25:13.360 given his or her role in the world and et cetera.
00:25:16.540 And no one's going to think it's a total derangement of our ethics
00:25:20.820 or a few people will think it's a total derangement of our ethics
00:25:23.680 for us to be prioritizing saving the president of the United States
00:25:28.420 over any random person who may need their life saved on any given Thursday.
00:25:34.040 So this notion that all human beings are equally valuable is not something that
00:25:36.900 we can strictly factually defend.
00:25:39.280 But I think there is a...
00:25:41.760 It struck me in reading your book that there...
00:25:44.880 One analogy came to mind which, again, it's just a heuristic
00:25:48.960 but it may be ethically justifiable.
00:25:52.860 One question.
00:25:53.880 Peter, have you ever spent much time firing guns or working with firearms?
00:25:59.420 I have not, no.
00:26:01.220 This is a bit of an American obsession, I think,
00:26:03.220 and I did not grow up in the United States.
00:26:05.500 So then I'll just briefly educate you.
00:26:07.600 So when you're working with firearms, there is a dogma that one is wise to always observe,
00:26:15.100 which is to treat a gun as though it is always loaded, right?
00:26:19.140 And even if you, you know, if you're handing me a gun, you will check to see that it's loaded
00:26:24.180 and you'll check in a way that is quite redundant.
00:26:27.680 I mean, it really is a kind of an acquired obsessive compulsive disorder.
00:26:31.440 I mean, you'll look into the chamber and you'll look to see that there's no magazine in it
00:26:35.400 and then you'll look into the chamber again and then you'll look to see that there's no magazine in it
00:26:38.800 and you might even do that a third time before handing me the gun.
00:26:42.920 And even if I have just watched you do that, I too will check to ensure that the gun isn't loaded.
00:26:49.220 And even once it's been established that it's not loaded,
00:26:53.200 I will still treat it as though it is loaded,
00:26:55.660 which is to say I won't randomly point it in your direction
00:26:58.440 or in the direction of any living being that I wouldn't want to put at risk, right?
00:27:04.420 So that kind of discipline is the only thing that ensures that as you spend more and more time around guns,
00:27:10.820 the probability that you are going to get killed or injured by them inadvertently
00:27:15.960 or kill and injure somebody else isn't going to increase intolerably over the months and years.
00:27:22.380 But it is in fact true that when, you know, if you have just handed me a gun,
00:27:26.860 which I've watched you check to see whether or not it's loaded,
00:27:29.300 and I've checked it once and I've checked it twice,
00:27:32.240 it's not true that I actually believe the gun might still be loaded.
00:27:37.780 I know it's not loaded.
00:27:39.460 I've seen you check it.
00:27:40.460 I've checked it once.
00:27:41.820 And now when I'm checking it again, I'm engaged in a kind of religious ritual, right?
00:27:47.400 And yet it is a truly wise one that has consequences in the real world.
00:27:52.680 And to not observe this kind of redundancy, you know, has obviously negative consequences.
00:27:58.100 And every negligent discharge of a firearm that results in the injury or death of some innocent person
00:28:03.880 is always the result of a failure to practice this kind of obsessive attention to this sort of detail.
00:28:11.280 So this analogy occurred to me in defense of something like speciesism with respect to people.
00:28:20.200 And so it's the valuing of a life that we know is not actually valuable in the case of a human being
00:28:27.680 is a kind of bulwark, you know, an attitudinal bulwark against some of just the most obscene
00:28:35.900 departures from normativity we know we've accomplished in our past.
00:28:42.000 I mean, when you just look at the, you know, the Nazi doctors or, you know, the Nazis prior to the Holocaust
00:28:47.260 in full swing just deciding, okay, well, it's time to sterilize all of these mental defectives.
00:28:53.740 And, you know, now that we're sterilizing them, why don't we just start killing them en masse
00:28:57.920 because these aren't lives worth living?
00:28:59.380 I mean, there's a slide into evil which could be prevented if you just had this sort of blind heuristic
00:29:09.600 which is to value human life simply because it is human at every stage of its capacity,
00:29:19.360 whatever its individual capacities.
00:29:22.480 And I just wanted to get your reaction to that.
00:29:24.960 Yes, that argument was widely used in the debate about voluntary euthanasia when it first got started
00:29:33.900 in the Netherlands in the 1980s when doctors began assisting patients to die on their request.
00:29:42.400 And the courts allowed them to do that saying that they faced a conflict of duties because of the
00:29:48.180 unbearable suffering of their patients.
00:29:50.180 And many opponents said, this is going to lead to a slippery slope.
00:29:54.800 We will end up killing off people who are, you know, regarded as useless or intellectually disabled
00:30:01.360 or even politically undesirable.
00:30:05.000 And that argument in the early 1980s perhaps seemed to have some weight.
00:30:11.820 It was something of an unknown.
00:30:13.220 We were not familiar with the idea of the legal or open performance of voluntary euthanasia or medically assisted dying.
00:30:21.940 Actually, Peter, let me just add one caveat to it because it never occurred to me as what I just put forward,
00:30:27.360 it never occurred to me that it would be an argument against euthanasia in the case of there being real suffering
00:30:34.140 that we are preventing, right?
00:30:35.680 So I fully take your point that, you know, excruciating suffering is something that we want to relieve,
00:30:42.160 all things being equal.
00:30:43.400 And if euthanasia is the only way to do it, well, then the door is open to that.
00:30:47.380 It's more this argument that even in the case of a person who's no better than an animal,
00:30:54.240 and in fact, quite obviously worse, right?
00:30:57.180 You know, the person in the lifeboat who is less intelligent than the most intelligent chimpanzee for whatever reason,
00:31:06.080 it still might make sense to privilege their humanness over the chimp given a triage situation.
00:31:14.300 And looking strictly at capacities that cross the species boundary seems like perhaps a dangerous way to go.
00:31:23.520 That's more the argument I guess.
00:31:25.160 Okay. So in that case, it's not the suffering of the being whose life we are ending that is relevant.
00:31:31.440 But I do think we need to look, again, back at the other side of this,
00:31:35.960 because we are then still preserving the idea that every member of the human species is in some way more important,
00:31:42.420 or their lives are more sacrosanct or inviolable than every member of any other species.
00:31:48.580 So at a moral level, we're preserving this gulf between humans and non-human animals.
00:31:55.160 Now, your argument suggests, well, we're doing this to prevent a kind of slippery slope that gets us to Nazi holocaust or something like that.
00:32:04.020 And, you know, obviously, I don't want to take steps down that particular slope.
00:32:09.380 But on the other hand, it also allows us to treat the non-humans in this way that I think is totally horrendous and is on such a vast scale that you don't want to say this really,
00:32:24.120 but the scale of it is far greater than anything that has happened to humans because there are only 8 billion humans on the planet.
00:32:33.080 And each year, at present, we are killing about 200 billion vertebrate animals, raising and killing for food, 200 billion vertebrate animals, and inflicting miserable lives on them.
00:32:49.440 And I think that that's a huge cost that we need to think about as offsetting the risk.
00:32:57.540 I would see it as a rather small risk, but, you know, I don't deny that I'm not saying it's zero risk.
00:33:04.000 I think that breaking down this barrier and suggesting that, for example, in the case of that an encephalic baby,
00:33:10.120 it would have been all right to remove her heart and give it to a baby who needed a new heart,
00:33:16.200 that that kind of treatment is going to lead to these really, you know, great evils that we have certainly seen in the past.
00:33:24.040 And, of course, that we still see in different ways, although not quite in the same way that the Nazis did it.
00:33:30.360 Hmm. Well, I want to talk about some of the suffering you detail in your book.
00:33:36.060 But before I do, I just have a—I kind of want to jump to an ethical punchline question.
00:33:44.180 Because so all of this has to do with, you know, the cash value, the ethical cash value of everything we're going to talk about
00:33:50.500 comes down to questions of suffering and, you know, whether the lives of certain animals are net negative
00:33:58.680 based on, you know, how we raise them and how we treat them and how we kill them.
00:34:03.360 But what if—and I think, you know, and we'll get into some of the details,
00:34:06.980 but, you know, I don't think many readers of your book would be tempted to defend
00:34:12.620 most of the details you describe in your book around the use of animals in experiments
00:34:20.320 and their treatment when raised for food on factory farms.
00:34:25.260 I mean, the details really are appalling.
00:34:27.000 But what about the more enlightened or most enlightened smaller organic farms that may or may not yet exist?
00:34:37.060 I mean, just the ideal condition of, you know, pasture-raised cows or chickens, say,
00:34:42.900 if we agreed that animals raised under those conditions, you know, albeit raised for food
00:34:50.500 and eventually killed, live net positive lives, right, which is to say it would be better
00:34:55.720 to have been such an animal than to have not been at all, right?
00:35:00.220 Certainly better than to have been a wild animal that lives its entire life fleeing predators.
00:35:05.760 And it's certainly quite unlike what is happening on our industrial-scale factory farms.
00:35:14.760 If such idyllic or organic farms exist, and certainly some of them are currently advertised to exist,
00:35:23.180 would eating those animals be not only ethically permissible, but better than shunning all animal agriculture?
00:35:33.060 I think that's a very good question.
00:35:34.680 And quite a difficult question.
00:35:38.060 I accept that there are a small number of farms that do treat animals well, that give them good lives.
00:35:45.420 In the end, they kill them.
00:35:46.480 But especially if they can kill them on the farm, which in the United States is only possible for small animals
00:35:51.740 like chickens and ducks and rabbits, because otherwise you're not really allowed to kill them on the farm.
00:35:57.320 You have to take them to a slaughterhouse.
00:35:58.540 But if they have good lives and they die without suffering, I think there is a case for saying that we are not harming them by purchasing those products
00:36:10.300 because, on balance, their life was a good thing.
00:36:13.400 And if nobody purchased those products, then clearly they would not have existed at all.
00:36:19.900 Now, that gets you into this quite difficult philosophical argument that was first raised by the Oxford philosopher Derek Parfitt
00:36:27.020 in his book, Reasons and Persons, about whether bringing more beings into existence, if they're going to lead good lives, is actually a good thing.
00:36:35.840 And of course, here there's also the question, does it in some way compensate for depriving an existing being of their life?
00:36:41.840 But, you know, I'm prepared to say that the answer to that question, maybe yes, it is a good thing to bring beings into existence
00:36:48.200 if they're going to live good lives.
00:36:50.000 And if the only way to do that is to, at some point in their life, kill them without suffering and sell their products,
00:36:57.480 that might still be, overall, something that you can defend.
00:37:02.140 So, to that extent, although I'm a, well, I call myself a flexible vegan, I'm always vegetarian,
00:37:09.080 and I'm vegan when I'm shopping and buying for myself, but it's not always easy to stick to that when traveling or moving around.
00:37:16.640 So, although that's my preferred way of eating, I don't really reject people who,
00:37:23.180 there's sometimes called conscientious omnivores, who really search at these small places where they can be confident
00:37:30.820 that the animals have had good lives.
00:37:32.540 And I think it's difficult to find because you can't always believe the labels.
00:37:37.280 I think you really need to visit the farm and talk to the people who run it and make your own judgment
00:37:42.300 about how genuine they are in terms of what they're doing for animals.
00:37:45.700 But I'm not going to deny that some of them do exist.
00:37:49.240 So, you know, many of my fellow animal rights activists would say,
00:37:53.900 no, that's still a violation of the animal's rights.
00:37:56.060 You're still using it as a means to your end.
00:37:58.200 But I, as I say, to me, that's a difficult argument to make.
00:38:03.140 And I'm certainly not going to say confidently that that argument is wrong
00:38:07.200 and that that's why you should be a strict vegan or vegetarian.
00:38:12.820 But I am just pointing out that that's, you know, fewer than 1% of the animal products
00:38:18.400 raised in the United States or other affluent countries would meet that criterion,
00:38:23.920 quite possibly a lot fewer than 1%.
00:38:26.400 So it's not going to sustain the kind of diet that most Americans or most people in affluent countries will be eating.
00:38:35.400 And at the very least, we would need to drastically reduce our consumption of animal products
00:38:39.960 in order to be able to only limit it to animals who've had good lives.
00:38:44.340 Yeah, well, you mentioned parfit.
00:38:46.960 It reminds me, I want to get to parfit, too.
00:38:49.320 I think, you know, you see you're referencing the somewhat fraught discussion of population ethics
00:38:56.380 and whether it is justifiable in the end to just talk about the, you know, aggregate suffering and well-being
00:39:06.140 and how to do the moral math there.
00:39:09.080 But the math is, however we do it, it's at least implicit in more or less everything we say on this topic
00:39:16.760 because, I mean, as you said yourself, just the sheer magnitude of animal suffering
00:39:21.240 is what raises it to the current level of moral concern that you are giving it, right?
00:39:27.980 I mean, the fact that you're giving numbers like 200 billion animals a year,
00:39:32.320 the reason why that is more important than most other things or really all other things
00:39:37.060 is because of the numbers, right?
00:39:39.520 It's because of some sense that more is different,
00:39:42.700 which is to say more suffering spread over billions is more important.
00:39:48.520 And if we could reduce the number of animals treated in appalling ways,
00:39:53.260 well, then that would be making progress toward the good.
00:39:57.420 And it's just, perhaps we even spoke about the repugnant conclusion
00:40:01.660 and other paradoxes thrown up by Parfit's work in previous podcasts,
00:40:05.260 but it is, in fact, difficult to do the math under certain rather novel framings
00:40:12.240 of the sort that Parfit seemed to produce, you know, every minute of the day for decades.
00:40:18.420 But, I mean, it strikes me as morally uncontroversial
00:40:22.900 that the misery and death of X number of people or animals is just, you know,
00:40:29.040 all things being equal is not as bad as the misery and death of 1,000 X number
00:40:34.540 of the same people or animals.
00:40:36.960 And that's, I think, you know, that's everyone's intuition.
00:40:40.180 We'll certainly agree on that, yes.
00:40:41.720 Yeah.
00:40:42.420 Okay, so actually the most disturbing chapter in your book,
00:40:46.140 I think it was probably because I was much less familiar with the details,
00:40:50.060 was for me the chapter on animal experimentation,
00:40:53.780 which is, I have to say, it's like, it's almost unbelievable that,
00:40:58.400 you know, especially in psychological science that we did and continue to do,
00:41:04.760 it sounds like, these experiments that just seem not only pointless and unnecessary,
00:41:11.600 but just sadistic and insane to the point where I don't know how these experimenters
00:41:17.680 do this type of work, and I certainly don't know how they attract graduate students.
00:41:22.340 What is the state of current practice now, and what sort of pressure has been put on
00:41:28.660 the scientific community to stop these types of experiments?
00:41:33.600 I mean, feel free to describe what I'm talking about, but I'm thinking in particular of,
00:41:37.260 you know, the learned helplessness experiments that supposedly offer some models of depression
00:41:43.020 or PTSD.
00:41:43.820 And it was quite amazing to discover that Martin Seligman, who's often credited as the
00:41:49.840 father of positive psychology, was among the people who has done these experiments and
00:41:55.080 seem to have endorsed them as important even up to the present.
00:41:59.380 The details are actually jaw-dropping, so I don't, you know, we don't need to be pointlessly
00:42:05.220 gruesome here, but it's just amazing what you describe in your book.
00:42:10.600 Yes, I have to say, I was really disturbed in writing that chapter.
00:42:17.300 I was disturbed in writing the experimentation chapter in the original edition in 1975, but
00:42:23.280 I had expected things to have improved more than they have.
00:42:27.260 I'm not saying they haven't improved at all.
00:42:29.700 They have, but there is still, as you say, a lot of quite horrendous things continuing.
00:42:35.980 And I expected things that improved because one of the things that it has happened and was
00:42:40.540 a result, I suppose, of pressure on scientists from the animal community, was the introduction
00:42:45.720 of what are generically known as animal experimentation ethics committees, but in the United States
00:42:52.100 are known as institutional animal care and use committees.
00:42:55.760 And these are committees that look at proposals for experiments from people in the institution
00:43:00.300 that may be going forward for applications for funding or just to be done.
00:43:05.280 And they are supposed to vet the experiments.
00:43:09.940 And I'd been led by some people to think that they were doing an effective job.
00:43:15.160 Steven Pinker, for example, wrote in Better Angels of Our Nature that when he was a graduate
00:43:21.800 student in psychology, he did what he says was one of the worst things I've ever done.
00:43:26.740 And he himself describes it as torturing a rat to death.
00:43:29.880 He didn't really mean to torture it to death, but he set up an experiment, left it overnight
00:43:34.760 and in the morning the rat was dead.
00:43:36.540 And he concluded that it had effectively been tortured to death because it was getting electric
00:43:40.620 shocks and had not learned to stop the shock in the way that he or his supervisor had expected
00:43:46.920 the rat would learn to stop the shock.
00:43:49.900 But he then says, you know, well, that happened, whatever the date was in the 1960s, I guess,
00:43:54.900 when Steven was a graduate student, but the difference now is like the difference between
00:44:00.300 night and day.
00:44:01.960 And unfortunately, it's not.
00:44:04.620 Unfortunately, there is still a lot of research that gets through these institutional animal
00:44:09.940 care and use committees that really should not be done.
00:44:13.280 That is clearly very painful and distressing to the animals.
00:44:17.280 And that is being done in the United States, but also in many other countries.
00:44:22.040 So the learned helplessness experiments that you talked about, that Martin Seligman and
00:44:27.440 others were involved in, was an attempt to produce an animal model of depression.
00:44:35.100 And the idea was that you train an animal to be able to escape an electric shock.
00:44:41.580 So a dog, for example, would be put in a sort of cage enclosure that had two sides to
00:44:48.460 it and it had a wire floor on both sides, but you could electrify the floor on one side and
00:44:55.000 the dog would then feel the shock and would rapidly jump onto the other side of this box
00:45:02.080 where there was no electric shock and it would learn to do that.
00:45:04.460 But then at some point, you put up a barrier so the dog can't jump away from the electric
00:45:08.860 shock.
00:45:09.820 And as the experimenters themselves describe, after a large number of attempts to escape,
00:45:15.040 and I think they used things like running around, urinating, defecating, yelping, the
00:45:20.720 dog will eventually give up the attempt to escape and will simply lie on the electrified
00:45:25.400 floor and passively accept the shock.
00:45:28.880 And the idea was that this would be in some way a model for depression and that maybe we
00:45:33.580 would learn to treat depression, which of course is a terrible condition when humans
00:45:37.260 have severe and untreatable depression, from doing this to dogs.
00:45:41.080 But this went on for decades and we never learnt anything that enabled us to treat the severe
00:45:47.400 cases of depression from this.
00:45:50.480 And although it's now accepted by the experimenters themselves that this was not a good model of
00:45:55.840 depression, in fact, and that even the label they'd given to it, learned helplessness, turned
00:46:00.200 out to be wrong because it wasn't actually learned behaviour.
00:46:03.400 It was something that was more biologically innate.
00:46:05.920 But, you know, after they'd given up using that as a model for depression, somebody then
00:46:11.500 had the bright idea of saying, well, maybe this is not a good model for depression, but
00:46:14.900 how about post-traumatic stress disorder, PTSD, which is a problem that, you know, we talk
00:46:20.380 about a lot now.
00:46:21.640 So then they said, well, yeah, could we use it for that?
00:46:24.900 And then they said, hmm, but maybe it's not traumatic enough or, you know, maybe there's
00:46:30.540 a theory that PTSD comes from early childhood abuse and then that is reignited by a later
00:46:37.040 traumatic event.
00:46:38.820 So one of the experiments I describe, done with rats, attempts to replicate this, attempts
00:46:44.660 to say, okay, we'll give them some abuse when they're very young and then we'll abuse them
00:46:49.300 again when they get older.
00:46:51.360 And they set up things where they do a whole variety of different forms of trauma to them.
00:46:57.160 So one is giving them inescapable electric shock, another is dropping them into a sheer
00:47:03.840 sided container of water where they have to swim to stay alive and you have what's called
00:47:11.220 the forced swimming test.
00:47:12.720 You let them for 20 minutes where they can't get out of this container, they have to keep
00:47:16.960 swimming and swimming and probably they're fearful, of course, of getting tired and drowning.
00:47:23.760 Then another one is you immobilize them.
00:47:25.720 You put them in plastic cones where they can't move at all.
00:47:29.160 They're completely immobilized in all their limbs.
00:47:32.160 And yeah, or you may deprive them of food.
00:47:35.460 So again, a lot of pain and distress is being inflicted on animals.
00:47:40.040 And is this really a model of human post-traumatic stress disorder?
00:47:43.600 It seems very unlikely because it seems that we have a different kind of awareness of what's
00:47:48.800 going on.
00:47:49.360 And we talk about it with other people.
00:47:52.420 It's a social thing as well.
00:47:54.800 We may feel that we were humiliated because we were abused and mistreated, which perhaps
00:47:59.320 a rat does not feel or not feel in the same way.
00:48:02.580 So it is a kind of continuation of saying, well, we use animals in this way.
00:48:06.960 What else can we think of that we will use and abuse?
00:48:11.920 But very little of this translates to being useful for humans.
00:48:17.280 You can't say it's zero, but there's an immense amount of pain and suffering inflicted on animals
00:48:22.240 in the hope that it will do some good for humans.
00:48:24.940 But it very rarely does.
00:48:27.500 And of course, it uses a lot of resources.
00:48:29.280 It takes money.
00:48:29.980 It takes scientific, talented scientists to work on this.
00:48:33.800 And who knows if we use those resources and those talented people to more directly try
00:48:39.320 and treat people with the disease, do research, clinical studies of humans with the condition.
00:48:44.900 Maybe we would have got better treatments for these conditions without abusing animals.
00:48:49.980 Well, also in the context of that discussion, you make the quite astute point really devastating
00:48:58.660 for the whole enterprise, which is to say that if this work really is to translate into
00:49:04.160 our understanding of human suffering, it will only translate because these animals are deeply
00:49:10.080 analogous to humans in their suffering.
00:49:12.540 And if they are deeply analogous to humans in their suffering, well, then that makes our
00:49:16.500 mistreatment of them all the more odious.
00:49:19.980 Ethically speaking.
00:49:21.080 So insofar as this work could be useful, it approaches the ethical asymptote of just
00:49:29.080 the monstrosity of treating other sentient beings in this way.
00:49:34.180 And insofar as they're not at all analogous and the suffering is, you know, in a Cartesian
00:49:39.080 sense, really just an illusion.
00:49:41.180 Well, then why are we doing the work in the first place?
00:49:44.260 Exactly.
00:49:44.860 Yes, that's right.
00:49:45.580 It's a dilemma that people who do this kind of work on animals in psychology
00:49:49.560 have to face.
00:49:51.480 Either the animal is really like us in terms of its psychology and its mind, its mental
00:49:56.320 states, in which case, how can we possibly justify doing this?
00:50:00.560 Or it's completely unlike us.
00:50:02.620 And then what are we going to learn from doing these things to the animal?
00:50:06.420 So yeah, I don't think they can win that argument.
00:50:08.720 Yeah, I think the most depressing studies you cite in the book, I've now forgotten whether
00:50:15.980 these are more from the first edition or whether this kind of work has continued up to the present,
00:50:22.040 perhaps you can tell me.
00:50:22.840 But it's all the maternal deprivation stuff with monkeys and apes where, you know, they're
00:50:28.540 given a, instead of, you know, access to their actual mother, they're given access to
00:50:33.660 a, you know, a wireframe simulacrum of a mother.
00:50:37.660 But, you know, every sadistic permutation of this seems to have been explored, including
00:50:44.020 mothers that, you know, pointlessly shock them or stab them or, you know, screech with
00:50:48.740 loud sounds.
00:50:49.640 Or, I mean, if you just imagine an alien race coming to Earth and beginning to treat
00:50:55.140 us this way, you know, the only theory of mind you could have about them is it's just
00:50:59.840 pure evil, right?
00:51:01.860 I mean, it's just like there is no greater evil than the adumbrations of those experiments
00:51:06.980 that some brilliant grad student or his or her supervisor has designed.
00:51:13.120 Does that work still continue?
00:51:15.220 And if so, what is the possible justification for it?
00:51:18.320 Well, that work went on for a very long time.
00:51:22.000 Harry Harlow was the one who really started this series of experiments back in the 1950s.
00:51:26.880 And then, yeah, I mean, he really did horrendous things, as you say, that you have to suspect
00:51:32.500 there was some kind of sadism behind this from the things he did and from the way he wrote
00:51:37.780 about it, right?
00:51:38.440 I mean, he used terms like he created a tunnel of terror, he calls it, to frighten these monkeys
00:51:44.140 to see if he could see what, you know, pathology, mental pathology developed.
00:51:48.660 And then he got these neurotic female monkeys and he wanted to see how they reacted with
00:51:54.860 their own babies, but they wouldn't allow the males to mate with them.
00:51:58.680 So then he constructed what he calls in his own paper, a rape rack, basically tying the females
00:52:04.080 down so that the males could rape them.
00:52:07.000 And, you know, then sees how they are, what sort of mothers they are with their babies.
00:52:12.020 And he describes how one of them takes its baby face down on the floor and rubs its face
00:52:18.900 across the grid of the wire cage.
00:52:21.320 So, you know, there's generations of suffering that he is causing.
00:52:25.960 So he trained his graduate students to continue to do this work.
00:52:29.840 One of them was Stephen Suomi, who continued to get large grants from the National Institutes
00:52:36.420 of Health in the United States, supporting this research.
00:52:40.000 Until finally, people for the ethical treatment of animals, PETA, as it's more commonly known,
00:52:44.980 the animal group, aroused opposition to them.
00:52:48.080 And I think they stopped in 2015, something like that.
00:52:51.820 But so these experiments had gone on for 60 years, this vein of experiments.
00:52:56.840 As far as I know, they're not going on now.
00:52:58.840 I certainly hope they're not going on now.
00:53:00.380 But obviously, they went on far too long.
00:53:02.620 And there are other things that may be almost as bad that are still continuing.
00:53:08.400 Well, so now this is a line here that, you know, I don't know how to specify it.
00:53:14.020 But I mean, my associations with PETA as an organization, you know, albeit distant associations,
00:53:18.820 I've never had any direct experience of the group.
00:53:21.560 But I just, from the kinds of protests I've heard them, you know, perform, it has seemed
00:53:28.160 to be against all animal experimentation, no matter how seemingly sane and necessary it is,
00:53:36.880 right?
00:53:37.100 So, like, granted, the experimentation you describe in your book is something that I see
00:53:42.880 no ethical justification for.
00:53:44.880 But again, I believe I can imagine the judicious and careful use of non-human animals for the
00:53:52.320 purpose of mitigating the most appalling forms of human suffering.
00:53:56.640 And that there may not be any, you know, computer simulations that can come to the rescue here
00:54:02.840 to make that kind of work no longer necessary, right?
00:54:06.140 So, you know, I'm not sure what the best examples are at this point, but I imagine there are some.
00:54:12.520 So, can you speak to that issue of just the potential animal rights extremism here that would
00:54:18.100 prevent us from figuring out how to cure our children's cancers or spinal cord injuries or
00:54:23.900 whatever it is?
00:54:25.220 Right. So, I think actually that Peter does tend to focus on the experiments which are
00:54:30.220 not curing our children's cancers, and that's obviously good tactics if you want to change
00:54:36.020 something. You don't want to tackle the hardest cases, you want to tackle the cases that will be
00:54:40.560 more widely accepted by the public. But there's a lot of pressure, of course, you know, as with
00:54:46.760 any group or political group of activists or lobbyists, there's a lot of pressure to sort of
00:54:52.700 stick to a party line and not allow much nuance in your position. And I think that that's probably
00:55:00.280 responsible for the fact that there certainly are organizations against animal experimentation.
00:55:06.220 Whether or not Peter is one of them, I'm not quite in a position to say. That would say,
00:55:09.980 we're against all animal experiments, even the ones that will cure the cancer of your children.
00:55:16.420 So, I'm not in that position. I do think that you have to accept that there can be potentially
00:55:23.880 justifiable experiments. Now, always, of course, I think first you should see, as in fact you
00:55:29.640 suggested, is there some non-animal-using way we can make progress on this issue? And you mentioned
00:55:35.420 computer simulation. For some cases, it might be growing cells in vitro, where there's no conscious
00:55:41.240 being, but just cells that are being worked with. There's a whole range of fields of developing
00:55:47.080 alternatives to animal research. In fact, at the end of August, I attended a conference at Niagara Falls,
00:55:53.140 in which there were several hundred scientists from 40 different countries, all exchanging notes on
00:55:58.480 on where they were making progress. But, you know, I certainly acknowledge that there are
00:56:04.380 experiments going on now that we cannot replace with non-animal-using methods. And some of them
00:56:10.020 will have benefits that are sufficient to say, yes, reluctantly, at the moment, we are justified in doing
00:56:17.680 this with animals, while trying to minimize their suffering to the greatest extent possible. One example
00:56:24.380 that I give in the book is research to alleviate the symptoms of Parkinson's disease, which many
00:56:30.340 listeners will know, somebody who has Parkinson's disease or may have it themselves in their early
00:56:35.440 stages. And it is a terrible condition. And it affects millions of people worldwide. So if you could
00:56:44.200 find, if you have something that really has good hopes of curing the disease, or in the case of the
00:56:50.080 research that I mention in the book, alleviating the symptoms of the disease, which is an important
00:56:54.480 part, of course, it's a slow-acting disease, I think, you know, you could defend that if there were
00:56:59.900 really no other way to find that treatment. So, yeah, it makes life more complicated to recognize
00:57:06.460 that. And then you have to start drawing lines. It's a lot easier if you just say, well, no harmful
00:57:10.900 experiments can be done on animals at all. But the cost to that is that you have far smaller chances of
00:57:18.240 actually obtaining public support. Because, of course, the lobby that wants to continue to do
00:57:23.040 animal experiments, which is not only the scientists, but also the big commercial companies
00:57:27.920 like Charles River Laboratories that produce millions of animals for use in laboratories and
00:57:34.040 make good profits from it. So the lobbyists are well-funded, and they will place ads, you know,
00:57:39.580 basically saying, your child or the rat, here are these fanatics who want us to stop, you know,
00:57:45.800 promising research to save your child from cancer, and they want to stop it because we're using rats.
00:57:51.900 And clearly, at this stage anyway, the public is going to say, oh, I'll choose my child, thank you
00:57:57.780 very much, not the rat. Yeah, yeah. Yeah, well, this is one of those instances where the perfect
00:58:04.680 and the pure can be the enemy of the good. And yeah, I would agree that it's important to be
00:58:11.460 pragmatic as well as principled here. But let's talk about effective altruism, because this is
00:58:16.960 this is something that I have, I know you and I have spoken about it before. I've certainly spoken
00:58:22.820 to Will McCaskill and many times on the podcast and Toby Ord, at least once. This is a movement that I
00:58:31.120 have, I've never been officially an adherent of, though I've been very directly informed by it.
00:58:39.000 You know, it's representation online in a, you know, in the branded culture of effective altruist
00:58:48.460 has always struck me as not perfectly passing the smell test. I mean, there's something quasi-cultic
00:58:55.220 about it or dogmatic or something that always concerned me, and this is something I've spoken
00:59:00.120 to Will about, at least. But generally speaking, it seemed like a major advance over the normal way
00:59:07.420 of approaching philanthropy, which is just to let your sentimentality and good feels be your guide
00:59:13.280 and to really have no rational accounting of the good you're doing or the harm you may be causing
00:59:18.720 apart from that. So it's, you know, it really has informed the way I give to causes very directly.
00:59:26.320 And then here comes Sam Bankman-Fried, who was very clearly the poster boy for the ultimate
00:59:33.460 instance of what effective altruists call earning to give, where, you know, you're a smart person
00:59:40.060 whose talents could be easily monetized and you recognize that it's far better than joining an
00:59:45.320 organization like, you know, Doctors Without Borders or anything else where you're explicitly doing good
00:59:52.360 in the world. It would be better to earn all the money you can earn and support those organizations
00:59:58.040 because then you have the effect of hundreds or thousands or even tens of thousands of people
01:00:03.220 doing that good. And in his case, he seemed to earn more money than almost anyone in human history
01:00:09.840 and he was earning it explicitly for the purpose of doing all the good he could do.
01:00:15.180 And quite unhappily, the cynicism with which this project was viewed by many people who take a dim view
01:00:23.960 of effective altruism and or even a view of altruism, a dim view of altruism as well. I mean,
01:00:29.280 just the kind of the Ayn Rand types in Silicon Valley who think it's all just virtue signaling and
01:00:34.280 any pretensions to the contrary is just kind of human vanity and, you know, status signaling under
01:00:41.500 some other guys. I'm not going to name names, but there are many people who view really any
01:00:46.360 any philanthropy along those lines. And he seemed to be the living confirmation of all of their
01:00:53.780 prejudices. And because I won't have to go into details here, I think everyone will be familiar
01:00:59.960 with just how fully it really became a Greek tragedy, at least when viewed from the point of
01:01:05.080 view of Sam Bankman Freed's parents, I think, just how fully he soared in the estimation of everyone
01:01:11.060 and then immolated. What's been your experience as certainly one of the patriarchs of effective
01:01:18.060 altruism in the advent of the Sam Bankman Freed catastrophe?
01:01:25.220 Well, it has been a tragedy. I think that's a good way to describe it. It's a tragedy viewed from
01:01:31.280 many different perspectives. You mentioned Sam's parents were professors of law and STEM.
01:01:38.720 If you'd like to continue listening to this conversation, you'll need to subscribe at
01:01:43.760 samharris.org. Once you do, you'll get access to all full-length episodes of the Making Sense
01:01:48.420 podcast, along with other subscriber-only content, including bonus episodes and AMAs and the conversations
01:01:55.180 I've been having on the Waking Up app. The Making Sense podcast is ad-free and relies entirely on
01:02:00.720 listener support. And you can subscribe now at samharris.org.