#48 — What Is Moral Progress?
Episode Stats
Words per Minute
161.2551
Summary
Peter Singer is one of the most famous living philosophers. He is the author of many books, including Animal Liberation, which is often considered the silent spring of the animal rights movement. He s also written The Life You Can Save and The Most Good You Can Do, and The Ethics of What We Eat. His most recent book is Ethics in the Real World: 82 Brief Essays on Things That Matter, and I highly recommend it. Peter and I talk about many things, and we ran out of time, frankly, and as you'll hear at the end, I come up against the brick wall of a time constraint and really was wanting to bring Peter back at some point, so I'll have to bring him back later. Sam Harris Sam is a professor of bioethics at Princeton University, and a regular visiting position at the University of Melbourne in Australia. He has been interested in a range of different issues, including: ethics, politics, free speech, euthanasia, and the sanctity of life, for a long time. He's also written on a variety of other topics, including his new book, Ethics In The Real World, which you can get a copy of on my blog. You can find a link to it on his blog here. If you do find this useful, you can once again support the podcast at Samharris.org/Supporting-The-Making-Makes-Sense. And as always, your support allows me to clear my schedule to do this sort of thing and keeps this all ad-free. I hope you find it useful! and as always your support is greatly appreciated. - Sam - Thank you! . Thank you for supporting the podcast. . . . Sam. -- The Making Sense Podcast is made possible entirely through the support of our sponsorships! -- Sam Harris, the podcast's made possible by you, the listeners, the listener, and your support keeps this podcast all ad free and makes this all possible. It helps me do more of this stuff possible. Thank you, my dear listener, I really appreciate what we re-create the world we re living in a world where people can live their lives in a better, more meaningful, more beautiful, more of a place where they can all have a chance to experience the best of what we can all be all of us can live together in the best way possible. I really do appreciate it.
Transcript
00:00:10.880
Just a note to say that if you're hearing this, you are not currently on our subscriber
00:00:14.680
feed and will only be hearing the first part of this conversation.
00:00:18.420
In order to access full episodes of the Making Sense Podcast, you'll need to subscribe at
00:00:24.060
There you'll find our private RSS feed to add to your favorite podcatcher, along with
00:00:30.520
We don't run ads on the podcast, and therefore it's made possible entirely through the support
00:00:35.880
So if you enjoy what we're doing here, please consider becoming one.
00:00:49.160
Peter is certainly one of the most famous living philosophers, and he's been very influential
00:00:54.980
on public morality, both with respect to the treatment of animals and in this growing
00:01:00.820
movement that I spoke about with Will McCaskill on a previous podcast known as Effective Altruism.
00:01:07.020
He's a professor of bioethics at the University Center for Human Values at Princeton.
00:01:13.820
He's the author of many books, including Animal Liberation, which is often considered the silent
00:01:23.640
He's also written The Life You Can Save and The Most Good You Can Do and The Ethics of
00:01:30.300
His most recent book is Ethics in the Real World, 82 Brief Essays on Things That Matter,
00:01:38.380
Peter and I talk about many things, and we ran out of time, frankly.
00:01:42.600
We had two hours booked, and as you'll hear at the end, I come up against the brick wall
00:01:47.980
of time constraint and really was wanting to talk about many more things, so I'll have
00:01:55.560
We spend the first half hour or so talking about how it's possible to talk about moral truth,
00:02:03.020
and if that's not to your taste, if you're not really worried about how we can ground our
00:02:08.020
morality in universal truth claims, you might skip 30 minutes in or so where we start talking
00:02:16.120
about questions of practical ethics, and we touch many things, the ethics of violence,
00:02:25.620
There's just a lot we cover, and I hope you find it useful.
00:02:29.320
And if you do find conversations like this useful, you can once again support the podcast
00:02:39.600
And as always, your support is greatly appreciated, and it's what allows me to clear my schedule
00:02:44.280
to do this sort of thing and keeps this all ad-free.
00:03:05.280
Listen, everyone will know who you are, but perhaps you can briefly describe what you do
00:03:09.240
at this point and the kinds of questions you focus on.
00:03:14.160
I'm a professor of bioethics at Princeton University, and I also have a regular visiting position
00:03:19.320
at the University of Melbourne in Australia, which is where I'm originally from.
00:03:26.520
I've been interested in a range of different issues.
00:03:28.900
I wrote a book called Animal Liberation, published back in 1975, that some regard as having started
00:03:39.900
I've also been interested for many years in the obligations of the affluent people like
00:03:44.980
us to people in extreme poverty elsewhere in the world.
00:03:48.140
And I've written on issues in bioethics, questions about the sanctity of life, and a range of
00:03:57.780
Yeah, and your new book is entitled Ethics in the Real World, and I'll have a link to it on my blog,
00:04:05.340
and I certainly encourage listeners to get that.
00:04:08.640
It's great because it's divided into these 82 very short chapters, literally like three-page essays,
00:04:21.940
So you tackle questions like, you know, should poor people be able to sell their organs?
00:04:31.820
These are all questions where public policy and how people actually live their lives are just
00:04:38.520
explicitly in play, and just super digestible philosophical essays.
00:04:47.200
If I'm not mistaken, Peter, you and I have only met once, right?
00:04:50.040
I think it was at this Arizona event organized by Lawrence Krauss, which was...
00:04:55.500
Yes, that's the only time we've actually met in person.
00:04:57.720
Yeah, which unfortunately, it was titled The Great Debate, somewhat pretentiously perhaps,
00:05:02.760
but it was you and me and Steve Pinker and Lawrence and Patricia Churchland, I think, and a few other
00:05:09.440
people, and that's available for people to see on YouTube.
00:05:13.420
If I recall correctly, you and I got somewhat bogged down disagreeing about the foundations
00:05:19.520
of morality and human values, but I had the sense at the time that we were talking past
00:05:25.220
one another and getting derailed on semantics more than anything else.
00:05:30.420
So I'd like us to start with the topic of the foundations of morality and to answer the
00:05:37.580
question or attempt to answer the question, how is it possible for something to be right
00:05:43.420
and wrong in this universe or good and bad, and then move from there into what is the
00:05:49.060
relationship between the claims we make about good and evil and right and wrong and facts
00:05:54.720
of the sort that fall within the purview of science.
00:05:57.160
And then once we have just a concept of goodness in hand and how it relates to truth claims,
00:06:03.500
then I want to go on to talk about just the practical reality of doing good in the world,
00:06:07.780
and this will lead to questions of effective altruism and population ethics and moral illusions
00:06:14.620
So this first question I put to you is, how is it that you think about moral truth?
00:06:23.300
And if it does, what is the relationship between the true claims we make about good and evil
00:06:29.440
or right and wrong and facts of the sort we talk about in science?
00:06:34.120
That's a good question and a very large question.
00:06:37.460
It's one that I've grappled with on and off for most of my philosophical career, and I
00:06:44.100
have changed my views on it significantly in the last few years.
00:06:48.840
So earlier on in my career, I would have said that there are no objective truths in ethics,
00:06:56.660
but we can prescribe that certain things be done, and we can prescribe them not just for
00:07:04.740
ourselves or out of our own interest, but we can prescribe them in a way that is, to use
00:07:10.420
a term that my former Oxford supervisor, Professor RM Hare coined, universalizable.
00:07:16.920
That doesn't mean that they're the same for everyone, but what it means is that I can express
00:07:21.660
them without referring to myself or without using terms like I or proper names.
00:07:28.620
So, for example, if I were to say, as Donald Trump has recently been saying, it's fine for
00:07:35.780
me not to pay any taxes, then I would have to say it's fine for anyone in my situation not
00:07:44.400
Um, and, uh, of course, you know, one might not be so keen to do that.
00:07:49.360
You can think of other circumstances in which people might say even worse things.
00:07:54.540
The Nazi might say, uh, it's good for me to kill Jews.
00:07:58.680
Um, but then we, we can ask the Nazi to imagine, well, suppose you suddenly discovered actually
00:08:03.600
that you're of Jewish ancestry or your parents had hidden this from you.
00:08:07.800
Does that mean that it's fine for any Nazi to kill you?
00:08:10.600
Um, most Nazi probably would, would think twice.
00:08:13.860
There might be a few ideological fanatics who would still say yes.
00:08:19.760
So that was as far as I thought you could go really, that it did depend on people's inclinations
00:08:25.580
and prescriptions and there was no objective truth in it.
00:08:32.500
I think that there are some claims which you can say are truths, that they're things that
00:08:39.520
we can reflect on and that strike us as simply undeniable, if you like, as self-evident, although
00:08:47.300
that's not to say that everybody will immediately agree with them.
00:08:50.500
But an example would be that, uh, inflicting agony for no real purpose.
00:08:57.860
Let's say inflicting agony because it, on someone else, because it brings you some kind of moderate
00:09:04.140
enjoyment, mild enjoyment, uh, uh, that that's wrong.
00:09:08.320
Um, uh, and what's really, what's really at work here is the idea that, that agony is something
00:09:13.680
that's a bad thing, that the world is a better place if there's less agony in it.
00:09:19.340
Um, and I do think that that's, that's a very hard claim to deny, uh, that, uh, the fact
00:09:27.140
that someone is agony, is in agony provides us, provides anyone really with a reason to
00:09:36.900
Um, and the fact that doing an action will cause someone to be in agony is a reason not
00:09:42.680
Not necessarily an overriding reason, but it is a reason against doing it.
00:09:47.760
But now it's, uh, you say your views here changed recently.
00:09:51.840
Well, I guess, how recently are we talking in the last, since I, since I actually, I saw
00:09:57.820
Probably when I saw you in Arizona, uh, I was to some extent in transition.
00:10:05.360
Um, I was always trying, uh, even, you know, maybe 30 years ago, I was trying to find ways
00:10:11.380
in which you could tighten up the arguments, uh, that you could bring in some role for reason
00:10:17.060
in this so that, um, the, the problem with the position that, uh, my mentor RM Hare had
00:10:22.440
developed was that, uh, he said that, uh, universalizability just depended on the concept
00:10:29.240
of ought, the, the, the basic moral concept ought good and right incorporated this idea
00:10:35.980
And the problem with that was that if somebody said, okay, so I'm just not going to use those
00:10:40.940
Instead of saying you ought to do something, I'll say you schmort do something or, you
00:10:49.100
Um, and now it's fine for me to not pay any taxes.
00:10:52.880
Um, uh, and I don't have to say that anybody else in my position, you know, also doesn't
00:10:58.380
have to pay taxes or, or any of these other implications.
00:11:02.420
That seemed too easy a way out of moral arguments.
00:11:06.280
So, um, I was always looking for something, uh, a bit stronger and looking at whether you
00:11:12.860
could argue that, uh, there was a rational requirement, uh, that was corresponding to
00:11:20.440
And, uh, so I guess I, I only wrote about this in a book called The Point of View of the
00:11:27.580
Universe, which is a co-authored book with a Polish philosopher, Katarzyna de Lazari-Radek,
00:11:37.280
So it's, you know, perhaps a year or two before that, um, in the, in, in thinking about that
00:11:44.440
book, I had already come to the conclusion that, that you can argue that there are some
00:11:49.280
things that are, uh, moral truths or self-evident, um, uh, the 19th century philosopher, Henry
00:11:58.040
Sidgwick, who we discuss in that book, uh, describes them as moral axioms.
00:12:04.000
So, so it's, it's within the last, let's say five years, definitely that I've come to
00:12:10.260
So if I'm not mistaken, this is anchored in a kind of consequentialism or utilitarianism.
00:12:17.100
So what do you do with the claim that consequentialism is itself just an expression of a mere preference
00:12:31.000
I think that, um, I mean, there are different forms of consequentialism that what they, what
00:12:35.280
they have in common, of course, is the view that what we ought to do is the act that will
00:12:41.100
And then the discussion is what do we mean by best consequences?
00:12:47.160
I, I think that, uh, when you think about, uh, different actions, if it's clear that one
00:12:55.760
act will have better consequences, all things considered than any outcome, um, I'd be prepared
00:13:02.480
to say that it's then true that that act is the right thing to do.
00:13:07.360
Now, obviously, you know, that can be denied by people who think that there are some moral
00:13:13.260
rules, which we ought never to break, no matter what the consequences.
00:13:21.240
I, I would try to argue that, uh, my view is, is true, but, um, that really has to be at
00:13:29.780
least in part by undermining the foundations of the alternative view.
00:13:33.100
It's not, uh, it's not so self-evident that the consequentialist view is right, that one
00:13:38.600
can just state it and everybody will see it to be right, uh, because, you know, partly
00:13:43.520
because there've been a whole history and culture of moral thinking, which is based on
00:13:48.440
rules, um, moral rules do have a certain social purpose.
00:13:55.240
We can't calculate from first principles every time we act, which act will have the best consequences.
00:14:00.660
So it's not all that surprising that people sometimes think that these rules have a kind
00:14:05.520
of inherent objectivity of their own, um, and that we should obey them no matter what the
00:14:13.480
Um, but that, you know, that, that's the kind of argument you need to have.
00:14:20.040
I am, I'm tempted to not spend a lot of time fishing around for areas where we might disagree
00:14:26.500
in metaethics, but I think most of the listeners to this podcast will be familiar with my views
00:14:32.160
on, on morality and, and moral realism as I lay out in, in my book, The Moral Landscape.
00:14:37.400
I guess just a couple of points I would make here.
00:14:40.180
I think whenever I hear someone say that they are not a consequentialist, you know, whether
00:14:45.800
they hold to some rule that they think is important regardless of consequences, what I believe
00:14:54.180
I have found without exception in those conversations and in reading the work of people like Kant and,
00:15:00.740
and, you know, other famous non-consequentialists is that they smuggle in consequences into the,
00:15:06.940
the primacy of the rule that if the, if the rule had bad consequences, they would never,
00:15:12.160
it would never suggest itself as a reliable basis for ethics.
00:15:16.340
So if Kant's categorical imperative reliably produced needless human misery that was otherwise
00:15:23.300
avoidable, no one would think that the categorical imperative was a good idea, right? And so if you
00:15:29.200
drill down on why people are attached to a rule, you tend to get justifications for it, which have
00:15:36.120
to be cashed out in the form of consequences, whether they're actual or potential. Has that been
00:15:42.500
your experience or do you see it differently? I certainly think that the tendency of most of the
00:15:48.120
rules that are part of everyday morality is to produce better consequences. I think you're right
00:15:55.780
about that. And I think you're probably right that any rule that reliably produced more misery
00:16:00.180
would be dropped and, and a different rule would be substituted. But, but of course, what does
00:16:07.640
often happen and Kant is a good instance of this is that you have a rule that generally has good
00:16:12.620
consequences, like the rule that you should tell the truth. And then somebody imagines a situation
00:16:17.620
where, uh, a would be murderer is, comes to your house and asks if you have seen so-and-so and it so
00:16:25.640
happens that so-and-so knows that this guy is pursuing him and has asked if you will hide him in your
00:16:31.140
basement. Now, you know, most of us would of course say, well, it is justified to lie in those
00:16:36.400
circumstances, but Kant actually sticks to the rule and says, no, it's wrong to lie even then.
00:16:43.060
So part of the problem with the people who are not consequentialist or some of them at least is
00:16:47.500
they want to stick to their rules no matter what, even if the general tendency of the rule
00:16:53.420
is to have good consequences. Um, and so I, I would describe that as a rule that we should apply
00:17:00.000
in everyday life. Um, but not as a rule, we should stick to no matter what.
00:17:05.080
The other thing here, which gives us this sense or gives many people the sense that there can't be
00:17:11.380
such a thing as moral truth is we, we value differences of opinion in philosophy and, and in
00:17:19.460
particular moral philosophy in a way that we don't in the rest of our truth claiming about the world.
00:17:26.480
So if someone comes to the table saying that they have a very different idea about how to treat
00:17:32.780
women, we should make them live in bags as the Taliban do. And this is how we want to live.
00:17:39.380
And there's no place to stand where you can tell me that I'm wrong because I'm just being guided by my
00:17:45.300
age old moral code for which I even have a religious justification. Many people in the West,
00:17:51.880
I think largely as a result of what postmodernism has done to the humanities, but perhaps there are
00:17:58.160
other reasons. Imagine that there's just no place to stand where you can contradict that opinion or
00:18:05.060
dismiss it. You can't actually say, well, some people are not adequate to the conversation for
00:18:11.180
reasons that should be obvious. And yet we do this in science and everywhere else. I mean, just in,
00:18:17.120
in journalism or in history, in any place where people are purporting to make claims that are true,
00:18:23.740
it, when someone shows up and demands that their conspiracy theory, theories about alien abductions
00:18:31.000
or whatever it is, get, get taken seriously, we just say, you know, sorry, these, these views are so
00:18:36.080
uninteresting and so obviously incredible that they don't actually constitute any kind of rejoinder
00:18:43.940
to what is being said here. And so it's, it just, it's very easy to disregard them. Now,
00:18:49.760
occasionally some, you know, outline view becomes credible for some reason, and then it, it subverts
00:18:55.580
what we think is true. And that's, that's a, just a process of criticism that just has to run its
00:19:00.700
course. But I feel like in moral philosophy, many people have just tied their hands and imagine that
00:19:09.240
everyone gets a vote in moral epistemology and it's an, it's an equal vote. And so, you know,
00:19:16.200
there's just, you know, what, what Derek Parfit thinks about morality doesn't matter any more than
00:19:21.840
what Mullah Omar thinks. And everyone is on all fours in their truth claims. And that's,
00:19:27.200
I think that's been very destabilizing for many people in the West when it comes time to talk about
00:19:32.420
the nature of human values and how we can talk about them in, in universal terms. Just, you know,
00:19:38.240
just the fact that you can still meet anthropologists and it might even be still a majority
00:19:43.620
of anthropologists who doubt whether a universal notion of human values is even a credible thing to
00:19:51.320
aspire to. Yeah. I mean, anthropologists seem particularly prone to that. Maybe it's kind of
00:19:56.740
occupational hazard of studying a lot of different societies. And of course you do find different
00:20:01.340
particular practices in different societies, but you also find some common tendencies. For example,
00:20:08.740
reciprocity is pretty much a universal value. It's very hard to find a society in which it's not
00:20:16.120
considered a good thing to do favors to those who've done favors to you. And conversely, that you're
00:20:22.500
entitled to have some kind of retribution on those who've done bad things to you. But that's, of course,
00:20:29.500
not to say that this is actually the right morality. This is just to say that this is kind of the evolved
00:20:34.580
morality of our species and indeed not just of our species, but of long lived social primates who
00:20:41.380
recognize each other as, uh, as individuals. Um, but, but I wanted to get back to the larger point that you
00:20:48.660
were making. Um, and I'm not actually, I don't quite see as much reluctance to criticize ethical views of
00:20:57.500
different cultures as you described. Um, I think most people, uh, for example, who are not, uh,
00:21:06.900
of that religious group and it's not all Islam, but it's a particular part of, uh, Islam, uh, who think
00:21:14.380
that women should not go out in public, uh, without you put it wearing a bag over themselves. Um, uh, I think
00:21:21.720
most people who are not of that religious group would be prepared to say that that's wrong, that, that it's,
00:21:26.260
it's wrong to treat women in that way. It's wrong to, uh, deny them, uh, privileges that, uh, men
00:21:33.640
automatically have. Um, and they would reject as unethical that way of treating women. Now, what
00:21:41.740
does come over the top of this is that we have, um, I think you and I would agree perhaps excessive
00:21:47.520
respect for religion. Um, and I think that comes out of a long tradition of, uh, people fighting over
00:21:55.880
religion and often killing over religion. And, uh, at some point, perhaps, you know, around the
00:22:02.020
17th and 18th centuries, people started to say, well, enough of this, you know, I'll, I'll leave
00:22:08.040
you alone to practice your religion and you leave me alone to practice my religion. And then we don't
00:22:12.260
have to kill each other over it. Uh, and of course, in, in some sense, that's a very good idea that we
00:22:16.760
don't have to kill each other over it, but, um, it can be taken too far. It can be taken to the
00:22:21.900
point of, well, um, you know, every religion is sort of somehow good on good in itself or
00:22:28.460
beyond criticism in itself. And, uh, I think that that's completely wrong. Uh, but if somebody
00:22:35.500
tried to put it forward as a sort of, you know, a secular belief that women should cover
00:22:40.560
themselves completely whenever they go out in public, whereas men are perfectly free to
00:22:45.880
display their arms and legs and, uh, so on, um, and of course face, uh, I think people would be
00:22:53.540
pretty baffled by that view. And I, I don't think that they would think, oh yeah, that's fine.
00:22:58.200
Well, you, you haven't spent as much time criticizing these views as I have, I think,
00:23:01.740
in public. I would agree that most people feel that there's something wrong there, but I, I've
00:23:07.440
noticed that the more educated you become, certainly in the humanity, actually not just the
00:23:12.600
humanities. It's, it's science as well. Pumes, you can't derive an ought from an is, has become
00:23:18.880
this shibboleth among very educated, but incompletely educated people. And...
00:23:26.160
Well, I'm going to disagree with you about that because I think that's true. Um, uh, that, that,
00:23:30.360
that's, that's a philosophical claim that I think is, is defensible.
00:23:34.260
I think it's defensible within a certain construal. Yeah, we can talk about that, but I know physicists
00:23:40.260
who will say, you know, I don't like slavery. I, I personally would vote against it. I would
00:23:45.540
put my shoulder to the wheel in resisting it, but I have no illusion that in resisting slavery,
00:23:51.420
I am making any claim about what is true. There's just no place to stand in science to make those
00:23:58.500
claims with respect to, to morality. And, and so what, what, what that does is that, that divorces
00:24:04.020
morality from any conception of the wellbeing of conscious creatures like ourselves. And it's
00:24:12.140
claiming that no matter how far advanced we become in understanding wellbeing and the possible
00:24:18.560
experiences that suitable minds can have, we will never know anything about what is better or worse
00:24:25.800
in this universe. I mean, but so my, my view, again, I don't want to spend too much time on this
00:24:30.460
because there's a, there's so much in applied ethics that I want to talk to you about, but
00:24:35.220
I do want to hear your pushback on the ought and is issue. But I just, my claim here is that,
00:24:42.100
I mean, we could forget about ought, as you suggested earlier, though I take that in a very
00:24:46.580
different direction. I mean, just imagine we have no conception of ought and we have no conception of
00:24:50.280
morality, but we have, we appear in a universe where certain experiences are possible. So I,
00:24:58.600
so I view morality as a kind of navigation problem. We have, we are conscious systems
00:25:02.860
and there is a possibility to experience unendurable and pointless misery for as long
00:25:10.320
as possible. And then there are all these other possibilities. And my view is that anything is
00:25:15.960
better than the worst possible misery for everyone.
00:25:19.340
Look, I totally agree with you about that moral judgment. And I also agree with you that
00:25:25.760
understanding wellbeing, uh, what causes brings about wellbeing, what reduces suffering, how our
00:25:31.960
minds work, all of that is highly relevant for deciding what we ought to do. But, um, and so I
00:25:38.960
think, and I, and I also think that the physicist, uh, that you mentioned is wrong if he says my judgment
00:25:44.460
that, um, you know, happiness is better than misery, let's say is, uh, uh, is not true. I don't think
00:25:52.360
that I can say that this is true. I think you can say that it's true, but I don't think it actually
00:25:56.660
follows from the description of the natural universe. I think it follows only if you make that judgment.
00:26:03.740
And I think you actually made it, you use the word better, that it's better if there's a world
00:26:08.880
in which there's, you know, people are living rich, enjoyable, fulfilling lives than if they're
00:26:14.500
miserable, suffering and so on. Um, but, but we have to, we have to say, well, what is that judgment
00:26:20.600
that it's better? I think, I think it's a judgment that we use our reason to get at. So I think that
00:26:28.020
we have normative reasons, even if we didn't use the word ought, even if we decided that the
00:26:33.460
institution of morality is not one that we want to be part of, I think we'd have to say,
00:26:37.300
do we have reasons for acting to bring about the happy world that you described rather than the
00:26:45.080
miserable world that you described? And I would answer, yes, we definitely do have reasons for
00:26:50.540
acting because it's a better world, but I wouldn't claim that I can deduce that reason for acting
00:26:57.460
simply from the description of here's one world and here's the other. Um, there has to be,
00:27:03.460
it's not just the description. There has to be, uh, as I say, something normative
00:27:07.160
by which I mean reasons that ought to move a rational being towards choosing one world rather
00:27:14.440
Right. Well, I would fully agree with that, except the reason why the is ought dichotomy
00:27:20.120
is uninteresting to me is you can't get to any description of what is without obeying certain
00:27:28.660
oughts in the first place. I mean, so you have to pull yourself up by your bootstraps at some point.
00:27:33.200
And so, you know, logical intuitions are not self-justifying. We just grab them and use them
00:27:40.920
and, you know, a desire for evidence. Why should you desire evidence? You know, what evidence are
00:27:46.600
you going to provide to convince someone that they should desire evidence if they don't desire evidence
00:27:50.160
for their truth claims? Or, you know, what, what logical argument will you use to prove to someone
00:27:54.980
that they should value logic if they don't value logic? I mean, these are, these are brute facts of
00:27:59.880
epistemology, which we use without any apology, really, because we can do no other. I would just
00:28:07.940
say that valuing any movement away from the worst possible misery for everyone, right? I mean, so
00:28:16.140
again, just, you know, I want our listeners to absorb what those words mean. Imagine a universe where
00:28:21.720
everything that can suffer suffers for as long as it possibly can, as deeply as it can, and no good
00:28:29.900
comes of it, right? There's no silver lining to this. This is a perfect hell for all conscious
00:28:35.160
creatures. That's bad if the word bad is going to mean anything. And getting out of that situation is
00:28:41.560
good, and it's something you should do if, if, if words like good and should mean anything. And that,
00:28:48.000
in my view, that's all we need to get this consequentialist machine turning.
00:28:53.200
Okay, okay. I don't think we're disagreeing on anything very significant here. Perhaps we're
00:28:58.240
disagreeing on whether these are facts that science describes, or whether they're reasons that are part
00:29:09.440
Yeah, I don't think we disagree there, because I mean, the point of confusion that I think you and I got
00:29:13.540
bogged down by the first time around was in a very different definition of the word science. I mean,
00:29:19.620
I was always using science in a much more elastic sense to coincide much more with the way you're
00:29:25.780
using the word reason. And so I'm not just talking about people in white lab coats who can run
00:29:30.860
experiments immediately on any given hypothesis, or ever. I mean, so there, so there are truth claims
00:29:36.380
we want to make or could make about the world, which we know we would never test scientifically,
00:29:42.300
but we know there are facts of the matter. And the fact that we can't get the data in hand doesn't
00:29:48.340
make the truth or falsity of those claims any less assured. So I mean, again, the one I use all the
00:29:55.580
time, and I might have even used it with you in Arizona was, you know, what was JFK thinking the moment
00:30:02.060
he got shot? Well, there's an infinite number of things I know he wasn't thinking. I know he wasn't
00:30:08.460
thinking, and I wonder what Peter Singer and Sam Harris are going to say about what I was thinking.
00:30:12.640
And you just, the list can just grow from there. And that's a claim about his inner life,
00:30:18.520
of what it was like to be him, that is ontologically subjective, right? I'm making a claim about his
00:30:25.260
subjectivity, and on some level, the state of his brain, but it's epistemologically objective,
00:30:31.760
which is to say it's true. You know, it's just there's every reason to believe it. And people
00:30:37.420
doubt that you can make claims about human subjectivity that are, wherever you stand,
00:30:46.040
surviving all of the tests of credibility that claims about physics and chemistry need to survive.
00:30:51.940
And I mean, that's my particular hobby horse, that people feel that this area is just, by definition,
00:30:58.440
less clear, less truthful, less grounded in the kinds of cognition we use to do science. But
00:31:07.860
again, I just think there's nothing more sure than some of these claims we could make about
00:31:13.900
morality once we look at the intuitions we're using to make the claims. The intuition that two plus two
00:31:20.660
makes four, and that this abstraction is generalizable. So it works for apples, it works
00:31:26.260
for oranges, it works for boats. That intuition that is at the foundation of arithmetic, again,
00:31:33.100
is just something we apes are doing with our minds, and it works. But I just think it's not in a
00:31:41.480
different sphere from the kinds of intuitions you and I are talking about with respect to,
00:31:47.560
it is good to reduce pointless agony, all things considered.
00:31:52.880
Yeah, we certainly agreed on that. And I think we agreed on rejecting the various forms of
00:32:00.180
subjectivism and relativism that the postmodernist ideas in particular have encouraged some people
00:32:10.600
to have. I find that quite disappointing in a way. You get people who come out of backgrounds where
00:32:17.880
they're doing cultural studies. And they come up with the same sort of views that freshmen come up
00:32:26.160
to Princeton with, and we discuss in early seminars. And they usually fairly rapidly see that those views
00:32:34.120
aren't really tenable, that they have implications that they don't want to accept. But there are certainly
00:32:40.660
more sophisticated forms of that kind of relativism and subjectivism that are still around. And I think
00:32:48.280
we're agreed that ethics is a field in which there are truths. Exactly how we classify those truths
00:32:58.720
is a fine point. But I think probably it didn't really delay us any longer. I think we've clarified
00:33:06.720
where we are. Okay, so let's move forward with that consensus in hand. So we want to reduce
00:33:12.120
suffering, all things considered, and maximize the well-being of conscious creatures. And we don't need
00:33:19.340
to waste much time justifying that going forward. So now what do you do in a situation where people
00:33:26.340
claim that suffering is being produced, and suffering is in fact being produced, but you feel that the
00:33:36.320
basis for the suffering is illegitimate? Let's say it's based on a religious dogma. So the example that
00:33:43.820
comes to mind now is the cartoon controversy. What if a consequentialist, philosophically minded but
00:33:50.680
still doctrinaire panel of Muslim philosophers came to you and said, listen, whenever you cartoon the
00:33:59.080
prophet or tolerate others cartooning the prophet, you produce a tremendous amount of suffering in
00:34:05.940
millions of devout Muslims, suffering that you can't compensate us for, suffering that we are
00:34:12.840
committed to feeling based on our beliefs. And therefore, it is just wrong to do this. And you
00:34:20.340
need to conform your freedom of speech to our religious sensitivities. How do you think about that?
00:34:27.280
I do think about that in terms of the consequences of the action. So I'm not somebody who's going to say,
00:34:33.080
no, I have a right to free speech. And if I choose to exercise that right to free speech,
00:34:37.440
I will do so no matter what the consequences. The question to be considered is, what are the
00:34:45.460
consequences of restraining free speech in this area? And there's no doubt, I think, that
00:34:54.880
these cartoons are offensive to Muslims, and they will cause some hurt feelings. Perhaps more serious than
00:35:04.900
that, because people can get over their hurt feelings, I'd say more serious is the fact that
00:35:10.240
some of them may then engage in protests that turn violent, may attack Christians, if there are
00:35:19.320
Christians or, let's say, people they consider to be infidels, not necessarily Christians, but
00:35:25.280
not Muslims anyway. If they're living in their country, may attack and kill them. These things have
00:35:31.500
happened. And I think that that's something that anybody thinking of publishing these cartoons
00:35:37.820
needs to give very significant weight to. On the other side, I think that religious intolerance is a
00:35:48.940
major source of suffering in the world. And of course, in the case of militant Islamic views,
00:35:57.560
we've seen very clearly in recent years, how that can cause specific violent attacks, which clearly
00:36:06.040
cause a lot of suffering on the people who are killed or injured and the families and relatives and
00:36:12.360
and so on. So the question is, do we want to just accept that those religious beliefs cannot be
00:36:22.060
criticised? And therefore, they will continue forever, or indefinitely, I guess nothing lasts
00:36:27.880
forever? Or do we want to see whether we can, in some ways, encourage fewer people to hold those
00:36:37.620
beliefs, or at least encourage people to hold them in a more open, tolerant form? And then, of course,
00:36:44.980
and I think the answer to that is yes, I do think that we should be free to criticise religious
00:36:49.740
beliefs, especially those that do a great deal of harm. And then the further question is,
00:36:55.780
is the use of ridicule an effective means of achieving that end? And on that one, I'm not so
00:37:04.880
sure. So, in other words, if it were a question of publishing arguments against the claims made by
00:37:13.900
in Islam, publishing historical studies about how the Quran came to be written and publishing studies
00:37:23.420
of the Quran showing contradictions or inconsistencies, which, of course, exist in the Bible and in any
00:37:30.040
of these substantial texts from long ago, going to be demonstrably inaccurate in some places.
00:37:38.780
So, is that the way we want to try to persuade people to shift their religious beliefs? Or should
00:37:48.440
we try actually ridiculing those beliefs? And my guess is that probably both have some effect,
00:37:57.700
but I'm not sufficiently convinced at the moment that ridicule is so much more effective as to outweigh
00:38:06.680
the serious consequences that it can have. Yeah, I guess my intuition here is that the rule of
00:38:14.360
privileging free speech over everything else is just so useful that the need to rethink it in any local
00:38:24.500
case is almost never pressing. I think free speech being essentially the equivalent of, you know,
00:38:31.020
sunlight spread on bad ideas. It's such a reliable mechanism for bringing bad ideas to light, criticizing
00:38:40.060
them, getting others to react to them, that the moment you begin to look for local instances where you need to
00:38:47.380
calculate the harm done by exercising it, I think it's, you know, almost always counterproductive. And for instance,
00:38:54.620
there's one area here where I know you and I agree, because I've read what you wrote in your most recent book,
00:38:59.180
but the idea that Holocaust denial should be illegal, right, because of all the harm it does,
00:39:05.980
both to the survivors and their descendants, and also just the fact that it seems to encourage,
00:39:11.960
or at least is imagined to encourage, the survival of these noxious views, you know, Nazism and
00:39:18.940
neo-Nazism in Europe. You and I both agree that it shouldn't be illegal, and you shouldn't put people
00:39:25.260
in prison for denying the Holocaust, that the appropriate response there is ridicule and the
00:39:31.380
attendant destruction of their reputation and just talking more about the evidence for the Holocaust and
00:39:38.380
just the normal process whereby we expose bad ideas to criticism and use the immune system of
00:39:46.940
conversation to deal with them. Yeah, I mean, we certainly agree about that example. I think that
00:39:52.400
the way to deal with Holocaust denial is to simply show the evidence that the Holocaust existed,
00:40:00.040
and that evidence is totally overwhelming, whereas locking somebody up who denies the existence of the
00:40:04.960
Holocaust probably just encourages conspiracy theorists to think, oh, well, if they have to prohibit
00:40:11.620
people denying it, that must be because there isn't really good evidence that it happened.
00:40:17.000
So that's a case where we're completely in agreement. But I'm not sure that, you know,
00:40:21.600
if you're saying there is no case where a restriction of freedom of speech is justified,
00:40:27.520
then I disagree. And I disagree with the kind of case that John Stuart Mill, who, of course,
00:40:32.780
is a famous defender of freedom of thought and expression, in his book On Liberty, carved out an
00:40:39.060
exception, where he said that, you know, in his day, corn dealers were very unpopular. It was thought
00:40:44.620
that they were hoarding corn, profiteering from it, and starving the poor. So he gave the example of
00:40:52.720
somebody who, standing in front of the house of a corn dealer, addresses an excited mob, saying that
00:40:59.360
corn dealers are starving the poor, or robbing the poor, or something of that sort. And he thought it
00:41:04.320
was legitimate to prevent that speech taking place. On the other hand, he said, if in different
00:41:10.540
circumstances, somebody wants to hand out a leaflet, expressing exactly the same views, but not in front
00:41:17.760
of the house of a corn dealer and in front of an excited mob, then that was perfectly legitimate. And
00:41:23.780
there was a right to freedom of thought and expression that extended to expressing that opinion.
00:41:29.400
Now, of course, times have changed somewhat, and we have instant communication anywhere. And
00:41:36.800
so the case of the cartoons, which are, then we know, going to get spread over the internet and
00:41:43.800
read about or reported in countries where there is a lot of militant Islamic thought,
00:41:52.480
may have a similar effect in terms of inciting a mob to attack and kill, as I say, the people they
00:42:01.200
regard as infidels, or representatives of the government where the cartoons are published, or
00:42:07.000
whoever it might be. So that's why I'm not sure that the exceptions ought not to extend here. Not that I
00:42:16.240
really want to see a law against those cartoons. That might be a step too far and might be difficult to
00:42:22.100
say exactly what is legitimate ridicule or satire and acceptable. But I would, I think if I were an
00:42:32.740
editor and I were aware that of what the consequences were going to be, if I, let's say, had reliable
00:42:38.420
evidence that they would cause the death of hundreds of innocent people, I would choose not to publish
00:42:45.540
I would agree that if you are going to make the causality absolutely clear and say, well, somebody
00:42:52.220
is going to die if you publish this cartoon, we know that. Well, then that, it becomes difficult to
00:42:56.900
justify publishing it. But then we're always dealing in probabilities. And if the probability is high
00:43:02.100
enough, as it probably is in this case, you could reasonably expect that people will riot and someone
00:43:08.000
will get injured or killed as a result. But the thing is, it puts us in a position where a whole
00:43:15.120
civilization and a whole societies can be held hostage to the whims of, in this case, religious
00:43:24.240
maniacs. But I'm by no means just focusing on the specific case of Islam. It's just anyone could
00:43:30.900
announce, Unabomber style, if you say X or you don't say X, I'm going to kill someone. And there's
00:43:38.660
just something so corrosive about that. And it can be so consciously and cynically used against us,
00:43:46.740
again, until the end of the world, that it's tempting to just say, well, sorry, we don't play
00:43:52.500
that particular game. And the game we do play is, we basically talk about everything. And we encourage
00:44:00.980
you to talk about everything, and you will feel a lot better once you do. The other thing that's implicit
00:44:06.820
in having a position of the sort we have sketched out here, where we think that moral truths exist, and
00:44:14.260
it's possible to be right and wrong, or more or less right and more or less wrong about what a good
00:44:19.300
life is, that entails the claim that certain people and even whole cultures may not know what
00:44:26.420
they're missing. What I would want to claim here is that a religiously blinkered culture that feels
00:44:33.700
no affinity for freedom of speech and thinks that cartoonists and novelists and other blasphemers
00:44:39.700
should be killed for saying the wrong thing about the provenance of a certain book or about a certain
00:44:44.740
historical figure. These people don't know what they're missing, and they don't realize how much
00:44:50.980
of a price they're paying for this attitude toward freedom of thought and freedom of expression. And
00:44:57.700
on some level, we know we're right about this. We know we're on the right side of history, and we have to
00:45:03.860
encourage, cajole, browbeat, and ultimately even coerce people to get on the right side of history.
00:45:11.300
Yeah. I mean, again, I largely agree with what you say there. And I don't think that if somebody
00:45:18.100
is trying to blackmail us into not saying something by deliberate threats and is using that as a tactic,
00:45:26.340
I don't think we should yield to that, although there might be a significant cost. But obviously,
00:45:31.860
once that succeeds, then it's going to be a tactic which will be used over and over again. And freedom of
00:45:37.300
thought and expression is something that is really important to defend. I certainly agree about that.
00:45:46.100
So, you know, it's really the rather different case that I was talking about where it's not a
00:45:50.740
deliberate tactic. It's just a reaction. And it's specifically about cartoons. It's not about expression
00:45:59.060
of ideas. It's not about being able to criticize the religion. I certainly don't want to see any religion
00:46:06.900
insulated from criticism because I do think that there's a lot of harm that flows out of that.
00:46:15.460
So if it gets to that point where people are saying, you know, if you even dare to say that
00:46:20.260
it's not the case that every word in the Koran is true and ought to be followed, you know, clearly,
00:46:27.300
we're not going to play that game. We are going to be free to criticize whatever religious texts
00:46:34.980
people put up that we disagree with. That's a very important and fundamental freedom and something that
00:46:42.580
we should defend even if there is some cost to doing so. Yeah, well, I would just argue that the
00:46:48.340
cartoons were of a piece with that larger consideration back there. If you'd like to
00:46:54.340
continue listening to this conversation, you'll need to subscribe at SamHarris.org. Once you do,
00:46:59.620
you'll get access to all full-length episodes of the Making Sense podcast, along with other
00:47:03.700
subscriber-only content, including bonus episodes and AMAs and the conversations I've been having on the
00:47:09.620
Waking Up app. The Making Sense podcast is ad-free and relies entirely on listener support,