The TED Interview
Episode Stats
Length
1 hour and 7 minutes
Words per Minute
174.08095
Summary
Introducing a new event series I'm kicking off in three cities starting in Jan, called Experiments and Conversation. It's a series of events where public intellectuals and creative people are willing to take some risks and think out loud on important and controversial topics. We're looking for people who are not cowed by the prospect of saying something so surprising or counterintuitive that people might take offense at it. I can't tell you how stifling the current environment is for speech, with the exception of podcasts, it is just crazy out there. I'm hoping this series can help us all recalibrate a little bit, and I'm looking forward to hundreds of events like this happening each year all over the world. Pre-sale tickets go on sale to the general public on Nov. 2nd, but only a few days of exclusive access this time around. So if you live in or near any of those cities, and you want to come out for a live event, here's your chance to get in on the action! Tickets are available to podcast subscribers right now. If you're a supporter of the podcast, you can get a pre-sale code on my website and you'll get a discount on your tickets starting on November 2nd. You can also pre-order tickets to my upcoming event, Events and Conversation, which starts on sale on November 5th. Please make sure to check out the event on my Events page, where I'll be giving you exclusive access to tickets to the first three live events starting in those cities starting January 28th and 29th in Milwaukee and in Chicago on the 30th in January. I'm very happy to give you a chance to catch up on all of those tickets and get a sneak peek of what's to come. This is a big thank you! -Sam Harris -The Waking Up Podcast the Waking up Podcast . Music: "The Wakening Up Podcast" by Ian McKirdy "The Talking Drummer" by John Singleton (featuring Samharris_ and "The Lonely Planet" by Mark Phillips & "Waking Up" by Jeff Perla , by "The Stranger" by "Mr. , "The White Supremacist" by , and "Noah" by Robert Ferensky is out on SoundCloud, "Wake Up"
Transcript
00:00:00.000
welcome to the waking up podcast this is sam harris okay some housekeeping here
00:00:25.620
i have a new event series to announce it's called experiments and conversation
00:00:32.580
and i'll be kicking this off with eric weinstein in january january 28th we'll be in detroit
00:00:41.640
at the fillmore on the 29th we'll be in milwaukee at the pabst theater and on the 30th we'll be in
00:00:49.700
chicago at the chicago theater and the idea with experiments and conversation is to launch a
00:00:55.560
series of events that is about more than my events i'll participate in many of them especially in the
00:01:02.140
beginning but there's so many great speakers out there and i want to create a speaking series that
00:01:07.440
could eventually take place in many cities simultaneously it'll be conversation based
00:01:12.120
these are not lectures now we've been thinking internally about this being ted for two where
00:01:18.240
the ted conference has significantly refined and even institutionalized the short lecture
00:01:25.000
we're going to attempt to do something similar for conversations and to this end we'll be looking for
00:01:31.700
public intellectuals and creative people who are willing to take some risks and think out loud on
00:01:37.580
important and controversial topics we're looking for people who are not cowed by the prospect of
00:01:43.420
saying something so surprising or counterintuitive that people might take offense at it i can't tell
00:01:50.360
you how stifling the current environment is for speech with the exception of podcasts it is just crazy
00:01:57.920
out there so i'm hoping this series can help us all recalibrate a little bit now in a perfect world
00:02:04.720
there will one day be hundreds of events like this happening each year all over the world but we'll see
00:02:11.300
how it goes with the first three and i really can't think of a better person to kick this off with
00:02:15.840
than eric weinstein eric is one of the most consistently interesting and courageous thinkers i know
00:02:22.440
he's a real polymath and in my experience we can talk about almost anything and it's fun and illuminating
00:02:31.400
and just it's what a conversation should be so he was a natural starting point when thinking about
00:02:38.000
this series but eventually i will turn to him and say who else could you have a great conversation
00:02:43.260
with and he might go off with that person and do an event in a city near you and i'll reach out to
00:02:49.840
some of my other favorite podcast guests and people i admire and ask the same question and the
00:02:55.200
conversations will spread anyway this is all starting in detroit milwaukee and chicago in january so if you
00:03:03.260
live in or near any of those cities and you want to come out for a live event here's your chance i
00:03:08.860
don't get to your part of the world very often in fact i've only spoken once in chicago and i've never
00:03:14.580
been to milwaukee or detroit so come on out pre-sale tickets are available to podcast subscribers right
00:03:22.540
now if you're a supporter of the podcast either through samharris.org or patreon you can log into my
00:03:28.960
website and you'll see a pre-sale code on my events page and that's samharris.org forward slash events
00:03:35.760
and then tickets go on sale to the general public on friday november 2nd so subscribers only have a few
00:03:44.020
days of exclusive access this time around apologies for that live nation collapsed the schedule on me
00:03:49.760
for some reason you guys usually have a full week but not the case here anyway i'm looking forward to
00:03:56.780
those events and eric and i will try to make sense okay the waking up course the app is now available on android
00:04:06.880
i'm very happy to say that didn't take too long at all many of you feared that it would take years
00:04:12.660
understandably so but no it took a mere six weeks beyond the ios build so enjoy that please know we're
00:04:22.780
continually updating it insofar as you find bugs or improvements that should be made please tell us
00:04:31.080
at wakingup.com and again your reviews are incredibly helpful so please keep those coming
00:04:39.220
let's see here i've got some good people coming up on the podcast i've got rebecca traister the
00:04:44.780
feminist journalist who's written a scalding me too book we had a colorful conversation uh johan hari
00:04:53.180
is coming up he's written a book about depression that many people are finding very useful darren brown
00:04:58.740
the magician who is remarkable as you probably know he's coming up dia khan the muslim filmmaker
00:05:07.000
has made an amazing film about neo-nazis and white supremacists in the u.s
00:05:11.340
dia proves to be kryptonite for white supremacists it's quite amazing well speaking of ted today i'm
00:05:21.320
turning the tables and presenting an interview i did with chris anderson the owner and impresario
00:05:26.400
of ted he's just launched his own podcast the ted interview i believe this is his third episode
00:05:34.600
as you know i occasionally do this i present my appearance on another person's podcast
00:05:40.220
i think you might enjoy it and i think you'll enjoy this one chris is a great interviewer
00:05:46.120
and as you'll hear he's a little concerned about some of my views and he pushes back on me from time
00:05:52.360
to time this was actually very noticeable after my first ted talk in 2010 where chris came on stage
00:05:59.840
and asked me some very worried questions about my views on the hijab and i think chris has been
00:06:06.500
worrying about me ever since anyway what chris has done with the ted conference is truly incredible
00:06:12.420
and it was an honor to be interviewed by him so please enjoy my conversation with chris anderson
00:06:19.680
welcome to the ted interview i'm chris anderson and this is the podcast series where i sit down
00:06:33.780
with a ted speaker and we get to dive much deeper into their ideas than was possible during their ted talk
00:06:39.820
my guest today is sam harris philosopher neuroscientist author podcaster sam has been at the heart of many
00:06:50.300
of the most provocative conversations out there today politically i would place him at what you
00:06:56.600
might call the radical center a stern critic of donald trump but also of political correctness for
00:07:02.760
example he has infuriated people on both left and right in almost equal measure but he has also
00:07:09.960
delighted many many people because of his clarity of thought and his fearlessness in how he expresses
00:07:16.940
those thoughts sam's podcast waking up is super popular i'm a regular listener and he's also famed for
00:07:24.560
his book called the moral landscape that was the subject of his first ted talk most people probably here
00:07:30.760
think that science will never answer the most important questions in human life questions like
00:07:36.940
what is worth living for what is worth dying for what constitutes a good life so i'm going to argue
00:07:43.520
that this is an illusion the separation between science and human values is an illusion and actually
00:07:49.080
quite a dangerous one at this point in human history so the debate is over the nature of morality
00:07:55.960
morality specifically is there such a thing as objective moral truth or is morality inherently
00:08:03.600
subjective in which case all moral statements are ultimately just statements about the values an
00:08:09.260
individual or a culture happens to hold so let's give an example here i mean look if i say something
00:08:14.820
like it's wrong to lie or we should all stop eating animals are those ultimately just your personal
00:08:22.060
moral values or might there be a sense in which they can objectively be judged to be true or false
00:08:28.560
if you believe in god there's an easy enough answer to this question good is what god has revealed to us
00:08:34.160
is good he's created human beings with consciences and with a holy book that sets out what is right and
00:08:39.240
what is wrong but most modern philosophers academic scientists don't think you can outsource morality to god
00:08:45.780
they would say there is a fundamental difference in the world between facts and values facts as statements
00:08:52.880
about the real world they can be true or false values are human creations they differ between different
00:09:00.320
cultures we can debate them but ultimately there is no objective arbiter of the truth of a moral statement
00:09:07.620
what's interesting about sam harris is that although he definitely doesn't believe in god
00:09:13.120
he does believe that statements about moral values are ultimately objective statements in his view we
00:09:19.760
can discover the truth about those statements through an ever deeper knowledge of science of psychology
00:09:25.980
for example of how human societies operate and of the exercise of reason there's a lot at stake here
00:09:32.520
if sam harris is wrong and the majority of scientists and philosophers are right then it's hard to see how
00:09:38.720
there can be such a thing as moral progress if a moral system is simply the subjective values that a
00:09:44.880
culture creates it puts a limit on how much you can argue against views you disagree with like the
00:09:51.180
sanctity of life or child marriage you just have no real answer to the position look this is what i and my
00:09:59.620
family going back generations choose to believe if sam harris is right on the other hand it becomes
00:10:05.940
impossible to argue that certain cultural values are objectively wrong and must be changed and to
00:10:12.100
present real evidence as to why that might be so and looking forward it impacts how we build ethical
00:10:19.360
decisions into the technologies we're creating like machine learning artificial intelligence social media
00:10:26.180
algorithms self-driving cars there is much to ponder here it's not just a philosophical argument it's as
00:10:33.960
important a conversation as there is so let's go sam let's start here um how can you build morality
00:10:43.000
out of mere reason and science and perhaps you could even start by defining what morality even is
00:10:49.200
well i would i would say that it is anchored to the the fact that we are in relationship with one another
00:10:57.580
so if you're in a in a universe of one if you're on a desert island the ethics of your living don't
00:11:05.580
come into play because there's no other conscious system that can be affected by what you do so if
00:11:10.620
you're truly alone and can't harm or benefit anyone then we don't really talk in moral terms we talk in
00:11:15.920
just in terms of well-being so a moral system is the rules by which we should treat each other or not
00:11:21.960
treat each other how do you create the rules by which to treat each other i mean how do you build
00:11:27.240
a moral system from the ground up uh just imagine that we have no notion of should or ought there's
00:11:33.520
there's nothing we should do this thought has not occurred to anyone yet and even the notion of right
00:11:38.640
and wrong and good and evil hasn't occurred to anyone it's just there's just we just find ourselves
00:11:42.680
in this universe and the circumstance which we didn't create is one in which conscious minds like
00:11:49.480
our own are susceptible to a vast range of experiences and some of these experiences suck
00:11:57.040
right i mean this unambiguously like and if you if you doubt that just imagine having every variable
00:12:03.300
that conspires to make you miserable turned up you know to to 11 right so if you're if you doubt this
00:12:10.080
you know go to a hot stove and put your hand on it right i mean that that is a powerful philosophical
00:12:15.040
argument the experience you will have there is deeper than your doubts about the whether morality
00:12:22.080
can be anchored to reason right and if you think well it only burns my hand because i am you know my
00:12:27.160
mind and body are constituted in such a way well yeah that's precisely the point i'm saying that every
00:12:32.180
possible mind has a is susceptible to a range of experiences given the physics of things we don't
00:12:39.100
know how consciousness is actually integrated with physics that's a mystery but there is some
00:12:44.300
relationship and we live in a universe where conscious minds have a a range of conscious
00:12:50.940
states and some of these states are better than others and i think that claim that there's that's
00:12:55.920
that the worst possible misery for everyone is bad and that every other state of the universe is
00:13:02.320
better i think that is as rudimentary a claim as we ever make in reasoning about anything it's it's
00:13:09.060
as a rudimentary as two plus two makes four it's as rudimentary as events have causes uh there's
00:13:16.260
nothing it is bedrock and we know there are many other conditions on offer which are far better than
00:13:24.020
that right uh some are sublimely rapturous and filled with beauty and and apparent meaning and
00:13:30.740
we just just all the satisfaction that the luckiest people we know and and and ourselves in our best
00:13:36.580
moments have enjoyed while alive and so what i would argue is that what we have on our hands is a
00:13:41.940
navigation problem we are navigating in the space of all possible experiences and so let me just but
00:13:47.680
let me just push uh on one sort of philosophical point back there in describing the worst possible
00:13:53.940
state for all people though couldn't two different people look at two universes and disagree about
00:14:01.120
which one was actually a worse state in in one let's say everyone was making this god almighty mess
00:14:08.860
they were just creating mud everywhere there is it was the ultimate sort of pigsty and it was disgusting
00:14:13.820
to look at and in another there were people being hurt really badly but there was also you know
00:14:21.100
this beautiful artwork in the sky that was somehow some creation of perfection like like people could
00:14:26.760
disagree about which of those two was that was the the worst couldn't they yes what you're saying is
00:14:31.100
true of my picture of morality in general and that's why i called it a moral landscape where you have
00:14:36.420
peaks and valleys and some of these peaks could be equivalent and some of the valleys could be equivalent
00:14:42.280
but yet different so that you could have societies of people functioning by very different principles
00:14:47.380
and moral intuitions and senses of what's right and wrong and they could be enjoying they could be
00:14:51.700
enjoying equivalent states of well-being that are irreconcilable right so you could have a an island
00:14:57.240
of perfectly matched sadists and masochists say and they might be happy by their own lights but we would
00:15:03.120
look at them and say that's just that's a completely bizarre and undesirable way of living so there is a kind
00:15:07.780
of moral diversity possible in my picture but but for this example this thought experiment just imagine
00:15:13.160
that every conscious system in that universe suffers as much as it possibly can for as long
00:15:20.180
as it can so if you're telling me there's somebody who would consider a universe of dirt to be worse
00:15:26.520
than a universe of you know painful torment well then that's the universe he gets right so like we're
00:15:33.100
putting we put every however your mind is constituted so as to suffer to the ultimate degree for as long
00:15:39.020
as possible that's what you get in this universe and so someone who said but suffering isn't the
00:15:44.940
point you know say injustice is the point so i i think a worse universe than that is one where people
00:15:51.500
may not be suffering but there is greater injustice yeah the reason why justice seems important to you
00:15:57.820
is because it seems important to you there's an experiential component to this right so one simple way
00:16:03.120
it seems to me almost of of getting to your to your argument is just to imagine um a scientific
00:16:09.160
comparison between here are two universes and they're actually identical in every regard except that in one
00:16:15.980
one child is suffering and and in the other that same child is is not suffering every control for everything
00:16:23.240
else yeah and it it feels like it's not a stretch to say as a fact that that universe with a child is not
00:16:30.360
suffering is is is better yeah so as long as you can get me give me this spectrum of better and worse
00:16:36.240
uh that's all i need and there are several double standards here that people observe by default which
00:16:43.500
are the the source of our what i would argue is our confusion about morality so one double standard is that
00:16:49.620
even the most hard-headed scientists use a totally different standard and so they give you an analogy here
00:16:56.500
if you take something like physics as the the prototypical case so if someone shows up at a
00:17:02.440
physics conference with his cockamamie view of physics that can't be integrated with standard
00:17:08.940
physics if someone wants to to argue for a biblical physics say that person just doesn't get invited back
00:17:15.840
to the conference i mean this is there's no burden upon mainstream physics to incorporate that view into
00:17:21.700
physics and no one would be tempted to say on the basis of you know defining people who think that
00:17:27.400
the earth is flat or they've invented some perpetual motion machine or whatever it is no one takes those
00:17:34.140
claims seriously and and so does with medicine if someone came to a hospital or into a medical school and
00:17:42.540
said you listen you know i have a totally different conception of human health and it entails you know vomiting
00:17:48.920
continuously and being in continuous pain and then dying soon that's my that's how i'm i'm going to
00:17:53.860
define health you know this person's working with a conception of health that doesn't matter to us
00:17:58.980
for good reason there are obviously controversies in science and those are debated even for decades
00:18:03.860
and sometimes they overturn our standard conception of of what is true but the kind of radical skepticism
00:18:11.900
with respect to maybe there's no such thing as science right and maybe there's no such thing as truth
00:18:16.320
that doesn't continually undermine our conversation whereas with morality it does and so when you when
00:18:22.780
you find an another group behaving by a totally divergent moral code a group like isis say so you
00:18:31.100
know you know isis thinks that the best thing to do is kill apostates kill blasphemers throw homosexuals
00:18:38.240
from rooftops take sex slaves and in the even best case dying for the privilege of doing all this in
00:18:46.460
an act of martyrdom right so this is their conception of a life well lived and people look at this in the
00:18:52.420
west and you know well educated over educated people with phds and you know people who are have
00:18:58.720
careers as bioethicists right people who for whom thinking about what is good and right and beautiful
00:19:04.260
in a western context that is their job they look at this diversity of opinion and they say well
00:19:09.860
who are we to say that this is wrong all we can say is that we don't want to live that way
00:19:15.620
but it's a mere preference and then this gets connected to a descriptive notion of how we got here
00:19:24.260
like when you look at how our moral toolkit has evolved i mean we're social primates that have
00:19:30.740
our morality anchored to certain emotions like disgust and jealousy and and a capacity for empathy and
00:19:39.500
and we look at these evolved capacities and we say well there's nothing about the process that got us
00:19:45.980
here that is causing us to track anything of substance about you know the way the world is you
00:19:52.140
know we're not in touch with reality when we're moralizing we're just apes with preferences and so these
00:19:57.640
these two things that the fact of moral diversity and the fact that much of our morality is anchored to
00:20:04.380
these evolved apish tendencies those two things have led many many smart people to believe that there's no
00:20:12.280
there there there's no truth with respect to right and wrong and good and evil i mean by presenting the
00:20:16.780
isis case there you've started with the sort of the awkward logical implication of of moral relativism i mean
00:20:23.640
most people wouldn't wouldn't start there they would say what they're protecting is for example if we
00:20:27.540
discover a culture in the amazon rainforest never been discovered we discover that they have certain
00:20:32.000
ways and certain moral preferences and how they run their society who are we to judge and say our western
00:20:37.900
ways are better but um that kind of um anthropology driven moral relativism as championed by people like
00:20:44.920
margaret mead becomes quickly a kind of an absurd position where you can't say that objectively isis's views are
00:20:52.640
wrong that's just their culture we can't say they're wrong all we can do is is fight them um but i just want to
00:20:59.060
just plant a flag there because you you mentioned anthropology which is a discipline which 70 years ago in the
00:21:05.220
aftermath of world war ii explicitly said these are the the american anthropological association explicitly said when when the
00:21:12.920
u.n was trying to develop a universal conception of human rights the anthropologists all lined up and said
00:21:18.840
this can't be done this is a fool's errand there is no such thing as universal human rights but think of how
00:21:25.000
ethically questionable that position is like there's there's no way to say that you know clitorectomies
00:21:30.720
are a bad thing say it's pure delusion the moment you link morality to the well-being of conscious creatures in
00:21:40.820
general and and people in particular once you draw the link between human flourishing and morality
00:21:47.680
which i think is i think the link is very direct and we can talk about that but once you draw that
00:21:53.320
link to say there are no right and wrong answers here is tantamount to saying we will never know
00:21:58.340
anything about human well-being and there'll be no human psychology that can tell us that how people
00:22:03.880
flourish there'll be no sociology there'll be no economics there'll be no other discipline that that can
00:22:09.000
give us right and wrong answers and that's that i think is is wrong okay so what you're saying is
00:22:14.940
that the route from science to morality as as it were you've described as a sort of a reason-based
00:22:21.740
route there's another route that people might give which is a sort of an evolution-based route to
00:22:27.260
morality which would say that it's completely credible to believe that apes and certainly our ancestors
00:22:34.040
evolved a conscience or perhaps multiple consciences if you like moral instincts that
00:22:40.080
guided behavior which turn out to be really helpful for surviving and promoting group collaboration and
00:22:46.480
and so forth but those instincts may be generally good and beneficial but may also be buggy as we know
00:22:54.940
that so many aspects of our psychology is just odd like it may have been fine-tuned for life a few million
00:23:00.840
years ago it's definitely runs into all kinds of glitches in the world that we're in now and i think
00:23:05.560
what i hear you saying is that there's this incredibly important agenda of applying reason to the start
00:23:11.280
point instincts that we have and this of course is where it gets gets really hard people do i mean
00:23:17.500
john john height has spoken at ted and has argued you know that there are these different moral engines
00:23:21.280
going on in people some people care much more about fairness or about happiness um others care more
00:23:27.800
about honor or about purity or about justice and i think you want to argue that you can you can use
00:23:35.460
the tools of reasons to bridge those gaps those are not fundamentally uh divisible chasms that that
00:23:40.660
can't be breached right yeah well so there are two separate projects here the two ways in which
00:23:46.320
science can weigh in on the question of morality one is to help us descriptively understand how we got
00:23:53.440
here and that is an evolutionary story that is a story uh get talking about ourselves in terms of
00:23:58.680
our history as as social primates and just observing as a matter of psychology and sociology and every
00:24:06.220
other discipline that can be brought to bear on this that people have emotions and intuitions
00:24:14.520
uh various cultures have norms which everyone involved claim have something to do with morality
00:24:23.440
right so there's a feeling of disgust that people have and it's you know clearly it's it's ancient
00:24:29.020
origin is is to be anchored to things like smells and tastes uh and it protects the the the organism
00:24:35.080
from you know just just pollution but then as we've evolved it has this we haven't evolved any new
00:24:42.480
hardware and so what we have built in terms of our morality our norms and our sense of their violation
00:24:49.780
is anchored to this same circuitry so now disgust is doing a lot of work in the moral domain and the
00:24:56.700
political domain and and it's even you know i as some neuroimaging work i did early on shows that
00:25:03.560
the same circuits in this case the the insula in the brain are working to differentiate just truth
00:25:10.460
and falsity so that when you find a statement to be false it seems to activate the same network
00:25:16.860
and based on culture this can play out in very different ways so you can find cultures where
00:25:22.380
people find certain things disgusting which seem completely arbitrary to us right and and therefore
00:25:29.120
wrong and yet we and then we find other things disgusting and to take this down to something like
00:25:33.800
food preference there are cultures that that eat dogs and we find this absolutely disgusting right
00:25:39.140
and you know many of us eat cows the hindus find that absolutely disgusting and you know sacrilegious
00:25:44.900
so clearly we can't to talk about the ultimate wrongness of eating cows or dogs the conversation
00:25:51.480
can't begin and end at what people find disgusting right you you want you want to say it should be
00:25:56.920
possible to make progress on that by you know bring a hindu and a westerner together and let's let's have a
00:26:03.180
conversation and look at what's actually at stake here who's being reasonable who isn't and see if
00:26:08.020
see if we can't change those feelings and probably everyone listening can can think of things that
00:26:12.820
they were discussed at one point that they've maybe shifted on over time but but more generally i want
00:26:17.900
to make the claim that there's this there's another project which is just in principle is just as
00:26:23.420
scientific as the first project of telling an evolutionary story of how we got here and this project is
00:26:28.840
to talk about what is possible for us what what states of conscious well-being are possible
00:26:35.580
given the kinds of minds we have and given the kinds of minds we can someday have based on changes
00:26:41.860
whether cultural or pharmacological or genetic or you know just with neural transplants
00:26:47.680
or implants um we integrate our minds with technology who knows what states of consciousness are on offer
00:26:54.500
whatever is on offer a completed science of the human mind would be able to tell us just how good
00:27:02.060
it is possible to feel the truths about us will be known scientifically ultimately right so just as a
00:27:09.260
scientifically minded group of explorers could embark on a journey through a new landscape and try and
00:27:15.860
figure out the smart way to to navigate it using measurement and reason and and discussion among them so
00:27:22.980
a group of reasonable humans could navigate the moral landscape and and figure out new possibilities
00:27:27.220
better peaks as it were that we might um aspire to yeah that's a beautiful sounding project and
00:27:33.680
and certainly convincing you know to to many people but it runs into this this problem quite soon in
00:27:39.120
practice which is that from this start point you are putting yourself onto a springboard where you can
00:27:45.200
basically sort of sound for want of a better term morally superior you you will say you know
00:27:50.860
Muslims your book is sick and uh promotes violence and it provokes this really strong reaction among
00:27:57.800
people that you that you are you are being discriminatory you are you know you are being in some cases
00:28:03.400
you've been accused of being racist because because of the strength of which people hear these views
00:28:08.080
expressed how could you persuade someone who's not in your world right now that these ideas are for them and
00:28:20.240
they're for all humans they're not just for you is it possible to bridge yeah well well first i should
00:28:26.620
say is that despite how undiplomatic i can be on this topic and and you know seemingly unpragmatic and
00:28:34.120
even inept in communicating these ideas in a way that people can hear them people's minds are changed all
00:28:39.860
the time and even in the most extreme case you know i hear from fundamentalists you know former
00:28:44.500
fundamentalist christians or former fundamentalist muslims i mean i you know people who have described
00:28:50.100
themselves as this close to being jihadists i've heard from these people and met them in person
00:28:55.400
whose minds have been changed by a totally uncompromising and tone deaf and even apparently callous
00:29:03.980
criticism of their beliefs so it is a myth to say that someone can't be reasoned out of a position
00:29:10.460
they weren't reasoned into say right that's just simply untrue people through the the hammer blows
00:29:15.260
of reason all the time come out of their their dogmatism and their poorly considered views i mean
00:29:21.220
islam is for whatever reason especially politicized and you you you reap a whirlwind of criticism in
00:29:28.440
on the left politically for pointing out it's it's obvious issues whereas you know christianity if you
00:29:34.600
know on the left you can you know i can criticize fundamentalist christianity all day and we'll win
00:29:38.360
plaudits from people on the left but the moment that turns to islam people worry that this is
00:29:43.520
somehow discriminatory or and that's that's just a double standard we have to notice because it makes
00:29:48.400
no sense and the issue is that all we have is human conversation by which to orient ourselves to
00:29:55.760
these questions the most important questions in human life are questions we have to be able to talk
00:30:00.760
about and we have a very large proportion of humanity that is saying okay these questions these
00:30:09.680
most important questions how to live how to cause your children to live and what to die for these are
00:30:17.100
questions that we're not willing to talk about rationally these are questions upon which we have a
00:30:23.240
book that was dictated by the creator of the universe for whom we have no evidence which is will be
00:30:29.260
sacred until the end of the world the book can't be edited and all that's left for us to do now is to
00:30:35.780
decide how completely we will be enslaved by the contents of this book and if you say anything about
00:30:44.980
this project that is disparaging or even skeptical i will consider you my enemy right that's where we
00:30:52.800
live and it's completely insane right it it is as though we were living in a world where people
00:30:59.720
were doing this with the plays of shakespeare or the the or or the iliad and the odyssey that's how
00:31:04.920
perverse and and random it is and so it's impro it's appropriate to lose patience with the status quo so
00:31:12.420
this is such an interesting topic to me and and uh because i'm i'm thinking about this a lot in terms of
00:31:16.600
ted actually in terms of you know ted speakers come to the ted stage they're coming to try to persuade
00:31:21.000
people of something often sometimes those efforts succeed brilliantly and sometimes they fail and
00:31:25.800
sometimes when they fail they fail for unexpected reasons not because the person said anything that
00:31:29.600
wasn't true or sort of reasoned in one way but because they did things that provoked unintentionally
00:31:36.560
a sort of a defensive reaction in the audience and so communication didn't happen i guess my question
00:31:42.940
to you is let's say that your your project essentially is to sort of spread the good meme of moral progress
00:31:49.600
to the world you know how do you do that so that there's reason which is one way in which humans
00:31:54.580
persuade each other but um most people suppress reason as their main listening tool and they're
00:32:02.080
listening to other things they're impacted by their emotions they're impacted by whether they trust
00:32:05.200
someone they're impacted by whether they feel that there's a connection there i just wonder whether
00:32:09.580
there's a discussion it's more like if you like a tactical discussion about how you would do best
00:32:15.220
to persuade people who aren't westerners say aren't liberal westerners or whatever that you are are
00:32:22.400
right and that yes there may be some people who you have persuaded who've completely abandoned their
00:32:28.440
faith for example and come over to a different but there may i think there may also be many other
00:32:32.840
people i could be right or wrong about this we haven't done a survey but who they've heard you
00:32:38.220
they've heard the tone of some of what you've said and rather than being persuaded they've reacted
00:32:42.220
against it because it has sounded scornful or disrespectful is is that is that a danger
00:32:48.460
yeah well first i would point out that a person's capacity to be offended that the feeling of offense
00:33:00.420
is not an argument okay and it's not a virtue right it's not this is a we all have this thing
00:33:08.560
and in many people it is functioning like it is some kind of epistemological principle right like
00:33:17.660
that like it's this is this is how i'm going to judge the correctness of a view i'm going to react
00:33:23.560
to it instantly right so if i say to you well there's good reason to believe that men and women
00:33:31.620
differ biologically you know start with the uterus and and you start counting from there and the more
00:33:39.160
science studies us and sex differences we have discovered that this extends to human psychology
00:33:46.900
and human ability cognitive abilities and and interests and so you start linking those sentences
00:33:53.820
together people begin to get uncomfortable right the discomfort isn't evidence of anything
00:34:00.720
it's it's it's it being true right so correct not evidence but but it is real and and and and the
00:34:07.820
de facto impact of that may be that you lose part of your audience okay but i'm saying well yeah but
00:34:14.340
the point i'm making is that we have a project a collect collectively you know seven billion of us we
00:34:18.820
have a project to get more and more people more of the time to become sensitive to cognitive and
00:34:28.180
emotional reactions that are making conversation and and clear thinking impossible right and this is
00:34:34.600
one of them this is a the the feeling you get like i don't like the way this sounds it's a logical error
00:34:40.680
to move from that feeling to the feeling that this counts as evidence against the view correct so so
00:34:47.940
let's agree that that's a logical error but it might also still be a tactical error by you as the
00:34:54.340
persuader to trigger that offense sure what if you in other words if if you could make the case
00:35:00.440
a different way why wouldn't you well i do i mean i make it depending on the situation i make it every
00:35:06.620
which way it's i'm certainly not a provocateur i'm not saying the offensive thing just to get a rise out
00:35:11.920
of people i mean everything i i say is sincere it's not i'm not giving it the top spin that makes it
00:35:18.200
less accurate because i know i'm going to get a rise out of people in any case i think that it is
00:35:23.620
there are enough people who are meliorists there are enough people who are bending over backwards
00:35:30.320
to not offend on these topics and what we need are more and more people to say listen we we can all
00:35:38.080
we can all be a little thicker skinned than that and we're paying a price for political correctness
00:35:43.180
we're paying a price for not being totally straight and it is just a fact that just to talk
00:35:51.180
about the the narrow subject here but this applies to everything you know everything that's polarizing
00:35:56.040
it is a fact that we are paying an immense and generally unacknowledged and i would argue totally
00:36:04.800
unnecessary price for respecting this concept of revelation this the idea that any one of our books
00:36:11.060
it has an origin that is not merely human and that the moment you put a little pressure on that
00:36:17.760
belief you're already in in the territory of deeply offending billions of people so that's true but
00:36:25.140
let's so let me tell you a a story about my own sort of engagement with both religion and islam
00:36:32.280
it sort of intersects with this a bit i mean i was i was my parents were missionaries i grew up in
00:36:36.420
pakistan and afghanistan you know i grew up as a fundamentalist christian born-again christian
00:36:40.940
believed that um that my christianity determined whether i go to heaven or hell my father was in
00:36:47.180
pakistan for many years in the belief that unless he could persuade muslims to to accept christ that
00:36:54.520
they would go to hell and this this drove his whole approach to life and he was you know he was in his
00:36:59.060
own by his own measures a deeply moral person instead of making money in the west he was out there as a
00:37:04.560
pauper trying to persuade people of this belief over many years um he and my mother got to know
00:37:11.080
many muslim families at much greater depth and um something kind of surprising happened um they
00:37:19.360
they found that many of them were deeply spiritual and shared a lot of the sort of concerns they did
00:37:25.820
they they had you know they were concerned for the poverty they saw around them um they obviously you
00:37:31.420
know sacrificed for their families and they they worked hard and they're incredibly hospitable
00:37:36.000
and spiritual in the sense of sort of the quest for something deeper than themselves they found a
00:37:40.820
connection to him and surely slowly you know his views shifted to believing that actually muslims and
00:37:47.520
christians quite probably worship the same god just by different names and with different sort of um
00:37:52.820
accompanying beliefs but it was a monumental shift for someone who you know started from where he started
00:37:58.980
and his his conversations with muslims he he could go a very long way by starting from a position of
00:38:08.320
respect of of emphasizing the things the good things that were there so what i i guess my my question
00:38:13.780
for you is that because you i i hear what you say and i i believe what you say about your your
00:38:19.120
sincerity you are you're very very passionate about what you believe what people sometimes hear is a
00:38:24.880
withering scorn um which is very effective like in your ted talk i picked up one one line you made
00:38:30.260
about millions of muslim women being trapped in cloth bags and um if you say that as opposed to a
00:38:36.740
different way of interpreting like you could we could say that like i'm here trapped in a cloth bag
00:38:40.620
around my you know body for different reasons um if if there was a different story that said
00:38:46.720
look there are so many things that are extraordinary about your tradition and uh your religion there's this
00:38:52.940
emphasis on mercy there's this emphasis on compassion on hospitality many many muslims spend so much of
00:39:00.020
their lives trying to figure out how to make the world a better place they're not focused on the
00:39:04.500
stories of around violence and so forth that are part of the quran but arguably open to different
00:39:10.660
interpretation if you start from a different point don't you have a better chance of persuading
00:39:17.320
the silent majority of muslims to take you seriously and to want to be part of the solution
00:39:25.720
as opposed to provoking them to sort of say uh-oh outside a meme coming in defend reject and and to
00:39:34.180
sort of close you out and you actually make someone could argue that you make life harder for moderate
00:39:39.940
muslims because of that sort of feeling of scorn for the whole enterprise right well first it's never
00:39:48.420
it's virtually never that i'm i'm communicating scorn for people it's it's ideas that i'm criticizing and
00:39:55.680
what you what you talked about with your your family's experience in pakistan and afghanistan
00:40:00.320
none of that is surprising to me but there's a just a deeper principle here that that we're human
00:40:05.300
beings where we are on some basic level running the same software and you know culture is laid on
00:40:12.800
top of that but we just we have a deeper psychological capacity for empathy and ethical engagement and even
00:40:19.260
spiritual insight and i and i don't shy away from this concept of spirituality i i can i do consider
00:40:24.160
myself a very spiritual person as you know i've spent a lot of time practicing meditation i've spent
00:40:29.600
all told some years on silent meditation retreats and i think that the contemplative life is something
00:40:36.820
that we have only begun to think about in western scientific rational terms and it is a part of the
00:40:43.160
spectrum of human experience that i think is undoubtedly there and worth understanding culturally
00:40:49.820
the problem is that the respect for tradition and in particular the respect for revelation
00:40:54.880
keeps us balkanized into these separate moral communities that do have irreconcilable differences
00:41:01.720
and so you you pointed to the possibility of christians and muslims having a kind of rapprochement
00:41:07.520
around that we both worship the same god you can't play that game with hindus right there's like the
00:41:13.560
christian the bridge from christianity to hinduism only runs in one direction hindus can say well
00:41:19.160
jesus is just an avatar of vishnu but the christians can't look at hinduism with its multiplicity of
00:41:24.740
gods you know thousands of gods many of whom have you know the most garish fictional monstrosities
00:41:30.720
from the the point of view of christianity you've got people worshiping monkeys you know like hanuman
00:41:35.540
or or elephant-headed gods like ganesh none of that makes any sense it's it's all understood in a
00:41:41.940
context of karma and rebirth that also makes no sense uh somebody's right and somebody's wrong and
00:41:47.520
somebody's going to hell right if you believe these things and there's no reason we should find
00:41:53.360
ourselves in this circumstance for centuries more right that it makes and we and we we just don't
00:41:59.740
have the the luxury of waiting for centuries to change our views on these topics but but sometimes
00:42:07.160
um people like yourself i would say and and others who sort of criticize religion it's is they they see
00:42:13.780
they sound as if they're coming to it as if religion was um a belief like say belief that jupiter is the
00:42:20.060
biggest planet that was that someone if they were persuaded or shown how ridiculous an idea that was
00:42:25.500
would would abandon it and possibly miss um but again i should say that does happen that does happen
00:42:32.580
it can happen but it's it's a i i think it's a deeper thing that you're trying to overcome i mean
00:42:38.880
as someone who grew up religious it's not just a belief it's it's a relationship you know to let go
00:42:46.360
of a belief in god is worse than getting divorced it's sort of like it's a it's a relationship with
00:42:52.560
what you believe to be someone you've had uh connection with your whole life and um and so
00:43:00.300
when i i i think incoming critiques of it yes you know that there is a discussion to have around reason
00:43:08.780
but it's also it feels like you know your core identity is being attacked and i think when when that
00:43:14.280
happens to humans a whole other set of defense mechanisms come into play and i guess what i
00:43:19.640
wonder is let's say we agreed um that a world where religion did not play the dominant role that it
00:43:26.620
does now was could be a better world how would you get there it might not be a head-on assault on
00:43:34.980
religion it might be like if you look at what's happened with christianity a few hundred years ago
00:43:39.500
christianity was at least as violent as the most extreme aspects of of islam today and gradually
00:43:47.740
most christians have just downplayed those interpretations of the bible those arguments
00:43:54.420
you know the people espousing them have have lost cause people most people don't need as coherent
00:44:00.460
and consistent a worldview as as you need and as i kind of feel like i i need most people are able to
00:44:05.820
embrace an element of contradiction and to say you know i love the traditions here and i i believe in
00:44:11.740
some of the core ideas of of religion even with hinduism christians and hindus can unite on certain
00:44:18.340
things they can unite in in an idea that life is about more than shopping um it's about you know the
00:44:24.480
exploration of mystery and wonder and the divine and the pursuit you know perhaps god is all around us
00:44:28.740
gods are all around us those two things don't have to be that different they can agree on compassion
00:44:32.660
religion as as like as a core operating system value i would argue it's possible that an approach
00:44:39.020
that said i love that about your religion that that is at the heart of it and that you do that
00:44:44.560
um i'm uncomfortable with this how you know can we talk about that but but it starts from a position
00:44:50.180
of respect for you know for for what is good and and arguably the the single biggest thing that
00:44:57.520
religions do that fundamentally plays to the moral world you want is that they have persuaded billions
00:45:04.380
of people that they should pursue the interests of others over themselves and that that i think is
00:45:12.000
the hardest thing that the abandonment of religion hasn't really handled yet is that by saying you know
00:45:19.060
all those rules don't matter anymore find yourself follow your passion be your thing we haven't
00:45:24.360
inspired enough people yet to say as a core part of that by the way don't just live for yourself
00:45:31.280
well i think the need is to be able to talk about the most important questions in human life without
00:45:39.620
losing our connection to one another and we are not playing that game well we need to be able to hear
00:45:46.340
people out we need to be able to reason about everything because reasoning is the only thing that scales
00:45:53.520
it's the only way of talking about a problem which stands the chance of being universalizable it's it's
00:46:02.140
the and this is why identity politics is clearly a dead end it can't be that this thing is important
00:46:11.440
and the whole world needs to take it on board because you are you or because you have the color of skin
00:46:18.820
that you do right by accident it whatever this thing is whatever if this is going to going to relate to
00:46:25.140
you know our building a durable cosmopolitan pluralistic future together this thing has to be
00:46:31.860
true and important because it's touching the way the world is for everyone on some level it's touching
00:46:38.940
some universalizable principle of human psychology and human flourishing and economics or whatever it is
00:46:44.820
and our religious provincialism doesn't do that our incompatible claims about revelation don't do that
00:46:52.860
our the mere accidents of birth and skin color and gender don't do that and so we need we need to again
00:46:59.540
we have to be able to reason as human beings very much in in the style of so to take john rawls for a second
00:47:05.860
he had this brilliant thought experiment which is that it was called the veil of ignorance or the original position
00:47:11.480
where he he he asked us to imagine organizing a society such that you know we figure out what we
00:47:18.600
think are you know just and fair arrangements between people but we did this from behind a veil of
00:47:23.880
ignorance where we don't know who we're going to be in that society right so and and this is a a starting
00:47:29.160
position where you then could imagine that whoever whoever you are whether you're a neo-nazi or a black person or or a
00:47:37.560
a muslim or an atheist whoever you are not knowing who you're going to be in this society
00:47:44.680
this is a heuristic that could allow people to converge on principles of fairness uh without
00:47:50.280
them having to sort out their differences in advance from that veil you could have a recent
00:47:54.520
discussion about what would be the limits of inequality in a society what would be the
00:47:58.040
fundamental rights that you would want at a very minimum to have and you don't you don't know if
00:48:03.240
you're you know high iq or low iq you don't know you don't know where you where you will fall right
00:48:09.320
and and this is a principle that generally i think is is unacknowledged that we have to spend much more
00:48:14.840
time uh acknowledging which is that so much of this comes down to luck i mean some people are so much
00:48:22.280
luckier than other people you're lucky to be born in the right place at the right time to the right
00:48:26.920
parents with the right economic opportunities and all of those switches can be toggled in the
00:48:33.880
other direction and you have none of that and through no fault of your own this is a massive
00:48:40.360
lottery and so much of what will make the future better is for us to care about the the most shocking
00:48:47.960
disparities in in luck and correct for them collectively and that john rawsian conversation
00:48:54.840
seems like a beautiful thing imagining that that people just using the tools of sort of um
00:48:59.000
you know reason and fairness and discussion among them you could you could come up to certain basic
00:49:03.080
fundamentals of a society for example like as soon as you have that discussion everyone puts basic
00:49:08.520
health care and education right at the high list of something that a society of course you know would
00:49:14.360
would would do because you would you would want that at a minimum to give yourself a chance
00:49:17.880
and um and and you can build on many other things on the top of it and then the tragedy of the
00:49:22.760
present seems to be that certain discussions seem to get shut down before they can even start with
00:49:28.040
the lines that you can't say that to me because of who you are and who i am yeah well that's identity
00:49:34.200
politics yeah yeah and it's it's a try i agree with you that it's it's a tragedy i have this picture of
00:49:39.000
these sort of two different audiences so from the view of someone speaking to an audience you know
00:49:42.440
there's an audience sort of here's the speaker and the people are watching and they're sort of their
00:49:46.520
eyes are open and and uh their arms are open and they're excited and they're listening and then the other
00:49:51.400
audience where they've heard something that means oh i don't know about you and they're in protection
00:49:56.120
mode i think there are strategies to provoke the sort of you know the opening of the arms and and
00:50:02.440
the listening and one obvious question i guess is whether should we start every conversation when
00:50:10.200
there are different identity groups involved with some kind of recognition of you know the biggest
00:50:15.080
concerns of the other group i wonder whether that's something that we need to spend more time on
00:50:18.680
well i think there are tricks as you say that are very useful and that we are paying a terrible
00:50:24.920
price for not remembering enough so it's like let's not call them tricks let's call them wise
00:50:30.760
wise maxims yeah but but it's just only it's it's it's something like you know so many of these
00:50:36.600
arguments occur with each side straw man in the other right so you take the worst version of your
00:50:43.640
opponent's view one that he he or she can't wouldn't sign off on and you attack that and
00:50:50.280
that's totally unpersuasive what we need is the opposite so there's a notion of steel manning which
00:50:54.760
is is now a term of jargon among us where you you you you prop up the best possible version of your
00:51:02.760
opponent's view which they will not find fault with right so let me just let me just summarize what i
00:51:09.240
think you think and then what you put into that place is perfect right that's the way to start one
00:51:15.080
of these debates right right now that's but it's but and i would i would add one more um tweak to it
00:51:20.120
perhaps sam which is which is not just that's the way you think even before that it's this is the way
00:51:25.560
that you feel and i think i think i think you know that feelings are so fragile right now that people
00:51:31.160
want that recognition first almost to feel that that more human connection not just an intellectual
00:51:36.680
connection yeah although i would say that this dichotomy between reason and emotion or the
00:51:41.880
intellect and and feeling that is it's a bit of another one of these myths it's certainly not the
00:51:47.400
case neurologically and i would say it's generally not the case experientially i mean when i i mean
00:51:52.520
none of this is divorced from emotion for me you know like when i'm talking like so that first ted
00:51:58.600
talk i gave for you where i was talking about you know the moral landscape and how science can
00:52:03.880
understand human values and i'm you know i'm a spock-like character i actually almost burst into
00:52:09.720
tears at one moment i'm talking about honor killing and i then i ask you to imagine what it's like to be
00:52:15.400
a father who believes that that the family honor and male honor is so predicated on the sexual purity of
00:52:24.120
the girls in the family that when he has it when his daughter gets raped what he's moved to do is to
00:52:30.600
kill her out of shame right so that would just by stating that example i virtually burst into tears
00:52:36.760
so i'm reasoning in a cold and calculating way about what is right and what is wrong and the power of
00:52:41.800
ideas but this is all you know this is all just just a a neuron away from a a very energized and feeling
00:52:51.640
laden contest but but um but if your audience hadn't been the ted audience say but had been um an audience
00:52:58.360
including say it was an audience in an islamic country there is an edit to that talk that could
00:53:04.600
have made it much more effective which is to start by saying look um i understand the the beauty and the
00:53:12.280
idea of honor i understand that you come from a tradition where family values are deeply respected
00:53:18.600
where you want you want to celebrate the purity of marriage you don't want people engaging in
00:53:24.520
widespread infidelity you look at what's happening in the west and you're horrified by what you see
00:53:29.400
you're horrified by the movies you see you don't want your society to be like that that those i i
00:53:34.520
understand the beauty of that but and then from there you go but you can't go from there to the
00:53:39.240
horror of well i did sort of make that point even in that talk but this is a point i do make it's not
00:53:46.440
it's not hard to see the merit in the the criticism of western superficiality and materialism and
00:53:55.560
blindness to what is sacred or possibly sacred about our appearance here we have done a bad job as you
00:54:01.560
say in in secular culture particularly in the west in valuing something more than just gratifying one
00:54:10.680
desire after the next you know and so it's it can't all be a matter of getting nice tastes on your
00:54:17.000
tongue and buying the most expensive watch you can afford right and yet clearly we need a deeper and
00:54:23.880
truly universalizable conversation about what is most profound and what is what is possible here and i again
00:54:31.880
i just keep this is this is where you have to draw the line and have to be uncompromising i think
00:54:37.880
the idea that we will get to a good place by simply reducing our adherence to these irreconcilable
00:54:50.120
claims of revelation like they're getting christians and muslims and jews to be just a little bit less
00:54:55.400
fundamentalists more of the time that that is the that incremental effort is the end game i think that is
00:55:03.720
clearly untrue because the problem is there's an asymmetry here there's an advantage to fundamentalism
00:55:10.120
always because one when you go back to the books the books never tell you to be a moderate they never
00:55:15.400
tell you the problems with fundamentalism and fundamentalism can always be rebooted by just
00:55:21.000
merely adhering to the text and there's something more honest about it and this again this is where
00:55:27.480
where there's an asymmetry within the every one of these traditions where the fundamentalists are on
00:55:31.880
firmer ground theologically than the the moderates or the liberals because they can always say listen
00:55:37.400
i just want to know what the book says i just want to i want an honest adherence to what is here on the
00:55:44.120
page and what that gets you is intolerable right you have to be doing some advanced not entirely
00:55:52.840
straightforward casuistry with the book to edit out the bad parts so let me pull you back to to um almost
00:56:00.520
to the start point of your position here um your start point comes from recognizing that uh all that
00:56:07.720
matters are things that happen to sentient beings you know it's if an atom moves here and there the
00:56:12.920
universe no big deal if something suffers or enjoy something that that matters and that's that's the
00:56:18.520
anchoring view of um of of the position so that's fundamentally a statement about consciousness
00:56:24.520
consciousness yeah um and yet consciousness i think in your view certainly in mine is is the one big
00:56:32.600
thing that we know about that science so far has um miserably failed to give a really compelling
00:56:38.920
explanation of uh i i would say so you've you've got you've got a view that you could science can get
00:56:46.280
you to a sort of a rational view of of right and wrong of morality that's anchored in in a story about
00:56:52.360
something that science really can't explain how how do you how do you think about that is that is
00:56:56.840
that um is that a paradox or yeah well you know i as you know i'm one of these people who believes
00:57:04.120
there is a so-called hard problem of consciousness that consciousness is unlike anything else we've
00:57:08.680
attempted to study or understand scientifically and is simply a fact that the only evidence for
00:57:16.040
consciousness in the universe is our direct experience of consciousness itself but the flip
00:57:24.360
side of that is that consciousness is the one thing that can't be an illusion it's the one thing we
00:57:29.080
can't be mistaken about consciousness whatever it is exists i think therefore i am the original yeah
00:57:34.600
but i mean this is a i think descartes might have meant something very close to this but
00:57:38.920
consciousness is deeper than than thought it's and and and you know the the i am part is also fishy
00:57:44.600
because i think the the self is is an illusion the self is a construct there's no stable unchanging
00:57:49.880
self carried over from one moment to the next something feels therefore something is yeah there
00:57:55.240
is there is something seems to be happening and that seeming is what we mean by consciousness so
00:58:00.840
even if you're you we're not actually doing a podcast now and you're just dreaming that we are
00:58:05.000
even if we're just brains and vats if we're in the matrix if we could be radically confused about
00:58:09.960
everything but whatever this this seeming is the fact that the lights are on that is consciousness
00:58:17.480
the fact that there's a qualitative character to our appearance here to being and that some systems
00:58:24.120
have it and and probably some systems don't right and even some parts of the brain have it and some
00:58:28.680
parts of the brain don't that is mysterious but the fact that that is so is the one thing that isn't
00:58:37.720
open to any possible doubt and that's and so that's a it is the kind of paradox because it's the one
00:58:43.720
thing it is the thing that is doing all the understanding we don't understand consciousness
00:58:48.920
but unless something appears in consciousness it isn't an empirical datum to to be taken into account
00:58:55.400
at all is is there any hope that in the next 10 years say that we make material progress in
00:59:02.520
understanding consciousness i mean it's been this riddle for thousands of years um it feels like
00:59:07.720
in some ways that there's going to be dramatic new data points over the next decade as the machines
00:59:12.520
we build start to exhibit what looks very much like conscious behavior do you think that's going to force
00:59:19.240
us to make decisions uh you know like the decision on whether the things we create are conscious or not
00:59:26.520
that there's huge um implications of that do you think we'll be able to make a wise decision about
00:59:31.960
that or will that just remain impossibly impenetrable well i think several things might occur and there and it
00:59:39.560
matters which universe we find ourselves in i think it it's it's hugely consequential that we might
00:59:46.520
build conscious machines therefore machines that can suffer and machines that can experience well-being
00:59:52.520
and perhaps suffer unimaginably horribly in ways that we don't understand or or experience well-being
00:59:59.240
that that exceeds our own that ethically is of enormous importance it's you know in certain
01:00:06.680
cases you could imagine it being the most ethically consequential thing that has ever happened in the
01:00:11.080
universe if we could build simulated worlds that are essentially hell realms and populate them with
01:00:16.280
conscious minds you know that would be the worst possible thing we could do it would and and it would i
01:00:22.200
should point out give us the same moral stature as the god of the bible or the quran we get if and
01:00:28.120
if he exists as believers believe he does which is to say this is a completely psychopathic thing to
01:00:33.720
do to create a hell and populate it so it matters if that's the case if that's possible and it certainly
01:00:39.320
matters if we stumble into that circumstance not knowing we've even done it right so that like we
01:00:43.800
wouldn't want to do that on purpose we wouldn't want to create hell on purpose and yet it's possible
01:00:48.040
that we could do it inadvertently given just the physics of things what i think is quite likely and
01:00:55.000
pretty undesirable from my point of view is that we could lose sight of this being an interesting
01:00:59.720
problem in the first place we could build machines that seem conscious seem so credibly conscious to us
01:01:06.520
far in advance of our understanding what consciousness is at the level of information processing our machines
01:01:12.440
will all be passing the turing test we'll feel in relationship helplessly in thrust into relationship
01:01:18.200
with them they'll make the right facial expressions we'll we'll design them this way because we'll want
01:01:22.840
to interact with machines at least in certain circumstances that make us feel like we're in
01:01:27.320
relationship with another person and it'll just be obvious to us that our you know robot servant is
01:01:34.520
conscious because it seems so and if we don't know i mean there's a perfect disjunction here we could
01:01:40.920
build systems that are not conscious but seem conscious and we could build systems that don't
01:01:45.800
seem conscious at all because we haven't built the interface for them to seem so but they in fact are
01:01:50.440
conscious perhaps google is suffering right now with all that complexity of information processing that's
01:01:55.960
going on and it's in woe at the dismal nature of all the searches that people are typing into it
01:02:00.920
and wishes that the input could be better yeah it's hard to take that concern seriously but but
01:02:05.640
something like that is certainly possible sam because precisely because we're building these
01:02:11.800
machines making them more powerful at some point we will have to make an effort to put human values
01:02:17.560
into them so we're going to have to decide what those values are and even if you just look at it from
01:02:23.240
that standpoint it seems to me your work is incredibly important i mean these these questions are incredibly
01:02:29.240
hard to resolve but at some point we're building things that need to operate based on some kind
01:02:36.520
of moral code and so we have to bring more people into this conversation we have to figure out and
01:02:41.240
have to try and figure out a way of having it that pulls in as many people as possible you know
01:02:46.120
collaboratively and constructively and get past this horrible moment in history where truth is nothing
01:02:52.280
reason is nothing and it's all it's all just a fight well yeah so this is philosophy on a deadline
01:02:58.760
and this is this is one of the the silver linings to the risk here is that being forced to build our
01:03:06.920
values into technology that's becoming more powerful than we are will force us to ignore the academic
01:03:15.880
quibbles here and acknowledge that there are better and worse answers to moral questions and
01:03:20.920
to just take self-driving cars as one example and again it's a near-term example it's already here
01:03:27.240
it's an engineering problem that we have to solve and then the question is what moral biases and
01:03:33.000
intuitions do you want to build into your robot cars do you want cars that run over white people
01:03:39.800
preferentially because of all the white privilege in the world do you want cars that put the drivers
01:03:46.520
or the passengers life at some greater risk if we're talking about a trolley problem where it's
01:03:52.680
the one versus the five or the one versus the ten one child versus three old people exactly so so
01:03:58.600
and to not answer these questions is to answer them one other way by default you either make your car
01:04:05.080
blind to the differences between people or you make it sensitive to the differences and so there you you it's a
01:04:10.840
forced choice i think people have different intuitions about what are the right answers
01:04:15.800
are here but clearly there are wrong answers and they're clearly answers and some of the traditional
01:04:21.320
answers that you would get from a religion like islam for instance i will bet will be judged wrong
01:04:28.120
even by a majority of muslims when this technology has to come online for everyone and and if they are
01:04:34.520
judged wrong by a majority of muslims that's maybe a an indication that people are incapable of moral
01:04:41.640
reasoning across across you know long-held this is what happens this is again this is and this has
01:04:46.360
happened as you pointed out to christianity in a very effective way i mean christianity if you the
01:04:51.800
crit the crit we're not tending to meet the christians of the 14th century anymore and that's because of
01:04:56.680
what scientific rationality and secular politics and humanism and capitalism and just modernity
01:05:04.520
in general has done to christianity and to some degree the disparity we see between christianity
01:05:09.880
and judaism and islam now is uh because islam is a a much is a vast religion i mean there's it's
01:05:17.000
nearly two billion people and much of the muslim world has not suffered the same centuries-long
01:05:22.920
collision with modernity or and the collision it's suffering now is occurring over a much shorter
01:05:28.120
time frame and without many of the same social and economic benefits being spread to these societies
01:05:35.240
and so we're we have to keep the end game in view the end game has to be a viable global civilization
01:05:44.040
that is pluralistic cosmopolitan tolerant of difference and yet convergent on the same answers
01:05:53.560
to the most important questions in life we can't be radically tolerant of difference these ideas are
01:05:59.640
for everyone not for one group for everyone and uh you're you're ready to fight for that i'm trying
01:06:06.120
i'm trying with your help sam it's been an absolutely fascinating conversation thank you so much for
01:06:12.280
all your time here and wish you the best if you find the waking up podcast valuable there are many
01:06:23.800
ways you can support it you can review it on itunes or stitcher or wherever you happen to listen to
01:06:28.200
it you can share it on social media with your friends you can blog about it or discuss it on your
01:06:33.240
own podcast or you can support it directly and you can do this by subscribing through my website
01:06:39.160
at samharris.org and there you'll find subscriber only content which includes my ask me anything
01:06:45.080
episodes you also get access to advanced tickets to my live events as well as streaming video of
01:06:50.840
some of these events and you also get to hear the bonus questions from many of these interviews all of
01:06:56.280
these things and more you'll find on my website at samharris.org thank you for your support of the
01:07:01.240
show it's listeners like you that make all of this possible