#115 — Sam Harris, Lawrence Krauss, and Matt Dillahunty (1)
Episode Stats
Words per Minute
181.77829
Summary
Learn English with Lawrence Krauss and Matt Dillahunty. Lawrence and Matt discuss a wide range of topics, including the dangers of nuclear war, Christian support for Donald Trump, and how to deal with a new world where technology is in everybody s hands and can be used and abused. They also discuss the recent false alarm warning that a ballistic missile had been detected headed toward Hawaii, and why we should be worried about it, and what we can do to prepare for it, in a world where nuclear weapons are in everyone s hands, and we don t have to worry about them. This episode is the first of three events I did with the three of them in New York, and it was great to meet so many of you afterwards. I hope you enjoy the other three events, which will be happening in Chicago, Phoenix, and Chicago again in the coming weeks. If you live near to either of those cities, feel free to come on out. Otherwise, I will try to get the audio and release it here. Otherwise, here is the audio from one of the events in which you can be heard on the podcast. Thank you so much for listening to the podcast, and if you like what you hear, please leave a review on Apple Podcasts or wherever else you get a chance to listen to podcasts, and share the podcast with your friends and family. Timestamps: 1:00:00 - What's the worst thing you've ever heard of a nuclear attack? 3:30 - What are you worried about? 4:15 - What is your biggest fear? 5:20 - How doomsday clock? 6: Is it real? 7: What's your worst enemy? 8:00 9:40 - What do you think about nuclear weapons? 11:00- How do we prepare for nuclear war? 12:30- What are we should we be prepared for it? 13:00 | What s the best thing we should do? 15:40- What would you do in the new world? 16: What s your biggest takeaway from the Doomsday Clock? 17:20- Is there a new value of the Doomsday clock ? 18:15- Is it possible? 19: What is the real value of Doomsday? 21: How do you want to be prepared? 22:00 -- What is it possible to be a Christian in the 21st century?
Transcript
00:00:00.000
you've heard me with Lawrence Krauss before on the podcast. Lawrence is a physicist who
00:00:20.640
will be familiar to most of you. And Matt Dillahunty has moderated a couple of discussions
00:00:25.980
I had with Richard Dawkins, and you've heard him here as well. So, without more introduction,
00:00:32.160
I'll just say we get into several interesting topics here. We talk about nuclear war and
00:00:38.800
Christian support for Trump. Trump does not come up much. Many of you will be happy to know. We talk
00:00:45.020
about science and a universal conception of morality. We talk about the role of intuition
00:00:51.620
in science, the primacy of consciousness as a fact, the nature of time, free will,
00:00:59.840
the illusion of the self. Lawrence does not agree that it's an illusion. We may have to cover that
00:01:05.960
topic again. And there's a few more topics here. In any case, it was a fun event. It was great to
00:01:12.240
meet so many of you afterwards. These particular events are always followed by book signings, so
00:01:17.060
the event itself was just an hour and a half, but the book signing winds up going for two hours or so.
00:01:25.320
And that really is the chance to say hi. So, if you enjoy this conversation, there will be two others
00:01:31.380
with the same participants in Chicago and Phoenix coming up. So, if you live close to either of those
00:01:37.860
cities, feel free to come on out. Otherwise, I will try to get the audio and release it here.
00:01:43.860
And now I bring you the event I did in New York with Lawrence Krauss and Matt Dillahunty.
00:01:54.220
Ladies and gentlemen, it is my great privilege and honor to introduce the gentlemen who will be
00:01:59.340
joining me on stage. Please welcome Sam Harris and Lawrence Krauss.
00:02:10.380
They're standing. You just can't sit in the room.
00:02:41.080
We'll bring the lights up before we get to the Q&A. How are you, gentlemen?
00:02:47.500
Yes. As you know, but they don't. I came down with food poisoning two nights ago. So,
00:02:52.960
if I either vomit or have to run off stage, it's not because of anything these two gentlemen
00:03:00.960
Maybe if he liked you better, he'd feel better.
00:03:05.360
We'll see who runs off stage faster if you vomit.
00:03:07.940
I promise not to run off stage, mostly because I'm in boots that won't allow me to run anywhere.
00:03:15.400
So, today, you know, we're going to be doing three of these. New York is the first time
00:03:19.740
for the three of us together. And something happened today that was all over the news.
00:03:25.360
And I thought it might be an interesting spot to start. Hawaii had an incredible false alarm
00:03:32.240
today where an emergency alert system sent out a text message essentially saying that
00:03:38.840
a ballistic missile had been detected heading towards Hawaii and to seek cover. And this is
00:03:42.860
not a drill. And 39 minutes later, they announced that it was a false alarm. And it both intrigued
00:03:53.220
me and terrified me about the new world that we live in compared to, you know, when I was a kid.
00:04:00.240
The technology that's there to save our lives. And yet, things can go wrong because we're fallible.
00:04:06.140
Are we better off if we're terrifying people with false alarms? And how do we go about dealing
00:04:13.720
with a new world where technology is in everybody's hands and can be used and abused?
00:04:22.420
We are in a context where it's plausible to worry that missiles could be headed toward Hawaii.
00:04:27.420
So, that's the underlying problem. But you should say something about the Tuesday clock.
00:04:30.860
People aren't worried about it enough. I think in just a little under two weeks, I'll be going
00:04:38.180
to Washington to announce the new value of the doomsday clock. I'm the chairman of the
00:04:42.880
board of the Bolton Atomic Scientists, as you know. And one of the things that worries me
00:04:47.900
is that I think people become very complacent about nuclear weapons. Because they haven't
00:04:55.440
been used in over 70 years, people tend to think they'll never be used. And the real problem
00:05:02.520
is that this kind of thing became public. But there's a great book called Command and Control,
00:05:08.260
which is terrifying. And you realize how many close calls we've had. It's kind of amazing
00:05:15.040
that there hasn't been either an accident or panic.
00:05:18.540
If you haven't read it, that's Eric Schlosser, his book. And there was a PBS documentary done
00:05:24.860
on it. And you should either read it or watch that documentary.
00:05:28.100
Read it, but have a bottle of scotch or something when you're reading it. Because it is really
00:05:35.280
terrifying, as it should be. And so, part of the problem, in fact, of this, there's a lot
00:05:41.400
of problems that people don't realize, that in fact, because intercontinental ballistic missiles
00:05:46.220
act relatively quickly, you know, in 25 or 30 minutes that they can do their work and do most
00:05:53.680
of the way around the earth. We still live in a world where the United States and Russia both have
00:06:01.500
about a thousand weapons on a status where they're prepared to respond immediately. And as a lot of
00:06:11.380
people, I didn't want to mention this word, but until a guy whose name I won't mention came in
00:06:18.240
the White House, people didn't realize this. And I actually didn't realize it either until I was
00:06:23.580
writing a piece. But people now know, and if you don't, you should know this, that there is no
00:06:28.140
there is no safeguard against the president launching nuclear weapons. There's no one he or she
00:06:34.360
would have to ask. There's no one who can say no. And there's no, there's no constitutional check on
00:06:42.280
that. And, and recently, some Congress people did discuss producing such check. During the Cold War,
00:06:48.560
there was perhaps the height of the Cold War, there was some reason for that, because there were 20,000
00:06:53.360
nuclear weapons that Russia and then the Soviet Union and the United States were shooting, were aiming at
00:06:57.380
each other. And, and the idea was you have to launch them quickly. But now there isn't that reason. And
00:07:03.380
yet we still have that. And that's, that itself is terrifying, because if that warning had not got,
00:07:08.880
and by the way, the warning I understood was due to a shift change, and someone pressed the wrong button
00:07:13.680
when they went off the shift. This is true. That raises a problem. When I'm a, when I have to check
00:07:20.300
out at the grocery store and swipe my credit card, I have to click yes, like 18 times just to pay for my
00:07:25.320
book. How is it, how could you possibly hit the wrong button in a shift change and not get a,
00:07:29.240
hey, are you sure you want to send this message? But, but imagine that went not to the sensible,
00:07:36.560
but scared people in Hawaii, but imagine that went right to the White House. Right. Okay. Well,
00:07:41.400
and to read command and control is to witness how by sheer dumb luck, we have avoided nuking ourselves,
00:07:48.180
one another, and even ourselves. I mean, just literally dropped live nukes on like North Carolina and two
00:07:55.160
of three safeties failed. And the final safety was like a manual toggle switch that was just in the
00:08:01.020
right position. And it's just, it's with, and in silos, this book begins with a, with a, with a,
00:08:06.720
with a potential nuclear weapon exploding in a silo. It is truly amazing. And it really
00:08:12.940
argues for something that we've been arguing at the, at the Bulletin, and certainly I try to write
00:08:17.980
about, which is that, that we are safer with fewer nuclear weapons and not more nuclear weapons,
00:08:22.340
because the more you have, the more likely there will be an accident or a, or a, a false alarm.
00:08:28.680
And, and yet we're in a situation right now where there are no arms control treaties. And what I was
00:08:34.160
going to say at the beginning, which I think we were talking about beforehand is when I, what
00:08:37.600
discourages me when I write about nuclear weapons compared to almost anything else I write about
00:08:42.340
in the popular media, there's less interest. I don't know whether people don't want to think
00:08:47.400
about it or are they just so complacent? Armageddon is boring. Yeah. Armageddon,
00:08:51.500
I guess, is boring or you don't want to, you want to think about it. Can you say what you said
00:08:54.740
about William Perry's opinion? Is that for public consumption? I don't know. Um, it's just us.
00:09:00.420
But well, well, uh, thanks, Sam. I'll think of something back, but, um, uh, William Perry,
00:09:08.560
I actually, I will, I'll use this as an opportunity. We'll be at my origins project in Arizona. We'll be
00:09:12.980
having a, an event on, uh, a workshop on, on autonomous weapons, autonomous weapons, nuclear
00:09:18.500
weapons, and, and, and defense. And, and I'll be doing a dialogue with William Perry in a month.
00:09:24.000
Um, maybe give a two line bio of... William Perry was the secretary of defense and,
00:09:28.440
and, oh, for Clinton, I guess. And, and, uh, and has been, and is an amazing man in many,
00:09:33.500
many ways and has a long view. He's not a youngster like you. And, um, and, but he, he said in
00:09:40.960
conversation that he thinks we are now living in a, the time that is, is, uh, more dangerous
00:09:48.000
than any time, even during the height of the Cold War, which is really kind of, uh, sobering.
00:09:54.640
With respect to this issue. With respect to nuclear weapons, yeah. And, uh, uh, it's, it's,
00:10:00.500
it's an issue that people should be concerned about. So it's awful that that happened. But
00:10:04.120
if it raises public awareness of the kind of ridiculous accidents that the ridiculous false
00:10:10.120
alarms, um, uh, there's a man who actually, we nominated for the Nobel Peace Prize. He never,
00:10:15.640
he's now dead, but a Russian, in my opinion, the only, one of the few people who probably
00:10:19.620
really deserved the Nobel Peace Prize, uh, a Russian, uh, who was working in a missile silo.
00:10:24.860
And there was a computer glitch and it showed a nuclear weapon being launched in the United
00:10:28.320
States. And he got the order to fire. And another was, showed another weapon five minutes
00:10:32.380
later and another weapon. And he personally reasoned that if there was going to be an attack
00:10:36.780
from the U.S., they wouldn't wait, you know, four or five minutes between each other, what,
00:10:40.500
two minutes or whatever it was. So he disobeyed the command and probably personally saved the world.
00:10:46.500
It's nice to take that warning that went out today, even though, you know, it's a mistake.
00:10:50.840
It lets us know about human error. It also may raise awareness. There, there is a potentially
00:10:55.600
huge downside in, in that this could be, end up looking a little more like a crying wolf situation
00:10:59.900
where the next time, if it's real and you don't get, you get that warning that you,
00:11:02.900
you don't take shelter. Uh, but I think I, something you said is terrifying to me and
00:11:07.560
not because specifically because of who's in the white house, this, this is true, no matter
00:11:11.720
who's there, the very idea that Congress has to declare war, but they don't have to declare
00:11:17.460
that it's okay to nuke people. Uh, there need, you know, in a, in a, in a nation and a system
00:11:22.860
that's built on checks and balances, this one thing doesn't appear to have sufficient.
00:11:27.780
The most consequential thing has no check and balance. Yeah. And I worry.
00:11:31.900
It shocked me. I don't know if you knew about that earlier. I mean, literally I thought
00:11:35.180
that there had to be approval of the NTC staff or, or at least a majority of cabinet members
00:11:39.400
or something. Um, but in fact, there is no check on that.
00:11:43.320
I would like to think that if somebody decided to go rogue and do it, that there would be somebody
00:11:48.980
sensible nearby, some secret service person who would do what that Russian missile agent did.
00:11:55.240
Well, one hopes that, yeah, I mean, the people actually have to press the button and their button
00:12:00.400
is bigger than, than his. Um, they, they, they would, it's a sober, I mean, you actually have
00:12:06.300
to do it. I think those people think very carefully, but you know, they're trained to realize that
00:12:10.020
they may have to do that. And so it's, it's, um, yeah. And they drill it all the time.
00:12:16.460
So I wonder how this ties in to, I try not to paint with too broad of a brush when I talk about
00:12:22.680
any specific religion, including Christianity, but there are a number of Christians, including
00:12:26.480
some of my family members who are eagerly awaiting Armageddon. Uh, we all deal with people who
00:12:36.020
construct conspiracy theories on occasion. I don't think it'd be that hard to put together a conspiracy
00:12:40.720
theory that the reason we have Trump is because there were people who are okay with the idea of
00:12:47.220
Armageddon, uh, because I know tons of evangelicals, evangelicals who were supportive of him when
00:12:52.840
there's nothing about this man that fits like the churches I went to, even though I know those
00:12:56.840
churches are waiting for an apocalypse. The, the most benign interpretation of the Christian
00:13:02.420
support is just their calculated assumption, which has borne out that he will give them what
00:13:09.000
they want because they're a voting bloc that he needs. Uh, I remember I ran into Ralph Reed,
00:13:15.600
the former head of the Christian coalition at a conference, and this was still during the campaign,
00:13:20.620
but when, when Trump was the nominee and was professing to be a Christian of some flavor.
00:13:27.240
And I had, I had no, I had debated Reed once on television, but we actually had never met.
00:13:32.760
And I, I said to him, there's no way you think he's actually a person of faith, right?
00:13:38.500
I mean, what, how do you explain the Christian support? And he immediately fell back on this
00:13:44.380
trope, you know, who am I to judge what's in another man's heart? Insofar as I could tell
00:13:49.740
that he was bullshitting, he was really bullshitting. He's happy to judge what's in other
00:13:53.540
people's hearts. Yeah, right. But the worst possible interpretation is the one you just gave,
00:14:00.060
which is there, there's a, at least some millions of people and maybe tens of millions of people in this
00:14:04.860
country for whom biblical prophecy is real. It's a real roadmap to the future. And they're expecting
00:14:12.800
the wheels to completely come off this car before the end. And that will be the best thing that
00:14:19.140
happens. That's necessary for the best thing that will ever happen to happen.
00:14:22.980
I have to say that since Trump got elected, I've been sort of hoping for Armageddon too, but in a way,
00:14:28.120
it just seems better than listening to tweets every day. But, but I actually don't think it's
00:14:34.980
the Armageddon thing. I was actually just thinking about writing a piece about this and I'll say it,
00:14:38.900
although it'll get a bingo angry, some people. It, to me, it represents one of the real problems
00:14:44.900
of what, of professed Christianity. Because when you said, when you said, they, they, they don't think
00:14:51.120
Trump is a Christian, but they'll get what they want. What do they want? Do they want the things
00:14:55.640
that they're supposed to be abonding, like love and all the things? No, what they want is hate.
00:15:01.060
What they want is laws that restrict freedom of others. And that means to me that operationally
00:15:06.040
in this country, when it comes to the politics, professed Christianity is equivalent with hate.
00:15:11.820
Well, to bend over backwards, no, before I want to see if there's anyone who would, I can't tell.
00:15:17.680
The most charitable interpretation is not that it's synonymous with hate all the way down the line,
00:15:23.860
because just imagine if you're someone who really thinks that abortion is akin to murder, right?
00:15:30.560
That there is no difference between killing a fetus at the eight week stage and killing a fully
00:15:36.560
developed human being. If you think that, then you think our society is just spectating on a
00:15:42.460
Holocaust that has been going on for your entire life. And it's easy to see how someone would not
00:15:48.840
be moved by hate and would be, would in their own mind, be moved by compassion and love and a concern
00:15:53.580
for divine retribution if they believe that God is watching all the while.
00:15:57.820
Yeah. I mean, you're right. It's extreme to say that. You could say the same thing about
00:16:01.720
restricting the rights of gay people. That it's really love because that's a sin in a lot of
00:16:07.540
people's hearts. And, and therefore, you can say the same thing about members of ISIS who were
00:16:12.860
throwing gay people off of rooftops. Some of you, you must have seen this footage of ISIS members
00:16:19.340
hugging with apparent sincerity, the people they were about to hurl off of rooftops.
00:16:24.740
I mean, this was not, this was not a naked declaration of hate. This was, sorry, this is how the game is
00:16:31.520
played. We, you know, we have to do this. Well, you know, that represents to me that, I mean, that's the,
00:16:36.300
that's the paradox. So, and, and I don't know if I've said it before on stage with you, but Steve Weinberg
00:16:41.460
was a physicist, a friend, a Nobel laureate, and also an atheist, has, has said that there are good
00:16:49.520
people and there are bad people. Good people do good things, bad people do bad things. When good
00:16:54.400
people do bad things, it's religion. Yeah. And I think, I mean, there's a lot of truth that,
00:16:59.000
so the people are, and, and it's not just religion, it's ideology. Whenever people move away from
00:17:04.920
reason, and, and justify, and we all do it, but justify bad actions, and as if, no one, I think
00:17:13.240
very few people do bad things thinking they want to do a bad thing. Right. They're doing it for some
00:17:17.220
reason that they think is a good reason. Well, we can go right back to Voltaire to address all this,
00:17:21.560
which is, you can get people, if you can get people to believe absurdities, you can get them
00:17:25.160
to commit atrocities. Yeah. Yeah. And once you've poisoned the foundation, which I think is hallmark
00:17:31.520
of what many religions do, of right and wrong, of about how we should go about determining what is
00:17:37.260
a moral good, if you poison that sufficiently, that's how you get people to do that. That's how
00:17:42.040
you get them to bomb abortion clinics. That's how you get them to throw homosexuals off roofs,
00:17:46.300
which kind of brings us to one of the questions, and we polled a little bit. I asked for
00:17:51.480
suggestions on Facebook, and Sam had asked on Twitter, and there's a couple things that keep
00:17:56.400
coming up, but I think, given what we're talking about, this issue of morality terrifies believers.
00:18:05.220
I've been told that, you know, atheists can't be moral, and then the people who have put like
00:18:10.360
another half second of thought into it will say, well, of course you can be moral, but you can only be
00:18:13.880
moral because you were raised in a Christian environment that taught you about morals.
00:18:18.360
And I gave a talk for a number of years, but you wrote The Moral Landscape, and I want you to just
00:18:25.820
take a couple of minutes and give a summation of objective reality, science-based assessments,
00:18:32.360
and why people don't have to be terrified, and why it may in fact be more terrifying,
00:18:37.260
if morals are just the dictates of some individual or being.
00:18:41.980
Well, it's clearly more terrifying if the Bible is true or the Koran is true, because then the
00:18:48.800
universe has been created and is now governed by an omniscient sadist. He's created a universe
00:18:56.300
with hell to be populated by people who he didn't give enough evidence to to convince them of the
00:19:03.780
truth of his doctrine. So he could have just given enough evidence and we'd all be fundamentalist
00:19:09.460
Christians or Salafi Muslims. But he gave, he, the miracles are always thousands of years old,
00:19:15.280
or they're in India or somewhere. And strangely, they're in places where...
00:19:23.280
But they're not sort of like the UFO abductions and the cattle molestations.
00:19:28.400
It could happen right here, right now in front of 2,000 educated people,
00:19:32.400
and we would all be convinced. But that's, that's not going to happen for some perverse reason.
00:19:45.400
But you can imagine if you're in actual, in dialogue with an omniscient being who's bent
00:19:49.300
upon convincing you for your own good, that can happen very quickly.
00:19:58.500
So, but to tie into what he's saying, I've had Christians tell me that God wouldn't reveal
00:20:02.600
himself to me because I would continue to question and deny. And I'm like, what kind of weak-ass
00:20:07.720
God do you believe in who is incapable of convincing me? Oh, you're just too damn obstinate.
00:20:14.180
So let's leave that, so we can leave that aside. There's something strange about believing
00:20:19.640
that these books as written give you a truly moral worldview that you would endorse. If any
00:20:26.640
person behaved the way the God of the Bible behaves, that is our definition of a psychopath
00:20:32.780
But the reason why you can have objective morality, or you, I think that you can have a few short
00:20:37.800
steps to objective morality. What I mean by objective is not that it's all just a matter
00:20:42.920
of atoms. The universe includes subjective experience, includes a consciousness as a natural
00:20:49.340
phenomenon. Consciousness is a property of the universe. We don't know exactly at what
00:20:53.920
stage it emerges in information processing in complex systems, or maybe it goes even deeper
00:20:59.600
than that. I mean, it's totally possible that there's some spooky view of consciousness going
00:21:04.640
further down than vast numbers of neurons or information processing units doing their thing.
00:21:11.440
There's no especially good reason to believe that, I would say.
00:21:15.280
Yes, but still, the jury is arguably still out on that. What it's not still out on is
00:21:21.460
a few fundamental questions. One, clearly consciousness exists. Even if we're living in a
00:21:26.700
simulation on some alien hard drive, something seems to be happening, right? And that seeming
00:21:32.660
is what I'm calling consciousness. So even if you're a brain in a vat right now, or you're
00:21:37.860
in the matrix, or this is all just a dream, and you're going to wake up in a few minutes and find
00:21:42.480
yourself in bed, no matter how confused you might be about your circumstance, there is still
00:21:47.500
consciousness and its contents in each moment. And there is a vast difference between excruciating
00:21:54.540
and pointless misery and sublime happiness and creativity and joy and love and all of the good
00:22:01.280
things in life. And we have no idea how far that continuum actually goes in both directions,
00:22:06.900
but we really know, really, that we like one side of it much better than the other side
00:22:12.260
of it. And we don't have to justify that preference. You don't have to justify preferring the happiest
00:22:19.120
possible life to being tortured for eternity, right? And the idea that you would need some
00:22:24.120
philosophical argument to justify that is just a specious claim that has confused a lot of
00:22:29.860
people. And the idea that you would need to be able to draw your preference there, again, for avoiding
00:22:36.820
the worst possible misery for everyone, that you'd have to draw that from some book that has been
00:22:41.660
dictated by an omniscient being, that also is a specious claim. So I view morality as a kind of
00:22:47.640
navigation problem. And the reason why this is of a piece with ultimately a scientific understanding
00:22:55.760
of the mind and a scientific understanding of human well-being and of conscious systems generally
00:22:59.920
is that navigating between these two ends of the continuum of experience, avoiding the worst
00:23:05.900
possible misery and finding the true bliss and creativity and connection and love, there are
00:23:13.680
right answers about how to do that for properly constituted minds. And for us, there are biochemical
00:23:20.440
answers, there are psychological answers, there are sociological answers, there's economic answers,
00:23:24.880
political answers. Every piece of human knowledge that's legitimate knowledge has to be brought to
00:23:30.000
bear on the question of how to live a fulfilling life. And it is possible to be wrong. And it's
00:23:35.520
possible to not know what you're missing. And it's possible to be right for the wrong reasons. And so
00:23:40.640
every permutation of ignorance and confusion is there to be suffered and endured. And we have to break
00:23:47.360
the spell of thinking that we need to live forever shattered by tribal dogmatisms in order to talk about
00:24:01.820
As Sam knows, we had another word event where Sam was at talking about exactly this and had a bunch of
00:24:11.420
so I think that, you know, I've had a lot of discussions about this since then. And and it may be
00:24:17.100
It is probably true that reason is a slave of passion for most people. We make we we are we are we we
00:24:24.460
have we possess reason, but reason doesn't necessarily drive our actions. Yeah, and we
00:24:29.340
justify things after the fact on the basis of what we want it to be. And then we come up with a rational
00:24:35.900
We understanding that is another exercise of reason.
00:24:38.460
Exactly. And for clarity, there's flawed reasoning, but that doesn't mean that reason itself is flawed.
00:24:45.900
No, but I think we're capable and you and I and everyone in this audience does it. We we all rationalize
00:24:52.460
our lives every day. We, you know, wake up. We rationalize we like our work or a spouse or whatever
00:24:59.820
And and and let it be known that I did. Yeah, yeah, yeah, yeah, that's right.
00:25:08.220
Let it be noted, but um, only audio, but but but you know, and and so I'm not a hundred percent
00:25:13.820
convinced that you can always get off from is as as some famous philosopher once said.
00:25:18.300
Um, but I do think I agree with you completely that it's that it's a process
00:25:22.940
that you without is you can't get off. I think that's the point without is you can't get off.
00:25:27.500
If you don't know the consequences of your actions in any way and that's what science is
00:25:32.780
science tells you the content or reason and reason but I view science as a reason based on empirical evidence
00:25:43.340
you can't determine what's right or wrong. You need to know what the goal is and what the outcome is going to be
00:25:48.060
What the outcome is going to be so without a careful
00:25:50.860
understanding of of and then some people call this utilitarianism, I guess, but I
00:25:54.860
I I I just see it as without without science there can be no morality in my opinion or no
00:26:00.540
sensible morality and I think what we've seen and
00:26:03.660
Pinker and others have argued I think pretty effectively that in some ways
00:26:09.180
the enlightenment and rational thinking has led to a a world where where where some things that were once
00:26:16.380
thinkable are not thinkable now and so I there's no doubt I don't know whether
00:26:19.820
I would argue that we can well certainly I would argue that we might not be able to understand morality now
00:26:25.180
But that doesn't that's irrelevant because we agree that not understanding something is not evidence of anything, but not understanding
00:26:32.540
The more we learn the more we will understand. So I do think ultimately we'll have a
00:26:37.660
a neural understanding of almost all our decision-making
00:26:41.020
capabilities, but but certainly without that without that reason, I don't think you could even discuss the question
00:26:47.100
Well, let me just take one minute to say why I think this is ought business is totally confused
00:26:52.700
This comes from a paragraph in in Hume's work where he was actually
00:26:57.820
trying to to hold religious conceptions of morality at bay
00:27:03.900
It's certainly been overused as one of these exports from philosophy that has just gotten into the heads of everybody and and is
00:27:09.820
Influential totally out of proportion to its its actual validity
00:27:13.180
One thing I would point out is that Hume said he found many seeming paradoxes and one was with respect to causation
00:27:22.140
You couldn't really take science very seriously because apparently there's no evidence of causation in the world
00:27:27.820
We just see the contiguity of various events, but that we never see causes between you know a and b
00:27:35.100
This is odd business. Let's let's say there is no ought there is no should there is there is no
00:27:41.580
Obligation to do anything in this universe there is just what is there's just the totality of facts that are
00:27:48.460
Actual and perhaps possible perhaps that you know also impossible whether that's whether there's such a real real thing as possibility or
00:27:57.260
Everything is in fact actual it's just happening in a parallel universe right or trillions upon trillions of such universes
00:28:05.100
And the first thing I would ask you is if you can't get your
00:28:09.260
Sense of how you should live from the totality of facts all of reality
00:28:14.300
Where do you think you can get this sense of how you should live?
00:28:17.020
So there's you're not impoverished having all the facts of the universe at your disposal
00:28:23.740
But and you still have even if there's no such thing as morality
00:28:29.420
You know put your hand into a wood chipper and see how much you like it, right?
00:28:33.020
You you you will very quickly get the message that you don't want to do that again
00:28:37.180
You will want to avoid that and there there are an infinite number of ways in which
00:28:43.820
Pointless misery from which no good comes and we
00:28:48.460
We will find ourselves navigating and all i'm arguing is that we call
00:28:53.020
Morality those subset of behaviors and commitments
00:28:58.220
That relate in social space to this this navigation problem of finding better lives together
00:29:03.180
And if you were alone on a desert island, you wouldn't call it morality, but you would still talk about
00:29:07.180
Well-being and happiness and I agree completely
00:29:10.140
I guess the question is one of what would call subjective morality if what if you want to use those terms in the sense that
00:29:19.020
The question I would have is that at the same time because that now I think because that navigation
00:29:27.180
Effort is is sort of has an evolutionary basis as well as a cultural basis
00:29:31.580
I think what you know evolution is wrong on most of these questions, yeah, but I think that are
00:29:37.100
What that our thinking has of an evolutionary basis and I don't think that I think it's clear that that's the case
00:29:44.700
It means to me that morality is a moving target to I'm the question is so so that humans are hardwired
00:29:54.220
And and and and not and that's an interesting question to find out how they are and you as you know
00:29:58.940
Psychologists some psychologists do test the famous trolley car experiment and so
00:30:05.420
When one talks about objective reality, I think it's it's based on a totality of experience
00:30:09.500
But that totality of experience evolves and therefore I'm a little more hesitant talking about absolute morality
00:30:14.780
Yes, I I don't I don't have absolute yeah, it's but it's it's a but it's evolving into a space of
00:30:21.500
Right and wrong answers and real facts about con the conscious experience of actual and possible being so there's a right answer to the question of
00:30:31.020
If I add this compound to my neurochemistry, is it going to make me
00:30:35.820
happier or not right in so far as we could we could come to some kind of completed
00:30:40.700
Neuroscience of happiness well, then there'd be we would understand more and more about the likelihood of you know
00:30:45.340
You helping or hurting yourself that way and but so too with any
00:30:48.940
Use of your attention if you you know if I'm if I'm in this relationship
00:30:53.900
There are right and wrong answers there whether or not you discover them
00:31:01.020
Yeah, but like there's that you don't know what you're missing, right?
00:31:03.260
Like you don't know what in a counterfactual situation
00:31:06.780
You could have done something yesterday that would have made today much better than it was for you and you may never know what you missed and
00:31:15.020
Realism for me so whether it's scientific realism or moral realism
00:31:18.540
Just amounts to the claim that it's pot. It's possible to be wrong
00:31:22.220
It's possible not to know what you're missing. It's possible for everyone to be wrong
00:31:25.660
Like every physicist alive we get you can ask some pressing question about physics
00:31:30.700
And where I don't know how many physicists there are 30,000 all 30,000 could be wrong
00:31:35.420
And then tomorrow someone could be right and I get letters every day from those people who say they are
00:31:42.780
But that's the whole but that's I I it's interesting brought up the example because that's the whole point of science
00:31:49.260
The whole point of science if if you couldn't be wrong
00:31:52.060
There would be no science the whole point of science
00:32:01.900
That's how science perceives because it doesn't prove things right
00:32:05.340
It only proves things wrong and then you narrow down what's left over and and so
00:32:10.140
You're absolutely right and that's what makes empirical evidence so useful
00:32:13.740
That's why it should be the basis of public policy because you can find out what doesn't work
00:32:25.340
Utilizing in every aspect of our experience in my opinion. Yeah
00:32:28.700
There's a there's a couple things about the moral issue and
00:32:37.020
I'm glad you guys made the points people confuse objective morality with absolute morality
00:32:42.540
Neither of us none of us I assume I know sam and I are advocating for absolute morality actually situational ethics is
00:32:48.860
Probably the term that I use most often when I talk about objective morality
00:32:52.140
I just mean that it's not just subject to your whim or any subjective experience
00:32:56.380
Because one of the objections we get when you say you don't want to put your hand in a wood chipper
00:33:00.300
Somebody will come along and say well somebody might want to do that. Who are you to decide what's right for them?
00:33:06.940
We are physical beings in a physical universe with rules that dictate what the consequences of our actions
00:33:12.460
And if there really were a masochist who wanted to do that there there would be a complete scientific understanding of
00:33:19.260
Masochism that's possible sure and it's possible that there's a way of being a masochist
00:33:27.740
I highly doubt this is the case, but let's say that was the case
00:33:32.380
So it's so having right and wrong answers to questions of morality. This is why I use this analogy of a moral landscape
00:33:38.780
It doesn't mean there's just one right answer right for everybody there's there could be many many
00:33:43.980
functionally infinite number of peaks on this landscape
00:33:46.700
But there are even more wrong answers there's a larger
00:33:49.980
Infinite set of wrong answers, and you know when you're not on a peak and because your hand is in the wood chipper and it turns out
00:33:56.700
You're not one of those masochists who likes it
00:33:59.980
So the one question that keeps that you've been hammered with that I've been hammered with
00:34:04.780
Is oh you're talking about objective morality, but your foundation of morality is well-being now when it comes to the is-aught problem
00:34:13.660
Fallaciously pointed out that I you may not be able to get from an is to a not
00:34:17.420
But I can get from two is is to a not because if I know what the goal is
00:34:21.420
And I know what the consequences of my action is or the consequence of action is
00:34:25.500
Then I can tell what I ought to do to achieve that goal, which was a good way to sum it up
00:34:29.580
There's a problem in there that I'm not going to get into but I liked I believe I understand
00:34:35.980
What you assessed in moral landscape, which is kind of my view of this is
00:34:41.020
What under what basis what objective basis have you decided that well-being is the standard?
00:34:46.780
And I think you said what other standard could there be?
00:34:50.140
That's the thing about secular moral systems is they have at their foundation the goal of getting better getting better
00:34:55.500
And even if I pick three premises that are going to serve I can pick them arbitrarily death is preferable to life
00:35:01.420
And you can work through and do thought experiments to see does that get you towards a better world
00:35:07.260
But all this little bickering about better world who defines better what's better well-being
00:35:12.300
Has anybody in all from your detractors suggested another non
00:35:21.340
Foundation that would be better than well-being and if they did how would you respond?
00:35:25.340
Well, there's there's two ironies here one is that
00:35:29.180
The religious answer is also predicated on well-being
00:35:33.020
I mean when you ask religious people why what's wrong with going to hell for eternity?
00:35:41.740
You don't want to be there heaven is much better
00:35:48.060
Of well-being or its antithesis that awaits us after death now if that were true
00:35:52.060
If there was a good reason to believe in the christian heaven or the or the the muslim paradise
00:35:56.780
I would be first the first to say that it's really important to live so as to
00:36:02.140
To place the right bet on eternity. I mean it's because you know
00:36:04.620
What's 70 years compared to eternity of suffering or happiness?
00:36:08.300
But it just so happens that there's no good reason to believe in in those after death states
00:36:14.460
consciousness and its contents and and well and the difference between misery and and well-being
00:36:19.740
And for me that the the definition of well-being is truly open-ended
00:36:23.740
It's there to be refined and and further discovered and I think there's there
00:36:28.860
There are possibilities of well-being that we can't imagine the other irony here is that people are at when people say that you
00:36:35.900
You have this assumption that well-being is is good or worth finding
00:36:40.940
As though we could do otherwise as though having an axiom at the bottom here
00:36:47.900
Every science is based on similar axioms that can't justify themselves. So
00:36:58.140
Right assuming that two plus two makes four for every two and two
00:37:03.100
You know if it works for apples it works for oranges. It's also going to work for cantaloupes
00:37:07.100
How do you know it's going to work for ravens and chickens and how does it generalize that's an into it?
00:37:11.820
That that's an intuition, right? That's a foundation
00:37:14.220
It's still an assumption that you need to test though in physics. I mean think you
00:37:17.580
But you don't test it by continuing to count apples and oranges and cantaloupes. Well, we do things like that
00:37:23.980
We do check to see if the rules continue to work in places. We haven't looked before the idea that events have causes
00:37:31.260
Unless unless time begins and then there was no cause because there was no before
00:37:36.220
Right, but that's proffered as a violation of our intuition that works everywhere else in science
00:37:44.380
In some sense playing the devil's advocate in this regard, but but violation of intuition
00:37:48.780
In my field violation of intuition is everything
00:37:51.740
What was that? In my field violation of intuition is everything except the least trusty
00:37:55.500
Where you think you have is intuition. Yeah, no, but you're using other intuitions to get behind the bad ones
00:38:02.940
If you'd like to continue listening to this podcast you'll need to subscribe at samharris.org
00:38:11.260
You'll get access to all full-length episodes of the making sense podcast and to other subscriber only content
00:38:17.020
Including bonus episodes and amas and the conversations i've been having on the waking up app
00:38:22.700
The making sense podcast is ad free and relies entirely on listener support