The "After On" Interview
Episode Stats
Length
1 hour and 48 minutes
Words per Minute
178.00809
Summary
In this episode of The Waking Up Podcast, I interview Rob Reed, founder of the podcast Afteron, about a wide range of topics, including his new novel, The End of Faith, and his new book, A Hardcover of Hardcover, which is out now. We talk about publishing, psychedelics, terrorism, and meditation, among other things. And I give you a slightly edited version of the Afteron podcast that Rob recorded with me a year ago, which you can listen to here. Afteron is a podcast by Rob Reed that focuses on the intersection of philosophy, science, and public policy, and argues against organized religion. It's hosted by Rob, who is also a novelist and an investor, and is a frequent guest on the TED Conference. Afteron was created by Rob and his partner, Chris Anderson, who founded the company which built Rhapsody, the music service that created the unlimited on-demand streaming model that Spotify and Apple and others have since adopted, and has also spent lots of time throughout the Middle East, including a year as a Fulbright scholar in Cairo. And he started his podcast originally as a limited run to promote his novel, also titled "Afteron." But now he is going to continue it indefinitely, and many people who heard this interview originally thought it was unusually good, not that I'm unusually good in it. I don't take strong recommendations from Chris lightly. The man surely knows how to put on a show, and I don t take recommendations from the man, and he certainly knows how well he puts on a good one. You can find out more about Rob on his podcast, The Art of Charm, the podcast hosted by Afteron. And you can find much more about him on his website, here. And if you like the podcast, check him out at The Art Of Charm, where he also has a great podcast, where you can read more about his book, "The End Of Faith." and much more. The Hardcover: A Book That's Hardcover. by Sam Harris by Tom Merritt by The New York Times Bestselling author of Four Bestsellers. and his book Hardcover by John Griggs, I hope you enjoy this episode, and remember to share it with your friends, family and family, and tell them about it on your social media if you're looking for a hardcover copy of "Hardcover".
Transcript
00:00:00.000
Welcome to the Waking Up podcast. This is Sam Harris. Okay, in this podcast, I'm actually
00:00:26.440
releasing an interview that I did on someone else's podcast. That podcast is After On,
00:00:33.360
and the interviewer is Rob Reed. Rob founded the company which built Rhapsody, the music service
00:00:39.700
that created the unlimited on-demand streaming model that Spotify and Apple and others have
00:00:45.220
since adopted. Rob has also spent lots of time throughout the Middle East, including a year as
00:00:51.680
a Fulbright scholar in Cairo, and he's an investor, but he's mainly a novelist these days. And he
00:00:59.520
started his podcast originally as a limited run to promote his novel, also titled After On, but now
00:01:06.000
he's going to continue it indefinitely. And many people who heard this interview originally thought
00:01:10.800
it was unusually good, not that I'm unusually good in it, but that we covered a lot of ground. And we
00:01:17.720
certainly did. Rob and I talk about publishing and psychedelics and terrorism and meditation,
00:01:24.520
free speech, and many other things. And in fact, Chris Anderson, the curator of the TED conference,
00:01:32.140
heard it and got in touch with me and suggested that I release the interview on my own podcast.
00:01:37.040
And he felt this interview covered topics that I don't often touch, or at least don't touch in that
00:01:41.660
way. And I don't take strong recommendations from Chris lightly. The man surely knows how to put on a
00:01:49.160
show. So with Rob's permission, I am giving you a slightly edited version of the podcast he released.
00:01:57.260
I have to give you a little warning about the sound quality. We tried to clean it up on our end,
00:02:01.840
but there's a lot of popped peas. It's probably best listened to in your car or at your desk. But Rob is a
00:02:08.940
great interviewer. And he's since had many other interesting guests on his podcast. So if you like
00:02:14.760
the angle he took with me here, you might check him out at after-on.com. And you can find out much
00:02:22.060
more about his book there too. And now without further delay, I bring you the conversation I had
00:02:33.440
So Sam, thank you so much for joining me here at Tom Merritt's lovely home studio.
00:02:38.860
You were a guest on the Art of Charm podcast about a year ago, and they asked you to describe what you
00:02:44.860
do in a single sentence. And you said, I think in public, which I thought was a very elegant way of
00:02:50.500
putting it. I was hoping you might elaborate on that. And in this case, feel free to use as many
00:02:56.000
Yeah, well, I'm glad you brought that back to me because I would have totally forgotten that
00:02:59.020
description. It's a useful one. Increasingly, I'm someone who's attempting to have hard conversations
00:03:04.840
about what I consider some of the most important questions of our time. So the intersection of
00:03:10.900
philosophy, particularly moral philosophy and science and public policy and just things in
00:03:17.520
the news, topics like race and terrorism, the link between, you know, Islam and jihadism and
00:03:24.040
things that are in the news but that have, when you begin to push on these issues, they run very,
00:03:31.220
very deep into the core of human identity and how we want our politics to proceed and
00:03:36.800
the influence of technology on our lives. So there's just, you can almost, you pull one of
00:03:41.620
these threads, everything that people care about starts to move.
00:03:45.280
Yeah, there's a great deal of interconnection. And I'd say, and correct me if this is wrong,
00:03:49.180
but I'd say you started thinking in public and earnest, perhaps back in 2004, with the release
00:03:54.240
of your first book, The End of Faith, in which you argued stridently against all types of
00:03:59.300
organized religion. And in favor of atheism, it peaked at number four, was it on the New York
00:04:05.820
You know, I don't even remember. It was on for, I think, 33 weeks, but I think four sounds about
00:04:10.880
Yeah. So obviously you got out there in a big way with a book you've since written. Is it
00:04:15.740
four more bestsellers, New York Times bestsellers?
00:04:17.920
Yeah. Yeah. That designation means less and less, as it turns out. But I mean, there are bestsellers
00:04:26.280
They're the bestsellers that bounce off the list, which most of mine have been. And then
00:04:30.680
there are those that stay on forever. But yeah, I've had five that have hit the list. Yeah.
00:04:36.900
And what's intriguing to me is that quite recently you have developed a wildly successful podcast. And
00:04:42.880
I was hoping you could characterize the reach that the podcast has attained compared to that of
00:04:49.420
these very, very successful series of books that you did.
00:04:52.300
Yeah. The numbers are really surprising and don't argue for the health of books, frankly.
00:04:59.220
A very successful book in hardcover. Your book comes out in hardcover first, normally. Some
00:05:05.640
people go directly to paperback. But if you are an author who cares about the future of
00:05:11.540
your book and reaching lots of people, you publish your hardcover and you are generally
00:05:16.540
very happy to sell 100,000 books in hardcover over the course of that first year before it
00:05:23.760
Indeed, ecstatic. That would probably put you in the top percentile of all books published
00:05:29.540
Oh, yeah. And that is very likely going to hit the bestseller list. Maybe if you're a diet
00:05:34.960
book, you need to sell more than that. But if you sold 10,000 in your first week, depending
00:05:39.760
on what else is happening, you almost certainly have a bestseller. And, you know, in the best
00:05:45.740
case, you could sell 200,000 books or 300,000 books in hardcover. And that's a newsworthy
00:05:52.640
achievement. And then there's the one 100th of 1% that sell millions of copies. So, you know,
00:06:00.020
with a book, I could reasonably expect to reach 100,000 people in a year and then maybe some
00:06:05.360
hundreds of thousands over the course of a decade, right? So all my books together now
00:06:11.140
have sold, I haven't looked at the numbers, but I'm pretty sure I haven't reached 2 million
00:06:17.180
people with those books, somewhere between a million and 2 million. But with my podcast,
00:06:24.540
I reach that many people in a day, right? And these are long form interviews and sometimes
00:06:30.780
it's standalone, sometimes just me just talking about what I think is important to talk about for
00:06:34.800
an hour or two. But often I'm speaking with a very smart guest and we can go very deep on any
00:06:42.740
topic we care about. And again, this is not like going on CNN and speaking for six minutes
00:06:48.160
in attempted soundbites and then you're gone. People are really listening in depth.
00:06:54.360
And so if we were to clone you in two right now, and one of the Sam Harris's that we ended up with
00:07:00.780
was to record a podcast. And the other Sam Harris was to write your entire literary output.
00:07:10.940
Yeah, well, that's the other thing. Forget about the time it takes to write a book, right? Which in
00:07:15.480
some cases is years, in some cases is months, depending on how long the book is and how
00:07:21.140
research driven it is. But it's a lot of time. It's a big commitment to write a book.
00:07:25.680
Once it's written, you hand it into your publisher and it takes 11 months for them to publish it.
00:07:34.900
Yeah, yeah. And it's, you know, that increasingly, that makes less and less sense.
00:07:41.100
Both the time it takes to do it and the time it takes to publish it don't compare favorably
00:07:46.620
with podcasting. You know, in defense of writing, there are certain things that are still best done
00:07:52.760
in written form. Nothing I said has really any application to what you're doing. I mean,
00:07:58.640
you're writing novels. Reading a novel is an experience that people still want to have.
00:08:03.520
But what I'm doing in nonfiction that's primarily argument-driven, right? There are other formats
00:08:10.800
in which to get the argument out. And I still plan to write books because I still love to read books.
00:08:17.660
And taking the time to really say something as well as you can affects everything else you do. It
00:08:25.140
affects the stuff you can say extemporaneously in a conversation like this as well. So I still value
00:08:29.880
the process of writing and taking the time to think, you know, that carefully about things.
00:08:35.720
The thing that is striking, though, is the extraordinary efficiency that the podcast has become as a way for
00:08:41.720
you and many others to disseminate ideas in terms of the hours that you put into the creation of it,
00:08:47.160
which are non-trivial. I'm learning that as a very new podcaster myself. It ain't easy
00:08:51.980
to research and put one of these things together. But compared to a book, it's just, there's just
00:08:58.000
incredible leverage there. Now, another thing, speaking of large audiences, I believe I read somewhere
00:09:03.060
that you were featured in the most heavily watched Bill Maher video clip of all time. Do you know
00:09:10.020
if that statistic is accurate? I suspect it still is accurate. It was at the time. It was the most
00:09:15.380
viral thing that ever got exported from the show. And you were discussing Islamophobia with the then
00:09:20.720
future Batman. Yeah. And why do you suppose that clip became so widespread? I mean, Bill Maher is no
00:09:29.020
stranger to controversy. The exchange between you and Ben Affleck and between Maher and Ben Affleck did
00:09:35.540
become quite heated. But in any given month, there are many interactions on cable news and on Sunday
00:09:41.860
talk shows that are at least as lively. What do you think it was about that that made it go
00:09:47.020
so widespread? And also, if you'd characterized it, if you care to just characterize it briefly for
00:09:51.920
those who haven't seen it. It was a combination of things. It was the topic. It was the fact that it
00:09:56.500
was a star of Ben Affleck's caliber going kind of nuts and going nuts in a way that was very polarized
00:10:03.800
into the audience. So what happened briefly is I was actually on not to talk about Islam or jihadism
00:10:08.940
or terrorism or anything related to this topic. I was on to talk about my book on meditation,
00:10:15.280
Waking Up, you know, where I was trying to put our spiritual concerns, our contemplative concerns,
00:10:21.060
on a rational footing. And it just so happened that, I mean, this is a hobby horse that Bill and I have
00:10:27.680
written for a number of years talking about the unique need for reform in Islam. You know,
00:10:34.780
I have an argument against all faith-based religion, but part of my argument is to acknowledge that
00:10:39.420
religions are not the same. They teach different things. They emphasize different points. And to its
00:10:44.940
discredit and to the reliable immiseration of millions of people, Islam emphasizes intolerance
00:10:53.100
to free speech and intolerance to political equality between the sexes and rather direct connection
00:10:59.940
between suicidal violence and martyrdom. And hence all the problems we see throughout the Muslim world
00:11:07.680
at the moment and our collision with it. So in any case, that topic came up of Islam and jihadism
00:11:14.260
in the middle of this interview. And Ben Affleck jumped in. I mean, clearly he had been prepared by
00:11:20.420
somebody to hate me because his intrusions into my interview with Bill were otherwise inexplicable
00:11:27.200
because he was sort of at my throat even before the topic of Islam came up. I was still talking
00:11:31.940
about meditation and he said something snide, again, in a mid-show interview that is normally
00:11:38.200
protected from the intrusions of the rest of the panel. So it was weird. And then the thing just lit up
00:11:44.160
with him seemingly completely misunderstanding what Bill and I were saying, but doing it in an
00:11:50.480
increasingly adamant and ultimately quite heated way. So he was unhinged and not making any sense
00:11:57.880
from my point of view. And he was calling us racists and bigots.
00:12:01.940
And in some ways proving the very points that you were making.
00:12:04.820
Oh yeah, in every way. My point was, listen, people get emotionally hijacked on this issue.
00:12:10.440
They don't actually follow the logic of what is being said. I'm criticizing ideas, not people.
00:12:17.920
Islam is a religion subscribed to one or another degree by people who call themselves Muslims. But
00:12:24.300
we have to speak specifically about the consequences of specific beliefs. It becomes incredibly relevant
00:12:30.640
to know what percentage of people think dying in defense of the faith is the best thing that could
00:12:36.480
possibly happen to you or that apostates should be killed. So we're talking about the consequences
00:12:40.100
of ideas. And there are many, many millions of Muslims who would repudiate both of those ideas.
00:12:45.900
So obviously I'm not talking about them when I'm talking about the problem of jihadism or
00:12:49.440
a belief in martyrdom or apostasy. And so he proved himself totally incapable of following the plot just
00:12:56.280
as I was talking about that very problem and went berserk. And the most depressing thing about that
00:13:02.120
encounter was to see how many people on the left, and in particular apologists for Islam and so-called
00:13:09.080
moderate Muslims who viewed his performance as just the height of ethical wisdom, right? Like he had
00:13:16.380
unmasked Mai and Bill's racism as though being Muslim was to be a member of a race. I mean, that non-sequitur
00:13:23.960
was the first thing people should have noticed. But he was celebrated as just this white knight who came
00:13:29.700
to the defense of beleaguered brown people everywhere, right?
00:13:35.040
To a degree that is just... I mean, if you've looked on social media in the immediate aftermath of that,
00:13:40.460
it was just a tsunami of moral and political confusion, really. It was like a nuclear bomb
00:13:49.040
Well, what's interesting to me is I looked at that in preparation for today's talk, and it would seem
00:13:54.740
the tide has changed. I looked at the YouTube clip, and I know that you've said in other places that
00:13:58.840
YouTube seems to be a particularly bad cesspool for really vitriolic commentary at times.
00:14:04.920
And I figured I'd scan it quickly to get a sense of like, what's the percentage breakdown?
00:14:10.220
And I looked at almost 100 comments, I believe, and I did not find a single one that was pro Ben
00:14:17.900
Affleck. I mean, people were making the points that you just made, that he was essentially making
00:14:21.700
your points for you, in that when you start talking about ideas, people presume that you're trying
00:14:27.640
to paint with a broad brush, people, which you were not trying to do. So it might have changed
00:14:32.780
since then. But in the immediate aftermath, there was a very pro Ben kind of reaction to what it
00:14:39.200
Yeah. And it continues in a way that is quite shameful. So for instance, the comic Hassan Minaj,
00:14:47.620
who just did the White House Correspondents Dinner, so he's now the one that Trump didn't attend,
00:14:52.980
but his stature has risen among comics of late. And he just released a Netflix special where he talks
00:15:01.580
about this issue, just praising Ben Affleck to the skies and saying quite libelously that Bill,
00:15:11.120
in that exchange, advocated for, quote, rounding up Muslims and containing them, as though in concentration
00:15:17.160
camps, or at the very least internment camps. How this got past Netflix fact-checkers...
00:15:27.360
Bill Maher said on camera, a YouTube clip viewed by millions of people, round them up.
00:15:32.880
This is his position, that he wants Muslims rounded up and contained, right? And happily,
00:15:38.940
he didn't mention me by name. He was talking about Bill and Ben in that episode. But it's just
00:15:43.620
pure delusion and slander. It's a massive applause line in his world. So this is a kind of form of
00:15:51.640
asymmetric warfare. Whenever I inadvertently misrepresent the views of my opponents, I mean,
00:15:58.880
no matter how malicious the opponent, right? If I say something that gets their view wrong,
00:16:04.620
and it gets pointed out to me, I publicly apologize for it. I am absolutely scrupulous to represent
00:16:15.740
Yes, yes. Because some of this gets fairly bloody. But when I'm pushing back against my
00:16:22.260
critics, and again, no matter how malicious, I am always holding myself to the standard of
00:16:28.040
articulating their position in a way that they couldn't find fault with. And then I can then go
00:16:34.240
on and demonstrate what's wrong with their view. Anyone who criticizes Islam as a doctrine, or really
00:16:41.960
anyone who touches any of these third rails that have become so fraught among liberals and progressives,
00:16:48.420
so to talk about race, to talk about gender, to talk about really any of these variables around which
00:16:54.400
identity politics have been built, reliably produces people who think that defaming you at any cost
00:17:01.920
is fair game. So they will attribute to you views that not only do you not hold, they are the
00:17:06.600
opposite of the views you hold. They will make any attempt to make that stick.
00:17:10.980
Do you think in their minds it's an ends justifies the means thing, where they are so committed to
00:17:16.480
their position, and they are so utterly certain that their position is objectively right, that they're
00:17:22.840
saying, okay, I know he didn't say round them up, but I'm going to say that he said round them up,
00:17:28.500
because that will eliminate his credibility, and the elimination of his credibility, even
00:17:33.800
by a dishonest mechanism, serves such a higher good. Do you think that's the calculus?
00:17:40.220
Obviously, there's a range of cases here. And so the most charitable case is that there's some number
00:17:45.400
of people who are just intellectually lazy, and are just guilty of confirmation bias, they're misled,
00:17:53.360
they hear a snippet of something which strikes them a certain way, and then they just run with it,
00:17:57.740
right? And they feel no intellectual or moral obligation to get their facts straight.
00:18:03.160
Anyone can fall prey to that. I mean, if, you know, I've been so critical of Donald Trump,
00:18:07.340
if you show me a tweet that looks insane from him, I'm not going to spend any time trying to figure
00:18:12.440
out if it's really a tweet from him, because all of his tweets have been insane. So either the chances
00:18:16.640
this one's real is very high. If revealed that it was fake, well, then I'll walk back my forwarding of
00:18:22.920
it or whatever. But everyone only has so much time in the day, and so it's easy to see how people get
00:18:27.700
lured into just being lazy, right? But then there are the people who consciously manufacture
00:18:34.480
falsehoods. You know, I think there are actually real just psychopaths in any movement, right? And
00:18:39.460
there are people who just have no moral qualms in spreading lies, no matter how defamatory, no matter
00:18:44.880
how likely they are to increase the security concerns of the people involved. Spreading the lie
00:18:51.240
that someone is a racist or that they favor genocide against Muslims, say, which are, these are both
00:18:57.780
lies that are just endlessly spread about me and Bill and even former Muslims or Muslim reformers with
00:19:05.700
whom I support. I mean, someone like Ayaan Hirsi Ali or Majid Nawaz, I mean, people who are,
00:19:09.600
have excruciating security concerns. Endless lies are told about them, and these lies have the effect
00:19:16.940
of raising their security concerns. It could jeopardize their lives. Yeah, yeah. This is well
00:19:20.800
understood by the people who are telling these lies. For instance, this is just, you happen to catch me
00:19:25.460
in a 24-hour period where this has happened to me in a fairly spectacular way. Really?
00:19:30.680
So yeah, I had Majid Nawaz, who's this brilliant and truly ethical Muslim reformer on my podcast.
00:19:39.820
And a reformed Muslim as well. He had been imprisoned for a period of time for radical
00:19:44.460
activities. Yeah, yeah. So he's a former Islamist, which is distinct from a jihadist. He was not a
00:19:49.320
terrorist, but he was trying to, you know, he was part of an organization that was trying to
00:19:53.240
spread the idea of a global caliphate, and they were trying to engineer coups in places like Pakistan
00:19:58.220
and Egypt. So he was doing fairly nefarious things. He was recruiting for this organization,
00:20:03.580
and then spent four years in an Egyptian prison, and got essentially deprogrammed in proximity to
00:20:10.160
jihadists and fellow Islamists, just understanding of the kind of world they wanted to build more
00:20:16.440
deeply. And then he was also taken as a prisoner of conscience by Amnesty International. And that,
00:20:22.560
it was the juxtaposition of that kind of ethical overture from the enemy.
00:20:27.260
Because he at that time would have considered Amnesty to be the enemy.
00:20:30.900
This is a Western liberal progressive organization. Now, all of a sudden, they're coming in
00:20:35.100
and defending me, even though they know I loathe everything they stand for, because that is
00:20:40.160
what they do, that is consistent with their values. So that got through to him.
00:20:44.120
And who in the, what organization in the Muslim world or the Islamist world does that, right?
00:20:49.420
It broke the spell. And so he came out of prison, and very soon thereafter,
00:20:57.740
But did not disavow Islam, right? He's still a practicing Muslim.
00:21:00.940
He's at pains to say that he's not devout. He's not holding himself up as an example of
00:21:05.740
religiosity. But he's identified as a Muslim. He's not an ex-Muslim. He's not claiming to be an
00:21:12.900
atheist. And he started this counter-extremist think tank, the Quilliam Foundation in the UK,
00:21:18.180
that has attracted theologians and other former Islamists and has a very active program of deep
00:21:26.180
programming extremists, both jihadist and otherwise. And this is just the most courageous
00:21:33.160
and necessary work. I mean, of all the things that human beings should be doing, especially people
00:21:39.300
in the Muslim community. This is just, it has to be at the top of everyone's list. And yet he is
00:21:45.640
demonized as an Uncle Tom and a native informant by so-called moderate Muslims, right? And so he and
00:21:53.240
I wrote a book together, which was initially a kind of debate. I mean, we're on, you know, I was the
00:21:58.960
atheist criticizing Islam and talking about the link between the doctrine and terrorism. And he was
00:22:05.400
arguing for a program of reform. And it was a very fruitful collaboration and a very useful
00:22:12.260
introduction to the issue for those who have read the book. And there's a documentary coming out,
00:22:17.940
you know, based on the book. And we did a speaking tour in Australia together. I'm totally supportive
00:22:23.320
of him. I mean, he's a real friend now. And so he was on my podcast in January, and we're having a
00:22:30.180
conversation about all these issues. And there's a part of the conversation where I'm essentially
00:22:34.800
playing devil's advocate with him. And so he had been talking about reform. And at this point,
00:22:42.020
we're speaking specifically about the migrant crisis in Europe, born of the civil war in Syria,
00:22:49.060
and just what to do about the millions of people who are pouring across the borders into Europe
00:22:54.580
at that point, and just the ethical challenges of that. And I'm on record, both in that podcast and
00:23:02.540
elsewhere, saying that I think we have a moral obligation to let in all the Syrians we can
00:23:08.320
properly vet. I talk about these people as the most unlucky people on earth. I am, you know,
00:23:13.620
I was against Trump's travel ban, right? And I have criticized that on television and on my podcast
00:23:20.560
Yeah, you've been quite unequivocal about that.
00:23:22.280
Yeah. And again, within this specific podcast made these points. I talk about secular and liberal
00:23:29.140
Muslims being the most important people on earth and the people who I would move to the front of
00:23:33.980
the line to get U.S. citizenship if they wanted it, if I had any influence there. So my views on this
00:23:40.340
matter are very clear. So there's a part in the conversation where I'm playing devil's advocate,
00:23:44.820
and there had just been a terrorist attack in Germany in the Christmas market where a jihadist in a van
00:23:51.040
plowed into dozens of people and I think killed 12 and injured 50. And at one point I said to Maja,
00:23:56.600
okay, so you've said many hopeful things thus far. I want to push back a little bit. I can well
00:24:02.120
imagine that there are millions of people in Europe at this moment, in the aftermath of this Christmas
00:24:06.300
market attack, who are thinking, why the fuck do we need more Muslims in our society? Surely we have
00:24:14.340
enough. Why not just not let anyone else in, right? So someone who apparently has been doing this to
00:24:20.940
all my podcasts, I only just noticed this time, someone in the Muslim community took a snippet of
00:24:27.460
the audio, starting with, why the fuck do we need more Muslims in our society, right? And then there's
00:24:33.440
just Maja's contribution here is just, he's just kind of nodding along saying, yes, doing nothing to
00:24:38.280
push back. I mean, just seeming to acquiesce to my position here. And he tweets this out, this minute
00:24:44.700
of audio witness, you know, Sam Harris's genocidal attitude toward Muslims and, you know, Majid's
00:24:51.000
support. And then all the usual suspects, Reza Aslan and Max Blumenthal, you know, the odious son
00:24:57.840
of Sidney Blumenthal, who has never resisted an opportunity to lie about people like me and Ayaan
00:25:04.580
Hursi Ali and Majid. All of them, just full court press, push this out. I mean, now we're talking about
00:25:09.560
people who have platforms of hundreds of thousands, you know, and then that percolates down to all the
00:25:14.720
people who have tens of thousands of people on Twitter. So millions of people receive this.
00:25:20.980
Yeah, yeah, this is now 48 hours ago. And I'm seeing people from, I'm seeing a writer from the
00:25:26.860
nation also push it out. And also like nearly docks me where she says, well, next time I see him at
00:25:32.740
my favorite coffee house, and she names the coffee house that I'm at rather frequently, I'll tell him
00:25:38.480
what I think of him, right? So it's the most irresponsible use of social media. And in the
00:25:45.280
case of people like Reza Aslan, he absolutely knows what my position is, and he knows he's
00:25:51.620
And there is clearly a world of difference between what you had characterized as the most charitable
00:25:56.060
case, which is this is just somebody who's incredibly lazy and doesn't research. This person
00:25:59.580
very plainly, surgically removed something out of context, very, very surgically, not an oopsie
00:26:06.640
blunder kind of thing. Put it out there. And those who picked it up, presumably knowing a thing or two
00:26:12.300
about both you and also the source, just spread it wantonly without any notion of checking to see
00:26:19.740
And the other thing that's crucial here is that even if you wanted to extend the most charitable
00:26:24.560
interpretation to them, that is a genuine mistake.
00:26:29.680
Within 15 minutes, the hoax is revealed, because I have, you know, nearly a million people following
00:26:36.660
me on Twitter. And I pushed back against it, you know, multiple times. And I sent a link to the
00:26:42.780
timestamp to the beginning of the actual part of the conversation that reveals just what is being
00:26:52.560
Which you wouldn't expect from the person who did it, because they did it quite wittingly.
00:26:56.280
But the people who forwarded it to hundreds of thousands of people, having been made aware,
00:27:00.600
would have a moral responsibility to walk that back. Because it does put you, it heightens the
00:27:05.700
physical threat that you live under. We are probably either a double-digit number of months
00:27:12.120
from software, which we've seen the first prototypes already, that would allow somebody to basically
00:27:18.300
sample your voice, which there are many, many examples, and basically do a marionette thing
00:27:23.100
where they have you say whatever they want. But these tools are going to be out there,
00:27:30.260
And you could be made to say, I could be made to say, the president, anybody could be made
00:27:34.940
to say absolutely anything. And I wonder if that's going to kind of, in a perverse way,
00:27:41.480
help things, because audio quotes will, from that point forward, just simply not being taken
00:27:46.600
Yeah. No, I'm really worried about that. But I do actually see the silver lining you
00:27:53.460
just pointed to. I think that it will be so subversive that people will realize that all
00:28:04.160
I imagine something similar has happened with Photoshop now, where people just don't use photos
00:28:11.320
as forensic evidence in the same way. And they just, they're fairly skeptical about what they
00:28:16.600
see in an image when it counts. Just imagine if you saw, if someone forwarded to you a photo of
00:28:22.480
Trump in some insane circumstance, your first thought before forwarding it would be, wait a
00:28:30.980
minute, is this photoshopped? We'll have to be that circumspect about audio and even video. So now
00:28:36.000
they have the mouth-linking fakery. The completely fake audio, which again, sounds exactly like the
00:28:42.960
person's actual voice, can be made to seem like it's coming out of his or her mouth.
00:28:47.500
You add the visual cue, and look, it always, what happens in audio happens next in video.
00:28:52.060
Well, to sort of go a little bit bigger picture for a moment, I'm delighted to be talking to
00:28:55.520
you now because there's almost an uncanny overlap between the subjects you've dedicated your life to
00:29:01.260
understanding and those that are discussed in my novel after on. The main topic of the book is
00:29:05.720
super AI. You're very widely quoted on this subject. You gave a great TED talk about it almost exactly a
00:29:11.540
year ago. Another major theme in the book is consciousness. You spent an entire decade exploring
00:29:16.600
consciousness full time. I'm not sure if that's an overstatement, but it's an approximation.
00:29:21.200
A connected major topic is neuroscience. You are one, or you're a neuroscientist. And yet another
00:29:26.520
major theme is nihilistic terrorism. And of course, you're now one of the most outspoken people.
00:29:30.680
In the U.S. on this subject, I think the only lifelong focus of yours that's not a major
00:29:36.540
obsession of the book is jujitsu. So we will keep the jujitsu talk to an absolute minimum here.
00:29:41.740
But before we go back into all this, and particularly nihilistic terrorism, I'd like to consider
00:29:45.700
the life trajectory that made you expert in all these topics, starting at the first time
00:29:52.380
our lives overlapped without either of us realizing it. We were both undergraduates at Stanford at the
00:29:58.140
same time. I was a year ahead of you, young man. And I'd like to go back that far just briefly,
00:30:03.760
because you embarked on an unusually bold, and as it turned out, unusually long project for one of an
00:30:11.240
undergraduate age. And it's a project that I think has a great deal to do with who you are now.
00:30:17.000
So when you arrived at Stanford, you're on campus, you haven't yet made this bold decision to take an
00:30:22.680
enormous amount of time off. What was your thinking of religion at that point? Were you an atheist
00:30:28.280
already? If you were, was that a major part of your identity, a minor part?
00:30:33.140
Well, I was definitely an atheist, but I wouldn't have called myself one. The term atheist was not
00:30:40.220
really in my vocabulary. I was completely unaware of the history of atheism, organized atheism. I
00:30:47.680
wouldn't have known who Madeleine Murray O'Hare was. And I had never been given religion by my
00:30:54.340
parents, so I wasn't reacting against some dogmatism that had come from the family.
00:31:00.420
And you came, your parents were from very different religious traditions, correct?
00:31:06.380
Just unreligious, yeah. I mean, they were just, but again, they were not atheists. They wouldn't have
00:31:11.600
But you had one of your parents' ways, Quaker, is that right?
00:31:14.300
Yeah, Quaker, and my mother's Jewish. And so this is also slightly an artifact of what
00:31:20.120
it is to be surrounded by cultural Jews who are not religious. I mean, so Judaism is almost
00:31:25.240
unique in that you can have people for whom their religion is still a seemingly significant
00:31:31.180
part of their lives. They care that they're Jewish, but there is zero otherworldly or supernatural
00:31:38.220
content to their thinking about what it is, what it means to be a Jew.
00:31:41.560
I believe it probably is unique. I mean, maybe the Parsis have something similar.
00:31:45.600
Yeah, and this Jewish experience of secularism is fairly misleading to most Jews, I find,
00:31:52.700
because they kind of assume that everyone else has lost their religion to the same degree.
00:31:58.220
You know, so I've debated conservative rabbis who, when push came to shove, revealed they believed
00:32:05.080
almost nothing that could be classified as religious. Their notion of God was so elastic as to commit
00:32:14.140
them to almost nothing. You know, nothing specific about what happens after death, nothing that can
00:32:19.300
necessarily be prayed to or that can care about human events. I'm not talking about reformed Jews,
00:32:23.800
I'm talking about conservatives. You know, the ultra-Orthodox believe a fair number of imponderable
00:32:28.500
things. But short of that, Judaism has really been denuded of its otherworldliness. I grew up in
00:32:35.580
that kind of context where even religious people—again, my family wasn't—but even people who went to
00:32:41.520
synagogue didn't believe anything. So I was fairly sheltered from the culture wars in that respect,
00:32:47.820
and was just unaware of the kind of work that religious ideas were doing in the world or in the
00:32:58.500
phase. When I got to Stanford, I remember being in the Great Books Seminar, and the Bible was one of
00:33:03.940
the books that is considered great and that we had to read. And I remember getting into debates with
00:33:10.620
people who had clearly come from a Midwestern Christian background, say, or more of a Bible-built
00:33:17.340
experience, and just, I mean, having absolutely no patience for their belief that this book was
00:33:27.620
fundamentally different from the Iliad and the Odyssey or anything else we were reading in this
00:33:32.480
seminar. And the professor's way of holding that text in particular compared to the other books—I
00:33:39.420
don't know if she was religious, but she seemed to be carving out a kind of different place on the
00:33:43.660
bookshelf for this text to occupy. And from my point of view, the stuff we were reading wasn't even
00:33:50.300
great. I would admit that there are great parts of the Bible, but, I mean, we were reading Leviticus
00:33:56.400
and Deuteronomy, and just, I mean, these are the most deranged recipes for theocracy that have ever
00:34:03.220
been written. I mean, certainly sections of them are worse than anything that's in the Quran or any
00:34:07.800
other terrible book. I was just astonished that we were wasting time reading this stuff. The only argument
00:34:15.360
for reading it, in my view then, and it's really my view now, is to understand how influential the
00:34:22.580
book has been elsewhere. I mean, you want to be able to understand the allusions in Shakespeare,
00:34:27.220
you have to be conversant with the Bible. But the idea that this is somehow a great flowering of human
00:34:33.680
wisdom, you know, again, specifically books like Deuteronomy and Leviticus.
00:34:39.280
Those are books in which the grim punishments for people who step out of line, among other things,
00:34:45.000
are detailed in kind of gory detail. Yeah, and they're not allegories for anything. It's just,
00:34:51.200
these are the reasons why you need to kill not only your neighbors, but members of your own family
00:34:56.880
for thought crimes. Here's how you should be living. And it's just, you almost couldn't invent a worse
00:35:03.700
worldview. And the corollary to that is anyone, any neurologically intact person, in five minutes
00:35:12.360
can improve these books spiritually and ethically and politically and in every other way, scientifically,
00:35:20.000
economically. I mean, there's just nothing that this is the best for, even good for, apart from
00:35:27.180
creating conditions of, you know, Taliban level intolerance in a society. That is, if, you know,
00:35:33.200
people actually believe this stuff. And, you know, very few Jews now believe that you should be paying
00:35:38.560
any significant attention to Leviticus or Deuteronomy. And Christians have their own
00:35:43.080
reasons for ignoring it. But what we're witnessing in the Muslim world is that there are analogous
00:35:48.180
texts, the parts of the Quran being one, and the Hadith, and the biography of Muhammad being the rest
00:35:55.100
of the canon, which detail, you know, very similar levels of intolerance and a commitment to prosecuting
00:36:02.200
thought crime. And many, many millions of people take them very, very seriously.
00:36:07.340
And so you were in a state of outrage at the fact that these texts were being held up as great.
00:36:12.680
You were certainly not a believer in any manner. Atheism may not have been a word you would have
00:36:18.400
applied to yourself, but it was something that you, essentially, from what you're describing,
00:36:25.000
If you look at the DSM, 10-year journeys of spiritual discovery are generally not considered
00:36:31.980
to be symptoms of atheism. Yet, from that point of de facto atheism, you essentially did take off on,
00:36:39.640
is it fair to say, a 10-year journey of spiritual discovery and near full-time exploration of
00:36:46.960
Yeah. So what happened is I took MDMA for the first time, and I had taken other psychedelics
00:36:53.840
as a teenager. I mean, really just mushrooms a few times.
00:36:57.200
And I will add that Stanford in the late 80s was awash in MDMA long before it entered the club
00:37:07.580
I didn't know that, actually. I'd never encountered it.
00:37:09.620
Yeah, yeah. No, it was all over the place. And we called it X in the United States, and then the
00:37:15.120
Brits, who kind of discovered it a few years later, called it E. And it was something that was just so
00:37:20.740
part of just sort of the fabric that I mistakenly thought it was a very, very widespread drug,
00:37:26.120
and it didn't become widespread until much, much later. Now, I wasn't as bold as you. I actually
00:37:30.100
was fearful of this stuff, but it was everywhere. It was definitely everywhere in the 80s. Yeah.
00:37:38.320
Well, you were hipper than I was because you actually tried it.
00:37:40.680
Yeah, no, I mean, maybe it was everywhere because I had taken it, and I was proselytizing.
00:37:44.420
Yeah. I was evangelizing pretty hard, at least to three captive friends when I got back to campus,
00:37:51.820
because it really did blow my mind. I mean, it just changed everything about what I thought was
00:37:58.380
So that was the pivoting incident. That was what caused you to—I didn't realize that. So that
00:38:02.640
was the thing that caused you to say, I'm out of here, at least for now.
00:38:06.520
It's connection to my dropping out was a little less direct than that. It took a little more time,
00:38:13.440
but it just took like a quarter, you know. But it was, you know, 10 weeks later, I was not
00:38:18.120
enrolling again. But I guess I took it during spring break or something. I wasn't at Stanford. I was
00:38:23.960
back home when I took it. This is something I write about in the beginning of my book, Waking Up.
00:38:29.500
It was the first experience I had where the implications of that change in my consciousness,
00:38:34.700
they were far more global, and they suggested something about the possibility of changing
00:38:39.760
one's consciousness in a more durable way. I wasn't left thinking, wow, ecstasy is amazing,
00:38:46.700
or, you know, that was a very interesting drug experience. It seemed to unmask something about
00:38:52.240
the nature of my own mind that was more true than what I was tending to experience. So the experience
00:38:58.960
of coming down from it was the experience of having my actual true self, in a way, occluded by neurotic
00:39:10.060
layers of my personality that were being rebuilt, that had been suppressed by the drug. So, I mean,
00:39:15.900
the experience was briefly of just feeling all self-concern drop away. So that I was, you know,
00:39:23.080
Sidney, I was talking to one of my best friends. He still is one of my closest friends. And he had never
00:39:28.360
taken it before either. So we both took this, and we, again, we took it, this is before anyone had
00:39:33.680
a rave or, yeah, so, and we took it very much in the spirit of trying to find out something
00:39:39.520
interesting about our minds. We weren't partying, this was, this was...
00:39:42.820
More of a Timothy Leary than a Ken Kesey type of experience.
00:39:46.160
Yeah. I mean, this was given to us as, this had been kind of an export from the psychotherapeutic
00:39:51.420
community. Like, this is a drug that shows you something about the nature of spirituality,
00:39:55.840
the nature of love, ultimately. So we were just curious about what was there to be discovered.
00:40:02.160
And I just remember talking to him, and there was nothing psychedelic about it at all. I mean,
00:40:08.180
there were just no visual distortions, no sense of coming onto a drug, just this increasing sense of
00:40:14.080
moral and emotional clarity, where I just have more and more free attention to just talk to my friend.
00:40:21.620
I'm getting less and less at every moment as I'm coming onto this, and it took a while for me to
00:40:27.240
recognize what had happened, but I'm becoming less and less encumbered by the concern about what he's
00:40:33.580
thinking about me. I mean, so, like, I'm looking into his eyes, and I'm no longer, like, and, you know,
00:40:39.080
there's changes in his facial expression in response to what I'm saying, and I'm no longer reading that
00:40:45.280
as a message about me. It's like, it's like, I'm no longer behind my face looking at him,
00:40:52.220
no longer tacking in the wind of somebody else's attention on me. There was just a sense of zero
00:40:59.000
self-concern. I mean, I just, my attention was not on myself at all. I was simply paying attention to my
00:41:04.640
best friend. And that pure granting of attention was love. What I was experiencing more and more as the
00:41:13.760
minutes ticked on was just a total commitment to his happiness, just his well-being, just wanting
00:41:21.960
everything that could, that could possibly happen for someone to happen to him, right? There was
00:41:26.900
nothing transactional about that. It was just a pure state of being. It was just like the state of being
00:41:32.340
fully attentive to another person as just the locus of a moral concern.
00:41:40.240
And this led you to decide that you wanted to significantly alter your curriculum, I guess. I
00:41:45.580
mean, you were at that point taking the, you were sophomore at this point. Yeah. So not a notoriously
00:41:51.320
delightful year for anybody, but you were taking a wad of things, preparing to declare your major if you
00:41:55.700
hadn't yet already. And so I assume that this made you realize that there was a different curriculum you
00:42:01.280
wanted to pursue in a sense. So ironically, it led me to realize that all of the otherwise incoherent
00:42:08.480
and offensive noises that religious people had been making for millennia actually, actually were
00:42:14.600
inspired, must have been inspired by experiences like this, right? So like, whatever you want to
00:42:21.560
think about Christianity and the Bible, Jesus was probably talking about this, right? Or something
00:42:26.820
like this. So the one thing that just bore in upon me like a freight train in that experience was
00:42:32.300
the recognition that millions of people had had experiences like this, and many not through drugs,
00:42:40.040
but through, you know, prayer and fasting and, you know, other contemplative exercises, yoga,
00:42:45.600
meditation. So there was a path. Your mind could be more and more like this than mine had tended to be.
00:42:54.700
Yes. Yeah. Because it's all just chemicals. I mean, the drug is, you know, drugs are mimicking
00:43:00.220
neurotransmitters or inspiring neurotransmitters to behave differently. I mean, you only have a few
00:43:05.920
levers to pull in there, but I didn't have a background in neuroscience at that point. And I
00:43:10.920
had been an English major. And so when I went back to school, there was nothing in school that I could
00:43:16.520
connect with that immediately seemed like this is the most rational use of your time, given what you just
00:43:21.880
experienced. And I also was writing, I was also planning to write fiction. I wanted to write...
00:43:26.840
I know you were working on a novel, weren't you?
00:43:28.460
Yeah. Yeah. So I had a kind of a dual agenda when I dropped out. I was going to write a novel and
00:43:34.320
study meditation. I started going on meditation retreats that were getting kind of longer and
00:43:39.780
longer. And then I was going to India and studying meditation with various teachers and going to
00:43:45.180
Nepal. And I mean, this is mostly in a Buddhist context.
00:43:48.080
And did you buy into the religiosity of Buddhism? Because often, I mean,
00:43:51.620
there's extraordinarily powerful spiritual practice that is embedded in Buddhism. But in
00:43:55.920
other contexts, you've said, you can access that and leave the religiosity behind if you wish.
00:44:01.160
You're coming in as a young person, as a novice of sorts into this community. Was it easy for you
00:44:07.500
to take sort of almost the neuroscientific wisdom that was being transferred and leave out
00:44:13.660
the religious wrapping that I imagine it often came in if you were going on retreat and going to
00:44:20.960
Yeah, not entirely. I mean, I was not, I never became a religious Buddhist or much less a religious
00:44:28.100
Hindu, though I was studying with teachers in both traditions. But I was not yet a scientist. I was not
00:44:36.600
yet really scientifically literate. I mean, my background, I'd been studying English at Stanford.
00:44:41.660
And hadn't taken many science courses at that point. And I became very interested in the
00:44:49.680
philosophy of mind and in the conversation that was happening between philosophers and scientists
00:44:55.060
about the nature of consciousness. So I was reading, I was getting some brain science in reading what
00:45:01.420
philosophers were saying. And I was reading some stuff at the margins of neuroscience. And then I was
00:45:07.040
also reading a fair amount of popular physics, because a lot of the popular physics was being
00:45:12.260
marketed as a way of cashing out a new age mysticism. People were hurling books at me on
00:45:22.760
And the scientific and philosophical confusion there was not yet obvious to me. So at a certain
00:45:28.320
point, undoubtedly, when I'm up to my eyeballs in Krishnamurti and reading patently magical books
00:45:36.640
like Autobiography of a Yogi, you know, Paramahansa Yogananda. And then I'm also reading, you know,
00:45:41.660
Ken Wilber and people who are wrapping up Eastern wisdom with basically the spookiest exports from
00:45:49.880
physics. So if you had asked me what I thought the universe was like at that moment, I undoubtedly
00:45:56.440
some new age gobbledygook could have come out, you know, which is, I now view as quasi-religious.
00:46:05.120
There's a fair amount of confusion there. And I've debated people like Deepak Chopra, who still
00:46:10.060
promulgate that kind of confusion. I was always interested in just in the experiential
00:46:14.780
component of meditation and any of these paths of practice. But when you go far enough into the
00:46:22.080
experiential component and begin to confirm some of the very surprising things, some of the very
00:46:27.480
surprising claims about the nature of the mind that only seem to get made by people in the East,
00:46:35.260
for the most part, who are also making claims about the magic powers that come with attaining,
00:46:41.740
you know, very high states of meditation and the miraculous feats of various yogis and gurus.
00:46:48.520
Well, then you're surrounded by people who believe, for instance, that their favorite yoga teacher can read
00:46:55.100
their minds, right? And I was always somewhat skeptical of these stories. I mean, I don't think
00:47:02.480
I had the phrase confirmation bias in my head, but I could see that the disposition among these people
00:47:08.560
to believe, the desire to believe these stories to be true, yeah, was, I mean, there was very little
00:47:15.080
resistance in the system to just accepting everything uncritically. I think I was, you know,
00:47:20.600
I was on the skeptical end there, but I was not spending any time trying to debunk claims about
00:47:25.420
magic. I was simply just trying to get to the most qualified teachers and learn whatever they
00:47:32.140
had to teach. And it was roughly a 10-year period, correct? Yeah. Which you were going on to retreats,
00:47:38.300
coming back. How many of those 10 years were you in silent meditation? Was it, would it total to a
00:47:44.560
year or more? It totaled to about two years. If you strung them all together, the various silent
00:47:49.040
retreats. Yeah. I mean, I was, I was doing, I never did a silent retreat longer than three months,
00:47:54.520
but I did, I did a couple of three months, a couple of two months. Sounds like a doozy to me.
00:47:59.280
Yeah. I mean, it's long. It's, it's just an amazing experience. I mean, there's something,
00:48:03.560
you know, paradoxically, you can experience the same thing in a moment off retreat. It's not that
00:48:09.240
there's in principle the necessity of being in silence, but for most people, it's amazingly powerful
00:48:15.600
to go into silence. It's an experience unlike any you tend to have, even when you're spending much
00:48:21.460
of your day alone and out in the world, you know, for those who don't have an experience with
00:48:26.140
meditation, this is, I guess, some explanations in order, but whatever practice of meditation you're
00:48:31.440
doing, you're really in two conditions while doing it. You're either lost in thought, you're just
00:48:36.980
distracted by your, the kind of the automaticity of discursive thought, and you've just forgotten that
00:48:41.740
you were supposed to be meditating, or you're paying attention to the thing you're trying to
00:48:45.320
pay attention to, and that is your practice of meditation. And we spend so much time in our lives
00:48:51.020
lost in thought, having a conversation with ourselves that we're not aware of having. And so
00:48:55.800
much of this conversation is neurotic. So much of it is producing unhappiness. You're thinking about
00:49:01.300
the things you, you regret having done. You're thinking about the things that didn't go well moments
00:49:05.700
before, hours before, days, or even years before. You're thinking about what you want,
00:49:10.200
what you're anxious about, what you're hoping will happen, you know, a moment hence, or at some point
00:49:14.900
in the future. And you're spending almost no time truly connecting with the present moment in a way
00:49:22.620
that is deeply fulfilling. And so, and to take my experience on MDMA, you know, one of its features
00:49:28.000
was just full immersion in the present moment. There was just zero past and future going on. And part of
00:49:35.660
the ecstasy of that experience is attributable just to that. And this is an experience you really can
00:49:41.040
have in meditation. Focusing on anything to sufficient degree produces an ecstatic state
00:49:48.180
of mind. I mean, there's bliss to be found just in being concentrated. It's just being sufficiently
00:49:53.600
concentrated on the breath or a light or anything. It doesn't matter what it is. You can also be
00:49:58.980
additionally concentrated in specific states of mind, like loving kindness, which is very much the
00:50:04.940
emotion that one often experiences on ecstasy. That is a specific meditation practice within the
00:50:10.980
Buddhist tradition. And, you know, in other traditions, there's a devotion to the guru. And, you know,
00:50:15.780
in the Western tradition, there's, you know, the love of Jesus, right? So there's no question that you
00:50:20.340
can be one-pointedly fixated on the object of your devotion and get that emotion so intensely realized
00:50:28.900
in your mind that it obliterates everything else. Incredibly expansive experiences of await someone who
00:50:35.580
can get that concentrated. It need not even be in the positive emotion of love or devotion. It could
00:50:41.480
just be the breath. So I started, you know, I started training in various types of meditation for
00:50:48.060
periods up to three months or so. And so that was punctuating my, the decade of my 20s. And it took
00:50:54.140
me a while to realize that I had to go back to school. And did you come back to English at that
00:50:58.640
point? Because you were studying English at Stanford previously. I came back to philosophy because I had
00:51:03.740
been reading philosophy and essentially writing philosophy nonstop throughout this period for 10
00:51:08.500
years. So very much with the attitude of someone who's going to go to graduate school in philosophy,
00:51:14.180
I went back to finish my undergraduate in philosophy. With an idea that this is a segue
00:51:17.820
into graduate work, but then you ended up pivoting to neuroscience of all things, which is vastly much
00:51:23.460
more of a hard science. How did that pivot come about? I mean, it makes imminent sense looking at
00:51:29.720
who you are now and regarding it with the benefit of hindsight. How did that come about in the moment?
00:51:36.280
The fact that I had dropped out of Stanford was also just sheer good luck because Stanford, as you
00:51:42.500
probably know, is like the one school, certainly the one good school that has this policy where
00:51:47.960
you basically can never drop out. I mean, you just...
00:51:50.560
Well, they call it stopping out. They don't even call it dropping out. So you've stopped out.
00:51:54.080
And there's a presumption that at some point in your life, you may wish to come back. And if you do,
00:52:00.140
Yeah. Yeah. So, you know, Tiger Woods can go back to Stanford today. I don't know how long it's been.
00:52:04.100
It's been 20 years or something, but he can just walk back in and the registrar will just
00:52:08.760
have his name in the computer. Take his check for sure.
00:52:11.960
You know, I guess that's the way it should be. I'm sure there's a reason why Harvard and Princeton
00:52:17.280
and other good schools don't do it this way. They don't want you back unless you've been writing
00:52:21.460
them letters every year. And at a certain point, I think you have to reapply. You have to give some
00:52:25.760
accounting for what your years in the wilderness have done to you.
00:52:29.600
Well, I think you're probably an object lesson in that perhaps that's not such a great idea because
00:52:33.620
Stanford did get you back and it was to, you know, their benefit and yours. And I'd argue to the
00:52:37.660
worlds that you were able to slide back into that and make this pivot to neuroscience.
00:52:42.400
It's interesting to look back on that because in my 20s, I remember at one point, I think I was
00:52:45.440
probably 25 and had the first had the thought, you know, I should really go back to school to do
00:52:51.420
this right. But the psychological barrier to going, like, I felt so old at 25. I felt like I felt so
00:52:58.480
neurotic around, wait a minute, I can't go back and be a junior in college at 25. It's flabbergasting
00:53:05.320
for me to glimpse who I was at that moment because, you know, I went back at 30 or 31,
00:53:11.020
very close to 31. And that's a much more neurosis producing bit of arithmetic. And it was psychologically
00:53:18.080
hard to do because, I mean, you just picture it. I'm going back and I've, again, I've spent now a
00:53:22.760
decade reading and writing on my own. And I'm now having to take, do a full philosophy major,
00:53:29.920
taking all the courses. And I'm doing this as fast as I can because I want to get this done with.
00:53:33.900
Right. Because you started with English. So you're, you're in, you're in like sophomore
00:53:37.000
seminars. You're in like, you're in with freshmen.
00:53:39.180
Yeah. Yeah. I'm taking, I have to take, I'm not getting any breaks. I don't have credit for
00:53:42.860
what I've already read. So, and I'm taking a massive course load to do this quickly,
00:53:46.740
but I'm also getting my papers graded by, you know, 20 year old TAs. And it was just brutal.
00:53:58.580
It was an extraordinary experience, but it was, you know, ultimately a good one because it was
00:54:03.900
just at a certain point, it was not about saving face. It was just, you just have to use this as a
00:54:10.000
crucible to get the tools, to be able to speak clearly, write clearly. And you just have to get
00:54:16.220
out of your own way. I mean, like I was spending all of my time focused on overcoming the, the
00:54:22.220
hallucinatory properties of the ego, right? It's like, I want to wake up from this hallucination
00:54:27.560
where it seems to matter what another person thinks about me and conditions how I feel about
00:54:34.400
And you know, if 10 years of meditation aren't going to get you there, I guess it's just time
00:54:39.000
Yeah, exactly. And what meditation gets you, at least at my level, is not a permanent inoculation
00:54:46.060
against all of these unpleasant states of mind, the half-life of psychological suffering
00:54:56.420
Yes. It's sort of up to you how rapidly. At a certain point, you can just decide, all right,
00:55:00.480
I'm going to stop suffering over this thing. And absent an ability to really meditate, you're a
00:55:06.400
victim of whatever half-life it's going to be in your day. So if you're going to, if you get
00:55:10.640
suddenly angry now about something that happens, you know, you could be angry for an hour, you
00:55:16.580
could be angry for a day, you could be angry for a week. And over that period, you could
00:55:20.940
do all the life-deranging things that angry people do to screw up their relationships.
00:55:28.460
If you're angry over a week or a month or whatever it is.
00:55:31.260
And the difference between being angry for 30 seconds and being angry for an hour, it's
00:55:38.020
impossible to exaggerate how important that is.
00:55:42.460
And so it is with embarrassment and everything else.
00:55:44.600
So you got through and then neuroscience beckons.
00:55:47.780
I was going to do a PhD in philosophy, but again, my interest was in the philosophy of
00:55:53.620
mind. And I thought I would do a PhD in philosophy, but it was just so obvious that the philosophers
00:55:59.900
were either having to become amateur neuroscientists to actually interact with what we were finding
00:56:06.700
out about the brain, or they were just having a conversation that was completely uncoupled
00:56:12.420
to what was known about the brain. And so I just decided I needed to know more about the
00:56:18.300
brain. But I went into neuroscience very much as a philosopher, I mean, with philosophical
00:56:23.960
interests. And I was, I never went in thinking, well, you know, maybe I'm, I'm going to work
00:56:28.100
Did you have to take like pre-med courses or anything? Cause I mean, I think of neurosciences,
00:56:32.360
obviously it's, it's a deeply biological subject. You're going to need to understand, you know,
00:56:36.260
metabolic pathways, neurological pathways. Did you have to take like a whole pile of classes
00:56:41.000
having finally finished this philosophy degree to qualify?
00:56:43.900
As I was finishing my degree at Stanford and my interest in the brain was, was starting
00:56:50.180
to come online. I took a few courses that were proper neuroscience courses. And then when
00:56:57.420
I applied, I got provisionally accepted. They wanted me to take a genetics course at UCLA.
00:57:03.500
I had about nine months between when I finished at Stanford and started at UCLA. And I needed to
00:57:09.900
take a genetics course just to show them how I would function in a proper science class.
00:57:16.180
I've always been a bit of a, a drudge and a good student. So, I mean, there was, there was no problem
00:57:21.960
doing that. And happily, what happens when you go into, I don't know if this is true in every
00:57:25.620
neuroscience program, but at UCLA, whatever you've come from, you have to take everything all over
00:57:30.420
again. So I'm surrounded by people who did their undergraduate degrees in, in neuroscience or in
00:57:36.140
molecular biology, but we have to take all these fairly basic courses in, you know, molecular
00:57:43.280
neuroscience and cellular neuroscience and systems neuroscience. And you just have to take it all
00:57:48.200
again if you've done that as an undergraduate. So it's review for them and arguably a little bit
00:57:51.540
easier, maybe a lot easier, but you're all going through it. You're getting put to the same level.
00:57:56.000
That's good. Yeah. And on some level, all of that is a, just a vast memorization feat. You know,
00:58:03.960
I mean, certainly neuroanatomy is just this memorization exercise, unlike any other. And
00:58:10.080
you're just learning how to play a language game. You're just learning just the concepts and the parts
00:58:15.860
and how to talk about them and how, and how we currently understand them to be interrelated.
00:58:21.660
Looking back on it, it would be daunting for me to have to do it again now, but it was,
00:58:25.660
it was totally fine. And then I, and then you get into your research and then you get into the,
00:58:30.060
you know, having to use the methods and, and answer the kinds of questions you specifically want to
00:58:34.720
ask. And again, there, my interests were, you know, very high level and fairly philosophical. I mean,
00:58:43.060
I was, I was studying belief with functional magnetic resonance imaging, fMRI. So putting people in the
00:58:50.200
scanner and having them evaluate propositions from various, on various topics, propositions that were
00:58:57.600
either clearly true or clearly false or clearly undecidable. And so I was comparing belief and
00:59:02.480
disbelief and uncertainty and just looking at, at what, what it means neurophysiologically to be in a
00:59:09.860
state of accepting some propositional claim or rejecting it. So what brain regions were lighting up?
00:59:15.780
Yeah. And what, and what, just what the difference is. And I was, I was interested to know if it was
00:59:19.640
reasonable to speak about a kind of final common pathway or, or, or a content neutral property of
00:59:25.420
just belief. I mean, it's granting credence to a, a statement about the world. Is that a, a unified
00:59:32.560
thing in, in the brain and is rejecting something as false, a unified thing that is in some basic sense,
00:59:39.480
the same, whether you're talking about the virgin birth of Jesus or two plus two makes four,
00:59:45.280
we're recording a podcast right now, or you're a man, or you went to Stanford, or to evaluate any
00:59:51.380
of those claims as true or false, obviously invokes very different kinds of processing in the brain
00:59:56.500
because, you know, math is one thing and, you know, your, your autobiography is another. The truth
01:00:01.480
testing wouldn't be the same there, but the granting of assent and crucially for me becoming
01:00:08.720
emotionally and behaviorally susceptible to the implications, really the imperatives of accepting
01:00:16.240
something to be true or rejecting it as false. So if you have someone comes in and says, you know,
01:00:19.860
I hate to tell you, but your wife is cheating on you. You know, I just saw her, you know, you think
01:00:25.620
she's on a business trip, but I just saw her at a restaurant with this Lothario who I know, right?
01:00:32.360
Is that true or false? Everything depends on whether that is true or false. And your evaluation
01:00:37.500
of it, given the right evidence, it's instantaneous, right? It's like your world changes in a moment,
01:00:44.460
this propositional claim, which is just language, it's just noises coming out of someone's mouth,
01:00:49.020
or, you know, it's just an email, right? So you're just, you're just, it's just a bit of language
01:00:53.080
becomes your world the moment you grant it credence. And so that, that shift...
01:00:59.360
You almost made a belief detector, it sounds like.
01:01:01.700
We did, in fact, make a belief detector, which, you know, under the right conditions would also be
01:01:06.120
a lie detector. If you know whether someone is representing their beliefs accurately, you know
01:01:12.600
whether or not they're telling the truth. And, you know, that's an interesting topic, but the future
01:01:17.180
of mind-reading machines, I think, undoubtedly will be a future in which we will be increasingly
01:01:24.160
confident whether or not someone is telling the truth.
01:01:26.340
Yeah, because current lie detector technology is from the, what, the 1920s and is notoriously,
01:01:33.240
Yeah, but it's not even a valid science, even if you were not tricking it, you know, it's just...
01:01:40.960
Yeah, it's just measuring physiological changes that are correlated with anxiety.
01:01:44.560
But, you know, if you're not an anxious liar, then you're...
01:01:47.940
You're going to pass with flying colors. And if you're an anxious truth-teller, as some people are.
01:01:52.080
So in the middle of all this research, 9-11 happens.
01:01:55.160
And that, was that a direct trigger to the book Into Faith?
01:02:00.000
Yeah, yeah. So within 24 hours, I was writing what became that book. I mean, I was writing initially a
01:02:06.420
book proposal, but I wrote essentially the first chapter of that book, you know, the very next day I
01:02:11.520
started writing it. So 9-11 came. I had finished my coursework. I was just starting my neuroimaging
01:02:18.280
work. I was already focused on belief, you know, and religious belief is a subset of that.
01:02:24.960
And I had just spent this previous decade plus focused on just questions of spiritual concern
01:02:33.160
and what is true in religion and why do we have these competing worldviews that are religious in
01:02:39.220
the first place? And what is it necessary to believe to have a meaningful life? And then people
01:02:45.500
start flying planes into our buildings, clearly expecting paradise. This is an act of worship,
01:02:51.840
you know, and we immediately start lying to ourselves about why they did it. And because I had read the
01:02:57.420
Quran, I was not, I hadn't focused on Islam to any great degree, but I was pretty sure I knew what
01:03:04.240
these guys were up to, right? Like the moment I heard about what Al-Qaeda was, just you have someone
01:03:10.460
like Osama bin Laden who could be doing anything he wants. He's got hundreds of millions of dollars.
01:03:14.680
He could be living in Paris and dating models, but no, he's decided to live in a cave and plot,
01:03:20.480
you know, the takeover of the world for the one true faith. I immediately recognized the spiritual
01:03:27.700
intensity of that enterprise. He was not faking his belief. He believed what he said he believed,
01:03:34.760
and it was only rational to take his stated beliefs at face value. I had been surrounded by people who
01:03:40.940
believed the Hindu version or the Buddhist version of karma and rebirth, right? And they believed it
01:03:47.080
absolutely to their toes. And I understood why they believed it. And many of them were having
01:03:51.960
intense experiences of the sort I was having in meditation or on psychedelics. And there's no doubt
01:03:58.440
in my mind that members of Al-Qaeda were having intensely meaningful experiences of both of solidarity,
01:04:04.580
you know, among their fellow jihadists, and just many of us have gotten into things that suddenly seem to
01:04:12.540
answer much of what we were lacking in our day-to-day experience.
01:04:19.180
Yeah, but I mean, even seemingly more trivial things. So we all know that certain people,
01:04:23.440
you know, they become vegan or whatever, and all of a sudden it's all about getting their diet
01:04:26.900
straight, right? Or they get really into yoga. You know, and this happened to me with Brazilian
01:04:30.780
jiu-jitsu. I mean, I got into Brazilian jiu-jitsu, and all of a sudden, it's the only thing I can talk
01:04:34.780
about with people. Like, it's just, you know, I've become a cult recruiter for jiu-jitsu. I mean,
01:04:40.200
you go down the rabbit hole with these things, and suddenly you have immense energy for paying
01:04:47.540
attention. It just becomes effortless to pay attention to this thing. Now, just imagine
01:04:52.500
something that has all of these components. It has the, one, you actually believe the doctrine.
01:04:58.100
So you believe that this life is just a way station here. And the only thing that matters here
01:05:06.440
is getting your head straight about what's on the other side of death. You have to believe the right
01:05:12.700
things now. You have to get your life straight now so that when you die, you go to the right place,
01:05:18.860
right? There's no question that millions of people, billions of people, really, most people who have
01:05:24.700
ever lived believe something like that about the way the universe is structured. And Islam,
01:05:30.840
in particular, this especially doctrinaire version of it, gives a uniquely clear picture of just how
01:05:38.220
all of that is organized. I mean, it's just, it's a very self-consistent view of just what you need to
01:05:42.380
believe and how you need to live to get to the right place. Imagine having that kind of moral and
01:05:48.860
spiritual clarity in your life, which immediately translates into a recipe for how to live. I mean,
01:05:54.280
there's just zero ambiguity about how society should be structured, how men and women should relate.
01:05:58.660
But then there's this whole political layer, which is all of these historical grievances where the
01:06:05.620
West, the infidel West and the materialistic West, really the obscene West has, by some perversity of
01:06:13.520
history, acquired all this power and essentially trampled upon the only civilization that has ever
01:06:20.820
mattered to God, which is the Muslim one. In addition to everything else, you have essentially the yoga
01:06:27.180
component and the diet component and the personal life straightening component. You have this political
01:06:33.700
component where you have to right this great historical wrong and spread this one true faith
01:06:39.820
to the ends of the earth. I mean, this is a missionary religion. This is not Judaism. This is not Buddhism.
01:06:45.200
This is the way this works is you spread this thing, right? And there's nothing pacifist about this.
01:06:51.920
You, as a man, you get to harness all of your testosterone. You get to be essentially a spiritual
01:06:58.120
James Bond, right? You get to go to war for this thing. You get to kill the bad guys. You get to be
01:07:05.440
But with social approbation within your circles as opposed to the negatives that would come with
01:07:11.720
Exactly. Yeah, yeah. Like, this is a spiritual gang. It's also incredibly well-funded. I mean,
01:07:18.380
if you look at how the Saudis have funded the spread of the Wahhabi-style Islam, this is a gang
01:07:26.320
And the rewards are simply beyond comprehension, literally, because the rewards are paradise.
01:07:33.380
I mean, it's like we see gangs motivated by, you know, money and access to women and all the
01:07:38.740
things that, you know, have powered, you know, lots of gangs and lots of songs. And that's
01:07:44.320
teeny compared to the upside that these folks would imagine that they're playing with.
01:07:50.280
And so you felt you knew a thing or three or 10 or 100 about belief.
01:07:55.320
This happens, you dive into it. And it's interesting just talking about belief, because I know
01:07:59.540
one of the complaints that you have about a lot of your critics is that they don't seem to think
01:08:05.360
the Islamists believe that which they actually say.
01:08:09.360
Yeah, it's amazingly durable, this piece of confusion. But the idea is that the jihadists,
01:08:16.080
you know, even those who blow themselves up, right, in what is just transparently
01:08:20.660
kind of the ultimate act of self-sacrifice, they don't believe what they say they believe.
01:08:26.280
They're not being motivated by religion. Religion is, at worst, being used as a pretext
01:08:30.780
for political goals and economic grievances and, you know, psychological instability, right?
01:08:38.520
Or it's being cited by Islamophobes as a way to sort of slander Islam by saying,
01:08:45.000
well, these people did it for religious reasons. No, that's an Islamic phobic thing to say. They
01:08:49.140
really did it for this other reason. What other reason is offered as an alternative to fervently
01:08:54.540
Political grievances or they were so despairing over the state of the Palestinians, you know,
01:09:00.580
under the Israeli boot. Again, this can be more or less plausible if you're talking about a
01:09:05.920
Palestinian who's being mistreated in Gaza. It's completely implausible when you look at a third
01:09:12.400
generation British Muslim recruit to ISIS who had to drop out of, you know, the London School of
01:09:18.640
Economics in order to go to Syria, right? And there are endless numbers of cases of people who have
01:09:24.460
every other opportunity in life who become, quote, radicalized in this way. There's a deep skepticism
01:09:30.960
among people who simply don't know what it's like to believe in God, frankly. A real God, you know,
01:09:37.320
a God who can hear your prayers, a God who can hate homosexuals, a God who cares how you live,
01:09:42.240
not this elastic God of just good vibes in the universe. People have lost touch with me and many
01:09:47.920
academics, you know, virtually every anthropologist I've ever had to talk to about this stuff.
01:09:53.000
Many journalists, many so-called scholars of religion, just don't know what it's like to
01:09:58.360
believe in God and then doubt that anyone really does. They don't actually think that people believe
01:10:04.520
that they'll get virgins in paradise, right? They think this is just propaganda, propaganda that nobody
01:10:10.880
Almost like the Judaism that you described of your youth, where people would go to synagogue and
01:10:15.580
they'll go through these things, but not because they believed in something ephemeral, but because that
01:10:19.080
was sort of a cultural or a community activity, people are projecting that on to this world. And
01:10:25.260
you certainly are not saying this as some kind of a neocon. I mean, I imagine you probably first
01:10:30.260
voted in a presidential election in 1988. How many Republicans versus Democrats have you voted for?
01:10:37.940
Never voted for a Republican. And you actually think that this was a decisive issue or potentially
01:10:43.580
decisive issue in the election that we just had, correct?
01:10:49.560
Well, yeah, because we had a president for eight years that just clearly lied about
01:10:55.880
this particular topic. I mean, he would not name the ideology that was delivering us this form of
01:11:04.600
terrorism. He would just talk about generic extremism or generic terrorism. And he was
01:11:10.700
quite hectoring and sanctimonious about the dangers of naming this ideology. So at the one point he
01:11:18.160
gave a speech just pushing back against his critics. You know, I was a huge Obama fan, actually. And
01:11:22.840
when I compare him to our current president, it feels like we have kind of fallen into some new part of
01:11:28.160
the multiverse that I never thought we would occupy. I mean, it's just unimaginable that we've taken this
01:11:32.180
turn, where you have a totally sane, intelligent, ethical, professional person running the country,
01:11:39.580
and then you have this unhinged con man running it next. But Obama really got this part wrong,
01:11:47.460
and disastrously so. And Clinton seemed to be echoing most of his delusion on this part. I mean,
01:11:53.540
at one point she talked about extremist jihadism, or radical jihadism.
01:11:59.920
As if there's moderate jihadism. Yeah, so there's moderate jihadism that doesn't pose a problem
01:12:03.160
for us. But so in the immediate aftermath of Orlando, the Orlando shooting that killed, I think,
01:12:08.340
49 people. 49. It was the biggest mass shooting in American history. Right, right. And no parallel.
01:12:15.200
And clearly an act of jihadism. I mean, just transparently so. Everything that Omar Mateen said
01:12:21.340
was just, he just connected all the dots. It could not be clearer. And Hillary Clinton spoke only about
01:12:27.880
the need for gun control, and the need to be on guard against racism in the aftermath of Orlando.
01:12:33.940
And that was just, I know at least one Muslim who voted for Trump, just because of how galling
01:12:41.120
she found that. To use Trump's language, it's all true, that political correctness and delusion,
01:12:46.560
I mean, it was just a refusal based on this fake concern about racism. I mean, Islam is not a race,
01:12:54.200
right? Not at all. You and I could convert to Islam right now, and we would be part of this
01:12:58.380
particular problem if we converted. When I lived in Cairo, I knew lots of Western, both American
01:13:03.580
and European converts, who were very sincere and devout Muslims, and they had not a drop of Arab
01:13:08.640
blood in them, etc. It is not a race, absolutely. And you can be more devout. You know, it's easier to
01:13:13.960
convert, because if you're actually going to convert on the basis of the ideas, the only way to convert is
01:13:19.460
to actually claim to believe these specific doctrines, right? And the doctrines get fairly
01:13:26.520
inimical to most things we care about in the 21st century very, very quickly. You can't convert to
01:13:32.480
the lived experience of just having been a nominal Muslim surrounded by Muslim culture, and analogous to
01:13:39.380
the Jewish experience that we just talked about. So I just had Fareed Zakaria on my podcast, and
01:13:43.800
he's a Muslim. He identifies as a Muslim. He's clearly not religious at all. I mean, most serious
01:13:50.640
Muslims would consider him an apostate. I mean, he's not a believer, right? But he has a Muslim
01:13:56.720
experience, analogous to the kind of Jewish experience, that matters to him, and he feels
01:14:01.820
solidarity with that community. You know, I can't convert to that, right? Because I don't have that
01:14:06.020
experience, but I could become a member of ISIS if I check the right boxes. Hillary was such an
01:14:11.260
obscurantist on this issue. And again, in the immediate aftermath of this horror, when you're
01:14:18.860
having attacks in Europe that are also enormous and seeming, you know, to presage more to come
01:14:26.680
in our own society, right? And this need not have been a winning issue for Trump, but it was among the
01:14:33.520
two or three things that... Yeah, in an election that tight, there are arguably probably dozens of
01:14:39.120
winning issues, because anything that swung a few tens of thousands of votes... Yeah, 75,000 votes.
01:14:44.280
Yeah, in the right or the wrong place. Now, you mentioned, you know, political correctness and
01:14:48.880
language. You have stated a few times that you view free speech as the master value. Would you care to
01:14:55.560
just say briefly why that is? Because I think it's an intriguing notion.
01:15:00.440
Because it's the only value that allows us to reliably correct our errors, both intellectually
01:15:07.540
and morally. It's the only mechanism we have as a species to keep aligning ourselves with reality as
01:15:17.540
we've come to understand it. So you're talking about the data of science, you're talking about
01:15:21.220
the data of human experience. Everything you can conceivably use to judge whether or not you're on the
01:15:29.040
right track or the wrong track. And again, this applies to everything. This applies to human health,
01:15:34.180
it applies to politics, it applies to economics, it applies to spiritual concerns, contemplative
01:15:38.660
concerns. It's the corrective mechanism. It's just, it's the only mechanism. And if certain ideas are
01:15:43.660
inutterable, you're not going to be able to correct. If there are certain things that you will not,
01:15:48.800
you refuse to talk about. This is what's so wrong with dogmatism. So dogmas are those beliefs or those
01:15:54.960
doctrines, which you will assert the truth of, and you'll, you'll, you demand people remain aligned to
01:16:00.640
without justification, right? It's like the time to justify them either never arrived or it's long
01:16:05.980
past. And these merely must be accepted going forward. So these are off the table. You know,
01:16:12.180
the Apostles' Creed, if you're a Catholic, that is off the table. It's instructive to know that the
01:16:16.260
word dogma is not a pejorative term in a religion like Catholicism, right? But it is everywhere else.
01:16:24.100
And there's a good reason for that, because it's even the most benign dogma can produce immense
01:16:30.740
human misery in surprising ways. And if you're not, if you can't keep correcting for it, you're just
01:16:37.560
laid bare to the misery. So I mean, my favorite example of this, because it is such a surprising
01:16:43.560
mismatch between the, the seeming propositional content of the dogma and its effects in the world,
01:16:48.740
but you have a dogma like kind of a twin dogma. It's the life starts at the moment of conception
01:16:54.260
and all human life is sacred. What could be wrong with that, right? So this seems to be the least
01:17:00.720
harmful thing you could believe about the human condition. How are you going to harm anyone believing
01:17:07.880
those things? All human life is sacred, and human life runs all the way down to a single cell.
01:17:15.680
What could go wrong? Well, what can go wrong is you suddenly get a technology like embryonic stem cell
01:17:20.820
research, where there's this immense promise, obviously unforeseen by the Bible, but also unforeseen
01:17:27.980
by every generation of humanity. Perhaps someone in the 1930s could have foreseen this was coming,
01:17:34.340
but not much before that, right? And you have this immense promise of alleviating scores of
01:17:43.200
conditions. Boundless suffering. Boundless suffering, full body burns and spinal cord injury and
01:17:48.860
Alzheimer's, just, just you name it, who knows how, how much promise this technology holds for medical
01:17:56.680
therapy. And then you have people, and again, these people are the most influential people in our
01:18:02.400
society, from presidents and senators on down, and religious academics, and bioethicists who aren't
01:18:08.860
religious but still treat these magical doctrines as somehow deserving of respect. But you have this
01:18:14.460
idea that every fertilized ovum contains a human soul. You've got now souls in petri dishes, just as
01:18:21.840
vulnerable as the baby Jesus, that cannot be sacrificed, no matter what the argument is on the other
01:18:27.840
side. You can have, you know, people with Parkinson's or little girls in wheelchairs,
01:18:33.560
doesn't matter. I'm just as concerned about the life in this petri dish. And, you know, we've sort of
01:18:39.100
moved on because there have been workarounds found biologically, but basically we dragged our feet for
01:18:44.960
a good 20 years there. And who knows what medical insights weren't had as a result of that.
01:18:52.620
And what do you feel about the value of anonymous speech? There are inarguable value to anonymous
01:18:58.240
speech in brutal dictatorships where dissidents and others can get into enormous trouble, get tortured
01:19:04.420
and killed if they say something that gets detected by somebody who's incredibly nefarious and has really
01:19:10.900
no ethical standing in the minds of most folks in this country. So I think there's certain things,
01:19:16.160
I'm not talking about those relatively inarguable things, but I know that you don't enable comments
01:19:20.860
on your webpages. I know that you have had concerns about the quality of speech in places like the
01:19:28.140
YouTube forums and so forth. Do you feel that there is a fundamental difference between the value of
01:19:33.200
anonymous speech and, for lack of a better word, owned speech? Or do you feel that anonymous speech is
01:19:39.740
every bit as much of the master value, in a sense, that you attribute to free speech writ large?
01:19:44.840
But I wouldn't prevent it in most cases. Certainly there's like the whistleblower's role for it.
01:19:52.940
I'm in favor of journalists protecting the anonymity of their sources if, you know, great harm would come
01:19:58.400
to the sources. Generally speaking, I think it is one of the variables that accounts for why so much of
01:20:05.920
what is said online is so toxic. I mean, people feel a license to be jerks that they wouldn't feel if they
01:20:16.640
And then what about tools that enable tremendous anonymity to anybody? And I'm thinking particularly
01:20:22.640
of TOR. TOR, which is ironically a product of the United States Navy. It is something that I have no
01:20:28.100
doubt has masked the identity of lots of dissidents in ways that any reasonable person would applaud.
01:20:33.560
But at the same time, it preserves the anonymity and the secure communication, certainly between
01:20:38.400
terrorists. There's an enormous amount of child pornography there.
01:20:41.560
Again, it just cuts both ways. I think there is an argument to be made that something like that,
01:20:47.580
I mean, something like strong encryption is just inevitable. It's just a mathematical fact that
01:20:51.840
it's available and it will therefore always be available to anyone who's going to take the time
01:20:56.400
to acquire it. And this is something I kind of stumbled into on one of my podcasts or when the
01:21:02.000
first controversy around the FBI's unlocking of an iPhone came online. An iPhone, it was sort of
01:21:10.400
uncrackable by law enforcement. If you attempt the passcode too many times, it just goes into
01:21:16.280
permanent lockdown. Yeah, 10 times. Yeah, yeah.
01:21:18.280
And apparently no one can get in or almost no one can get in. And Apple was claiming not to have
01:21:23.920
devised its own ability to get in. And that struck me as a way of punting on Apple's part that was not
01:21:35.500
And their argument was that if they created a mechanism whereby they could answer a court order
01:21:41.720
and unlock an iPhone, that mechanism would be impossible to keep safe. Then everyone would
01:21:47.400
have a hackable iPhone. And I never really bought that. I felt like they could, if they had wanted to
01:21:52.860
keep it safe, they could probably keep it safe. And it seems to me that people do keep, I mean,
01:21:58.380
they keep other trade secrets safe, presumably. And...
01:22:03.380
Yeah. You know, if those are the keys to the kingdom, then presumably they could keep it safe.
01:22:08.300
Obviously, the tech community took a very strong position against the government there.
01:22:12.960
But we don't have the analogous right in any other area of our lives. When you draw an analogy to,
01:22:20.640
for instance, I want to be able to build a room in my house where I can put things and even put
01:22:28.080
evidence of all my criminal behavior that no one on earth in principle can get access to, right?
01:22:34.380
So there's no court order, there's no government process, there's no evidence of my own culpability
01:22:40.160
that could be so clear that could get that room unlocked.
01:22:45.420
It's almost like your personal diplomatic pouch or having some kind of like privileged
01:22:50.020
communication with a lawyer. That is an unlockable box legally, but it's a physical box in this case.
01:22:55.280
Yeah. And so no one claims to feel that they have a right to that thing, right? It's not feasible.
01:23:00.860
We can't easily build it, right? Or we can't build it at all.
01:23:03.560
Or if we could, there would be unlikely to be a mass movement for everybody to get one of those things.
01:23:09.020
Yeah. And so if someone had managed to build such a thing, and we had reason to believe that evidence
01:23:14.860
of his, you know, vast criminality was in there.
01:23:18.220
There was a severed head in it or something like that.
01:23:20.040
Yeah, right. So there's a murder that is going unsolved every day because we can't open this
01:23:24.460
closet, right? His argument that that's his personal property that can't be opened, that wouldn't hold
01:23:31.280
water to really, for really any of the people who are quite exercised about the necessity of keeping
01:23:36.160
their iPhones private, right? And then you have the cases. So I spoke to, I didn't have him on the
01:23:41.120
podcast, but I spoke to Cyrus Vance, who's a, I think he probably still is the district attorney of
01:23:49.700
And so we kind of ran through this with him for a couple hours. And he was telling me about, you know,
01:23:54.720
murders that are unsolved, where they know that the murder victim was texting with someone up to
01:24:02.020
the moment she was killed or that the, or that the video, the camera was on, right? Like, like,
01:24:07.160
like people who had taken pictures of their murderers.
01:24:10.300
With the intention of them being seen, presumably.
01:24:13.200
And Apple was declining to help unlock these iPhones, right? And they had, at that point,
01:24:24.460
And you can imagine, imagine being the parent of, your daughter gets murdered, and it is possible
01:24:33.540
Because she took the picture wanting her murderer to go to jail, and now all of a sudden it's a
01:24:42.400
The fact that we can't find some mechanism by which to right that wrong doesn't make sense to me. So I,
01:24:48.260
you know, I'm on both sides of this issue. I'm in favor of good people not having their
01:24:53.980
privacy needlessly invaded, obviously, and having secure communication. But at a certain point,
01:25:00.660
if you are behaving badly enough, I think we, the state has an interest in sorting out what
01:25:07.880
you've done, and why you did it, and who you collaborated with. And this controversy is going
01:25:13.040
to come back to us a hundredfold the moment we have reliable lie detection technology, right?
01:25:19.920
And I should also say that we have solved this problem in the opposite way, where people have
01:25:25.620
the opposite intuition with respect to DNA technology. So you do not have a right to keep
01:25:29.700
your DNA secret. You can't say, no, no, you can't take a swab of my saliva, because that's private
01:25:35.360
data, you know, that I don't want you to have access to. No.
01:25:37.720
And that would, in a certain level, be more logical for people to say, like, I'm sorry,
01:25:41.400
that is so intimate, you may not. It would be in some ways more defensible.
01:25:45.400
But it's not, and we've just steamrolled over that sanctity, because there's a forensic imperative
01:25:52.240
There's an overwhelming benefit to social benefit and crime-finding benefit.
01:25:56.200
Yeah. But the argument, people are treating their iPhones essentially as a part of their
01:26:04.980
Understandably, because there's so much information there. But when we can actually read minds,
01:26:08.960
right, that that's going to be, is, do you have a right to take the Fifth Amendment privilege
01:26:14.780
when we have lie detection technology that can sort out whether or not you're telling
01:26:21.080
And I mean, there are philosophical problems with relying on lie detection technology. I mean,
01:26:25.400
there are people who, well, we know there are people who could be delusional, who could
01:26:29.200
be telling the truth and perhaps giving a false confession, right?
01:26:32.120
Well, one of your guests, Lawrence Wright, wrote a book about that very phenomenon.
01:26:34.840
Exactly. Yeah, that was fascinating. So, I mean, that's a wrinkle we need to sort out. It seems
01:26:41.220
to me that there are certain moments where any of the claims of personal liberty and privacy
01:26:48.620
just break down. I mean, you make the stakes high enough and you make a person's culpability
01:26:54.300
obvious enough that we should be getting into their phones and computers by any means possible.
01:27:00.540
And because of the San Bernardino connection, this actually touches on another interest
01:27:04.620
and another thing that interests me quite a bit is when you sit down to write a book
01:27:08.320
that's set in the very near future, certain depictions that you make of the near future
01:27:13.200
almost inevitably either come true or fail to come true during the period that you're
01:27:17.980
writing, particularly if you aspire for your book to be set roughly nine seconds into the
01:27:23.900
And one of the things in the world of After On is lone wolf terrorism and the self-organizing
01:27:30.900
lone wolf terrorism that is inspired by ideology as opposed to by a central group is a feature of
01:27:39.160
the world of After On. And to my absolute dismay, I take absolutely no pride in quote unquote
01:27:44.500
predicting this correctly. That has in fact started occurring to a significantly greater degree
01:27:50.960
in the couple of years since I started writing the book. Now, you made the point in your very
01:27:55.880
recent podcast with Graham Wood that in some ways ISIS-inspired attacks are more scary than ISIS-directed
01:28:02.360
ones. And he made the counterpoint that ISIS-directed ones tend to have much, much higher death
01:28:08.260
tolls. But the ISIS-inspired ones, is it just their ability to pop up anywhere and spread like a virus
01:28:16.060
Yeah. Well, it's the demonstrated effectiveness and spreadability of the ideas that is the scariest
01:28:23.100
thing. I mean, there are two things to worry about in this world. You can worry about bad people and
01:28:28.020
you can worry about bad ideas. And bad ideas are much worse than bad people because they can
01:28:35.440
potentially inhabit the minds of good people and get even good people to do bad things. Right. So I'm
01:28:40.920
under no illusions, and many people are, that all the people who joined ISIS are bad people,
01:28:47.760
right? They're just people who believe these bad ideas. Many people imagine that ISIS is acting like
01:28:55.500
a bug light for psychopaths, right? And so that only people who would do bad things anyway, they would
01:29:00.360
have found some other reason to rape and kill and take sex slaves and cut people's heads off,
01:29:05.680
and they just happen to find this reason. No, that's absolutely not what's happening. And we know
01:29:11.580
that that's not what's happening. There are psychologically normal people who become as
01:29:16.660
convinced of the veracity of ISIS's worldview as I became convinced of the utility of meditation
01:29:23.280
practice, right? And then they do something very extreme. What I did was very extreme. I dropped out of
01:29:28.480
a great college, right? And kind of derailed my life in conventional terms and forsook every other
01:29:36.620
reasonable ambition but to understand the nature of consciousness more for this significant period of
01:29:43.560
time, right? You know, you change a few of the relevant beliefs. I could have been, you know,
01:29:48.200
John Walker Lind in Afghanistan with the Taliban, right? It's like, I recognize a person like that
01:29:53.760
as someone who's very familiar to me, you know? And John Walker Lind, he's in prison now. He still
01:29:58.840
believes all this stuff. And he's getting out soon. And the force multiplier element of it matters a
01:30:03.940
great deal to me because I actually think a raw material that a lot of these nihilistic organizations
01:30:10.620
use are folks who happen to be feeling suicidal today. Humanity produces them in abundance and has
01:30:17.060
across continents and societies and centuries, about a million people will kill themselves this year.
01:30:21.500
And by the way, it's very hard. I think probably impossible. If I were recruiting suicide bombers,
01:30:27.460
I would probably stay away from people who are happy and centered and empowered because talking
01:30:33.100
that person into killing themselves at all is an enormous lift compared to talking somebody who's
01:30:39.440
already coming to me out of their minds with, you know, addiction, with depression, with chemical
01:30:45.860
imbalances in their minds, whatever. So society produces this raw material in some abundance.
01:30:51.380
And some percentage of those people are inclined to take people with them. And some of those people
01:30:56.740
are secular. I mean, the guy who shot up the school at Newtown, he committed suicide. He was
01:31:01.700
relying on the police to kill him. He was committing suicide and taking as many people with him as
01:31:06.380
possible. Likewise, the guy who murdered the five cops in Dallas, that was, he didn't drop a bomb on
01:31:16.600
Yeah, Andreas Lubitz. And so that's the second force multiplier. And this gets me nervous. So when
01:31:22.220
somebody gets into that mental state, my feeling is that there are two force multipliers that stand
01:31:27.300
out. One is what is now animating them. And this gets to what you're talking about, the power of these
01:31:32.700
ideas. I mean, if you look at Mateen, the Orlando killer, he was a third rate loser who failed at
01:31:38.480
everything. He had been dumped by two wives before the age of 30. He could not hold down a job. I would
01:31:45.220
imagine that in many parallel universes, he's the kind of guy who might have killed himself or might
01:31:50.640
have killed an ex-wife or two ex-co-workers or something.
01:31:53.460
He probably also had some kind of gay shame thing happening.
01:31:57.140
Self-hating. Some self-hating thing going on. But there are many, many hundreds of people like that
01:32:03.340
who do themselves in. He got animated by an idea that inspired him to go out and literally commit
01:32:10.000
the biggest mass murder in the history of a country with a very high bar for biggest ever.
01:32:15.120
He killed 49 people. Now, the second force multiplier, as you just indicated, is going to be
01:32:20.220
weaponry. So this is a chilling fact. I wish I didn't know it, but I do. In the two and a half years
01:32:26.420
leading up to the Newtown attack, there was a series of very strange, unrelated school attacks
01:32:33.160
in China, mass murder attacks. And there were 10 of them.
01:32:37.200
And by chilling irony, the last one was literally just a few hours before the Newtown attack. Now,
01:32:43.740
those 10-ish attacks combined, all 10 of them put together, had roughly the same number of total
01:32:51.380
deaths as the lone Newtown attack, because they were being committed literally with knives and
01:32:57.260
hammers. Whereas the person who attacked in Newtown had the benefit of living in a society that sells
01:33:04.240
near cousins of machine guns to people who are on the no-fly list. Not that he was on the no-fly
01:33:09.740
list, but we permit that. So there's this huge force multiplier of weaponry. And then if you're
01:33:15.160
Andreas Lubitz and you have an airplane, okay, fine, you kill a couple hundred people more.
01:33:20.140
And with that chilling fact in mind, I'd like to just read a couple quotes to you from End of
01:33:26.180
Faith. Our technical advances in the art of war have finally rendered our religious differences,
01:33:30.780
and hence our religious beliefs, antithetical to our survival. We're fast approaching a time
01:33:35.560
when the manufacture of weapons of mass destruction will be a trivial undertaking,
01:33:40.240
while it—and these are from three different quotes—while it's never been difficult to meet your
01:33:44.140
maker in 50 years, it will simply be too easy to drag everyone else along to meet him with you.
01:33:49.480
So we have this force-multiplying spread of ideas, this proliferation of lone-wolf attacks.
01:33:55.200
We know what weaponry does. What weapons were you thinking about when you wrote that,
01:33:59.460
when you said in 50 years it will be simply too easy to drag everyone else? Were you thinking of
01:34:04.240
bioweapons, synthetic biology? Nuclear is harder to do.
01:34:09.320
Yeah, although it's not that hard, actually. I mean, it was hard to invent the technology.
01:34:14.460
The Manhattan Project was hard. It's not hard to render much of Los Angeles uninhabitable for
01:34:23.880
It's far less hard once it was invented, but still you need the resources of a nation-state
01:34:30.800
Well, you actually don't. I mean, you could actually—if you're willing to die,
01:34:33.980
you can be the weapon. And what you need is the enriched uranium or the plutonium. But you could
01:34:38.280
literally—you wouldn't get the full yield you would want if you want to kill the maximum number
01:34:44.020
of people, but you could take two, like, you know, 50-pound plates of enriched uranium and just
01:34:51.740
put one on the floor and slam the other one on top of it, and it would go critical. You would not get
01:34:59.720
Yes, but you would get—and you would be just kind of like the ultimate dirty bomb experience,
01:35:05.820
right? So you could actually be the bomb. But a much more reasonable thing to do if you're in
01:35:11.580
this business is to just do something that's analogous to the bomb design of Hiroshima and
01:35:16.500
Nagasaki, where you have a gun-style apparatus where you're shooting one piece of enriched uranium
01:35:24.280
or plutonium into the other, right? And just slam—essentially slamming them together harder
01:35:28.980
than you could physically. And again, that the yield there is not—it's not as complete as,
01:35:35.400
you know, a nation state would produce. But still, you could get a multi-kiloton yield.
01:35:42.480
There, the technical issue is just getting the fuel.
01:35:48.040
And so, yes, you do not need the tools of a nation state. You just need a few engineers
01:35:52.500
and machinists. You know, it's powered by ordinary explosives to get the things slamming
01:35:58.240
together. And I mean, there are a bunch of scenarios that have been described to everyone's
01:36:02.820
horror online, where you can do this in a shipping container, and you truck it into the DC, and
01:36:07.840
it can be activated with a cell phone. And William Perry has a terrifying bit of animation that he put
01:36:15.740
online that just shows you how simple and how totally destabilizing it would be to our society
01:36:23.520
to do this. So just imagine you build a simple device, which is just, again, just like Hiroshima,
01:36:29.740
you know, like a 15-kiloton explosion. If you put that, you know, right next to the Capitol building,
01:36:34.620
right, you just—now you have a continuity of government problem. You know, who did you kill?
01:36:38.680
You killed all the senators and congressmen and the president.
01:36:41.620
The Supreme Court and the Joint Chiefs, and yeah.
01:36:44.640
Imagine doing it in one American city, right, and then announcing, whether this is true or not,
01:36:52.180
who knows, but then announcing you have similar bombs placed in 10 other American cities.
01:36:59.600
Yeah, and you will do them—you'll do, you know, one a week until your demands are met.
01:37:05.140
How do we begin to respond to that, right? This is an act of terrorism, obviously orders of magnitude
01:37:10.420
beyond September 11th, which ushered in a decade of just derangement, you know, and cost trillions of
01:37:16.500
dollars in the aftermath, you know, at least two wars and financial crises. Imagine this happening
01:37:23.420
in one city. This is within the technical capacity of a group like ISIS or al-Qaeda. I mean, you don't—you
01:37:30.240
just need to get the fuel. And we have almost no way to prevent it. I mean, we don't—we're not screening
01:37:37.180
things at our ports so assiduously as to know this couldn't possibly get in.
01:37:44.200
Yeah, you just have to imagine weaponizing something akin to the Spanish flu, which, you
01:37:52.040
know, killed something like 50 million people in 1918. The sky is the limit there. You could get
01:37:58.060
something that is as easily transmissible and is even more deadly. When you're talking about a
01:38:06.080
bioweapon, the worst possible case is something that is easily transmissible and that doesn't make
01:38:12.840
you floridly ill for long enough for you to do as much damage as you possibly can.
01:38:18.480
You sneeze a lot on lots of grapes, on lots of people.
01:38:23.060
Yeah. And then those people are sneezing on grapes in people and then nobody knows there's
01:38:26.460
an outbreak until there's a million infectees or something like that.
01:38:29.540
Yeah. Something like Ebola doesn't have going for it, you know, as bad as it is,
01:38:34.220
as horrible as it is, one of the reasons why it's not scarier is it is very quickly obvious
01:38:39.680
how sick people are. If you're talking about airborne transmission of something that has
01:38:44.380
very high mortality and a long incubation period, yeah, weaponize that. That is a civilization
01:38:52.900
canceling event if we don't have our act together.
01:38:56.660
And for now, George Church may be the only person who can do it. But in 25 years with biology
01:39:02.200
following what's sometimes called the Carlson curve, which is even steeper than the Moore's
01:39:05.700
law curve, who knows when 10 people, then 100, then 1,000 people. So I'd like to close
01:39:10.800
on something that I wrestle with a lot. You gave a great TED talk on the risk of super AI. I won't
01:39:17.580
make you replay it here because people can access it. I'll just pull two quotes from it to just set
01:39:22.960
the context. You described the scenario of a super AI having better things to do with our planet
01:39:28.700
and perhaps our atoms than let us continue to have them as being terrifying and likely to occur.
01:39:35.360
And also saying it's very difficult to see how they won't destroy us. And I don't think that those
01:39:41.820
are shrill or irrational statements personally. I also don't think it's shrill or irrational
01:39:47.240
to think that what George Church alone can do today will be the province of many millions of lab
01:39:53.920
techs, probably in our lifetimes. And with those two forces out there, I don't know what scares me
01:40:01.740
more. And I think about proliferating, democratizing, existentially destructive technology. Just about
01:40:09.060
the only thing I can think of that might protect us against such a thing would be an incredibly benign
01:40:15.240
super AI that has functional omniscience because of its ubiquity in the networks. And it has functional
01:40:23.000
omnipotence because of its mastery of, who knows, nanotechnology or something else. But boy,
01:40:28.520
we're both scared about a super AI. It's almost like super AI can't live with them, can't live without
01:40:34.420
them. How do we navigate those twin perils? And do we need to perhaps embrace a super AI as a
01:40:43.760
protective mechanism for democratized, super destructive power?
01:40:49.280
Yeah, well, I do think it really isn't a choice. I think we will develop the most intelligent machines
01:40:55.700
we can build unless something terrible happens to prevent us doing it. So the only reason why we
01:41:02.100
wouldn't build... The civilization gets thrown violently backwards.
01:41:05.140
Yes. I mean, so, you know, George Church loses his mind or one of his techs does, and we have some
01:41:11.420
pathogen that renders us incapable of keeping our progress going on the technology front. And you
01:41:18.880
just have to imagine how bad that would have to be in order to actually stop...
01:41:23.920
Yes. Yes. You know, you'd have to have a world where no one understood how to build a computer
01:41:30.040
again, and no one ever understood how to build a computer again going forward from that point.
01:41:35.160
So beyond canticle for Leibovitz type of destructiveness. Yeah.
01:41:38.620
So if it's not that bad, we will keep making progress. And you don't need Moore's law. You just
01:41:49.580
And at some point we will find ourselves in the presence of machines that are smarter than we are,
01:41:53.120
because I don't think there's anything magical about the wetware we have in our heads as far as
01:41:57.460
information processing. So the moment you admit that this can be, that what we call a mind can be
01:42:03.320
implemented on another platform, and there's every reason to admit that scientifically now. And I leave
01:42:09.900
questions of consciousness aside. I don't know that consciousness comes along for the ride,
01:42:13.660
necessarily, if you get intelligent machines. And ironically, the most horrible vision is one of
01:42:19.640
building super-intelligent unconscious machines. Because in the presence of consciousness,
01:42:24.980
at least you could argue, well, if they wipe us out, well, at the very least, we will have built
01:42:29.400
something more important than we are. We will have built gods. We will have built minds that can take
01:42:35.380
more pleasure in the beauty of the universe than we can. Who knows how good the universe could be
01:42:41.640
inhabited in their hands. But if the lights aren't on, if we've built just mere mechanism that is
01:42:48.380
incredibly powerful, that can be goal-directed, but for whom there is nothing that it's like to be
01:42:54.440
directed toward those goals. Well, that really strikes me as the worst case scenario, because
01:42:59.320
then the lights go out if we... If we go out. So it sounds like you believe that the super AI
01:43:04.300
is inevitable, unless... Something equally terrible happens, yes.
01:43:08.960
So our best shot of surviving is to do all we can to make sure the super AI that one day
01:43:15.140
inevitably arises is benign. Yeah, it's aligned with our interests. Intelligence is the best thing we
01:43:22.240
have, really. It's our most valuable resource, right? So it is either the source of or the safeguard
01:43:29.140
for everything we care about. And there's overwhelming economic incentives for thousands...
01:43:35.620
Yeah, you get immediately rich. Intensely smart people, intensely well-capitalized companies to
01:43:41.840
go screaming down that path. Yeah. So all of the incentives are aligned to get into the end zone
01:43:47.360
as quickly as possible. And that is not the alignment we need to get into the end zone as safely as
01:43:53.780
possible. And it will always be easier to build the recklessly unsafe version than figuring out
01:44:01.680
how to make this thing safe. Yeah. So that's what worries me. But I think it is inevitable in some
01:44:07.100
form. And again, I'm not making predictions that we're going to have this in 10 years or 20 years,
01:44:11.040
but I just think at some point... And again, the human level bit is a bit of a mirage, because I
01:44:15.940
think the moment we have something human level, it is superhuman. Yeah. Oh, it blows past that.
01:44:21.200
Yeah. That's a mirage. And people are imagining somehow that that's a stopping point. It will
01:44:25.640
barely get there, and then we'll stay there for a long time. It could only be the case if we are
01:44:31.240
ourselves at the absolute summit of cognition, which just defies common sense. And we just know
01:44:37.760
that's not true. We just know it's not true. Just take, you know, the calculator in your phone. I mean,
01:44:41.200
that's not human level. That is omniscient with respect to arithmetic. Yeah. You know, and
01:44:45.960
having the totality of human knowledge instantaneously accessible through the internet.
01:44:51.600
I mean, if we hook these things to the internet, it has a memory that is superhuman and an ability
01:44:56.580
to integrate data that is superhuman. So the moment all of these piecemeal cognitive skills
01:45:03.820
cohere in a system that is also able to parse natural language perfectly, you know, that you can
01:45:11.700
talk to it and it understands. It does what you want. All of the answers to the questions are no
01:45:17.560
longer like series answers where they contain, you know, howlers, you know, every third trial,
01:45:23.760
but they're the most perceptive, best informed, most articulate answers you're getting from any mind
01:45:30.460
you ever interact with. Once those gains are made, they won't be unmade. It's like chess. It's like once
01:45:36.280
computers were better at chess than people, you know, and now we're in this, this sort of no man's
01:45:40.780
land, which again, which I think will be fairly brief where the, yeah, the combination of a person
01:45:46.520
and a computer is now the best system. But at a certain point, and I'm amazed that anyone doubts
01:45:52.260
this, but at a certain point, I think it will obviously be the case that adding the ape to the
01:45:56.680
equation just adds noise to the equation. And, and, you know, the computers will be better than,
01:46:02.000
than cyborgs. And once they are, there's no going back from that point. It may not be everything. It
01:46:07.720
may, there may be things we neglect to build into our AIs that turn out to be important for
01:46:13.020
human common sense. Or, I mean, this is, this is the scary things. We don't know what is required
01:46:18.140
to fully align an intelligence system with our wellbeing, you know? And, and so we could neglect
01:46:26.260
to put something like our common sense because we would don't perfectly understand it into
01:46:31.740
these systems. And then you can get errors that are deeply counterintuitive that are,
01:46:38.660
I mean, this is analogous to, you know, Nick Bostrom's cartoon thought experiment of the
01:46:42.480
paperclip maximizer. I mean, it's like, well, who would build such a machine? Well, we wouldn't,
01:46:46.680
but we could build a machine that in the service of some goal that was, is obviously a good one
01:46:53.460
could form some instrumental goal that we would never think an intelligence system could form and
01:47:00.820
that we would never think to explicitly prevent. Yeah. And yet this thing is totally antithetical.
01:47:05.960
It reaches some local equilibrium where it says more paperclips, good. Yeah. Going to do that for
01:47:11.040
a while. Yeah. And soon the universe is paperclips. Well, Sam, you have been extravagantly generous with
01:47:17.640
your time. I appreciate it. Not at all. It's a pleasure. And thank you very kindly. And we will,
01:47:23.980
I'm sure, remain in touch. Yeah. Yeah. And I wish you the best of luck, needless to say,
01:47:28.540
with your book and the podcast and everything else. Thank you kindly. It's a great idea that
01:47:34.020
you're combining both in this way. I think, obviously, this is the frontier of creative
01:47:38.180
use of these new media, and it's great to see you doing it.
01:47:40.980
If you're enjoying the Waking Up podcast, there are many ways you can support it at
01:47:48.640
samharris.org forward slash support. As a supporter of the podcast, you'll get early access to tickets
01:47:55.780
to my live events, and you'll get exclusive access to my Ask Me Anything episodes, as well as to the
01:48:01.060
AMA page on my website, where you can pose questions and vote on the questions of others.
01:48:05.560
And please know that your support is greatly appreciated. It's listeners like you that make