#112 — The Intellectual Dark Web
Episode Stats
Words per Minute
178.48975
Summary
Live from The Masonic in San Francisco, Ben Shapiro and Eric Weinstein debate the value of free speech and the role of religion in shaping our understanding of the world, and why it s important to have a healthy dose of controversy in our discourse. This is the audio from a live event I did with Ben and Eric a few weeks ago, and it's a must-listen if you want to know what it was like to be on stage with two of the most controversial figures in the conservative intellectual community, and to have them debate free speech, and how important it is to understand that there are many areas of disagreement around which we can find common ground, and where we can agree to disagree on those areas of agreement. If you don't know who these two guys are, then you're in for a treat, because this is a podcast you won't want to miss! Recorded on January 1st, 2020, by Ben Shapiro, the host of The Ben Shapiro Show, and Sam Harris, editor-in-chief of The Weekly Standard, a nationally syndicated conservative columnist, and host of the Weekly Standard's own podcast, The Daily Wire's Senior Editor in Chief, Sam Harris. Thanks to everyone who came out to the event and gave us their thoughts, and we hope you enjoy it. You can't ask for much more. Tweet me and let us know what you thought of it! Timestamps: 4:00:00 - What was your favorite part of the event? 6:30 - What did you think of it? 7: What was the best part? 8:20 - What would you'd like to see in the next episode of the podcast? 9:15 - What are you looking forward to hearing from Ben Shapiro & Eric Weinstein? 11:00 12:00 | What are your thoughts on the podcast in 2020? 13:00 -- What do you think about Ben Shapiro's new book? 14:30 -- What's your favorite point of the episode? 15:40 -- How do you're going to do next? 16:15 -- What is your favorite aspect of the show? 17:20 -- Who's the most important thing? 18:40 - What's the biggest thing you're most excited about? 19:00-- What would your biggest takeaway from the podcast from this podcast so far? 21:20 22:10 -- Who would you like to hear from me?
Transcript
00:00:00.000
Okay, well, for this episode, the first of the new year, I am presenting the audio of
00:00:24.060
my live event with Eric Weinstein and Ben Shapiro that we did in San Francisco a few
00:00:29.440
weeks back. To say that this audio has been much anticipated really is an understatement.
00:00:37.980
Ben has an enormous following online, and I have been hearing from all of you, mostly on social
00:00:45.600
media and in comment threads. I really haven't been sitting on this audio for any other reason
00:00:50.620
than I had many other podcasts in the queue, and I couldn't decently hold them for much longer.
00:00:56.820
But the time has arrived, and I just have a few things to say by way of preamble. I introduced
00:01:03.320
both of these guys on stage, so I don't need to do that here. Eric Weinstein, many of you know from
00:01:08.620
a previous podcast, he's always great. Ben, as many of you know, is quite a polarizing and controversial
00:01:16.340
figure. I got a fair amount of grief when I announced that we would be doing a podcast
00:01:22.240
together. Also, a ton of enthusiasm from his fans. Needless to say, I can't take responsibility for
00:01:29.680
everything Ben has said or written in the past. I'm certainly unaware of most of what he's said and
00:01:35.400
written, but I'm aware of enough to know that he has, like many of us operating in this space on these
00:01:45.020
topics, been unfairly maligned and demonized by his detractors. I think any comparison between him
00:01:53.600
and Milo Yiannopoulos is quite unfair, given the differences in how they operate. This is something I say
00:02:01.600
on stage at the beginning of this event, but that's a comparison that's often made. Ben and I disagree about
00:02:08.960
some fundamental things here, and it was really, I found myself in a situation which I often find
00:02:13.920
myself in on the podcast, where I have to play host and debate partner simultaneously, and I've begun to
00:02:22.540
feel that there really is no perfect way to split that baby, and I certainly didn't do it perfectly here.
00:02:31.980
More and more I try to err on the side of being the gracious host who is not going to let things get
00:02:37.520
bogged down. But the scientist and philosopher in me who can't let a bad point stand invariably flirts
00:02:45.800
with the ditch on the other side of the road. So you can decide whether I struck the right balance
00:02:51.780
here. Ben and I disagree fundamentally about religion and its connection to human values. We disagree about
00:02:59.180
free will. I tackled some of these points as they came up and let others go in the interest of not
00:03:06.080
getting bogged down. But I think Ben and I did about as well as we could here where we disagreed given
00:03:13.500
the nature of the event. But you be the judge. I should say that despite our disagreements, the vibes
00:03:19.700
with Ben were great. In the green room beforehand, afterwards, this was a very fun and collegial
00:03:27.500
experience for everyone. And I'm very grateful for Eric and Ben's participation, as well as to all of
00:03:33.780
you who came out for the event. We had a packed house at the Masonic in San Francisco. And from
00:03:40.160
what I can tell, most of you had a lot of fun. So without further delay, I give you the Wake
00:03:47.400
a podcast live from San Francisco with Eric Weinstein and Ben Shapiro.
00:03:59.400
Ladies and gentlemen, please welcome Sam Harris.
00:04:24.660
Okay, well, sorry for the late start. I'm going to jump right into it because we have a lot to talk
00:04:30.580
about. I've got two guests tonight. And the first is a mathematician. He has a PhD from Harvard and he
00:04:39.040
has held research positions at Harvard and MIT and Hebrew University and Oxford. You may have heard him
00:04:46.820
before on my podcast. He is one of the most interesting people I have ever met. And honestly,
00:04:54.000
that's saying a lot. And he has, along with his brother, Brett, who I just did a podcast with last
00:05:03.660
night in Seattle, he has become one of the most valuable defenders of free speech in our time. So
00:05:11.080
please welcome, Eric Weinstein. Thank you for coming. And our next contestant is the editor in chief of
00:05:33.260
dailywire.com. He is the host of the Ben Shapiro show. So guess his name, which is the top conservative
00:05:43.520
podcast in the nation. He is the author of seven books. And he has been a nationally syndicated
00:05:51.380
columnist since the age of 17. Pity he got such a late start. He's a graduate of UCLA and Harvard Law
00:05:59.200
School. Please welcome Ben Shapiro. Thank you for coming. So we have a lot to get into here and
00:06:19.280
there are areas of agreement. I mean, many of you know who these two guys are and you know,
00:06:24.820
you can imagine the course we're going to chart here. I want to start with Ben because he's had
00:06:33.100
a truly unusual experience. And many of you may not be aware of just how unusual. And this will take
00:06:40.300
us into areas of agreement, Ben, where we definitely agree, which is around the primacy of free speech
00:06:45.860
and how strange our national conversation is on so many topics. So Ben is, if you don't know,
00:06:53.320
is the person who, when he goes to Berkeley, requires, Berkeley University, requires $600,000
00:07:01.660
worth of security to give a speech. We have a little bit less security here tonight. So please
00:07:08.180
behave yourselves. But it's a bit, Ben, what's been going on? What has it been like to be you in the
00:07:15.080
last two years? Confusing. It's, I've always been a little bit bewildered by the scope of the
00:07:22.820
opposition at these college speeches because I don't actually think that my message is supremely
00:07:27.520
controversial. It's pretty mainstream conservative. And yet when I show up on campuses at Cal State,
00:07:33.920
Los Angeles, there was a near riot. When I went to Berkeley, obviously they required a fair bit of
00:07:37.140
security thanks to Antifa. And when I was at DePaul, DePaul University banned me outright. They
00:07:43.540
threatened to arrest me if I set foot on their campus, even though students had invited me.
00:07:47.800
University of Wisconsin, they actually tried to pass a law banning the heckler's veto basically
00:07:52.360
after I spoke at University of Wisconsin. So I think it has far less to do with me than it does
00:07:57.200
with this kind of mood in the country that's so polarized and so crazed. And I would say with
00:08:03.700
regard to college campuses, unique to the political left. I'm not seeing a lot of it from the political
00:08:07.040
right. The political right certainly has its own problems at this point in time. But what's going on
00:08:10.760
on campuses is something that, you know, I've been speaking on college campuses for most of my career,
00:08:15.720
so 15 years. And only in the last couple have I needed security. The first time I ever acquired
00:08:20.060
security guards was last year. And now, you know, every place I go, I have to have security guards
00:08:25.320
when it's a public event. So. And you also, you're getting it from both sides in a way that's
00:08:28.960
completely surreal. Because so, for instance, you were often disparaged as a Nazi or a white supremacist.
00:08:34.580
And yeah, it's the yamaka that gives me away on that one. Yeah, yeah. But even if you were not
00:08:39.960
going to notice the yamaka, you were actually the most targeted victim of anti-Semitism in,
00:08:47.680
what, 2016? Yeah, among journalists on Twitter, anyway. Yeah. It is upside down. And you're also
00:08:52.880
often compared to your former Breitbart colleague, Milo Yiannopoulos, right? And that's an unfortunate
00:09:00.520
pairing. Because the reason why I wanted to talk to you is because, while I think you and I will
00:09:07.440
disagree about several maybe foundational things, I see you as someone who is sincerely defending a
00:09:14.420
position based on a rational chain of argumentation, based on first principles that we may or may not
00:09:20.060
share. But you're not a performance artist. And that's a crucial distinction. I mean, that's at
00:09:26.220
least what I'm going for, right? I mean, I've always thought that what I'm trying to do anyway is say
00:09:30.280
things that I think are true. And if they piss you off, they piss you off. But I'm not going in there
00:09:33.660
with the idea I need to piss somebody off to make a headline. That's why I've always found this a
00:09:37.340
little bit puzzling. Because there are provocateurs whose job it is to go into a university, say
00:09:41.320
something deliberately provocative, just to get a rise out of people and get a headline. And since
00:09:46.280
that really is not my MO, I've been sort of puzzled, frankly, by the level of opposition
00:09:50.220
on all sides. It was a very weird year. I mean, last year was a weird year. I had the alt-right
00:09:55.600
calling me a Black Lives Matter activist and Black Lives Matter calling me an alt-riter. So it was
00:10:00.640
a unique time. 2016, we're still living in a parallel universe in which Marty McFly
00:10:06.000
actually did not stop from using the sports yearbook.
00:10:11.780
One lens through which I want us to view this conversation is really two lenses. It's what
00:10:19.360
most worries you and what most interests you at this point. Let's start with the worry. Where
00:10:26.700
Well, I guess for me, I've tried to localize my concern with the breakdown of what I call
00:10:35.140
semi-reliable communal sense-making. If something happens...
00:10:41.920
My Twitter follower count is the orders of magnitude below yours. The idea being that if something
00:10:52.660
happens and everybody in the audience processes it, we will fall into certain clusters. And
00:10:59.700
those clusters are fairly reliable and dependent. And so to Ben's point that he is both Black Lives
00:11:05.920
Matter and Alt-Right in this Schrodinger superposition. So what is that? And it has to do with the fact
00:11:16.020
that traditionally we've used institutions to guide our sense-making and to make sense of things
00:11:23.200
collectively. And that has now gone away. And so depending upon what institutions I'm hooked up to,
00:11:30.520
what was my last... Where did the fox last have the scent? I can be at odds with somebody I love,
00:11:38.440
somebody who I've thought about as somebody I've shared a life with, because there's no longer any
00:11:43.700
way to do this communal. And the semi-reliable, I don't think that Walter Cronkite was actually
00:11:48.200
always telling us the truth. But it was in some sense, to first approximation, close enough that
00:11:54.160
there was a national consciousness belief structure. There was enough shared
00:12:00.000
sort of complex for us to function as a country. And I think that that's gone away. So I think
00:12:07.620
this is the parent of the crisis, which I increasingly think of as this, you know, I call it the no-name
00:12:15.480
revolution or the N-squared revolution, where in some sort of new regime, which doesn't look like
00:12:21.400
any revolution we've seen before, it's much less physically violent so far. It's digitally
00:12:27.180
extremely violent. And it has to do with the fact that we can't make sense of things communally
00:12:34.760
And what are the ideas or sets of ideas that you think are most culpable for bringing us here?
00:12:41.540
Well, it's tough. I think that what really happened, if we think about it historically,
00:12:47.640
is that we had this beautiful period of fairly reliable, high, evenly distributed, technologically
00:12:55.500
led growth after World War II, up until about, let's say, 1970. And we predicated all of our
00:13:02.600
institutions on this expectation that this growth was some sort of normal thing that we could depend
00:13:08.420
upon in the future. When it ran out, we were left with all of our institutions looking in some form or
00:13:14.620
another like Ponzi schemes. And in order to keep running an institution that expected growth
00:13:20.180
in a steady state condition, let's say, you need to change the narrative to create some sort of as-if
00:13:26.840
growth story. So you start stealing from the future. You start stealing from groups that are too weak to
00:13:32.560
defend their own economic interests so that certain slices can keep growing, even if the pie doesn't
00:13:37.420
grow so well. There are certain areas that kept growing, like communications and computation.
00:13:42.780
So there was some real growth, maybe fracking. But that in general, what we have is we have a bunch
00:13:48.640
of institutions that used to be capable of honesty that had to develop narratives. And that the problem
00:13:54.800
is we've had as-if growth across the spectrum for most of our adult lives. And that story, which is a
00:14:02.920
You're not one of these anti-capitalists I've been hearing about, are you? You're the managing director
00:14:17.540
I mean, so you actually think economics is the longest lever here that's influencing the machine?
00:14:23.680
I mean, like this breakdown of our, this failure of polite conversation to get us to converge
00:14:31.280
Well, if you chase it all the way up the chain, I mean, markets are in some sense the
00:14:35.440
continuation of natural and sexual selection by other means. And so what you have is that
00:14:43.280
That sounds like a very creepy, come online from a conference.
00:14:47.020
Yes. The night is young. I don't know how to record.
00:14:55.360
No, sorry. That's right. This is, let me just acknowledge, I'm actually a bad podcast host.
00:15:02.660
You're not with an expert here. You're a good sport.
00:15:05.660
Yeah. I think that we don't realize that when we look at the city, that nobody is telling
00:15:10.460
people where to drive, what to do. It's sort of self-organized with the markets being this
00:15:15.940
kind of invisible fabric that keeps us together. And so, yeah, it's really important that when
00:15:21.340
growth stops proceeding at the levels that it's expected, people can't form families in
00:15:27.600
real time. So fertility is threatened. People can't plan for, for coupling and for a future.
00:15:35.080
So I think it gets right into the most intimate details of our lives when, when the markets don't
00:15:47.260
Well, I worry a lot less about economics as the basis for social collapse. I don't think,
00:15:52.460
I think it's, it's easy to, to overstate the extent to which growth has stagnated. I mean,
00:16:00.060
we are at 4% unemployment. The, the economy is not, I mean, this is not 1935. This is not
00:16:08.160
even 1975. There's still, you know, significant economic growth. To me, it seems like the social
00:16:15.100
fabric has been completely ripped apart. And some of that is due to social media and the fact that
00:16:19.860
we coordinate with each other in a different way. But I think a lot of it has to do with loss of
00:16:23.300
common values, like even the ability to have a common conversation. In order to have a conversation
00:16:27.260
with one another, we have to take certain things for granted, like human reason, like objective truth.
00:16:32.140
If we don't take these basic things, at least for granted, then how are we even speaking the same
00:16:36.300
language? And it seems to me that a lot of those things have disappeared in favor of radical
00:16:39.840
subjectivism that may make us feel good, but doesn't provide the common framework for a
00:16:43.480
conversation. And objective truth goes by the wayside, because if we can't agree on the fact,
00:16:48.620
how are we going to have a conversation? You see this, particularly in our politics,
00:16:51.400
where it seems like there's two bubbles that have been created. And if you read Huffington Post,
00:16:55.240
you are in a completely different world than if you read Breitbart. And my mom actually first
00:16:59.920
noticed this in 2012, because she said, you know, I was working at Breitbart at the time,
00:17:02.940
and she said, well, it looks like from Breitbart, Romney's definitely going to win. I was like,
00:17:05.740
yeah, he's definitely going to win. And she said, and then all my friends at work read Huffington
00:17:09.080
Post, and they say that Obama's definitely going to win. And I don't know who to believe. And I
00:17:13.080
said, well, I really don't know who to believe either, because no one knows the answer to that
00:17:15.900
question. But you can see that it's broken down in incredibly radical ways now, because even things
00:17:21.040
where there should be a common basis of fact, people are disagreeing on, right? To take the Senate
00:17:24.940
race in Alabama, right? There's pretty good, reliable accounts that the Republican candidate in that race
00:17:32.340
is likely guilty of some form of sexual abuse of underage girls. And a huge percentage of the
00:17:39.580
Republican base, you know, my party, my group, a huge percentage of them will outright deny that
00:17:44.400
that's the case, because they'll say this is a witch hunt, people are out to get Roy Moore,
00:17:47.840
it's a conspiratorial attack on Roy Moore. So that's one example from the right. Then on the left,
00:17:52.880
you'll have examples where, you know, you will say things that are biologically true,
00:17:58.940
take a controversial example, like, there is a male sex, and there is a female sex. And if you
00:18:03.500
say that, then people will lose their minds, because you're somehow insulting their subjectivity.
00:18:08.100
And, you know, when you do that, it's hard to have a conversation, because people will change
00:18:13.960
the terms they're using, they'll change the frame of reference they're using. And how are we, and then
00:18:18.320
they'll toss reason out altogether, they'll say, you know, your specific bias as a person prevents you
00:18:22.760
from even having a reasonable conversation, right? Your white privilege, or your background,
00:18:27.240
or your ethnicity, or all of this prevents us from even discussing on a one-on-one level. Like,
00:18:32.440
I can recognize my background in having an impact on how I think. But if that is supposed to be a
00:18:37.780
conversation stopper, then how exactly are we supposed to have a conversation?
00:18:42.420
Yeah, so that's why identity politics is so toxic, in my view, because if identity is paramount,
00:18:49.780
communication is impossible. Exactly. Like, because you haven't shared my specific experience,
00:18:54.720
or because you don't have the same skin color, you're not the same gender, there's no bridge
00:18:58.520
between us, right? And you're, and there's no chain of reasoning from you to me that should trump
00:19:05.020
anything I currently think, because what I think is anchored to identity. Exactly. And we don't share
00:19:10.420
an identity. Yeah, well, we're atomized individuals kind of bouncing off one another as opposed to being
00:19:13.900
able to form some sort of molecular bond. Yeah. And I think that that's completely, it seems like
00:19:19.720
that's completely collapsed. Right. Right. And is, do you think social media is the main engine of
00:19:26.100
that collapse? Or is it just where we were headed there anyway? I mean, obviously Fox News and the
00:19:32.000
fragmentation of media precedes social media. So we had our echo chamber. Yeah, I mean, I really don't
00:19:36.780
think it's social media. And there was a, there's a study that came out from, I think it was Harvard
00:19:39.680
actually, reported by the New York Times, talking about how the impact of social media on polarizations
00:19:44.520
overstated, that if you look at the most polarized populations in the country, it's actually older
00:19:47.640
people. That people who are older are more polarized politically and are having fewer conversations
00:19:52.940
with people on the other side of the aisle than younger people. And younger people are obviously
00:19:57.000
more apt to use social media. I really don't think it's that. I think that there is a ground shift
00:20:01.260
in the way people think that's taken place even within our lifetime and has gained steam. And as I say,
00:20:09.140
even basic concepts like reason are being thrown out in favor of a philosophy of feeling because
00:20:15.900
maybe it does come down to lack of success for people. Maybe people do feel that they can't
00:20:21.360
succeed in other ways. And so the way that they feel fulfilled, the way that they feel whatever
00:20:24.780
need they have for fulfillment is by wallowing in themselves. If I can't find fulfillment in the
00:20:31.360
outer world, then I will look inside me and I will look at what makes me special. And we've all been
00:20:35.700
taught that we're special by Barney. And therefore, since we are all special, then you saying anything
00:20:40.660
that disagrees with me is taking away my specialness. And that can't be infringed upon.
00:20:45.180
You can actually try to look at the history of these ideas. Like, for example, you mentioned
00:20:49.560
white privilege. And I, at some point, tried to track it down. And there's some two-page,
00:20:55.020
it's not even an academic paper, so unpacking the knapsack in the late 80s coming out of Wellesley.
00:21:00.820
Or, you know, intersectionality comes out of apparently UCLA Law School. A lot of these ideas
00:21:06.320
actually began as kind of minor, interesting ideas, heuristics, that couldn't support an entire
00:21:13.980
epistemology. And what happened was, is that you had some sort of vaguely approximate concepts
00:21:20.380
that got pushed so far beyond their domain of applicability, that they led to a kind of madness
00:21:28.120
when they became sort of the substrate for thought. You can't really have conversations where,
00:21:34.500
you know, white privilege is a barrier. If Ben has a drinking problem, and I have a gambling problem,
00:21:41.420
we may not be able to understand each other's addictions directly. But if I think about Ben's
00:21:48.160
problem... I asked you not to talk about that publicly.
00:21:56.580
The issue is that this idea of being able to hack empathy and hack understanding by using
00:22:03.360
our own personal experiences, our lived experience, to use the jargon, and the felt experience,
00:22:09.560
in order to empathize across these dividing lines, shows this incredible failure of imagination.
00:22:16.220
It's as if there was no screenwriter who was able to write both male and female characters
00:22:20.580
that men and women, you know, identify with. And so I think it has to do with pushing
00:22:26.440
interesting but very limited heuristics so far beyond their domain of applicability.
00:22:32.940
You can track each one of these things using Google n-grams to find out where they came from.
00:22:37.940
Right. It seems to me that we're struggling, and it's not just us, all of us are struggling
00:22:42.480
to find a way to capture meaning and value in the context of a rational worldview.
00:22:50.760
And I think that is a challenge that just doesn't go away. That is a kind of a perpetual challenge
00:22:57.080
insofar as we understand the situation we're in. We need to find ways of talking about that
00:23:05.020
so as to converge with a basic life plan with seven billion strangers. And the one difference
00:23:12.260
between us is what we think the value of religion is in that picture. So just to get a little bit of
00:23:19.120
the context here, you're an Orthodox Jew. What does that actually commit you to with respect to
00:23:24.540
belief? I mean, what do you believe that I don't believe that is salient here? I'm an atheist, so.
00:23:33.060
That gives you a clue, yeah. I hadn't picked up on that, but it's going to be so awkward now.
00:23:40.980
Well, I mean, I believe in a creator of the universe. I believe that he set certain guidelines
00:23:51.840
for human behavior, that he cares what happens to us. I believe that he endowed us with,
00:23:58.460
in American sense, certain inalienable rights that accrue to us as virtue of being human.
00:24:05.020
You know, from a Judaic perspective, which doesn't really impact public policy so much,
00:24:10.100
one of the reasons that I think we can have a conversation is that when it comes to public
00:24:13.260
policy discussions, I try as little as possible to refer to biblical texts, which means I almost
00:24:18.180
never do. Mainly because what would an appeal to authority that you don't believe in do? I mean,
00:24:23.880
it's a waste of time. So in the areas where I think we can actually have a conversation,
00:24:28.040
where we're not talking about the value of kashrut or keeping Sabbath, which I think has very little,
00:24:32.380
you know, relevant input for public policy and the kind of social fabric building that we're
00:24:36.960
talking about doing. The stuff that I think is important where we disagree is man-made in God's
00:24:42.720
image, created, taking the premise by faith that God created us with certain inalienable rights,
00:24:50.200
endowed us with the capacity to choose, endowed us with the capacity to reason,
00:24:54.320
and cares about what happens to us. Right. So...
00:24:59.380
Not sure if you can say right any more cynically there, but...
00:25:09.160
So, I mean, so what I'm interested in is in a worldview that could be rebooted or rediscovered now. I mean,
00:25:20.060
just imagine we lost all of our, you know, we had all the libraries burned,
00:25:23.560
the internet went down, we lost all of our texts. How would someone rediscover this thing? Now, I can,
00:25:30.820
we can make an easy case that we could rediscover science. You know, it might take some time, but
00:25:35.620
if the literature of Judaism, in your case, were lost,
00:25:41.220
it seems to me patently obvious that whatever is true about reality is still there to be discovered.
00:25:46.940
And if there's some part of reality that is ethical or spiritual or divine or spooky,
00:25:55.300
it's there, it is there to be discovered by sentient creatures such as ourselves. So what would,
00:26:00.780
how would you reboot religion, the religion that's true? Because you are by accident born a Jew.
00:26:08.160
Right. And there's, you know, there are a billion people in India who weren't.
00:26:11.120
And I must imagine that on your account, they have by sheer bad luck, the wrong version of this
00:26:17.400
story. Well, I mean, so Judaism is actually not quite as exclusive as a lot of other religions
00:26:21.460
with regard to this. I mean, Judaism actually says that as long as you fulfill seven basic
00:26:25.420
commandments, like don't kill people, don't steal, don't eat the flesh of a living animal,
00:26:29.160
that you actually have a pathway into heaven. So Judaism is not particularly exclusive. And we
00:26:34.340
actually try to discourage converts. So it's not quite the same as some of the other converting
00:26:39.160
religions in monotheism. But as far as what's discoverable, I would agree with you. If the
00:26:44.020
Torah were to disappear tomorrow, it would not be discoverable, which is why there is a necessity
00:26:47.660
for revelation in the Jewish view. Right. The idea is that revelation was necessary, not that
00:26:52.340
revelation was unnecessary and that if people had not been graced with revelation, they would have
00:26:55.700
come to this on their own. But the principles you just gave me, you don't think those are
00:26:59.520
discoverable? Those are discoverable. Right. And that's the reason why I say that I think that the
00:27:03.700
principles that are granted through revelation are not necessarily, I think that they caused a
00:27:10.600
ground shift historically from certain ways of thought to other ways of thought. Like the advent
00:27:15.200
of Judeo-Christian thought changed the way of thinking. But I think that they are also things
00:27:20.580
that you can discover through contemplation, for example. So all of the things that I said about
00:27:25.860
free will and reason and the presence of an unmoved mover, that's more Aristotelian than it is
00:27:31.480
Judeo-Christian. Right. Right. And that is stuff that was essentially discovered through
00:27:35.560
philosophy, not through revelation. So that is the stuff when I talk about the necessity for
00:27:41.460
reason, that's the stuff I think that is more relevant. Now, I think that you do need a religious
00:27:47.000
system in order to inform people who are not going to sit around philosophizing all day
00:27:51.120
what are good and bad modes of behavior. And, you know, Voltaire thought the same. So I think that
00:27:58.280
the notion of a dual... But is it important to believe that those good and bad modes were approved
00:28:05.900
of or discouraged by an omniscient being? I mean, can't we just chart a course toward greater
00:28:12.500
fulfillment, greater peaceful collaboration based on just an intelligent analysis of what it is to be
00:28:17.600
social beings? So I don't think you can unless you're willing to acknowledge that reason,
00:28:22.700
the capacity to choose, the capacity to act in the world, that these things exist. And that has to
00:28:30.600
be done based on assumption because you actually oppose some of these things, right? Like you don't
00:28:33.420
think free will exists. Yeah. But I also don't think you need free will to live a moral life.
00:28:38.000
Right. I've never really understood that position. So we'll have to get into it.
00:28:40.320
But, you know, to me, if you're going to have a conversation with someone and convince them,
00:28:45.940
then we need to agree on the value of reason. The value of reason is not something that
00:28:48.720
evolutionary biology suggests, right? What does reason have to do with evolutionary biology per
00:28:53.600
se? It's a mode of action that is more likely to preserve your species. It doesn't create
00:28:58.380
objective truth. The notion of an objective truth that exists apart from you and would exist
00:29:01.940
whether or not you were living. This is not something that can necessarily be gathered from
00:29:06.380
science alone, right? You have to make certain assumptions about the universe and the way that
00:29:09.620
your mind reflects what is present in the universe, right? As Kant would argue.
00:29:13.240
Well, it's true that an evolutionary perspective on ourselves suggests that we have not evolved
00:29:22.240
to know reality perfectly. I mean, if you believe that we are apes that have been selected for
00:29:28.100
and all of our cognitive architecture is built by virtue of its adaptive advantage in evolutionary
00:29:34.800
terms, yes, it's hard to believe that we are perfectly designed to do mathematics or anything
00:29:41.740
else that is true. But you do feel that we can still gather objective truth.
00:29:45.820
But even that picture suggests a wider context of minds more powerful than our own that could
00:29:53.980
have evolved or our own future minds. I mean, it's like there's no...
00:29:58.340
Why would you appeal to minds that have not yet evolved or future minds as opposed to just a
00:30:01.800
creator who put us here with certain capacities?
00:30:03.820
Well, no, because that we, I would argue, we don't have any evidence for. What we do have evidence
00:30:08.940
for is that we're here. We understand a lot about the mechanism that is operating now that got us
00:30:14.860
here and that is causing us to be the way we are. We can see our relationship to other life forms. We
00:30:21.080
know that we can look at chimps that share 99% of our DNA and they obviously share a lot of the
00:30:27.500
evolved precursors of our own social and cognitive architecture, but they have no idea what we're up to,
00:30:34.380
right? So they're cognitively closed to most of what we're doing and most of what we care about.
00:30:39.180
And by analogy, we know that we could be cognitively closed to what we might be capable
00:30:45.460
of in a thousand years now. I mean, that our sense of what engagement with the cosmos promises...
00:30:50.820
I know, but I guess the argument is if you're arguing that we're cognitively closed to certain
00:30:55.320
things, then why are you arguing which specific things we are cognitively closed to?
00:30:58.080
Well, no, I'm just saying that once you admit it's possible to not know what you're missing,
00:31:03.640
factually, ethically, spiritually, I mean, in any domain of inquiry, it's possible to come up
00:31:10.280
against a horizon line where the known meets the unknown.
00:31:14.980
Well, you wouldn't be the first to say it, but it's clearly possible not to know what you're
00:31:28.140
But I mean, if you kill the hundred smartest mathematicians on earth right now, you would...
00:31:34.440
You would close the door to certain conversations maybe for 200 years.
00:31:39.460
But so, again, by analogy, it would be just sheer hubris to think that the 7 billion of
00:31:46.760
us who are currently here collectively or anyone individually have pushed the human conversation
00:31:53.900
to the limit of what's rationally apprehendable about the universe.
00:31:57.440
So we know there's more out there in every sense.
00:32:09.540
I'm going to have to have you over for Sabbath.
00:32:10.900
But from the atheist perspective, or from the perspective of not being convinced of any
00:32:16.720
religion, this is what's so limiting about this notion of revelation.
00:32:23.040
Because what you have, you're anchoring a worldview to a book that we know, we just know by the
00:32:30.200
time of its composition and by its actual contents, can't subsume the larger worldview that we're
00:32:40.700
Because the argument that I was making was based on an Aristotelian philosophical view of an
00:32:44.440
unmoved mover and certain properties that we have to have as human beings in order to create a
00:32:50.060
And you're arguing back to Revelation, which I freely admitted, that if Revelation were to be
00:32:53.660
destroyed tomorrow, I could not recreate the Torah from memory.
00:32:56.760
It's not a matter of not being able to recreate it.
00:32:58.400
It's just what is its importance apart from being one among millions of books that have
00:33:06.380
Well, I mean, the importance of Judeo-Christian revelation in our particular context is it
00:33:10.840
is the creator of the entire chain of events, or it is at least the progenitor, along with
00:33:16.380
Greek thought, largely, of an entire chain of events and thought that lead to the establishment
00:33:23.600
Well, no, but that's, again, that's a set of historical contingencies that are...
00:33:32.120
I mean, my argument here is that you could also say that virtually everything that has
00:33:36.220
been accomplished in human history was accomplished by people who didn't know a damn thing about
00:33:45.220
...every beautiful building that was built was built by somebody who knew nothing about
00:33:50.640
But that's not an argument that ignorance of molecular biology is a good thing or that it should be
00:33:56.340
And I'm not arguing that ignorance is a positive.
00:34:01.280
Well, I would say that any kind of religious sectarianism is a species of ignorance now that
00:34:09.060
And that's, again, an assumption that you're making based on premises that I don't necessarily
00:34:16.800
But on your account, the Hindus have to have it wrong.
00:34:20.220
I mean, they're worshipping an elephant-headed god and a monkey god and, you know, I mean...
00:34:30.540
I mean, I do think that the Hindus are not correct.
00:34:37.980
If you're going to go to Aristotle and you're going to go to seven precepts that anyone could
00:34:41.640
discover so as to lead a well-ordered life, what is the significance of being Jewish?
00:34:46.400
So, the significance of being Jewish is that even the foundations of what Aristotle believed,
00:34:51.600
that he's trying to arrive through, that he's trying to arrive at logically, have to
00:34:56.080
be undergirded by a faith in a god who also provides us some level of moral guidance.
00:35:01.460
Because even the precepts of Aristotle are too broad to actually create the civilization
00:35:09.440
This is a Greek-slash-Judeo-Christian civilization.
00:35:11.480
It's the Athens and Jerusalem in the typical phraseology.
00:35:13.940
And if you just knock out the pillar of Jerusalem, then you're ignoring the impact that Jerusalem
00:35:19.220
has on Athens and that Athens has on Jerusalem, historically speaking.
00:35:22.780
Well, this is kind of reminding me of the moment when I debated Rick Warren once at Saddleback,
00:35:29.920
It was just the two of us and John Meacham who was moderating.
00:35:33.160
And he was telling me that basically without God, you know, people would just be raping and
00:35:38.320
killing and that you require this as an anchor for an ethical life.
00:35:41.840
And he even said of himself, I mean, I don't believe this when anyone says this, but this
00:35:47.940
He said of himself that without, if he didn't believe there was a hell, he would be raping
00:35:54.620
That's actually not something that I fully agree with.
00:36:07.080
But what I do believe is that a scientific materialist worldview cannot construct a moral
00:36:17.140
system because is has nothing to do with ought.
00:36:19.940
Science is about is and has no capacity to say anything about ought other than constructions
00:36:25.680
that are based in a notion of free will that you yourself reject.
00:36:32.000
You know, time is short, but I've written two books on those two...
00:36:37.180
But if that were true, how would you explain the moral character of my life?
00:36:43.100
I mean, assuming I'm not raping and killing people or living a very...
00:36:51.060
I mean, as I just said moments ago, I don't think that you have to be a religious person
00:36:57.480
I do think that there has to be a religious underpinning to a moral system because I don't
00:37:03.400
You're using terminology that is based in certain assumptions about human nature that I'm not
00:37:07.480
sure that you are recognizing that you reject, right?
00:37:11.620
Let's take the scientific materialist worldview at its very base, okay?
00:37:14.480
At its very base, we are basically balls of meat wandering through the universe with a bit
00:37:19.000
We're sort of Spinoza's stones that have been thrown, and we know that we've been thrown.
00:37:27.400
First of all, many people who would take an evolutionary picture of ourselves also imagine
00:37:33.600
I've never understood that perspective, to be honest with you.
00:37:35.480
I'll put the free will piece in play here because actually I think there are moral insights
00:37:40.520
we can have when we see through the illusion of free will, which we really can't easily
00:37:56.560
Well, I mean, I think one of the problems, one of the problems is that in some very weird
00:38:01.760
way, because Ben is wearing a kippah, we think of him as being very orthodox, pious, and religious.
00:38:08.220
In fact, I'm always struck by just how much he eschews any appeal to text in his public
00:38:16.120
So for functional reasons, I very often see him in a largely atheistic context.
00:38:22.160
I find, Sam, that you're always focused on what is, to my way of thinking, very clearly
00:38:32.580
And that really does sound anti-Semitic somehow.
00:38:38.940
I'll have to ask my rabbi how I just got insulted.
00:38:42.680
I don't know, Sam, how much are you being paid tonight?
00:38:46.440
And, you know, as much as I take a scientific worldview, I find that if I'm really honest
00:38:52.260
with myself, I have a lot of certainly dialectical tensions that I can't resolve, needs for meaning
00:39:02.000
that I can't find easily met within the rational systems.
00:39:06.880
I think that the is and ought is a good distinction.
00:39:10.700
I think a lot of this has to do with pre-existing architecture that predisposes us, even though
00:39:17.380
our rational minds may know better, towards something that functions very much in an as-if
00:39:24.120
Well, let's just take is and ought for a second, because here's one way those two things collapse
00:39:29.180
If understanding how the universe is altogether, you know, all the possibilities of experience,
00:39:36.420
all the ways minds emerge, all of the kinds of good lives and bad lives and all of the mechanisms
00:39:42.160
that would determine one path over another, a complete understanding of the mind and the
00:39:48.460
cosmos, that's all the is, all the is that is there to be understood.
00:39:54.220
If understanding that couldn't give you a complete picture of what you ought to do, where would
00:40:04.300
If you sum all the facts, how does that not give you a way to chart your course in this universe?
00:40:15.260
Well, there are these things that we notice in our minds that we can, you know, that run
00:40:20.980
through our fingers like Quicksilver that aren't exactly facts, these intuitions, these things
00:40:26.060
that gnaw at us, even though we know the answer, we feel superstitious, we feel guilt.
00:40:32.100
You know, economists talk about utility as a one-dimensional object, but how many kinds of utility
00:40:38.160
and dis-utility, I can be happy, I can be interested, I can be fulfilled, you know, all these different
00:40:44.040
ways of tagging, you know, utilities and dis-utilities. And if you just notice your mind,
00:40:49.320
you'll notice that there are all sorts of things going on in it that really aren't about,
00:40:54.500
aren't about facts. And I don't know where they originate, neither do you.
00:40:58.680
But see, but just translate what you're saying, I mean, how I'm hearing what you're saying,
00:41:02.140
you're telling me facts about the mind, which I agree with. I mean, there's kind of a congress in there.
00:41:08.020
I mean, you guys decided that there was an objective reality when you were having that
00:41:11.780
conversation. And I suppose that there's probably objective reality, but I think that a lot of what
00:41:20.220
goes on is that we've been in the shallow end of science where more or less, you know,
00:41:26.820
me and let's say this gentleman over here share enough that we can probably agree that the square
00:41:35.640
root of two is provably irrational. I believe that that's probably an objective fact, but I don't
00:41:40.360
believe proof checking is objective because we have things like the Amabi problem that sit in the
00:41:44.900
literature for years and we think it's proved, but it turns out we didn't have the right proof,
00:41:49.360
you know? So we have situations in which we've been picking low-hanging, easy fruit to console
00:41:55.580
ourselves that we can all get at the objective reality. We've all seen optical illusions where,
00:42:01.640
you know, some color is exactly the same wavelength, but it looks two different ways because of the
00:42:05.480
surroundings. But so that's a great example. Let's linger there for a second. So again, we thought we
00:42:11.980
knew what we were talking about and then we find out at a deeper level that we didn't. And then we
00:42:15.580
think we know what we're talking about again and then it can reverse again. But that move to the
00:42:19.760
deeper level is more facts. It's more context. It's more objectivity. Right. But we already agreed
00:42:26.400
on something that turned out not to be true as objective fact. And then so the point is, is that
00:42:32.300
I'm not entirely sure in any of these, like if I take this irrationality of the square root of two,
00:42:38.600
there's a concept called not worth worrying about. You know? It's just not worth worrying about whether or
00:42:46.140
not somebody is going to find a mistake in that proof because it's so short. You know, when it
00:42:50.320
comes to something like the ABC conjecture, you know, it's been going on for how many years? We
00:42:54.800
still haven't, you know, gotten our arms around it. We're now not in the shallow end quite so much.
00:42:59.800
And so my concern is, is that it doesn't do a lot of damage to say we can prove that the square
00:43:06.180
root of two is irrational and that that's an objective fact up until you start trying to extend that,
00:43:12.700
you know, to more and more complicated proofs. And, you know, then it, it actually matters that
00:43:19.340
the original concept was the outside proof may exist, but proof checking isn't objective. And
00:43:23.860
therefore we may never exactly know, but there are things that aren't worth worrying about. We call
00:43:28.080
them objective fact for convenience. Sorry. Let me make an objective, what I think is an objective
00:43:34.880
claim of fact that I think has moral, that you won't agree with Ben, that I think has moral
00:43:40.960
consequence that we should grapple with. So the, and it connects to a very real world
00:43:47.220
issue like wealth inequality, right? So wealth inequality is a problem if you think it's a
00:43:52.640
problem or, or it's inevitable if you think it's inevitable, but it's, I think everyone would
00:43:58.020
agree that some level of wealth inequality would be intolerable and that we would want to correct
00:44:02.180
for it. But wealth inequality is just one kind of inequality. There's every other kind of
00:44:06.720
inequality. And there's this fact that, that none of us, and this, this goes to the free will issue.
00:44:13.440
So, so what we, what we imagine is that people, they have some, a certain inheritance, they have,
00:44:18.980
they have their genes, they have their environments. They, you know, you didn't pick the fact that you
00:44:23.280
weren't born yesterday in Syria. You were born in a stable society when you were born. We don't own,
00:44:30.060
we can't truly own all of our advantages. We didn't make ourselves, but most people feel that
00:44:35.320
there's something like a ghost in the machine that has free will that can make the best of even a
00:44:42.960
bad situation. Now, I think you probably agree that some situations are so bad that, you know,
00:44:46.980
that that can be so stacked against you that, you know, it's just life is unfair. I think, I mean,
00:44:52.660
here are claims about you that I, that, that I think are true and have, have kind of, should be
00:44:58.540
morally salient. You didn't make yourself, you didn't determine anything about yourself that you
00:45:07.200
would use as an exercise of your own free will. So you're, you're very intelligent, you're very
00:45:12.320
literate, you're very interested in things that get you ahead in, in all the ways you've gotten
00:45:16.940
yourself ahead. You didn't create that about yourself, right? And obviously there's, there's a
00:45:22.580
genetic component to that. There's an environmental component to that. There's a, maybe there's just,
00:45:25.980
you know, cosmic ray bombardment that can help or hurt who knows what, what influences are there,
00:45:31.120
but none of that is something that you have authored. And that's true of everyone in the
00:45:36.360
room. You have exactly the disposition you have, the effort you have. If you wake up tomorrow morning
00:45:41.720
with twice the capacity for effort and grit that you had yesterday, you won't know where that came
00:45:47.220
from. If it, if it comes from a book you read, you can't determine the fact that the book had
00:45:52.100
precisely the influence it had and not a little bit less or a little bit more. Well, you are part
00:45:56.100
of a system of influences. And so this is a picture in my view that just makes a mockery of the notion
00:46:01.120
of free will, right? And it goes down to the smallest possible case of, you know, my getting to the end
00:46:06.940
of the sentence, right? It's just, you know, like if, if I have a stroke now, well then, you know,
00:46:10.640
sorry, I can't do it, but I didn't. And I didn't choose that either. So now that I think, I think
00:46:18.760
taking that on board does not rob us of morality. I think because, because we still have a preference
00:46:25.940
between an excruciating plunge into civil war and needless misery and building a viable global
00:46:33.140
civilization where the maximum number of people thrive. So you're using a lot of active verbs for
00:46:37.920
a person who is a product of environment and genetics. Well, no, but no, but it's all, it's
00:46:44.140
all happening. Like if I were, we can build robots that act, right? And we are, I'm moving my hands
00:46:50.020
now, but I honestly don't know how. But is the robot moving the hands? I mean, is it, but the point
00:46:54.760
that I'm making is when you say we can, we can discern, we can build, we can create, we can, you know,
00:47:00.240
we can decide. But it's exactly like, it's exactly like you speaking now. You are, you don't know how
00:47:05.240
you follow the rules of English grammar. I'm not arguing that you can't make a convincing case
00:47:08.980
that I don't have free will. I'm arguing that you can't make a convincing case you can build
00:47:11.760
a civilization on lack of free will. Take this case. I mean, the moral relevance of this, and
00:47:16.800
Eric, I'd be interested to know if you agree with this. It seems to me that once you admit you, you
00:47:22.360
either won the lottery or you didn't on some level, that conveys a kind of ethical commitment or an
00:47:29.500
ethical obligation that you wouldn't otherwise have. You can't be the person who, who then says
00:47:34.000
everyone just is basically, you're, you're on your own. You either make it based on your
00:47:38.940
effort or not. I mean, this goes to questions of, you know, should we have universal healthcare?
00:47:44.060
It's not just a question. It's not just an economic answer.
00:47:46.240
You're going directly from is to ought with no stop on the trade at all.
00:47:49.560
Well, no, it's just for, for, for literally decades, there, there were very wealthy and very
00:47:55.600
sophisticated countries that took the premises that you are building upon and built some of
00:48:00.400
Well, no, but the idea that- But they had other things going on. They
00:48:09.260
But the point that I'm making is that you are, you are making definitive statements about
00:48:12.860
value judgments with reference to a naturally selected interaction of, of biology and environment.
00:48:22.600
I just don't know how you're getting from one to the other.
00:48:26.680
I mean, do robots have morality is what I'm asking you.
00:48:28.460
Well, no, they certainly would if we built them to have conscious states that they could
00:48:36.720
So then we can be God, but God can't make us those kinds of robots is the argument.
00:48:40.740
Should we maybe try taking the fun out of this?
00:48:44.820
So, you know, one possibility is that there's like a layer cake and at the bottom you've got,
00:48:49.800
you know, quantum field theory and then you get organic chemistry and you build this
00:48:53.840
thing up and you've got natural and sexual selection, then you get, you know, systems
00:48:59.900
And there's some sort of weird category error between the layers of this cake.
00:49:05.260
So it may be that if you can get rid of quantum indeterminacy that you have effectively Laplacian
00:49:13.500
determinism and everything is a product of initial conditions.
00:49:17.140
Uh, and that takes place at the lowest level, but there's no morality at the level of, uh,
00:49:22.480
you know, exciting fields and electrons and quarks.
00:49:25.820
So, you know, you don't put pair that observable, which is like, you know, that quark is being
00:49:32.240
unethical right now, uh, you know, with some behavior, which, you know, affected whether
00:49:39.780
So that's, that morality thing has to do with this very high up layer, which is some sort
00:49:45.820
of social organization, which is not fundamental.
00:49:50.680
And so what I hear us doing is talking about free will down here and talking about morality
00:49:57.240
And, you know, one of the lessons of physics is, is that you, every layer of the cake
00:50:02.660
has, well, it has its own language game associated with it.
00:50:07.680
And so those observables are paired with what we might call effective theories, right?
00:50:12.640
And so these effective theories are not to be mixed up.
00:50:16.500
And so every time we get into one of these free will conversations, I don't know whether
00:50:20.160
you're talking about free, we have as if free will, who was forced to buy a ticket to the
00:50:27.860
So like, like, did, did you, did you actually, I didn't have to buy one.
00:50:37.640
But, but the point is, is that I'm perfectly happy with the idea that I have as if free
00:50:45.220
And, and if we can get rid of quantum observation and get back to Laplacian determinism at some
00:50:52.680
But, but it's as if free will only because you actually are not aware of the proximate
00:50:59.460
If I look at a chaotic pendulum over the exploratorium, it may have a very clear path that's determined
00:51:06.580
through Newtonian mechanics, but I'm not smart enough to figure it out.
00:51:11.360
I just sit there like an idiot twirling it thinking, oh, wow, I didn't think it was going
00:51:14.680
to do that, you know, even though I know the physics, right?
00:51:17.260
So, so the point is, is that if I try to compute something that's much larger than I am, my computer
00:51:25.800
So, you know, this is why sort of self-reflection leads to madness very often.
00:51:30.640
And I thought you said this was going to be fun.
00:51:36.580
I'm still really interested in the app that you're coming out with for meditation.
00:51:46.720
But, but what I'm trying to get at is, is that the fun part of these conversations comes
00:51:51.120
from making these category errors and the unfun part comes from sorting it out.
00:51:55.780
And then, you know, when I, when I play Johnny Raincloud, everybody will say, well, okay,
00:52:00.260
I guess that makes sense, but it's no fun anymore.
00:52:03.820
Well, but you would still, you're not disparaging the idea of a unity of knowledge, right?
00:52:10.600
Each layer of the cake, you can make a smooth transition between layers that doesn't usurp
00:52:17.600
I mean, I have a fair idea when my wife's going to be angry at me for not doing the dishes,
00:52:23.220
but I can't recover it from quantum field theory, right?
00:52:26.300
So the idea is that maybe that the quantum field theory determines her behavior.
00:52:30.660
No, but you're, but there's nothing about doing dishes that it, that violates quantum field
00:52:36.540
I mean, so it's, it's not that you have to live in a different worldview in order to
00:52:43.260
talk about the human relations layer, the moral layer, the free will layer or not.
00:52:48.100
I can do my best, but I don't find it useful to try to think about human psychology from
00:52:53.200
the point of view of, uh, of quarks, but you know, could organic chemistry, if some neurotransmitters
00:53:01.400
depleted, yeah, you know, so, so there are some ways in which these different layers can
00:53:06.080
talk to each other, but there's no reason that I should be able to compute necessarily
00:53:09.800
across these layers successfully, even if there is some sort of concept of entailment or
00:53:16.040
What I'm interested in is kind of a first principle methodology of moving forward into the unknown,
00:53:24.780
So like it was, so what I, what I object to in religion and in this notion of revelation is
00:53:29.100
that there was some prior century where we were given the best ideas we're ever going to have
00:53:36.200
on a specific topic and we must cling to those ideas until the end of time.
00:53:41.120
This is the analogy that, or the, the rubric that, that I find most convincing.
00:53:45.820
I, I, it's like there's, there's only ever been people talking about reality here, right?
00:53:51.460
And you can, so you, therefore you can either locate yourself in a current, modern, open-ended
00:53:57.540
conversation or you can anchor yourself to an ancient one and never give yourself the
00:54:04.340
So, and then you could have done it with Homer, you could have done it with Aristotle, you could
00:54:07.860
have done it with Shakespeare and the Hindus have done it with the Ramayana and the Mahabharata
00:54:12.980
and you're, you're losing no sleep over whether or not you should do likewise, right?
00:54:17.820
And so my sense is that we need to, every question of societal importance requires that
00:54:24.860
we now outgrow the accidents of merely contingent history, outgrow the fact that people used
00:54:31.520
to be living in geographical isolation from one another and linguistic isolation from one
00:54:36.860
another for centuries and outgrow, therefore, our religious provincialism and just get to
00:54:42.660
a common humanity that's using the best tools available to solve the hardest problems.
00:54:49.960
If you'd like to continue listening to this podcast, you'll need to subscribe at samharris.org.
00:54:55.840
You'll get access to all full-length episodes of the Making Sense podcast and to other subscriber-only
00:55:00.760
content, including bonus episodes and AMAs and the conversations I've been having on the
00:55:07.320
The Making Sense podcast is ad-free and relies entirely on listener support.