The Ben Shapiro Show


Sam Harris | The Ben Shapiro Show Sunday Special Ep. 9


Summary

Sam Harris, the author of Waking Up and the host of the Waking up podcast, joins me to talk about his new book, On God and the Universe and why we should all be worried about getting life insurance if we don t have it. We also talk about the dark web, the dark side of the media, and how we can all be friends with people who disagree with us on a wide variety of topics, including religion, politics, philosophy, and philosophy, but are willing to put their thoughts on the line and engage with the best version of their opponents. This is an [Expert] level episode, which means some parts of the conversation may not make sense unless you ve read the book or watched the video, and some parts may make sense only if you ve listened to the podcast. If you haven t checked out the book and/or the podcast, you should definitely do so. It's worth the read, and it's well worth the listen. You won't want to miss this one. Thanks to our sponsor, PolicyGenius. They make life insurance, and they make it easy to compare quotes and find the best deal for you. Check out their website here. And if you need life insurance but you ve been putting it off because it's too confusing or you think you don't have time, check out their site here. You can compare quotes while sitting on the couch watching TV or while you're listening to this podcast. Just try it, just try it! Just pay it just five minutes, and you'll be fine. Just like it's free, right? just like a baby seal on the ice, right like that's dead like that. . That's right there, baby seal right like you can do it and you're not going to die like that, right right like a big baby seal like that s dead like a cubicle it's not even getting any better than that, like a real baby seal until you're dead like you're in the next episode of the ice cream truck that s going to be dead like it s dead, like that or it s gonna be a baby s in the ice cube like a baby seal, right by the ice Cube is a baby , right like the ice seal is dead like this ?


Transcript

00:00:00.000 What are the principles whereby we can navigate in the space of all possible experience and experience better and better lives and a better and better world?
00:00:17.000 So today's Sunday special featuring Sam Harris, the author of Waking Up and the host of the Waking Up podcast, will begin in just a second.
00:00:23.000 First, I want to remind you that you're going to die.
00:00:26.000 I'm just going to tell you right now.
00:00:27.000 It's going to happen.
00:00:28.000 And when that happens, if you don't have life insurance, then you're going to be really sorry.
00:00:31.000 71% of people say they need life insurance.
00:00:32.000 The answer is 100% of people need life insurance.
00:00:35.000 And only 59% of people have coverage.
00:00:37.000 So at least 12% of people who want it are procrastinating.
00:00:40.000 And sure, normally procrastinating is a bad thing, but if you're going to die, it's an even worse thing.
00:00:44.000 Because while you were putting off getting life insurance, getting life insurance was getting easier with PolicyGenius.
00:00:49.000 PolicyGenius is the easy way to compare life insurance online.
00:00:52.000 You can compare quotes in just five minutes when it's that easy.
00:00:54.000 Putting it off becomes a lot harder.
00:00:55.000 You can compare quotes while sitting on the couch watching TV or while you're listening to this podcast.
00:00:59.000 Just try it.
00:01:00.000 PolicyGenius has helped over 4 million people shop for insurance and placed over $20 billion in coverage.
00:01:04.000 They don't just make life insurance easy, they make disability insurance and renters insurance and health insurance easy.
00:01:09.000 If you care about it, they can cover it.
00:01:10.000 So if you need life insurance but you've been putting it off because it's too confusing or you think you don't have time, check out PolicyGenius.
00:01:15.000 The easy way to compare top insurers, find the best value for you.
00:01:18.000 No sales pressure, zero hassle.
00:01:20.000 It's free.
00:01:20.000 PolicyGenius.com.
00:01:21.000 When it's this easy to compare life insurance, why put it off?
00:01:24.000 Well, speaking of life and death, Sam Harris, thank you so much for stopping by.
00:01:28.000 Sure.
00:01:28.000 And, you know, there are certain weird things that happen in your life where you think, I'll never end up being friends with that guy.
00:01:33.000 And then you and I have ended up becoming pretty friendly, which is really kind of interesting.
00:01:36.000 So Sam Harris, for those who don't know, is not only perhaps the foremost atheist philosopher on planet Earth, he's also the host of the Waking Up podcast.
00:01:43.000 He's a neuroscientist and philosopher.
00:01:47.000 You should check out his podcast, it's just terrific, and all of his books are really worth the read.
00:01:50.000 I disagree with them strenuously, and they're really intelligent and really fun to read, and he's not trying to hide the ball.
00:01:56.000 So, Sam, thanks so much for stopping by.
00:01:57.000 Sure, yeah.
00:01:58.000 So, let's just jump right in with the fact that you and I are now sitting together, because that in and of itself is a weird thing.
00:02:03.000 What has happened in the country that we're now part of this kind of deviant conversation from the mainstream that seems to be growing in audience size?
00:02:13.000 Well, you and I are both part of this wing of the media where we're having long-form conversations on podcasts and on YouTube videos where we're reaching a surprisingly large audience.
00:02:25.000 And because of the format, we're not put in these weird rhetorical boxes where we have to struggle to make
00:02:32.000 The other guy looked bad in the 45 seconds remaining, you know, on CNN or wherever it is.
00:02:37.000 And so we really can unpack an argument and we can search for
00:02:44.000 There's a possibility of convergence in real time together.
00:02:49.000 You might not notice the difference if you're listening to this in any 30 second bit of conversation, but over the course of an hour or two hours, you notice that you're hearing a conversation that you're not hearing elsewhere.
00:03:03.000 We don't agree, as you say, about many things, but we're in the same media channel and we're approaching these conversations in a similar spirit of just
00:03:12.000 Being willing to put our thoughts on the line and, based on a principle of charity, engage with the best version in our opponents.
00:03:21.000 I think that's really important because, as you say, when you spend your life in cable news, it's always finding the worst version of the argument and then bashing it with a club until it's dead like a baby seal on the ice.
00:03:30.000 But, you know, it's really interesting because the intellectual dark web, which is a coined term by our friend Eric Weinstein, at your podcast, where we were doing a taping of your podcast in San Francisco, and that, of course, is a conversation between three people who disagree on a wide variety of topics.
00:03:42.000 What do you think sort of characterizes this?
00:03:43.000 Because it's become really
00:03:45.000 Controversial the term itself, there's been a lot of backlash to even the idea that there is an intellectual dark web.
00:03:49.000 Does it exist?
00:03:50.000 If so, what are the sort of common factors you think that unify the widely disparate viewpoints therein?
00:03:56.000 Well, I think it's what I just mentioned.
00:03:58.000 The fact that these conversations are happening in the dark with respect to the mainstream media.
00:04:03.000 I don't think most people at the New York Times or on CNN understand how big your audience is or how big Joe Rogan's audience is or my audience is.
00:04:10.000 It's just the numbers
00:04:12.000 Uh, would surprise them, and the fact that people are listening with that level of engagement would surprise them.
00:04:19.000 And, yeah, ideological commitments aside, or beliefs aside, we're having conversations that are not really yet on the radar of the mainstream media, and yet we have analogous large audiences.
00:04:35.000 And, again, it's a spirit of intellectual honesty and adventure where we're not
00:04:41.000 We're not stuck simply trying to win points.
00:04:44.000 I mean, you and I will debate many topics, and I think I'm right, you think you're right, and it will have the character of a debate, but in reality, as hard-hitting as any of those exchanges could ever be,
00:04:57.000 I'm not approaching it the way you approach a formal debate at a theater.
00:05:01.000 It's more fun.
00:05:01.000 You get to turn it over and look under the other side of the rock and look at these viewpoints all the way through.
00:05:06.000 And one of the things that I think has happened that's driven this group of people together is the fact that the hard left has become so ensconced in identity politics.
00:05:16.000 I know that you obviously got into a very well-publicized exchange with Ezra Klein at Vox over this.
00:05:20.000 Wasn't that fun?
00:05:21.000 I thought he was deeply intellectually dishonest.
00:05:23.000 Shocker.
00:05:24.000 I thought it was deeply intellectually dishonest and then his suggestion that you were saying identity politics is bad is in and of itself a form of identity politics.
00:05:32.000 What do you make of this whole identity politics rising tide and what do you think the backlash to that's going to be?
00:05:37.000 Where is it coming from?
00:05:39.000 Well, it's coming mostly from the left.
00:05:41.000 It's producing a counter effect on the right, which is white or even white male identity politics.
00:05:48.000 And so it's natural to see that these two sides are amplifying one another.
00:05:52.000 But on the left, it's much more troubling for me because the left is the space traditionally where
00:06:00.000 Self-criticism and wondering whether or not you might be wrong is just a paramount virtue, right?
00:06:08.000 That has metastasized in this context to a kind of
00:06:16.000 I don't know.
00:06:38.000 I now often describe this as just the most unhappy game of Dungeons and Dragons that ever was invented.
00:06:44.000 You have to negotiate these power differentials based on victimology.
00:06:50.000 You know, how many victimology points you have in this game.
00:06:53.000 And, you know, as a white, privileged, heterosexual, you basically have no points, right?
00:07:00.000 So you're not empowered
00:07:05.000 to have a credible opinion on any of the most important topics of our time, right?
00:07:11.000 You're either just part of the problem with respect to all of these variables, and you're mansplaining or you're, you know, cultural appropriation or you're, you know, wading in here.
00:07:23.000 I mean, so this is what happened with Ezra Klein.
00:07:26.000 At one point he said, you know, we're two privileged white Jewish guys who shouldn't be talking about race at all, right?
00:07:32.000 Like, this is not something we can weigh in on.
00:07:36.000 And then when I changed the topic to anti-semitism it got no better for some reason.
00:07:43.000 It's a problem because clearly if we want to get to a post-racial society, if we want to get to a society where human beings can simply be identified as human beings, the endgame can't be taking things like race and gender and gender difference and sexual orientation
00:08:04.000 Immutably, seriously.
00:08:06.000 These can't be just the ineradicable variables that define a person's position on important topics for all time.
00:08:15.000 If we get to Mars and we're still worried about skin color, in a Martian colony, we've done something wrong.
00:08:22.000 I think we need to reverse engineer what we think the
00:08:27.000 The end state should be and clearly identity politics is not the game we should be playing.
00:08:31.000 Agreed and I mean obviously it's destroying the capacity to even have these conversations because there's no way to have a conversation with someone who is spending the entire time assessing whether your point of view is even worth being taken seriously as opposed to the rational nature of what you're saying.
00:08:43.000 And I think that's one of the factors I think that's unifying those of us who are having these conversations.
00:08:48.000 I think another factor that's unifying
00:08:51.000 All of this is a belief in data.
00:08:55.000 Now, I know that you, obviously, are a deep believer in data and science.
00:09:01.000 I'm quite fond of data and science myself, although you would argue, obviously, that as a religious person, I'm not fond quite enough of data and science, and we'll get to some of that in a little bit.
00:09:08.000 But it seems to me that one of the downsides of the identity politics is the attempt to paint into a corner science as though science is an outgrowth of a particular
00:09:18.000 Culture and is therefore irrelevant to general swaths of people.
00:09:21.000 So in the Ezra Klein interview, for example, when you suggested that group differences in IQ exist, and you weren't even saying that those group differences in IQ are attributable to environment or genetics.
00:09:31.000 You were saying, we don't know the answer to that.
00:09:33.000 If I don't want to mischaracterize your view.
00:09:35.000 And he was immediately coming back with, well, you can't say that.
00:09:38.000 And I'm not sure how we're supposed to have conversations when you legitimately can't cite data before you even start having a conversation.
00:09:43.000 Yeah, yeah.
00:09:45.000 So, you know, in defense of people who worry about this kind of thing, you know, it's obviously possible for data to be wrong and the conversation continues even once you have data, but what we should all be anchored to is a good faith, intellectually honest, non-smear merchant approach to analyzing what we think we know and why we think we know it.
00:10:08.000 And those principles of rationality
00:10:12.000 and an empirical engagement with reality simply are not susceptible to an identity or even a political interpretation.
00:10:23.000 This is why reason is the only thing that scales.
00:10:26.000 If I have a good enough argument based on clear enough evidence, it should persuade you if you are being reasonable no matter what your background, no matter who your parents were, no matter how you were mistreated or not as a child,
00:10:41.000 You're welcome.
00:10:56.000 in order to reach the right answer.
00:10:58.000 This is why prototypical, prototypically reasonable or reason-based topics are so easily divorced from politics and things like mathematics.
00:11:09.000 Anyone who's going to argue that mathematics or philosophical logic is just a tool of political ideology and oppression just knows nothing about those topics.
00:11:20.000 But it's true that
00:11:22.000 Virtually every other place that we really care about facts, and being right or wrong, has that character.
00:11:29.000 It should be true, ultimately, of journalism.
00:11:32.000 It's either Lee Harvey Oswald shot Kennedy or he didn't, right?
00:11:37.000 It's a fact about a human being holding a gun, right?
00:11:40.000 And we either get access to the data or we don't.
00:11:44.000 So, the fact that so much of our discussion about what's going on in terrestrial reality is based on, or is filtered through the lens of people's political commitments is just highly dysfunctional.
00:11:59.000 It's just not something to be maintained.
00:12:00.000 We should be cutting through it wherever we can.
00:12:02.000 So where do you think that the United States is going?
00:12:04.000 Because obviously we've seen, you know, the rise of your audience is enormous.
00:12:07.000 You have a huge audience.
00:12:09.000 I think a lot of folks, you mentioned Joe Rogan has an enormous audience.
00:12:11.000 Jordan Peterson obviously has a very big audience.
00:12:13.000 A lot of these people have a very big audience.
00:12:15.000 Do you think that the rise of these new conversations is... Are you optimistic or pessimistic?
00:12:19.000 Do you think that these new conversations are going to...
00:12:22.000 Turn into a new sort of brand of politics that ends up saving the country?
00:12:25.000 Or are you pessimistic and you think that the identity politics machines that are now operating at seemingly full blast on all sides, you think that those end up winning the day for the moment?
00:12:35.000 I don't know.
00:12:35.000 The future is a big question mark for me.
00:12:40.000 The present is fundamentally surprising to me.
00:12:43.000 I didn't think we would get Trump.
00:12:46.000 And getting Trump, I didn't think we would get the kind of reaction we see to Trump.
00:12:52.000 I could have predicted the reaction on the left, but it just amazes me every day that he is as
00:13:00.000 Untouchable.
00:13:14.000 I mean, I think you and I will be fine, right?
00:13:17.000 I'm very optimistic about our having conversations like this and this channel and media having a durable interest for people.
00:13:27.000 But whether it will affect any real political change in the near term, I don't know.
00:13:32.000 Because I see that the left is fully capable of
00:13:38.000 I don't think so.
00:13:57.000 And I'm, as you point out, or as we'll discover, I'm on the left on virtually every relevant question, except the ones we've been talking about, which is the virtues of identity politics and victimology.
00:14:11.000 So let's talk about your political viewpoint a little bit.
00:14:13.000 So you say you're on the left on a bunch of different issues.
00:14:15.000 Are you more on the left socially, as far as being libertarian?
00:14:19.000 Because, I mean, the truth is that you and I probably agree on the government's role in a lot of those particular areas.
00:14:24.000 Probably not on abortion, but on same-sex marriage, for example.
00:14:28.000 I think we probably agree the government should not be involved in these sorts of decisions.
00:14:32.000 But as far as economics, are you also on the left?
00:14:34.000 Are you more in favor of redistributionism?
00:14:36.000 And on what grounds?
00:14:37.000 Well, I think the government... I'm a fan of libertarianism up to a point.
00:14:43.000 I think the government should only do what it can do best, right?
00:14:46.000 So for me, nothing but free.
00:14:49.000 So then I think the arguments about what
00:14:53.000 We're good to go.
00:15:09.000 I think waiting around for the market to get that perfectly right, which might only happen if we just run out of oil, I think that's waiting too long.
00:15:24.000 The oil industry is already subsidized.
00:15:27.000 So if you got those subsidies out, then there would be a clear competition with renewables.
00:15:31.000 But I think it would be rational for the government to, quote, pick a winner in that space.
00:15:38.000 Not a winner with respect to a specific company, but recognize that there are certain things we want to incentivize.
00:15:45.000 And one of them would be, say, to get off of oil, right?
00:15:48.000 Forget about global warming for a second.
00:15:50.000 Just, we want clean air, right?
00:15:52.000 So if we wanted to incentivize clean air, the market isn't necessarily the best way to do that.
00:15:57.000 Because if you are burning something horrendous in your factory,
00:16:02.000 Uh, you can't adequately compensate me for, for the smoke that, you know, blows over the fence.
00:16:06.000 Right, their externalities, obviously.
00:16:08.000 Uh, so I think, I think the externalities, uh, uh, in many cases are, I think libertarians ignore them, or they think that, that far too blunt an instrument would correct for them.
00:16:21.000 Something like a, you know, a boycott.
00:16:23.000 You know, so if you're a massive polluter, you know, and I don't like it, I can organize a boycott against you, and in the fullness of time, that's gonna do its work.
00:16:31.000 Uh, I think that's getting truer and truer, right?
00:16:33.000 I think the power of a boycott now with social media is probably as sharp as it can get or has ever been.
00:16:43.000 It's easy to look back 20 years to see just how ineffectual those efforts might have been against big corporations.
00:16:49.000 I do wonder whether the government interventionism in environmental issues particularly is, in fact, a blunt instrument given that there are technological changes that take place that radically change the nature of many of these industries.
00:17:02.000 I'm a believer that in the next 20 to 30 years, very few people are actually going to commute to work.
00:17:05.000 I mean, the internet has made it essentially possible for you to work wherever you want.
00:17:08.000 It's made it possible for me to work wherever I want.
00:17:10.000 There's still factories out there, but that's a shrinking percentage of the American workforce.
00:17:15.000 I mean, you're seeing major retailers go out of business specifically because people are sitting home.
00:17:19.000 I don't know.
00:17:35.000 We're good to go!
00:17:56.000 With the level of threat that's being talked about by the IPCC.
00:17:59.000 So I want to talk more about that with you in just a second.
00:18:02.000 First, we have to talk about protecting your data.
00:18:04.000 So with all the recent news about data hacks and breaches, it's hard for me not to worry about my digital privacy.
00:18:08.000 No matter what you do online, your mobile carrier, an internet service provider, they track all of it.
00:18:12.000 Every website you visit, all the emails you're sending.
00:18:14.000 It's ridiculous.
00:18:14.000 So that's why I've taken back my privacy.
00:18:16.000 I do use ExpressVPN.
00:18:17.000 And these days I don't use the internet without it.
00:18:19.000 ExpressVPN is the world's leading VPN provider.
00:18:21.000 It lets you securely use the internet without being tracked by anyone.
00:18:24.000 And ExpressVPN keeps my online activity private and anonymous while I browse, or email, or download, or stream.
00:18:30.000 It's great for streaming content.
00:18:31.000 You can even use it to watch the World Cup without a cable subscription, if you're into that sort of thing.
00:18:35.000 They're easy to use.
00:18:35.000 The app encrypts all of my internet data and hides my IP address, which protects my connection.
00:18:40.000 Which is great for me.
00:18:41.000 ExpressVPN costs less than seven bucks a month and runs seamlessly in the background of your computer, phone, tablet.
00:18:45.000 Every time you use the internet without it, you're putting yourself at risk.
00:18:48.000 So take back internet privacy today.
00:18:50.000 Find out how you can get three months free.
00:18:51.000 Go to expressvpn.com slash ben.
00:18:54.000 That's E-X-P-R-E-S-S-V-P-N.com slash ben for three months free with a one-year package.
00:18:59.000 It is really easy to set up.
00:19:00.000 Takes legitimately only a couple of minutes and then you just run it in the background of your computer.
00:19:04.000 You never notice it again, but it's protecting you.
00:19:05.000 Visit expressvpn.com slash ben to learn more.
00:19:08.000 That's expressvpn.com slash ben.
00:19:10.000 Okay, so, let's talk a little bit about some of the kind of root issues that I think people want us to get into, which is gonna be all the religion versus atheism, and rationality, and all the deep stuff that is actually more fun to talk about than politics, in my view.
00:19:24.000 And I get the pleasure of talking about it with you here, because I don't usually have the pleasure of talking about it on a daily basis when I'm covering politics.
00:19:29.000 So, since we get to do this, let's do it.
00:19:31.000 So, let's talk about where you think morality comes from.
00:19:36.000 In the religious view, obviously.
00:19:38.000 There are certain things that I believe are capable of understanding by any sentient human being.
00:19:44.000 So I don't believe that all human beings in the absence of religion are immoral people who go around murdering their neighbors and raping their sisters.
00:19:50.000 I think that, in fact, this is
00:19:53.000 Pretty well-embedded in even Judaic philosophy, the idea that there is a sort of natural law theology where you, as just a normal person, know not to kill people and know not to steal and know to set up courts of law.
00:20:03.000 This is what they call the Seven Commandments to Noah.
00:20:06.000 But the idea is that anyone can basically discover these things.
00:20:09.000 And there are universals across culture about you're not supposed to murder your brother.
00:20:13.000 The biblical reading is that to reach a more sophisticated level of morality that leads to a sort of right-based society we see here, you at least need the catalyzing enzyme of a Judeo-Christian religion in order to get here.
00:20:27.000 That would, I think, be the most rationalistic argument on behalf of Judeo-Christian values.
00:20:31.000 But where is here again?
00:20:33.000 Here would be a civilization that values individual rights above the values of the collective, that says that people are to be treated, to use the biblical phrase, as made in the image of God, that we should treat individuals as made in the image of God.
00:20:47.000 That does not happen in the absence of a Judeo-Christian value system.
00:20:51.000 That's the religious argument.
00:20:53.000 Although that is more of a historical argument.
00:20:57.000 Right, that's what I'm saying.
00:20:58.000 It's a rationalistic argument because the deeply religious argument would be God said so, so do it, right?
00:21:02.000 But that's not the argument that I think is the most compelling because that only works if you believe in God and if you believe in Revelation.
00:21:08.000 So that's not the argument that I tend to make because I don't find it intellectually convincing.
00:21:11.000 It's an argument from authority, which of course is not particularly convincing.
00:21:14.000 So I tend to make the historical argument, which is that history
00:21:18.000 We're good to go.
00:21:43.000 A few points.
00:21:44.000 One, I'm not convinced by that historical argument.
00:21:46.000 I think you can cherry pick the data either way and come up with a different conclusion.
00:21:52.000 And even if I agreed with it, it wouldn't make the case I think you want to make, because it would be an instance of what's called the genetic fallacy, which is
00:22:03.000 Even if we granted that our respect for individual rights, say, came from a Judeo-Christian tradition, it doesn't mean that it can only come from there or that it even is best gotten from there.
00:22:16.000 I would say that it actually hasn't come principally from there.
00:22:20.000 For instance, you could say that
00:22:22.000 That Christianity, in particular, was responsible for, in part responsible, for the fall of the Roman Empire.
00:22:28.000 So Christianity undermined the notion that the Roman Emperor was a god.
00:22:35.000 It made it harder to recruit true soldiers, and they had to farm it out to mercenaries.
00:22:41.000 And it eroded what you might call traditional Roman values.
00:22:46.000 And then the Western Empire fell, and we ushered in the Dark Ages.
00:22:52.000 And insofar as there was a reboot to civilization at that point, it was largely the result of classical, the learning and philosophical insight of antiquity being preserved by, of all people, the Muslim community.
00:23:07.000 So, I think you can have it any way you want looking at history, but it just doesn't get you there in terms of the moral content and, in this case, the political or social content coming from the Bible or any other religious text.
00:23:24.000 So then why here?
00:23:26.000 Meaning, like, why in Judeo-Christian civilization, but not Islamic civilization?
00:23:29.000 Because you mentioned rediscovery of Aristotle and reuse of Aristotle in the 10th and 11th centuries was really beginning, you know, in the Islamic world long before Aquinas really repopularized it in the 13th and 14th centuries.
00:23:42.000 Yeah.
00:23:42.000 Well, one, I think it's, you know, from my point of view, it's impossible to ignore the influence of Islam.
00:23:48.000 I mean, Islam is its own ideology instead of dogmatisms that are inflexible and at odds with the spirit of science fundamentally.
00:23:58.000 And despite the fact that there was a brief period where there seemed to be some, you know, happy convergence between scientific and mathematical insight and Islam, for the most part, Islam has been hostile to, you know, real intellectual life and
00:24:14.000 In a way that Christianity was hostile, even when the scientific worldview was struggling to be born in the 16th century and the 15th century.
00:24:25.000 What we have historically is a real war of ideas.
00:24:32.000 It can be crystallized in the moment where Galileo was shown the instruments of torture and put under house arrest.
00:24:40.000 By people who refuse to look through his telescope, right?
00:24:43.000 I mean, so that was the genius of religion paired with the emerging genius of science in that room.
00:24:49.000 Well, to be fair, I mean, Galileo was originally sponsored by the church and so was Copernicus, but there's no question there was a backlash from the church to this stuff.
00:24:56.000 Yes, and the backlash makes sense because there is a
00:25:01.000 Intellectual progress on questions of how the cosmos is organized or where it came from or how life began.
00:25:09.000 All of these questions, the scientific answers to which are in zero-sum contest with the doctrines found in the books.
00:25:19.000 Now, it's true that there are religious people
00:25:22.000 And now even the Pope, who have relaxed their adherence to tradition enough to make room for something like evolution, right?
00:25:28.000 But it's still, it is still a problem.
00:25:31.000 Not a super new idea, I mean, right?
00:25:33.000 Aquinas was talking about this in the 13th century and 14th century, the idea that if it was in science and it was contradicted by the book, then you're misreading the book, right?
00:25:43.000 I mean, that's a pretty old idea.
00:25:44.000 But that is to subvert science rather than the book, in Aquinas' case.
00:25:48.000 I mean, Aquinas thought heretics should be put to death.
00:25:51.000 His argument for that, for capital punishment for heresy, and Augustine made the same argument.
00:25:56.000 He thought they should be tortured.
00:25:59.000 So those two great lights of the Catholic Church gave us the Inquisition and gave us more than a century.
00:26:06.000 I think it's also fair to say that they were rather instrumental in the development of modern science.
00:26:09.000 So the Dark Ages, first of all I think the Dark Ages are a bit of an exaggeration in terms of the Dark Ages themselves saw a massive
00:26:18.000 A massive growth in technology and architecture, for example.
00:26:20.000 I mean, Gothic cathedrals are built during the Dark Ages.
00:26:23.000 But the scientific world is, well, virtually every major university in the Western world was sponsored by the Catholic Church.
00:26:30.000 And I'm not a great Catholic defender, right?
00:26:33.000 But virtually all major universities were sponsored by the Catholic Church, which saw consonance between science and religion as a reason to actually investigate the natural world.
00:26:41.000 Well, no, I mean, again, I think that's backwards.
00:26:43.000 I think the reality is there was no one, I mean, everything that was good that was done anywhere at any time prior to, you know, pick your year, was done by some religious person.
00:26:53.000 I mean, there was just nobody else to do the job, right?
00:26:55.000 So you could make the argument that, you know, Catholics built every bridge in Europe until the Protestants came around and they built their half of the bridges.
00:27:02.000 I mean, so there was just no one else to do the job.
00:27:04.000 I don't know.
00:27:22.000 Who were Christians, who were, you know, as is often pointed out, Newton spent half his time worrying about biblical prophecy.
00:27:28.000 Now, I think that was a waste, an objective waste of his time.
00:27:31.000 He also spent a lot of his time worrying about astrology, right?
00:27:34.000 So, you're trying to... Alchemy, yeah.
00:27:37.000 Yes, and alchemy.
00:27:38.000 And alchemy, insofar as there was anything to it, apart from sort of the internal myth-making that may be of use to some people,
00:27:48.000 be edged into chemistry, right?
00:27:49.000 So there was often a real science at the back of a lot of merely mortal confusion where people were trying to work things out, you know, I would argue very much under the shadow of religious commitments that they need not have had and were not actually serving their ultimate ends.
00:28:09.000 And this for me is true in the moral sphere as well, because to take
00:28:14.000 This is why the Bible, in my view, can't be the real repository of our moral wisdom in any sense, because when you go to read it, you are forced to ignore certain passages or reinterpret them rather aggressively.
00:28:29.000 To conform to what you now, in the 21st century, have every reason to believe is good or a direction worth going, socially.
00:28:36.000 So, you know, it is just an inconvenient fact that slavery is endorsed in the Bible.
00:28:42.000 It's explicitly endorsed in the Old Testament and it's certainly not repudiated in the New, right?
00:28:47.000 And, you know, Jesus told slaves to serve their masters and to serve their Christian masters especially well.
00:28:53.000 So, there's no place in the Bible
00:28:56.000 Where you can get a truly compelling case against slavery, because the creator of the universe clearly expected slavery to be a human institution.
00:29:05.000 Well, except for abolitionists finding enough inspiration in the Bible to use it as their main text.
00:29:10.000 But they did that despite what's in the Bible.
00:29:12.000 Well, I think that that is... I mean, I don't want to... This shouldn't sound insulting, because it's not meant as an insult.
00:29:18.000 I think that, from a religious point of view, that's a simplistic reading of the Bible's role in human affairs.
00:29:24.000 Meaning that when any written document is given to any group of people, it has to be given to people in a way that they can understand.
00:29:32.000 It's not that slavery was endorsed by the Bible.
00:29:34.000 It's that slavery is universal among human civilization until modern times.
00:29:37.000 But it was... No, no.
00:29:38.000 There were...
00:29:39.000 There are religions that have different points of view on all these questions, right?
00:29:43.000 So it was possible in the 5th century B.C.
00:29:47.000 to have a take on ethics with respect to something like slavery or the killing of combatants or non-combatants that was quite a bit more modern and ethical and civilized
00:30:03.000 uh... then was founded that we is found in the bible so you need to take take a minute you might not like some of their other commitments but take something like jane ism the jane is amit gandhi got his non-violence from jane ism jane ism is just in truth a religion of peace unlike islam which is is uh... you know that the word peace is a euphemism for the word surrender there uh... or submission uh...
00:30:27.000 It's possible for people 2,500 years ago to wake up one day and even write a book which suggests don't harm anyone or anything, even a cricket.
00:30:39.000 Right, well that's fine, but the question is...
00:30:42.000 How about that?
00:31:05.000 And then it has developed over time.
00:31:06.000 This is why I think Judaism is particularly kind of unique in this respect, because that's been an ongoing dialectic for literally thousands of years.
00:31:13.000 I mean, there's legitimately, you know, thousands of pages of tractates of just people arguing about these particular issues.
00:31:18.000 You would say that the argument should have started from the point of there was no text for them to argue about, and they should have just argued from sort of apriori reason, maybe.
00:31:27.000 I respect text, but I think the principle of revelation is a problem.
00:31:30.000 But just to back up for a second, I think it's certainly problematic for you as a Jew to argue that the legitimacy or success of religion is best measured by the number of adherents in the year 2018.
00:31:43.000 No, but the point of Judaism also is that, I mean, it says in the Bible itself that God is going to make, you know, great peoples of all of Abraham's sons, for example.
00:31:51.000 And as Maimonides put it in the Jewish belief, even the growth of Islam and Christianity, which are obviously based on a certain Judaic root, I think Islam less so, because there's an actual rewriting of the Old Testament.
00:32:05.000 I think so.
00:32:23.000 The basic principle, the bottom line is the basic principles of Judaism, those have been embraced in a way that the basic principles of Jainism have not across time, and they've shaped civilizations in a way that is significantly better than the principles of Jainism have shifted any number of small people.
00:32:37.000 Or small group of people, not small people, obviously.
00:32:39.000 Well, so, again, it's just...
00:32:43.000 The fact that you and I could improve the Bible with very little thought, just by taking out... If we just took out the worst passages that have no possible redeemable content this year, or I would argue any other year, the Bible's already improved.
00:32:59.000 So the fact that we could edit it to anyone's advantage is a problem for the idea that this was written by an omniscient being
00:33:08.000 Thank you.
00:33:24.000 as the product of human minds, brilliant or not, and every shelf in the bookstore or library has the same status with respect to the merely mortal provenance of these ideas, then it's fine.
00:33:37.000 Then you can pick and choose the best ideas.
00:33:40.000 Then you can be slavishly attached to Plato's Republic, and that can be your favorite book.
00:33:49.000 What revelation gets you is this notion that, no, no, this isn't just a book, right?
00:33:53.000 This is the product of omniscience on some level.
00:33:58.000 And that ties your hands intellectually, because then you are forced to make these acrobatic contortions around passages which clearly have no good application now and didn't even have a good application then.
00:34:13.000 And when you view it from the other side, when you think about just how good a book would be if an omniscient being wrote it.
00:34:21.000 It's very easy to see what could be in there that would still astonish us.
00:34:29.000 It's very easy to see what could be in there that would prove, just based on the time of its emergence, that this couldn't have been the product of merely human minds.
00:34:38.000 And there's nothing like that in the Bible.
00:34:39.000 To respond to some of those points, I think that there's a lot there, so I'm going to try and parse it as we go.
00:34:46.000 I think that one of the arguments that is made, certainly in the Talmud, is the idea that human reason was generated in order to help
00:34:57.000 I think so.
00:35:13.000 Of course.
00:35:31.000 We're good to go!
00:35:49.000 Can take you in any number of horrible places that are significantly worse in virtually every way than the places that Judeo-Christian religion brought you for 2,000 years.
00:35:59.000 My argument is not that Judeo-Christianity itself, Judaism on its own is everything that you need, right?
00:36:04.000 As an Orthodox Jew, my argument is not that.
00:36:06.000 My argument, which if it were, then I wouldn't be out there caring about science or about nature, as you say.
00:36:10.000 People who are fundamentalists don't care about any of those things.
00:36:12.000 They think everything you need to know is found in this particular book.
00:36:16.000 I don't think I know how to fix my car from the Bible.
00:36:18.000 But what I do think is that in the Straussian view, there's a tension between Athens and Jerusalem.
00:36:23.000 There's a tension between revelation and reason.
00:36:25.000 And that without either one of these things, that reason without revelation ends up in utopias of our own creation that can end up in
00:36:32.000 We're good.
00:36:47.000 Omniscience matters is because if there is no belief in an objective level of moral truth, then everything becomes subject to interpretation.
00:36:56.000 Up to and including, if you were intellectually honest enough, laws like murder.
00:36:59.000 Because maybe murder doesn't apply to people who are outside my tribe, maybe it doesn't apply to people who are outside my family, or people I just want to kill for my own benefit.
00:37:05.000 So this is where we get into the alternative morality.
00:37:07.000 This is where I'm going to ask you about your moral basis.
00:37:09.000 Let me respond to some of that though, because there's a lot there.
00:37:12.000 So, it's not going to surprise you that I disagree with that summary.
00:37:16.000 This is the fun part, yeah.
00:37:17.000 So, to take your anchor...
00:37:20.000 I think there is an anchor, or at least a foundation, to everything we can discover or value, and it's human conversation, right?
00:37:30.000 And I think the morality, because I think Revelation simply didn't happen, right?
00:37:34.000 Right.
00:37:35.000 It's just human beings talking to one another at any point in history, thinking internally.
00:37:40.000 I mean, they've been extraordinary people, and they've had extraordinary insights, and they've shared them, and they've codified them in books.
00:37:46.000 So the morality or pseudo morality or barbaric morality that you find in various places in various texts was put there by people, right?
00:37:56.000 It's not that it came from some other source and that we need to be anchored to it.
00:38:01.000 It is the record of a merely human conversation.
00:38:06.000 That, I'll be the first to admit, has, in various parts, real value and real, you know, wisdom in it.
00:38:14.000 And that's, and hence, the reason why people are so attached to some of it, certainly.
00:38:18.000 But it's just, that's not unique for the Bible, that you can find that in, among Greek philosophers, or in various places in antiquity.
00:38:27.000 And you can find it in modern variants.
00:38:29.000 I mean, you and I, in one another's presence here, having a conversation that gets recorded,
00:38:35.000 We can say something that is highly relevant to the question of how to live a good life.
00:38:41.000 And if we were doing this 3,000 years ago and it happened to get written down, it would be one of the lines that people would think had been revealed if they lost track of what its actual source was.
00:38:54.000 And so it's people like ourselves that have always done this work.
00:39:00.000 We're good.
00:39:17.000 The intellectual tools we can get in hand, and that includes whatever is good in religion, right?
00:39:22.000 So if there's something that is in Ecclesiastes that is better put there than any place else in the canon of all of human knowledge, well then of course we want to keep that, right?
00:39:32.000 So how do you decide what is the good?
00:39:35.000 Because right now you're essentially playing Cottie under the tree, right?
00:39:38.000 You get to sit there and say, here's a good moral standard, here's a bad moral standard, and you and I will sit here and our conversation will be better than a conversation a thousand years ago.
00:39:47.000 This is a question that I asked you actually last time we spoke publicly.
00:39:51.000 And the question is, okay, so if that's the case, you and I tend to agree, I would think, on probably 95% of our central values about what it is that makes for a good life, right?
00:40:02.000 I think that we both believe in individual freedoms.
00:40:04.000 I think we both believe in the ability to
00:40:07.000 Live as you choose, so long as you're not hitting anybody else in the face, as a general rule.
00:40:11.000 I think that we both agree on all of these things, and so what I asked you last time is, why do we agree on these things?
00:40:16.000 Is it because we just both happen to be super unreasonable, like we are just the most reasonable people who ever lived, and we just happen to be here at this time, and why didn't people a thousand years ago know this?
00:40:24.000 Or is it, you know, back to my original argument, the fact that we grew up probably ten miles from each other in a city that was built in a country that was built by people who believed all of these things that
00:40:34.000 I think so.
00:40:51.000 Thank you for having me.
00:41:12.000 Yeah, and it's not unique there, too.
00:41:15.000 There are Greek philosophers, I think Epictetus articulated it someplace.
00:41:22.000 The truth is, it predates our humanity on some basic level.
00:41:25.000 You can see evidence of it in monkeys, right?
00:41:27.000 There's an expectation of fairness, even in monkeys, right?
00:41:30.000 So, this is a
00:41:33.000 We're running a software program that is morally relevant to us, that is riddled with bugs, but that predates our humanity.
00:41:44.000 And so largely what civilization is, the good parts of culture that will lead to something that is durable at the level of civilization, largely correct for our merely hominid, merely evolved, merely
00:42:00.000 Creaturely, moral intuition.
00:42:02.000 So, for instance, you and I have, as front and center in our moral hardware, a sense of disgust, right?
00:42:11.000 And disgust has roots below anything that could be considered moral.
00:42:15.000 It's just, you know, you could smell something bad and you feel like vomiting and that's... But the truth is, in terms of the evolution of the brain, the brain doesn't evolve new modules that can do fundamentally new things.
00:42:28.000 It has to evolve
00:42:30.000 capacities that are predicated on the old hardware that was anchored to things like, you know, I'm going to vomit based on that smell, right?
00:42:42.000 And much of our moral thinking about the world is disgust-based or fear-based and it gets applied to things like how you feel about gay marriage, say.
00:42:56.000 So if you go into a Bible-thumping
00:42:59.000 fundamentalist Christian context where you can find people who are just adamant that gay marriage is wrong, this discussed circuitry is tuned up through the lens of that social question.
00:43:09.000 And that's a... so I view conversation about ethical truth and progress in ethical space being more a matter of
00:43:23.000 Reasoning and unhooking our reflexive, in this case, disgust-based intuitions from our sense of what is ultimately good.
00:43:34.000 I mean, there are things that you and I might not be comfortable with the first time we consider them, that we can get comfortable with by thinking it through or imagining things from other people's perspectives.
00:43:45.000 Or at the very least, that we think are immoral but still think that we have no business in doing anything about.
00:43:49.000 Yeah, and ultimately,
00:43:51.000 There are cases of moral dumbfounding where you would have to admit that it's moral in the sense that you can't point to a victim, right?
00:44:01.000 Jonathan Haidt talks about experiments like this, like somebody who's having sex with a dead chick.
00:44:05.000 But it still disgusts you.
00:44:07.000 To take it out of the moral sphere for a second, if I told you,
00:44:11.000 I have Jeffrey Dahmer's sweater, you know, it's been dry-cleaned.
00:44:14.000 Would you like to, would you try it on?
00:44:16.000 Right?
00:44:16.000 Now, everyone recoils from that, right?
00:44:19.000 But it's not, if you actually think about it, it's not something that, like, there's kind of a magical superstition intruding there, where you think there's something that's been deposited in the sweater, even though, you know, we dry-cleaned it, you know, 450 times, right?
00:44:35.000 So it's, and maybe a sweater is too charged, but to take something that, where there would be absolutely no question that his DNA, it's not covered with this creepy guy's DNA, it's something that you just, in order to recoil from it, you are thinking superstitiously, right?
00:44:51.000 So you can correct for those.
00:44:53.000 So you're making a sociobiological explanation for morality, like basically as an E.O.
00:44:56.000 Wilson explanation for the evolution of morality?
00:44:58.000 No, to the contrary.
00:45:01.000 I think that there are two very different ways to talk about morality with respect to our scientific understanding of ourselves.
00:45:09.000 One is
00:45:11.000 What you just referenced, this biological, evolutionary, descriptive story of how we got here.
00:45:17.000 Why is it that we're the sort of apes that feel these sorts of ways about social interactions?
00:45:22.000 And why is our moral thinking anchored to those properties?
00:45:29.000 Then there's a completely separate question, which is the question that interests me morally, which is, given what we are, given where we are right now, what is possible for us?
00:45:41.000 How good could human life be?
00:45:43.000 And what are the principles of neurobiology and everything else, economics and sociology and genetics, anything that can be brought to bear to change human experience?
00:45:54.000 What are the principles
00:45:56.000 Whereby we can navigate in the space of all possible experience and experience better and better lives and a better and better world.
00:46:04.000 And so that is a very different question because it presupposes, just on its face, that we have to, most of our job is to fly the perch that has been prepared to us.
00:46:16.000 And I think you and I agree on this, but I'm not sure why.
00:46:18.000 Okay, so here are the, I think, the two big questions that I have.
00:46:22.000 One is,
00:46:23.000 Where does your concept of the good come from and why is it universal?
00:46:28.000 And two, you just spelled out sort of your differentiation from the socio-biological explanation for morality that we evolved over time and that our brains are evolved to perform certain tasks and we sort of naturally came to a level of morality.
00:46:42.000 But you're not a believer in free will.
00:46:43.000 So when you talk about reason, and you talk about the importance of reason, you and I fully agree on this, but my question is that, as a neuroscientist, if we are just pure material and we're just a bunch of neurons firing outside of our own control, obviously, because every...
00:46:58.000 Cause, every effect has a cause.
00:47:00.000 If it's just things happening, then why should we value reason?
00:47:04.000 Does it matter that we value reason?
00:47:06.000 Because we can't control whether we value reason anyway?
00:47:09.000 Are these conversations kind of pointless?
00:47:10.000 Because, or and then back to the first question, how exactly does reason play into the good?
00:47:17.000 Is that just a vague term for a particular system of neurons that convinces another particular set of neurons?
00:47:22.000 Right, right.
00:47:23.000 So two questions there.
00:47:24.000 What is the foundation of value and morality specifically?
00:47:27.000 And how does free will or its absence interact here?
00:47:36.000 And this connects to other questions where our intuitions probably divide and the questions about what is the meaning of life, what is the purpose of life, those are questions that people ask where religious people by and large feel like you need an answer, like there's a meaning shaped hole in the world and we should fill it.
00:47:54.000 And I, given how I view things, think it's the wrong question.
00:47:57.000 What I see us as having is an opportunity.
00:48:01.000 We are in a circumstance where
00:48:05.000 Based on the minds we have, there's a range of possible experience.
00:48:10.000 And we don't, and the horizon goes in both directions to the very, very bad and the very, very good.
00:48:15.000 And we don't quite know how bad things can get based on what we know.
00:48:20.000 We know they can get far worse than we ever want to touch.
00:48:24.000 And so it is with the good.
00:48:25.000 We don't know how good things can get.
00:48:28.000 And yet, we know the general direction where we want to head.
00:48:34.000 We know that if the world becomes more and more characterized by love, and joy, and creativity, and compassion, and insight, and fun, and we know that's all, that whole suite of, and you could list those characteristics
00:48:54.000 I think so.
00:49:13.000 Seeming paradoxes, which we could both point to occasions where suffering has led to something good, right?
00:49:19.000 There's a silver lining to certain kinds of pain, right?
00:49:22.000 Or if you want to become a Navy SEAL and experience all the empowerment that comes with that, you have to go through the hell of becoming a Navy SEAL, and that's a test and a trial and
00:49:35.000 And yet, there's a massive silver lining for people who come out the other side of that.
00:49:37.000 And yet, if you could sample a person's experience in each moment through that ordeal, it might be indistinguishable from torture, right?
00:49:45.000 So, that's just to say that the frame around which we put certain sensory experiences matters, right?
00:49:51.000 If I tell you that, you know, the pain in your bicep is because you've been lifting weights so much and you're making so much progress,
00:49:58.000 Uh, you know, you'll feel one way about it.
00:50:00.000 If I said to an identical pain, well, you actually unfortunately have arm cancer.
00:50:04.000 It's a very rare cancer, but you've got it, right?
00:50:06.000 You'd be, you would feel the suffering attendant to that.
00:50:09.000 So, but all of this, these are all statements about what it's like to have a human mind and, and again, these are, I view us as having a navigation problem.
00:50:18.000 And the reason why I feel like
00:50:20.000 There's a foundational claim here that need not even be argued for, that is far more defensible than a claim about revelation or anything else where you might try to anchor morality, is that all I need is the acknowledgement that
00:50:39.000 If we imagine a universe where every conscious mind suffers as much as it can, for as long as it can, with nothing good ever coming of it.
00:50:49.000 There's no silver lining.
00:50:50.000 We have a perfect hell that has been designed for every possible conscious mind.
00:50:57.000 I call it the worst possible misery for everyone.
00:50:59.000 That's bad.
00:51:00.000 If the word bad is going to mean anything,
00:51:03.000 It applies there, right?
00:51:04.000 Now, if you're going to say it doesn't apply there, if you're going to say, well, yeah, that's kind of bad, but there are things that are worse, I don't know what you're talking about.
00:51:11.000 Because by definition, this is the worst possible misery for everyone, right?
00:51:16.000 So, as long as you are going to acknowledge that other states of the universe are better than that,
00:51:23.000 Right.
00:51:24.000 Then you've given me my continuum of better and worse.
00:51:26.000 You're making a very St.
00:51:27.000 Dan Selman's logical argument here about the nature of bad.
00:51:30.000 And I'm not sure that... Well, no, no.
00:51:32.000 ...you can imagine the worst island that you could possibly imagine.
00:51:35.000 No, no.
00:51:35.000 It's very different from that.
00:51:36.000 Because it's not to say that... Because that's clearly specious.
00:51:40.000 If I said to you, well,
00:51:43.000 There's a perfect turtle.
00:51:46.000 Oh no, I agree with you.
00:51:47.000 I'm not defending the ontological argument.
00:51:49.000 I'm making the argument that you're sort of making an ontological argument.
00:51:51.000 It's not.
00:51:52.000 It may sound that way, but it's not.
00:51:53.000 There is... It's so rudimentary.
00:51:56.000 Because I think that you're failing to define a couple of terms.
00:51:59.000 Meaning that... So when you say the worst possible suffering, do you mean physical suffering?
00:52:03.000 Do you mean mental suffering?
00:52:04.000 And the truth is... Any combination that's the worst combined suffering.
00:52:08.000 But people...
00:52:09.000 I mean, it's all mental.
00:52:11.000 It's all mental in the end.
00:52:12.000 It's all a matter of consciousness and its content.
00:52:14.000 Right.
00:52:14.000 And I think you're, I feel like you're playing a little bit of a trick when you sort of presuppose that we share a common definition of suffering.
00:52:23.000 I think there are certain things where we share a definition.
00:52:25.000 Let's say we don't.
00:52:26.000 So let's say we have a helmet we can put on and dial in every possible conscious state for a human brain.
00:52:32.000 So you can wear the helmet and I can wear the helmet and we
00:52:35.000 And, you know, we each tune it to state, you know, X551, right?
00:52:39.000 Right.
00:52:39.000 And I say, well, how do you like that?
00:52:42.000 So one thing that is implicit here, although it's not a defeater if it's not so, but I have an expectation that you and I will converge, perhaps not
00:52:55.000 on every specific state as our favorite or our least favorite.
00:52:59.000 But there'll be whole families of states there that you and I will acknowledge.
00:53:03.000 Well, these are all fantastic.
00:53:05.000 Maybe, I'm not sure which I like better than the other, but this is really good.
00:53:09.000 I'd like to feel more of this.
00:53:11.000 And we'll converge.
00:53:12.000 And this is just based on the similarity.
00:53:14.000 So how do you not end up, if you're pursuing the ultimate, if there is such a thing as the ultimate possible good and this good is what you are, how do you not end up, number one, in sort of the brave new world
00:53:22.000 You're drugging yourself the whole time for pleasure.
00:53:24.000 That's an interesting question.
00:53:26.000 And number two, this does bring to mind an essay by George Orwell that he wrote in 1940 about the rise of Nazism, and what he basically suggested was, why is it that everything is so good in the West?
00:53:37.000 Like, everything is much better in Britain than it is in Germany, and yet people are willingly joining up with this monster to go and fight.
00:53:44.000 And he said, because it turns out that a lot of people don't want freedom, a lot of people don't want
00:53:48.000 Pleasure.
00:53:48.000 A lot of people are willing to forego those things in favor of a higher pursuit.
00:53:52.000 And you see that now with literally billions of people who, I think that the bush line that every human soul yearns for freedom, I don't think that's true.
00:53:59.000 I think there are a lot of people who misdefine freedom or think that freedom is something that freedom is not.
00:54:03.000 Otherwise they wouldn't willingly convert into these systems.
00:54:05.000 So I think that it is a little simple.
00:54:08.000 You know, no, but none of this contradicts the picture I'm painting.
00:54:11.000 It's just that all of these things, all of these differences among people will have explanations, and we don't yet have those explanations in hand.
00:54:18.000 But so take the simplest case, you know, you and I each put our hand on a hot stove, right?
00:54:23.000 Now, given our similarity neurologically, I would expect if the stove is hot enough, you and I will have indistinguishable responses, right?
00:54:30.000 And if you don't, if one of us doesn't have that response, there's something wrong with our nervous system.
00:54:36.000 So, but let's just say we met somebody who had a different enough response that we even couldn't converge on the question of, you know, hot stoves are not worth touching, right?
00:54:50.000 You have a masochist who likes
00:54:53.000 I think so.
00:55:13.000 In Lawrence of Arabia, you know, apparently that's a true story about him.
00:55:17.000 But that's, you know, it's a, you can train yourself to feel a different way about certain kinds of unpleasant, classically unpleasant stimuli.
00:55:24.000 But again, all of this fits in a complete picture that we don't yet have about just why it is certain minds are the way they are.
00:55:34.000 So the macro question for me is, given all the minds as they are, where
00:55:43.000 Should we go?
00:55:56.000 Given a million years to talk about it, we might not be able to distinguish which is better or worse.
00:56:02.000 Is Chinese food better than Thai food?
00:56:06.000 There's a range of differences there which don't matter for better or worse, it's just different.
00:56:11.000 And yet, at the end of the day, if you really preferred one and I really preferred the other,
00:56:17.000 We could find some reason why that's the case.
00:56:19.000 I mean, you might be a super-taster of certain tastes genetically, but then it's still coherent to ask, if we could really intrude in the brain and change our intuitions about better and worse, right?
00:56:34.000 If I could change your sense of the rightness of certain actions or the wrongness, we could ask this additional question of,
00:56:42.000 Whether that would be good because that would be a new way of navigating the space and this brings you to your brave new world question so like if it's possible to let's say we had a
00:56:53.000 We're good to go.
00:57:15.000 But you could imagine someone who's just so destroyed by the experience of grief that they just can't get their life back on track.
00:57:21.000 Everyone in their life is worried about them.
00:57:23.000 They're, you know, on the virtue of suicide.
00:57:25.000 At a certain point you'd say, well, let's just give you this pill and just see if we can bring a little daylight in there.
00:57:30.000 Even if you were against using it for yourself, right?
00:57:33.000 But presumably, you wouldn't want to take it 30 seconds after your kid was run over by a bus.
00:57:38.000 You know, you just see the worst thing that's ever happened in your life happen, and then you just pop this pill and you don't feel anything, you know, one way or the other about it, right?
00:57:46.000 You're ready to go to Starbucks, right?
00:57:49.000 That would be a complete fragmentation of who you are with respect to the love you feel or felt for your child, right?
00:57:58.000 Like, what does loving your child mean if
00:58:01.000 upon uh... immediately upon his or her death you want to cancel your grief and you feel great right so that we don't know what we did it'd be hard to find the right answer but and you know that this kind of thing is very likely coming by the way right it's it was very likely we will one day have a cure for grief and we'll have to figure out how to use it and there will be wrong ways to use it but i think what we want i think the intuition that that causes you to ask this question about Aldous Huxley and Brave New World is that we we have a
00:58:29.000 We're right to want to be anchored to reality in some sense.
00:58:32.000 And if we were ever faced with an opportunity of uploading ourselves into a simulation where just the world is a video game and nothing is real, right?
00:58:41.000 So like our states of happiness are totally divorced from the reality of our lives, right?
00:58:47.000 And our actual relationships and the conscious experience of other people, that would be a bad thing, right?
00:58:52.000 And yet, we could imagine a circumstance of maximizing pleasure in a way that's divorced from reality.
00:58:58.000 And that's an interesting argument to have, ethically, because I think our intuitions about that could change to some degree, and I think that there are ways in which we're already in something very much like a simulation.
00:59:13.000 To talk about what is real in this context is interesting.
00:59:16.000 But I share your bias here.
00:59:18.000 I share the sense that there are versions of pure pleasure, brave new world futures where everyone's a heroin addict, perfectly medicated.
00:59:27.000 That's not good.
00:59:29.000 But I think you can adjudicate that based on
00:59:34.000 Other possible experiences in the landscape of all possible experiences that are clearly better.
00:59:40.000 And you would make an argument based on evaluating those experiences.
00:59:43.000 Okay, so this is one of those episodes where it's just, I'm going to be devastated that we don't have a second hour to actually go into all of these issues.
00:59:50.000 Because we basically scratched the surface on all of this.
00:59:53.000 But sort of final parting question because we didn't even get into rationality or free will exactly.
00:59:58.000 A couple places where I'm sure we have more disagreements.
01:00:01.000 But I have a very short answer to that piece.
01:00:02.000 Perfect.
01:00:03.000 Okay, so go for it.
01:00:04.000 So, rationality, and I think I might have said this to you on stage,
01:00:08.000 at our event.
01:00:09.000 Rationality is not a... Successful moments of reasoning are not examples where freedom of will is even tempting to ascribe.
01:00:18.000 So it's... If I give you an argument, if you strongly believe one thing, and I give you an argument that persuades you, that just knocks down the row of dominoes in your mind, that leads you to think... Right, no, I understand your argument, which is a naturalistic response to a reasonable argument, and you don't have any control over that.
01:00:34.000 But you don't have any control over
01:00:38.000 We're good.
01:00:53.000 Yes and no, because you see people who clearly resist the impact of a reasonable argument on themselves.
01:00:58.000 Yes, but the resistance is what it is to be unreasonable or to be under the sway of wishful thinking or confirmation.
01:01:04.000 Right, but the bottom line is that, from my perspective, if the idea is that reason is basically just eliciting a particular response, then people
01:01:13.000 We're good to go.
01:01:33.000 Is the argument in favor of reason a moral, reason is good argument?
01:01:35.000 Or is the argument a utilitarian, reason is useful argument?
01:01:38.000 Or is it both?
01:01:50.000 I can imagine a lot of ways to convince people of things that don't involve me making arguments to them and that historically have been used to great success with horrible, horrific human carnage, obviously.
01:01:59.000 Well, you're not necessarily convincing them in that case, you're just... Forcing them, right.
01:02:03.000 Yeah, that's right.
01:02:04.000 Although, I would say that you can indoctrinate fully millions of people into... You would think this too, right?
01:02:08.000 Yeah, true.
01:02:09.000 That it's possible to indoctrinate people not using reason into fully formed belief systems.
01:02:14.000 Yeah.
01:02:15.000 So, when you make the argument for reason, are you saying that reason is morally better?
01:02:18.000 And if so, why is reason morally better than, for example, the appeal of passion, which has obviously motivated millions of billions of people over time?
01:02:25.000 Right.
01:02:25.000 Well, again, I don't think they're as separable as many people think.
01:02:29.000 I think you can't reason, and there's neurological evidence to back this up, and Antonio Damasio did this work decades ago, where if you have certain neurological injuries in the orbital medial prefrontal cortex, you can't
01:02:43.000 You can't be moved by the products of your, quote, reasoning.
01:02:47.000 Because, I mean, reason has to be anchored to emotion in a very direct way.
01:02:51.000 And, I mean, you can actually feel this in yourself.
01:02:53.000 So, if I say something that starts to sound like bullshit to you, right, that feeling of doubt, you know, the feeling that you have detected errors in my chain of reasoning, that feels like something.
01:03:03.000 That is an emotion, right?
01:03:04.000 And if you couldn't feel that, you know, if it was all just cold and calculating and just... That's why sociopaths, you know, can't reason their way to virtually anything.
01:03:11.000 Well, they can, well... Or they can't reason their way to everything.
01:03:15.000 They can reason fine, unfortunately.
01:03:18.000 They just don't care about other people's experience.
01:03:20.000 So they're very manipulative in ways that you and I wouldn't... The reason itself doesn't arrive at a moral answer in any case.
01:03:28.000 Well, so for me, reason is the only thing that, as we talked about at the top of this, it's the only thing that takes us out of who we are and scales to some universal point of view.
01:03:41.000 It's not, you're not reasoning, if you're actually reasoning,
01:03:44.000 What do you think?
01:04:02.000 the effect of perspective.
01:04:08.000 Again, reason and scientific rationality, generally, is the thing that explains why, if you're colorblind, you don't see colors the way I do.
01:04:16.000 It's not that we can't get at what's actually real.
01:04:20.000 We can, because we can explain divergences of opinion.
01:04:24.000 Otherwise, you just have those divergences.
01:04:25.000 Well, we are definitely gonna have to have you back for a much longer conversation.
01:04:29.000 It's really a pleasure to have you here.
01:04:30.000 Thank you so much.
01:04:31.000 For folks who don't know Sam's podcast, I'm sure you do, it's Waking Up.
01:04:35.000 Sam's podcast is The Waking Up Podcast, and his most recent book is Waking Up.
01:04:40.000 So check that out.
01:04:42.000 Sam, thanks so much for stopping by.
01:04:43.000 It really is a pleasure to see you.
01:04:50.000 The Ben Shapiro Show Sunday Special is produced by Jonathan Hay, Executive Producer Jeremy Boring, Associate Producers Mathis Glover and Austin Stevens, edited by Alex Zingaro, audio is mixed by Mike Caromina, hair and makeup is by Jeswa Alvera, and title graphics by Cynthia Angulo.
01:05:04.000 The Ben Shapiro Show Sunday Special is a Daily Wire Forward Publishing production.
01:05:08.000 Copyright Forward Publishing 2018.