Making Sense - Sam Harris - February 16, 2017


#64 — Ask Me Anything 6


Episode Stats

Length

42 minutes

Words per Minute

143.13156

Word Count

6,085

Sentence Count

324

Misogynist Sentences

3

Hate Speech Sentences

8


Summary

In the wake of Bill Maher's controversial interview with Majid Nawaz, I thought it would be a good idea to ask the Muslim community if they would like to see a moratorium on stoning women who commit adultery. And I got a lot of questions about that, so I decided to answer them in this episode of the Making Sense Podcast. And, as always, thank you for all the questions you sent in, and I hope you enjoy the podcast as much as I enjoyed answering them. In order to access full episodes of the podcast, you'll need to subscribe to our premium member membership, where you'll get access to all the latest episodes and access to the most popular shows on the network. If you're not a member yet, you can become a member for as little as $19.95 a month, which includes ad-free versions of the full-length episodes available on most major podcast directories, as well as access to The Huffington Post, NPR, Slate, and other major news outlets. If you don't already have an account, please consider becoming a supporter of the MMS Podcast by clicking the link below. We don't run ads, we're making possible entirely through the support of our listeners, so you can support what we're doing here! and become a supporter by becoming a patron! . Thank you for listening to the podcast and sharing it on your social media platforms! You can help us make sense of it all! Sam Harris - The Making Sense: a podcast that's all about making sense of the world through the lens of ideas, facts, ideas, and culture, and the people who make it so we can all of it better, better, smarter, more beautiful, more fun, more thoughtful, more interesting, more informed, more of us all together. - Thank you, again and more of you, more understanding of the things we all of us can all be more like you, making sense, more like ourselves, more listening, more common sense, less of us, more human, more things, more meaningful, more intelligent, more evidence, more real, more connected, less complicated, more profound, more brainier, more stuff, more reason, more lovely, more ideas, more of that we all can agree on it, more empathy, more truth, more eloquent, more useful, more times than we'll all agree on that, more facts, less stuff like that.


Transcript

00:00:00.000 Welcome to the Making Sense Podcast.
00:00:08.820 This is Sam Harris.
00:00:10.880 Just a note to say that if you're hearing this, you are not currently on our subscriber
00:00:14.680 feed and will only be hearing the first part of this conversation.
00:00:18.420 In order to access full episodes of the Making Sense Podcast, you'll need to subscribe at
00:00:22.720 samharris.org.
00:00:24.060 There you'll find our private RSS feed to add to your favorite podcatcher, along with
00:00:28.360 other subscriber-only content.
00:00:30.240 We don't run ads on the podcast, and therefore it's made possible entirely through the support
00:00:34.640 of our subscribers.
00:00:35.880 So if you enjoy what we're doing here, please consider becoming one.
00:00:46.760 Okay, there's been a lot going on.
00:00:49.500 Does anyone else feel like this year has been going fast?
00:00:55.740 We're about six weeks into it.
00:00:58.360 Feels like six months.
00:01:00.160 Feels like Trump has been president for six months.
00:01:03.520 Jesus.
00:01:05.660 In any case, I am doing an AMA podcast today.
00:01:09.900 So I went out to all of you on Twitter and Facebook.
00:01:14.240 And I gotta say, whenever I do this, the response is just hugely gratifying and overwhelming.
00:01:20.480 I get, no exaggeration, thousands of questions whenever I go out to you guys.
00:01:27.560 So thank you for that.
00:01:29.980 Needless to say, I can only answer a tiny fraction of them.
00:01:34.040 But many of you hit similar topics in similar ways.
00:01:39.240 So I'm aggregating a fair amount here.
00:01:43.040 Also recovering from a cold.
00:01:44.500 Hopefully that won't play too much havoc with your listening pleasure.
00:01:49.200 And I don't mention anyone's names when I do Q&As like this because, again, I do aggregate questions.
00:01:58.060 I occasionally reword them a little bit to make them more on point.
00:02:03.380 So if you asked any of these, you will no doubt recognize your handiwork.
00:02:09.640 But sorry not to give you credit because I can't really keep track of how I change things here.
00:02:16.120 And also I can only assume that some of you actually want to remain anonymous.
00:02:19.860 And given that I haven't communicated with you directly about this, I will err on the side of safety.
00:02:25.740 Before I get to the questions, I will do some brief housekeeping.
00:02:28.880 Just to put all of this in context, I just did Bill Maher's show.
00:02:34.420 And you can see the response to that playing out online.
00:02:39.180 I felt that interview was a bit of a tightrope walk given the previous time I'd been on the show.
00:02:46.380 And I'm reasonably satisfied that the whole story came out in those 12 minutes.
00:02:53.660 So that's good.
00:02:56.020 Of course, that doesn't prevent people from the left and the right going crazy in response to it.
00:03:03.480 And it's really been instructive to see that there's virtually no space to occupy between the extreme left and the extreme right
00:03:13.640 that doesn't get you attacked by both sides on this issue.
00:03:18.840 By virtue of that conversation, I am getting attacked as a Islamist shill and a racist xenophobe.
00:03:28.980 It's incredible.
00:03:30.300 There is no place.
00:03:31.420 It is not even a razor's edge where you can stand to make sense on this issue at the moment.
00:03:38.700 So in any case, if you've missed that, you can see that on my blog.
00:03:42.160 It's on YouTube.
00:03:43.440 I've embedded it on my blog.
00:03:45.180 And thanks to Bill for having me on.
00:03:46.760 It's always good to talk to him.
00:03:49.600 I was also just in New York with Majid Nawaz, and we finished filming this documentary on our collaboration.
00:03:57.060 I don't know when that's coming out, but I will keep you all apprised of that.
00:04:02.740 And it was great to see Majid again face to face.
00:04:05.520 It's always instructive in the aftermath of an interview like the one I did with Bill to receive Majid's hate mail, which is just mind-boggling.
00:04:18.760 The self-proclaimed moderates who attack Majid and Ayan for their bigotry, it just proves how far we have to go.
00:04:29.160 I just noticed, for instance, among the usual suspects, and it really is the usual suspects, there's Bina Shah, who is a columnist for the New York Times and didn't like my conversation with Bill at all.
00:04:45.680 And she disavows Majid and Ayan, and then says that she loves reformers like Tariq Ramadan.
00:04:57.120 This is the Tariq Ramadan who, when asked whether stoning women for adultery was okay, he recommended that there be a moratorium on it.
00:05:07.260 We just pause this edict for a while so that we can consider its wisdom.
00:05:12.120 That's how far he would go.
00:05:14.740 It's unbelievable.
00:05:16.860 This woman writes for the New York Times.
00:05:19.100 So, if nothing else, it proves this is a necessary conversation.
00:05:23.500 And, again, to clarify, and I said this in my interview with Bill, I don't think I'm going to reform Islam.
00:05:32.880 That is obvious.
00:05:33.720 I am urging Muslims to reform Islam and to speak honestly about the need for reform.
00:05:42.120 And if you think reform need go no further than a moratorium on stoning women to death for adultery, your theocracy is showing.
00:05:54.160 And the fact that you could be that confused as a woman New York Times columnist is fairly jaw-dropping.
00:06:02.040 Okay, first question.
00:06:05.540 Any update on the project manager position?
00:06:08.700 Many questions of this sort came in.
00:06:10.980 Yes, we are still in process over here.
00:06:13.740 There have been over 900 applications at this point.
00:06:18.980 So, closing in on 1,000, I actually need this position filled in order to vet the applicants, unfortunately.
00:06:27.340 But I do have some help with the vetting.
00:06:30.220 In fact, I'm not doing the first round.
00:06:32.240 I will see only the final 50 or so.
00:06:36.100 But, yeah, there's been a lot of interest, and I look forward to hiring that person.
00:06:42.220 That would be very helpful.
00:06:43.940 Question two.
00:06:45.320 Many questions on my conversation with Jordan Peterson.
00:06:48.280 Jordan is the clinical psychologist I had on two podcasts back, and we got bogged down in a conversation about scientific epistemology on the question of truth.
00:06:58.700 Um, many listeners seem confused about my reasons for not accepting Peterson's version of truth, which amounted to some odd form of pragmatism pegged to our ultimate survival as a species.
00:07:13.840 If you recall, according to Peterson, a claim is true if it helps us survive, and false or not true enough if it doesn't.
00:07:22.140 But I see so much wrong with this claim that it was really hard to know where to begin.
00:07:29.120 And, um, well, I don't think I said this in the podcast.
00:07:32.180 One wonders whether this claim applies to itself.
00:07:35.480 You know, is this claim about truth only true if it helps us survive?
00:07:40.320 And what if it doesn't?
00:07:41.580 Does it then bite its own tail and just disappear?
00:07:44.880 Do you see the problem there?
00:07:45.940 But I went round and round with Peterson for two hours on this, and this prevented us from getting into topics that listeners really wanted us to explore.
00:07:55.420 Again, he was my most requested podcast guest ever.
00:07:59.500 Now, some of Peterson's fans blamed me for this entirely, and they were alleging mostly that I'm a materialist,
00:08:08.580 and that I'm somehow dogmatically opposed to the idea that mind might play some role in defining reality or parts of it.
00:08:18.380 But that's just not true, and it's not even relevant as far as I can see, even if it were true.
00:08:24.460 If mind helps create reality, I would just claim that we can stand outside those facts as well
00:08:31.120 and say they are true whether or not anyone knows them, right?
00:08:36.740 So, for instance, if it's true to say that the moon really isn't there unless someone is looking at it,
00:08:44.940 which is to say consciousness is somehow constitutive of its being in reality,
00:08:52.140 well, that fact about the mind's power would be true whether or not any of us know about it or understand it, right?
00:09:03.280 So, you can still get a realistic picture of truth being as spooky as you want about the mind.
00:09:11.480 All I was arguing for was that there are facts of the matter whether or not anyone understands them,
00:09:17.780 and some of these facts have nothing to do with the survival of the species.
00:09:22.040 Now, some other defenders of Peterson have argued that I just don't understand pragmatism.
00:09:30.180 Okay, but that's not true either, as far as I can tell.
00:09:33.900 Pragmatism in its usual form has to make sense of the kinds of challenges I was posing to Peterson.
00:09:40.680 Okay, but Peterson's version wasn't doing that.
00:09:42.740 Pragmatism isn't just predicated on survival.
00:09:47.540 It's predicated on what works in conversation, what actually conserves the data,
00:09:54.560 what makes our statements about the world seem to cohere,
00:09:59.200 and the kinds of statements that square with our experience,
00:10:03.220 the kinds that become predictive of future experience scientifically.
00:10:06.900 All of that is what it means to be pragmatic, in the usual sense.
00:10:12.380 Most of that has nothing to do with the survival of the species.
00:10:16.960 So, again, statements about prime numbers can be understood pragmatically
00:10:21.480 when I make a claim that there is a prime number higher than any we have represented.
00:10:29.140 Unfortunately, that's actually a paradox,
00:10:30.620 to say that there's a prime number larger than any we have represented
00:10:34.200 is in fact to represent it, but leaving that aside, right,
00:10:38.500 let's talk about explicitly representing it,
00:10:41.180 which is to say it hasn't been discovered yet.
00:10:43.980 There are different ways to think about that being true,
00:10:47.660 but a pragmatic way is just to say, well, it certainly seems true.
00:10:51.860 Those kinds of statements function and conserve the experience
00:10:56.360 of what it's like to be us continually discovering new prime numbers
00:11:01.360 or seeming to discover them.
00:11:02.780 It doesn't mean that there is a reality outside of our conversation
00:11:07.640 where prime numbers really exist.
00:11:09.760 That's what the pragmatist wants to say.
00:11:12.280 Now, of course, the mathematical idealist wants to say
00:11:15.120 that there is some realm of number on some level
00:11:17.800 to be discovered by sentient beings like ourselves,
00:11:22.580 and it exists in some sense whether or not we discover it.
00:11:28.800 This is the kind of thing I got into with Max Tegmark,
00:11:30.840 who seems to be fairly idealistic on this topic.
00:11:34.280 In any case, normal pragmatism can skate across that thin ice fairly elegantly,
00:11:42.820 if not persuasively,
00:11:44.300 but a pragmatism that suggests that every statement about prime numbers
00:11:50.320 must be resolved in terms of the survival of apes like ourselves,
00:11:54.300 that doesn't make any sense.
00:11:56.820 But I couldn't seem to get Peterson to acknowledge that.
00:12:00.740 And most of you, the vast majority of you, it seems to me,
00:12:07.160 thought I made that case fairly well,
00:12:11.080 and therefore you agreed with me that Peterson's concept of truth was pretty wacky,
00:12:16.640 or at least that he wasn't communicating it well.
00:12:18.940 But many of you still faulted me as a podcast host
00:12:23.340 for not being gracious enough to just move on to other topics once we reach that impasse.
00:12:29.320 Now, I totally understand that criticism,
00:12:31.660 and I even, I think, anticipated it at some point in the podcast,
00:12:35.580 and I might have even learned something from it.
00:12:38.460 We'll see.
00:12:38.960 We'll see what I'm like next time I get bogged down like that with a guest.
00:12:42.500 But the truth is, I'm not a normal podcast host.
00:12:47.660 I view these exchanges as conversations, not really as interviews,
00:12:52.020 though occasionally it does play out a little bit like an interview.
00:12:56.560 But I'm usually trying to have a conversation,
00:13:00.360 and I'm trying to pressure test my own views
00:13:04.060 and refine my own understanding of the world.
00:13:07.300 So if the person I'm talking to isn't making any sense,
00:13:12.000 at least to me,
00:13:13.200 I really want to get to the bottom of what the problem is.
00:13:16.840 And now this is necessarily intrusive,
00:13:19.620 because on many of these points,
00:13:21.100 really one of us has to be wrong,
00:13:23.680 or at least confused.
00:13:26.200 And in this case, the disagreement was so fundamental,
00:13:29.720 and I knew Jordan wanted to move on to topics like
00:13:33.000 the existence of Jungian archetypes, for instance.
00:13:37.300 And I just couldn't see how we were going to make sense on that topic
00:13:40.960 if we couldn't agree about what it means to say that something is true, right?
00:13:46.700 I mean, how do you distinguish fact from fantasy?
00:13:51.160 We couldn't converge there, really, at all.
00:13:55.400 And the next topics on the menu were things like archetypes,
00:13:59.240 and mythology, and the reality of Christian doctrine.
00:14:04.380 And I wanted to get on to those topics,
00:14:07.180 because I knew how much our listeners wanted us to get there.
00:14:12.100 I was making increasingly desperate attempts
00:14:14.960 to try to get to some consensus so that we could move on.
00:14:19.040 And this had somewhat the character of
00:14:21.120 my attempting to perform an exorcism,
00:14:23.560 which didn't work.
00:14:25.600 It was like that scene in The Exorcist
00:14:27.460 when Max von Sydow does his whole spiel in Latin
00:14:30.620 and still winds up with a face full of green vomit.
00:14:34.980 Anyway, at the end of that podcast,
00:14:37.920 those of you who heard it know,
00:14:40.680 if you got to the end,
00:14:42.320 I put it out to all of you
00:14:44.280 to crowdsource the postmortem on it
00:14:47.960 to tell us what happened
00:14:49.460 and to decide whether we should go forward
00:14:52.280 and have a second conversation
00:14:53.540 on other topics.
00:14:55.040 And I put out a poll on Twitter,
00:14:59.640 which about 30,000 of you responded to,
00:15:03.560 and 81% of you said,
00:15:05.980 yes, we should have another conversation.
00:15:08.840 Now, I got to think that poll went fairly viral
00:15:13.240 among Jordan's crowd,
00:15:15.320 because on my own social media channels,
00:15:18.120 I got a lot of complaints about the conversation,
00:15:21.600 and I'd be very surprised if 81% of my listeners
00:15:25.960 want to hear Jordan and I go round and round again.
00:15:30.120 But I will take this recommendation seriously.
00:15:33.200 I don't yet know what I'm going to do.
00:15:35.300 It's somewhat amusing and somewhat disconcerting
00:15:38.860 that a fairly frequent criticism
00:15:42.540 from Jordan's crowd
00:15:45.080 seems to be that I didn't let the conversation move on
00:15:49.900 because I was afraid that Jordan
00:15:53.080 was going to dismantle my atheism,
00:15:56.040 that he was going to say something there
00:15:57.640 that was so powerful or so well-reasoned
00:16:00.880 that he would have revealed my doubts about God
00:16:03.600 to be completely bankrupt.
00:16:06.440 I got to say I'm open to that,
00:16:07.900 but the fact that anyone thinks
00:16:10.020 that is the reason why I didn't move on,
00:16:15.880 I got to hope that those of you
00:16:19.680 who are regular listeners to the podcast
00:16:21.220 know me better than that.
00:16:23.920 Anyway, I will let you know
00:16:25.500 if Jordan's coming back on.
00:16:27.840 I'm going to have a few more podcast guests
00:16:30.000 in the meantime before I rethink that.
00:16:33.280 I guess the implications of putting it to a vote
00:16:35.320 would be that I would simply do
00:16:37.320 whatever the majority of you say I should do.
00:16:40.480 I'm not sure this is actually a democracy.
00:16:44.300 I may be a little more autocratic than that,
00:16:46.340 but I certainly hope Jordan realizes
00:16:48.900 there are no hard feelings.
00:16:50.480 I just, in everything he has said since the podcast,
00:16:55.260 some of which I responded to on my blog,
00:16:57.880 none of that has clarified his position to me.
00:17:02.360 And if we do go for a second round,
00:17:08.100 I think we really do have to avoid
00:17:10.260 getting bogged down again the way we did.
00:17:13.040 So I have to figure out some kind of guidelines
00:17:15.840 so that we can actually have a conversation
00:17:18.560 that is productive and not excruciating for all of you.
00:17:23.680 So more on that when I figure it out.
00:17:26.960 Question three.
00:17:28.840 Many of you asked me about my views
00:17:30.700 on the so-called Muslim ban.
00:17:33.080 I'm just answering this now
00:17:34.540 just to say that I wrote a blog post about it
00:17:37.400 and then I was on Bill Maher's show to talk about it.
00:17:40.020 And both of those are on my blog.
00:17:42.200 Perhaps I'll just say that in my last meeting with Majid,
00:17:44.560 we spoke about it and he had a good distinction
00:17:47.540 or a reformulation of what is reasonable here,
00:17:51.420 which I fully agree with.
00:17:52.820 I think I said something like,
00:17:54.700 it's hard to get away from the logic
00:17:57.320 of some kind of religious test.
00:17:59.560 It is actually relevant.
00:18:02.000 Once you realize you're looking for jihadists,
00:18:05.340 it is relevant to know whether somebody is a Salafi Muslim,
00:18:09.680 right?
00:18:10.260 Because he would stand more of a chance of being a jihadist
00:18:14.200 than a Unitarian Universalist would.
00:18:18.440 But the way Majid talks about this,
00:18:22.000 we just want to know about people's beliefs and attitudes, right?
00:18:25.620 We're looking for illiberal beliefs.
00:18:28.700 And yes, it is true that Islam has more than its fair share of people
00:18:35.340 who are fundamentally illiberal at this moment,
00:18:38.460 who don't support free speech,
00:18:41.400 who think apostates should be killed, say.
00:18:43.700 But we are looking for illiberalism of that sort in general.
00:18:49.020 And if there's some new cult born tomorrow
00:18:51.160 that produces the same kind of illiberalism,
00:18:54.860 well then we'd want to stop those people at the border too,
00:18:57.860 if we could.
00:18:59.180 So a Muslim ban doesn't make any sense.
00:19:03.260 But nor does it make any sense to say you can't ask people
00:19:08.620 detailed questions about their worldview
00:19:11.400 in the process of vetting them.
00:19:13.460 Of course you have to ask people,
00:19:15.000 how would you feel if your daughter married outside your religion, say?
00:19:19.400 And there's a wrong answer to that question.
00:19:21.760 If you say, well, I would cut her head off,
00:19:23.940 we don't want you in the country, right?
00:19:25.820 And we are right not to want you in the country.
00:19:28.400 And it's instructive that there are Muslim organizations
00:19:32.980 that don't want those sorts of questions asked
00:19:35.740 in the vetting process.
00:19:37.620 Of course, as a very common theme with me,
00:19:39.900 this comes down to ideas and beliefs
00:19:42.020 and the degree to which people subscribe to them.
00:19:45.480 Because this is the best predictor of
00:19:47.720 what they will do in the world.
00:19:50.520 And we care about what people will do
00:19:53.760 and how likely they are to assimilate
00:19:56.940 into our society in a productive way.
00:19:59.340 And we're right to care about those things.
00:20:01.700 So if you want any more on Trump's executive order
00:20:05.420 and why I don't think it was a good idea,
00:20:08.380 you can see those blog articles
00:20:09.740 and the aforementioned interview with Bill Maher.
00:20:15.220 Next question.
00:20:17.080 Milo at Berkeley.
00:20:19.120 Wow, that was amazing.
00:20:22.020 Well, I guess I will just point out the obvious
00:20:25.040 that that was one of the best things
00:20:28.660 that could have ever happened to Milo
00:20:31.520 in terms of proving his points,
00:20:35.620 both the legitimate and illegitimate ones,
00:20:39.280 and raising his stature, right?
00:20:42.560 I mean, just what a short-sighted, idiotic,
00:20:46.060 counterproductive thing to do.
00:20:48.560 And what worries me about this moment politically
00:20:50.940 is that the left seems capable
00:20:53.160 of doing everything wrong
00:20:56.160 in response to the rise of the so-called alt-right
00:21:01.000 and the Trump presidency.
00:21:03.540 This antipathy to free speech,
00:21:06.760 this idea that rioting to prevent a lecture
00:21:11.920 is an example of liberal free speech in action.
00:21:15.760 And that is just so confused and destructive
00:21:19.020 that I'm tempted to say
00:21:20.920 that the left is just irredeemable at this point.
00:21:24.080 There seems to be so little insight.
00:21:28.600 And coming fresh out of my interview with Bill Maher,
00:21:32.600 I can see this.
00:21:34.320 I mean, there are people who have tweeted at me
00:21:36.860 and written to me
00:21:38.060 who heard in my discussion with Bill
00:21:41.780 a horrifying expression of racist hatred
00:21:48.540 or are pretending to have heard such a thing.
00:21:51.520 And this kind of judgment is, again,
00:21:54.620 echoed by the usual suspects on the left.
00:21:58.460 I mean, that position is so crazy
00:22:00.440 that I just don't know how to interact with it.
00:22:02.700 So it's not an accident that people on the right
00:22:05.260 can't see any way to interact with it.
00:22:08.360 I mean, all I can say is that
00:22:09.420 if I'm a bigot and a racist and a xenophobe,
00:22:15.040 if that's how I appear to you
00:22:17.000 based on what I said on real time,
00:22:20.180 what words are you going to use
00:22:22.420 for the real bigots and racists and xenophobes?
00:22:26.520 And what I've said before about Milo,
00:22:28.400 Milo is a, at this point,
00:22:31.060 kind of a professional troll, right?
00:22:33.380 I mean, some of his criticism of the left
00:22:36.480 is no doubt sincere,
00:22:38.640 but he's a kind of performance artist.
00:22:41.720 I mean, he's just winding up the left.
00:22:43.580 And, you know, perhaps I've missed it,
00:22:47.800 but I haven't seen anything from him
00:22:50.600 that is real racist bigotry.
00:22:54.760 Please take this caveat on board.
00:22:56.900 I have not read all of Milo's stuff or much of it.
00:23:01.000 Maybe there's something I've missed.
00:23:02.800 Feel free to point that out to me.
00:23:04.340 But the Milo I've seen is very far
00:23:07.240 from being a neo-Nazi
00:23:09.780 or someone whose attitudes
00:23:12.640 are truly of the right.
00:23:16.280 That's probably not an accident.
00:23:18.820 I mean, he's flamboyantly gay
00:23:20.500 and half-Jewish, I believe.
00:23:23.400 I don't know how right-wing
00:23:24.780 he could be in the end.
00:23:27.240 But this response at Berkeley
00:23:29.960 wouldn't even be warranted
00:23:32.640 if he was actually a KKK member.
00:23:35.180 Again, the moment you're using violence
00:23:38.700 to prevent someone from speaking,
00:23:42.160 you are on the wrong side of the argument
00:23:44.780 by definition.
00:23:46.740 How is that not obvious on the left
00:23:49.400 at this moment?
00:23:51.920 You're going to, what,
00:23:53.220 burn down your own university
00:23:55.020 to prevent someone from expressing views
00:23:58.700 that you could otherwise just criticize?
00:24:01.240 All of these protests were seen in response
00:24:04.400 to right-wing or quasi-right-wing speakers
00:24:09.320 being invited to college campuses
00:24:11.680 by, I'm sure, the campus Republicans.
00:24:14.680 These are so uncivil and unproductive.
00:24:18.280 And again, this is almost entirely
00:24:20.180 a phenomenon of the left.
00:24:21.800 If you heard generically
00:24:23.600 that some college campus
00:24:25.900 had erupted in violence
00:24:27.400 because a student mob
00:24:29.580 had prevented a lecture from taking place.
00:24:34.160 And the people who wanted to hear that lecture
00:24:35.740 were spat upon
00:24:36.940 as they tried to enter the hall
00:24:38.420 and finally attacked.
00:24:41.620 You could bet with, what,
00:24:43.400 99% confidence
00:24:45.540 that this was coming from the left.
00:24:49.820 Now, in the age of Trump,
00:24:51.640 when you really want to be able
00:24:53.340 to say things
00:24:55.280 against creeping right-wing authoritarianism,
00:24:59.800 having an authoritarian
00:25:01.760 anti-free speech movement
00:25:03.980 subsume the left
00:25:05.720 is a disaster politically.
00:25:08.680 But I actually think the left
00:25:10.680 is irredeemable at this point.
00:25:13.440 And this is why I've begun to use the phrase
00:25:15.400 the new center.
00:25:16.700 I think we need a new center to our politics.
00:25:19.680 I mean, I don't know how you ever get
00:25:21.520 the people writing for the Intercept
00:25:23.580 or the people on the Young Turks
00:25:25.600 to be reasonable human beings
00:25:27.700 given what they've done in recent years.
00:25:31.100 And so that's the left
00:25:32.180 as it currently stands.
00:25:34.420 Of course, it's no accident
00:25:35.340 that the Women's March,
00:25:37.680 which otherwise seemed like a great thing,
00:25:40.860 was vitiated by its alliance
00:25:43.220 with Linda Sarsour
00:25:44.840 and these closeted
00:25:47.080 and semi-closeted Islamists
00:25:49.420 who have co-opted
00:25:51.520 the women's movement
00:25:52.360 and convinced millions of women,
00:25:55.440 apparently,
00:25:55.980 that the hijab
00:25:57.180 is a sign of women's empowerment.
00:26:00.060 That's fairly mind-boggling.
00:26:01.940 Just so there's no confusion on this point,
00:26:04.040 I think you, dear listener,
00:26:06.560 should be free to wear the hijab
00:26:08.700 if you want to.
00:26:10.100 But you should also recognize
00:26:11.240 that most women the world over
00:26:13.660 who are veiled
00:26:15.160 to one or another degree
00:26:16.720 are living that way,
00:26:18.460 not out of choice
00:26:19.780 or certainly not
00:26:20.620 out of what could be considered
00:26:22.100 a free choice.
00:26:23.000 They're living in the context
00:26:24.500 of a community
00:26:25.840 that will treat them like whores
00:26:29.140 or worse
00:26:30.280 if they don't veil themselves.
00:26:32.900 Right?
00:26:33.380 That's not the political empowerment
00:26:35.420 of women.
00:26:36.520 And someone like Linda Sarsour,
00:26:39.240 again,
00:26:39.680 one of the principal organizers
00:26:41.120 of the Women's March,
00:26:41.980 is a theocrat
00:26:44.220 who lies about this,
00:26:46.400 who attacks Ayaan Hirsi Ali.
00:26:48.700 This is how the left will die,
00:26:51.600 by, on the basis of its own
00:26:53.240 moral relativism,
00:26:55.080 locking arms with Islamism
00:26:57.040 and stealth theocracy,
00:26:59.380 which is what it has done.
00:27:01.740 I mean, just as you know,
00:27:03.020 if you travel too far right
00:27:04.440 on the political spectrum,
00:27:06.440 you will encounter
00:27:07.260 the most repulsive,
00:27:09.940 the most callous,
00:27:11.600 the most authoritarian attitudes.
00:27:14.240 I think you should know
00:27:15.140 that if you travel too far left,
00:27:17.320 you will encounter
00:27:18.780 a kind of moral confusion
00:27:20.920 and identity politics
00:27:22.740 that is,
00:27:24.600 in its actual application
00:27:26.180 to the world,
00:27:27.540 a little better.
00:27:28.980 And I don't see how that changes
00:27:30.880 at this point.
00:27:32.660 Next question.
00:27:33.520 How do you think
00:27:35.060 we can reasonably expect
00:27:36.120 to break the echo chamber
00:27:37.620 mentality in social media
00:27:39.420 and online information?
00:27:41.340 Do you think it's possible
00:27:42.100 or do you expect
00:27:42.760 our conversation
00:27:43.560 to grow increasingly
00:27:44.700 factionalized?
00:27:46.700 This is a good question
00:27:48.360 to which I really
00:27:50.000 don't have a good answer
00:27:52.120 apart from my acknowledging
00:27:54.340 that this is just
00:27:55.120 a huge problem.
00:27:56.820 This has to be high
00:27:57.540 on everyone's list
00:27:58.580 of problems
00:27:59.640 that really could make it hard
00:28:01.340 to maintain
00:28:02.140 our way of life.
00:28:03.680 We're talking about
00:28:04.700 how human beings
00:28:05.940 reach a common understanding
00:28:07.620 of reality, right?
00:28:09.400 How do we get
00:28:10.340 our view of the facts
00:28:12.120 to converge
00:28:13.060 and how do we get
00:28:14.220 the moral norms
00:28:16.100 that should guide
00:28:17.520 our behavior
00:28:18.160 to become aligned
00:28:19.760 collectively?
00:28:21.680 And if we're not
00:28:23.580 dealing with the same facts,
00:28:26.800 if my news sources
00:28:29.120 are fake news,
00:28:30.980 according to your own,
00:28:32.980 and vice versa,
00:28:34.420 it is hard to see
00:28:35.380 how we will make
00:28:36.080 any progress.
00:28:37.620 This isn't just about
00:28:38.380 agreeing that climate change
00:28:40.020 is a problem.
00:28:41.400 This is everything.
00:28:42.760 This is the wars we fight,
00:28:44.420 the laws we pass,
00:28:46.080 the research we fund
00:28:47.460 or don't fund.
00:28:48.820 It is everything.
00:28:50.520 There is a difference
00:28:51.940 between truth and lies.
00:28:54.580 There is a difference
00:28:55.620 between real news
00:28:57.000 and fake news.
00:28:58.040 There is a difference
00:28:59.180 between actual conspiracies
00:29:01.300 and imagined ones.
00:29:04.600 And we cannot afford
00:29:06.020 to have hundreds of millions
00:29:09.180 of people in our own society
00:29:11.200 on the wrong side
00:29:13.080 of those epistemological chasms.
00:29:16.820 And we certainly can't afford
00:29:17.820 to have members
00:29:18.520 of our own government
00:29:19.500 on the wrong side of them.
00:29:21.400 As I've said many,
00:29:22.480 many times before,
00:29:23.780 all we have is conversation,
00:29:25.780 right?
00:29:25.980 You have conversation
00:29:27.060 and violence.
00:29:28.280 That's how we can influence
00:29:29.580 one another.
00:29:31.060 When things really matter
00:29:32.620 and words are insufficient,
00:29:35.800 people show up with guns.
00:29:38.120 That is the way things are.
00:29:40.800 So we have to create
00:29:42.160 the conditions
00:29:42.900 where conversations work.
00:29:46.280 And now we are living
00:29:47.160 in an environment
00:29:47.920 where words have become
00:29:50.360 almost totally ineffectual.
00:29:52.900 And this is what has been
00:29:55.560 so harmful,
00:29:56.760 I would say,
00:29:57.620 about Trump's candidacy
00:29:59.840 and his first few weeks
00:30:01.260 as president.
00:30:02.220 Just the degree to which
00:30:04.220 the man lies
00:30:05.560 and the degree to which
00:30:07.400 his supporters do not care,
00:30:10.820 that is one of the most
00:30:12.000 dangerous things
00:30:13.140 to happen
00:30:14.300 in my lifetime,
00:30:16.220 politically.
00:30:17.200 There simply has to be
00:30:18.780 a consequence
00:30:19.420 for lying
00:30:20.620 on this level.
00:30:22.780 And the retort
00:30:24.360 from a Trump fan
00:30:25.260 is,
00:30:25.900 well,
00:30:26.220 all politicians lie.
00:30:28.140 No,
00:30:28.780 all politicians
00:30:29.640 don't lie
00:30:30.920 like this.
00:30:32.400 What we are witnessing
00:30:33.580 with Trump
00:30:34.320 and the people around him
00:30:35.880 is something
00:30:37.200 quite new.
00:30:39.080 Even if I grant
00:30:40.080 that all politicians
00:30:41.200 lie a lot,
00:30:42.660 I don't even know
00:30:43.500 if I should grant that.
00:30:44.980 All politicians
00:30:45.620 lie sometimes,
00:30:47.100 say.
00:30:47.400 But even in their lying,
00:30:50.180 they have to endorse
00:30:51.680 the norm
00:30:52.760 of truth-telling.
00:30:55.060 That's what it means
00:30:55.920 to lie successfully
00:30:57.280 in politics
00:30:58.980 in a former age
00:31:00.840 of the earth.
00:31:01.840 You can't
00:31:02.700 be obviously lying.
00:31:05.660 You can't
00:31:06.300 obviously be repudiating
00:31:08.560 the very norm
00:31:10.300 of honest communication.
00:31:12.680 But what Trump has done
00:31:14.500 and the people around him
00:31:15.760 have gotten caught
00:31:17.000 in the same vortex.
00:31:19.340 It's almost like
00:31:19.800 a giddy nihilism
00:31:21.280 in politics,
00:31:22.740 right?
00:31:23.080 Where it's just,
00:31:24.420 you just say
00:31:25.040 whatever you want
00:31:26.280 and it doesn't matter
00:31:28.200 if it's true.
00:31:29.380 Just try to stop me
00:31:31.020 is the attitude.
00:31:32.540 It's unbelievable.
00:31:34.540 So finally,
00:31:35.460 on this point,
00:31:35.860 I would just say
00:31:36.300 that finding ways
00:31:38.280 to span
00:31:41.180 this chasm
00:31:42.840 between people,
00:31:44.780 finding ways
00:31:45.680 where we can reliably
00:31:46.740 influence one another
00:31:48.180 through conversation
00:31:50.020 based on shared norms
00:31:52.380 of argumentation
00:31:54.400 and self-criticism,
00:31:56.060 that is the operating system
00:31:58.380 we need.
00:31:59.460 That is the only thing
00:32:01.600 that stands between us
00:32:03.760 and chaos.
00:32:04.600 and they're the people
00:32:07.620 who are trying
00:32:08.140 to build that
00:32:08.880 and they're the people
00:32:10.240 who are trying
00:32:10.800 to tear it down
00:32:11.640 and now one of those people
00:32:13.620 is president.
00:32:15.280 And again,
00:32:15.560 I really don't think
00:32:16.400 this is too strong.
00:32:17.620 Trump is,
00:32:19.160 by all appearances,
00:32:21.260 consciously destroying
00:32:22.840 the fabric
00:32:23.720 of civil conversation
00:32:26.160 and his supporters
00:32:27.860 really don't seem to care
00:32:29.900 and I'm sure
00:32:31.080 that those of you
00:32:31.900 who support him
00:32:32.760 will think I'm just whinging now
00:32:34.900 in a spirit of partisanship,
00:32:37.060 right?
00:32:37.420 That's why I'm against Trump.
00:32:39.040 I'm a Democrat
00:32:39.760 or I'm a liberal.
00:32:41.220 That's just not the case.
00:32:43.280 Most normal Republican candidates
00:32:45.700 who I might dislike
00:32:47.600 for a variety of reasons,
00:32:49.580 Marco Rubio
00:32:50.660 or Jeb Bush
00:32:52.340 or even a quasi-theocrat
00:32:54.480 like Ted Cruz,
00:32:56.020 would still function
00:32:57.620 within the normal channels
00:32:59.620 of attempting
00:33:01.900 a fact-based conversation
00:33:03.700 about the world.
00:33:06.160 Their lies would be
00:33:07.180 normal lies
00:33:08.320 and when caught,
00:33:10.720 there'd be a penalty to pay.
00:33:12.520 They would lose face.
00:33:14.760 Trump has no face to lose.
00:33:17.000 This is an epistemological potlatch.
00:33:20.560 Do you know what a potlatch is?
00:33:21.760 It's a traditional
00:33:23.300 native practice
00:33:24.600 of burning up
00:33:26.580 your wealth,
00:33:28.520 burning up
00:33:29.220 your prized possessions
00:33:30.460 so as to prove
00:33:32.160 how wealthy you are,
00:33:35.040 right?
00:33:36.380 Look at me.
00:33:37.660 I can burn down
00:33:38.560 my own house.
00:33:39.960 This is a potlatch
00:33:41.340 of civil discourse.
00:33:44.080 Every time Trump speaks,
00:33:46.600 he's saying,
00:33:47.620 I don't have to make sense.
00:33:49.980 I'm too powerful.
00:33:51.080 to even have to make sense.
00:33:54.900 That is his message.
00:33:57.760 And half the country,
00:34:00.520 or nearly half,
00:34:02.660 seems to love it.
00:34:04.620 So when he's caught in a lie,
00:34:06.860 he has no face to lose.
00:34:10.360 Trump is chaos.
00:34:12.900 And one of the measures
00:34:14.140 of how bad he seems to me
00:34:16.240 is that I don't even care
00:34:18.020 about the theocrats
00:34:19.500 he has brought to power
00:34:20.660 with him.
00:34:21.380 And there are many of them.
00:34:23.100 You know,
00:34:23.280 he has brought in
00:34:24.440 Christian fundamentalists
00:34:26.460 to a degree
00:34:27.020 that would have been
00:34:27.580 unthinkable
00:34:28.440 10 years ago.
00:34:29.900 And 10 years ago,
00:34:30.540 I was spending a lot of time
00:34:31.940 worrying about the rise
00:34:33.120 of the Christian right
00:34:34.020 in this country.
00:34:35.320 Well,
00:34:35.500 it has risen
00:34:36.440 under Trump,
00:34:37.880 but
00:34:38.420 honestly,
00:34:39.640 it seems like
00:34:40.480 the least of our problems
00:34:41.560 at this moment.
00:34:43.600 And it's amazing
00:34:44.440 for me to say that,
00:34:46.020 given what it means
00:34:47.420 and might yet mean
00:34:48.840 to have people like Pence
00:34:50.600 and Jeff Sessions
00:34:51.800 and the other
00:34:53.120 Christian fundamentalists
00:34:55.060 in his orbit
00:34:56.420 empowered in this way.
00:34:58.700 Next question.
00:35:00.900 Are you still giving
00:35:01.760 $3,500
00:35:02.560 each month
00:35:03.980 from the podcast
00:35:05.100 to the Against Malaria Foundation,
00:35:07.840 as you spoke about
00:35:08.580 in your podcast
00:35:09.480 with Will McCaskill?
00:35:11.460 Yes.
00:35:12.300 Yes,
00:35:12.620 I'm doing that.
00:35:13.300 That's happening
00:35:13.980 automatically.
00:35:15.540 I'm not continuing
00:35:16.500 to talk about it
00:35:17.380 so as not to wear
00:35:18.700 my philanthropy
00:35:19.780 on my sleeve.
00:35:20.620 But that was the
00:35:21.320 result of my conversation
00:35:22.980 with Will.
00:35:23.560 I highly recommend
00:35:24.580 you listen to that podcast
00:35:25.780 because Will McCaskill
00:35:27.000 is fantastic.
00:35:28.460 I just came out
00:35:29.400 of that feeling
00:35:29.980 that
00:35:31.280 however
00:35:32.840 conflicted I might be
00:35:34.920 about
00:35:35.300 the results
00:35:36.320 of any podcast,
00:35:38.420 however conflicted
00:35:39.460 I might be
00:35:39.780 about the use
00:35:40.540 of my time
00:35:41.320 on any given month,
00:35:42.860 however conflicted
00:35:43.560 I might be
00:35:44.100 around asking
00:35:45.480 listeners to support
00:35:46.840 the podcast,
00:35:48.160 I wanted to know
00:35:48.960 that at minimum
00:35:49.960 I was doing
00:35:51.260 some good in the world
00:35:52.160 and the value
00:35:54.240 of saving a human life
00:35:55.640 each month
00:35:56.320 really can't be disputed.
00:35:58.720 And $3,500
00:35:59.620 is still
00:36:00.680 the
00:36:01.520 statistical minimum
00:36:03.260 for what it takes
00:36:04.040 to save a life
00:36:04.780 through the most
00:36:05.460 efficient means,
00:36:06.260 which is still
00:36:08.060 anti-malarial
00:36:09.000 bed nets.
00:36:10.500 So anyway,
00:36:11.400 listen to my
00:36:12.260 conversation with Will
00:36:13.220 and you may find it
00:36:14.880 as inspiring
00:36:15.720 as I did.
00:36:17.360 Okay,
00:36:18.280 next question.
00:36:19.780 One argument
00:36:20.600 I've heard from
00:36:21.140 someone who believes
00:36:21.820 in God
00:36:22.440 and in afterlife
00:36:23.760 is that
00:36:24.480 quote,
00:36:25.060 energy can never
00:36:25.860 be destroyed.
00:36:27.220 I assume what
00:36:27.760 is meant by this
00:36:28.320 is that consciousness
00:36:29.020 survives the body
00:36:30.140 as a soul,
00:36:31.140 perhaps.
00:36:32.040 I think this is
00:36:32.600 nonsense but I don't
00:36:33.300 really have a good
00:36:33.800 enough comeback
00:36:34.340 for it.
00:36:35.200 What would your
00:36:35.580 response be?
00:36:37.180 Well,
00:36:37.500 it's not a matter
00:36:38.420 of energy
00:36:39.980 so much as it is
00:36:41.840 information and
00:36:43.180 organization
00:36:43.660 when you're talking
00:36:44.880 about minds
00:36:45.600 and even living
00:36:46.780 systems.
00:36:47.780 The difference
00:36:48.280 between a living
00:36:48.940 system and a dead
00:36:50.280 one is not
00:36:51.760 merely a difference
00:36:54.020 in matter or
00:36:55.780 energy.
00:36:56.900 When you die,
00:36:57.760 you don't suddenly
00:36:58.760 become physically
00:36:59.620 lighter.
00:37:01.520 Actually,
00:37:01.880 when your body
00:37:02.240 begins to cool,
00:37:03.560 you have to become
00:37:04.220 a little lighter
00:37:04.820 because you're
00:37:05.560 losing kinetic
00:37:06.160 energy but I
00:37:07.900 doubt the effect
00:37:08.860 is measurable.
00:37:11.140 There was actually
00:37:11.580 a doctor at the
00:37:12.260 beginning of the
00:37:12.780 20th century,
00:37:14.000 I think,
00:37:14.380 named Duncan
00:37:14.980 McDougall who
00:37:16.300 assumed that the
00:37:18.280 soul must have
00:37:19.080 mass and therefore
00:37:20.480 he weighed people
00:37:22.060 at the moment of
00:37:22.760 death and he
00:37:24.000 claimed to have
00:37:24.560 found that the
00:37:25.700 weight of the
00:37:26.180 human body
00:37:26.800 diminished by
00:37:28.120 something on the
00:37:29.340 order of 21
00:37:30.040 grams.
00:37:31.380 I think he also
00:37:31.780 did experiments in
00:37:32.680 dogs and found
00:37:33.580 that there was
00:37:33.980 no weight
00:37:34.460 difference.
00:37:35.440 And this
00:37:35.620 confirmed the
00:37:36.800 thesis that
00:37:37.340 unlike human
00:37:38.520 beings, dogs
00:37:39.960 have no souls.
00:37:41.580 Right?
00:37:42.000 Well, obviously
00:37:43.340 there's no reason
00:37:44.760 to believe any of
00:37:45.360 this is true,
00:37:46.240 but you can
00:37:47.000 sympathize with
00:37:47.760 the good doctor's
00:37:48.420 thinking there.
00:37:50.080 It's really not a
00:37:50.680 question of matter
00:37:52.160 or energy going
00:37:53.720 somewhere else.
00:37:55.160 Nobody thinks
00:37:55.860 that heat energy
00:37:56.880 is the basis
00:37:58.320 of your conscious
00:37:59.560 life.
00:38:00.080 in fact, you're
00:38:01.580 losing heat every
00:38:02.460 moment now, you're
00:38:03.380 just producing more
00:38:04.600 of it.
00:38:05.380 It's not like your
00:38:05.940 mind has migrated
00:38:08.380 out into the
00:38:09.440 environment because
00:38:10.680 some of the
00:38:12.080 molecular energy in
00:38:13.160 your body has.
00:38:15.560 So whatever
00:38:16.120 consciousness is,
00:38:18.180 whatever its
00:38:18.640 relationship is to
00:38:20.320 the brain, if it
00:38:21.920 is the product of
00:38:23.680 what the brain is
00:38:24.920 doing, it is the
00:38:26.220 product of the
00:38:27.600 organized information
00:38:29.460 processing in the
00:38:31.000 brain.
00:38:31.940 And once that
00:38:33.160 ceases to be
00:38:34.540 organized, once
00:38:35.920 those processes
00:38:36.760 stop, once neurons
00:38:38.860 are no longer
00:38:39.440 firing, once their
00:38:41.060 connections begin to
00:38:42.740 break down, it's
00:38:44.120 not a matter of so
00:38:45.940 much of matter and
00:38:46.840 energy being lost,
00:38:47.900 it's a matter of
00:38:49.060 activity ceasing.
00:38:52.300 Where does a song
00:38:53.580 go when you stop
00:38:55.200 singing?
00:38:55.660 Where does a dance
00:38:58.080 go when you stop
00:38:59.600 dancing?
00:39:00.920 Do they still exist
00:39:02.360 in some way?
00:39:03.720 The distinction
00:39:04.300 between having a
00:39:06.080 mind and not having
00:39:07.120 one, or being alive
00:39:09.440 and being dead, is
00:39:11.080 more like that.
00:39:11.840 It's more like a
00:39:13.440 verb than a noun.
00:39:16.180 Living bodies do
00:39:17.540 things that dead
00:39:19.060 bodies don't.
00:39:20.300 And when they stop
00:39:21.180 doing those things,
00:39:22.640 they're dead.
00:39:23.220 systems that
00:39:25.500 process information
00:39:26.960 and could be the
00:39:28.420 basis of minds
00:39:29.560 are doing things
00:39:31.360 that disorganized
00:39:33.580 systems don't.
00:39:35.960 And when they
00:39:36.540 become disorganized,
00:39:38.500 they cease to do
00:39:39.440 those things.
00:39:40.320 So this is a bad
00:39:41.320 analogy, this idea
00:39:43.120 that the conscious
00:39:44.700 mind is energy
00:39:46.260 and energy can't be
00:39:47.800 destroyed.
00:39:49.220 Energy can be
00:39:50.200 converted into forms
00:39:51.600 that are no longer
00:39:52.740 useful, where it
00:39:54.080 can no longer do
00:39:54.760 work, where it
00:39:56.000 contains no more
00:39:57.260 information.
00:39:58.840 This is entropy.
00:40:01.220 And we are
00:40:02.740 fighting entropy
00:40:03.480 every moment of
00:40:04.300 our lives.
00:40:04.780 and when we die,
00:40:07.280 entropy wins.
00:40:09.740 If you think in
00:40:10.660 terms of process,
00:40:12.840 it's a little easier
00:40:13.640 to see that
00:40:14.180 processes can become
00:40:15.420 disordered and
00:40:16.700 disrupted, right,
00:40:18.660 and finally cease.
00:40:21.860 So this is not
00:40:23.000 where I would put my
00:40:23.660 hopes for
00:40:24.940 immortality.
00:40:27.420 Next question.
00:40:29.580 What would you say
00:40:30.040 to someone who
00:40:30.640 claims that the
00:40:31.380 humanities are an
00:40:32.440 unnecessary waste of
00:40:33.480 money, because they
00:40:34.780 have no immediate
00:40:35.340 practical purpose, and
00:40:37.060 thus should not be
00:40:37.760 taught at universities
00:40:38.720 or given funds for
00:40:39.820 research?
00:40:40.740 I refer to subjects
00:40:41.660 such as history,
00:40:42.580 sociology, or
00:40:43.480 philosophy.
00:40:45.200 Well, while I'm a
00:40:46.160 huge fan of the
00:40:48.060 sciences, obviously,
00:40:50.040 and also a critic of
00:40:53.180 some of the
00:40:54.260 ideological trends in
00:40:57.460 the humanities, much
00:40:59.060 of the derangement of
00:41:00.580 the left on college
00:41:01.920 campuses that I've
00:41:02.920 spoken about, could
00:41:04.580 be laid at the
00:41:05.220 doorsteps of many
00:41:06.740 of the departments in
00:41:07.740 the humanities, but
00:41:09.100 speaking generally,
00:41:10.080 there's much more to
00:41:11.100 living a life worth
00:41:12.380 living and having a
00:41:14.040 mind worth having
00:41:15.480 than just
00:41:16.320 understanding the
00:41:17.260 world scientifically
00:41:18.380 or producing better
00:41:20.340 technology.
00:41:22.060 The humanities are
00:41:22.820 absolutely central to
00:41:25.080 intellectual life and
00:41:26.620 ethical life, and
00:41:28.540 while there really
00:41:29.080 isn't an infinite
00:41:30.060 amount to learn, and
00:41:31.440 I wish I had
00:41:32.260 studied some things
00:41:33.920 differently as an
00:41:35.260 undergraduate, I'm
00:41:36.140 very happy to have
00:41:36.940 done my undergraduate
00:41:38.280 degree in philosophy
00:41:39.180 because it gets you
00:41:41.640 thinking and arguing
00:41:43.360 clearly about more or
00:41:45.640 less everything, or at
00:41:46.680 least potentially can do
00:41:47.960 that, and I think
00:41:49.880 that's extremely
00:41:50.380 important.
00:41:50.940 So I don't, you
00:41:51.760 know, while it's not
00:41:52.400 obvious what the jobs
00:41:53.460 are for most people
00:41:54.520 coming out of a
00:41:55.440 philosophy degree, when
00:41:57.060 people ask me whether
00:41:57.800 I recommend a degree
00:41:59.340 in philosophy.
00:42:01.180 If you'd like to
00:42:02.040 continue listening to
00:42:02.780 this conversation, you'll
00:42:04.180 need to subscribe at
00:42:05.160 samharris.org.
00:42:06.520 Once you do, you'll get
00:42:07.600 access to all full-length
00:42:08.760 episodes of the Making
00:42:09.580 Sense podcast, along with
00:42:11.200 other subscriber-only
00:42:12.140 content, including bonus
00:42:13.840 episodes and AMAs and the
00:42:16.080 conversations I've been
00:42:16.880 having on the Waking Up
00:42:17.760 app.
00:42:18.680 The Making Sense podcast
00:42:19.580 is ad-free and relies
00:42:21.380 entirely on listener
00:42:22.340 support, and you can
00:42:23.740 subscribe now at
00:42:25.020 samharris.org.
00:42:27.800 .