Making Sense - Sam Harris - May 12, 2017


#75 — Ask Me Anything 7


Episode Stats

Length

29 minutes

Words per Minute

136.9192

Word Count

3,988

Sentence Count

229

Misogynist Sentences

1

Hate Speech Sentences

9


Summary

Sam Harris responds to a question from a listener about whether the concept of the self being an illusion is actually an illusion. He also talks about the benefits of mindfulness and meditation, and offers advice on how to make the most of your time in the present moment, and how to practice mindfulness in general, in order to make sense of the world we live in and the lessons we can learn from the past and present, and the things we can do to improve the way we think about the world around us and how we can improve our lives and our lives around us. The podcast is made possible entirely through the support of our subscribers, and we don t run ads on the podcast unless you become a supporter of the podcast. If you enjoy what we're doing here, please consider becoming one. You'll get access to full episodes of Making Sense Podcast wherever you get your podcasts, plus ad-free versions of the show wherever you consume your favorite podcasting apps, including Audible, iTunes, and Poshmark, wherever you re listening to podcasts. Thanks for listening. Make sense of this episode? and Good Luck! -Sam Harris Copyright 2019: Making Sense Media, LLC. All rights reserved. This episode was produced and edited by Sam Harris. All permission is property of his own making sense and is not affiliated with any of his patrons' companies, unless otherwise stated in this podcast or any other third party licensed under a third party agreement. Thank you for your support and permission to use this material on this podcast, unless stated in the making sense. Please don't forget to tell me what you're listening to this podcast on a podcast or listening to it. or sharing it on social media or using it on their own website, or any of their own podcasting platform if you're looking for a discount or other such thing in any other podcasting service thank you, or you're being compensated for this podcasting opportunity is a friend of this podcast I'm looking out to help me out in any way of any other person else's making sense of it, and I appreciate it, I'm grateful that you're helping me out here, I really appreciate it it's a good thing, thank you thanks you're a good friend of making sense, good day, etc., etc., good luck, good night, good morning, and good night etc.,


Transcript

00:00:00.000 Welcome to the Making Sense Podcast.
00:00:08.820 This is Sam Harris.
00:00:10.880 Just a note to say that if you're hearing this, you are not currently on our subscriber
00:00:14.680 feed and will only be hearing the first part of this conversation.
00:00:18.420 In order to access full episodes of the Making Sense Podcast, you'll need to subscribe at
00:00:22.720 samharris.org.
00:00:24.060 There you'll find our private RSS feed to add to your favorite podcatcher, along with
00:00:28.360 other subscriber-only content.
00:00:30.240 We don't run ads on the podcast, and therefore it's made possible entirely through the support
00:00:34.640 of our subscribers.
00:00:35.720 So if you enjoy what we're doing here, please consider becoming one.
00:00:46.820 Okay, this is an Ask Me Anything episode.
00:00:50.680 And just one housekeeping item here.
00:00:52.960 I noticed the other day that Amazon canceled my affiliates account.
00:00:59.140 This is the account that allows me to post links to books and to have some portion of
00:01:05.500 your shopping on Amazon through those links come back to support the podcast at no extra
00:01:11.800 cost to you.
00:01:13.080 And they did this because apparently I was in violation of their policy.
00:01:19.100 You can't tell your podcast listeners that following these Amazon links does support the
00:01:25.580 podcast.
00:01:26.020 I'm not sure why they consider this some kind of unethical inducement.
00:01:31.980 It's obvious that this is why podcasters and content creators use Amazon affiliates links.
00:01:38.540 But honestly, I had never read the fine print.
00:01:40.860 And I don't know how frequently it gets updated.
00:01:46.140 I was certainly not in conscious violation of their policy.
00:01:49.780 And, you know, I don't see anything unethical about either way of thinking about this.
00:01:55.860 Obviously, Amazon can have any policy they want.
00:01:57.940 But this is just to inform you that those of you who have been supporting the podcast this
00:02:04.640 way can no longer do that and that those links are now retired.
00:02:09.960 The only ways to support the podcast are through my website at samharris.org forward slash support
00:02:18.640 or through Patreon.
00:02:21.140 And you can find a link to Patreon also on my support page.
00:02:27.500 But this is, I mentioned this for another reason.
00:02:30.840 This is a larger problem that people are running into online.
00:02:34.800 People who are creating content, those who use YouTube ads, which I don't, are often finding their videos demonetized suddenly
00:02:47.160 based on some algorithmic or editorial concern about the content.
00:02:54.160 Podcasters and videographers are just finding that their online businesses evaporate overnight.
00:03:01.300 And I've heard from many ex-Muslims and secularists that their Facebook pages have been canceled
00:03:12.840 based on some perceived blasphemy or even an organized campaign launched by their religious critics.
00:03:22.400 So, it's just a fact that many content creators are very vulnerable to the decisions made by these platforms.
00:03:34.580 And it's easy to lose sight of this vulnerability when we're on social media and building a platform there.
00:03:42.960 Platforms that can be not only useful but indispensable for writers and artists and podcasters.
00:03:49.660 We are using someone else's platform.
00:03:54.160 We are essentially sharecropping for Facebook and Twitter and YouTube.
00:03:59.220 And this can all go away overnight.
00:04:03.040 It's the Wild West out here still when it comes to producing digital content.
00:04:08.260 Okay, so this is an Ask Me Anything podcast.
00:04:11.720 I went out on Twitter asking for questions.
00:04:15.300 And it looks like I got 1,400 of them.
00:04:19.820 Or at least 1,400 responses to the tweet.
00:04:23.600 Now, I'm going to go through these more quickly than usual.
00:04:28.600 In the interest of both hitting more points and seeing if I can do it.
00:04:33.680 One of the features I'm building into my meditation app is a Q&A feature.
00:04:38.920 Where I can answer questions live.
00:04:42.500 So I can announce that I'm going to be on the app for the next hour.
00:04:46.120 Come and ask questions.
00:04:47.760 And it'll be like an audio version of Periscope.
00:04:50.440 Where you can type in questions and I can respond.
00:04:53.820 And I've never done this really.
00:04:56.280 So let's see how it works.
00:04:58.700 Maybe this feature is something I don't need at all.
00:05:02.360 Because it'll just cause me to put my foot in my mouth again and again.
00:05:05.980 Okay, first question.
00:05:10.960 Is it possible that the mindfulness notion of the self being an illusion is itself an illusion?
00:05:17.300 Well, almost anything is possible.
00:05:20.200 I'll tell you why I think it's not an illusion.
00:05:23.240 The classic illusion is something that seems a certain way.
00:05:27.920 But then you pay more attention to it.
00:05:30.680 You study it more carefully.
00:05:32.940 And it seems another way.
00:05:34.460 Right, so it collapses into another form on the basis of paying more attention to the phenomenon.
00:05:44.280 This can be true of visual illusions.
00:05:46.460 You think there's a triangle there.
00:05:50.500 But then you see that the sides of the triangle don't even exist.
00:05:54.540 Right, because they've been merely implied.
00:05:58.100 So you pay more attention and you see that there is no triangle there on the page.
00:06:01.560 Even though there seems to be one.
00:06:04.720 Now, the sense of self, the sense that there's a subject in our heads, a thinker of thoughts,
00:06:12.920 that is a feeling that if you pay more careful attention to it, goes away.
00:06:18.600 And every time it comes back, once you actually know how to pay attention,
00:06:25.100 it is by virtue of being distracted, being captured by something else, being lost in thought, actually.
00:06:33.800 And then when you pay attention again, it goes away.
00:06:36.600 And once you learn how to pay attention, once you really learn how to meditate, it goes away every time.
00:06:44.860 You reliably fail to find this feeling that you've been calling I.
00:06:50.360 Now, I talk much more about this in my book, Waking Up.
00:06:56.020 I will talk much more about this in my forthcoming meditation app.
00:07:01.360 But the idea that there may really be a self that just disappears or seems to disappear every time you look for it
00:07:10.960 is no more compelling to me than the idea that there really is a triangle on the page
00:07:16.480 in the Kinesa-Klein illusion, and that it only seems to disappear every time you look for its sides.
00:07:22.800 And if you're not familiar with the illusion I'm talking about, Google Kinesa-Klein triangle.
00:07:30.140 And you'll see a triangle bounded by three partial circles, or what seems to be a triangle.
00:07:39.380 But again, much more on that in my book, Waking Up, and in my forthcoming app.
00:07:45.860 Tell me some real-life examples that are good for society
00:07:49.600 and that are informed by Charles Murray's research in the bell curve.
00:07:55.440 I guess I should say a few things about the Charles Murray podcast.
00:07:59.520 I got some considerable criticism for that.
00:08:03.060 Glenn Greenwald called me a racist.
00:08:05.620 No surprises there.
00:08:07.360 But I got actually much less criticism than I would have thought.
00:08:11.800 As did Charles.
00:08:12.920 I think we were both pleasantly surprised by the reception.
00:08:18.020 I think he said in his email to me that I appear to have gotten more criticism for having him on the podcast
00:08:23.780 than he was getting for being himself.
00:08:27.940 But in any case, I didn't get all that much.
00:08:30.440 I think people got the point of what I was doing there, which makes me happy.
00:08:36.940 The point of the conversation was not to talk about differences in IQ across race.
00:08:44.640 As I think I made clear, that topic doesn't really interest me.
00:08:49.640 And I share some of the skepticism communicated in this question.
00:08:53.920 When I asked Charles what the point of this kind of research was,
00:08:58.440 many of you felt that his answer was insufficient and a little confusing.
00:09:04.360 I can tell you what I took his answer to be.
00:09:06.540 He seemed to be saying that if we are misled by an irrational expectation
00:09:14.040 that intelligence must be the same statistically across populations,
00:09:19.800 then we will perceive any difference in representation of racial groups or ethnic groups
00:09:27.280 in the various walks of life as being synonymous with racism or bad policy.
00:09:33.920 To take another potentially inflammatory example,
00:09:38.560 it would be conceivable to think that because the number of Jews in the NBA
00:09:44.280 isn't exactly in register with the number of Jews in the population,
00:09:50.140 well, then there's some latent anti-Semitism operating there,
00:09:54.960 keeping Jews off the basketball court.
00:09:57.760 Now, does anyone think that?
00:09:59.840 I doubt it.
00:10:00.880 But Charles' general concern is clearly that our expectations and our policies
00:10:06.860 track real facts in the world,
00:10:10.800 and that we not go in search of problems that don't exist,
00:10:14.660 and that we not make other problems that clearly do exist worse
00:10:18.480 by giving them bad remedies.
00:10:20.420 Now, our conversation didn't go into social policy with any depth at all.
00:10:29.580 And I think at one point in the podcast, I simply said,
00:10:32.020 I'm not informed enough about the consequences of various policies
00:10:36.400 to even have that conversation.
00:10:38.540 But the real purpose of that podcast episode
00:10:42.480 was to perform a kind of exorcism on the topic
00:10:46.320 and Murray's reputation.
00:10:48.900 Again, we're talking about a man who cannot stand up on a college campus
00:10:52.600 without encountering the threat of being physically hounded off of it.
00:10:56.880 UC Berkeley, just the other day,
00:10:59.320 declared that it could not keep Ann Coulter physically safe
00:11:03.260 were she to come to the campus
00:11:05.060 to talk to the college Republicans.
00:11:07.180 Now, I don't agree with Ann Coulter about much.
00:11:10.980 I'm not at all inclined to invite her on the podcast
00:11:13.420 because I think what she says is either boring or insincere.
00:11:18.540 But it's pretty clear we are having a breakdown of civil society
00:11:23.320 when a college cannot keep her safe
00:11:26.180 and puts the onus on her, at least implicitly,
00:11:30.640 and her views and the views of those who want to hear her speak
00:11:33.940 rather than on this moral panic
00:11:37.060 that is shutting down conversation on the left.
00:11:41.500 So I wanted Murray here above all
00:11:44.000 because I realized that I had been somewhat complicit
00:11:47.080 in his defamation
00:11:49.020 merely by my benign neglect of his work.
00:11:53.120 Once it became clear to me
00:11:54.380 that he was a well-intentioned and careful scholar,
00:11:58.300 whatever the merits of his research in fact are,
00:12:01.800 he was not at all the golem
00:12:04.580 that had been created by the hysteria on the left.
00:12:08.880 So, I had that conversation with Charles.
00:12:11.960 I enjoyed it.
00:12:13.380 Most of you seem to find it quite illuminating.
00:12:17.880 And I have no regrets there.
00:12:21.020 We have to be able to talk about facts
00:12:24.420 without at every turn
00:12:26.820 claiming that those with whom we disagree are evil.
00:12:29.740 If you want to see some criticism of The Bell Curve
00:12:33.020 that came out contemporaneous with its publication,
00:12:36.120 in this Twitter feed,
00:12:38.240 Michael Shermer linked to one of the articles
00:12:41.360 published in Skeptic magazine.
00:12:43.860 If you go to skeptic.com
00:12:46.160 and search Bell Curve,
00:12:48.060 presumably you'll get some older articles there,
00:12:51.540 at least one of which is critical.
00:12:53.320 But, I should say that there's nothing that I have heard
00:12:58.260 since my podcast with Charles
00:13:00.880 that suggests to me that he was misrepresenting
00:13:04.440 the state of the science
00:13:06.200 circa 2017.
00:13:09.740 In fact, one person I heard from was Richard Hayer,
00:13:14.540 an emeritus professor at the University of California, Irvine.
00:13:17.520 Richard is a PhD in psychology
00:13:21.500 who studies the neurobiology of intelligence.
00:13:26.020 And he's written a very recent book
00:13:27.720 for Cambridge University Press
00:13:29.560 entitled The Neuroscience of Intelligence.
00:13:33.400 I have now read part of that book.
00:13:35.760 Just arrived the other day.
00:13:37.920 And what's clear from the parts I've read
00:13:40.280 and from his email to me
00:13:42.660 is that the basic science
00:13:45.300 that Murray was discussing
00:13:47.000 has held up.
00:13:49.120 As controversial as it still is in some quarters,
00:13:52.580 the notion of general intelligence
00:13:55.380 seems valid.
00:13:58.580 IQ tests can test for it.
00:14:01.680 There's no reason to think that
00:14:03.040 we are unable to do this
00:14:05.680 in an unbiased way.
00:14:08.340 And the results of these tests
00:14:11.080 are predictive of a wide variety of outcomes
00:14:15.000 educationally, occupationally, and otherwise.
00:14:20.060 And there seems to be absolutely no question
00:14:23.380 that intelligence is highly heritable
00:14:26.940 and correlates with neurophysiological facts
00:14:31.600 at the level of the brain.
00:14:34.360 So again, this is not my area of special interest.
00:14:38.280 And none of this is to claim that
00:14:40.240 intelligence is the only thing
00:14:42.400 that dictates success in life.
00:14:44.880 I'm sure many of you know
00:14:46.360 some very smart people
00:14:47.560 who haven't done much of anything
00:14:49.640 with their lives
00:14:50.860 or have done some very questionable things.
00:14:54.260 I certainly know such people.
00:14:56.580 And no doubt we will find out more
00:14:58.640 about the brain basis of intelligence
00:15:00.800 in the coming years.
00:15:01.760 whether we will be able to augment it
00:15:04.760 directly by brain-machine interface.
00:15:09.600 That's another question
00:15:10.680 that has come up, I see, repeatedly here.
00:15:13.940 Many of you have asked
00:15:14.960 what I think of Elon Musk's
00:15:17.200 new company Neuralink
00:15:18.620 and his goal of building
00:15:22.780 a brain-computer interface
00:15:25.160 that not only will be useful
00:15:27.160 for people suffering
00:15:28.800 neurological injury or disease
00:15:30.340 but will be so useful
00:15:32.080 and so readily adopted
00:15:34.520 that we will all become cyborgs
00:15:37.240 and plug our brains
00:15:39.420 directly into the cloud.
00:15:41.660 Well, first I should say
00:15:42.920 that I don't have any
00:15:43.580 inside information on this.
00:15:45.260 I actually haven't spoken to Elon
00:15:46.380 about this much
00:15:47.780 apart from early conversations
00:15:50.380 about the fact that he was doing this.
00:15:53.220 Most of what I know about the company
00:15:55.000 I learned recently
00:15:56.720 from Tim Urban's blog post
00:15:59.240 about Elon and Neuralink
00:16:01.720 which you can read
00:16:03.140 on the Wait But Why blog.
00:16:05.600 And if you don't know Tim Urban
00:16:06.620 and his blog
00:16:07.440 you really should.
00:16:09.160 He's fantastic.
00:16:10.860 He's another one of these
00:16:11.840 content creators
00:16:12.840 who you can support on Patreon
00:16:15.640 as I do.
00:16:18.300 He's amazing.
00:16:20.240 I'll have him on the podcast
00:16:21.480 at some point
00:16:22.240 because he's doing something
00:16:23.700 very unique.
00:16:25.000 And he's written these very long
00:16:27.480 really book-length blog posts
00:16:29.660 on Elon and his various companies.
00:16:32.140 He's done it for SpaceX and Tesla
00:16:34.060 and he just did one
00:16:36.460 for Neuralink.
00:16:38.820 So you can read there
00:16:40.600 just how daunting
00:16:42.320 the technical challenges are
00:16:44.500 in doing this.
00:16:46.020 Just what it means
00:16:47.060 to put an array
00:16:48.020 of whatever material composition
00:16:50.380 on the cortex
00:16:51.820 or implant anything
00:16:54.100 into the brain
00:16:55.580 hoping to be able to read out
00:16:58.580 the activity
00:16:59.560 of vast numbers of neurons
00:17:02.360 so as to get the data
00:17:04.280 of conscious and unconscious thought
00:17:06.280 out into the world
00:17:08.560 much less reading programs
00:17:10.360 from the world
00:17:11.060 back into the brain
00:17:12.560 so as to influence
00:17:14.100 its functioning.
00:17:15.140 This is an incredibly
00:17:17.180 daunting challenge.
00:17:18.660 I think it's
00:17:19.040 no exaggeration
00:17:20.000 to say
00:17:20.300 this is the
00:17:20.960 most ambitious
00:17:22.280 thing
00:17:23.420 technically
00:17:24.420 that we can imagine.
00:17:26.760 When you consider
00:17:27.880 the possibility
00:17:28.680 of helping
00:17:29.420 people
00:17:30.260 whose brains
00:17:31.760 have been damaged
00:17:32.600 either by injury
00:17:33.940 or illness
00:17:34.440 well then this is
00:17:35.320 totally uncontroversial.
00:17:37.480 It's a wonderful thing
00:17:38.780 to be tempting
00:17:39.960 and there's already
00:17:42.140 some progress
00:17:42.940 on those fronts
00:17:44.220 but when you
00:17:45.560 imagine the bigger picture
00:17:47.140 of
00:17:48.260 fundamentally
00:17:49.520 augmenting
00:17:50.420 human intelligence
00:17:51.300 so that we're not
00:17:52.740 in a losing
00:17:53.460 competition
00:17:54.380 with the machines
00:17:55.480 of the future
00:17:56.080 well obviously
00:17:57.760 there are a few
00:17:58.680 assumptions there
00:17:59.460 that will be
00:18:00.380 controversial
00:18:00.900 and at least
00:18:02.800 one
00:18:03.360 that strikes me
00:18:04.860 as
00:18:05.420 potentially
00:18:06.520 far-fetched
00:18:07.760 and that's
00:18:08.940 the assumption
00:18:09.320 that
00:18:09.960 it will be
00:18:10.480 possible to do
00:18:11.260 this in a way
00:18:11.740 that is sufficiently
00:18:12.640 non-invasive
00:18:14.200 so that
00:18:15.280 we'll all want
00:18:16.580 to have
00:18:17.180 our brains
00:18:18.320 connected to the cloud.
00:18:20.320 Anything that
00:18:21.220 requires
00:18:21.800 neurosurgery
00:18:23.400 obviously
00:18:24.240 is setting the
00:18:25.380 bar
00:18:25.660 pretty high
00:18:26.580 and
00:18:27.420 it remains
00:18:28.340 to be seen
00:18:28.880 just how
00:18:29.520 non-invasive
00:18:30.420 a brain-machine
00:18:31.720 interface
00:18:32.340 can become
00:18:33.160 but the
00:18:34.120 technical challenges
00:18:35.100 are fairly
00:18:36.380 astounding
00:18:37.340 and
00:18:38.180 Tim Urban's
00:18:39.540 blog post
00:18:40.040 will
00:18:40.240 give you
00:18:41.060 a good
00:18:41.280 sense
00:18:41.580 of what
00:18:42.380 they are.
00:18:44.220 Do you
00:18:45.120 think reducing
00:18:45.960 wild animal
00:18:47.060 suffering
00:18:47.520 is a moral
00:18:48.280 blind spot
00:18:48.940 of modern
00:18:49.780 humans
00:18:50.260 or a moral
00:18:51.160 error?
00:18:51.560 I remember
00:18:54.000 hearing about
00:18:55.580 some vegans
00:18:56.500 who thought
00:18:57.580 it a moral
00:18:58.500 duty
00:18:58.880 to prevent
00:19:00.660 various predators
00:19:02.300 on the African
00:19:03.680 savannah
00:19:04.220 from killing
00:19:05.360 their prey.
00:19:07.340 Who knows if that
00:19:08.460 was just a
00:19:08.980 slander of vegans
00:19:10.140 but I'm sure
00:19:11.340 somebody's capable
00:19:12.160 of thinking that.
00:19:13.240 It is a kind of
00:19:14.060 reductio ad absurdum
00:19:15.180 of an ethical
00:19:16.620 concern for
00:19:17.400 animals
00:19:17.820 but the underlying
00:19:19.320 fact is that
00:19:20.820 nature is
00:19:22.300 not a
00:19:24.060 theater
00:19:25.100 of moral
00:19:26.160 concern.
00:19:27.360 Really,
00:19:28.160 it is an
00:19:28.580 abattoir.
00:19:29.840 Everything
00:19:30.240 is getting
00:19:31.300 eaten.
00:19:33.080 Every
00:19:33.840 animal
00:19:35.240 with the
00:19:36.180 exception of
00:19:36.820 the apex
00:19:37.920 predators
00:19:38.480 lives in
00:19:39.740 perpetual flight
00:19:40.840 from the other
00:19:42.180 animals that
00:19:42.860 want to make
00:19:43.560 it a meal.
00:19:45.140 There is no
00:19:46.280 way to
00:19:47.120 intercede
00:19:48.820 here that
00:19:49.920 doesn't
00:19:50.440 directly cause
00:19:51.960 the starvation
00:19:53.300 and therefore
00:19:54.280 misery and
00:19:55.180 death of
00:19:56.420 some other
00:19:57.380 species.
00:19:59.220 And then when
00:19:59.600 you add the
00:20:00.260 layer of
00:20:01.360 contagious
00:20:02.420 illness and
00:20:04.000 parasites,
00:20:05.860 right, the
00:20:06.240 fact that
00:20:06.720 every creature
00:20:07.560 is more or
00:20:09.040 less all the
00:20:09.520 time being
00:20:10.020 victimized by
00:20:11.260 various worms
00:20:12.940 and amoebas,
00:20:15.500 it's pretty
00:20:16.160 clear that
00:20:16.880 there is no
00:20:17.880 all-seen and
00:20:18.760 all-powerful,
00:20:19.860 compassionate
00:20:20.380 god who
00:20:21.660 set this
00:20:22.120 place up
00:20:22.740 for general
00:20:24.700 equanimity.
00:20:25.940 So, yeah, I
00:20:26.880 don't see how
00:20:27.440 we intercede on
00:20:28.280 behalf of the
00:20:28.800 rabbits and
00:20:29.380 the pigeons
00:20:30.440 and take a
00:20:31.780 position against
00:20:32.760 the foxes and
00:20:34.160 coyotes and
00:20:34.980 hawks.
00:20:35.680 I do feel a
00:20:37.980 little strange
00:20:38.580 about people
00:20:39.260 who keep
00:20:40.340 pet snakes
00:20:41.400 and repeatedly
00:20:44.240 feed them
00:20:45.040 mammals like
00:20:46.780 mice and
00:20:47.340 rats.
00:20:48.360 There's a
00:20:49.040 cognitive
00:20:49.480 hierarchy there
00:20:50.500 that I
00:20:51.000 wouldn't want
00:20:51.460 to keep
00:20:52.660 standing on
00:20:53.920 the wrong
00:20:54.320 side of
00:20:54.980 day after
00:20:56.180 day.
00:20:56.840 I think the
00:20:57.780 rats and
00:20:58.260 the mice
00:20:58.700 suffer more
00:20:59.900 than the
00:21:00.660 snakes.
00:21:02.140 That could
00:21:02.780 just be my
00:21:03.220 warm-blooded
00:21:03.820 bias, but
00:21:05.200 the neurological
00:21:05.680 details would
00:21:06.980 back me up
00:21:07.520 there.
00:21:09.220 Next question.
00:21:11.260 How is
00:21:12.160 Brazilian
00:21:13.080 jiu-jitsu
00:21:13.620 coming?
00:21:15.060 Slowly, as
00:21:16.260 ever.
00:21:17.480 I absolutely
00:21:18.540 love it.
00:21:19.440 I'm still in
00:21:20.240 the mode of
00:21:21.020 perpetually
00:21:21.620 mitigating
00:21:22.420 injury, so
00:21:23.200 I don't do
00:21:23.780 it nearly as
00:21:24.260 much as I
00:21:24.680 would want
00:21:25.360 to.
00:21:26.460 I think I
00:21:27.300 will probably
00:21:27.960 be a blue
00:21:29.040 belt for the
00:21:29.900 rest of my
00:21:30.320 life at this
00:21:31.500 rate, but it
00:21:33.040 does remain
00:21:33.680 one of the
00:21:34.300 most gratifying
00:21:35.740 hours I can
00:21:37.180 spend doing
00:21:38.080 anything.
00:21:39.760 What are your
00:21:40.400 thoughts on
00:21:40.800 Kevin Kelly's
00:21:41.780 article, The
00:21:42.980 Myth of
00:21:43.500 Superhuman
00:21:44.080 AI?
00:21:45.300 Actually, I've
00:21:46.160 read the
00:21:46.440 article, and
00:21:47.840 Kevin got in
00:21:49.140 touch with me,
00:21:49.760 and we're going
00:21:50.340 to do a
00:21:51.000 podcast, I
00:21:52.620 think in about
00:21:53.100 a month here.
00:21:54.860 I have to
00:21:55.220 check the
00:21:55.520 calendar, but
00:21:56.740 it's already
00:21:57.300 scheduled, and
00:21:58.280 I look forward
00:21:58.740 to that.
00:21:59.040 We disagree
00:21:59.800 about many
00:22:01.420 things on
00:22:02.460 this topic, and
00:22:03.520 that should
00:22:04.260 be a fun
00:22:04.740 conversation.
00:22:06.240 I think we
00:22:06.600 disagree about
00:22:07.160 religion and
00:22:08.260 a few other
00:22:08.820 things, too, so
00:22:09.500 I'm looking
00:22:10.660 forward to
00:22:11.000 that.
00:22:12.540 How do you
00:22:12.800 think your
00:22:13.100 friend, the
00:22:13.700 late, great
00:22:14.260 Christopher
00:22:14.680 Hitchens, would
00:22:15.880 have dealt
00:22:16.240 with the
00:22:16.560 Trump
00:22:16.800 presidency?
00:22:18.840 Well,
00:22:20.600 eloquently, no
00:22:22.160 doubt, and he
00:22:23.420 has missed more
00:22:24.260 than ever at
00:22:25.020 this point, I
00:22:25.500 would say.
00:22:26.740 Once again, many
00:22:27.380 people are under
00:22:28.120 the impression that
00:22:28.900 he hated the
00:22:29.620 Clintons so
00:22:30.640 much that he
00:22:31.660 would have
00:22:31.940 obviously sided
00:22:33.180 with Trump.
00:22:34.980 Given what I
00:22:36.180 know about
00:22:36.720 Hitch, that
00:22:37.780 seems almost
00:22:38.800 perfectly delusional.
00:22:41.020 I honestly
00:22:42.120 cannot imagine a
00:22:43.900 candidate and
00:22:44.900 his surrogates who
00:22:46.260 are more at
00:22:47.020 odds with
00:22:48.240 Hitch's deepest
00:22:49.500 intellectual values.
00:22:51.340 The lack of
00:22:52.340 honesty and real
00:22:55.080 intellectual engagement
00:22:56.240 with history and
00:22:58.600 with policy and
00:23:00.460 facts as they
00:23:01.540 can be known.
00:23:04.200 But unfortunately,
00:23:05.240 we do not have
00:23:05.780 the pleasure of
00:23:06.620 his company now.
00:23:08.800 And if you
00:23:10.120 think I'm soft
00:23:10.920 on the Clintons,
00:23:12.060 go back and
00:23:13.140 listen to my
00:23:13.660 podcast with
00:23:14.440 Andrew Sullivan
00:23:15.100 that we did in
00:23:16.320 the run-up to
00:23:17.260 the election.
00:23:18.720 I certainly
00:23:19.480 share most of
00:23:20.740 Hitch's view
00:23:22.540 of both
00:23:23.620 Clintons.
00:23:24.120 But we now
00:23:26.880 have the
00:23:27.300 president we
00:23:27.800 have.
00:23:29.400 And barring
00:23:30.060 some impeachment
00:23:30.940 proceeding relative
00:23:32.780 to the Russian
00:23:34.100 hacking scandal,
00:23:35.400 it would seem we
00:23:36.520 have to make the
00:23:36.940 best of it for
00:23:38.480 the time being.
00:23:40.900 Will you do a
00:23:42.060 podcast with Ben
00:23:43.120 Shapiro on
00:23:44.120 religion?
00:23:44.540 Many people
00:23:46.760 have asked that
00:23:47.320 I do something
00:23:48.140 with Ben.
00:23:48.960 I am certainly
00:23:49.720 open to it.
00:23:50.400 In fact, Dave
00:23:51.620 Rubin has
00:23:53.040 threatened to
00:23:53.740 get us together
00:23:54.380 on his show.
00:23:55.760 I'm not entirely
00:23:56.620 sure what we
00:23:57.720 would get into,
00:23:58.860 but Ben is
00:23:59.540 obviously smart,
00:24:00.640 and we disagree
00:24:01.480 about many things,
00:24:02.380 although Trump
00:24:03.120 is not one of
00:24:03.920 them.
00:24:04.340 Ben is a,
00:24:05.080 last I looked,
00:24:06.100 not a Trump
00:24:06.660 fan.
00:24:07.800 But he is a
00:24:09.000 conservative,
00:24:09.860 and I believe
00:24:10.840 he's a conservative
00:24:11.620 or even
00:24:13.340 Orthodox Jew.
00:24:15.000 Not sure,
00:24:15.740 but no doubt
00:24:17.020 there's something
00:24:17.400 to disagree about
00:24:18.100 there, and
00:24:18.880 I am open to
00:24:20.500 it.
00:24:20.940 Not sure when
00:24:21.520 it's going to
00:24:22.080 happen, however.
00:24:24.460 Some of you
00:24:25.100 have noticed
00:24:25.640 my trolling
00:24:27.380 of Jocko
00:24:28.140 Willink and
00:24:29.520 his trolling
00:24:30.400 of me online.
00:24:32.060 I forget how
00:24:32.580 this started.
00:24:34.000 I think I
00:24:34.560 once revealed
00:24:35.720 that I was
00:24:36.760 a fan of
00:24:37.500 Downton Abbey
00:24:38.340 on a podcast
00:24:39.220 and started
00:24:41.100 getting
00:24:41.320 slammed for
00:24:42.160 it on
00:24:43.000 Twitter,
00:24:43.640 and then
00:24:44.760 I just
00:24:45.600 went out,
00:24:46.180 I just
00:24:46.520 lurched at
00:24:47.720 Jocko on
00:24:49.140 Twitter saying
00:24:49.780 something like,
00:24:50.460 well, I
00:24:52.040 happen to know
00:24:52.520 that when
00:24:52.780 Jocko's doing
00:24:53.560 his deadlifts
00:24:54.400 in his
00:24:54.820 basement at
00:24:55.480 four in the
00:24:56.000 morning, he's
00:24:57.200 watching old
00:24:57.960 episodes of
00:24:58.780 Downton Abbey.
00:25:00.160 Isn't that
00:25:00.500 right, Jocko?
00:25:01.600 Those of you
00:25:02.300 who don't know
00:25:02.660 Jocko, Jocko
00:25:03.440 is a Navy
00:25:04.440 SEAL and
00:25:05.920 now New York
00:25:07.160 Times number
00:25:07.820 one best-selling
00:25:08.620 author of
00:25:09.880 the book
00:25:10.860 Extreme
00:25:11.340 Ownership
00:25:11.840 with his
00:25:13.420 Navy SEAL
00:25:14.380 co-author
00:25:14.980 Leif Babin
00:25:16.140 and Jocko's
00:25:17.140 also a
00:25:17.660 jujitsu
00:25:18.380 black belt.
00:25:19.980 He's about
00:25:20.680 the most
00:25:21.480 macho guy
00:25:22.440 you will
00:25:23.360 ever meet
00:25:23.820 but also
00:25:24.900 one of the
00:25:25.600 nicest guys.
00:25:26.960 Anyway, so
00:25:27.540 people have been
00:25:28.460 trying to get us
00:25:29.080 together to
00:25:29.800 debate free
00:25:31.060 will, I
00:25:31.580 think, because
00:25:32.260 his kind of
00:25:34.000 core ethic
00:25:34.900 and productivity
00:25:36.220 hack is to
00:25:37.400 take what he
00:25:38.220 calls extreme
00:25:39.060 ownership over
00:25:40.420 the things that
00:25:41.260 happen in your
00:25:42.220 life.
00:25:42.740 You know, when
00:25:43.080 your efforts
00:25:43.560 come to
00:25:44.040 naught, don't
00:25:45.500 blame the
00:25:45.900 world, don't
00:25:46.660 blame other
00:25:47.020 people, you
00:25:48.160 have to own
00:25:48.720 the whole
00:25:49.620 process.
00:25:50.420 That's how you
00:25:50.960 improve, that's
00:25:51.740 how you inspire
00:25:52.560 more trust in
00:25:53.440 people who are
00:25:54.120 collaborating with
00:25:54.760 you.
00:25:55.940 There's a lot
00:25:56.620 to be said
00:25:57.120 about the
00:25:58.300 wisdom of
00:25:58.740 doing that
00:25:59.200 and people
00:26:00.160 seem to think
00:26:00.940 that this is
00:26:01.880 at odds with
00:26:02.580 my view of
00:26:04.000 free will.
00:26:05.480 It may be
00:26:06.540 in certain
00:26:07.620 cases, but
00:26:08.880 generally speaking,
00:26:09.980 I don't think
00:26:10.400 it is.
00:26:11.160 But in any
00:26:11.560 case, people
00:26:11.880 want us to
00:26:12.660 debate free
00:26:13.960 will, which
00:26:15.020 I would be
00:26:15.320 happy to do.
00:26:16.440 So we've
00:26:17.080 been pitted
00:26:17.740 together online
00:26:18.620 and Jocko
00:26:20.340 has been
00:26:20.940 threatening to
00:26:22.460 demolish me
00:26:23.340 in debate
00:26:23.980 and I have
00:26:25.500 been threatening
00:26:26.000 to physically
00:26:27.380 demolish him.
00:26:29.600 And I would
00:26:30.360 encourage those
00:26:31.240 of you who
00:26:31.560 listen to
00:26:31.820 the Jocko
00:26:32.200 podcast, to
00:26:34.120 listen to his
00:26:34.820 more recent
00:26:35.360 episodes, to
00:26:36.400 detect any
00:26:37.900 sign of fear
00:26:38.800 in him.
00:26:40.460 He can only
00:26:41.320 conceal it for
00:26:41.980 so long.
00:26:43.560 He's got to
00:26:44.280 be under a lot
00:26:44.700 of pressure
00:26:45.100 knowing that
00:26:46.320 I'm out here
00:26:47.360 training.
00:26:50.780 Hearing from
00:26:51.500 some people
00:26:51.940 here who
00:26:52.840 used to be
00:26:54.280 religious and
00:26:55.660 got reasoned
00:26:57.200 out of their
00:26:57.840 religion by
00:26:59.580 me and
00:27:00.200 others,
00:27:00.580 it's amazing
00:27:01.960 that many
00:27:02.280 people think
00:27:02.720 this never
00:27:03.440 happens.
00:27:04.360 It happens
00:27:04.980 all the
00:27:05.300 time and
00:27:06.680 I continually
00:27:07.220 see the
00:27:07.600 evidence of
00:27:08.100 it.
00:27:08.360 So this
00:27:09.600 idea that
00:27:10.020 you can't
00:27:10.540 reason people
00:27:11.220 out of their
00:27:11.800 faith is
00:27:13.040 just not
00:27:13.580 true.
00:27:15.740 What you
00:27:16.420 can't generally
00:27:17.440 do is do
00:27:19.240 it within the
00:27:20.400 hour of a
00:27:21.040 scheduled debate
00:27:21.860 or otherwise
00:27:23.240 on demand.
00:27:24.740 But it
00:27:25.500 happens.
00:27:25.960 how would
00:27:27.720 someone lose
00:27:28.940 their faith?
00:27:30.280 Just a
00:27:31.140 blow to the
00:27:32.020 head or a
00:27:33.460 good scare?
00:27:34.980 No, generally
00:27:36.020 it's one of
00:27:36.880 two ways.
00:27:38.240 They either
00:27:38.760 discover the
00:27:39.620 ways in which
00:27:40.200 various doctrines
00:27:41.200 don't make a
00:27:42.040 lot of sense
00:27:42.740 or they
00:27:45.000 don't want the
00:27:45.400 sort of life
00:27:46.100 that seems to
00:27:47.080 be dictated
00:27:47.720 by any kind
00:27:49.720 of serious
00:27:50.340 adherence to
00:27:51.840 revelation.
00:27:53.200 So some
00:27:53.960 combination of
00:27:54.740 those two
00:27:55.240 either doesn't
00:27:56.400 make sense
00:27:57.020 or doesn't
00:27:58.120 lead somewhere
00:27:58.920 good.
00:28:00.740 But the
00:28:01.360 first reason I
00:28:02.140 think is always
00:28:02.700 the more
00:28:03.180 compelling.
00:28:04.340 People will
00:28:04.800 make fairly
00:28:05.440 impressive
00:28:06.060 sacrifices to
00:28:07.820 their own
00:28:08.180 happiness and
00:28:09.280 even to the
00:28:09.680 happiness of
00:28:10.200 their children
00:28:10.800 if they
00:28:11.940 believe that
00:28:13.260 the doctrines
00:28:14.560 that justify
00:28:15.480 and even
00:28:16.560 mandate those
00:28:17.420 sacrifices are
00:28:18.860 true.
00:28:20.060 So the
00:28:20.780 question of
00:28:21.220 truth is
00:28:22.220 everyone's
00:28:22.820 concern.
00:28:23.840 This is
00:28:24.160 another point
00:28:25.300 of confusion.
00:28:25.980 The idea
00:28:26.320 that religious
00:28:27.360 people,
00:28:28.000 religious
00:28:28.260 fundamentalists
00:28:29.200 aren't really
00:28:30.860 concerned about
00:28:31.720 what's true.
00:28:33.500 No, they're as
00:28:34.280 concerned as
00:28:34.920 anyone.
00:28:35.800 They're far
00:28:36.500 more concerned
00:28:37.300 in fact about
00:28:38.980 truth than
00:28:40.280 many so-called
00:28:40.980 religious minds.
00:28:42.740 If you'd like
00:28:43.560 to continue
00:28:43.980 listening to
00:28:44.440 this conversation
00:28:45.140 you'll need to
00:28:46.120 subscribe at
00:28:46.820 samharris.org.
00:28:48.400 Once you do
00:28:48.960 you'll get access
00:28:49.540 to all full-length
00:28:50.440 episodes of the
00:28:51.080 Making Sense
00:28:51.480 podcast.
00:28:52.220 along with
00:28:52.860 other subscriber
00:28:53.560 only content
00:28:54.260 including bonus
00:28:55.520 episodes and
00:28:56.600 AMAs and the
00:28:57.740 conversations I've
00:28:58.460 been having on
00:28:58.920 the Waking Up
00:28:59.420 app.
00:29:00.360 The Making Sense
00:29:00.860 podcast is ad
00:29:01.840 free and relies
00:29:03.040 entirely on
00:29:03.760 listener support
00:29:04.420 and you can
00:29:05.400 subscribe now
00:29:06.180 at samharris.org.