Making Sense - Sam Harris - January 10, 2017


#60 — An Evening with Richard Dawkins and Sam Harris (2)


Episode Stats

Length

36 minutes

Words per Minute

142.02742

Word Count

5,176

Sentence Count

303

Misogynist Sentences

6

Hate Speech Sentences

20


Summary

After a stroke, Richard and Sam discuss the effects of time travel, and whether or not it's possible to travel back in time to a time and time machine. Plus, a new episode of Making Sense featuring a special guest: a man who had a stroke and is now recovering. Sam and Richard discuss their experience with time travel and the effects it has on their lives, and how it can affect our understanding of the past, present, and future. And, of course, there's a quiz from you, the listeners! Have a question or would like to debate a particular trend or idea? Please e-mail us your questions, suggestions, thoughts, or suggestions for future episodes of the podcast. We don't run ads on the podcast, and therefore it's made possible entirely through the support of our listeners, so if you enjoy what we're doing here, please consider becoming one. You'll get access to access to full episodes of The Making Sense Podcast, where you'll need to subscribe to our private RSS feed to access the full episode, plus other premium-quality, non-advertiser-only episodes. If you're not a subscriber yet, you can still become one by becoming a supporter of our podcast by becoming one by subscribing to Making Sense. Thanks for listening to the podcast and listening to this podcast! Please consider becoming a member! Sincerely, Sam Harris -- and . (Sam's Note: Please forgive me if I've just had a minor stroke. I'm sorry for my voice does not up to this episode, but I can do better than that. I can't sing in tune in tune, I can be a little better than you can do that, can you do better, I really do. ) Thank you all for being a good friend of The Huffington Post? -- Sam Harris, too. -- Timestamps: 1:00 - 2:00:00 3:30 - What's a good thing? 4:15 - What would you do? 5: What do you think of a time machine? 6:40 - Is time travel in the future? 7:20 - How does it matter? 8: Is it possible? 9:10 - What are you better? 11:50 - Does time travel possible? 12:10 13:30 14:40


Transcript

00:00:00.000 Welcome to the Making Sense Podcast.
00:00:08.820 This is Sam Harris.
00:00:10.880 Just a note to say that if you're hearing this, you are not currently on our subscriber
00:00:14.680 feed and will only be hearing the first part of this conversation.
00:00:18.440 In order to access full episodes of the Making Sense Podcast, you'll need to subscribe at
00:00:22.720 samharris.org.
00:00:24.140 There you'll find our private RSS feed to add to your favorite podcatcher, along with
00:00:28.360 other subscriber-only content.
00:00:30.260 We don't run ads on the podcast, and therefore it's made possible entirely through the support
00:00:34.660 of our subscribers.
00:00:35.900 So if you enjoy what we're doing here, please consider becoming one.
00:00:46.860 Please forgive me if I croak.
00:00:52.280 I've just had a minor stroke.
00:00:54.200 Basal ganglion on the right makes me walk as if I'm tight.
00:01:02.900 So if my voice descends to squawking, Sam will have to do the talking.
00:01:07.920 Thank you all for coming.
00:01:16.440 This is what a sold-out house looks like when the Cubs are in the seventh game of the World
00:01:20.800 Series.
00:01:23.400 There's a place in hell for those people who bought tickets and didn't use them.
00:01:28.320 Needless to say, it's an honor to be here and a real honor to be doing this with Richard,
00:01:32.080 and I get to do this twice in one week.
00:01:34.520 This is the second night that I think you know.
00:01:37.140 And Richard and I were worried about this.
00:01:40.320 We were worried about this event because we thought we would have a great conversation
00:01:44.360 last night, and then we didn't want to spend an hour in front of you here trying to recapitulate
00:01:50.140 that conversation.
00:01:50.940 So as a way of avoiding that fate, I went out to all of you, I think, online asking for
00:01:58.020 questions, and I got thousands of questions, and I picked many.
00:02:01.420 So the questions we'll track through tonight are different from the ones we did last night,
00:02:05.860 and this is all being videotaped, and you can see what you missed last night once that
00:02:09.960 video is available.
00:02:11.980 And I wasn't actually planning to ask this, but I wanted to talk about your stroke because
00:02:15.820 we haven't spoken about this.
00:02:16.880 And I'm going to guess that the sock choice is not evidence of your stroke.
00:02:24.960 Well, I was explaining last night that at the recent skeptics conference in Las Vegas,
00:02:30.960 we had a workshop on cold reading, which you know that system whereby you pretend to thought
00:02:38.540 read, and all you're doing really is sizing the other person up.
00:02:40.780 And my partner was a young woman who said, I seem to see there's something wrong with
00:02:48.220 your eyes, maybe colorblind.
00:02:52.640 She was looking at my, sorry, sorry.
00:02:55.880 I am trying to make a point.
00:02:59.100 I'm trying to spread the meme of odd socks.
00:03:02.780 Now, this is not for the reason given in Stephen Potter's lifemanship.
00:03:10.160 Under womanship, he recommends the odd socks ploy as a way of arousing the maternal instincts.
00:03:19.580 And then there's a footnote that says, buy our patent odd socks brand.
00:03:27.280 But my point is different from that.
00:03:30.040 It is that we should not be compelled to buy socks in pairs, because unlike shoes, which
00:03:37.080 have genuine chirality, you can't switch a left shoe and a right shoe.
00:03:42.920 Socks do not have this property, and therefore it's ridiculous having to buy socks in pairs.
00:03:48.260 If you lose one sock, you have to throw the other one away.
00:03:57.520 So I want to make the point in as vivid a fashion as possible, and encourage everybody
00:04:02.540 to wear odd socks.
00:04:09.620 Tell us about the experience of having a stroke.
00:04:12.820 Okay.
00:04:13.620 Well, it was a bit scary.
00:04:15.040 I just suddenly became aware that my left hand wasn't working.
00:04:20.600 I couldn't pick things up.
00:04:21.640 Or if I managed to pick something up, I couldn't let go of it again, which is sort of kind of
00:04:26.500 scary.
00:04:28.520 And I was sort of staggering about and not able to stand up straight.
00:04:32.120 I couldn't drop buttons.
00:04:34.340 I think I'm pretty much recovered now.
00:04:36.260 I can both do up and undo buttons.
00:04:38.540 This was in February?
00:04:39.980 Yes.
00:04:40.380 The only thing is, I can't sing.
00:04:45.540 I never could sing very well, but I could at least sing in tune, and now I can't.
00:04:51.120 And my voice does tend to croak, so hence my introductory apology.
00:04:56.900 Well, was there any immediate emotional or cognitive or perceptual component to it, or
00:05:04.560 was it just a motor thing that you noticed?
00:05:06.000 No, it was just motor.
00:05:07.740 I mean, I was obviously scared.
00:05:09.800 It's in the basal ganglion, as I said, which doesn't affect cognitive function.
00:05:13.680 So, I hope that will become evident tonight, as I've got to.
00:05:21.260 Well, we will see.
00:05:23.120 If you come out as a Mormon at any point in the next hour.
00:05:29.640 I think it would take more than a stroke to do that to me.
00:05:31.780 Our first question that one of you may have asked, if you had a time machine and could
00:05:49.540 travel 500 years into the future, what do you think you would find biologically, assuming
00:05:54.900 our direct descendants still exist and haven't uploaded themselves into the matrix, will
00:06:01.380 we be recognizably human?
00:06:03.280 500 years is too short a time to expect any genetic evolutionary change.
00:06:09.680 What about with our own meddling, the genetic engineering that we're surely going to do?
00:06:14.540 Yes, I mean, I suppose that is a possibility.
00:06:17.280 If by then we've colonized Mars, such that there's a barrier to gene flow between the parent
00:06:23.740 planet Earth and the colony on Mars, then it's possible that the Mars colony might have
00:06:30.360 diverged.
00:06:31.460 But 500 years is a short time.
00:06:34.380 But how much of an appetite do you think we will have, given what we currently are, to
00:06:40.440 change ourselves, given the ability to do so in radical ways?
00:06:45.260 Well, we've had the ability to change cows and horses and pigs and cabbages and dogs and
00:06:51.900 roses for hundreds, thousands of years.
00:06:56.520 And although we've changed all those species, almost beyond recognition, when you think that
00:07:00.980 a Pekingese or a poodle or a pug or a bulldog is a wolf, he still thinks it's a wolf.
00:07:07.420 The world's worst wolf.
00:07:09.400 And yet we haven't done that to humans.
00:07:12.020 So it looks as though we don't seem to have had much of an appetite to do that with respect
00:07:18.440 to the selection part of the Darwinian equation.
00:07:22.660 We're now just beginning to have the possibility of doing it to the mutation part of the Darwinian
00:07:27.960 equation, namely genetic engineering.
00:07:30.980 But it's not obvious why, if we didn't have the motivation to selectively breed humans,
00:07:36.220 why we should have the motivation to selectively mutate humans.
00:07:39.800 Kind of a related point, you're obviously very famous for having introduced this concept
00:07:44.880 of a meme.
00:07:46.060 How seriously should we take the analogy to a gene with a meme?
00:07:51.440 It was intended as an analogy to a gene.
00:07:56.100 And the idea is that anywhere in the universe where self-replicating coded information arises,
00:08:05.240 that could be fair game for Darwinian evolution, for Darwinian selection.
00:08:11.580 And I wanted to end the selfish gene by making that point, because the whole of the rest of
00:08:16.120 the book had been extolling the gene as the unit of selection.
00:08:20.480 So I wanted to make the point, it doesn't have to be DNA.
00:08:22.960 It could be anything which is self-replicating.
00:08:25.080 Well, one could speculate about life on other planets being mediated by a replicator other
00:08:34.840 than DNA.
00:08:36.220 But then I thought, or a computer virus would have done the job as well, but I didn't know
00:08:40.540 about them in 1976.
00:08:41.820 So I thought, well, what about cultural inheritance?
00:08:48.100 Anything where we have imitation is potentially analogous to genetic replication.
00:08:53.140 So something like a craze at a school, something like a craze for a particular kind of toy.
00:09:00.540 I introduced to my boarding school a craze for origami paper folding to make a Chinese junk.
00:09:08.660 And it spread exactly like a measles epidemic through the school, and then died away like
00:09:14.240 a measles epidemic.
00:09:15.680 Interestingly, I had learned to do this from my father, and he had learned it from an almost
00:09:21.680 identical epidemic at the same school 26 years earlier.
00:09:26.100 So the epidemiology of meme spread is very similar to gene spread.
00:09:32.380 But it's only interesting from a Darwinian point of view if the memes that spread are the
00:09:38.100 ones that are good at spreading.
00:09:39.680 If there is some kind of selective effect, and it's plausible that it should be, clothes
00:09:45.840 fashion spread because people find them cool, or something like a reverse baseball cap, which
00:09:54.480 by the way, lowers the IQ by a full 10 points.
00:09:58.140 That's probably the first remark that he's going to get in trouble for.
00:10:07.880 But I think you can probably treat religious memes in the same way.
00:10:14.620 I mean, religious ideas spread like a virus.
00:10:19.060 So I call them virus of the mind.
00:10:20.840 So they either pass down the generations like DNA does.
00:10:27.100 And of course, obviously, religions pass down generations.
00:10:30.200 But they also spread sideways in epidemics when you've got a particularly charismatic vector
00:10:37.460 of the virus like Billy Graham or one of those types.
00:10:41.900 So I think it's a genuinely interesting question whether the really successful religions like
00:10:48.380 Roman Catholicism and Islam spread because the memes have high spreadability in their own
00:10:55.680 right, like genes in Darwinism, or whether they're spread by Machiavellian priests who get
00:11:02.140 together and work out what's the best marketing strategy to spread them.
00:11:05.820 And I'm inclined to think that the pure memetic spread is plausible, and I'm interested in that.
00:11:12.900 I haven't really run very much with the meme idea.
00:11:15.880 The people who have are Dan Dennett, the philosopher who talks in a very interesting way about memes
00:11:23.340 in most of his recent books.
00:11:25.640 And Susan Blackmore is another one who wrote a book called The Meme Machine.
00:11:29.280 Actually, there are about 20 books now with the word meme in the title, which emphasise various
00:11:35.360 aspects of it.
00:11:37.200 The fact that memes don't change truly randomly, does that run roughshod over the analogy?
00:11:43.560 I don't think that really matters.
00:11:46.360 Genes mutate randomly in the sense that mutation is not directed towards improvement.
00:11:52.940 The only improvement comes from selection.
00:11:57.340 But mutation, nevertheless, is induced by things like cosmic rays, radioactivity, various mutagenic
00:12:04.920 chemicals.
00:12:06.560 The fact that memes are introduced by human creativity doesn't detract from the idea that
00:12:13.140 some memes spread better than others for selective reasons.
00:12:18.220 What do you think your most important contribution to science or culture at large has been or will
00:12:25.060 be?
00:12:25.720 I suppose The Extended Phenotype, which is the title of my second book.
00:12:31.160 It's the only book that I wrote with a professional audience in mind.
00:12:37.000 I could expound it, but this is supposed to be a conversation, not a monologue.
00:12:42.880 The question is for both of us, so I can answer it.
00:12:46.480 Well, you could answer it.
00:12:47.320 Well, okay, you do your answer first then.
00:12:52.280 I can tell you what I hope it will be.
00:12:53.940 I don't tend to think in these terms globally, but I think what I'm doing most of the time
00:13:01.900 and have done in most of my books is attempt to argue for the unity of knowledge and to
00:13:06.980 resist this balkanization of our epistemology by essentially what I view as the dictates of
00:13:14.320 university architecture.
00:13:15.780 The fact that there's the biology department over here where you study biology and then
00:13:19.960 there's the psychology department over there, that seems to articulate two separate spheres
00:13:24.860 of inquiry that in the centers, they do have different methods, but there really are no
00:13:29.560 boundaries between those disciplines.
00:13:31.000 And I see that as true for not just for canonical scientific disciplines, but just fact-based thinking
00:13:40.600 about the nature of reality across the board.
00:13:42.780 And also the distinction that people make between third-person facts, classically physical facts,
00:13:49.300 and first-person subjective facts.
00:13:52.140 And some people think that distinction is so hard and fast that they imagine there are no
00:13:55.920 subjective facts, that I think is a boundary that I am consciously trying to erode.
00:14:01.840 And I think questions about moral truth and the truth of possible human experience or the
00:14:09.640 experience of conscious systems, those are questions that are every bit as grounded in reality as
00:14:17.400 any questions we ask in physics or chemistry or...
00:14:20.060 So introspection is a way of getting scientific data, do you mean?
00:14:24.780 Yeah.
00:14:24.980 I mean, that's...
00:14:25.980 Obviously, there are...
00:14:26.980 You have to issue certain caveats there.
00:14:29.680 I mean, there are ways in which introspection is a dead end.
00:14:33.560 I mean, for instance, I can't tell even with my best efforts, I cannot tell that I have a brain.
00:14:40.480 That's a pretty big blind spot.
00:14:42.040 But there are many things that you can introspect about, which give you scientifically valid data.
00:14:51.180 And in fact, you only...
00:14:53.660 I mean, if you're studying the mind, if you're studying what it's like to be a person,
00:14:57.360 at some point you are correlating third-person, quote, objective methods with first-person report.
00:15:04.740 You know, somebody says, you know, I ask you what it's like to have a stroke, or your neurologist
00:15:08.500 does, and he needs to know what your experience is.
00:15:12.960 I mean, it's not...
00:15:14.080 I mean, with a stroke, it's...
00:15:16.360 The final analysis seems to be looking at your brain at the, you know, actually what has been
00:15:22.860 physically affected, but the cash value of those physical effects is always what is showing up
00:15:29.800 in your experience and what is showing up in your function.
00:15:31.760 So if some canonical language area, say, was affected, but you spoke fine and appreciated
00:15:39.620 understood language fine and there was no discernible change in your language use, well, then that
00:15:46.520 would be the definition of those being non-linguistic areas of the brain being affected, no matter
00:15:51.960 how close they are to, you know, the standard, you know, average atlas of language use.
00:15:57.060 So we do always link up with a subjective report, too, and first-person performance.
00:16:03.240 And so, yeah, I mean, in terms of the contribution I want to make, I want to argue that there's
00:16:09.340 a larger set of truth claims we want to make when we're reasoning about reality, and those
00:16:14.680 include things that we will never know.
00:16:17.020 I mean, they include abstract things like mathematics, you know, which the physical foundation of which
00:16:22.020 is kind of hard to specify, and they include the example I always use is, you know, a question like,
00:16:29.100 what was JFK thinking the moment before he got shot?
00:16:33.260 Well, we know we'll never know, and that's data we'll never get, but there's an infinite number of things
00:16:40.120 we know he wasn't thinking, right?
00:16:41.780 So it would be wrong to say he was thinking, I wonder what Sam Harris and Richard Dawkins
00:16:46.680 think about what I'm thinking right now before I get shot.
00:16:48.880 There's an infinite number of things we could assert about the character of his subjectivity
00:16:55.480 there, which we know are wrong, you know, and we know that as fully as we know anything
00:17:00.500 in science.
00:17:02.620 And there are things that get, it's like, you know, like the mystery or pseudo-mystery of
00:17:06.460 how to integrate free will or experience a free will with our scientific worldview, I think
00:17:12.460 can be easily resolved if you can introspect with sufficient perspicacity and notice that
00:17:18.820 you don't even have evidence for free will in your first-person experience.
00:17:22.640 I think that's, those are subjective data that are available.
00:17:25.760 So there are ways to get access to interesting things through introspection, but they don't
00:17:31.320 actually include the existence of your brain.
00:17:33.000 Very hard to communicate to other people.
00:17:35.620 Yeah.
00:17:36.320 But I mean, that's true of many things that we have, we don't begin to doubt.
00:17:42.100 I mean, just imagine what it would be like if only 1% of the population had vivid dreams
00:17:48.440 at night.
00:17:49.080 So most of us just sleep like animals.
00:17:51.460 There's nothing that it's like to be us for eight hours a night.
00:17:54.760 But then some percentage of the population talk about traveling and meeting people and having
00:17:59.880 all of these illogical encounters, dreams would be much stranger and many people would doubt
00:18:04.520 their existence, but they would exist just as much as they do now.
00:18:07.160 And we doubt the sanity of people who had them probably as well.
00:18:10.020 Yeah, yeah, yeah.
00:18:11.800 But did you answer, did you fully answer your question?
00:18:14.380 Do you want to say more about the extended phenotype?
00:18:17.240 Well, I'll try.
00:18:19.740 I mean, what is a, to tell people what a phenotype is.
00:18:22.480 The phenotype is the external, not very external, the manifestation of genetic effects.
00:18:29.920 And from a Darwinian point of view, the phenotypic effects by which a gene is selected.
00:18:35.340 So there'll be genes that affect wing size, eye color, hair color, intelligence.
00:18:42.820 These are all phenotypic effects.
00:18:45.400 Conventionally, phenotypic effects are confined to the body in which the gene sits.
00:18:50.800 So genes exert their phenotypic effects by influencing embryonic, embryological processes.
00:18:57.340 And so the shape of the body, the color of the body, the behavior of the body are all influenced
00:19:02.300 by the genes inside the body.
00:19:04.520 That's conventional phenotype.
00:19:06.700 Extended phenotype is phenotypic effects of genes which are outside the body in which the gene sits.
00:19:14.520 And the easiest examples to think of are artifacts, things like beaver dams, birds' nests.
00:19:23.000 These are quite clearly phenotypes.
00:19:25.600 They quite clearly influence the survival of the genes that make them.
00:19:29.660 So a bird's nest is made by genes in the same limited sense, or not so limited sense,
00:19:35.640 as the bird's tail and the bird's eyes and the bird's wings.
00:19:40.520 And the nest contributes to the survival of the genes, which is what matters in the selfish gene view of life,
00:19:49.440 just as surely as the wings and the tail of the bird contribute to the survival of the genes that made them.
00:19:55.820 So although the nest is not a part of the bird's body, it is a part of the phenotype
00:20:01.720 by which the genes lever themselves into the next generation.
00:20:06.820 Well, if you buy that, and I think you have to,
00:20:10.740 then effects that parasites have on hosts, there are numerous examples, fascinating, rather lurid examples,
00:20:17.640 of parasites, which affect the behavior or the morphology of the host in such a way as to improve the survival of the parasite.
00:20:28.980 Well, that means that parasite genes are influencing host behavior and host morphology
00:20:37.480 in the same kind of way as any gene influences phenotype.
00:20:43.200 So when an animal is induced by, there's a thing called a brain worm, for example,
00:20:50.080 which is a worm that gets into a fluke or a snail, or various things like that,
00:20:55.600 and causes the intermediate host, the snail or the fluke.
00:21:01.000 Let's stick to snail.
00:21:03.100 Causes a snail to be eaten by a sheep.
00:21:08.240 Sorry, the brain worm is a fluke, and it gets into the snail
00:21:12.080 and causes the snail to be more likely to be eaten by a sheep.
00:21:15.780 And it does so by moving into the eyes of the snail
00:21:19.880 and making the eyes pulsate in a sort of rather frightening way
00:21:25.780 and calling the attention of an animal like a sheep or a cow to eating the snail,
00:21:33.960 which means that the parasite, the fluke, then gets into the next part of its life cycle.
00:21:39.400 So the fluke genes are influencing the behavior of the snail and the eyes of the snail.
00:21:48.320 The change in the snail is part of the phenotype of fluke genes.
00:21:55.920 Extended phenotype.
00:21:58.200 And if you buy that, which is a sort of further step,
00:22:00.920 then something like the bird's song,
00:22:06.500 say male bird's song, which, say, influences female birds,
00:22:09.860 actually physically causes the ovaries of the female to swell.
00:22:13.460 This does happen.
00:22:15.640 The swelling of the ovaries of a female bird
00:22:18.580 is extended phenotype of genes in the male,
00:22:25.100 which make the male sing the song, which has this effect.
00:22:30.920 So the extended phenotype then becomes a way of looking at the whole of animal communication,
00:22:37.600 where one animal influences the behavior of another.
00:22:42.780 I have not done justice to the extended phenotype.
00:22:46.320 Read it.
00:22:46.880 So what are the prospects that religion or something like it is part of our extended phenotype?
00:23:02.940 Yes.
00:23:03.780 I don't think I want to say that.
00:23:06.820 I imagine you wouldn't.
00:23:08.320 Well, in order to qualify as extended phenotype,
00:23:13.960 it would be necessary that genes...
00:23:19.900 Well, say you took two preachers,
00:23:23.180 one of whom was a very good preacher
00:23:25.880 who recruited lots and lots of people into his church,
00:23:28.820 another of whom wasn't.
00:23:29.760 That could be extended phenotype,
00:23:32.160 but only if there was a genetic difference between these two preachers,
00:23:36.280 which caused one of them to be an effective recruiter and the other one not.
00:23:40.740 That would be, but I don't think that's very likely to be true.
00:23:44.260 Well, wait a minute.
00:23:45.640 Just to quite literally play devil's advocate here.
00:23:48.380 If there's a gene for religious enthusiasm
00:23:54.460 or a set of genes for susceptibility to that range of experience
00:24:00.520 and a fundamental lack of intellectual honesty
00:24:03.780 or a lack of concern that what you're saying is true,
00:24:07.620 so an increased capacity for self-deception
00:24:11.980 and therefore deception of others, say,
00:24:14.420 I mean, that seems to me plausible.
00:24:16.220 So that's a very effective preacher
00:24:18.300 who's filled with the charisma of being absolutely sure
00:24:23.200 about what he's saying
00:24:24.220 and energized by his passion for the whole project.
00:24:27.820 That's true, and I think it probably is true
00:24:30.980 that there are genetic...
00:24:33.220 When you say a gene for something,
00:24:34.680 all you ever mean is a genetic difference
00:24:37.000 that causes a phenotypic difference.
00:24:39.620 So the best way to show that would be twin studies.
00:24:42.960 If you can show that identical twins,
00:24:45.880 it's monozygotic twins.
00:24:47.760 If one of them is a religious maniac,
00:24:49.860 the other one probably will be as well.
00:24:52.180 If that's true,
00:24:54.600 and if that's not so true of fraternal twins,
00:24:57.420 dizygotic twins,
00:24:59.420 then you've shown that there is a genetic effect
00:25:02.580 on religiosity,
00:25:04.980 and that's probably true,
00:25:06.680 and I think that's certainly true.
00:25:08.620 To be extended phenotype,
00:25:11.440 I think you've got to say
00:25:12.280 that genes engineer
00:25:15.580 their own survival and passing on
00:25:19.740 into the next generation
00:25:20.940 by making their victims religious.
00:25:27.500 And I suppose,
00:25:29.260 well, maybe that works, actually.
00:25:31.860 So quickly,
00:25:35.580 a life's work is undone.
00:25:42.760 Yes, I mean,
00:25:43.460 I suppose that...
00:25:45.320 Let's not go there,
00:25:46.740 is it, though?
00:25:50.680 What do you think
00:25:51.500 are the most misunderstood topics
00:25:54.840 in science
00:25:55.600 by otherwise smart
00:25:57.040 and educated people?
00:25:57.880 Or what's one
00:25:59.520 that you think
00:26:00.020 is often misunderstood?
00:26:01.460 Oh, evolution, surely.
00:26:04.200 Especially in this country.
00:26:09.500 But what do you think,
00:26:11.160 even many people in this room
00:26:12.660 who obviously are well-educated
00:26:14.500 and interested in the topic
00:26:15.680 to even be here,
00:26:17.580 what do you think
00:26:18.560 many people here
00:26:19.440 may be confused about
00:26:21.780 or be wrong about
00:26:23.600 and not know it
00:26:24.260 that's of consequence in science?
00:26:25.740 Have you had to bet?
00:26:27.800 Well, I mean,
00:26:28.780 certainly there are no creationists
00:26:30.520 in this audience.
00:26:31.460 You were screened at the door, right,
00:26:32.720 with that wand?
00:26:35.760 I suppose there may be people...
00:26:38.740 I mean, I would say
00:26:39.680 it was a misconception
00:26:40.800 to...
00:26:42.740 to believe that
00:26:48.360 the majority of evolutionary change
00:26:50.960 as we observe it
00:26:52.480 is non-selective.
00:26:53.880 and there are people
00:26:55.280 who believe that
00:26:56.160 natural selection
00:26:57.400 is relatively trivial
00:26:58.600 compared to
00:27:01.160 random genetic effects.
00:27:05.360 Now,
00:27:06.200 that's a genuine
00:27:07.200 scientific controversy
00:27:08.300 and there may be people here
00:27:09.460 who subscribe to that.
00:27:10.880 And it's true
00:27:11.840 if you stick to
00:27:13.400 molecular genetic changes.
00:27:16.600 But if you're talking about
00:27:17.740 actual externally visible
00:27:20.140 phenotypic changes,
00:27:21.260 then I don't think it is true.
00:27:22.620 And I think that's a confusion
00:27:24.040 which I would expect
00:27:25.700 to find in this
00:27:27.560 sophisticated audience.
00:27:29.160 So just to
00:27:30.100 traverse that one more time,
00:27:32.580 the belief that
00:27:35.120 much of what we notice
00:27:37.200 about ourselves
00:27:37.960 was not selected for,
00:27:40.180 but just kind of came along
00:27:41.180 for the ride,
00:27:41.900 you think that's very likely untrue?
00:27:43.480 Yes,
00:27:44.940 but you have to be
00:27:46.040 sophisticated about it
00:27:47.120 because you may be
00:27:48.940 looking at the wrong thing.
00:27:50.280 We talked about this
00:27:51.040 last night.
00:27:51.920 Perhaps it doesn't matter
00:27:52.720 doing it again.
00:27:56.240 Many people think
00:27:57.220 that quite a lot of
00:27:58.460 characteristics are trivial,
00:28:01.700 sort of frivolous.
00:28:02.500 I mentioned eyebrows
00:28:03.480 last night
00:28:04.340 as being something
00:28:05.020 which nobody could
00:28:06.400 seriously think
00:28:07.120 that eyebrows
00:28:07.640 are doing anything useful.
00:28:08.580 How could eyebrows
00:28:09.320 possibly be
00:28:10.040 naturally selected for?
00:28:11.200 Well, I think
00:28:12.520 that's a mistake.
00:28:13.740 It's a very tempting mistake.
00:28:16.540 But something
00:28:17.720 that seems trivial
00:28:19.000 is almost certainly
00:28:20.260 not trivial
00:28:21.120 because the genes
00:28:22.840 that make it
00:28:23.740 have so many opportunities
00:28:25.800 to be selected.
00:28:26.900 They are represented
00:28:27.840 in thousands of individuals
00:28:29.680 and over lots of generations.
00:28:32.600 And this has been
00:28:33.700 worked out mathematically
00:28:35.260 as well.
00:28:36.580 So that is a very
00:28:39.720 common misconception,
00:28:41.020 I think,
00:28:42.060 that very slight effects
00:28:45.560 are too trivial
00:28:46.520 for natural selection
00:28:47.340 to care about.
00:28:49.400 And I think that is wrong.
00:28:50.540 I think natural selection
00:28:51.300 actually does care about
00:28:52.540 even what look to us
00:28:54.220 like very tiny,
00:28:55.280 trivial effects.
00:28:56.920 To make a disconcertingly
00:28:59.160 lateral move here,
00:29:01.120 how can we publicly
00:29:02.200 challenge the more
00:29:03.320 dangerous tenets of Islam
00:29:04.800 without further inspiring
00:29:06.720 bigotry against Muslims?
00:29:08.880 Now, you and I have both,
00:29:10.660 unlike many scientists,
00:29:13.120 we have sounded off
00:29:16.500 on this.
00:29:17.180 It's been,
00:29:18.360 for as long as I've been
00:29:19.160 an atheist,
00:29:19.720 it's been deeply
00:29:20.400 unfashionable
00:29:21.200 amongst atheists,
00:29:22.840 even atheists who are,
00:29:24.980 who think it's a legitimate
00:29:25.700 project to criticize religion.
00:29:27.140 It's been unfashionable
00:29:28.640 to criticize any one religion
00:29:31.380 more than any other.
00:29:33.260 Yes.
00:29:33.400 And I've noticed...
00:29:34.500 Especially one,
00:29:35.160 any more than any other.
00:29:35.920 Yeah, yes.
00:29:37.940 Yeah, you can go to town
00:29:39.360 on Mormonism
00:29:40.700 or Scientology.
00:29:42.320 But it's...
00:29:44.560 Or Christianity, actually.
00:29:46.240 I mean...
00:29:47.300 The default assumption
00:29:48.940 is that if you're
00:29:49.520 against religion
00:29:50.540 or if you think
00:29:51.480 that the evidentiary claims
00:29:53.880 upon which all these
00:29:54.880 revealed religions
00:29:55.620 are founded
00:29:56.100 are unjustifiable,
00:29:58.840 well then,
00:29:59.380 they're all on all fours together
00:30:00.620 and you don't really need
00:30:01.760 to weight your concern.
00:30:05.480 But it just has seemed obvious
00:30:07.520 at least since
00:30:08.740 September 11th, 2001,
00:30:10.800 that one of these religions
00:30:11.820 is producing more
00:30:13.700 than its fair share
00:30:14.560 of conflict and oppression.
00:30:17.080 So, back to the question,
00:30:18.680 how do you,
00:30:20.080 given that you and I
00:30:20.700 both think it's legitimate
00:30:21.620 to focus on
00:30:22.560 the most harmful instances
00:30:24.560 of religion
00:30:25.120 as we see it,
00:30:26.860 how do you avoid
00:30:28.600 energizing those voices
00:30:31.220 who are actually animated
00:30:33.380 by bigotry and xenophobia?
00:30:35.480 Yes, yes.
00:30:36.060 Well, I think we both
00:30:37.180 have this problem.
00:30:39.140 I suppose...
00:30:41.800 I mean, I listened
00:30:42.360 to one of your podcasts
00:30:43.500 about Islam
00:30:46.340 and arguing against
00:30:49.880 the point of view
00:30:50.700 which says that
00:30:51.580 the terrorists
00:30:53.620 which we all
00:30:56.000 know about
00:30:57.140 in the Middle East
00:30:57.760 are not motivated
00:30:59.360 by religion.
00:31:01.360 They're motivated
00:31:02.840 by anything but religion.
00:31:04.100 There's a kind of
00:31:04.740 desperate desire
00:31:06.440 to blame anything
00:31:08.400 but religion
00:31:09.460 for what is going on.
00:31:12.120 This was the podcast
00:31:16.500 where that issue
00:31:17.940 of ISIS's magazine
00:31:19.960 Dabiq
00:31:20.580 where they just
00:31:21.440 spelled out it.
00:31:22.100 They were as fed up
00:31:23.460 with this as I was
00:31:24.680 and they just wrote
00:31:25.500 all of their reasons
00:31:27.240 for killing infidels.
00:31:29.600 It was amazing.
00:31:30.680 It was...
00:31:31.040 I felt like I was
00:31:31.740 in a lucid dream
00:31:32.580 that...
00:31:33.380 It's true.
00:31:35.800 Do listen to it.
00:31:37.040 What's it called,
00:31:37.860 Sam?
00:31:38.040 I forgot what it's called
00:31:39.720 but you'll find it.
00:31:40.680 It's sometime
00:31:41.520 in the last 10 podcasts.
00:31:42.980 I think what...
00:31:43.720 Sorry?
00:31:46.660 What jihadists really think
00:31:49.040 and it's why we hate you
00:31:52.060 and why we fight you.
00:31:53.340 Yeah.
00:31:54.000 And it's absolutely...
00:31:55.200 I mean, Sam could have
00:31:55.920 written the script.
00:31:56.760 It's just completely...
00:31:59.040 We hate you
00:32:00.440 because you're not Muslim.
00:32:02.280 It amounts to nothing
00:32:03.480 more than that.
00:32:04.240 Yeah.
00:32:04.820 And we fight you
00:32:06.040 for the same reason.
00:32:08.040 But what do you do...
00:32:09.660 Actually, I had a podcast
00:32:11.340 that I just released today
00:32:12.520 where I was interviewing
00:32:13.340 our mutual friend
00:32:15.380 Ayaan Hirsi Ali
00:32:16.120 and I asked her
00:32:17.640 a related question.
00:32:19.140 Yeah.
00:32:19.900 Thank you.
00:32:23.520 This is the true
00:32:25.180 feminist hero
00:32:26.040 who was just declared
00:32:27.120 an anti-Muslim extremist
00:32:28.740 by the Southern Poverty Law Center.
00:32:30.940 That's absolutely unbelievable.
00:32:33.760 But as I asked her...
00:32:34.940 And has been disinvited
00:32:36.240 by at least several campuses
00:32:39.440 in this country,
00:32:40.660 including Brandeis.
00:32:41.780 So I asked her
00:32:43.240 more to the point
00:32:45.020 of conspiracy thinking
00:32:47.020 on the right.
00:32:47.960 So it's often...
00:32:49.980 It is simply a fact
00:32:51.200 that Islamists
00:32:51.940 and jihadists
00:32:52.740 are scheming
00:32:54.540 to spread their views
00:32:56.620 and, you know,
00:32:57.180 both by the sword
00:32:58.180 and otherwise
00:32:58.920 throughout open societies.
00:33:00.760 and they're using
00:33:01.780 the norms
00:33:03.220 and institutions
00:33:04.260 of our open societies
00:33:05.940 against ourselves
00:33:07.680 in a very cynical
00:33:09.020 and calculated way.
00:33:11.200 And it's not even
00:33:12.060 a conspiracy,
00:33:12.940 as Ayaan pointed out.
00:33:13.820 It's just there
00:33:14.480 in the open.
00:33:14.960 This is an agenda
00:33:15.640 that Islamists have.
00:33:17.100 They're totally open
00:33:18.000 about it,
00:33:18.460 totally honest about it.
00:33:19.520 Yeah, but the issue is
00:33:21.460 you can take this,
00:33:24.060 one's concern about this
00:33:25.660 in truly paranoid directions.
00:33:27.860 So I hear from people
00:33:28.720 who think Ayaan
00:33:31.020 is a stealth Islamist
00:33:32.660 or jihadist.
00:33:33.760 I hear from people
00:33:34.360 who think that Majid,
00:33:35.280 who I wrote
00:33:35.680 Islam and the Future
00:33:36.660 of Tolerance with,
00:33:37.700 is a stealth Islamist
00:33:39.160 and jihadist.
00:33:39.780 And so there's no...
00:33:40.520 There's no obvious signage
00:33:44.180 on the way
00:33:44.780 to complete insanity
00:33:46.020 where you're told
00:33:48.060 that you are now
00:33:48.760 too fearful
00:33:49.380 and too concerned
00:33:50.780 about things
00:33:51.440 that actually are contiguous
00:33:53.280 with real reasons
00:33:54.940 for concern.
00:33:55.920 But I asked Ayaan,
00:33:57.880 so where's the boundary here?
00:33:59.600 How do we differentiate
00:34:01.040 a reasonable fear
00:34:03.120 about genuinely scheming people
00:34:05.280 and right-wing paranoia
00:34:08.120 in this case?
00:34:09.480 And she just said,
00:34:10.760 facts.
00:34:12.080 Just one word.
00:34:13.140 It's like you're either
00:34:14.020 talking about facts
00:34:14.960 or you're not.
00:34:15.400 And when you're talking
00:34:15.920 about facts,
00:34:16.680 you can't go wrong
00:34:18.260 in this space.
00:34:19.120 And I thought
00:34:19.660 that was a great answer.
00:34:21.160 I think one point
00:34:21.800 to make is that
00:34:22.460 the main victims
00:34:23.940 of these awful people
00:34:25.060 are actually Muslims themselves.
00:34:28.580 But what about
00:34:29.460 the attack from the left,
00:34:32.220 which in liberal left circles
00:34:35.100 in America and Britain,
00:34:37.160 where Islam gets a free pass
00:34:40.640 on all sorts of terrible things
00:34:42.640 like misogyny,
00:34:43.660 which no liberal
00:34:44.900 would actually sanction,
00:34:47.420 and yet if a Muslim
00:34:49.680 behaves in horrifically
00:34:52.320 misogynistic ways,
00:34:53.440 somehow that's ignored,
00:34:55.400 as though that's somehow
00:34:56.140 legitimate.
00:34:56.660 Oh, it's part of their culture,
00:34:58.220 so they're allowed to do that.
00:34:59.840 I must say,
00:35:00.680 I despise that kind of thing.
00:35:03.280 I think it was an anthropologist
00:35:14.960 who's quote this,
00:35:16.340 I'm about to butcher,
00:35:17.260 but it's a great point.
00:35:18.240 He said,
00:35:18.500 when one person
00:35:20.500 grabs a little girl
00:35:22.520 and cuts off her clitoris
00:35:25.240 with a septic blade,
00:35:27.260 he is a dangerous lunatic.
00:35:29.980 When a million people do this,
00:35:31.740 it's a culture,
00:35:32.480 and we need to respect it.
00:35:34.480 And that's the little crystal
00:35:37.520 of moral confusion
00:35:38.880 that's just at the center
00:35:40.100 of the liberal worldview
00:35:42.240 that we need to fully crush.
00:35:45.480 I dare to suggest
00:35:46.900 that there's too much respect
00:35:49.340 in the world.
00:35:50.180 If you'd like to continue
00:36:03.020 listening to this conversation,
00:36:04.600 you'll need to subscribe
00:36:05.620 at samharris.org.
00:36:07.240 Once you do,
00:36:08.060 you'll get access
00:36:08.560 to all full-length episodes
00:36:09.860 of the Making Sense podcast,
00:36:11.520 along with other
00:36:12.080 subscriber-only content,
00:36:13.840 including bonus episodes
00:36:15.140 and AMAs
00:36:16.220 and the conversations
00:36:17.260 I've been having
00:36:17.820 on the Waking Up app.
00:36:18.700 The Making Sense podcast
00:36:20.280 is ad-free
00:36:21.140 and relies entirely
00:36:22.560 on listener support.
00:36:23.960 And you can subscribe now
00:36:25.200 at samharris.org.