Making Sense - Sam Harris - August 05, 2019


#164 — Cause & Effect


Episode Stats

Length

50 minutes

Words per Minute

141.38982

Word Count

7,103

Sentence Count

519

Misogynist Sentences

2

Hate Speech Sentences

6


Summary

After two mass shootings in the past 24 hours, the conversation about mass shootings is generally so confused, and it s so frustrating to see people talking past one another for political or otherwise emotional reasons that, I don t know, I think I'll read the first part of this blog post, just to put my argument in view, in the clearest form, and then maybe say a few things relevant to the current moment. I think we ve all heard of Jared Loughner, James Holmes, and Adam Lanza. But there are at least four types of mass shooters, and they ve all had different motivations. What are they? And how do they differ from one another? And why do they seem to be so hard to identify? And what do they have in common with the other three types of shooters we ve been referring to as psychopaths and psychopaths? Sam Harris argues that psychopaths are not delusional, and are not malignant. They are not selfish, ruthless, and prone to violence. Given half a chance and half a reason, psychopaths will harm others for reasons that have nothing to do with culture, ideology, or other other other than other people. That is what psychopaths do, of course. And it is worth noting that these first-order reasons that trouble us, for a social variable, happens to be the head of a nation that has power and influence, and is so abhorrent about North Korea. of course, it s a mad child. The North Korean leader is simply mad, and we didn t create him. Again, making it too easy for him to get a nuclear arsenal. We didn t make it easy for himself to get nukes. But even here, there is very easy to make it. And we didn't create him, did we make him too easy to get them? making it easy? We didn't make him make it I didn t write a book about it? And it was too easy made it . a few years ago, I m making it ? from a book I wrote a piece on this about how to make a nuclear bomb and in the late 20th century, and I m writing a piece about making it . a book on making it hard for him , and he s building a nuclear bombs it s made possible so he s making it so easy for me to get this?


Transcript

00:00:00.000 Welcome to the Making Sense Podcast.
00:00:08.820 This is Sam Harris.
00:00:10.880 Just a note to say that if you're hearing this, you are not currently on our subscriber
00:00:14.680 feed and will only be hearing the first part of this conversation.
00:00:18.420 In order to access full episodes of the Making Sense Podcast, you'll need to subscribe at
00:00:22.720 samharris.org.
00:00:24.060 There you'll find our private RSS feed to add to your favorite podcatcher, along with
00:00:28.360 other subscriber-only content.
00:00:30.620 We don't run ads on the podcast, and therefore it's made possible entirely through the support
00:00:34.660 of our subscribers.
00:00:35.900 So if you enjoy what we're doing here, please consider becoming one.
00:00:46.820 Welcome to the Making Sense Podcast.
00:00:49.060 This is Sam Harris.
00:00:51.960 Okay.
00:00:52.400 Well, I'm recording this intro in the immediate aftermath of now two mass shootings, the one
00:01:01.660 in El Paso, and it appears there was one in Dayton a few hours ago.
00:01:09.040 Needless to say, social media is now a cesspool.
00:01:13.360 I guess there are a few things I could say about this.
00:01:16.180 Actually, I wrote a piece on my blog when I used to blog rather than podcast about six
00:01:25.660 years ago in response to some jihadist violence, and it really is the clearest articulation of
00:01:35.000 what I have to say at moments like this.
00:01:37.460 The conversation about atrocities of this kind, mass shootings, is generally so confused and
00:01:45.000 it's so frustrating to see people talking past one another for political or otherwise
00:01:53.880 emotional reasons that, I don't know, I think I'll read the first part of this blog post
00:02:02.260 just to put my argument in view in the clearest form and then maybe say a few things relevant
00:02:09.840 to the current moment.
00:02:10.820 A young man enters a public place, a school, a shopping mall, an airport, carrying a small
00:02:26.880 arsenal.
00:02:28.020 He begins killing people at random.
00:02:30.360 He has no demands, and no one is spared.
00:02:33.080 Eventually, the police arrive, and after an excruciating delay as they marshal their forces, the young
00:02:38.460 man is brought down, or arrested.
00:02:41.360 This has happened many times, and it will happen again.
00:02:45.020 After each of these crimes, we lose our innocence, but then innocence magically returns.
00:02:50.480 In the aftermath of horror, we seem to learn nothing of value.
00:02:54.420 Indeed, many of us remain committed to denying the one thing of value that is there to be learned.
00:02:59.840 After the Boston Marathon bombing, a journalist asked me,
00:03:03.160 Why is it always angry young men who do these terrible things?
00:03:07.840 She then sought to connect the behavior of the Tsarnaev brothers with that of Jared Loughner,
00:03:12.660 James Holmes, and Adam Lanza.
00:03:15.160 Like many people, she believed that similar actions must have similar causes.
00:03:20.120 But there are many sources of human evil, and if we want to protect ourselves and our societies,
00:03:25.160 we must understand this.
00:03:27.180 To that end, we should differentiate at least four types of violent actor.
00:03:30.780 And now this is a sidebar.
00:03:33.320 There may be one new subtype here that I'll add.
00:03:37.900 But here's the first.
00:03:39.700 One.
00:03:41.100 Those who are suffering from some form of mental illness that causes them to think and act irrationally.
00:03:46.680 Given access to guns or explosives, these people may harm others for reasons that wouldn't
00:03:51.080 make a bit of sense even if they could be articulated.
00:03:54.380 We may never hear Jared Loughner and James Holmes give accounts of their crimes.
00:03:58.840 And we do not know what drove Adam Lanza to shoot his mother in the face and then slaughter
00:04:03.740 dozens of children.
00:04:05.480 But these mass murderers appear to be perfect examples of this first type.
00:04:09.960 Aaron Alexis, the Navy Yard shooter, is yet another.
00:04:13.400 What provoked him?
00:04:14.580 He repeatedly complained that he was being bombarded with, quote,
00:04:18.240 ultra-low-frequency electromagnetic waves.
00:04:21.740 Apparently, he thought that killing people at random would offer some relief.
00:04:24.880 It seems there's little to understand about the experiences of these men or about their
00:04:29.500 beliefs, except as symptoms of underlying mental illness.
00:04:34.160 Two.
00:04:35.400 This is the second type.
00:04:37.600 Prototypically evil psychopaths.
00:04:39.760 These people are not delusional.
00:04:41.960 They are malignantly selfish, ruthless, and prone to violence.
00:04:46.280 Our maximum security prisons are full of such men.
00:04:49.080 Given half a chance and half a reason, psychopaths will harm others.
00:04:53.020 Because that is what psychopaths do.
00:04:55.880 It is worth observing that these first two types trouble us for reasons that have nothing
00:04:59.520 to do with culture, ideology, or any other social variable.
00:05:03.860 Of course, it matters if a psychotic or psychopath happens to be the head of a nation, or otherwise
00:05:08.940 has power and influence.
00:05:10.800 That is what is so abhorrent about North Korea.
00:05:13.340 The child king is mad, or simply evil, and he's building a nuclear arsenal while millions
00:05:18.440 starve.
00:05:18.880 But even here, there is very little to be learned about what we, the billions of relatively normal
00:05:24.320 human beings struggling to maintain open societies, are doing wrong.
00:05:29.180 We didn't create Jared Loeffner, apart from making it too easy for him to get a gun.
00:05:33.460 And we didn't create Kim Jong-il, apart from making it too easy for him to get nuclear bombs.
00:05:39.840 Again, this was written six years ago.
00:05:42.660 Given access to powerful weapons, such people will pose a threat no matter how rational,
00:05:47.140 tolerant, or circumspect we become.
00:05:50.220 And I guess I would add another descriptor here.
00:05:52.080 There are people, it seems, who fall into one of these two categories, who are living in
00:06:00.100 an online culture of trolling now, where killing people and writing semi-bogus or entirely bogus
00:06:10.660 manifestos, merely designed to confuse the media, is becoming a new phenomenon, right?
00:06:18.340 These are people who are not moved by a sincere ideology.
00:06:23.340 They're just, quote, shitposting.
00:06:25.960 The behavior of trolling on websites like 4chan and 8chan has been exported to the real world
00:06:33.760 in the form of mass murder designed as a troll.
00:06:39.520 And to some degree, I believe the Christchurch shooting in the mosque had this form, right?
00:06:46.160 It's still not entirely clear what happened there.
00:06:49.740 So this is a kind of derangement that social media has introduced into our lives, where some
00:06:58.920 people are willing to commit murder, and even mass murder, simply to enjoy the spectacle
00:07:06.020 it creates online.
00:07:08.140 Again, they're either crazy or evil or both.
00:07:11.760 But in certain cases, the reasons for their behavior are not as they appear, right?
00:07:19.140 And the media seems to get very confused about this.
00:07:23.880 Okay, the third type here.
00:07:27.460 Normal men and women who harm others while believing that they're doing the right thing,
00:07:31.320 or while neglecting to notice the consequences of their actions.
00:07:34.540 These people are not insane, and they're not necessarily bad.
00:07:37.740 They're just part of a system in which the negative consequences of ordinary selfishness
00:07:42.740 and fear can become horribly magnified.
00:07:46.200 Think of a soldier fighting in a war that may be ill-conceived or even unjust, but who
00:07:51.300 has no rational alternative but to defend himself and his friends.
00:07:55.060 Think of a boy growing up in the inner city who joins a gang for protection, only to perpetuate
00:07:59.660 the very cycle of violence that makes gang membership a necessity.
00:08:02.700 Or think of a CEO whose short-term interests motivate him to put innocent lives, the environment,
00:08:08.460 or the economy itself in peril.
00:08:10.760 Most of these people aren't monsters.
00:08:12.700 However, they can easily create suffering for others that only a monster would bring about
00:08:16.480 by design.
00:08:18.340 This is the true banality of evil, whatever Hannah Arendt actually meant by that phrase.
00:08:23.460 But it is worth remembering that not all evil is banal.
00:08:26.100 Four, normal men and women who are motivated by ideology to waste their lives, and the lives
00:08:33.440 of others, in extraordinary ways.
00:08:36.360 Some of these belief systems are merely political, or otherwise secular, in that their aim is to
00:08:41.280 bring about specific changes in this world.
00:08:44.020 But the worst of these doctrines are religious, whether or not they are attached to a mainstream
00:08:48.100 religion, in that they are informed by ideas about otherworldly rewards and punishments,
00:08:53.680 prophecies, magic, and so forth.
00:08:56.100 Which are especially conducive to fanaticism and self-sacrifice.
00:09:00.660 Of course, a person can inhabit more than one of the above categories at once, and thus
00:09:04.820 have his antisocial behavior over-determined.
00:09:07.780 There must be someone, somewhere, who is simultaneously psychotic and psychopathic, part of a corrupt
00:09:13.820 system, and devoted to a dangerous transcendent cause.
00:09:17.820 But many examples of each of these types exist in their pure forms.
00:09:21.400 For instance, in recent weeks, a spate of especially appalling jihadist attacks occurred.
00:09:27.160 One in a shopping mall in Nairobi, where non-Muslims appear to have been systematically tortured
00:09:31.880 before being murdered.
00:09:33.400 One on a church in Peshawar.
00:09:35.400 And one on a school playground in Baghdad, targeting children.
00:09:38.640 Whenever I point out the role that religious ideology plays in atrocities of this kind,
00:09:44.080 specifically the Islamic doctrines related to jihad, martyrdom, apostasy, and so forth,
00:09:49.120 I am met with some version of the following.
00:09:52.260 Quote,
00:09:52.520 Bad people will always do these things.
00:09:55.200 Religion is nothing more than a pretext.
00:09:58.000 This is an increasingly dangerous misconception to have about human violence.
00:10:02.600 Here is my pick for the most terrifying and depressing phenomenon on earth.
00:10:06.940 A smart, capable, compassionate, and honorable person grows infected with ludicrous ideas
00:10:13.000 about a holy book and a waiting paradise, and then becomes capable of murdering innocent
00:10:17.540 people, even children, while in a state of religious ecstasy.
00:10:21.520 Needless to say, this problem is rendered all the more terrifying and depressing because
00:10:25.540 so many of us deny that it even exists.
00:10:28.700 Okay, I think I'll stop there.
00:10:30.800 Again, I wrote this six years ago in the aftermath of some jihadist attacks, and now I'm reading
00:10:35.720 it to you in the aftermath of some mass shootings in the United States, which attest at least
00:10:41.800 to the problem of gun violence here, as well as to our failure to make it difficult for
00:10:47.720 bad people, crazy people, dangerous people to get access to guns.
00:10:54.180 And it might in fact attest to a rise of white supremacist violence.
00:11:00.520 At the time I'm recording this, it's not yet clear what's what here.
00:11:04.780 But whatever's true of El Paso and Dayton, two things are absolutely clear.
00:11:12.100 One is that, again, we need some rational gun control in the U.S.
00:11:17.860 And I've written about guns.
00:11:20.120 My views on guns and gun control are hard enough to parse that they resist easy summary.
00:11:27.080 You can listen to the podcast or read the associated essay titled, The Riddle of the Gun.
00:11:32.480 Again, I can sound very pro-gun for part of that, but the punchline you should not lose
00:11:38.520 sight of is that the regulations I recommend on guns in the U.S. are more stringent than
00:11:45.600 anyone on the left is calling for.
00:11:48.880 So don't lose sight of that if you freak out over the other parts of that essay that sound
00:11:55.080 like they were written by the NRA, an organization which I hope will one day be destroyed.
00:12:00.100 The short form of this point is that we license people to drive cars, we license them even
00:12:07.340 more stringently to fly airplanes, and I think getting a license to own a firearm should be
00:12:12.260 like getting a pilot's license.
00:12:14.400 It shouldn't be easy, and if you're mentally ill or prone to suicidal depression, it should
00:12:22.100 be very difficult to get your hands on a gun.
00:12:24.500 But with 300 million guns already in existence in the U.S., this is a hard thing to bring
00:12:32.680 about, not to mention the political religion around gun ownership enshrined in the Second
00:12:39.680 Amendment.
00:12:41.200 Anyway, we need a conversation and research and political change around the epidemiology
00:12:49.300 of gun violence.
00:12:50.080 It's insane that we suffer this in the U.S. to this degree.
00:12:55.680 It's also true that we should keep some perspective.
00:12:59.840 In the hours where I think it's now 38 people have died in two mass shootings in the U.S., more
00:13:06.600 people have died from ordinary shootings and by suicide and even by medical errors in hospitals,
00:13:14.720 right?
00:13:16.780 So, we should keep some proportion here.
00:13:20.740 And finally, whatever is the case with these specific shooters, whether or not they're both
00:13:25.740 people of the fourth type I describe in this essay, people who are motivated, in this case,
00:13:31.860 by the lunatic ideology of white nationalism, and that may yet prove to be the case, it is obviously
00:13:39.580 a bad thing that we have a president who utterly fails to be clearly and consistently opposed
00:13:49.160 to these ideas.
00:13:50.960 Yes, you can find him in the aftermath of Charlottesville saying one measly thing against
00:13:58.360 white supremacy.
00:14:00.380 But to say that he has been ambiguous on this issue is an understatement, right?
00:14:07.680 To say that he has given comfort to racists is an understatement.
00:14:12.400 He completely lacks a decent ethical political response to these trends.
00:14:18.700 I'm not a fan of dog whistle theory.
00:14:22.140 I don't actually think he's dog whistling in his statements to white supremacists.
00:14:28.320 I think he's just an ordinary Archie Bunker style racist who doesn't care about these issues
00:14:33.960 and doesn't want to alienate anyone in his base.
00:14:36.480 And I think the people who are endlessly talking about dog whistles are doing much more harm
00:14:41.460 than good in our political discourse.
00:14:43.940 Not everything is a dog whistle.
00:14:46.080 In fact, almost nothing is a dog whistle.
00:14:48.520 I'm not saying the phenomenon doesn't exist, but generally racists just tell you what they
00:14:52.740 think.
00:14:53.320 And when they talk to other racists, they're explicit about their racism.
00:14:57.100 And it really does matter that the left's allegations against Trump and his supporters are so poorly
00:15:05.060 targeted.
00:15:06.320 You know, when he tells Ilhan Omar to go back to where she came from.
00:15:11.760 On the left, that is proof positive of racism.
00:15:15.060 Again, I have no doubt that Donald Trump is actually a racist, but that's a bad example
00:15:21.140 of racism.
00:15:22.260 It can be read in other ways.
00:15:24.300 And to think that it's a dog whistle to neo-Nazis is just an act of leftist clairvoyance that strikes
00:15:31.940 me as totally counterproductive.
00:15:33.380 To remind you how crazy this has all become, there was a Washington Post opinion editor who
00:15:39.520 claimed that Nancy Pelosi was dog whistling to racists when she criticized AOC and Ilhan
00:15:46.840 Omar and the rest of the so-called squad.
00:15:50.280 Nancy Pelosi?
00:15:52.440 The dog whistle meme is going to prove politically suicidal on the left.
00:15:58.340 We have to be precise, even when attacking racists.
00:16:03.380 So whatever turns out to be true in this case, whether either one of these mass shootings
00:16:08.880 is a clear example of white nationalist terrorism, the problem with Trump is not that
00:16:16.600 he is a clear supporter of white nationalist terrorism or even white nationalism.
00:16:22.200 The problem is he is an obscenely amoral president who can't be counted upon to say anything
00:16:31.080 beyond what he imagines is narrowly self-serving politically and financially.
00:16:38.060 To use a great word which is now much overused, this is the U.S. presidency reduced to a grift.
00:16:46.040 And it's awful, but it is not always precisely awful in the ways that are alleged on the left.
00:16:53.280 And again, every error matters.
00:16:57.780 We are guaranteed to have Trump for four more years if the Democrats can't get their house in order.
00:17:05.780 So my political concern here is that this not get overplayed and overspun.
00:17:10.880 It's totally possible that one of these shooters is mentally ill.
00:17:17.100 And if this still gets talked about as white nationalist terrorism, rather than a symptom of mental illness,
00:17:25.480 that is going to be a political problem.
00:17:28.520 And no, this is not a double standard.
00:17:31.360 There are acts of violence perpetrated by Muslims that are not examples of jihadism, much less jihadist terrorism.
00:17:38.660 Sometimes people really are violent for other reasons, as I sought to make clear in this essay.
00:17:46.420 However, it is yet another very dark moment.
00:17:51.180 And this has all been horrible news.
00:17:53.520 But I will leave it there.
00:17:57.360 And now for today's podcast.
00:18:00.680 Okay, well, in this episode of the podcast, I speak with Judea Pearl.
00:18:05.420 Judea is a professor of computer science at UCLA.
00:18:08.660 He's the author of three highly influential scholarly books.
00:18:13.200 He's also the winner of the Alan Turing Award, often considered the equivalent of the Nobel Prize for computer science.
00:18:20.260 He's a member of the U.S. National Academy of Sciences.
00:18:23.300 He's one of the first ten inductees into the IEEE Intelligence Systems Hall of Fame.
00:18:28.000 He's received numerous awards and honorary doctorates, including the Rummelhart Prize,
00:18:35.460 the Benjamin Franklin Medal, and the Lakatos Award at the London School of Economics.
00:18:41.640 And he's also the founder and president of the Daniel Pearl Foundation.
00:18:45.220 And that is because he's the father of Daniel Pearl, who was the, I believe, the first journalist killed by al-Qaeda.
00:18:57.740 At least the first that came to the attention of everyone in the aftermath of September 11th.
00:19:03.500 Anyway, I mentioned this at the beginning because it would have been awkward to have just ignored it,
00:19:10.540 but as you'll hear, I didn't have the heart to make Judea's experience there a topic of conversation.
00:19:20.360 So I opened that door only to close it, and then we just go on to have a fairly highbrow conversation
00:19:27.640 about how science has generally failed to understand causation.
00:19:33.540 We talk about the different levels of causal inference, counterfactuals, the foundations of knowledge,
00:19:40.480 the nature of possibility, the illusion of free will, artificial intelligence, the nature of consciousness, and other topics.
00:19:48.620 Anyway, at one point, I get confused about what we're talking about.
00:19:53.740 So it's a bit of a nerd fest, but I really enjoyed it.
00:19:58.800 And as you'll hear, Judea is a dear person, and it was a great privilege to meet him.
00:20:06.440 So now, without further delay, I bring you Judea Pearl.
00:20:15.860 I am here with Judea Pearl.
00:20:17.520 Well, Judea, thanks for coming on the podcast.
00:20:19.900 Thank you, Sam. It's great to be here.
00:20:22.400 So we've been circling this podcast for quite some time.
00:20:25.900 It's just taken a while to actually get together, and we have many areas of overlapping interest.
00:20:31.920 So I'm looking forward to talking to you about your work.
00:20:34.260 I was prepared, as I said, offline to just talk about your academic work, and we'll get deep into that.
00:20:41.020 But given my background as a critic of Islam and as a warrior about the link between specific religious ideas and specific forms of violence,
00:20:51.400 it's awkward for me to bring it up, but it's awkward for me to ignore it as well.
00:20:55.900 Well, Danny Pearl was your son, who was, I believe, the first, at least first most visible person murdered, journalist murdered.
00:21:04.920 I think the first journalist.
00:21:05.860 Yeah, after 2001.
00:21:08.560 So I just, I wanted to kind of just mention that at the outset, we can talk about it or not, if, as you like.
00:21:15.760 Perhaps we should talk about this topic separately, so we can separate it to discussions.
00:21:21.060 Okay, okay.
00:21:21.960 I don't feel the strange talking about it.
00:21:25.260 I get used to talk about it.
00:21:27.140 But I think for in terms of listeners' interest, some people have interest in the technical part, and some have in the ideological part.
00:21:36.660 Right, right.
00:21:37.660 It's good to separate it to.
00:21:39.480 Okay, well, let's dive into your work and then see what happens, because your work is fascinating.
00:21:45.820 So, how would you describe what your intellectual focus has been in your career?
00:21:53.880 Recently, it has been the mathematization of cause and effect.
00:21:58.780 Let's put it very, very concisely and precisely.
00:22:03.300 But there's a direct connection to artificial intelligence that we'll talk about?
00:22:07.200 Because if we want robots to behave like us, to communicate with us in our language, we have to equip them with the ability to communicate in terms of cause and effect that this is our language.
00:22:22.000 If they act stupidly without knowing the difference between correlation and causation, they will not be able to supply us answers to questions that are burning for us.
00:22:34.980 Even simple questions like, why did the milk spill?
00:22:40.520 Because I pushed it or because I was irritated or things of that sort.
00:22:45.040 You want a good answer, a good explanation, so we can communicate.
00:22:49.840 So, you just mentioned this opposition between correlation and causation.
00:22:53.680 Yes, yes.
00:22:54.540 And this is a phrase that will be familiar to many people.
00:22:58.280 I think many people will be surprised that it has impeded scientific understanding to the degree that it has.
00:23:06.580 I mean, you make a very strong case that science has more or less ignored causation.
00:23:13.780 And yet, I think in the popular understanding, science is all about finding the causes of phenomenon.
00:23:20.920 Correct.
00:23:21.560 And so, maybe we can speak for a few minutes about how statistics has rendered us unable to speak about causes.
00:23:31.280 It's not only statistics, it's science in general.
00:23:34.440 Yeah.
00:23:35.080 You see, we learn physics.
00:23:37.920 Every high school kid can solve physics homework.
00:23:42.100 And if you look at the physics homework, you have boundary condition, you have the equation of motions, and find out what's going to happen.
00:23:54.840 Or even what's going to happen if you intervene and you change the spring length to double its previous value.
00:24:03.020 It's a caudal question.
00:24:04.880 Right.
00:24:05.060 And every child can do that.
00:24:06.380 But when you're trying to transfer this knowledge to a computer, to a robot, then the robot is facing a clash here.
00:24:21.240 The equations of physics are symmetric, which means that x causes y to the same degree as y causes x.
00:24:28.680 Which means that the movement of the barometer depends on the pressure.
00:24:33.800 The same way that the pressure depends on the movement of the barometer.
00:24:38.620 So, when a robot comes in and looks at the equation and says, hmm, let me change the weather tomorrow by moving this barometer a little bit, right?
00:24:47.380 What would prevent the robot from doing that?
00:24:50.720 Yes, it's the same thing that prevents the high school kids from not giving the same answer.
00:24:57.440 Right.
00:24:57.700 But what the high school kids had, the notion of cause effect.
00:25:01.000 So, the high school kids filters the equations in his or her mind before giving you the answer.
00:25:09.660 And that is a kind of filtering that we need to do here to introduce the asymmetry between cause and effect and do it mathematically.
00:25:19.280 Because the robot doesn't understand the hand-waving.
00:25:23.620 Yeah.
00:25:23.880 Robots must understand equation.
00:25:25.500 So, we need an algebra, which is asymmetric, to capture the asymmetry in nature.
00:25:31.780 Right.
00:25:31.820 So, it's asymmetric with respect to influence.
00:25:35.760 Influence.
00:25:36.160 Time is usually the signature of influence.
00:25:38.280 Correct.
00:25:38.740 But it's not only the time.
00:25:40.200 Yeah.
00:25:40.300 It's not only the time.
00:25:41.440 Yeah.
00:25:41.940 We can show many cases in which the temporal direction, temporal order is different.
00:25:47.960 And still, X causes Y, and Y doesn't cause X.
00:25:52.000 Right.
00:25:52.280 It's very simple.
00:25:53.600 I mean, you don't actually need teleology for that.
00:25:56.380 I'll give you an example.
00:25:57.660 The rooster crow precedes the sunrise.
00:26:02.260 Right.
00:26:02.400 And no one will say that the rooster crow causes the sunrise.
00:26:07.580 It's highly correlated, too.
00:26:09.200 Yeah.
00:26:09.320 So, the rooster crow appears to be a cause, if time were your only signature.
00:26:14.640 If the time is your only signature, right.
00:26:16.780 It's not sufficient.
00:26:18.580 So, you talk about three levels of causation.
00:26:22.240 Mm-hmm.
00:26:22.600 And maybe back up for a second and do a little more history of ideas.
00:26:26.860 So, David Hume, the Scottish philosopher, has been very influential here in alleging that,
00:26:35.200 at least in one place in his work, that we never, we have no direct knowledge of causes ever.
00:26:40.300 All we have is the conjunction or the correlation, the coincidence of two events.
00:26:47.040 And when, you know, event B reliably follows event A, we impute causation where, in fact, there's no other knowledge ever gained there.
00:26:57.340 And, you know, I've always felt that that's almost a kind of semantic game which ignores some background intuitions we have
00:27:05.400 that reach deeper into the way the world is than just mere B following A.
00:27:11.100 First, it's ignore experiments.
00:27:13.320 Mm-hmm.
00:27:13.880 And Galileo lived before Hume.
00:27:16.680 Right.
00:27:16.940 So, I'm surprised that Hume did not pay attention to Galileo, although Galileo didn't make it explicit
00:27:23.880 that with experiments we get additional knowledge that you could not get by passive observation.
00:27:29.820 But Hume puts too much emphasis on regularity, which was criticized by many other people.
00:27:39.020 But then Hume changed his mind.
00:27:42.700 Yeah.
00:27:42.820 Between his essay and the treatise on human nature.
00:27:49.160 Mm-hmm.
00:27:50.160 And he, after I think seven or nine years, he said, in other words, and then he brought up a counterfactual definition of causation.
00:28:03.500 Right.
00:28:03.800 Had the object been different, the results would have, I don't have the exact phrasing, I have it in my book,
00:28:10.740 that he changed from regularity to counterfactual.
00:28:15.140 Had the object been different, then the outcome would be different.
00:28:19.440 And even put the words, in other words, between them, as if they were the same.
00:28:26.040 Right.
00:28:26.320 But they are totally different.
00:28:27.840 The first one is statistical regularity, which sits on the lowest level of the ladder, and the counterfactual is the top layer, the third layer.
00:28:39.460 Yeah, so let's talk about the three layers.
00:28:40.960 You described them at one point as seeing, doing, and imagining.
00:28:45.660 Right.
00:28:46.040 So seeing is this, well, I'll let you describe it.
00:28:49.460 What is seeing?
00:28:50.360 Seeing is you are sitting there like an astronomer, possibly observing phenomena, with your hand tied behind your back,
00:29:00.040 and you are talking about how your belief changes with additional observation.
00:29:07.400 That's statistics.
00:29:08.300 If you see another piece of evidence, you change your belief.
00:29:14.160 Whether you see symptoms and you change your belief about disease, you see a disease and you have expectation about symptoms.
00:29:21.800 So this is what statistics is all about.
00:29:25.520 And so that's the domain of mere correlation and humean juxtaposition.
00:29:31.380 Correct.
00:29:31.700 At least the first, human is first mood.
00:29:33.800 And that, by the way, is the domain of machine learning today.
00:29:37.000 Right.
00:29:37.360 Care-fitting.
00:29:39.040 Yeah.
00:29:40.060 Under noise, of course.
00:29:41.820 Right.
00:29:42.680 So that has been the dominant theology.
00:29:47.280 Maybe, I mean, we're going to head toward AI for a second, but maybe we should elaborate on that just for a stretch of 30 seconds.
00:29:54.400 Machine learning takes in an immense amount of data and finds correlations which prove useful as long as we give it information as to what constitutes success.
00:30:06.500 Yeah, so it's like take a facial recognition task.
00:30:09.360 It's an example.
00:30:10.180 And there's just, there's that mere correlation combined with sufficient computational power can prove very useful.
00:30:20.340 Very useful.
00:30:20.660 It's just not, it's just...
00:30:21.640 Amazingly useful.
00:30:22.880 Yeah.
00:30:23.280 Just obviously not the basis of general intelligence of the sort that we are, we'll later talk about.
00:30:28.740 It is debatable whether it is sufficient for general intelligence or not.
00:30:35.380 Seems unlikely, yeah.
00:30:36.520 But my opinion is not, because I've seen mathematically that there are barriers that you cannot cross.
00:30:43.300 Right.
00:30:43.560 Okay, so we'll get to AI in a second and the robots that may or may not kill us.
00:30:48.520 So, seeing, then there's doing.
00:30:51.200 What is doing?
00:30:52.080 Doing is running an experiment.
00:30:54.780 I'm wondering whether cancer, whether smoking causes cancer.
00:31:02.300 So, I conduct an experiment.
00:31:04.440 It's as old as Daniel in the lion then.
00:31:08.340 Okay, in the book of Daniel, you have a first experiment where Daniel and his fellow Israelites who were exiled refused to eat the food.
00:31:21.020 It wasn't kosher.
00:31:22.740 And the king, Nebuchadnezzar, commanded them to eat the king's food because it was much healthier.
00:31:30.560 And he depended on their talents to run the empire.
00:31:35.300 Right.
00:31:35.940 So, Daniel proposed an experiment.
00:31:39.200 Okay, take a few of us, give them vegetarian food, and take the other groups and give them the king's food and see who is going to be more healthier looking.
00:31:51.140 And that was the first experiment that we know of.
00:31:54.160 Almost controlled, almost randomized.
00:31:56.100 Yeah, I don't know which the control is there, but yeah.
00:31:58.140 Well, take a group.
00:31:59.860 Yes.
00:32:00.140 So, let's tell you, you split the group into two parts.
00:32:06.700 One of them is control, the other one is treatment, they call them.
00:32:10.820 And you see the difference in the outcome.
00:32:14.220 It's an experiment, but of course.
00:32:17.040 This was invented only in the 1930s.
00:32:21.140 The idea of randomized experiments.
00:32:24.180 A randomized controlled experiment.
00:32:25.600 Randomized, yes.
00:32:26.320 The 1930s, yeah.
00:32:27.400 Yes.
00:32:28.260 But we have been dealing with cause and effect much before this, right?
00:32:32.920 Sure.
00:32:33.140 Even from a time of Daniel.
00:32:34.200 One hopes.
00:32:34.940 How did we manage?
00:32:36.100 Well, the child manages by conducting playful manipulation in the world.
00:32:43.040 The child finds out that moving one ball causes the other one ball to move.
00:32:48.960 Playing with one toy makes a noise, and the other one doesn't.
00:32:52.560 So, it's called playful manipulation, and that, I believe, where we get most of our knowledge
00:32:59.320 about cause and effect in the world.
00:33:01.980 Yeah.
00:33:02.560 Yeah, you push the world and something happens.
00:33:04.820 With your own muscles.
00:33:06.440 Yeah.
00:33:06.760 Right.
00:33:07.640 Like Galileo dropped the two objects from the Tao of Pisa and looked at them with his own eyes.
00:33:15.400 Right.
00:33:15.720 That was essential.
00:33:18.580 So, the third level is imagining.
00:33:21.200 The third one is imagining, yeah.
00:33:24.400 Some people do not see the...
00:33:26.580 You can sit back if you want.
00:33:27.900 I can just swing this closer.
00:33:28.900 No, no.
00:33:29.060 Okay.
00:33:30.340 Yeah.
00:33:31.660 Imagining is looking at your theory of the world and manipulating it in your mind.
00:33:38.800 I start talking about imagining by showing the first sculpture that described impossible
00:33:49.320 objects.
00:33:50.660 It was a lion head connected to a human body.
00:33:53.720 Okay.
00:33:54.360 That was the first figurine, ivory figurines, discovered from 32,000 years ago in a cave in
00:34:02.520 Germany, the first object, artifact, they described an impossible object.
00:34:10.220 And how was that created?
00:34:12.780 Well, the artist, in his or her mind, probably was his.
00:34:17.660 He imagined taking apart the human body, sever it, and putting on a lion head.
00:34:29.420 Imagining it in your mind first and then put it in the ivory.
00:34:33.500 Right.
00:34:33.920 And that was the key.
00:34:35.540 Okay?
00:34:36.060 You can manipulate things in your mind before doing it in the physical world.
00:34:41.640 And that is a terrific idea, because that creates, according to Harari, a market of promises.
00:34:50.420 Hmm.
00:34:50.900 Okay?
00:34:51.440 Yeah.
00:34:51.740 You all know Harari.
00:34:53.140 Yeah.
00:34:53.520 He's been on the podcast.
00:34:55.420 Yeah.
00:34:56.000 Do you know him?
00:34:56.600 He's very interesting.
00:34:57.660 I haven't met him personally.
00:35:00.180 He communicated in one message.
00:35:02.140 Uh-huh.
00:35:03.080 You guys should get together.
00:35:05.240 So, imagining is the domain of counterfactuals.
00:35:09.280 And counterfactuals are a very important part of this story.
00:35:13.040 It's essential for science.
00:35:14.540 How would you define a counterfactual?
00:35:17.100 It's figuring out an outcome that would have prevailed had a certain observation not taken place.
00:35:26.360 Had Hillary won the election.
00:35:28.980 Had Cleopatra knows being longer than it was really, okay?
00:35:33.720 Had Julius Caesar not cross the robocon.
00:35:36.960 Don't laugh, because that's how historians communicate.
00:35:41.580 Okay?
00:35:42.360 And they understand each other, and they form a consensus.
00:35:47.600 So, they can communicate, had Oswald not killed Kennedy, how would American politics develop?
00:35:55.640 When would we be pulled out of Vietnam and things of that sort?
00:35:58.940 And they can communicate that way, despite the fact that Oswald did kill Kennedy.
00:36:06.300 Right.
00:36:06.800 How can we form a consensus about things that are conflicting with the real trajectory of history?
00:36:18.460 So, it's a discussion of what might have been.
00:36:21.860 Might have been.
00:36:22.560 And it's anything that falls into the bin of, had the world been different, what could we say then?
00:36:30.660 Correct.
00:36:31.040 Right.
00:36:31.800 If I hadn't crossed the street at precisely that moment, how would my life be different?
00:36:36.880 And with that comes all the ethics.
00:36:39.240 You should have known better.
00:36:40.260 Great.
00:36:40.960 Yeah.
00:36:41.220 So, it can sound like a very dry export from the ivory tower, this notion of counterfactuals,
00:36:49.060 but it underpins so much of what we care about.
00:36:52.940 And I think we'll get into that.
00:36:55.220 There's another connection, for me, to the foundation of knowledge.
00:36:59.520 What does real knowledge consist in?
00:37:03.300 And it's not enough to be right by accident, right?
00:37:08.600 So, you can't, like, if I look at my watch and it's actually broken, but it happens to show the correct time at this moment,
00:37:16.740 it's wrong to say that I am in knowledge of what time it is.
00:37:21.260 I, you know, because a minute later, you know, I will reveal that my methodology is such that it's not delivering me actual knowledge about the world.
00:37:28.920 So, you need to be able to ask, and this is a problem I always get into with religious people,
00:37:36.060 when I, you know, when I criticize religion, I criticize it for this.
00:37:39.280 When you ask yourself, I would invite any believer to ask this question of themselves now,
00:37:45.700 would you believe in God if God didn't exist?
00:37:50.380 Do you stand in such relation to the truth of his existence such that you would not form a false belief that he exists?
00:37:59.480 Is your belief in God the result of being in some contact with reality such that if God didn't exist, you wouldn't believe he exists?
00:38:10.160 And I think, you know, any look at the history and psychology of religion demonstrates that,
00:38:17.320 in almost every case, apart from the mystics who have some vision of God that, you know, may in fact be a vision of God,
00:38:23.780 you know, who are we to judge, believers routinely violate this principle,
00:38:27.840 because the truth is they inherit these doctrines from previous generations that have merely asserted
00:38:34.860 that certain books were dictated by the creator of the universe,
00:38:37.260 and there's no more burden of evidence than that,
00:38:40.680 and there's no more reality testing or updating of beliefs generation after generation.
00:38:46.200 There's still the mere assertion that these ancient books are the perfect record of God's existence.
00:38:51.020 Well, you are facing now a specimen of a person who answered your description.
00:39:00.020 I don't believe in God.
00:39:01.940 Actually, I know that God doesn't exist.
00:39:04.180 Okay, you give me one better.
00:39:05.460 And I still believe in him.
00:39:07.400 Okay, well, that's going to get complicated.
00:39:10.220 Why?
00:39:10.740 Why?
00:39:11.720 Okay, well, so, all right, so I'm reluctant to take a full detour here, but it's too interesting.
00:39:16.400 So, okay, so what do you mean?
00:39:18.480 What do you mean?
00:39:20.340 God and religion are just poetry.
00:39:23.400 Okay, well, that's...
00:39:24.020 So I'm using certain metaphors.
00:39:26.920 Sure.
00:39:27.180 Which are very helpful due to my cognition.
00:39:32.560 I'm using them to communicate with you, with my children, and I say,
00:39:37.200 yeah, God will punish you if you talk like that.
00:39:41.760 Why not?
00:39:43.820 Which means, look, to be more scientific about it,
00:39:48.140 most of our reasoning works around metaphors, similarities.
00:39:55.520 And the deepest metaphors that we have are the metaphors of family relations.
00:40:04.460 We are born to mother and father.
00:40:07.820 Our perception system is so attuned to whether our mother frowns or smiles.
00:40:14.400 It's the first thing that we learn.
00:40:17.440 You grow up, and you find out that the world is not only mother and father.
00:40:23.080 It has stars, and it has other things.
00:40:25.520 So you create a metaphor, because I understand mother and father.
00:40:29.200 I don't understand this movement of the stars.
00:40:32.800 So I would immediately come out with a conclusion that there is some force there,
00:40:38.380 like my father, that moves the stars around,
00:40:41.400 and like my father, teaches me things and punishes me things.
00:40:46.680 And sometimes it's very natural.
00:40:49.820 So that's the basics of our cognition.
00:40:52.280 So I do not fight it.
00:40:54.240 I use it.
00:40:55.380 But I remember there's only poetry.
00:40:57.680 Right, okay.
00:40:58.780 Well, then you're in a parish with very few members at the moment.
00:41:04.040 But, you know, I mean, that's a legitimate use of poetry and literature, certainly.
00:41:10.160 But it's not what most people most of the time mean by God, as you know.
00:41:15.900 This is just to say that thinking about what might have been different at the level of belief.
00:41:23.820 So I believe certain things about the world.
00:41:25.660 And if I believe I'm in touch with the world, I believe that, for instance, I'm staring at a microphone that I put here.
00:41:33.080 I believe there's a microphone in front of me on the desk.
00:41:36.760 But implicit in that belief, to say that that really is my propositional attitude, that there's a microphone on the desk,
00:41:44.880 is the assertion that if there weren't a microphone on the desk, I wouldn't think there was one.
00:41:52.240 Right?
00:41:52.700 So there is a counterfactual built into just the assertion that this is a microphone, whether anyone ever thinks about it.
00:42:02.260 But as you point out, an understanding of counterfactuals or an ability to model them is the necessary ingredient to understanding what in fact is a cause,
00:42:15.200 as opposed to merely an event that happens to precede some other event in time or be associated with it.
00:42:22.700 A mere correlation.
00:42:24.200 It's necessary to believe in actual cause.
00:42:29.660 By actual is different than average cause.
00:42:34.860 Smoking is, on the average, smoking is harmful to your health, on the average.
00:42:42.380 Some people could benefit from smoking.
00:42:45.740 When you talk about individual, then you talk about counterfactual.
00:42:49.480 Right.
00:42:49.680 Had I not smoked, I would have lived X number of years.
00:42:54.140 Well, let's talk about the smoking case, because that was a fascinating bit of history in your book, which I thought I was aware of,
00:43:01.940 but it was actually a far bit more grim and delusional than I realized.
00:43:07.960 There was a period of such active and protracted debate about whether or not smoking caused cancer that it went on far too long.
00:43:19.000 And you had people, you had scientists who were smoking two and three and four packs a day, denying the linkage.
00:43:27.560 And there's a nicotine-empowered level of confirmation bias that was ruling the conversation there.
00:43:33.000 What lessons do you draw from that period in our history?
00:43:36.880 To me, it means something perhaps different than to other people.
00:43:42.740 For me, it was an example of how scientists can argue about things for which they don't have a language.
00:43:52.560 They didn't have a language of causation at that time.
00:43:56.780 They had a language of randomized experiment, which they couldn't conduct on smoking.
00:44:01.100 Right.
00:44:01.560 And that gave a fisher, who was the top statistician at the time.
00:44:08.460 An avid pipe smoker, if I recall.
00:44:12.840 Yeah.
00:44:13.520 Pipe smoker all his life.
00:44:15.500 And it gave him ammunition to claim, hmm, maybe what we see here is just coincidental correlation
00:44:23.160 between some genetic factor that makes you crave for nicotine on one hand,
00:44:30.440 and it puts you in a cancer risk on the other.
00:44:33.240 So what we are seeing is just the effect of a confounder, a third variable that causes both.
00:44:43.080 I am not sure that he did it because he was a smoker himself,
00:44:48.740 or because he wanted to be an iphamistabra, which means just a smart, a smart alec.
00:44:55.600 A smart ass.
00:44:56.060 A smart ass, okay.
00:44:58.620 And to show off his knowledge about statistics and about the possibility
00:45:05.100 that you might get the same results with a different hypothesis.
00:45:09.360 Right.
00:45:09.620 Yeah.
00:45:10.320 I'm not sure which was the case, but the fact that he resisted the conclusion of other people
00:45:18.860 went on for more than 10 years.
00:45:23.240 I think millions of people died as a result of that.
00:45:27.900 But eventually it was resolved by the commissioner,
00:45:31.560 and the surgery general came out with a statement that it does cause cancer.
00:45:38.120 And the way that came about it was interesting.
00:45:41.200 They looked into the plausibility argument in order, calculated the degree to which the hidden
00:45:51.180 genetic factors will have to change your craving for incotin, and that made it impossible,
00:45:59.160 or implausible, that if you have these genetic factors, you'll crave eight times more than
00:46:04.540 if you didn't have it.
00:46:05.360 They don't have any mechanism between a genetic factor to make this craving plausible, you
00:46:12.380 should say, okay.
00:46:13.180 That was a key for the conclusion that they came up with, in the consensus they came up
00:46:18.760 with, and things have been different since then.
00:46:22.580 Right.
00:46:22.760 But still, what one confronts there is the sense that, based on a purely statistical argument,
00:46:32.460 it's always an overreach to establish causation, no matter how much data you have of correlation.
00:46:39.100 Correct.
00:46:39.780 And that has not been appreciated to the degree it should be.
00:46:44.960 No causes in, no causes out.
00:46:48.000 That was Nancy Cartwright's slogan, which people, it makes sense.
00:46:55.680 No, no, it doesn't.
00:46:56.820 No causation without correlation.
00:46:59.100 Everybody understands, okay?
00:47:01.040 But the idea is that if you want to get causal conclusion, you must have some causal assumption
00:47:06.740 someplace, or experiments.
00:47:10.100 One or the two.
00:47:10.740 Right.
00:47:11.420 This is so important, because so many people have forgotten.
00:47:18.160 Let's linger on this notion of counterfactuals for another moment, because, so it does suggest
00:47:27.380 that possibility is a real thing.
00:47:31.500 And I've occasionally wondered, in fact, last time I wondered this in public, it was John
00:47:37.320 Brockman's final edge question, and the one I suggested was, I don't know if you were in
00:47:42.760 that particular round, but my last edge question, the question that year was, what should the
00:47:48.540 last edge question be?
00:47:50.780 And I believe my question was, is the actual all that is possible?
00:47:56.560 Which is to say that, is possibility an illusion?
00:47:59.640 Is there only what is actual?
00:48:01.400 Is the notion that something else could have happened always just an idea, and does it actually
00:48:07.720 not reach into anything that we can profitably think about?
00:48:13.100 Is there simply just the fact of the matter in every case?
00:48:16.520 And counterfactual thinking is explicitly thinking about what is possible, what might have been,
00:48:24.840 had things been different?
00:48:27.280 And I guess I'll just put it to you.
00:48:30.040 How do we know that possibility is even a thing?
00:48:35.580 It's useful to speak as though it were a thing.
00:48:37.880 And this actually connects to the topic of free will, which you write about in the book,
00:48:41.100 because, you know, you and I are convinced, you know, happily, not many people agree with
00:48:45.300 us, but you and I are both convinced that free will is an illusion, but in one way or
00:48:50.340 another, it's a useful or inevitable illusion.
00:48:53.080 But we still don't understand what makes it useful.
00:48:56.020 Right, right.
00:48:56.900 And you and I might disagree a little bit about how useful it is, but is it possible, and here
00:49:03.400 there's the useful invocation of the concept, is it possible that possibility is an illusion
00:49:10.660 as well?
00:49:11.820 It is.
00:49:13.600 Because the history of the people.
00:49:17.980 If you'd like to continue listening to this conversation, you'll need to subscribe at
00:49:23.440 samharris.org.
00:49:24.380 Once you do, you'll get access to all full-length episodes of the Making Sense podcast, along
00:49:29.360 with other subscriber-only content, including bonus episodes and AMAs and the conversations
00:49:34.860 I've been having on the Waking Up app.
00:49:36.980 The Making Sense podcast is ad-free and relies entirely on listener support, and you can subscribe
00:49:42.360 now at samharris.org.
00:49:44.240 Thank you.