Making Sense - Sam Harris - July 04, 2016


#39 — Free Will Revisited


Episode Stats

Length

45 minutes

Words per Minute

154.8964

Word Count

7,034

Sentence Count

348


Summary

Dan Dennett recently published a book review of his new book, "A Museum of Mistakes," and I responded to it in a blog post on my blog. In doing so, I also published a response to his review, written as a letter to Dan's critics, which you can find here. I hope this doesn't give our readers cause for cynicism on this front, but I do want to remind my readers that debates like this aren't necessarily pointless. And yet we've grown to expect it on every topic, no matter how intelligent and well-intentioned the participants are. I think it's fair to say that one could watch an entire season of Downton Abbey on Ritalin, and not have a finer note than you can manage for 20 pages running. And now I have a quote from Dan: "This is Dan being disingenuous when he says, 'This is a museum of mistakes.' I'm not here to say this is Dan trying to be clever, I'm here to point out that scientists are thinking so boldly, and clearly, what are they are thinking." I am grateful to Dan for saying so boldly and clearly. I've always suspected that many who hold this hard, determinist view are making mistakes. But we mustn't put words in people's mouths by articulating the points he has done a great service. And now, as a scientist, Wolfgang Pauly reminds us all along as to what they are making. of his work, as an ambientizing, not only by pointing out that they have been making all along along along as a long line of work, but by keeping themselves making mistakes but keeping themselves right along along, as if they are all along to keep themselves making these mistakes. -- but we must not make these mistakes, as it's not even a good thing, as Wolfgang's work goes by the virtue of making a long, right along as well as by declaring that they don't even need to be making these? by pointing to the ambientizing work of another physicist's work not by saying "an ambientizing. " . And now we must be making words in words that are not even good enough, not even by making words that make people s wrong along along the way. . he's not a good enough critic of us all right, right, but not a bad one, right by me, right right along with me, not by me?


Transcript

00:00:00.000 Welcome to the Making Sense Podcast.
00:00:08.820 This is Sam Harris.
00:00:10.880 Just a note to say that if you're hearing this, you are not currently on our subscriber
00:00:14.680 feed and will only be hearing the first part of this conversation.
00:00:18.420 In order to access full episodes of the Making Sense Podcast, you'll need to subscribe at
00:00:22.720 samharris.org.
00:00:24.060 There you'll find our private RSS feed to add to your favorite podcatcher, along with
00:00:28.360 other subscriber-only content.
00:00:30.260 We don't run ads on the podcast, and therefore it's made possible entirely through the support
00:00:34.640 of our subscribers.
00:00:35.880 So if you enjoy what we're doing here, please consider becoming one.
00:00:46.580 Well, I just got back from Banff, where I attended the TED Summit, and I brought a portable recording
00:00:53.120 device to the conference on the odd chance that I might find someone worth talking to who
00:00:57.140 wanted to record a podcast.
00:00:59.560 Needless to say, there were many people worth talking to, but not much time to sit down and
00:01:04.120 do a podcast.
00:01:05.480 But I did record one conversation with the philosopher Dan Dennett, who probably needs no introduction
00:01:10.700 here.
00:01:11.600 As many of you know, Dan and I have been brothers in arms for many years, along with Richard Dawkins
00:01:16.500 and Christopher Hitchens, as the so-called New Atheists, or the Four Horsemen after a video
00:01:21.920 by that name that we recorded in Hitch's apartment some years back.
00:01:25.540 Dan and I once debated together on the same team, along with Hitch, at the Ciudad de las
00:01:31.120 ideas conference in Mexico, where we were pitted against Dinesh D'Souza and Rabbi Shmuley
00:01:38.040 Botiak and Robert Wright, I believe, and Nassim Taleb got in there somehow.
00:01:44.340 I hope it doesn't seem too self-serving or contemptuous of our opponents to say that
00:01:50.040 we came out none the worse for wear on that occasion, and needless to say, that video is
00:01:54.880 online for all to see until the end of the world.
00:01:58.740 But as many of you know, Dan and I had a very barbed exchange on the topic of free will
00:02:03.480 some years later, and that was a little over two years ago, and we never resolved it.
00:02:08.180 I came out with my short book on free will, and Dan reviewed it, and then I responded to
00:02:13.120 his review, and the matter was left there in a way that no one found satisfying, least
00:02:19.120 of all our readers.
00:02:20.780 There really was an outpouring of dismay over the tone that we took with each other, and
00:02:25.920 I must say that was totally understandable.
00:02:28.200 I want to begin by reading the first few paragraphs of my response to Dan's review, which includes
00:02:34.940 a quotation from him, so you can hear how vexed and vexing things got.
00:02:39.620 And if you're interested, you can read the whole exchange on my blog.
00:02:43.080 In fact, when I post this podcast on my website, I'll provide the relevant links.
00:02:47.660 So this is near the beginning of my response, written as a letter to Dan.
00:02:51.480 I want to begin by reminding our readers, and myself, that exchanges like this aren't necessarily
00:02:56.360 pointless.
00:02:57.740 Perhaps you need no encouragement on that front, but I'm afraid I do.
00:03:01.360 In recent years, I've spent so much time debating scientists, philosophers, and other scholars,
00:03:05.500 that I've begun to doubt whether any smart person retains the ability to change his mind.
00:03:10.420 This is one of the great scandals of intellectual life.
00:03:13.480 The virtues of rational discourse are everywhere espoused, and yet witnessing someone relinquish
00:03:18.120 a cherished opinion in real time is about as common as seeing a supernova explode overhead.
00:03:23.680 The perpetual stalemate one encounters in public debates is annoying because it is so clearly
00:03:28.180 the product of motivated reasoning, self-deception, and other failures of rationality.
00:03:33.080 And yet we've grown to expect it on every topic, no matter how intelligent and well-intentioned
00:03:37.800 the participants.
00:03:38.940 I hope you and I don't give our readers further cause for cynicism on this front.
00:03:43.740 Unfortunately, your review of my book doesn't offer many reasons for optimism.
00:03:47.640 It is a strange document, avuncular in places, but more generally sneering.
00:03:52.300 I think it fair to say that one could watch an entire season of Downton Abbey on Ritalin and
00:03:56.780 not detect a finer note of condescension than you manage for 20 pages running.
00:04:00.720 And now I have a quotation from Dan's review here.
00:04:03.980 This is Dan.
00:04:05.120 I'm not being disingenuous when I say this museum of mistakes is valuable.
00:04:09.320 I am grateful to Harris for saying so boldly and clearly what less outgoing scientists are
00:04:14.120 thinking but keeping to themselves.
00:04:16.080 I've always suspected that many who hold this hard, determinist view are making these mistakes.
00:04:21.180 But we mustn't put words in people's mouths.
00:04:23.360 And now Harris has done us a great service by articulating the points explicitly.
00:04:27.320 And the chorus of approval he's received from scientists goes a long way to confirming
00:04:31.780 that they have been making these mistakes all along.
00:04:34.400 Wolfgang Pauly's famous dismissal of another physicist's work as, quote,
00:04:38.280 not even wrong, reminds us of the value of crystallizing an ambient cloud of hunches
00:04:43.460 into something that can be shown to be wrong.
00:04:46.000 Correcting widespread misunderstanding is usually the work of many hands.
00:04:50.000 And Harris has made a significant contribution.
00:04:52.800 End quote.
00:04:53.200 So this is back to me.
00:04:55.020 I say, I hope you will recognize that your beloved Rappaport's rules have failed you here.
00:04:59.700 As an aside, I should say, for those of you who are not familiar with them, these rules
00:05:03.240 come from Anatole Rappaport, the mathematician, game theorist, and social scientist.
00:05:08.960 And Dan has been a champion of these rules of argumentation for years.
00:05:12.640 And they are, one, attempt to re-express your target's position so clearly, vividly, and
00:05:17.700 fairly, that your target says, thanks, I wish I thought of putting it that way.
00:05:22.320 Two, list any points of agreement, especially if they are not matters of general or widespread
00:05:26.920 agreement.
00:05:28.100 Three, mention anything you have learned from your target.
00:05:31.820 Four, only then are you permitted to say so much as a word of rebuttal or criticism.
00:05:35.820 So those are the rules, and Dan has often said that he aspires to follow them when criticizing
00:05:43.180 another person's point of view.
00:05:45.360 So back to my text.
00:05:46.960 I hope you will recognize that your beloved Rappaport's rules have failed you here.
00:05:50.920 If you have decided, according to the rule, to first mention something positive about the
00:05:55.040 target of your criticism, it will not do to say that you admire him for the enormity of
00:05:59.520 his errors and the recklessness with which he clings to them despite the sterling example
00:06:04.020 you've set in your own work.
00:06:06.000 Yes, you may assert, quote, I am not being disingenuous when I say this museum of mistakes
00:06:10.580 is valuable, end quote.
00:06:12.380 But you are, in fact, being disingenuous.
00:06:15.840 If that isn't clear, permit me to spell it out just this once.
00:06:18.920 You are asking the word valuable to pass as a token of praise, however faint.
00:06:23.520 But according to you, my book is, quote, valuable for reasons that I should find embarrassing.
00:06:29.400 If I valued it as you do, I should rue the day I wrote it.
00:06:32.940 As you would, had you brought such value into the world.
00:06:36.300 And it would be disingenuous of me not to notice how your prickliness and preening appears.
00:06:41.020 You write as one protecting his academic turf.
00:06:43.680 Behind and between almost every word of your essay, like some toxic background radiation,
00:06:49.140 one detects an explosion of professorial vanity, end quote.
00:06:53.900 So that's how snide things got.
00:06:56.980 And I must say that this is really a problem with writing rather than having a face-to-face
00:07:02.900 encounter.
00:07:03.880 If any of you have ever had the brilliant idea of writing a long letter or email to a friend
00:07:08.580 to sort out some relationship crisis rather than just have a conversation, you've probably
00:07:13.540 discovered how haywire things can go through an exchange of texts.
00:07:17.640 And the same can be true for intellectual debates among philosophers and scientists.
00:07:21.880 And it's especially likely to happen if either or both of the people involved are writers
00:07:26.700 who get attached to their writerly maneuvers.
00:07:29.580 I remember writing that quip about Downton Abbey, and it made me laugh at the time.
00:07:35.100 I knew it would make many readers laugh, and so I kept it in.
00:07:38.920 But lines like that just amplify the damage done.
00:07:42.600 So as I told Dan at the end of our podcast, I very much regret the tone I took in this
00:07:48.200 exchange, and I'm very happy we got a chance to have a face-to-face conversation and sort
00:07:52.380 things out.
00:07:53.480 I don't think we resolved all the philosophical issues, but we spoke for nearly two hours,
00:07:58.600 but there were several important topics that never came up.
00:08:01.680 As you'll hear, we were speaking in a bar using a single microphone, and this was at the
00:08:06.340 end of a long day of conferencing.
00:08:08.140 So this isn't us at our most polished or prepared, but I thought it was a very good conversation,
00:08:14.260 and I think those of you who are interested in the problem of free will and its connection
00:08:17.440 to ethics will find it useful.
00:08:19.880 I still think there's some sense in which Dan and I are talking past one another.
00:08:23.480 The nature of our remaining disagreement never became perfectly clear to me.
00:08:28.060 So perhaps you guys can figure it out.
00:08:31.260 And now I give you Dan Dennett, in a bar, overlooking the Canadian Rockies.
00:08:38.140 So I'm here with Dan Dennett at the TED Summit in Banff, and we have stolen away from the
00:08:49.100 main session, and we are in a bar and about to have a conversation about the misadventure
00:08:55.300 we had in discussing free will online in a series of articles and blog posts.
00:09:02.040 I mean, you and I are part of a community, and a pretty visible part of a community that
00:09:07.060 prides itself on being willing to change its opinions and views more or less in real time
00:09:13.060 under pressure from better arguments and better data.
00:09:16.460 And I think I said in my article in response to your review of my book, Free Will, that this is a
00:09:24.740 very rare occurrence.
00:09:26.300 I mean, to see someone relinquish his cherished opinion more or less on the spot under pressure
00:09:31.780 from an interlocutor, that's about as rare as seeing a supernova overhead.
00:09:36.940 And it really shouldn't be, because there's nothing that tokens intellectual honesty more
00:09:43.420 than a willingness to step away from one's views once they are shown to be in error.
00:09:49.540 And I'm not saying we're necessarily going to get there in this conversation about free
00:09:54.380 will, but there was something that went awry in our written exchanges, tonally, and neither
00:10:02.800 of us felt good about the result.
00:10:04.660 And so, again, we'll talk about free will as well, but I think this conversation is proceeding
00:10:10.120 along two levels, where there's a thing we're talking about philosophically, which is free
00:10:14.280 will, but then there's just the way in which I want us to both be sensitive to getting hijacked
00:10:19.760 into unproductive lines that make it needlessly hard to talk about what is just a purely intellectual
00:10:27.360 philosophical matter.
00:10:28.980 And one of great interest, surprisingly great interest to our audiences.
00:10:34.800 There's no topic that I've touched that has surprised me more in the degree to which people
00:10:41.340 find it completely captivating to think about.
00:10:43.840 And I know you and I both think it's a very consequential topic.
00:10:47.540 It's unlike many topics in philosophy.
00:10:50.600 This one really does meet ethics and public policy in a way that is important.
00:10:55.140 So one thing you all should know in listening to this is that we have one microphone.
00:11:00.160 Perhaps this is a good thing because we really can't interrupt each other and we're just going
00:11:03.780 to pass this microphone back and forth.
00:11:05.780 And I now give you Dan Dennett.
00:11:08.340 Thanks, Sam.
00:11:09.200 This is a beautiful setting.
00:11:10.520 If we can't agree on some things here, we shouldn't be in this business.
00:11:14.620 That's right.
00:11:14.760 I want to go back one step further in how this got started.
00:11:21.040 You sent me the manuscript of your book, Free Will, and asked me for my advice.
00:11:27.660 And I didn't have time to read it.
00:11:28.880 I just told you, no, I'm sorry, I don't have time.
00:11:32.560 And then when the book came out, I read it and said, oh, I wish you'd, I forgot that we'd
00:11:36.760 had that experience.
00:11:37.380 I said, I wish you'd showed it to me because I think you made some big mistakes here.
00:11:40.580 And I would love to have tried to talk you out of them too late.
00:11:45.740 And then we, time passed and then we had the, you said you wanted me still to say what I
00:11:53.000 thought the mistakes were.
00:11:54.180 And that's when I wrote my piece for your blog and for Naturalism.
00:11:58.700 And it certainly struck you wrong.
00:12:01.380 And I guess I regret a few bits of tone there, but I think everything I said there is defensible.
00:12:10.040 And in particular, I did use Rappaport's rules, contrary to what you say.
00:12:15.420 If you look at the first paragraph of my piece, I applaud the book for doing a wonderful, clear
00:12:21.080 job of setting out a position which I largely agreed with.
00:12:24.640 And then I said, you went off the rails a little later.
00:12:26.520 So I did try to articulate your view, and I haven't heard you complain about that articulation
00:12:33.800 of your view.
00:12:35.180 And I said what we agree about.
00:12:38.400 And I even said what I've learned from that book.
00:12:41.580 So I did follow Rappaport's rules quite well.
00:12:45.880 But we can just set that aside if you want and get down to what remains of the issue.
00:12:51.640 One thing in particular, which I know came off awfully preachy, but I really think it was
00:13:01.400 most unwise of you to declare that my position sounded like religion, sounded like theology.
00:13:13.660 theology, you have to know that you're insulting me.
00:13:18.320 And that was a pretty deliberate insult, and that was in the book.
00:13:21.660 And I thought, come on, Sam.
00:13:24.000 So you can't expect kid gloves.
00:13:26.380 If you're going to call me a theologian, then I'm going to call you on it and say, as I said,
00:13:31.920 I tell my students, when a view by apparently senior, you know, an author worth reading looks
00:13:39.100 that bad, maybe you've misinterpreted it.
00:13:42.900 And of course, the main point of my essay was, yes, you have misconstrued my brand of
00:13:51.340 compatibilism.
00:13:52.720 You've got a sort of a caricatured version of it.
00:13:56.320 And in fact, as I say late in the piece, you are a compatibilist in all that name.
00:14:03.200 You and I agree on so many things.
00:14:05.060 You agree with me that determinism and moral responsibility are compatible.
00:14:13.620 You agree that a system of law, including punishment and justified punishment, is compatible
00:14:21.880 with determinism.
00:14:24.460 That's, we're just that close to compatibilism.
00:14:28.180 I've actually toyed with the idea, in part, provoked by you and some others, Jerry Coyne
00:14:35.700 and others, to say, all right, I don't want to fight over who gets to define the term free
00:14:40.240 will.
00:14:41.820 As I see it, there are two completely intention themes out there about what free will is.
00:14:49.620 One is that it's incompatible with determinism.
00:14:52.140 And the other is that it's the basis of moral responsibility.
00:14:54.760 I think it's the second one that's the important one.
00:14:57.980 That's the variety of free will worth wanting.
00:15:00.500 And I think the other one's a throwaway.
00:15:02.780 And I agree with you.
00:15:04.200 Indeterminist free will, libertarian free will, is a philosopher's fantasy.
00:15:09.920 It is not worth it.
00:15:11.820 It is just a fantasy.
00:15:15.000 So we agree on so much.
00:15:16.260 We have no love for libertarian indeterminism, for agent causation, for all of that metaphysical
00:15:26.060 gobbledygook.
00:15:27.040 We're both good naturalists.
00:15:29.420 And we both agree that the truths of neuroscience and the truths of physics, physics doesn't have
00:15:36.860 much to do with it, actually, are compatible with most of our understanding, our everyday
00:15:45.680 understanding of responsibility, taking responsibility, being morally responsible enough to be held
00:15:56.000 to our word, I mean, you and I both agree that you are competent to sign a contract.
00:16:02.720 Me too.
00:16:04.340 Well, you know, if you go and sign a deed or a mortgage, very often, if it's notarized,
00:16:14.100 the notary public will say, are you signing this of your own free will?
00:16:17.320 Well, and I recently did.
00:16:19.880 I said, yeah, I am.
00:16:21.540 That's the sense of free will that I think is important.
00:16:24.280 I have it.
00:16:25.700 There are a lot of people that don't have that free will, and it has nothing to do with indeterminism.
00:16:30.420 It has to do with their being disabled in some way.
00:16:34.360 They don't have a well-running nervous system, which you need if you're going to be a responsible
00:16:42.860 agent.
00:16:43.360 I think you agree with all of that.
00:16:44.900 So, I certainly agree with most of that.
00:16:47.720 I think there's some interesting points of disagreement on the moral responsibility
00:16:52.920 issue, which we should talk about, and I think that could be very interesting for listeners
00:16:58.700 for us to unpack those differences.
00:17:01.340 I am, needless to say, very uncomfortable with the idea that I have misrepresented your
00:17:07.440 view, and if I did that in my book, I certainly want to correct that here.
00:17:11.520 So, we should clearly state what your view is at a certain point here.
00:17:15.920 But I want to step back for a second before we dive into the details of the philosophy of
00:17:20.120 free will.
00:17:21.040 What I was aware of doing in my book, Free Will, and again, I would recommend that our
00:17:26.560 listeners just go back and, you don't actually have to read my book, but you can read Dan's
00:17:32.660 review of it on my blog, and you can read my response, which is entitled The Marionette's
00:17:37.420 Lament, I believe, then you can see the bad blood that was generated there.
00:17:42.320 And I don't know, Dan, if you're aware of this, you don't squander as much of your time
00:17:46.580 on social media or in your inbox, but I heard from so many of our mutual readers that they
00:17:53.240 were just despairing of that contretemps between us.
00:17:56.560 It was like mom and dad fighting, and it was totally unpleasant.
00:18:00.060 The thing that I really regret, which you regret that you didn't get a chance to read my book
00:18:03.780 before I published it, which that would have been a nice thing for both of us, but what
00:18:07.720 I regret is when you told me that you were planning to write a review of it, I kept urging
00:18:14.660 you and ultimately badgering you to not do that and have a discussion with me because
00:18:20.760 I knew what was going to happen, at least from my point of view, is that you would hit
00:18:24.940 me with this 10,000 word volley, which at a dozen points or more, I would feel you had
00:18:32.320 misconstrued me or gone off the rails, and there would be no chance to respond to those
00:18:36.660 and to respond in a further 10,000 word volley in a piecemeal way would just lead to this
00:18:42.400 exchange that was very boring to read and yielded a much bigger sense of disagreement than
00:18:49.880 what was necessary, right?
00:18:51.200 So if I have to spend 90% of my energy taking your words out of my mouth, then this thing
00:18:56.000 begins to look purely adversarial.
00:18:58.760 So one thing I've been struggling for in my professional life is a way of having conversations
00:19:03.260 like this, even ones where there's much less goodwill than you and I have for one another,
00:19:08.040 because you and I are friends and we're on the same side of most of these debates, and
00:19:12.320 so we should be able to have this kind of conversation in a way that's productive.
00:19:15.140 But I've been engaging people who think I'm a racist bigot as a starting point, and I
00:19:21.840 want to find ways of having conversations in real time where you can be as nimble as
00:19:27.640 possible in diffusing unnecessary conflict or misunderstanding, and writing is an especially
00:19:34.320 bad way to do that.
00:19:35.540 Certainly writing the 10,000 word New York Review of Books piece that then someone has to react
00:19:41.420 to in an angry letter, you know?
00:19:43.120 So in any case, I wish we'd had that conversation, but we're having it now, and this is instructive
00:19:47.980 in its own way.
00:19:49.480 Feel free to react to that, but I guess I want you to also express what compatibilism means
00:19:56.960 to you, and if you recall the way in which I got that wrong, feel free to say that, but
00:20:01.000 I'll then react to your version of compatibilism.
00:20:04.360 Well, my view of compatibilism is pretty much what I just said, and you were nodding, and
00:20:12.120 you were not considering that a serious view about free will, although you were actually
00:20:18.020 almost all of it you were agreeing with, and you also, I think, made this serious strategic
00:20:27.400 or tactical error of saying this is like theology.
00:20:32.460 It smells of theology.
00:20:34.260 Well, as soon as you said that, I thought, well, you just don't understand what compatibilism
00:20:38.840 is.
00:20:39.100 It's the opposite of theology.
00:20:40.840 It's an attempt to look at what matters, to look at the terms and their meanings, and
00:20:50.800 to recognize that sometimes ancient ideology gets in the way of clear thinking so that you
00:21:00.140 can't just trust tradition.
00:21:02.140 If you trusted tradition and the everyday meanings of words, we would have to say all sorts of
00:21:08.080 silly things we've learned.
00:21:10.180 In fact, one of the abiding themes in my work is there are these tactical or diplomatic choice
00:21:18.620 points.
00:21:19.640 You can say, oh, consciousness exists.
00:21:24.540 It just isn't what you think it is.
00:21:26.420 Or you can say, no, consciousness doesn't exist.
00:21:29.280 Well, if you've got one view of consciousness, if it's this mysterious, magical, ultimately insoluble
00:21:37.080 problem, then I agree, consciousness in that sense doesn't exist.
00:21:41.800 But there's another sense, much more presentable, I think, which of course consciousness exists.
00:21:49.180 It just isn't what you think it is.
00:21:51.660 That was a central theme in Elbow Room with regard to free will and in Consciousness Explained
00:21:58.080 with regard to consciousness.
00:21:59.100 My view, my tactic, and notice, those two views, they look as if they're doctrinally
00:22:08.920 opposed.
00:22:09.520 They're not.
00:22:10.420 They're two different ways of dealing with the same issue.
00:22:13.260 Does free will really exist?
00:22:15.160 Well, if free will means what Dennett says it means, yes.
00:22:19.880 And you agree.
00:22:20.820 If it means what some people think, then the answer is no.
00:22:24.820 Yeah, I understand that, but I would put to you the question, there is a difference between
00:22:31.500 explaining something and changing the subject.
00:22:35.820 So this is my gripe about compatibilism, and this is something we'll get into.
00:22:40.140 But I assume you will admit that there is a difference between purifying a real phenomenon
00:22:46.300 of its folk psychological baggage, which I think this is what you think compatibilism is
00:22:51.160 doing, and actually failing to interact with some core features that are just inelimitable
00:23:00.000 from the concept itself.
00:23:01.540 Let me surprise you by saying, I don't think there's a sharp line between those two, and
00:23:05.540 I think that's quite obvious, that whether I'm changing the subject, I'm so used to that
00:23:11.380 retort about any line along this.
00:23:14.600 So, no, I think that's just a debater's point.
00:23:17.920 We should just set that aside.
00:23:20.100 Saying you're just changing the subject is a way of declaring a whole manifold, a whole
00:23:29.540 variety spectrum of clarificatory views, which you're not accepting because you're clinging
00:23:36.300 to some core part of what free will is.
00:23:39.320 You want to claim that free will, the core of free will, is its denial of determinism.
00:23:47.400 And I've made a career saying that's not the core.
00:23:50.840 In fact, let me try a new line on you, because I've been thinking, why doesn't he see this
00:23:57.020 the way I see it?
00:23:58.400 And I think that the big source, the likely big source of confusion about this is that
00:24:08.440 when people think about freedom in the context of free will, they're ignoring a very good
00:24:15.140 and legitimate notion of freedom, which is basically the engineering notion of freedom when you talk
00:24:21.400 about degrees of freedom, my arms, you know, my wrist and my shoulder and my elbow, those
00:24:27.060 joints, there's three degrees of freedom right there.
00:24:31.220 And in control theory, it's all about how you control the degrees of freedom.
00:24:36.240 And if we look around the world, we can see that some things have basically no degrees of
00:24:41.560 freedom, that rock over there.
00:24:43.100 And some things, like you and me, have uncountably many degrees of freedom because of the versatility
00:24:49.160 of our minds, the capacity that we are.
00:24:51.900 We can be moved by reasons on any topic at all.
00:24:57.060 This gives us a complexity from the point of view of control theory, which is completely
00:25:02.060 absent in any other creature.
00:25:04.380 And that kind of freedom is actually, I claim, at the heart of our understanding of free will
00:25:11.520 because it's that complexity, which is not just complexity, but it's the competence to
00:25:17.380 control that complexity.
00:25:19.100 That's what free will is.
00:25:20.820 What you want, if you've got free will, is the capacity, and it'll never be perfect, to
00:25:28.340 respond to the circumstances with all the degrees of freedom you need to do what you think
00:25:36.200 would be really the right thing to do.
00:25:37.840 You may not always do the right thing, but let's take a dead simple case.
00:25:44.480 Imagine writing a chess program which, stupidly, it was written wrong so that the king could
00:25:52.580 only move forward or back or left or right like a rook and it could not move diagonally.
00:25:57.560 And this was somehow hidden in it so that it just never even considered moves, diagonal
00:26:02.840 moves by the king.
00:26:03.580 It's a completely disabled chess program.
00:26:06.560 It's missing a very important degree of freedom, which it should have and be able to control
00:26:11.720 and recognize when to use and so forth.
00:26:14.000 What you want, I mean, let me ask you a question about what would be ideal from the point of view
00:26:23.180 of responsibility.
00:26:24.060 What does an ideal responsible agent have?
00:26:30.200 It's not mainly true beliefs, a well-ordered set of desires, the cognitive adroitness to
00:26:40.540 change one's attention, to change one's mind, to be moved by reasons, the capacity to listen
00:26:47.000 to reasons, the capacity for some self-control.
00:26:50.820 These things all come in degrees, but our model of a responsible adult, someone you would
00:26:58.160 trust, someone you would make a promise to, or you would accept a promise from, is somebody
00:27:03.940 with all those degrees of freedom and control of them.
00:27:07.060 Now, what removes freedom from somebody is if either the degrees of freedom don't exist,
00:27:15.860 they're blocked mechanically, or some other agent has usurped them and has taken over control,
00:27:23.520 a marionette and a puppeteer.
00:27:25.660 And so, I think that our model of a free agent says nothing at all about indeterminism.
00:27:36.640 We can distinguish free agents from unfree agents in a deterministic world or in an indeterministic world.
00:27:46.180 Determinism and indeterminism make no difference to that categorization, and it's that categorization
00:27:52.180 which makes the moral difference.
00:27:53.880 So, yeah, I agree with almost all of that.
00:27:57.120 I just need to put a few more pieces in play here.
00:27:59.860 I think there is an important difference, nevertheless.
00:28:02.980 I agree that there is no bright line between changing the subject and actually purifying
00:28:07.040 a concept of illusions and actually explaining something scientifically about the world.
00:28:12.700 But in this case, the durability of free will as a problem for philosophers and now scientists
00:28:19.620 is based on people's first-person experience of something they think they have.
00:28:25.560 People feel like they are the authors of their thoughts and intentions and actions.
00:28:30.540 And so, there's a first-person description of this problem, and there's a third-person description
00:28:35.740 of this problem.
00:28:36.360 And I think if we bounce between the two without knowing that we're bouncing between the two,
00:28:41.780 we are losing sight of important details.
00:28:44.580 So, people feel that they have libertarian free will.
00:28:48.380 And when I get emails from people who are psychologically destabilized by my argument that free will doesn't
00:28:54.420 exist, these are people who feel like something integral to their psychological life and well-being
00:29:00.860 is being put in jeopardy.
00:29:03.260 And I can say this from both sides because I know what it's like to feel that I could have done otherwise.
00:29:09.600 So, let me just, for listeners who aren't totally up to speed here,
00:29:14.440 libertarian free will is this, is anchored to this notion of I could have done otherwise.
00:29:19.020 So, if we rewound the universe to precisely as it was a few moments ago, I could complete
00:29:25.880 this sentence differently than I did.
00:29:28.520 You know, whether you throw indeterminism or determinism or some combination thereof,
00:29:34.100 there is no scientific rationale for that claim.
00:29:37.160 If you rewound the universe to precisely its prior state with all relevant variables intact,
00:29:43.520 whether deterministic or indeterministic, these words would come out of my mouth in exactly
00:29:47.800 the same order and there would be no change.
00:29:51.440 I would speak this sentence a trillion times in a row with its errors, with its glitches.
00:29:56.920 So, people feel that if they rewound the movie of their lives, they could do differently in
00:30:03.180 each moment.
00:30:03.860 And that feeling is the thing that is what people find so interesting about this notion that
00:30:09.400 free will doesn't exist because it is so counterintuitive psychologically.
00:30:12.900 Now, I can tell you that I no longer feel that subjectively.
00:30:17.100 My experience of myself, I'm aware of the fact that it is a subjective mystery to me how
00:30:24.960 these words come out of my mouth.
00:30:26.880 It's like, I'm hearing these words as you're hearing these words, right?
00:30:28.980 I'm thinking out loud right now.
00:30:30.180 I haven't thought this thought before I thought it, right?
00:30:32.460 It's just coming.
00:30:33.140 And I am subjectively aware of the fact that this is all coming out of the darkness of my
00:30:42.720 unconscious mind in some sense.
00:30:44.480 There's this fear of my mind that is illuminated by consciousness, for lack of a better word.
00:30:49.700 And I can be subjectively identified with it.
00:30:55.220 But then there's all the stuff that is simply just arriving, appearing in consciousness, the
00:31:00.360 contents of consciousness, which I can't notice until I notice them.
00:31:03.920 And I can't think the thought before I think it.
00:31:06.100 And my direct experience is compatible with a purely deterministic world, right?
00:31:13.700 Now, most people's isn't, or they don't think it is.
00:31:16.100 And so that's where, when you change the subject, so the analogy I used in my article that responded
00:31:21.540 to your review, which I still think captures it for me, I'll just pitch it to you once more,
00:31:27.140 is the notion of Atlantis.
00:31:29.440 So people are infatuated with this idea of Atlantis.
00:31:32.480 I say, actually, Atlantis doesn't exist.
00:31:34.560 It's a myth.
00:31:35.500 There's nothing in the world that answers to the name of Atlantis.
00:31:38.020 There was no underwater kingdom with advanced technology and all the rest.
00:31:42.860 And whoever it was, Plato, was confused on this topic or just spinning a yarn.
00:31:48.760 And you, compatibilism, your variant and perhaps every variant, takes another approach.
00:31:56.260 It says, no, no, actually, there is something that conserves much of what people are concerned
00:32:02.040 with about Atlantis.
00:32:03.200 And in fact, it may be the historical and geographical antecedent to the first stirrings
00:32:08.680 of this idea of Atlantis.
00:32:09.860 And there's this island of Sicily, the biggest island in the Mediterranean, which answers
00:32:13.920 to much of what people care about with Atlantis.
00:32:16.780 And I say, well, but actually what people really care about is the underwater kingdom with
00:32:20.840 the advanced technology.
00:32:22.280 And that is a fiction.
00:32:23.920 So you and I are going to agree about Sicily.
00:32:26.320 99% of our truth claims about Sicily are going to converge.
00:32:30.500 But I'm saying the whole reason why we're talking about Atlantis in the first place is there's
00:32:35.220 this other piece that people are attached to, which by you purifying the subject, you're
00:32:41.300 actually just no longer interacting with that subjective piece.
00:32:44.420 Yeah, that's well put.
00:32:46.980 I think the analogy is, let's instructive.
00:32:53.300 I don't think it's entirely fair, but let's leave it at that.
00:32:55.780 But your position is that you can see very clearly that what people really care about
00:33:05.820 is that free will should be something sort of magical.
00:33:10.820 And you're right.
00:33:12.740 A lot of people, if you don't think free will is magical, then you don't believe in free will.
00:33:18.300 And that's what I confront and say, well, I got something which isn't magical, which is perfectly
00:33:28.280 consistent with naturalism and gives us moral responsibility, justification for the way we
00:33:37.240 treat each other, the distinctions that matter to us, like who do we hold responsible and
00:33:42.300 who don't, who do we excuse because they don't have free will.
00:33:44.700 It gives us all of the landmarks of our daily lives and explains why these are what matters.
00:33:54.980 And indeed, though, if the mystery, if the magic is that important to people, I agree with
00:34:01.860 you, that magic doesn't exist.
00:34:04.220 And if we're going to tie free will to that, then I would say, no, free will doesn't exist.
00:34:11.480 Now, you said something very interesting.
00:34:14.140 You said that the reason people believe in this is because they feel it or they think
00:34:22.200 they do.
00:34:22.560 They sort of intuit they could have done something different in exactly the same situation.
00:34:29.660 I agree with you that that's what they think.
00:34:34.060 But I don't think that it is a forlorn task to show them that that's not really what they
00:34:43.760 should think about this, about the very feelings they have.
00:34:48.280 Their sense that they are, as Kant says, acting under the idea of freedom.
00:34:53.760 That's right.
00:34:54.580 They are.
00:34:55.360 And that's the only way an agent can be.
00:34:58.440 This is a fairly deep point that an agent has to consider some things fixed and some
00:35:07.340 things not fixed.
00:35:08.280 You can't decide.
00:35:09.720 Otherwise, the whole setting of decision making depends on there being that kind of freedom.
00:35:16.520 And so it's no wonder, in a way, that people who are impressed with that decide that what
00:35:29.160 they experience is a sense of utter freedom.
00:35:34.440 They don't need utter freedom.
00:35:35.720 What they need and can have is the sense that in many very similar circumstances, circumstances
00:35:46.760 which differed maybe only in a few atoms, they would have made another decision.
00:35:53.040 And as soon as you allow any tiny change when you rewind the tape, the whole business about
00:36:02.960 determinism falls out of the picture.
00:36:04.560 And that's why in, actually, several places, I've gone to considerable length, probably
00:36:12.560 too long, to trot out examples where we have a decision maker, which is in a demonstrably
00:36:20.880 deterministic world, is playing chess, and it loses the game, and its designer says, well,
00:36:28.660 it could have castled.
00:36:30.840 What do you mean it could have castled?
00:36:33.060 What the designer means is it was just the luck of the draw.
00:36:38.360 A chess program, like any complicated program, is going to consult a random number generator
00:36:44.540 or a pseudo-random number generator at various points.
00:36:49.340 And this time, it chose wrong.
00:36:53.440 However, it chose wrong because when it got a number from the pseudo-random number generator,
00:37:00.360 it got a one rather than a zero.
00:37:02.200 Flip a single bit, and it would have made the other choice.
00:37:05.460 In other words, it's not a design flaw.
00:37:10.280 An agent could be, as it were, impeccably designed.
00:37:15.500 You couldn't improve the design of the agent.
00:37:17.500 So, that's what justifies saying, yeah, it could have done otherwise.
00:37:24.340 Half the time or more, it would have done otherwise.
00:37:26.460 This is just bad luck on this occasion.
00:37:28.980 Normally, it would have done otherwise.
00:37:30.640 So, I agree with all that.
00:37:33.340 I think you're not acknowledging, however, how seditious those facts are,
00:37:40.820 the degree to which they undermine people's felt sense of their own personhood.
00:37:46.860 So, if you tell me that but for a single charge at a synapse,
00:37:54.520 I would have decided I didn't want to have this conversation with you,
00:37:57.600 or I wouldn't have proposed to my wife, right?
00:38:00.940 And my entire life would be different.
00:38:03.720 Acknowledging the underlying neurophysiology of all of those choice points
00:38:08.320 and how tiny a difference can be that makes the crucial difference,
00:38:13.960 that suddenly brings back the marionette strings.
00:38:18.100 Now, no one's holding the strings.
00:38:19.380 The universe is holding the strings.
00:38:21.280 But that is not what people feel themselves to be.
00:38:24.660 This feeling that if you had had just one mouthful more of lunch,
00:38:31.740 something very different,
00:38:33.720 you would make a radically different decision six hours from now
00:38:37.380 than you are going to make.
00:38:38.600 That is a life that no one, virtually no one, feels they're living.
00:38:43.780 Oh, this is going in good directions.
00:38:46.520 I think you're largely right and exactly wrong in what you just said.
00:38:54.660 I think you're right that this is a subversive idea to many people.
00:39:01.120 They're so used to the idea that unless they're completely, absolutely undetermined,
00:39:07.020 then they don't have free will.
00:39:09.620 Now, the trouble with that is, if you look closely at that idea,
00:39:14.060 you see, if they were absolutely undetermined,
00:39:16.420 that wouldn't give them free will either.
00:39:17.700 So, that's a red herring.
00:39:23.540 So, let's look at what does matter.
00:39:27.580 It's interesting that you say that if I thought that some tiny atomic change
00:39:36.660 would have altered the course of some big important life decision,
00:39:41.120 let's look closely at that
00:39:43.720 because what I think we should say
00:39:47.500 is
00:39:48.640 it is indeed true
00:39:52.980 that there are times
00:39:55.680 when a decision is a real toss-up.
00:39:59.040 When you've thought about it, thought about it, thought about it,
00:40:01.820 you're going to have to act pretty soon
00:40:02.960 and
00:40:03.760 you just can't make up your mind.
00:40:06.340 In cases like that,
00:40:07.360 and it may be something that's morally very important,
00:40:10.600 the idea that when you do make the decision,
00:40:14.140 had the few atoms been slightly different,
00:40:16.160 you would have made the other decision.
00:40:17.400 I don't find that
00:40:18.460 upsetting at all
00:40:20.540 because that's one of those situations
00:40:24.580 and it doesn't mean
00:40:25.820 that when
00:40:26.520 the evidence
00:40:28.140 and the reasons are preponderantly on one side,
00:40:32.500 no,
00:40:34.760 then you'd
00:40:35.240 you'd have to make a very large change in the world
00:40:38.420 for a different decision to come out.
00:40:41.500 Sometimes
00:40:41.980 the indeterminists,
00:40:43.980 the libertarians,
00:40:45.080 in fact,
00:40:45.860 it's a sort of signature of a lot of their views,
00:40:48.800 say that
00:40:49.980 there has to be
00:40:52.000 an absolutely undetermined choice
00:40:54.220 of some importance
00:40:55.560 that's somewhere in the causal chain
00:40:57.000 of your life
00:40:58.480 for your action to be
00:41:02.200 responsible.
00:41:04.280 Now,
00:41:04.720 thus,
00:41:05.480 I had long
00:41:07.400 thrust into their faces
00:41:10.740 the example of Luther
00:41:12.240 who says,
00:41:12.780 I can do no other.
00:41:14.900 He's not ducking responsibility.
00:41:17.720 He's saying,
00:41:18.840 believe me,
00:41:20.820 it wasn't
00:41:21.340 had the light been different
00:41:22.960 or the wind not been blowing,
00:41:24.700 I would have,
00:41:25.180 no,
00:41:25.460 he's saying,
00:41:26.560 I was determined
00:41:27.700 to do this
00:41:28.620 and yet,
00:41:29.620 he's not saying
00:41:30.460 it's not a free decision.
00:41:32.720 They,
00:41:33.860 some of them,
00:41:35.000 amazingly to me,
00:41:36.300 fall for the bait
00:41:37.020 and say,
00:41:37.880 oh,
00:41:38.040 well,
00:41:38.160 that's only because
00:41:39.020 it must have been the case
00:41:40.360 that somewhere
00:41:41.560 in Luther's life,
00:41:43.740 there was a moment,
00:41:44.380 it might have been
00:41:44.780 in his childhood
00:41:45.580 when there were two paths,
00:41:49.820 A and B,
00:41:51.080 and he chose A,
00:41:52.800 which led to him
00:41:53.560 putting,
00:41:54.260 nailing the theses
00:41:54.920 on the door
00:41:55.500 and at that moment
00:41:56.880 it was absolutely undetermined
00:41:58.980 that he choose A.
00:42:00.160 I think that's
00:42:00.860 the craziest fantasy
00:42:02.240 imaginable.
00:42:04.720 It doesn't depend on that.
00:42:07.240 So,
00:42:08.280 I agree with you
00:42:09.800 that
00:42:10.660 when we
00:42:11.840 think about
00:42:13.140 how
00:42:15.000 chance,
00:42:16.940 luck,
00:42:18.940 enters into our lives,
00:42:21.340 that can be very unsettling.
00:42:25.000 And we should
00:42:25.780 not hide
00:42:27.180 from the fact
00:42:28.080 that
00:42:29.200 there are times
00:42:30.760 when
00:42:32.460 it's a toss-up,
00:42:36.240 and
00:42:37.380 we may
00:42:38.140 rejoice
00:42:39.680 in the decision
00:42:40.360 we make
00:42:40.760 or we may
00:42:41.240 bitterly regret it
00:42:42.360 and
00:42:44.520 the fact
00:42:46.280 that we couldn't do,
00:42:47.140 that
00:42:47.320 it was not
00:42:49.800 in our control,
00:42:51.760 it's,
00:42:52.520 maybe it's a tragic fact,
00:42:54.280 but it's not
00:42:55.720 a fact
00:42:56.180 which
00:42:56.620 disables us
00:42:58.120 for responsibility.
00:43:01.220 You're playing chess
00:43:02.520 to take
00:43:03.720 a deliberately
00:43:04.540 trivial
00:43:06.240 case.
00:43:08.920 You,
00:43:10.020 considering
00:43:10.400 two possible moves,
00:43:12.740 for the life of you,
00:43:13.740 you can't see
00:43:14.280 what the better one is.
00:43:16.080 You sort of
00:43:16.520 mentally flip a coin.
00:43:18.260 You don't know.
00:43:19.720 Works out great.
00:43:20.880 You're like,
00:43:21.100 yeah,
00:43:21.580 that's right,
00:43:22.340 that's what I,
00:43:23.620 you're very likely
00:43:25.120 to retrospectively
00:43:26.320 decorate that
00:43:27.880 with
00:43:28.200 the claim
00:43:28.960 that
00:43:29.140 that's what
00:43:29.740 you've determined.
00:43:31.280 Nah,
00:43:31.880 you're kidding
00:43:32.300 yourself.
00:43:32.740 you're just
00:43:35.020 taking responsibility
00:43:35.880 for a little bit
00:43:37.220 of lucky random
00:43:38.260 coin flip
00:43:39.600 in your decision
00:43:40.420 process.
00:43:41.400 That does not,
00:43:43.420 in fact,
00:43:43.840 not only does that
00:43:44.520 not disable you
00:43:45.560 for free will,
00:43:46.940 I think
00:43:47.580 an important
00:43:48.580 human point
00:43:49.540 about free will
00:43:50.380 is that
00:43:51.300 free responsible
00:43:52.620 agents
00:43:53.060 recognize
00:43:53.820 that when
00:43:55.400 they act
00:43:56.040 they're going
00:43:57.300 to be held
00:43:57.820 responsible
00:43:58.460 whether or not
00:44:00.740 they are in
00:44:01.960 complete control
00:44:02.880 of the,
00:44:05.080 and they can't
00:44:05.760 be in
00:44:06.620 complete control
00:44:07.760 of the
00:44:08.660 decision making
00:44:10.100 that goes to
00:44:11.060 making up
00:44:11.480 their minds.
00:44:12.460 Well,
00:44:12.920 now I think
00:44:13.640 we're getting
00:44:13.940 into some
00:44:14.280 very interesting
00:44:14.780 territory
00:44:15.180 where we
00:44:15.740 might actually
00:44:16.560 disagree
00:44:17.020 because I
00:44:17.860 think
00:44:18.280 perhaps
00:44:19.940 your notion
00:44:20.640 of moral
00:44:21.000 responsibility
00:44:21.600 is something
00:44:22.120 that I
00:44:22.460 don't agree
00:44:23.160 and I think
00:44:23.580 I can do
00:44:24.220 a kind of
00:44:24.640 compatible
00:44:25.160 list.
00:44:25.960 If you'd like
00:44:26.680 to continue
00:44:27.120 listening to
00:44:27.560 this conversation,
00:44:28.700 you'll need
00:44:29.120 to subscribe
00:44:29.620 at samharris.org.
00:44:31.280 Once you do,
00:44:32.160 you'll get access
00:44:32.660 to all full-length
00:44:33.560 episodes of the
00:44:34.220 Making Sense
00:44:34.600 podcast,
00:44:35.600 along with other
00:44:36.180 subscriber-only
00:44:36.940 content,
00:44:37.940 including bonus
00:44:38.640 episodes and
00:44:39.720 AMAs and the
00:44:40.880 conversations I've
00:44:41.580 been having on
00:44:42.060 the Waking Up app.
00:44:43.480 The Making Sense
00:44:44.000 podcast is ad-free
00:44:45.240 and relies entirely
00:44:46.660 on listener support,
00:44:47.840 and you can
00:44:48.520 subscribe now
00:44:49.300 at samharris.org.
00:44:53.160 The Making Sense
00:44:55.160 is a production
00:44:56.160 of the Waking Up
00:44:57.160 and the Waking Up
00:44:58.160 and the Waking Up
00:44:59.160 and the Waking Up
00:45:00.160 and the Waking Up
00:45:01.160 and the Waking Up
00:45:02.160 is a production
00:45:12.160 of the Waking Up
00:45:14.620 and the Waking Up
00:45:14.880 and the Waking Up
00:45:15.100 and the Waking Up
00:45:15.820 and the Waking Up
00:45:16.380 and the Waking Up
00:45:17.160 and the Waking Up
00:45:18.160 and the Waking Up
00:45:18.780 and the Waking Up
00:45:19.160 and the Waking Up
00:45:19.700 and the Waking Up
00:45:20.360 and the Waking Up
00:45:20.780 and the Waking Up
00:45:21.120 and the Waking Up
00:45:21.820 and the Waking Up
00:45:22.540 and the Waking Up
00:45:23.060 and the Waking Up
00:45:23.700 and the Waking Up