#39 — Free Will Revisited
Episode Stats
Words per Minute
154.8964
Summary
Dan Dennett recently published a book review of his new book, "A Museum of Mistakes," and I responded to it in a blog post on my blog. In doing so, I also published a response to his review, written as a letter to Dan's critics, which you can find here. I hope this doesn't give our readers cause for cynicism on this front, but I do want to remind my readers that debates like this aren't necessarily pointless. And yet we've grown to expect it on every topic, no matter how intelligent and well-intentioned the participants are. I think it's fair to say that one could watch an entire season of Downton Abbey on Ritalin, and not have a finer note than you can manage for 20 pages running. And now I have a quote from Dan: "This is Dan being disingenuous when he says, 'This is a museum of mistakes.' I'm not here to say this is Dan trying to be clever, I'm here to point out that scientists are thinking so boldly, and clearly, what are they are thinking." I am grateful to Dan for saying so boldly and clearly. I've always suspected that many who hold this hard, determinist view are making mistakes. But we mustn't put words in people's mouths by articulating the points he has done a great service. And now, as a scientist, Wolfgang Pauly reminds us all along as to what they are making. of his work, as an ambientizing, not only by pointing out that they have been making all along along along as a long line of work, but by keeping themselves making mistakes but keeping themselves right along along, as if they are all along to keep themselves making these mistakes. -- but we must not make these mistakes, as it's not even a good thing, as Wolfgang's work goes by the virtue of making a long, right along as well as by declaring that they don't even need to be making these? by pointing to the ambientizing work of another physicist's work not by saying "an ambientizing. " . And now we must be making words in words that are not even good enough, not even by making words that make people s wrong along along the way. . he's not a good enough critic of us all right, right, but not a bad one, right by me, right right along with me, not by me?
Transcript
00:00:10.880
Just a note to say that if you're hearing this, you are not currently on our subscriber
00:00:14.680
feed and will only be hearing the first part of this conversation.
00:00:18.420
In order to access full episodes of the Making Sense Podcast, you'll need to subscribe at
00:00:24.060
There you'll find our private RSS feed to add to your favorite podcatcher, along with
00:00:30.260
We don't run ads on the podcast, and therefore it's made possible entirely through the support
00:00:35.880
So if you enjoy what we're doing here, please consider becoming one.
00:00:46.580
Well, I just got back from Banff, where I attended the TED Summit, and I brought a portable recording
00:00:53.120
device to the conference on the odd chance that I might find someone worth talking to who
00:00:59.560
Needless to say, there were many people worth talking to, but not much time to sit down and
00:01:05.480
But I did record one conversation with the philosopher Dan Dennett, who probably needs no introduction
00:01:11.600
As many of you know, Dan and I have been brothers in arms for many years, along with Richard Dawkins
00:01:16.500
and Christopher Hitchens, as the so-called New Atheists, or the Four Horsemen after a video
00:01:21.920
by that name that we recorded in Hitch's apartment some years back.
00:01:25.540
Dan and I once debated together on the same team, along with Hitch, at the Ciudad de las
00:01:31.120
ideas conference in Mexico, where we were pitted against Dinesh D'Souza and Rabbi Shmuley
00:01:38.040
Botiak and Robert Wright, I believe, and Nassim Taleb got in there somehow.
00:01:44.340
I hope it doesn't seem too self-serving or contemptuous of our opponents to say that
00:01:50.040
we came out none the worse for wear on that occasion, and needless to say, that video is
00:01:54.880
online for all to see until the end of the world.
00:01:58.740
But as many of you know, Dan and I had a very barbed exchange on the topic of free will
00:02:03.480
some years later, and that was a little over two years ago, and we never resolved it.
00:02:08.180
I came out with my short book on free will, and Dan reviewed it, and then I responded to
00:02:13.120
his review, and the matter was left there in a way that no one found satisfying, least
00:02:20.780
There really was an outpouring of dismay over the tone that we took with each other, and
00:02:28.200
I want to begin by reading the first few paragraphs of my response to Dan's review, which includes
00:02:34.940
a quotation from him, so you can hear how vexed and vexing things got.
00:02:39.620
And if you're interested, you can read the whole exchange on my blog.
00:02:43.080
In fact, when I post this podcast on my website, I'll provide the relevant links.
00:02:47.660
So this is near the beginning of my response, written as a letter to Dan.
00:02:51.480
I want to begin by reminding our readers, and myself, that exchanges like this aren't necessarily
00:02:57.740
Perhaps you need no encouragement on that front, but I'm afraid I do.
00:03:01.360
In recent years, I've spent so much time debating scientists, philosophers, and other scholars,
00:03:05.500
that I've begun to doubt whether any smart person retains the ability to change his mind.
00:03:10.420
This is one of the great scandals of intellectual life.
00:03:13.480
The virtues of rational discourse are everywhere espoused, and yet witnessing someone relinquish
00:03:18.120
a cherished opinion in real time is about as common as seeing a supernova explode overhead.
00:03:23.680
The perpetual stalemate one encounters in public debates is annoying because it is so clearly
00:03:28.180
the product of motivated reasoning, self-deception, and other failures of rationality.
00:03:33.080
And yet we've grown to expect it on every topic, no matter how intelligent and well-intentioned
00:03:38.940
I hope you and I don't give our readers further cause for cynicism on this front.
00:03:43.740
Unfortunately, your review of my book doesn't offer many reasons for optimism.
00:03:47.640
It is a strange document, avuncular in places, but more generally sneering.
00:03:52.300
I think it fair to say that one could watch an entire season of Downton Abbey on Ritalin and
00:03:56.780
not detect a finer note of condescension than you manage for 20 pages running.
00:04:00.720
And now I have a quotation from Dan's review here.
00:04:05.120
I'm not being disingenuous when I say this museum of mistakes is valuable.
00:04:09.320
I am grateful to Harris for saying so boldly and clearly what less outgoing scientists are
00:04:16.080
I've always suspected that many who hold this hard, determinist view are making these mistakes.
00:04:23.360
And now Harris has done us a great service by articulating the points explicitly.
00:04:27.320
And the chorus of approval he's received from scientists goes a long way to confirming
00:04:31.780
that they have been making these mistakes all along.
00:04:34.400
Wolfgang Pauly's famous dismissal of another physicist's work as, quote,
00:04:38.280
not even wrong, reminds us of the value of crystallizing an ambient cloud of hunches
00:04:46.000
Correcting widespread misunderstanding is usually the work of many hands.
00:04:50.000
And Harris has made a significant contribution.
00:04:55.020
I say, I hope you will recognize that your beloved Rappaport's rules have failed you here.
00:04:59.700
As an aside, I should say, for those of you who are not familiar with them, these rules
00:05:03.240
come from Anatole Rappaport, the mathematician, game theorist, and social scientist.
00:05:08.960
And Dan has been a champion of these rules of argumentation for years.
00:05:12.640
And they are, one, attempt to re-express your target's position so clearly, vividly, and
00:05:17.700
fairly, that your target says, thanks, I wish I thought of putting it that way.
00:05:22.320
Two, list any points of agreement, especially if they are not matters of general or widespread
00:05:28.100
Three, mention anything you have learned from your target.
00:05:31.820
Four, only then are you permitted to say so much as a word of rebuttal or criticism.
00:05:35.820
So those are the rules, and Dan has often said that he aspires to follow them when criticizing
00:05:46.960
I hope you will recognize that your beloved Rappaport's rules have failed you here.
00:05:50.920
If you have decided, according to the rule, to first mention something positive about the
00:05:55.040
target of your criticism, it will not do to say that you admire him for the enormity of
00:05:59.520
his errors and the recklessness with which he clings to them despite the sterling example
00:06:06.000
Yes, you may assert, quote, I am not being disingenuous when I say this museum of mistakes
00:06:15.840
If that isn't clear, permit me to spell it out just this once.
00:06:18.920
You are asking the word valuable to pass as a token of praise, however faint.
00:06:23.520
But according to you, my book is, quote, valuable for reasons that I should find embarrassing.
00:06:29.400
If I valued it as you do, I should rue the day I wrote it.
00:06:32.940
As you would, had you brought such value into the world.
00:06:36.300
And it would be disingenuous of me not to notice how your prickliness and preening appears.
00:06:43.680
Behind and between almost every word of your essay, like some toxic background radiation,
00:06:49.140
one detects an explosion of professorial vanity, end quote.
00:06:56.980
And I must say that this is really a problem with writing rather than having a face-to-face
00:07:03.880
If any of you have ever had the brilliant idea of writing a long letter or email to a friend
00:07:08.580
to sort out some relationship crisis rather than just have a conversation, you've probably
00:07:13.540
discovered how haywire things can go through an exchange of texts.
00:07:17.640
And the same can be true for intellectual debates among philosophers and scientists.
00:07:21.880
And it's especially likely to happen if either or both of the people involved are writers
00:07:29.580
I remember writing that quip about Downton Abbey, and it made me laugh at the time.
00:07:35.100
I knew it would make many readers laugh, and so I kept it in.
00:07:38.920
But lines like that just amplify the damage done.
00:07:42.600
So as I told Dan at the end of our podcast, I very much regret the tone I took in this
00:07:48.200
exchange, and I'm very happy we got a chance to have a face-to-face conversation and sort
00:07:53.480
I don't think we resolved all the philosophical issues, but we spoke for nearly two hours,
00:07:58.600
but there were several important topics that never came up.
00:08:01.680
As you'll hear, we were speaking in a bar using a single microphone, and this was at the
00:08:08.140
So this isn't us at our most polished or prepared, but I thought it was a very good conversation,
00:08:14.260
and I think those of you who are interested in the problem of free will and its connection
00:08:19.880
I still think there's some sense in which Dan and I are talking past one another.
00:08:23.480
The nature of our remaining disagreement never became perfectly clear to me.
00:08:31.260
And now I give you Dan Dennett, in a bar, overlooking the Canadian Rockies.
00:08:38.140
So I'm here with Dan Dennett at the TED Summit in Banff, and we have stolen away from the
00:08:49.100
main session, and we are in a bar and about to have a conversation about the misadventure
00:08:55.300
we had in discussing free will online in a series of articles and blog posts.
00:09:02.040
I mean, you and I are part of a community, and a pretty visible part of a community that
00:09:07.060
prides itself on being willing to change its opinions and views more or less in real time
00:09:13.060
under pressure from better arguments and better data.
00:09:16.460
And I think I said in my article in response to your review of my book, Free Will, that this is a
00:09:26.300
I mean, to see someone relinquish his cherished opinion more or less on the spot under pressure
00:09:31.780
from an interlocutor, that's about as rare as seeing a supernova overhead.
00:09:36.940
And it really shouldn't be, because there's nothing that tokens intellectual honesty more
00:09:43.420
than a willingness to step away from one's views once they are shown to be in error.
00:09:49.540
And I'm not saying we're necessarily going to get there in this conversation about free
00:09:54.380
will, but there was something that went awry in our written exchanges, tonally, and neither
00:10:04.660
And so, again, we'll talk about free will as well, but I think this conversation is proceeding
00:10:10.120
along two levels, where there's a thing we're talking about philosophically, which is free
00:10:14.280
will, but then there's just the way in which I want us to both be sensitive to getting hijacked
00:10:19.760
into unproductive lines that make it needlessly hard to talk about what is just a purely intellectual
00:10:28.980
And one of great interest, surprisingly great interest to our audiences.
00:10:34.800
There's no topic that I've touched that has surprised me more in the degree to which people
00:10:43.840
And I know you and I both think it's a very consequential topic.
00:10:50.600
This one really does meet ethics and public policy in a way that is important.
00:10:55.140
So one thing you all should know in listening to this is that we have one microphone.
00:11:00.160
Perhaps this is a good thing because we really can't interrupt each other and we're just going
00:11:10.520
If we can't agree on some things here, we shouldn't be in this business.
00:11:14.760
I want to go back one step further in how this got started.
00:11:21.040
You sent me the manuscript of your book, Free Will, and asked me for my advice.
00:11:28.880
I just told you, no, I'm sorry, I don't have time.
00:11:32.560
And then when the book came out, I read it and said, oh, I wish you'd, I forgot that we'd
00:11:37.380
I said, I wish you'd showed it to me because I think you made some big mistakes here.
00:11:40.580
And I would love to have tried to talk you out of them too late.
00:11:45.740
And then we, time passed and then we had the, you said you wanted me still to say what I
00:11:54.180
And that's when I wrote my piece for your blog and for Naturalism.
00:12:01.380
And I guess I regret a few bits of tone there, but I think everything I said there is defensible.
00:12:10.040
And in particular, I did use Rappaport's rules, contrary to what you say.
00:12:15.420
If you look at the first paragraph of my piece, I applaud the book for doing a wonderful, clear
00:12:21.080
job of setting out a position which I largely agreed with.
00:12:24.640
And then I said, you went off the rails a little later.
00:12:26.520
So I did try to articulate your view, and I haven't heard you complain about that articulation
00:12:38.400
And I even said what I've learned from that book.
00:12:45.880
But we can just set that aside if you want and get down to what remains of the issue.
00:12:51.640
One thing in particular, which I know came off awfully preachy, but I really think it was
00:13:01.400
most unwise of you to declare that my position sounded like religion, sounded like theology.
00:13:13.660
theology, you have to know that you're insulting me.
00:13:18.320
And that was a pretty deliberate insult, and that was in the book.
00:13:26.380
If you're going to call me a theologian, then I'm going to call you on it and say, as I said,
00:13:31.920
I tell my students, when a view by apparently senior, you know, an author worth reading looks
00:13:42.900
And of course, the main point of my essay was, yes, you have misconstrued my brand of
00:13:52.720
You've got a sort of a caricatured version of it.
00:13:56.320
And in fact, as I say late in the piece, you are a compatibilist in all that name.
00:14:05.060
You agree with me that determinism and moral responsibility are compatible.
00:14:13.620
You agree that a system of law, including punishment and justified punishment, is compatible
00:14:24.460
That's, we're just that close to compatibilism.
00:14:28.180
I've actually toyed with the idea, in part, provoked by you and some others, Jerry Coyne
00:14:35.700
and others, to say, all right, I don't want to fight over who gets to define the term free
00:14:41.820
As I see it, there are two completely intention themes out there about what free will is.
00:14:49.620
One is that it's incompatible with determinism.
00:14:52.140
And the other is that it's the basis of moral responsibility.
00:14:54.760
I think it's the second one that's the important one.
00:15:04.200
Indeterminist free will, libertarian free will, is a philosopher's fantasy.
00:15:16.260
We have no love for libertarian indeterminism, for agent causation, for all of that metaphysical
00:15:29.420
And we both agree that the truths of neuroscience and the truths of physics, physics doesn't have
00:15:36.860
much to do with it, actually, are compatible with most of our understanding, our everyday
00:15:45.680
understanding of responsibility, taking responsibility, being morally responsible enough to be held
00:15:56.000
to our word, I mean, you and I both agree that you are competent to sign a contract.
00:16:04.340
Well, you know, if you go and sign a deed or a mortgage, very often, if it's notarized,
00:16:14.100
the notary public will say, are you signing this of your own free will?
00:16:21.540
That's the sense of free will that I think is important.
00:16:25.700
There are a lot of people that don't have that free will, and it has nothing to do with indeterminism.
00:16:30.420
It has to do with their being disabled in some way.
00:16:34.360
They don't have a well-running nervous system, which you need if you're going to be a responsible
00:16:47.720
I think there's some interesting points of disagreement on the moral responsibility
00:16:52.920
issue, which we should talk about, and I think that could be very interesting for listeners
00:17:01.340
I am, needless to say, very uncomfortable with the idea that I have misrepresented your
00:17:07.440
view, and if I did that in my book, I certainly want to correct that here.
00:17:11.520
So, we should clearly state what your view is at a certain point here.
00:17:15.920
But I want to step back for a second before we dive into the details of the philosophy of
00:17:21.040
What I was aware of doing in my book, Free Will, and again, I would recommend that our
00:17:26.560
listeners just go back and, you don't actually have to read my book, but you can read Dan's
00:17:32.660
review of it on my blog, and you can read my response, which is entitled The Marionette's
00:17:37.420
Lament, I believe, then you can see the bad blood that was generated there.
00:17:42.320
And I don't know, Dan, if you're aware of this, you don't squander as much of your time
00:17:46.580
on social media or in your inbox, but I heard from so many of our mutual readers that they
00:17:53.240
were just despairing of that contretemps between us.
00:17:56.560
It was like mom and dad fighting, and it was totally unpleasant.
00:18:00.060
The thing that I really regret, which you regret that you didn't get a chance to read my book
00:18:03.780
before I published it, which that would have been a nice thing for both of us, but what
00:18:07.720
I regret is when you told me that you were planning to write a review of it, I kept urging
00:18:14.660
you and ultimately badgering you to not do that and have a discussion with me because
00:18:20.760
I knew what was going to happen, at least from my point of view, is that you would hit
00:18:24.940
me with this 10,000 word volley, which at a dozen points or more, I would feel you had
00:18:32.320
misconstrued me or gone off the rails, and there would be no chance to respond to those
00:18:36.660
and to respond in a further 10,000 word volley in a piecemeal way would just lead to this
00:18:42.400
exchange that was very boring to read and yielded a much bigger sense of disagreement than
00:18:51.200
So if I have to spend 90% of my energy taking your words out of my mouth, then this thing
00:18:58.760
So one thing I've been struggling for in my professional life is a way of having conversations
00:19:03.260
like this, even ones where there's much less goodwill than you and I have for one another,
00:19:08.040
because you and I are friends and we're on the same side of most of these debates, and
00:19:12.320
so we should be able to have this kind of conversation in a way that's productive.
00:19:15.140
But I've been engaging people who think I'm a racist bigot as a starting point, and I
00:19:21.840
want to find ways of having conversations in real time where you can be as nimble as
00:19:27.640
possible in diffusing unnecessary conflict or misunderstanding, and writing is an especially
00:19:35.540
Certainly writing the 10,000 word New York Review of Books piece that then someone has to react
00:19:43.120
So in any case, I wish we'd had that conversation, but we're having it now, and this is instructive
00:19:49.480
Feel free to react to that, but I guess I want you to also express what compatibilism means
00:19:56.960
to you, and if you recall the way in which I got that wrong, feel free to say that, but
00:20:01.000
I'll then react to your version of compatibilism.
00:20:04.360
Well, my view of compatibilism is pretty much what I just said, and you were nodding, and
00:20:12.120
you were not considering that a serious view about free will, although you were actually
00:20:18.020
almost all of it you were agreeing with, and you also, I think, made this serious strategic
00:20:27.400
or tactical error of saying this is like theology.
00:20:34.260
Well, as soon as you said that, I thought, well, you just don't understand what compatibilism
00:20:40.840
It's an attempt to look at what matters, to look at the terms and their meanings, and
00:20:50.800
to recognize that sometimes ancient ideology gets in the way of clear thinking so that you
00:21:02.140
If you trusted tradition and the everyday meanings of words, we would have to say all sorts of
00:21:10.180
In fact, one of the abiding themes in my work is there are these tactical or diplomatic choice
00:21:26.420
Or you can say, no, consciousness doesn't exist.
00:21:29.280
Well, if you've got one view of consciousness, if it's this mysterious, magical, ultimately insoluble
00:21:37.080
problem, then I agree, consciousness in that sense doesn't exist.
00:21:41.800
But there's another sense, much more presentable, I think, which of course consciousness exists.
00:21:51.660
That was a central theme in Elbow Room with regard to free will and in Consciousness Explained
00:21:59.100
My view, my tactic, and notice, those two views, they look as if they're doctrinally
00:22:10.420
They're two different ways of dealing with the same issue.
00:22:15.160
Well, if free will means what Dennett says it means, yes.
00:22:20.820
If it means what some people think, then the answer is no.
00:22:24.820
Yeah, I understand that, but I would put to you the question, there is a difference between
00:22:35.820
So this is my gripe about compatibilism, and this is something we'll get into.
00:22:40.140
But I assume you will admit that there is a difference between purifying a real phenomenon
00:22:46.300
of its folk psychological baggage, which I think this is what you think compatibilism is
00:22:51.160
doing, and actually failing to interact with some core features that are just inelimitable
00:23:01.540
Let me surprise you by saying, I don't think there's a sharp line between those two, and
00:23:05.540
I think that's quite obvious, that whether I'm changing the subject, I'm so used to that
00:23:20.100
Saying you're just changing the subject is a way of declaring a whole manifold, a whole
00:23:29.540
variety spectrum of clarificatory views, which you're not accepting because you're clinging
00:23:39.320
You want to claim that free will, the core of free will, is its denial of determinism.
00:23:47.400
And I've made a career saying that's not the core.
00:23:50.840
In fact, let me try a new line on you, because I've been thinking, why doesn't he see this
00:23:58.400
And I think that the big source, the likely big source of confusion about this is that
00:24:08.440
when people think about freedom in the context of free will, they're ignoring a very good
00:24:15.140
and legitimate notion of freedom, which is basically the engineering notion of freedom when you talk
00:24:21.400
about degrees of freedom, my arms, you know, my wrist and my shoulder and my elbow, those
00:24:27.060
joints, there's three degrees of freedom right there.
00:24:31.220
And in control theory, it's all about how you control the degrees of freedom.
00:24:36.240
And if we look around the world, we can see that some things have basically no degrees of
00:24:43.100
And some things, like you and me, have uncountably many degrees of freedom because of the versatility
00:24:51.900
We can be moved by reasons on any topic at all.
00:24:57.060
This gives us a complexity from the point of view of control theory, which is completely
00:25:04.380
And that kind of freedom is actually, I claim, at the heart of our understanding of free will
00:25:11.520
because it's that complexity, which is not just complexity, but it's the competence to
00:25:20.820
What you want, if you've got free will, is the capacity, and it'll never be perfect, to
00:25:28.340
respond to the circumstances with all the degrees of freedom you need to do what you think
00:25:37.840
You may not always do the right thing, but let's take a dead simple case.
00:25:44.480
Imagine writing a chess program which, stupidly, it was written wrong so that the king could
00:25:52.580
only move forward or back or left or right like a rook and it could not move diagonally.
00:25:57.560
And this was somehow hidden in it so that it just never even considered moves, diagonal
00:26:06.560
It's missing a very important degree of freedom, which it should have and be able to control
00:26:14.000
What you want, I mean, let me ask you a question about what would be ideal from the point of view
00:26:30.200
It's not mainly true beliefs, a well-ordered set of desires, the cognitive adroitness to
00:26:40.540
change one's attention, to change one's mind, to be moved by reasons, the capacity to listen
00:26:47.000
to reasons, the capacity for some self-control.
00:26:50.820
These things all come in degrees, but our model of a responsible adult, someone you would
00:26:58.160
trust, someone you would make a promise to, or you would accept a promise from, is somebody
00:27:03.940
with all those degrees of freedom and control of them.
00:27:07.060
Now, what removes freedom from somebody is if either the degrees of freedom don't exist,
00:27:15.860
they're blocked mechanically, or some other agent has usurped them and has taken over control,
00:27:25.660
And so, I think that our model of a free agent says nothing at all about indeterminism.
00:27:36.640
We can distinguish free agents from unfree agents in a deterministic world or in an indeterministic world.
00:27:46.180
Determinism and indeterminism make no difference to that categorization, and it's that categorization
00:27:57.120
I just need to put a few more pieces in play here.
00:27:59.860
I think there is an important difference, nevertheless.
00:28:02.980
I agree that there is no bright line between changing the subject and actually purifying
00:28:07.040
a concept of illusions and actually explaining something scientifically about the world.
00:28:12.700
But in this case, the durability of free will as a problem for philosophers and now scientists
00:28:19.620
is based on people's first-person experience of something they think they have.
00:28:25.560
People feel like they are the authors of their thoughts and intentions and actions.
00:28:30.540
And so, there's a first-person description of this problem, and there's a third-person description
00:28:36.360
And I think if we bounce between the two without knowing that we're bouncing between the two,
00:28:44.580
So, people feel that they have libertarian free will.
00:28:48.380
And when I get emails from people who are psychologically destabilized by my argument that free will doesn't
00:28:54.420
exist, these are people who feel like something integral to their psychological life and well-being
00:29:03.260
And I can say this from both sides because I know what it's like to feel that I could have done otherwise.
00:29:09.600
So, let me just, for listeners who aren't totally up to speed here,
00:29:14.440
libertarian free will is this, is anchored to this notion of I could have done otherwise.
00:29:19.020
So, if we rewound the universe to precisely as it was a few moments ago, I could complete
00:29:28.520
You know, whether you throw indeterminism or determinism or some combination thereof,
00:29:34.100
there is no scientific rationale for that claim.
00:29:37.160
If you rewound the universe to precisely its prior state with all relevant variables intact,
00:29:43.520
whether deterministic or indeterministic, these words would come out of my mouth in exactly
00:29:51.440
I would speak this sentence a trillion times in a row with its errors, with its glitches.
00:29:56.920
So, people feel that if they rewound the movie of their lives, they could do differently in
00:30:03.860
And that feeling is the thing that is what people find so interesting about this notion that
00:30:09.400
free will doesn't exist because it is so counterintuitive psychologically.
00:30:12.900
Now, I can tell you that I no longer feel that subjectively.
00:30:17.100
My experience of myself, I'm aware of the fact that it is a subjective mystery to me how
00:30:26.880
It's like, I'm hearing these words as you're hearing these words, right?
00:30:30.180
I haven't thought this thought before I thought it, right?
00:30:33.140
And I am subjectively aware of the fact that this is all coming out of the darkness of my
00:30:44.480
There's this fear of my mind that is illuminated by consciousness, for lack of a better word.
00:30:55.220
But then there's all the stuff that is simply just arriving, appearing in consciousness, the
00:31:00.360
contents of consciousness, which I can't notice until I notice them.
00:31:03.920
And I can't think the thought before I think it.
00:31:06.100
And my direct experience is compatible with a purely deterministic world, right?
00:31:13.700
Now, most people's isn't, or they don't think it is.
00:31:16.100
And so that's where, when you change the subject, so the analogy I used in my article that responded
00:31:21.540
to your review, which I still think captures it for me, I'll just pitch it to you once more,
00:31:29.440
So people are infatuated with this idea of Atlantis.
00:31:35.500
There's nothing in the world that answers to the name of Atlantis.
00:31:38.020
There was no underwater kingdom with advanced technology and all the rest.
00:31:42.860
And whoever it was, Plato, was confused on this topic or just spinning a yarn.
00:31:48.760
And you, compatibilism, your variant and perhaps every variant, takes another approach.
00:31:56.260
It says, no, no, actually, there is something that conserves much of what people are concerned
00:32:03.200
And in fact, it may be the historical and geographical antecedent to the first stirrings
00:32:09.860
And there's this island of Sicily, the biggest island in the Mediterranean, which answers
00:32:13.920
to much of what people care about with Atlantis.
00:32:16.780
And I say, well, but actually what people really care about is the underwater kingdom with
00:32:26.320
99% of our truth claims about Sicily are going to converge.
00:32:30.500
But I'm saying the whole reason why we're talking about Atlantis in the first place is there's
00:32:35.220
this other piece that people are attached to, which by you purifying the subject, you're
00:32:41.300
actually just no longer interacting with that subjective piece.
00:32:53.300
I don't think it's entirely fair, but let's leave it at that.
00:32:55.780
But your position is that you can see very clearly that what people really care about
00:33:05.820
is that free will should be something sort of magical.
00:33:12.740
A lot of people, if you don't think free will is magical, then you don't believe in free will.
00:33:18.300
And that's what I confront and say, well, I got something which isn't magical, which is perfectly
00:33:28.280
consistent with naturalism and gives us moral responsibility, justification for the way we
00:33:37.240
treat each other, the distinctions that matter to us, like who do we hold responsible and
00:33:42.300
who don't, who do we excuse because they don't have free will.
00:33:44.700
It gives us all of the landmarks of our daily lives and explains why these are what matters.
00:33:54.980
And indeed, though, if the mystery, if the magic is that important to people, I agree with
00:34:04.220
And if we're going to tie free will to that, then I would say, no, free will doesn't exist.
00:34:14.140
You said that the reason people believe in this is because they feel it or they think
00:34:22.560
They sort of intuit they could have done something different in exactly the same situation.
00:34:34.060
But I don't think that it is a forlorn task to show them that that's not really what they
00:34:43.760
should think about this, about the very feelings they have.
00:34:48.280
Their sense that they are, as Kant says, acting under the idea of freedom.
00:34:58.440
This is a fairly deep point that an agent has to consider some things fixed and some
00:35:09.720
Otherwise, the whole setting of decision making depends on there being that kind of freedom.
00:35:16.520
And so it's no wonder, in a way, that people who are impressed with that decide that what
00:35:35.720
What they need and can have is the sense that in many very similar circumstances, circumstances
00:35:46.760
which differed maybe only in a few atoms, they would have made another decision.
00:35:53.040
And as soon as you allow any tiny change when you rewind the tape, the whole business about
00:36:04.560
And that's why in, actually, several places, I've gone to considerable length, probably
00:36:12.560
too long, to trot out examples where we have a decision maker, which is in a demonstrably
00:36:20.880
deterministic world, is playing chess, and it loses the game, and its designer says, well,
00:36:33.060
What the designer means is it was just the luck of the draw.
00:36:38.360
A chess program, like any complicated program, is going to consult a random number generator
00:36:44.540
or a pseudo-random number generator at various points.
00:36:53.440
However, it chose wrong because when it got a number from the pseudo-random number generator,
00:37:02.200
Flip a single bit, and it would have made the other choice.
00:37:10.280
An agent could be, as it were, impeccably designed.
00:37:17.500
So, that's what justifies saying, yeah, it could have done otherwise.
00:37:24.340
Half the time or more, it would have done otherwise.
00:37:33.340
I think you're not acknowledging, however, how seditious those facts are,
00:37:40.820
the degree to which they undermine people's felt sense of their own personhood.
00:37:46.860
So, if you tell me that but for a single charge at a synapse,
00:37:54.520
I would have decided I didn't want to have this conversation with you,
00:38:03.720
Acknowledging the underlying neurophysiology of all of those choice points
00:38:08.320
and how tiny a difference can be that makes the crucial difference,
00:38:13.960
that suddenly brings back the marionette strings.
00:38:21.280
But that is not what people feel themselves to be.
00:38:24.660
This feeling that if you had had just one mouthful more of lunch,
00:38:33.720
you would make a radically different decision six hours from now
00:38:38.600
That is a life that no one, virtually no one, feels they're living.
00:38:46.520
I think you're largely right and exactly wrong in what you just said.
00:38:54.660
I think you're right that this is a subversive idea to many people.
00:39:01.120
They're so used to the idea that unless they're completely, absolutely undetermined,
00:39:09.620
Now, the trouble with that is, if you look closely at that idea,
00:39:27.580
It's interesting that you say that if I thought that some tiny atomic change
00:39:36.660
would have altered the course of some big important life decision,
00:39:59.040
When you've thought about it, thought about it, thought about it,
00:40:07.360
and it may be something that's morally very important,
00:40:28.140
and the reasons are preponderantly on one side,
00:40:35.240
you'd have to make a very large change in the world
00:40:45.860
it's a sort of signature of a lot of their views,