#239 — Yet Another Call from Ricky Gervais
Episode Stats
Words per Minute
167.88208
Summary
In this episode, I'm joined by philosopher, philosopher, writer and philosopher-in-training, Dr. Carl Sagan. We discuss consciousness, the nature of the mind, the evolution of the brain and the role of philosophy in modern science and philosophy, and, of course, whether or not the brain can be a computer. This episode was recorded on the last Sunday of every month, which is a special bonus episode for those of you who like a good long bath! I hope you enjoy it and that it makes you think about what it means to be a philosopher, a writer, a thinker, a scientist, a skeptic, a mystic, a dreamer, a worshipper, a seeker of knowledge, and so on. If you like what you hear, please leave a rating and a review on Apple Podcasts! I'll be looking out for your comments and questions in the next episode, and I'll try to answer all of them in a future episode. Thank you so much for your support and good vibes. - it really means a lot to me and I really appreciate it. Cheers, Jono. Jono and Caitie, I hope that you enjoy this episode and that you get something out of it, because it's a bit spooky, spooky and theological, isn't it? - Jono, I promise you that you'll find out what it's about to be about consciousness, philosophy and the mind? - Tom and Jono is a good friend of mine, and that he's a good one, and he's good enough to help me understand it all, so don't you know what it is about it's all about it? - and he'll help me to understand it better than you can help me make sense of it? -- Jono does that? -- -- and he doesn't know that it's better than that? -- but he's also good enough, he's not going to help you understand it, does he really? -- or not? -- and so much more? -- can you help me explain it better? -- but does he also? -- he's got a good enough? -- it's not better than I can help you? -- is it more than that you can do that? ... -- -- is he can do it, can he really help me help me? -- will he help me do it better, can you do that ?
Transcript
00:00:00.000
hey hi how's it going yeah good a bit uh sweating i just had a hot bath i was already in it was too
00:00:21.560
late i couldn't i couldn't back out but yeah good from judging from twitter you're a bath man
00:00:27.780
more than a shower man i love it i have i have two either two baths a day in the winter or a bath
00:00:35.360
and a shower or two showers in the summer sometimes i do it because i'm bored there's something i think
00:00:40.420
it's from my upbringing where you know we could only have one bath a week when i was little
00:00:45.960
wow sometimes secondhand water oh i've had it hard honestly it was like it's like a dickens novel
00:00:52.140
that is that is hard you you could you joke but that is deprivation one bath a week i mean that's
00:00:58.040
that's uh 17th century stuff well i remember in the winter in our house we had this is absolutely
00:01:04.320
true this sounds like a joke this sounds like a monty python sketch but we had ice on the inside
00:01:09.700
of the windows when i when i woke up yeah i used to dream i'd got up and got dressed and then i'd wake
00:01:19.000
up and go oh fuck i haven't got dressed oh i know anyway i've been um uh have you got a minute i've
00:01:27.000
got a question for you another question yeah yeah i'm just i'm just not in the bath no way um i've
00:01:33.880
been thinking a lot about the brain or rather my brain has that's sort of a point now this is a quite
00:01:42.840
long question it would stop me at any point if i've made some sort of fallacious leap but right
00:01:50.420
the brain i get i i i totally understand evolution by natural selection it's a no-brainer and that you
00:01:58.800
know the brain is just an organ like anything else okay it came from three billion years from a blob of
00:02:05.360
reproductive protein to this most complex computer right but it is just physical it's you know it
00:02:13.100
goes by all the laws the contingent laws of the universe chemistry physics you know energy electricity
00:02:20.800
all that right but obviously we've talked about this as the epiphenomenon of consciousness we feel
00:02:27.620
like we've got a self we feel like we've got free will even though that's an illusion and and this leads to
00:02:34.400
imagination invention of philosophy art gods so two-part questionnaire one a chimp's going through
00:02:46.080
that do you think they've got all the rudimentary tools to invent their gods or have spirituality or
00:02:52.860
you know all you need is imagination and you know a decent brain or even a sense of self and two
00:03:00.700
if that is true if the brain is purely physical it can be reproduced so in the future will a computer
00:03:09.520
will we have paranoid computers will we have computers that are that are nice and nasty and
00:03:15.740
don't want to die and want to murder someone shoot yeah so that's a great question and that
00:03:22.380
is uh there there's so many questions contained yeah in it here's what's not controversial so there
00:03:28.960
are many places where one can try to find a foothold or a handhold to debate some materialist
00:03:39.420
assumptions right and then you know try to open the door to something that many people in science and
00:03:46.680
philosophy at the moment would consider spooky or theological or yeah just unwarranted so the central
00:03:54.020
drift of your question is fairly uncontroversial in science which is to say it's safe to assume
00:04:01.000
that everything we know and notice about the mind from the first person side as a matter of experience
00:04:09.560
you know what it's like to be us all of that is a product of what our brains as physical
00:04:15.480
information processing systems are doing right so our brains are essentially computers made of meat
00:04:23.020
although they're they're not computers that are all that similar to the computers we
00:04:27.420
currently call computers i mean they're different in important ways many people will point out that
00:04:33.020
science has been repeatedly confounded by bad analogies that we you know we used to make analogies to
00:04:40.180
water pumps and steam engines and now we no longer do that because now we have a much better analogy a
00:04:47.320
computer but many people would will be tempted to argue that it's still not a perfect analogy or not
00:04:53.720
even a good one but no but but but the the important thing is that intelligence is basically the ability to
00:05:01.160
problem solve negotiate the world and obviously those things if they work they're favored and they're passed
00:05:08.480
on and it presumably gets better and better or it doesn't work or a dead end or or whatever so i don't
00:05:16.980
yeah i i i i get that and you know it it starts it starts worrying me i come i came from a science
00:05:24.480
background and i went to do philosophy so all the things like determinism and materialism all those things
00:05:31.800
i sucked them up anything that felt a little bit new agey nonsense mumbo jumbo magic i i sort of
00:05:39.520
rejected but i kept no mind i said well prove it to me you know and so i i i am this sort of this
00:05:46.920
hardwired contingent i need proof i need physical proof and so even consciousness freaks me out because
00:05:55.960
yeah well it should yeah it should because it's really we we don't understand it physically yet
00:06:03.740
and it there are impressive impediments to doing that i think i mean the so-called hard problem of
00:06:08.960
consciousness is genuinely hard because it's not clear why anything we do as minds you know all of
00:06:17.640
our behavior all of our mental behavior everything including our intelligence needs to be associated with
00:06:25.280
experience right we could build robots and we undoubtedly will build robots eventually
00:06:31.460
that pass the turing test that are indistinguishable from humans and and in fact
00:06:36.460
only or become distinguishable from humans by their superhuman capacities though they will be as
00:06:44.480
intelligent as we are in every respect they'll be conversant with our emotions and display emotions
00:06:50.900
of their own because we will program that them that way very likely at least some of them that way yeah
00:06:56.200
and i think it's true to say they're they're already as good they might even be better at facial
00:07:01.940
recognition than humans are now and that will eventually include detecting and responding to
00:07:09.840
our emotions and i mean just so much of what makes us effective social beings you know millions of years of
00:07:17.240
evolution as social primates and you know 300 000 years or so of finishing school as homo sapiens
00:07:23.600
we're very good at this and there's no question we're going to build machines that are better than
00:07:28.660
we are and then then literally everything we do cognitively will be like chess where it will be true
00:07:35.900
to say that the best mind at that is now a computer mind not a human one yeah you know we will never be
00:07:44.600
the best at chess ever again right yeah and that's going to be true of looking at a person's face and
00:07:50.900
deciding how they feel will there be a robot right that's bigger and taller and stronger than me right
00:07:58.620
made of steel that can see in the dark and is a better stand-up that's that's that's the robots are
00:08:06.800
coming for your job i'll always have that i'll go out i'll fall over and the crowd will go wild they
00:08:14.240
go look at him they go look at that fat bloke he's dying and the robot will go i can't i can't compete
00:08:20.460
with that i'd never thought of that ricky in the steam engine yeah yeah but uh no i think it's true
00:08:28.780
if ultimately that something like that has to be true if intelligence and even comedic intelligence
00:08:37.880
and comedic timing and everything that gets built into that empathy i learned it i learned it there
00:08:45.680
was you know it was still my brain yeah exactly if that's just information processing there's just
00:08:51.000
there's no reason why a human has to be the best at that forever and in fact there's no way
00:08:57.920
one will be if we just keep making progress building intelligent machines so i think that
00:09:03.140
i even i but i don't i totally accept that i suppose my question is then what it comes down to is
00:09:10.120
why in this you know this illusion of free will is it the same as if it wasn't an illusion what's the
00:09:19.640
difference that's my question i totally accept it but so what we are what we are what does it matter
00:09:25.700
what does it matter that there isn't free will i mean the reason why it's important is that so much
00:09:32.460
of our psychological suffering personally and so much of our social suffering in terms of what we
00:09:40.940
the ethical and legal decisions we make is anchored to this illusion yeah the feeling that you are you
00:09:49.500
and really responsible for you it's not that it's never useful i mean it's useful in certain cases
00:09:55.800
but the fact that we put people in prison for the rest of their lives or even you know give them the
00:10:02.480
death penalty in certain states in my country and feel totally justified in doing it as a matter of
00:10:10.640
punishment not as a matter of social necessity that we have to just keep certain dangerous people off
00:10:15.020
the streets which obviously that's the difference and i think that's quite different yeah yeah it is
00:10:19.900
different and i and i'd say what i'd say with them i think to to and i know you're not saying this but
00:10:26.080
to say no one has free will so no one should be punished is a nonsense rather like if a machine breaks
00:10:32.380
down in a factory you don't go well he didn't mean to break down we keep it on you get rid of it
00:10:37.640
and get a new one it's not a punishment it's well we got to still protect the innocent and i get that and
00:10:44.160
and i think yeah definitely it's something else that there's there's there's loads of
00:10:48.560
punishment certainly makes sense still in in many cases but retribution doesn't or you know the
00:10:57.140
vengeance part of it doesn't yes morally it wants you it doesn't know swallow this pill of free will
00:11:04.320
being an illusion what are the three reasons for retribution rehabilitation and what's the restitution
00:11:12.380
yeah there's have you read uh ted hondrick's book on no punishment i think it's called i think it
00:11:20.460
might be called eye for an eye no i think it's just called punishment it's got a picture of an eye and a
00:11:23.960
tooth on it so it's my it was my professor oh yeah he told me about uh four years ago i was i was sold
00:11:31.760
on it as he struggled and um yeah he breaks down why that sort of punishment for retribution doesn't
00:11:39.320
work and you know we totally agree with and you know with the death penalty you can't go back and
00:11:46.600
say we were wrong you know we we know that we know the worries about that my point is even if everyone
00:11:53.220
understood free will is an illusion we're hard to work i don't think it should make any difference
00:12:00.200
because we're not saying oh he came from a tough background or it was a crime of passion
00:12:05.040
we're just saying we're all robots let's do what we like which we know isn't acceptable that's why i
00:12:10.960
mean that it doesn't make a difference all the other caveats would still be in place you know
00:12:15.540
a sympathetic you know judicial system and and uh act utilitarian as opposed to rule utilitarianism
00:12:24.660
all those things would still be in place but what what i can never accept is that the people that
00:12:31.260
say if hard determinism is true no one is responsible for their actions on a societal level
00:12:38.280
that's the difference i'm making once you view people in this vein as akin to malfunctioning robots
00:12:47.560
right so so evil people if we built an evil robot it would reliably produce evil you know nature has
00:12:54.700
built evil robots for us as you know psychopaths and and other people who just reliably create a lot
00:13:01.840
of harm for everyone else the question is how should we feel about that and whether hatred is the right
00:13:10.080
emotional response now i mean it's a totally natural response certainly if you've been victimized by
00:13:15.240
such a person but i think we should treat it like any other force that isn't our fault you don't you
00:13:22.620
don't you don't go into morality of an angry bear exactly trying to attack you in the woods right you
00:13:29.180
don't go you might shoot the bear he came from a tough background i love animals but if a bear is
00:13:35.860
attacking me i don't i don't care about his home problems but he did come from a tough background he
00:13:41.480
came from the background of being a bear right like what else what was he going to do and i don't
00:13:46.100
care when it's whether whether should i should i rehabilitate this bear i get out if i if i can't get
00:13:54.080
out of there i try and stop him it's it's not a moral issue it's the fact that i don't deserve to die
00:14:00.420
by a bear yet that's that's what it comes down to
00:14:04.360
i love bears i love bears i've never heard a bear i absolutely love them and good luck to them and
00:14:13.060
they've got to do what they've got to do but as i say if he's in my apartment i've got other word i
00:14:19.800
don't don't care yeah i don't know what that where that analogy goes what i'm saying is the psychopath
00:14:30.080
is part of nature like the bear i know it's not his fault is a psychopath just like it's not
00:14:35.820
it's fault that it's a hungry bear but that's no reason for me not to try and stop things we've got
00:14:43.660
to do something oh yeah but but you don't have to hate it and you wouldn't hate it in the same way
00:14:48.860
you'd hate a person and that this is the crucial piece for me that's a very good point ethically
00:14:53.860
it's like right it's like like and even even if it harmed you i mean this is i don't know if you
00:14:58.260
got to that part in my in my um yeah i know you heard some of the audio from from waking up where i
00:15:05.120
talk about free will but yeah just imagine the two cases you know one case you're attacked by a bear
00:15:11.960
and you know let's say you lose a hand right so you really are you've had a you've had a terrifying
00:15:17.320
encounter with near death but you're saved and the bear gets tranquilized and let's say it gets put in
00:15:23.280
the zoo right yeah that's one case the other case is you're you're attacked by an evil person
00:15:28.360
and suffer the same injury right so that yeah and so that but then the question is what is your
00:15:33.680
subsequent mental state no you're right for the rest of your life they're right no i mean you could
00:15:39.000
be hating the person and fantasizing over killing that person with your bare hands or hand yeah and uh
00:15:47.360
but with the bear you might actually laugh especially if he laughed in court yeah and and he said
00:15:52.540
he could he could just play upon your hominid emotions so that you would really hate him
00:15:58.160
you know and want to kill him and fantasize yeah yeah we're because we've got a sense of self
00:16:03.780
and morality and we feel what's right and wrong yeah we impose that on another human where we wouldn't
00:16:08.640
do it on the bear rather rather in the way if i walk if i walk into a tree and i scrape my nose i do
00:16:15.200
not hate that tree you hate yourself i hate myself and i'd i'd i'd try and i'd go why wouldn't the
00:16:22.060
council put a fence around it i would want someone to blame i want someone to blame with the weather
00:16:28.180
if it rains i go well who didn't who didn't tell me to bring out whose job was it
00:16:34.760
yeah yeah yeah that's true that's a very good point and we can't we it's hard to forgive
00:16:42.660
another human who hurts you for fun for supposed fun even though in our yeah in a in a naturalistic
00:16:51.540
framework they can't help it as i'm i'm doing i'm putting quote marks around help it but we mean
00:16:59.780
it literally as well don't we if we're determinants and honestly that does help me
00:17:05.220
now a fair amount psychologically i mean there's so many people out there on you know on social
00:17:11.780
media in particular who this is where i tend to see it i don't i don't see it in my life
00:17:15.720
who just maliciously attack me and attack people who are associated with me in any way
00:17:23.620
and it's why am i talking to you there good luck good luck on social media after this i don't know
00:17:29.780
anything about i didn't i thought you were super popular fuck's sake okay now anyone has to have
00:17:37.000
yeah i don't like sam i'm asking him i'm using him right if anything well that's all guys just if
00:17:43.440
you're listening yeah yeah that's a very very good point it's much easier to process when you
00:17:51.340
actually recognize that certain people are doing what they do because that's what they do they're
00:17:58.220
like bears yeah yeah exactly yeah and there's lots of other factors on social media getting getting
00:18:06.520
noticed wanting to be a part of someone else's calls a web heckling they're not like that in real
00:18:11.960
life they ask you for an autograph all these things they're box that there's a you know if someone i get
00:18:19.000
it rarely if someone sends a nasty tweet i think i've told you this before that i thought why they
00:18:24.280
said that and i look back and they've sent 20 nice ones but i didn't notice them right and i think i
00:18:30.660
mean why would why would i put this line in afterlife as well why why would people rather want to be famous
00:18:36.920
for being an arsehole than not famous what what is the attraction of of being famous saying i was here
00:18:45.300
because cavemen just used to used to put their hand on the wall and blow woad over it and you know
00:18:50.980
that was i was here and now it's obviously got out of hand but there seems to be i think it's some
00:18:58.000
sort of cachet for eternal life i think that's a very human worry and quest what's the point well you
00:19:07.640
know what will happen after i die will people remember me will i will myself carry on will i come
00:19:12.580
back as a spirit is there a heaven um have i led a good life was it worth it will i come back as a
00:19:18.120
cow i think all those things as irrational as they all are are are very human and i and i don't know
00:19:25.080
why i i don't know i don't know again they could be upshots but yeah all right well we can work that
00:19:31.720
out after you've had your third bath of the day i'm gonna have a tea now so in conclusion yes
00:19:40.180
robots computers will soon be indistinguishable humans final question is there a chimp somewhere
00:19:49.220
that sat down and looked up and thought where do we come from who did all this where are we going
00:19:57.320
as that happened yet as a chimp thought what the fuck is going on here i i would highly doubt that but
00:20:03.920
the interesting thing is that there are certain things we do that are really crucial to our being
00:20:09.780
smart like you know working memory which chimps are better at which is pretty and you can you can
00:20:15.600
see this display there we could find this video on youtube where given a memory task where there's a
00:20:22.500
keyboard like a foot you know a keyboard on a screen and many numbers and letters you know suddenly get
00:20:27.740
illuminated and then you have to recapitulate sure you have to press all the right keys
00:20:33.700
yeah chimps are so fast and so much better at it than humans that it really is it's kind of terrifying
00:20:42.420
have you seen that have you seen that um experiment that that shows it it's not just the arbitrary test
00:20:49.320
it's it's the the reward that has a a sense of it so they did a thing with a chimp with beads so if
00:20:57.360
it chose the the small pile of beads it got a jelly bean right it got it right every time choose the
00:21:05.320
smallest pile get a jelly bean when they gave it the choice to choose the smallest pile of jelly beans
00:21:11.200
it didn't it chose the big pile of jelly beans because it wanted all the jelly beans and it
00:21:16.160
couldn't it couldn't the experiment was out the window it just went fuck that that's the big pile
00:21:21.300
of jelly beans that's hilarious isn't that great that's fantastic that anyway is a genius now i don't
00:21:28.740
have a sense of self and i want to be a chimp brilliant cheers man cheers see you later