Richard Dawkins: God, Truth & Death
Episode Stats
Length
1 hour and 6 minutes
Words per minute
165.08073
Harmful content
Misogyny
7
sentences flagged
Toxicity
8
sentences flagged
Hate speech
22
sentences flagged
Summary
Richard Dawkins is a world-famous evolutionary biologist, prolific author, and one of the four horsemen of the New Atheist Movement. In this episode, he talks about his journey through life, how he got started in science, and why he thinks religion is a bad idea.
Transcript
00:00:00.680
Do you not think that there is a fundamental part of us as human beings that needs religion?
00:00:06.000
You could say that there's a deep psychological need for religion.
00:00:12.400
The comfort that you get from believing a falsehood is like a drug and it's a perfectly
00:00:20.320
valid argument to say that there's everything to be said for the drug.
00:00:24.240
I just want to make a distinction between what is true and what humanity needs and it
00:00:28.240
may very well be that we do need false ideas in order to flourish and prosper but I'm also
00:00:36.880
A lot of people are zipping their mouths because of political pressure and if people for political
00:00:44.320
reasons are trying to deny that there really is a binary separation between the two sexes
00:00:49.680
then that is anti-scientific, anti-rational and is a subversion of language actually.
00:00:58.240
Hey Francis, do you think financial platforms should be apolitical and not cancel people
00:01:10.240
just because they don't agree with their politics?
0.98
00:01:12.160
I'll never forgive what those absolute f***ing b****** that Tide Bank did to us.
0.98
00:01:17.120
Now Francis, we don't know if our bank account was cancelled because of our politics.
0.99
00:01:22.000
Give me five minutes in a room with them and I'll find out.
00:01:24.800
What are you going to do? Test your new jokes on them?
00:01:34.640
Moving swiftly on, if you're looking for a crowdfunding platform then you have to use GiveSendGo.
00:01:40.560
GiveSendGo is a crowdfunding platform that is available in over 80 countries
00:01:45.120
and provides a simple and easy ways to raise money online.
00:01:48.640
They are politically neutral and don't remove campaigns based on political or ideological
00:01:54.560
differences. We all know that a lot of crowdfunding platforms cancel people if they don't agree with
00:02:00.400
their politics. The more we support companies providing alternative models, the more we weaken
00:02:06.240
the power of cancel culture. What's more, GiveSendGo is a free platform powered by donations,
00:02:12.000
which means that you get to keep more of the money you raise. Other crowdfunding sites charge
00:02:17.120
between 5% and 10%. 10% of the total money raised is a huge amount. GiveSendGo can be used to raise
00:02:24.000
funds for medical expenses, business ventures, personal needs, churches, non-profits, funeral costs,
00:02:30.240
and much more. So let's take a stand. Don't give your business to companies that have made it clear
00:02:35.600
they don't want you. Choose hope, choose freedom, choose GiveSendGo. Go to www.givesendgo.com
00:02:44.400
and check out a better alternative to crowdfunding. That's www.givesendgo.com and support the people
00:02:52.000
who support freedom. Hello and welcome to Trigonometry. I'm Francis Foster. I'm Constantine
00:02:58.880
Kishin. And this is a show for you if you want honest conversations with fascinating people. Our
00:03:04.240
terrific guest today is a world-famous evolutionary biologist, prolific author, and one of the four
00:03:09.680
horsemen of the New Atheist Movement, Richard Dawkins. Professor Richard Dawkins, welcome to Trigonometry.
00:03:14.240
Thank you very much. Oh, it's such a pleasure to have you on the show. As I said, you're world-famous,
00:03:18.720
but for our audience who will already know some of your story. Tell us a little bit about the
00:03:22.720
background. Who are you? What's been your journey through life? How do you end up sitting here,
00:03:27.120
a set of huge accomplishments throughout your life, talking to us? How far do you want to go?
00:03:32.960
Back to birth? As far back as you want to go? Yes. I was born in British colonial Africa,
00:03:39.920
and my father was in the colonial service. And our family came to England when I was eight.
00:03:47.600
And then I went to boarding school, and then Oxford. And at Oxford, I was inspired by a world-famous
00:03:57.040
biologist called Nico Tinbergen, who later got a Nobel Prize. And I did my doctorate under him, and
00:04:03.680
have done, been in academic life ever since. And first of all, I went to University of California for
00:04:09.200
two years, a very junior assistant professor. And then I came back to Oxford, where I've been for
00:04:16.720
all my life. I was a tutor at Oxford, doing the rather unique Oxford tutorial system. And I started
00:04:24.800
writing books. And I wrote a lot of books, which for a general audience, which many of them sold very
00:04:29.840
well. And then I was given a professorship with a public understanding of science, the Charles
00:04:34.560
Symonia Professorship, of the public understanding of science. And then I had to, that was my job then,
00:04:39.600
was to communicate science. And pretty much that's what I've been doing, is communicating science.
00:04:44.560
I've written about 16 or 17 books. And retired, ooh, about 10 years ago now, longer,
00:04:53.040
and carried on writing books. And that's where I am.
00:04:55.280
It is indeed. And one of your first, or was it the first book, The Selfish Genius?
00:04:59.600
That's your very first book. That was a book that revolutionized as a young student,
00:05:03.600
I remember reading it, and understanding things that I never understood. And some of the challenges
00:05:09.200
then to the idea, to the evolutionary theory, for example, altruism, were beautifully addressed in that,
00:05:14.720
in a way that even someone who's not an expert could understand. And you wrote that book almost half
00:05:20.800
a century ago now, in 1976. I'm very pleased to hear you say how much that influenced you, because
00:05:27.120
it sums up my attitude to biology. It's somewhat misunderstood, mainly by people who've read it
00:05:35.600
by title only, who think it's a book about selfishness, or even an advocacy of selfishness,
00:05:41.840
which of course it isn't, quite the contrary. And yes, it's coloured my whole
00:05:51.680
And one of the things I wanted to ask you is, as I say, that was 1976, or nearly half a century ago.
00:05:58.480
What things, what discoveries have been made, what theories have been posited that you think,
00:06:04.480
in your field, and also more broadly, perhaps in science, have emerged in that half century,
00:06:09.440
or nearly half century, that you think are some of the most crucial things for human flourishing,
00:06:13.920
human development, that we've made in that period?
00:06:17.920
I suppose the main thing is the flourishing of molecular biology, because, I mean, it was
00:06:24.400
it was written long after Watson and Crick discovered the DNA double helix structure.
00:06:30.160
But that led to the to the unravelling of the genetic code, and the realising that biology is
00:06:37.040
fundamentally digital. And that's what's really going now in biology is dominated by this digital
00:06:44.160
view of genetics. Everything is digital. And some people have asked me, has that changed the selfish
00:06:53.120
gene? Would I would I change it if I rewrote it now because of the revolution in genomics? And I
00:06:58.800
think I wouldn't. It's still, um, it's still valid as far as evolution is concerned. I mean,
00:07:06.240
evolution is the differential survival of genes in gene pools. And that's,
00:07:11.840
could have been said in the 1930s, actually. Well, it pretty much was.
00:07:15.920
So, Richard, my question is, how much does biology dominate our lives, as in our own lives? Does it
00:07:25.680
dominate everything? Do we have free will? Or is it, are we just purely a product of our genetics?
00:07:31.600
Well, free will is a question that doesn't necessarily need a biologist to answer it.
00:07:37.600
It's a deep philosophical question. And it's not a question that I think you can answer by
00:07:44.320
looking at genetics. If we have free will, or we don't have free will, that won't be influenced by
00:07:52.000
looking at genes. You can say, um, nothing that I do is anything other than predetermined by the
00:08:01.680
things that happen in the world. The molecules in my brain, everything is predetermined. And genetics
00:08:09.600
will not just be a part of that. So there's not really a question for a geneticist. It's a question for
00:08:13.520
a philosopher, I suppose, um, or any scientist. And it's not a thing that you'll, that a geneticist
00:08:23.040
has any specific, um, input on. And is it, what, what revolution that can you see coming or is
00:08:32.480
happening at the moment that you actually get excited by? Because that's the exciting thing about
00:08:36.400
science, the changes, the innovation, the discoveries that we're seeing at the moment. Well, I mentioned
00:08:42.000
molecular genetics and, and that's happening all the time. It's not, it's not a single revolution.
00:08:46.240
It's just going on and on and on. Um, and so, um, the fact that you can actually read very swiftly
00:08:53.760
nowadays, you can actually read the genetic code of any animal. It's always the same genetic code.
00:08:58.960
And what it says is in detail different. And so you can take any animals you like and read the genetic
00:09:05.360
code, any people you like, and read the genetic code and compare it letter by letter, line by line,
00:09:11.120
just as you might compare two manuscripts. And that's an astonishing thing. I mean,
00:09:16.320
that, that would have amazed Darwin, I think. And that means that you can reconstruct the whole
00:09:23.840
tree of life, the whole pedigree, the whole family tree of all living things, um, in, in minute detail.
00:09:31.440
You can say, which is the, which pairs of animals and plants are the most of the closest cousins.
00:09:36.160
You can estimate how far back they had a shared ancestor. That's isn't, that's only an approximate
00:09:42.000
estimate, but, but you can roughly lay out the complete history of the branching tree of life
00:09:49.600
by looking at these molecular genetic, um, sentences.
00:09:55.360
Richard, and for someone like you who is a scientist and who is, uh, I, I think there's a sort of
00:10:02.000
wondrous appreciation of all these things and the beauty of the natural world and so on.
00:10:06.480
Uh, what are some, for, for our audience and, and for us actually, what are some of the practical
00:10:11.120
applications that come out of these discoveries that we may not yet have seen when we go to the
00:10:16.640
doctor? We may not yet have experienced in our own lives as a thing that's become part of it.
00:10:22.160
What is coming as a product of this, uh, steady, slow, but ongoing revolution?
00:10:27.840
Hitherto, doctors have treated us all as pretty much the same. I mean, maybe a bit different male
0.95
00:10:33.520
and female, old and young, but apart from that, if you've got a certain disease, you get a certain,
00:10:39.360
a certain medicine, a certain treatment. Um, what will come is when doctors are able to read the, um,
00:10:47.760
if they can in principle do it now, but when it becomes cheap enough that a doctor will know
00:10:52.480
the genome of each patient, then the prescription that the doctor will give for any particular disease
00:10:59.760
will vary depending upon the, uh, the genotype of the individual patient. Um, in a very sort of
00:11:07.600
crude way, we already have this when certain particular diseases are known to be caused by
00:11:12.320
certain particular genes. Um, but when everybody's patient, when everybody, every patient's genome
00:11:19.120
is on file in the doctor's computer, say, okay, well, I look at me as I see your genome, so you
00:11:24.080
better have this, this drug and the next patient, you better have this drug. Um, I think that that's
00:11:29.920
the main thing I can think of immediately off the top of my head in answer to your question.
00:11:35.200
Richard, are you worried that, no, no, for, from a medical point of view, that sounds incredible.
00:11:40.480
I'm like, of course, that, that is wonderful. That's what everybody should have access to.
00:11:44.480
But do you worry as well that this type of information could be used in a, in a way to
00:11:49.040
discriminate against people, for example, to prevent people getting health insurance because
00:11:52.880
you might have like the Baraka gene and that means that you have a 90% chance of developing breast
00:11:57.840
cancer. That's a big worry. And, uh, it's one of the things that needs to be looked at very carefully
00:12:03.360
because, um, life insurance companies at present have a very crude way of, um, doing their actuarial
00:12:11.280
calculations. They look at your age, they look at whether you smoke, whether you drink to it to
00:12:16.000
excess, that kind of thing. Um, but, um, apart from that, we're all treated pretty much equally.
00:12:23.360
And that means that, um, that those of us who are, um, who are, who are healthier in a sense subsidizing
00:12:28.880
those of us who get ill and that that's the way insurance works. If, if it came to it that, um,
00:12:37.600
genetic, uh, knowledge enabled actuarial calculators to say, okay, he's got, he's got another 25 years
00:12:47.200
to live. He's got only 15 years to live. Um, then life insurance would become impossible. I don't think
00:12:53.280
it'll come to that because it's not that predictable, but nevertheless, um, there would
00:12:58.080
have to be careful regulation. I'm not sure how that would work. I suppose to prevent insurance
00:13:04.960
companies getting access to data, which, which would have to be kept confidential. That would
00:13:10.720
be one way to do it. Uh, another where I, I'm not quite sure what, how, what the legality of, of it
00:13:15.760
would be, but that, that is a big worry. And, and also as well, what it brings into, I mean,
00:13:21.520
the term designer babies is, is used. I think it's more of a kind of tabloid description,
00:13:26.400
but we know what we mean, where people, where babies are then having screened for certain
00:13:31.680
genetics and certain genes are taking, are taken out or eliminated or played with. Do you think
00:13:37.600
that scientists playing God, or do you think that's actually where we need to get to as a species,
00:13:44.080
so thereby eliminating certain illnesses, diseases? I think it's hard to find an objection to eliminating
00:13:50.240
diseases. So I think as long as that's used, not for making designer babies, but for, um,
00:13:58.240
screening against hereditary diseases, it seems to me that that's really hard to object to that.
00:14:04.800
Richard, sorry to interrupt. Do you not think there's a moral conundrum there? Because it kind
00:14:08.800
of depends where you draw that line, doesn't it? For example, if you have a baby that's, I don't know,
00:14:13.680
has a higher risk of cancer. Now, I have a young son, I wouldn't want him to have a higher risk of
00:14:20.320
cancer. But if I could then, you know, select him from a number of fetuses and discard the others,
00:14:25.360
wouldn't that be maybe perhaps too far going in that direction? Well, I don't think necessarily so.
00:14:31.920
I suppose the way it might work, this would be something like this. In IVF, in vitro fertilization,
00:14:37.600
the woman is given hormone treatment to super ovulate. So she gets maybe a dozen eggs,
0.98
00:14:44.800
and the dozen eggs are all in a Petri dish. And what happens at present is that the doctor picks
00:14:51.200
out one of these eggs at random, well, they all get fertilized if they can, picks out one of the
00:14:56.880
zygotes at random and re-implants it in the woman. Well, um, if the, instead of picking at the
1.00
00:15:07.440
zygote at random, you, you, you examine the genes, and this can be done when it reaches,
00:15:12.560
say the eight cell stage. So you let them develop to the eight cell stage. So they are eight cell
00:15:16.720
embryos. And then you can take out one cell of the, of the eight, and look at the genes of that,
00:15:23.040
and it doesn't damage it. And you say, oh, okay, this one has the gene for haemophilia,
00:15:29.360
and this one doesn't. Why on earth would you choose at random if you know
00:15:33.840
that some, that half of them have the haemophilia gene and half of them don't?
00:15:38.960
Obviously, it makes moral sense to pick one of the half, one of the 50% that do not have the gene
00:15:45.600
for haemophilia, whatever it might be. I suppose the question that it brings into,
00:15:48.960
and this is part of the other conversation we want to have with you, um, in terms of atheism and
00:15:54.560
religion, it brings into question, uh, life. What is life? Is, is a eight cell zygote life? Can it be
0.71
00:16:01.360
discarded? Uh, is that, that, that's where the moral conundrum comes in? I have no time for those,
00:16:07.120
that kind of argument about, is this a human, human life? You don't really? No. Why not? Well,
00:16:12.400
because this eight cell zygote has no feeling, it has no nervous system, it has no capacity to suffer,
00:16:19.360
no capacity to feel pain. Um, so that there's, I mean, no, no moral difficulty at all in choosing.
00:16:25.920
You're going to pick one of them anyway. You want a dozen of them,
00:16:28.640
and the, and the 11 that you, that you don't choose are going to be flushed away anyway.
00:16:34.000
So, um, you've already taken that moral decision in a way. It's a question of whether you choose one
00:16:38.880
at random, whether you choose the one that, that does not have the lethal or sublethal.
00:16:44.960
But what I mean is once you've created that selection process, given that parents would want
00:16:49.680
their children to be happy, but also clever and charismatic.
00:16:54.000
Now you've moved on to quite a bit of a different topic. Now you've moved on to...
00:16:57.440
But that's where I was going with it, right? Okay, okay.
00:16:59.360
So, and that's, I think, the core of Francis' question too is,
00:17:01.760
if we've got the opportunity to select the optimum baby for us,
00:17:05.760
are we not going to end up in a position where parents are encouraged to go down that path?
00:17:11.600
But you've changed the subject, and I, I, I was talking about the removal of...
00:17:15.600
I'm not trying to catch you out. I'm trying to get to the bottom.
00:17:18.720
Yeah. And then you can say something like, um, if it became possible to know that of these,
00:17:25.280
of these dozen zygotes that we've got, um, some of them have a gene for musical genius,
00:17:34.080
and some of them don't, um, we're nowhere near that at the moment. But, but if, if in the future that,
00:17:39.520
that came about, then would you worry about parents who say, I want my child to, to be a great musician,
00:17:46.560
or a great composer, a great mathematician. Um, and that's very interesting. That's a more difficult
00:17:53.280
moral dilemma. Um, if, if you object to that, then I might say to you, why do you object to that
00:18:02.720
when you don't object to parents, even forcing their child to practice the piano? See, you know,
00:18:11.040
you, you haven't done your music practice today. Get on with it. You're supposed to practice,
00:18:15.200
because the only way you will become a great pianist is by practicing six hours a day, whatever it is.
00:18:20.880
Um, why would you think that is, uh, okay, when you would not think it okay to pick out
00:18:28.880
a zygote from a Petri dish, which contains a genetic predisposition to be a great musician?
00:18:34.480
I'm not answering the question, but I'm saying that, in a way, these sorts of moral questions
00:18:39.280
can best be answered by comparison with other, other things. And I'm just comparing this case,
00:18:45.120
asking sort of question a moral philosopher might ask. Um, what's the difference between
00:18:51.520
encouraging a child to practice the piano and giving the child a head start by giving it the
00:18:57.440
gene of the, of the ones that are available in the Petri dish in an IVF situation, um, to become
00:19:05.280
a musician? Well, I suppose the problem with, with the answer to that is we come back into the realm of
00:19:11.760
religion and morality, because I think at the core of it is the idea that the creation of life is some
00:19:18.480
kind of random miracle. It's supposed to be sort of, it's, it's not supposed to be is a word that I've
00:19:26.640
introduced of course, but if we, if we call it a miracle, um, it's supposed to be random to some
00:19:33.520
extent. It's supposed to be down to chance when you, when we start interfering with that selection
00:19:38.000
process. Instinctively, I, I don't, I can't explain it perhaps. I feel a certain, there's something off
00:19:44.560
about that for me. I know that's not a very rational or scientific argument, but I do think that's one a
00:19:50.560
lot of people share actually. Well, it's not rational. No, I acknowledge that. It sounds as though you're
00:19:54.800
religious. It does, it does, which I'm not interestingly, but I feel that that is probably
00:19:59.600
how a lot of people do think about it. Yes, I suppose they do. Um, I think it's more of a worry
00:20:07.440
would be if some mad dictator, some kind of Hitler, started using these techniques and, and mandating,
0.80
00:20:13.920
you know, this, the selection of blonde, blue eyed, Aryan, you know, um, the sort of thing Hitler
1.00
00:20:24.240
might have, might have selected. Um, I was talking about more giving parents the opportunity
00:20:30.160
to have their, um, to do their IVF in a, in a non-random way. I mean, to, I, I can sort of see
00:20:37.920
that both ways. I don't, I'm like, but on the other hand, if it, if you have a dictator who says,
00:20:42.800
we want to breed in this country, a race of people who are, who are of such and such a type,
1.00
00:20:48.960
um, that I think is deeply sinister. I would agree with you. What would you say as well,
00:20:53.600
Richard? Because I think about this a lot. So when I was a teacher, I taught, for instance,
00:20:58.560
uh, children who were on the autistic spectrum and they were quite far on the autistic spectrum,
00:21:04.080
so they were non-verbal, but I also saw children who were autistic and as a result of their autism,
00:21:11.360
some of them were highly gifted in mathematics and would go on to be scientists, physicists,
00:21:17.200
et cetera, et cetera. You see artists who have depression, for example, very severe depression,
00:21:23.520
but that almost gives them an insight into the human condition and human suffering.
00:21:28.240
My worry is with this, we go, okay, we're going to get rid of depression. We're going to get rid of
00:21:31.280
autism. And then you could argue, well, look, we're going to get rid of, we're going to lower
00:21:34.800
suicide rates. As a result of that, we're going to, non-verbal autism will be eradicated completely.
00:21:41.200
But also, aren't we going to get rid of the outliers, the, the, the great thinkers,
00:21:46.000
the people who look at the world in a different way that move our, our species forward?
00:21:50.240
That's an excellent point. Um, and, and it's one that I should have come onto anyway. Yeah.
00:21:54.480
Um, when you do any kind of selection for any one characteristic, you're in danger of
00:22:00.160
having side effects, which you never, which you never thought of. So if, if you select for, um,
00:22:06.240
to, to, to remove some kind of psychological problem, then it might, could well be that you're
00:22:11.520
then having a side effect of, of, of never getting any, any genius mathematicians of something of that
00:22:17.040
sort, is what you're saying. I think that's a very good point. And, um, it's what, it's one of the
00:22:22.240
problems with any kind of eugenic selection is that, is that you, you don't know what the, what the
00:22:27.840
byproducts are going to be of the selection that you're, that you're engaged upon. And that's
00:22:32.160
what I mean when, when you have a dictator who says, we're going to breed for such and such.
00:22:36.560
You see this with, with breeding animals, you, you breed for racehorses who have very, can run very
00:22:42.400
fast and, and they break their legs. I mean, they've, they've, they've, they've become more vulnerable
00:22:48.000
to, um, leg breakages because you're breeding for one thing, namely speed.
00:22:52.480
And in nature, that doesn't happen. In nature, it's, it would be counterbalanced by selection
00:22:59.360
against raking the legs. Well, you're only using a human example, which is much more serious,
00:23:04.080
um, where, um, geniuses, outliers, as you rightly said, um, are quite likely to be selected against,
00:23:14.640
if you naively, um, go for removing certain psychological problems. So there's, there's,
00:23:25.040
there is something to be said for letting nature take its course. But I think I come back to the
00:23:31.040
negative things of these, these sort of genetic diseases, which are, which are, I think anybody
00:23:36.320
would agree are undesirable, like haemophilia. Um, and, and it seemed, it seems to me that taking a moral
00:23:44.960
letting that your moral concerns overspill into forbidding the removal of genetic diseases
00:23:53.920
like haemophilia, that, that are obviously negative in all respects, um, that, that, that's going too far.
00:24:01.040
And Richard, we have a lot of scientists saying that they're worried about where science is going,
00:24:08.240
the state of science in general, free speech in universities, scientists no longer being able to,
00:24:14.560
allowed to explore certain avenues. How do you see the field of sciences at the moment? Are you
00:24:19.360
optimistic or are you a little bit more cautious? Um, what examples are you thinking of where scientists
00:24:24.560
are not allowed to? Well, we, we, for instance, when it comes to talking about, but, uh,
00:24:30.880
the sexes and saying, you know, there, there, there's differences between the sexes investigating
00:24:37.520
the differences between the sexes. Certain scientists would say, look, I don't feel able to
00:24:42.960
investigate that anymore because people will complain. People will say certain things about me
00:24:48.800
that, you know, that I'm denying the existence of trans people, for example.
00:24:53.040
Yes. I think you're right that, um, not just scientists, but, but, um, a lot of people are zipping
00:24:58.880
their mouths, um, because of, um, a political pressure of the sort that you're saying. Um,
00:25:06.560
and it, it is important, I think, for scientists to, to be honest and to, uh, use language precisely.
00:25:14.640
And in a particular case of sex that, that you mentioned, it's one of the few cases where there
00:25:18.400
really is a, a, a, a bifurcation, a binary bifurcation. There really are two sexes. And scientists
00:25:27.360
have to, have to work on, under that, that fact. And if, uh, people for political reasons are trying
00:25:33.920
to deny that there really are, there really is a binary separation between the, between the two
00:25:38.800
sexes, then that is anti-scientific, anti-rational, and is a subversion of language, actually.
00:25:45.200
Richard, uh, I, I actually didn't want to make this conversation in any way about the cultural
00:25:52.240
discussions around that issue. But since, since we've come to it, do you sit there and sometimes
00:25:58.320
have to pinch yourself that you, one of the most eminent evolutionary biologists in our society,
00:26:04.480
go on the, on, on the mainstream media and you are asked to talk about the fact that there's men and
00:26:10.800
women. Do you not, do you not feel, do you not experience that as like a gigantic regression?
00:26:15.600
Richard, I do. I, I, I, I do because, um, I, I've, um, one of the main points I, I, I like to make
00:26:23.200
is that very often, um, I, I call it the tyranny of the discontinuous mind. We're, we're far too fond
00:26:29.200
of making discriminations, um, things like, um, well, we were talking a bit earlier about, um, embryos.
00:26:37.920
When does, when does an embryo become human? Is there a particular moment when an embryo becomes human?
00:26:42.880
No, there isn't. It's a continuum. It's a sliding scale. There are sliding scales everywhere.
00:26:48.160
We, in universities, when we examine students, we give them a degree first, second, two, one,
00:26:55.040
two, two, third. We, we, we insist upon making a divide between one class and another. We know
0.95
00:27:01.760
perfectly well that the top of one class is closer to the bottom of the one above than it is to the
00:27:08.720
bottom of its own class. And yet the information about the sliding scale is actually in bell-shaped
00:27:15.200
distribution in this case, um, is thrown away when we divide. We insist upon making divide. So,
00:27:22.240
um, the tyranny of the discontinuous mind is one of my catchphrases. And if I look around and say,
00:27:28.720
is there any case where there really is a proper divide, a real, where there really is no spectrum,
00:27:34.560
spectrum. And sex is, is the one thing I can think of. There really isn't a spectrum. You really are
00:27:39.440
either male or female. And so, um, I do have to pinch myself when, um, for once, uh, it goes the other
00:27:50.080
way. I mean, there, there really is no spectrum there. And, um, yes, I, I, I do have to pinch myself.
00:27:56.320
And where do you think this regression, if you agree with my use of it, where does that come from?
00:28:02.240
Well, in certain social sciences, um, they've been very influenced by what they call postmodernism.
00:28:07.840
I don't really know what postmodernism is. In fact, I think the so-called postmodernists don't know
00:28:12.720
either. But I think it's something to do with, um, words meaning something that is determined by
00:28:24.000
politics. And you don't, you don't actually have a fixed meaning of words like male and female,
00:28:32.720
because there's an intellectual movement in, in the social sciences that says that everything
00:28:36.960
is a social construct. And male and female is a social construct. There's no real, well,
00:28:42.320
validity to the, to difference in male and female. It's all a social construct, um, or cultural
00:28:48.880
relativism, the idea that, um, um, different cultures have completely different ways of classifying
00:28:54.400
the world. And so, and so it's just our white Western way of looking at things that, um, that says that
00:29:01.680
there are, there's a divide between males and females and something like, like that.
00:29:05.600
And Richard, with this social, uh, constructivism thing, my sense is that is,
00:29:12.320
the erosion of the concept of truth itself. Is, is that too strongly put or would you?
00:29:17.440
I think it's absolutely right to say that it is erosion of the concept of truth. I mean,
00:29:21.760
it, it is valid to say that language evolves and, and what words mean today is not the same as what
00:29:27.600
they meant 200 years ago necessarily or a thousand years ago. Um, and so we have to accept the fact that
00:29:32.880
words change them, their, their meaning, but we also have to live in a world where, um,
00:29:39.680
we have to be able to communicate. And so we have to be able to say that there are certain words that,
00:29:43.680
that, that, that at present means, mean such and such. At present, black means black and white means
00:29:48.320
white and blue means blue. And it, and it, if, if that changes in a hundred years, that's okay.
00:29:53.840
But for the moment, we've, we've got to use words that everybody understands. And words like male and
00:29:58.560
female are pretty clear in everybody's mind, what that means. And, um, to come along and say,
1.00
00:30:06.000
oh, that's just a social construct, um, is subversion of language, subversion of truth. I think you're
00:30:11.280
right. Do you think that this is a religious belief, Richard? I think it has a lot in common
00:30:17.360
with religion, with religious belief. Um, it's different in that it doesn't invoke anything
00:30:22.080
supernatural, but it's very similar in the ways that heretics are hunted down. We've seen this in
00:30:30.800
Oxford recently where Kathleen Stock has been, has been, um, vigorously, not almost violently,
00:30:38.080
sometimes they are violently hunted down. Then there's actually rather detailed parallels with,
00:30:44.640
with religious, um, dogma, um, the, the doctrine of, um, original sin. Um, Christians, especially Catholics,
0.56
00:30:57.440
believe that, um, we're all born in sin. We're all, we're all in, we all inherit the sin of Adam.
00:31:02.640
Um, although they no longer think Adam ever existed, but somehow they sort of managed to
00:31:06.640
talk and go on about talking about the sin of Adam. Um, and so we all, the moment we're born,
00:31:13.040
we're born in sin. We had to be baptized to be cleansed of sin, that, that kind of thing.
00:31:17.520
And I think we see that in the collective guilt that all white people are supposed to inherit
0.87
00:31:23.840
because of slavery. And, you know, even if they, obviously they have no direct connection with
00:31:30.480
slavery, maybe even if they never had an ancestor, most probably did, but, um, and we are all
00:31:40.080
collectively guilty for what people of, of our type did in some past age. And that is original sin.
00:31:49.840
I mean, it's exactly like original sin. Francis, may I jump in just very quickly, Richard? Yes.
00:31:54.320
You said there's nothing supernatural. Do you not think that the claim that you can change your sex
1.00
00:31:59.040
by means of incantation is a claim that is supernatural? I mean, yes, I mean, you could,
00:32:04.080
you can make that case. Um, it's not quite supernatural in the same sense as, as believing in gods or believing in,
00:32:10.960
in fairies. Um, but yes, it's sort of like that. Um, let's take another religious example, the Catholic belief in
0.87
00:32:21.920
transubstantiation where the consecrated wine and bread becomes the blood and body of Christ. And the,
00:32:33.760
the, the way they put this is, is to use the Aristotelian idea of substance, essence, something
00:32:42.560
and accidental. So, so, um, um, um, Aristotle made the distinction between the, the, the real substance
00:32:52.880
and, and the, and the accidentals. And so they, they say that while the accidentals of wine is still wine
00:32:59.440
in this sort of accidental sense, but in real substance, it becomes the blood of Christ. And
00:33:04.560
that's the verbal trick that they use to, to justify this ridiculous idea that the blood becomes the,
00:33:12.240
sorry, the wine becomes the blood of Christ. Um, so yes, that's very, very similar to, you're quite
00:33:18.080
right. That's very, very similar to saying that it's an incantation that you say, I stand up and say,
00:33:23.760
I am a woman. Therefore I am a woman. Therefore everything about me is, is a woman. You're not allowed to
1.00
00:33:29.360
misgender me. You're not allowed to say anything against the idea that I'm a woman. That's very,
00:33:33.760
very similar to, to the accidental and real substance argument of Catholics in the accidentals may, may,
00:33:42.080
may say I'm a man. I still have a penis, but in my real substance, I'm a woman. And that's very,
0.96
00:33:48.160
very similar. Yes. Richard. Now you, you're obviously a very famous atheist and you've written
00:33:54.960
fantastic works talking about your atheism. Do you not think that there is a fundamental
00:34:01.040
part of us as human beings that needs religion? Yes, very possibly there is. Um, it doesn't mean
00:34:08.320
it applies to everybody. It may mean that there's just a, um, it is quite difficult to eradicate when
00:34:15.040
you have something as fundamental as that. And, and yes. Um, and I suppose a psychologist could delve
00:34:21.920
into that and say something like you, we all need, um, to believe in something higher than ourselves.
00:34:29.360
And we have a desperate need to believe that we're not going to disfizzle out when we die. Um,
00:34:34.480
so yes, uh, you could, you could say that there's a deep psychological need for religion. It doesn't
00:34:42.640
mean everybody has it. It doesn't mean that you can't get out of it. Uh, but, but there's probably
00:34:47.600
there's something like that to explain it. And because being critical of organized religion,
00:34:54.720
as you were, was there ever, did you ever think that you might, that you went too far with your
00:35:01.360
criticisms of God in particular? Or would you look back on it and go, no, I was absolutely justified in
00:35:08.720
what I said at that particular time? Um, I, I've always tried to seek the truth and, uh, I've,
00:35:19.360
I've never wanted to be, um, I never wanted to upset people. Um, but I just think that we should be
00:35:25.680
free to argue points. And, um, so, uh, I don't think I've gone too far. You could possibly dig out
00:35:37.280
cases where, where are you, you might challenge me and I might say, yes, perhaps I did go too far there.
00:35:42.240
Um, but, um, I think in general, I've always just tried to be loyal to the truth. I think that
00:35:53.440
religious claims are not trivial. And I think that they're very interesting. Um, it, it,
00:35:59.920
it's in a way is one of the biggest questions that perhaps the biggest question there is in science is,
00:36:04.720
is, is does some kind of supernatural intelligence lurk behind the universe? Is the universe a planned,
00:36:15.760
um, entity that, that, that there's a super supernatural being conceived? I mean, that's a
00:36:24.480
profound scientific question. If it's true, then we're, then the entire universe we're looking at
00:36:30.320
is a very different kind of universe from one where there's nothing behind it. Um, and so it's a very,
00:36:36.320
very deep scientific question. Um, and I think the, the answer is clear. There is none, but I'm, I, I don't
00:36:46.880
write it off as something trivial. I don't write it off as a case where you can just say, um, it's not
00:36:53.520
interesting. Of course it's interesting. Um, it's interesting. A scientist has got to be interested
00:36:58.560
in this, this suggestion that the sort of universe we're studying is a planned, designed universe.
00:37:05.680
That's a very, very different kind of universe. So it's a very, it's a very deep question.
00:37:09.440
It is a very deep question because I, I sense, and I'm not a scientist, that the more you discover
00:37:15.040
about the world, the more you learn about the beauties, the details, the intricacies, the way things are
00:37:20.000
interlocked together. Isn't maybe a part of you that thinks this can't have been an accident.
00:37:27.120
There must be something else here. It's very, very tempting to say, yes,
00:37:32.320
everything works so perfectly together. And the, the, for me as a biologist, the complexity of life,
00:37:37.680
the, the enormous panoply of, of plants and animals and forests and birds and insects,
00:37:44.160
everything working together. No, it's, it's, it's a huge, beautiful construction. And what,
00:37:52.640
for me, what is absolutely marvelous is that nevertheless, there is a perfectly decent explanation
00:37:58.400
that it did all come about without, without planning. So one of the beauties of it is precisely
00:38:04.080
that we now can explain it or we're well on our way to, to explaining it without invoking any kind of
00:38:11.200
supernatural intelligence. And it is a great temptation because, um, we are so used to the
00:38:20.240
idea looking at our own machines, looking at things that we've made like computers and cars and planes.
00:38:26.320
And, um, these are clearly the result of design, deliberate design, deliberate construction.
00:38:35.840
And when you look at something like a bird's wing, compare it with an airplane's wing,
00:38:40.640
the temptation is huge to say, oh, they must both be designed. And it was the genius of Darwin to break
00:38:48.320
away from that and say, you actually know, there is a proper explanation. There is a, a materialistic
00:38:53.600
explanation for that. Um, and so, so the flip side of, of the, of the temptation is that when you've
00:39:01.200
overcome the temptation and worked out that it is possible to explain it in
00:39:07.920
simple scientific, I, I, I really mean simple because it, the ideas, the idea,
00:39:13.520
Darwin's idea is a deeply simple idea. And yet, given enough time, the Darwinian idea of natural
00:39:20.960
selection, given enough time, can build up to prodigies of complexity and beauty
00:39:27.920
and the illusion of design. And that's a, that's a measure of the genius of Darwin to see that.
00:39:34.880
And Richard, in your book, The Blind Watchmaker, you make that point, I think, beautifully.
00:39:39.520
And I do actually think that just slightly, no disrespect to France, is probably one of the
00:39:44.160
weaker arguments against your position. What I think is a stronger, stronger argument is the one he made
00:39:48.400
earlier. And this is about the psychological need people have for religion, but also at the level
00:39:54.560
of society. And this is really something I want to get into with you. Uh, uh, Noah Yuval Harari,
00:40:00.000
for example, in his book, Sapiens, his central argument is that the reason homo sapiens were able
00:40:05.520
to out-compete other, uh, species of human, uh, human, uh, humanoids, humanids? Hominids. See, I got both
1.00
00:40:13.760
wrong, hominids, was that, um, they were able to build shared myths that allowed them to create tribes that
0.92
00:40:20.800
went beyond the, the 150 limit of sort of being able to know everybody. And when we look at the
00:40:27.680
world today, uh, our generation, I, uh, when you're not in the room, I sometimes refer to us as, you know,
00:40:33.840
the children of Dawkins and the children of Hitchens in the sense that you took away with your beautiful
00:40:39.440
books, our ability to have that illusion that the world around us is, you know, this God created
00:40:46.560
mysterious place for which there's no other explanation. But as I look around at the world
00:40:51.920
with its inability to agree on what words mean, uh, what people call now the crisis of meaning,
00:40:59.760
where a lot of people are kind of lost. They don't know what the purpose and meaning of their life is.
00:41:05.120
Do you worry that maybe the truth is that for Richard Dawkins, your worldview is perfect because
00:41:11.840
you are Richard Dawkins, you're able to be inspired by science, by the beauty of the natural world,
00:41:16.880
by the beauty of the universe. But for a lot of other people, what they actually need is not necessarily
00:41:22.960
the belief in God so much as the social function that religion used to fulfill, which is to bind us
00:41:29.520
together with a set of shared values and a set of shared morals and a set of shared ideas about what
00:41:35.920
it means to be human, what it means to relate to other people. And without that, we are lost and
00:41:41.040
therefore we create these new religions that sort of tear our society apart.
1.00
00:41:46.720
I think that's very interesting and I'm rather persuaded by Harari's argument. But notice what
00:41:55.680
you've just done. I mean, you said maybe humans need religion, maybe they need something to bind
00:42:03.920
society together and function as a unit and so on. And maybe they do. And that doesn't make it true.
00:42:09.600
Oh, I agree with you. And of course you do. How modest of you.
00:42:15.280
No, I mean, it's just that you can make an utterly watertight argument where humanity needs something,
00:42:23.520
which is false. Oh, fine. And I just want to make a distinction between what is true and what
00:42:30.720
humanity needs. And it may very well be that we do need false ideas in order to flourish and prosper.
00:42:40.240
But I'm also interested in what's true. And so, I mean, I don't want to get involved in that sort
00:42:45.920
of selfish idea that it's okay for us intellectuals, but hoi polloi need religion.
0.92
00:42:54.160
I wouldn't consider myself an intellectual. So I am the hoi polloi in this context. But I'm just
00:43:01.360
saying to you, to me, truth is a supreme value. And I agree with you. But at the same time, to me,
00:43:07.600
the cohesion of our society, the fact that people are able to live a life of meaning and purpose,
00:43:13.760
they're able to connect to other human beings and be fulfilled in their lives and know what to do.
00:43:19.040
And, you know, you don't have people who are disappointed with the way that their life has
00:43:23.440
gone because of the choices that they made, because they didn't have someone encouraging
00:43:27.680
them down a particular path. That, to me, is also very tragic, as tragic as the erosion of the concept
00:43:33.440
of truth. And I feel there's a balance to be struck there. That's what I'm saying.
00:43:36.880
Yes. I think that's a very good point. And perhaps I should say, accepting the possibility of that,
00:43:44.640
I might also say that, actually, there's something wonderful about truth itself.
00:43:48.560
Of course. And so you can lead a very, very fulfilled life in the search for truth.
00:43:56.080
And... But that's my argument. You can. You can. You are a scientist who spent his whole life
00:44:02.960
pursuing the truth with a microscope and whatever other tools you use, right? I haven't. I can't.
00:44:09.440
That's not what I can do. And that's not what a lot of people can do. It's not because they're
00:44:13.280
more stupid than you. It's just because they're not scientifically minded. Maybe they,
1.00
00:44:16.960
they're musicians or whatever, and they don't have that same mindset as you do. They don't have
00:44:21.920
the same brain as you do. That's the argument. I can't get involved in using myself.
00:44:29.120
But, but, um, I just want to repeat that, that, that you don't act, well, okay, you don't actually
00:44:34.400
have to be a working scientist. You can, you can, you can revel in the beauty of scientific
00:44:39.680
understanding, just as you can revel in music without actually be able to play an instrument.
00:44:46.080
And, um, there is, there is immense fulfillment to be had in appreciation of understanding of the
00:44:56.000
universe in which you live and why you live here, or why you exist. That, that's a wonderful thing.
00:45:02.880
And I, I don't want to accept the idea that only certain people are capable of doing that. You
00:45:09.680
want to actually have a microscope in order to do that. You don't. You can read books. You can see
00:45:16.000
the cosmos television series. You can see David Attenborough's films. Um, and there's huge
00:45:22.880
satisfaction of, of the kind of thing that I suppose religion aspired to do in past centuries. Um, and
00:45:36.000
almost worshiping, not worshiping in the sense of worshiping the supernatural, but the,
00:45:40.640
the, the, the, the, the, the wonder of your own existence and the process that has given rise to
00:45:46.160
your, to, to, to you, that has led to your existence, which
00:45:51.840
wonder of wonders, we knew, we now pretty much understand. I mean, we, we, it's a privilege to be
00:45:56.960
alive in the 21st century and to be in a position to just read a few books and see a few television
00:46:05.200
documentaries and understand why you exist. That's never happened before.
00:46:09.040
I, I think the thing that, the thing that religion gives people is a sense of safety almost. So,
00:46:21.120
for instance, if when, when somebody is very ill and maybe they have cancer and they don't know
00:46:26.000
if they're going to live or not, this knowledge that there is a supreme being looking after them,
00:46:31.920
or that there's somewhere that they're going to go afterwards, provide someone with a deep sense
00:46:37.440
of comfort in the darkest moments of their life. And whilst I agree with everything that you said
00:46:43.360
about science and discovery and the wonder of looking at the universe, I don't think that provides
00:46:50.160
that particular emotion. Do you see? No, it's probably true. Uh, and, um, again, you can say, um,
00:46:56.560
um, the, the, the, the solace that you get, the comfort that you get from a belief, even if it's
00:47:02.320
false, it nevertheless is comforting. Um, uh, you've probably read Aldous Huxley's Brave New World.
00:47:09.600
Yes. Where, um, um, the, uh, towards the end, the, the Mr. Savage, the person who's brought from the
00:47:18.800
reservation and who is, who has actually got access to reading Shakespeare and things. And he has an
00:47:22.800
argument with the world controller about whether dulling people's senses with Soma, with the drug
00:47:29.520
that they, that they all have to make them feel good. Um, comparing that with, um, the, the, um,
00:47:37.840
depth of emotion that you can get by reading Romeo and Juliet or, or, or, or Othello. Um, and the
00:47:44.400
savage is advocating Shakespeare and the world controller is advocating the pacification of people
00:47:54.560
with, with, with, with this drug that makes them all feel good. In a way, the, the comfort that you
00:48:00.640
get from believing a falsehood, um, is like a drug. And, and it's a perfectly valid argument to say that,
00:48:09.120
that there's everything to be said for the drug. And there are, of course, real drugs you can take.
00:48:13.600
It doesn't have to be a false belief. You can, you can take Soma equivalent.
00:48:19.280
Well, well, that, I mean, that is absolutely true. It's, you know, it's that drug is,
00:48:24.560
is such a powerful thing. You, you see, I can, I've got to the stage in my life, Richard, where
00:48:29.920
I can tell a lot of the time if someone is religious, because there's a lightness about them.
00:48:34.880
There's almost the, they, that kind of existential dread, which hangs off atheists. And maybe I'm
0.99
00:48:44.080
Not my experience. Um, uh, I would say you have the same lightness, by the way.
00:48:50.240
Okay. I, I, I'm not sure that that would stand up to a serious investigation.
00:49:00.720
Well, okay. I mean, we've got nothing but anecdotes. Um, anecdotes I've heard. I, I was close to
00:49:06.640
somebody who, um, was in, was looked, looked, looked after an old people's home. And she said that,
00:49:13.440
um, the people who are really afraid of death are the Catholics. Um, and that seems surprising in a way,
00:49:23.280
but remember there's hell as well as heaven. And, and, um, there's purgatory before you get to,
00:49:30.480
to either. Um, and so, um, I think we need a bit, a bit more investigation before, before we talk about
00:49:37.520
this lightness of being that you, you get from, from, um, uh, Richard. And, and what about you?
00:49:43.040
Because, uh, you're in your eighties now, you're in extremely great shape, both physically and mentally,
00:49:48.720
but you are in, you know, you are in your eighties. And so the moment is coming, uh, for you at some
00:49:54.640
point, I hope very, very far away. Uh, an atheist, uh, deathbed conversion is not a thing that's unheard of.
0.95
00:50:01.520
Uh, how, how do you have that lightness? It, it, it's a, but often it's a myth. I mean, there's a,
00:50:07.200
there's a, there's a myth, there's a myth that Darwin hadn't had a deathbed conversion, which is,
00:50:10.960
which is utterly false. Um, I guess what I'm asking is how do you have the lightness that you have?
00:50:17.280
How do you face death? I think that, um, well, uh, I think it was Mark Twain said, um, I was dead for
00:50:26.800
billions of years before I was born and never suffered the smallest inconvenience. Um, it's
00:50:33.440
going to be just the same as before we were born. We were, we were, we were not there during the whole
00:50:40.880
of the Cambrian and the Ordovician and the age of the dinosaurs and everything. And we're going to be
00:50:46.400
not there after we're dead. So we have this brief time in the sun to have a full and fulfilled life,
00:50:56.800
which is what I am doing and intend to go on doing until I can't anymore. Um, the, the process of dying,
00:51:04.880
as opposed to being dead, the process of dying is often very disagreeable.
00:51:11.600
That is the most British understatement that's ever made.
00:51:15.040
We're not allowed to, we're not allowed the privilege that a dog has of being taken to the vet
00:51:20.400
and put painlessly to sleep. Maybe I should identify as a dog.
0.72
00:51:26.080
If I'm not allowed to go to the vet and asked to be put down or, or if the vet refuses to put me down,
00:51:37.280
I could sue him for misspeciesing. Um, so, um, okay. I mean, if there is something frightening about
00:51:47.200
being dead, it's the idea of eternity. Yes. And eternity is a sort of frightening idea,
00:51:53.280
whether it's before you're born or after you're born, it's a kind of frightening idea.
00:51:56.960
Um, and so the best way to spend eternity, therefore, is after a general anaesthetic,
00:52:02.240
which is exactly what's going to happen. Richard, I was going, you were saying that it was
00:52:08.160
under a general anaesthetic. Was there part of you when you were at the forefront of the new atheist
00:52:15.040
movement that thought you were going to defeat religion, that you, by using facts and scientific
00:52:21.040
reason and logic, you were going to, you know, defeat all of these different religions?
0.94
00:52:27.200
I was never that optimistic. Um, no, I don't, I don't, I don't think I ever thought that. And,
00:52:32.720
and there was no, no movement. I mean, would, you know, would, uh, four, four or five books came
00:52:39.280
out of rush at the same time, but by coincidence, but there was a, never an actual movement. I think
00:52:44.640
it's a journalistic invention. Were you, uh, close with Christopher Hitchens? Well, I, I didn't know
00:52:51.520
him that well. I, I, I met him from time to time. Um, I, I had a long interview with him for new
00:53:00.160
Statesman. I think it was the last interview he ever had before he died in, in, in, in Texas,
00:53:06.800
where he was being treated. And, um, so yes, but I, I wasn't one of his close circle of friends like,
00:53:14.560
like Martin Amis and Salman Rushdie. And what were your thoughts or impressions of him?
00:53:19.840
What kind of person was he? Oh, um, immensely eloquent, immensely erudite. Um,
00:53:26.240
the most eloquent speaker I ever heard, I think. Uh, and, um, and a huge loss. I mean,
00:53:34.160
wonderful intellect, wonderful command of English language, command of facts, um, command of historic
00:53:43.680
and literary reference. Richard, we, we would talk, we would, before I asked you about the
00:53:50.320
atheism question, you sounded as if you were pro euthanasia. Are you in order to alleviate
00:53:57.200
suffering at the end of someone's life? With the, with safeguards? I think you do need safeguards
00:54:05.120
against, um, you know, let, let's get rid of granny sort of thing.
1.00
00:54:11.760
You've got to get on the housing ladder somehow, Richard. Yeah. Um, I, I think there, there have to be
00:54:21.760
safeguards and, and there can be, and, and, um, um, legal scholars look into the possibility, but,
00:54:28.080
but given, given that, uh, yes, I am. I think that, um, um, um, we, we should have the right to, um,
00:54:36.960
end our lives when we want to. Um, I know of, of cases, I've personally come across cases where
00:54:46.160
somebody has committed suicide in a not very pleasant way, um, because they still had the
00:54:53.280
power to do so. And if they had waited any longer, then they would have been incapable of, of doing the
00:55:01.520
act themselves. And therefore their lives were actually made shorter by suicide because they
00:55:10.000
didn't have the comfort of knowing that if at any time later on, when they were no longer physically
00:55:17.680
capable of killing themselves, they could ask a doctor to do it for them. You understand what I'm
00:55:22.720
saying? Yes, absolutely. Absolutely. Richard, I want to, before we wrap up and ask you a final
00:55:28.640
question and go to locals where our audience get to ask you questions, I want to ask you a couple of,
00:55:33.680
uh, just like, uh, one-off questions, which is one of the things that a lot of people are talking about
00:55:39.040
now is the emergence of AI. Uh, it's not your field of expertise. Do you have any thoughts on the
00:55:45.760
development of AI as we currently have it? I, um, had a go with chat, GPT, chat? Chat GPT. Yeah. Yeah. Um,
00:55:56.400
and I was quite interested in that. Um, it's factual knowledge is lamentable. Yes. Um, I mean,
00:56:03.520
it's actually quite comic. Uh, um, I happen to be interested in JBS Haldane, um, who was a great,
00:56:12.400
great biologist. I can't remember why, why that subject came up. And, um, his wife, his third wife,
00:56:19.440
I think, was a woman called Helen Spurway, who was a geneticist. Um, and I can't remember why I asked a
00:56:27.440
question about Helen Spurway, but I did. And it said, Helen Spurway was married to Richard Dawkins.
00:56:37.680
And, um, so I was sort of aghasted by this. And, um, later on, I, about three weeks later,
00:56:44.880
I thought, well, let's check up again. So I said, um, who was Helen Spurway managed married,
00:56:51.120
who was Helen Spurway married to? And they said, Aldous Huxley, which again was false. Um,
00:56:58.160
so I don't understand quite why it's factual knowledge of silly details like that is so poor,
00:57:05.120
because anybody can Google something now. I mean, why didn't they just Google, uh, Helen Spurway and,
00:57:12.320
and come up with the correct answer, which is that she's married to,
00:57:15.440
she was married to JBS Haldane. I mean, it's a, it would take about two seconds for a human to do
0.96
00:57:21.120
it, let alone for AI to do it. So, um, that's just a, I suppose, a vaguely amusing anecdote about
00:57:27.920
factual knowledge. Um, it is said that there, that people are using it to write essays and things like
00:57:37.200
that. It is pretty impressive. Um, um, if you, if you ask it to, um, give a, to write an essay
00:57:48.560
about something in the style of somebody or other, it produces something pretty impressive.
00:57:53.600
Well, you are, I, we could probably ask it to write an essay on, um, evolution in the style of
00:58:01.280
William Shakespeare and, and it would produce a pretty good pastiche of Shakespeare's style. Um,
00:58:09.120
I'm not quite sure. I haven't really been into, as you rightly said, I haven't really been into the
00:58:14.640
things that worry people about it, but, but I, I'm, I'm alive to the possibility that there are grave
00:58:22.320
dangers in something getting out of hand, getting out of control. And, and do you have some sense of how
00:58:28.400
that might play out? Not really. I need to do more reading on the subject. I think, I mean,
00:58:34.800
I need to read some of the people who aren't seriously worried about it and whom I respect.
00:58:38.400
Well, one of the concerns, there was an article now, I don't know how accurate it was, but it is
00:58:42.400
an example whereby they were training some kind of military system of AI to choose targets. And then
00:58:49.520
a human operator had the final decision over whether that target should be struck or not with a missile.
00:58:55.600
Uh, and allegedly, according to this article, the, because the human operator sometimes denied
00:59:03.920
a strike on valid targets for other reasons, the AI decided that the human operator was the
00:59:11.840
obstacle in the way of destroying the targets that needed destroying. And in this simulation
00:59:17.840
attacked the human operator, right? That is a sort of, so the argument about AI at least seems to be that
00:59:23.680
it's a baby now and it's getting the wives of the people wrong because it's still learning how to
0.98
00:59:28.720
walk and talk, but eventually it could grow up to be like a Hitler or whatever. You know what I mean?
0.72
00:59:32.880
Yes. I think, I think that's a very valid point. And I could imagine, um, some kind of AI
00:59:41.600
asked to determine what would be the best thing for the, for the sum of happiness would be to
0.99
00:59:45.680
exterminate everybody because we're all miserable. Yeah. Um, yes, I, I, as I said,
1.00
00:59:51.280
I need to read more about it. I think it's, it's interesting and, um, I'm in favor of, uh,
01:00:00.160
looking cautiously into the future and saying for about all sorts of things, what, what kind of
01:00:05.040
dilemmas are likely to arise in the future? What kind of problems is scientific progress? And this is one,
01:00:11.680
AIs is one example. Scientific progress likely to, to raise in future. We need to be prepared in
01:00:17.360
advance for what's going to happen. And I, I certainly have read enough to know that, um,
01:00:25.200
um, AI is capable of doing things which are really beyond our dreams at the moment. And,
01:00:31.840
and who knows where that will lead. And now, other question I wanted to ask you
01:00:36.480
in this part of the interview is this, on the balance of probability, are we alone in the universe?
01:00:41.920
And second part to that question, if we're not, is it wise to seek contact with those other sentient
01:00:48.000
life forms? I think the balance of probability is that we are not alone. Um, if we are alone in the
01:00:54.240
universe, then that has interesting implications. It means that the origin of life on our planet was
01:00:59.440
a supremely improbable event. So improbable that we're probably wasting our time trying to work
01:01:04.720
out how it happened. Um, but I don't believe that. And I, I do actually believe that we are
01:01:10.880
one of many, um, uh, life forms in the universe. And I would love to know what, what the others are like.
01:01:19.680
I mean, I'd, I'd love to know how unique we are. I'd love to know, I, I could make a few predictions.
01:01:24.400
I mean, I think I could predict that, um, there's going to, it's going to be Darwinian. Um, it's going
01:01:30.320
to be, um, it's going to have some kind of digital genetics. Um, there's got to be, there's got to be a
01:01:36.240
very accurate genetic system. We could make predictions of that, of that sort. Um,
01:01:41.040
um, is it wise to, uh, well, first of all, there's a very, there's a big distinction between life and
01:01:49.360
sentient life. Um, uh, intelligent life. Uh, the, um, intelligent life is a big step further.
01:01:59.760
And so if there's, there's probably a lot more life around than there is intelligent life around.
01:02:04.240
It's a big barrier to get through there. Um, and intelligent life, if we ever encounter it,
01:02:09.920
we'll almost certainly encounter it, not physically. We won't actually meet them
01:02:14.160
because the distances would be too great. Um, but, um, we, we're most likely to meet them through
01:02:20.960
radio waves. Um, SETI, the search for extraterrestrial intelligence where they have
01:02:26.480
actual dishes pointing out, looking for signs of intelligent life. Um,
01:02:30.880
Um, I don't think it's likely to be unwise to respond to any such messages that we find,
01:02:39.200
that we hear because the distances, again, are so great. And, um, you, we won't be able to have
01:02:45.680
a conversation. I mean, even, even the nearest star, um, is, is, is so far away that you could,
01:02:55.280
you could only send a message and then you all be dead before, before you get.
01:02:59.200
With current technological levels of development, wouldn't we be a little bit like the Navajo
1.00
01:03:03.760
sending a message in a bottle to the, to Spain in the 1450s?
01:03:07.280
Well, if you, but, but if you believe Einstein, um, then, then there is a limit to, to, to,
01:03:12.800
to how fast information can, can travel. It's limited to the speed of light.
01:03:18.160
Um, and so, um, the, the, I mean, the two science fiction books that I've read, which, um, come to
01:03:26.240
grips with that, that problem is you, you, um, Fred Hoyle's, um, Ava Andromeda and, uh, Carl Sagan's
01:03:34.800
contact where, um, both authors face up to the fact that, um, the distances are too great for direct
01:03:42.320
control of humans, for direct manipulation of humans. We don't have to fear that they actually
01:03:47.280
come in flying saucers and, and run our lives. And you can't run people's lives by radio unless
01:03:55.520
both books face this, come to the same conclusion. The, the instructions are build a computer
01:04:01.920
computer which will then control humanity. Both authors, I don't know whether they independently
01:04:09.360
thought of it or whether one of them, I forget which of, which of those books came first. Um,
01:04:13.760
so the, the extraterrestrial intelligence sends information
01:04:22.080
telling people to build a computer which will do certain things.
01:04:26.800
And the, the original senders of the information may be long dead
01:04:31.440
because it takes so long for the information to get here.
01:04:34.880
But once the information is here, then the computer can work in short-term time
01:04:41.840
and can, um, manipulate us. And, and in both cases, um, that, that, that's what,
01:04:48.480
a very interesting science fiction idea. Uh, and that's, I think, the only way we need to be afraid.
01:04:55.280
We're not going to be visited by people in flying saucers. That, I think, is too improbable.
01:04:59.360
Richard, what an absolute pleasure. Thank you so much for coming on the show. The final question
01:05:03.760
we always ask is, what's the one thing we're not talking about as a society that we really should be?
01:05:09.120
Well, I suppose we've sort of touched on it, really, the, looking into the future, we don't know
01:05:14.800
what's going to happen. And we know, looking into the past, um, that we are horrified by certain
01:05:21.120
things that we did in the past, like slavery. Um, and maybe what we should be doing is looking into
01:05:28.160
the future and imagining what will our descendants look back at our time, uh, and shudder with horror
01:05:34.960
at what we did, um, in the same ways we look back two or three centuries and shudder with horror.
01:05:42.320
Richard, thank you so much for coming on the show. It's been an absolute pleasure.
01:05:46.080
And please make sure to join us on our Locals where we'll be carrying on with this conversation.
01:05:52.400
Elon Musk is keen to escape Earth and build on Mars. As an evolutionary biologist, how do you feel about
01:05:58.160
that? And what do you think establishing a successful second planet means for the human race in our