Net Zero Must Go - Matt Ridley
Episode Stats
Length
1 hour and 6 minutes
Words per minute
166.91565
Harmful content
Misogyny
11
sentences flagged
Toxicity
13
sentences flagged
Hate speech
13
sentences flagged
Summary
Lord Ridley is a British academic and author who has written a book about the COIDV lab leak and the climate crisis. In this episode, he talks about how science is being corrupted within the scientific establishment, and why we should all be worried about climate change.
Transcript
00:00:00.000
I think we've got a crisis of confidence in scientific truth, and I think it's largely
00:00:07.440
self-inflicted. I've never thought that carbon dioxide has no effect at all. I've never thought
00:00:13.840
that we're not in a period of warming, but I've been blacklisted by the BBC. We're not in a period
00:00:19.640
of unprecedented warmth. We're not in a period of unprecedentedly fast warmth. We're not in a
00:00:25.040
period of increasing extreme weather. I think what the UK has done is unbelievably foolish,
0.99
00:00:30.280
and we should tear up net zero. Lord, Matt Ridley, welcome back to the show.
0.88
00:00:36.660
Thanks very much for having me on. It's great to have you on. We want to talk about science,
00:00:40.820
and particularly one of the big things you were at the forefront of, of course, was the COVID lab
00:00:47.360
leak thing. We also want to talk about the climate conversation. Those are two, I think, major issues
00:00:52.900
where there are a lot of concerns from very sensible and reasonable people like you, not lunatics on
00:00:57.560
the internet, about the way that science is effectively being corrupted. And we want to
00:01:01.780
talk about that through the lens of those two issues in particular, and we'll talk about your book and
00:01:05.000
other things as well. So where are we with the COVID lab leak? It is sort of anecdotally pretty clear
00:01:11.680
that it came from a lab in China. Is that basically true? Yes, for most of us. So the CIA thinks that,
00:01:18.740
the FBI thinks that, Robert Redfield, the former head of the CDC in America, thinks that,
00:01:23.640
most of the world thinks that, ordinary people think that, the evidence is overwhelmingly strong,
00:01:29.780
that it's not a coincidence that it broke out in the only city with a huge research programme on
00:01:34.620
COVID-like viruses. But the exception is the scientific establishment, which has not budged an inch.
00:01:41.620
So the journal Science, the journal Nature, they keep publishing things saying it's not a lab leak.
00:01:46.040
The Royal Society, they refuse to have a debate on the topic, the National Academy in America,
00:01:51.780
likewise. I and a couple of colleagues who are both academics, keep submitting a systematically
00:02:00.200
argued review paper, or versions of it, with all the arguments for a lab leak, spelled out with
00:02:07.760
references in the most objective and careful terms that we can, to journal after journal,
00:02:13.900
about six journals so far. They reject it within weeks. And the reason they gave to start with was,
00:02:20.820
this is a conspiracy theory, it's nutty, it's wrong, you can't believe a word of it. The reason
00:02:25.220
they now give is, everyone knows this stuff, so we don't need to publish it. Now, you can't both,
00:02:30.580
both of those can't be true. And the obvious question that a lot of people will ask is,
00:02:36.400
Well, within science, there's a huge number of scientists who recognise that the evidence points
00:02:43.720
strongly to a lab leak. And they say things to us like, keep going, you're on the right track,
00:02:48.000
but I don't put my head above the parapet, because I'm worried about my funding, I'm worried about my
00:02:51.660
reputation with my university, that kind of thing. But there's a group of senior scientists,
00:02:59.260
scientists and editors of journals and things, who decided early on, that if they gave an inch on
00:03:05.960
this topic, and conceded that it might have come from an accident in a laboratory, the damage to the
00:03:12.760
reputation of the whole of science would be so great, that it would threaten their funding in
00:03:17.400
lots of different fields. I think there are 180 degrees wrong on that. I think if they'd come out
00:03:22.060
quite early and said, this is a serious hypothesis, we need to take it seriously and look at it and
0.98
00:03:25.740
discuss it, then the damage would be confined to one corner of very idiotic virology, which was
00:03:32.220
doing extraordinarily risky experiments that should never have been done, let alone at that biosafety
0.95
00:03:36.620
level in that lab. And, you know, this was a very, you know, this was 1% of 1% of 1% of science,
00:03:42.900
you know, this wasn't, you know, the rest of science is fine in terms of safety. It's just,
00:03:48.720
you could have confined the damage to one corner of the discipline, and then you'd have got full marks
00:03:55.200
for investigating yourselves, looking into it. As it is, they're saying, how dare you review
00:04:02.060
biosafety and laboratories? It's up to us to decide whether something's safe. Well, I'm a libertarian,
00:04:09.720
but even I don't go that far, if you know what I mean. It's remarkable what we're hearing from these
00:04:15.740
people. So I think they made a mistake early on at a time when it was all going to blow over in a few
00:04:20.640
weeks. It's just going to be a little local outbreak in China, and nobody didn't know about
00:04:25.000
it. They knew among themselves that it was very likely to be a lab leak, but they kept that quiet.
00:04:29.420
They published a paper saying the opposite, which became very influential, and then they found
00:04:33.600
themselves out on a limb and they couldn't get back. And so they're now, they're doubling down
00:04:37.880
on a lie that they know they told effectively, or a mistake they made, let's say. This would be
00:04:42.140
charitable. Yes, but of course, things are changing in America because this week,
00:04:46.500
Jay Bhattacharya, my good friend, has been confirmed as the head of the National Institutes
00:04:52.900
of Health, responsible for the whole funding apparatus for all biomedical research. And the
00:05:00.360
head of the CIA, the new head of the CIA, yes, is John Ratcliffe, who led an inquiry into this very
00:05:07.600
topic for the Heritage Foundation and came to the conclusion it's probably a lab leak. So suddenly in
00:05:11.620
Washington, you've got senior people saying the opposite of what they were saying a year ago.
00:05:15.940
Now, that means we're going to get more information coming out. We're getting bits of it already from
00:05:20.320
within the US. We may get some decent intelligence from whistleblows in China or something. We may get
00:05:27.600
close to being able to say, case closed, this is definitely what happened. But five years have gone
00:05:33.580
by. And there's been a lot of track covering. And the hope of those who don't want to admit it was a
00:05:42.480
lab leak is that this gradually gets memory hold and never gets resolved. And, you know, it just fades
00:05:50.400
into the background. And how much of the reason for these, you know, call it a cover up, call it
00:05:56.860
unwillingness to adjust course or whatever, is to do with the fact that the lab in China was being funded
00:06:04.640
by quite a lot of scientists and the sort of scientific organisation in America. And so the people who are
00:06:12.760
in charge of funding that are the very same people who are then saying it didn't come from that lab.
00:06:18.460
Exactly. Well, I mean, Dr. Fauci, as I understand it, was heavily involved in all of this.
00:06:22.300
Anthony Fauci was responsible both for promoting this kind of research, for bringing to an end a
00:06:28.900
moratorium on this kind of research in the US, for allowing some of the funds to go through the
00:06:35.040
EcoHealth Alliance, a non-profitable body in New York City, to the Wuhan Institute of Virology.
00:06:42.840
So, yes, you know, his fingerprints were all over it. So when Jeremy Farrer of the Wellcome Trust and
00:06:49.160
a number of other people came to him very early in the pandemic and said, we need to have a
00:06:52.960
conversation about whether this thing started in the lab. His initial reaction was, let's have the
00:06:58.040
debate and talk about it. And we know that at that meeting, they discussed, you know, whether to
00:07:02.920
discuss it. And within a day or two, Fauci is basically giving instructions to shut it down,
00:07:08.240
to make sure this never gets mentioned. And he's at the podium in the White House talking about this.
00:07:13.060
And he never admits that he's just had a meeting with a dozen other scientists at which they very
00:07:19.100
seriously discussed the possibility that it did start in the lab. He says it's conspiracy theory,
00:07:24.080
it's debunked, all this kind of thing. And they savage anyone in the press who comes out and says
00:07:28.580
that. So, yeah, you have to ask yourself, did they worry about the fact that they might have been
00:07:36.600
partly responsible? And then you find some of the emails that have come out through congressional
00:07:41.020
investigation. And you find that Fauci is sending emails to colleagues saying, we need to have a
00:07:46.620
serious conversation about the grants we gave to this lab. Now, why is he having that conversation?
00:07:52.620
He's concerned. So, yes, I'm afraid there is an enormous amount of motivated reasoning going on
00:08:01.180
here. You know, if you worry that you might be partly responsible for the death of many millions of
00:08:07.180
people, then of course you're going to, your brain is going to tell you that it's very unlikely that
00:08:12.780
that's the way it happened. There is a term for what you're describing, Matt, a cover-up.
00:08:19.360
Well, frankly, I personally think this makes Watergate look like a picnic. Because at Watergate,
00:08:26.280
yeah, there was a break-in, there was a bit of burglary, there was some financial corruption
00:08:31.820
involved a president, yes, but it didn't involve the death of somewhere between 7 and 28 million
00:08:39.740
people. It didn't involve the turning of the world upside down. The worst industrial accident
00:08:47.180
in the world is probably Bhopal, where 25,000 people died. This was probably an industrial accident,
00:08:55.100
in a sense. The scientific laboratory accident is much the same thing. We're talking about a
00:09:01.420
thousand times as many people dying. It's a very important issue. And that there was a cover-up
00:09:08.580
is not in doubt. Because we've seen the emails, we've seen the slack conversations between the
00:09:13.880
people who were writing the paper, so-called proximal origin paper, which came out pretty well
00:09:19.060
five years ago this week, which said there is no possibility it came out of a lab. And they were
00:09:25.160
saying to each other in private, in their emails, I still think it's frigging likely that it came out of
00:09:30.440
a lab. I hover one way or another, one of them said. And they went on saying that even after
00:09:35.760
they published the paper. So their current excuse, which is, oh, we changed our mind in the light of
00:09:39.780
new information, doesn't add up. Because even after they published the paper, they went on in
00:09:43.980
private saying, I still think it's a possibility. So I'm afraid that's the very definition of a cover-up.
00:09:49.000
You are covering up information that you have available and suspicions that you have about what
00:09:53.860
happened. Now, we, when I say me and a number of other scientists, well, scientists and journalists,
00:10:00.320
I'm not a scientist, but I'm a commentator on science, we are pushing the journal Nature Medicine
00:10:06.060
to retract that paper. Because it was clearly not just factually wrong, there was no, the arguments it
00:10:14.280
gives for why it couldn't have gone out of a lab don't stack up. But it was written by people who
00:10:20.340
didn't believe it. Now, in my view, writing a scientific paper that says the opposite of what
00:10:24.580
you actually think in private is the very definition of scientific misconduct.
00:10:30.720
And Matt, this has damaged, and obviously millions of people have lost their lives, as you said,
00:10:36.960
but it has also damaged people's faith in science. The amount of people now who were told
00:10:42.800
follow the science and now are rabidly anti-vax, rabidly anti-institution, saying that they can't
00:10:50.980
trust scientists anymore, they can't trust doctors. This has done a tremendous amount of damage to the
00:10:56.720
scientific institutions themselves, hasn't it? Absolutely. And this is a point I struggle to get
00:11:02.180
across to my scientific friends. They think that it's the fault of RFK and anti-vaxxers that people
00:11:11.840
have turned against vaccines. And I'm saying, look in the mirror. Who was it who built up,
00:11:17.240
who gave RFK the viability to make his arguments? How did his numbers shoot up? Because you made
00:11:25.180
claims for the vaccines that were way overblown about whether they prevented transmission,
00:11:29.940
about whether they were safe for children, et cetera, et cetera, which have damaged vaccines.
00:11:35.380
I'm pro-vax. You know, I think vaccines are a good thing. But the damage that's been done
00:11:40.980
to the reputation of that technology and, as you say, of science in general. So just to take
00:11:45.760
another very simple example. Five years ago this week, the World Health Organization was still
00:11:53.060
saying in capital letters, this virus is not airborne. Quite why they were so obsessed with
00:12:02.300
saying this, I don't know. And from that flowed all the stuff about if you stand six feet apart
00:12:06.980
and if you wash your hands all the time, you know, it'll be fine. Well, that was nonsense.
00:12:10.960
It's airborne. It doesn't matter where you stand. I'm breathing at you now. You know,
00:12:14.840
if I had COVID, you'd get it. I don't, by the way. At least I think I don't.
00:12:20.540
And so, you know, giving out genuine misinformation and then not saying, sorry, we were wrong leads
00:12:32.360
people to say, well, I wonder what else you're wrong about. And I'm someone who's covered science
00:12:39.020
all my life. I've championed science. I'm passionate about science. I love science. But I find myself
00:12:44.960
now taking any scientific announcement on any topic with more suspicion than I used to five
00:12:53.140
years ago. It's had that much effect on me. Now, imagine if the effect it's had on other people
00:12:59.240
who are not such fanatically pro-science people. So I think we've got a crisis of confidence in
00:13:07.440
scientific truth. And I think it's largely self-inflicted. To paraphrase a Republican
00:13:13.860
strategist in America who was saying something, I think just this week, there isn't a mirror big
00:13:18.660
enough for the scientists to see the damage they've done to scientific reputation.
00:13:25.960
And the worrying thing is, Matt, is that we are going to have another pandemic. These things
00:13:29.460
happen every 100 years or maybe less than that. So if we do have another pandemic and we have
00:13:36.740
scientists coming out, my concern is, and I'm sure yours is as well, how many people are
00:13:41.480
actually going to believe them this time around?
00:13:43.840
I hadn't thought of that, but it's a very good point. And, you know, I personally think
00:13:48.720
that proving this was a lab leak will be quite reassuring in one way, and that is it will be
00:13:54.720
the exception that proves the rule. It will be the case where because it was already trained
00:14:00.300
on human beings in the lab, it was infectious from the start and therefore nothing we could do
00:14:04.520
to stop it. Lockdowns didn't work, etc., etc. Whereas if it jumps out of nature, it stutters
00:14:12.160
like bird flu is doing at the moment. You get a few infections and then it dies out. It's quite
00:14:16.680
easy to contain. It's not very contagious. You can institute controls that will stop it.
00:14:23.280
And therefore, there is every reason to think we can stop natural pandemics, but we can't stop
00:14:28.000
artificial ones. Now, that's a very important lesson to learn if we want to learn it. But what's
00:14:33.000
happening in virology labs at the moment? There are more opening than ever before. There's no increase
00:14:41.660
in the biosafety regulations in any country on this. They're accelerating. The Wuhan Institute of
00:14:51.320
Virology Lab that is at the centre of this, run by a professor called Dr. Shi Zhengli,
00:14:56.980
announced a new experiment on a MERS-like virus the other day. Well, that's comforting.
00:15:03.640
Now, you know, maybe they sat down and said, let's do it in a more safe way. But actually,
00:15:09.700
in the paper they published in Nature the other day about this, they said we used biosafety level
00:15:17.260
2+. Well, we've never heard of 2+. We know 2 is not good enough for these kinds of viruses.
00:15:22.320
We know 3 is probably OK, but we don't like the sound of 2+. So what are these Western journals
00:15:29.060
doing publishing this stuff without saying, hang on, are you sure it was safe to do this
00:15:34.160
experiment? You know, there's an enabling factor in the West here that needs to be looked at.
00:15:41.980
And I just want to examine China's part on it, because how much responsibility does China have
00:15:48.320
to take for being opaque about the virus, not disclosing what happened, as to us to really
00:15:56.000
understand what was going on? What I guess what I'm really trying to say is, Matt, can we absolutely
00:16:00.840
say for certain that it came from a lab without Chinese cooperation?
00:16:10.720
Can we prove that without the Chinese cooperating with the investigation? Yes, I see what you mean.
0.55
00:16:14.720
I thought we meant without, they didn't cooperate in making the virus, which I wouldn't agree with.
00:16:23.620
Probably not. We probably won't get to 100% without some kind of whistleblower or other
00:16:33.720
opening up of the records in the laboratory. But in terms of beyond reasonable doubt, I think
00:16:41.720
we're pretty well there, frankly. I think the evidence when you look at it is so extraordinarily
00:16:46.480
strong. I mean, the coincidence of this thing happening without leaving a trace in any wild
00:16:51.760
animals, in the very city that is doing an extraordinary range of experiments that includes
00:16:58.540
a plan to do the very experiment that would have put a fear in cleavage site for the first
0.96
00:17:02.800
time in a virus. You know, I won't go into the details, but you know what I mean.
00:17:05.900
Now, what happened in Sverdlovsk in 1979 is relevant here, if you don't mind me going into
00:17:13.340
that for a second. And that is that there was an outbreak of 65 people died. The Americans
00:17:23.600
said, we think you've just had a leak from your biowarfare anthrax plant in the city. The Russians
0.98
00:17:32.220
said, no, they've died of food poisoning. Come and look. And an international team went
00:17:38.820
in and looked and said, yeah, the Russians are probably right. And it all died down.
0.99
00:17:44.020
And then the Soviet Union collapsed about 12 years after the event. A scientist came to the
0.78
00:17:50.560
West and said, you were absolutely right. It was an anthrax lab. We left a filter off one
00:17:58.360
day and we sent a plume of anthrax over a suburb. If it had gone the other way, it would have killed
00:18:02.460
hundreds of thousands. In fact, it only killed 65 people. So it took a long time, but the truth
00:18:09.100
eventually emerged. Now, it's going to be, I still maintain, it's going to be quite hard
00:18:16.140
for the Chinese authorities to keep the lid on what happened in those months, in the autumn of 2019,
00:18:22.800
in that lab in Wuhan, where they had some kind of drill. They had some kind of emergency that caused
00:18:29.160
them to close down their, take their database offline, where they called for new ventilation
00:18:34.840
equipment for the lab, where they suddenly banned the sale of experimental laboratory animals in
00:18:40.440
markets as food. You know, there's a whole string of events that happened that autumn that we would
00:18:47.240
like to know more about. There are people who know exactly what happened. And are they still alive?
00:18:54.080
Well, one of them isn't. Um, the guy who was developing a vaccine from very, very early in
00:18:59.720
the pandemic, so early that we think it probably, he probably started developing the vaccine before we
00:19:03.820
knew about the outbreak. Um, cause otherwise it's just impossible to see how he worked that fast.
00:19:08.600
He fell off a building, the roof of a building and died. And these new labs coming back a little bit
00:19:16.020
that you're talking about, cause this is the thing that concerns me. We all make mistakes.
00:19:20.980
We all do stupid things. Governments do stupid things. Scientists do stupid things. We all make
1.00
00:19:25.660
mistakes. The mark of someone who is dangerous is someone who refuses to learn from their mistakes.
00:19:31.820
And in this instance, are you saying gain of function research is ongoing?
00:19:35.440
Well, gain of function can be quite a broad term. It can mean, uh, um, enabling a plant to fertilize,
00:19:45.060
to, uh, photosynthesize more efficiently. Let me rephrase. So are we still doing things that could lead
00:19:51.040
to another pandemic? Gain of function research of control. Goff rock is the sort of narrower category
00:19:56.820
that we're talking about. Sorry to be pedantic. No, no, please be pedantic.
00:20:02.980
Um, uh, is continuing, I think in the West with more caution. I think I'm sure lessons are being
00:20:19.000
learned even while they may not be being acknowledged, but we have no such confidence about, uh, Chinese
00:20:25.500
laboratories. Uh, we certainly have no such confidence about laboratories in places like
00:20:31.160
Iran and North Korea where they would actually quite like to start another pandemic, if you see
1.00
00:20:35.800
what I mean. Uh, and in general, as I say, calls for stiffer regulation of, uh, virology experiments
00:20:47.880
that cause increases in lethality or infectivity of viruses are being resisted by the scientific
00:20:56.540
community. So there's a famous meeting called Aziloma, which happened in 1975, soon after the
00:21:01.440
invention of genetic engineering in California. And at Aziloma, they agreed on a set of self-policing
00:21:07.440
principles that the biotechnology industry and research would do. Uh, and one of those was not
00:21:12.420
to work on highly pathogenic, um, individual, uh, creatures. Well, somewhere along the line,
00:21:17.500
they gave up on that. Um, Aziloma too happened a few weeks ago, uh, the 50th anniversary of that
00:21:26.120
meeting. Yeah, it was 1975. So yeah, the 50th anniversary. Um, nobody who has been arguing for a lab
00:21:35.640
leak in this case was invited to that meeting. The people who went were all the people who were saying
00:21:40.700
this was definitely not a lab leak. So there's a real danger of science turning in on itself and
00:21:46.580
talking to itself and thinking it doesn't need to listen to the concerns of the public, um, or of
00:21:51.620
informed critics. And I, uh, I don't think that's healthy. There's something powerful about hearing the
00:21:58.460
story of a country, not just as a list of events, but as a journey. There are many I find more inspiring
00:22:05.280
than America's story. The Great American Story, A Land of Hope, is a free online course from Hillsdale
00:22:12.060
College that brings it to life. It covers the American Revolution, the Civil War, the World Wars,
00:22:17.340
the Cold War, the defining moments that shaped a nation. But it's more than a timeline. It's about
00:22:24.180
the ideals that held through those challenges. Liberty, self-governance, perseverance, and hope.
00:22:30.020
What makes this course special is a way it blends honesty with inspiration. It doesn't avoid the
00:22:36.480
hard moments, but it also highlights the achievements, the progress, the principles,
00:22:42.000
the resilience. It's taught by Hillsdale president Larry Arnn and historian Wilfred McClay. And it's
00:22:49.700
designed for anyone who wants to understand America's past in a way that actually builds perspective.
00:22:55.980
is completely free. Just go to hillsdale.edu slash trigger. That's hillsdale.edu slash trigger.
00:23:06.440
And enroll today. You'll come away with a deeper appreciation for the story and the promise of
00:23:13.700
It's back. The payroll payout $5,000 signing bonus.
00:23:23.380
Monday, March 2nd at 5 p.m. You could be $5,000 richer.
00:23:29.400
You are my new best friend. Are you kidding me?
00:23:38.840
Approved by Alpine Credits. Own your own home and eat alone. Alpine Credits can help.
00:23:45.360
Are we missing something in this analysis, Matt? Is there a holy grail at the end of this type of
00:23:51.580
research that the trade-offs of killing a few million people are just so worth it?
00:23:56.920
Well, that, of course, is the phrase that, in fact, you used in a paper, you know, that the
00:24:03.360
Right. So there must be some massive benefits on this.
00:24:05.200
A long time ago. We can't see it. I mean, the benefits, in theory, are being able to predict
00:24:11.460
and prevent the next pandemic. Well, that went well, didn't it?
00:24:16.780
But even at the time this stream of research was kicked off 15 years ago, there were other
00:24:25.360
virologists saying, look, going out and finding viruses in bats in the real world and bringing
00:24:34.440
them into a laboratory and souping them up to see how dangerous they are. You're very unlikely
00:24:39.720
to find the one that actually is going to cause the next pandemic anyway. And if you do, you
00:24:44.220
might make it more dangerous. It's not really a very good use of money, this. Why not just beef
00:24:49.220
up surveillance and testing and tracing so that when there's a local outbreak, we're
00:24:56.360
quicker onto it? Why try and predict it? Why not just be better at stopping it when it first
00:25:02.120
starts? And that has always seemed to me a very good argument, because if this kind of
00:25:08.140
research can lead to 20 million deaths, then it's got to have an enormous upside for it to
00:25:13.980
be worth it. And the upside they talk about is pretty small.
00:25:17.300
Well, in that case, I am in a position where I don't understand. I always think, why are
00:25:23.640
people doing certain things? And people generally respond to some kind of incentive structure
00:25:28.360
within the culture and mentality they operate in. So if what you're saying is this type of
00:25:34.760
research is ongoing in one form or another, why are those people doing it?
00:25:40.360
To get papers in nature, to get promotion, to get scientific fame and renown. That's the big
00:25:57.320
You don't need to get complicated about this, if you know what I mean.
00:26:00.640
That tells me it's incentive structure within science is all wrong then.
00:26:06.780
Well, and in quite a lot of areas. Because I think if you examine what gets you fame,
00:26:17.880
renown and promotion in science, it's publishing a lot, it's doing something that nobody's done
00:26:22.700
before, etc. It's not necessarily solving a problem that humanity wants you to solve.
00:26:28.720
Now, I mean, they'd like to do that as well. And I don't... I'm not here to say that all
00:26:34.000
scientists are evil. You know, I don't think that... Evil is the wrong word here. But I think there
00:26:39.700
are misaligned incentives that have become very out of control. And for me, the big problem is the
00:26:46.780
monolithic nature of the funding and publishing system. The fact that basically all the money flows
00:26:53.580
through one government agency, so in this country, UKRI, in America, NIH, or whatever. And you say,
00:27:00.200
well, there's the Wellcome Trust. Wellcome Trust just falls in line with UKRI. You know, there's no...
00:27:04.820
There's no attempt to be the red team to their B team and to fund different things.
00:27:11.980
And then when it comes to publishing, you know, you don't find science says, well, I'm not going to publish
00:27:17.040
that. Nature says I will, except in a sort of competitive sense, we'd like to do it first. You know,
00:27:21.420
they all have exactly the same criteria. So it's become very monopolistic, science. And for me,
00:27:26.920
monopoly is the big theme of what's wrong with this world, what's wrong with government, what's wrong
00:27:30.800
with crony capitalism, etc. You know, wherever you get monopoly, you've got a problem.
00:27:37.320
It's a fantastic point, because I have a lot of friends who work in science. And what they say is,
00:27:43.340
if you want to get funding, which is actually the most important thing for a scientist,
00:27:47.880
is to get funding. And people would be very shocked to hear that, and so was I, but apparently
00:27:52.300
it's true. If you want to get funding, you know the avenues that you need to go down. You know the
00:27:58.000
research that is looked upon favourably. So you're not going to do something which is deemed to be,
00:28:04.780
you know, slightly conspiratorial or, you know, slightly dodgy for whatever reason. You are going
00:28:11.240
to stick to the current paths. And that's a tragedy for science and human innovation, isn't it?
00:28:17.300
Well, to take another example from a different field, Alzheimer's research decided some years ago
00:28:26.780
that the hypothesis they liked best was the amyloid plaque hypothesis, that if we could find a way for
00:28:34.900
these features not to form inside brain cells, then we would be curing Alzheimer's.
00:28:42.420
And at various points along the way, scientists have said, hang on a minute.
00:28:48.660
I'm not sure you're not just treating the symptoms rather than the causes here.
00:28:53.940
We don't seem to be getting anywhere with this hypothesis. And they have been excluded. There's
00:28:59.540
been a really shocking degree of, we refuse to publish you if you don't subscribe to the amyloid
1.00
00:29:05.620
black hypothesis. You're a nutcase, you're a conspiratorial, blah, blah, blah. And there are
0.99
00:29:10.260
other hypotheses out there and they get ostracised. An even more striking example, which admittedly from
00:29:17.060
longer ago in the 1980s, is the stomach ulcer story, I don't know.
00:29:22.420
The guy who proved the bite swallowing the thing and giving himself a stomach ulcer.
00:29:26.420
So Glaxo and others were making an absolute fortune out of these antacid drugs that fought
00:29:32.580
the symptoms of stomach ulcers, but never cured them. And this guy comes along and says, look,
00:29:37.540
I think they're caused by a bacterium called Helicobacter. And they wouldn't publish him
00:29:41.860
and they wouldn't fund him and they wouldn't give him grants. Barry Marshall was his name in Western
00:29:46.900
Australia. And eventually, in order to prove his point, he swallowed a glass of Helicobacter and got
00:29:52.340
the most ferocious stomach ulcers. And he then swallowed an antibiotic and cured himself.
00:29:57.860
And at that point, the world had to sit up and take notice. And he eventually got the Nobel Prize.
00:30:02.260
But the degree to which he was pushing uphill for most of his career is quite shocking.
00:30:08.260
So in order to disprove what was a commonly thought hypothesis, he had to give himself literal
00:30:22.740
There's a great tradition of self-infection. I think Robert Ross, the guy who made the link
00:30:30.260
between malaria and mosquitoes, he deliberately infected himself too. They're quite mad,
0.96
00:30:35.940
some of these doctors. Brave is the word I should use.
00:30:39.460
Yes. So moving that along to a subject which is even more controversial than COVID, which is
00:30:46.340
climate change or the climate crisis. I mean, that's going to be even worse, isn't it?
00:30:51.140
Well, I've covered that topic for 45 years now. I wrote about it when I was science editor of The
00:30:57.380
Economist in the mid to late 80s. And I have watched a perfectly reasonable hypothesis that carbon
00:31:07.220
dioxide might cause runaway warming, be tested and discussed and debated. And for 10 or 20 years,
00:31:15.060
you could argue both sides of it and then gradually get closed down to the point where if you say
00:31:21.860
something like, yeah, I think carbon dioxide is a greenhouse gas, but I don't think there are positive
00:31:27.620
feedback effects. And I think it's a diminishing effect. So I don't think we're going to get runaway
00:31:32.980
warming or dangerous warming. And I don't see any evidence that it's causing more extreme weather,
00:31:38.660
which is basically what the facts show after 40 years. If you say that, you are a denier.
00:31:46.500
Now, what does that word even mean in this context? It's an extraordinary thing to level at someone.
00:31:53.860
And now, you're not just a denier if you say, I think climate change is not as bad as you think.
00:32:01.540
You're a denier if you don't use the word crisis. You know, you and I have always been in meetings,
00:32:08.820
I'm sure, where people say the climate crisis is very important. If you stick your hand and say,
00:32:12.260
look, can we talk about climate change, but not the climate crisis? Because I feel that's emotive
00:32:15.860
language. It's like you farted. And as I say, you know, I've never thought that carbon dioxide has
00:32:29.940
no effect at all. I've never thought that we're not in a period of warming. But I've been blacklisted
00:32:37.380
by the BBC, because I once said some of these things on the Today programme, etc. And the idea
00:32:45.540
that, you know, and someone with my view in a university would never get funding, would never get
00:32:53.140
published, and would be quite quickly cancelled. Now, I don't think that's healthy. I mean,
00:32:59.860
I'd love to hear. There are other people out there who say there's no carbon dioxide warming.
00:33:05.300
It's all caused by the sun. I think they're wrong too, by the way. But I don't see why they couldn't
00:33:11.060
occasionally be allowed one job in a university or one paper in a in a journal, which we can then
00:33:16.660
criticise, if you see what I mean. And is it your assertion, therefore, that the consensus that we're
00:33:22.900
often told about that exists within science, you know, 90x, and it changes every day, whatever that
00:33:27.940
percentage is. But basically, the way it lands with a member of the ordinary general public like me
00:33:33.940
or Francis is, basically, all the same scientists agree. And there's this guy who thinks, you know,
00:33:40.100
that the sun revolves around the earth, and he's a bit weird. And that's why no one listens to him.
00:33:45.620
Yeah. Well, are you saying that's manufactured by the fact that, basically, you can't say a
00:33:51.940
different opinion, and therefore, therefore, 93% of scientists agree?
00:33:55.060
Yes, it's partly that, that it's a self-fulfilling prophecy. But also, the very study that got to
00:34:01.140
97% of scientists think climate change is happening said they think it's happening. Well, I'm in the 97%.
00:34:08.420
They didn't say they thought it was a crisis. You cannot find any study that says 97% of scientists
00:34:14.020
think it's a crisis. Literally. I mean, you could probably get to 50 or 40 or 60.
00:34:18.580
So what's the, in your assessment, what is the reality of climate change and the human impact on
00:34:27.700
that process, and therefore, the measures that we ought to be looking at and taking to mitigate
00:34:33.460
that problem? Well, that's a very big topic. But to be blunt, I think we are in a period of warming.
00:34:40.100
I think we're going to have to adapt to that anyway, because there's every chance that we're not going
00:34:44.660
to be able to stop it continuing. Can I just stop you there, Matt? You said we're in a period of
00:34:49.700
warming. What does that mean? What does that actually mean in terms of data, et cetera?
00:34:54.020
Because there'll be people going, I don't know what that means. I mean, me, by the way.
00:34:56.980
Exactly. Well, the average temperature of the planet does seem to be going steadily upwards.
00:35:06.020
It went down in the 70s. I remember all the fears about the ice age coming.
00:35:09.460
Yes, in the 70s. So we were told the planet is about to freeze in the 70s.
00:35:13.780
Yeah. And that wasn't a fringe view, by the way. It was, you know, covers of Time and Newsweek
00:35:18.500
and all that kind of stuff. Yeah. People have forgotten that.
00:35:20.660
Yeah. And it was going to be six degrees of cooling and, you know,
00:35:23.060
it was going to be amazing changes. And that didn't happen. It bottomed out and started warming gently.
00:35:29.860
And since then, it's warmed at, you know, about a quarter of a degree per decade,
00:35:37.460
on average across the last 30, 40 years. And it's maybe, they say it's 1.5 degrees above
00:35:45.540
pre-industrial levels now, but pre-industrial levels did this, you know. So which bit do you want to
00:35:50.820
prove? I mean, we know that in the medieval period, it was warmer than today because you find
00:35:54.740
whole forests that are emerging from melting glaciers now. Well, there were forests in 1500 where there's
00:36:01.060
no glass, there would have been no glacier there now. So the glaciers come and then gone away again.
00:36:05.380
So we're not in a period of unprecedented warmth. We're not in a period of
00:36:09.620
unprecedentedly fast warmth. We're not in a period of increasing extreme weather.
00:36:14.100
Floods, droughts, storms, there's no increase in either frequency or severity.
00:36:23.940
Floods, you need to take into account the fact that we're building on floodplains and making it
00:36:28.020
harder for rivers to lose their energy and things, you know. So there are, of course,
00:36:32.260
there's man-made factors, but that's not the same as saying that it's the climate that's making
00:36:36.580
them worse. So for me, climate change is real, it's happening, it's producing an effect,
00:36:44.500
some of which is deleterious and we need to work out what to do about it.
00:36:49.780
But at the same time, the carbon dioxide we're putting in the air is having a very measurable
00:36:54.500
effect that's beneficial. And we refuse to take that into account. And that is global greening.
00:37:00.660
You know, there is more green vegetation on the planet now compared with the 1980s,
00:37:06.500
equivalent to a continent the size of North America that's been added of green vegetation.
00:37:13.220
That's an enormous impact. It's approximately 16 or 17 percent more green vegetation on the planet
00:37:21.620
now than there was in the 80s. And there's lots of ways of proving that. It's satellites,
00:37:26.740
it's the variation seasonally, but on the ground too. You take photographs in deserts, you can see
00:37:32.660
how much greener they are than they were then. And that's because carbon dioxide is plant food.
00:37:37.620
And when there's more in the air, the plants grow better, particularly in arid areas.
00:37:41.620
Now, that's had an effect on crop yields. It's not the only effect on crop yields, but it's 15 percent
00:37:47.140
is quite a good improvement. And if you add the dollar value of that up, it's very hard to make
00:37:56.420
it smaller than the dollar value of the increase in warmth that has done damage. Because even the warmth,
00:38:04.420
that means that fewer people are dying in winter. Most of the warmth is concentrated in winter at
00:38:09.540
night in the north, not in summer and daytime and in tropical areas. So for me, the cost benefit
00:38:16.260
analysis is not clear. And yet there are so many vested interests now in continuing to talk about it as
00:38:23.220
a crisis and fund it as a crisis that I'll not get a hearing for what I just said.
00:38:30.100
Well, you're getting it here. And I find it very interesting because it sounds to me a little bit,
00:38:34.820
and, you know, as part of the media ecosystem now, I kind of start to see human behavior over
00:38:41.940
time. And I begin to see patterns. And I think one of the patterns is, particularly among the
00:38:47.140
commentariat, the people who may be not informed, is to kind of go, well, you know, the line is moving
00:38:52.820
in this direction. And if we continue moving in this direction for the next 50 years, things are going
00:38:57.700
to get really bad. And then the line starts moving in another direction. And we're like,
00:39:01.940
well, if it moves in this direction, 50 years. And we forget that, like, the line moves up and down.
00:39:08.020
And that's where I think the real concern about climate change is, which is the runaway effect.
00:39:12.820
Yes. Which is, if this carries on for a long time at this rate, we really do have a problem. And I don't
00:39:21.140
Absolutely. So what is the reality of the runaway thing? Because that's really the big piece of it.
00:39:26.420
Yes, you're quite right. The tipping point where it's a little bit of warmth creates a lot more
00:39:32.260
warmth. That would be the worry. You start to collapse the ice sheets, that makes everything
00:39:36.340
darker and therefore warmer, because ice is white and water is dark. You know, things like that.
00:39:43.220
Maybe when you make it warmer, you get less clouds, and therefore you get more sunlight coming in.
00:39:47.460
Most of the evidence on that is that the positive feedbacks are pretty small and diminishing. We
00:39:55.220
know, for example, that for every extra bit of carbon dioxide you put in the air, you get a
00:40:00.900
diminishing effect. The first bit has a big effect, the second bit has a smaller effect, and so on.
00:40:06.900
So there are negative feedback effects and positive feedback effects.
00:40:10.820
Most of the attempts to find tipping points, you know, the collapse of the Atlantic circulation,
00:40:18.260
collapse of the Greenland ice sheet, etc., and the acceleration of sea level rise, that's another
00:40:24.660
possible threat. After 40, 50 years, we can be much more relaxed about them than we were.
00:40:30.740
And, you know, it was reasonable to worry about them in the 80s, as I did. And I wrote some
00:40:35.060
alarmist articles in the 80s, saying this thing might go haywire after a certain point.
00:40:40.500
I now think that's pretty unlikely. But if we had a technology that said we can stop emitting
00:40:49.060
carbon dioxide tomorrow and it won't cost you a penny, fine, no problem. You then have to say,
00:40:55.940
well, how difficult is it to stop emitting carbon dioxide? And we've tried for 30, 40 years now to do
00:41:04.340
that. And what today, 82% of the world's energy comes from fossil fuels. The year 2000, 83%, roughly.
00:41:17.860
We've hardly changed. It's gone up a bit and then down a bit, you know, may have got to 84 at one point,
00:41:23.540
down to 81 at one point. But, you know, we can't find a replacement for fossil fuels that is both
00:41:30.820
reliable and cheap. And, you know, just think about heating your home or driving your car or whatever.
00:41:39.460
You know, it ain't that easy. And if you're living in Burkina Faso and you're burning brushwood,
00:41:47.540
which you've collected from the surrounding forest or scrub to keep yourself to cook food at night,
00:41:55.460
and the World Bank says you can't have money for a bottled gas program in that country because
00:42:06.020
it's a fossil fuel, then I think you should be pretty cross about that. Because your burning fire
00:42:12.260
is killing your kids. Indoor air pollution kills 4 million people a year. It produces more carbon
00:42:19.940
dioxide than burning gas. It steals the wood from beetles and other creatures who want to eat it.
00:42:25.700
Whereas gas doesn't do any of those things. And yet, so that's the reality of our obsession with
00:42:31.780
with trying to stop using fossil fuels is that we are doing genuine harm today. And you have to put
00:42:37.540
that in the balance against the potential future harms of runaway warming. And what are the policies?
00:42:43.620
Sorry, Francis, go for it on this. What are the policy implications of what you're saying, Matt?
00:42:47.940
If you were the chief scientific advisor or chief advisor to blah, blah, blah, would you go full
00:42:54.500
Trump drill, baby drill, no net zero, scrap all of that stuff? Should we be trying to reduce carbon
00:43:01.220
emissions at all? I think it's relatively simple. The advice I'd give was don't put it,
00:43:07.060
don't set a deadline. I mean, 2050, net zero, UK, only country doing it. We only produce 0.87% of
00:43:14.020
the world's emissions anyway. It won't make a damn bit of difference whether we hit that or not.
00:43:19.140
And the technology to do that, as I say, is not here. And it might come along in 2051. And then
00:43:24.900
you'd look a fool, wouldn't you? You'd spend a fortune trying to get rid of emissions and you
1.00
00:43:28.100
could have done it for free. So I think that's a crazy way of going about it. I think what the UK
00:43:33.300
has done is unbelievably foolish. And we should tear up net zero, get rid of the Climate Change
0.88
00:43:37.700
Committee, and instead fund research into energy technologies that might be able to solve the
00:43:49.380
problem in the future. Because if you could get fusion going economically five years earlier than
00:43:58.340
it would otherwise by a bit more funding, or if you could get small nuclear reactors cheaper five years
00:44:05.060
sooner, that would make far more difference than heat pumps and electric vehicles and all these kind
00:44:11.780
of things. So, you know, for me, it's about researching the problem to find solutions rather
00:44:21.220
than enacting deadlines today. Gentlemen, it's time to upgrade your grooming game.
00:44:30.580
The Lawn Mower 5.0 Ultra from Manscaped isn't just engineered for performance.
00:44:35.540
It's designed to impress, especially in the areas that matter most.
00:44:39.700
With skin-safe technology and a dual-blade setup, it gives you a clean precision trim
00:44:44.980
while helping prevent those awkward slips. Waterproof, lightweight, and equipped with a built-in
00:44:50.580
light, it's made for accuracy even in low light or high-stress situations. We've been
00:44:56.420
partnering with Manscaped for over three years now and that's no accident. Their grooming tools
00:45:02.340
are top-tier and their customer support is consistently excellent. Head to Manscaped.com
00:45:08.100
and use the code TRIGGER15 to get 15% off your entire order. That's TRIGGER15 at Manscaped.com.
00:45:16.340
Precision tools, unbeatable results, and smooth skin that doesn't come at the cost of your dignity.
00:45:22.500
A claims program for harmed Canadians has begun as a result of a landmark tobacco settlement.
00:45:31.060
If you smoked regularly before November 20, 1998, and were diagnosed with lung cancer,
00:45:38.100
throat cancer, emphysema, or COPD, you may qualify for a significant payment. To learn more,
00:45:45.460
call 888-482-5852 or go to tobaccoclaimscanada.ca.
00:45:54.500
One of the things when we talk about climate change is that people tend to fold it into pollution
00:46:00.500
and damage to the wildlife and the environment, which I know you're very passionate about. I am,
00:46:05.940
so is Constantine. How do we disentangle these things? Because it's very important,
00:46:12.660
because I think we can all agree that we need to do more to preserve our wildlife and our ecology,
00:46:19.060
not only for ourselves, for our children and our grandchildren. So how do we
00:46:23.300
essentially disentangle us and make sense of this?
00:46:26.340
Well, I was in a meeting the other day where a presentation was made in which
00:46:31.540
the importance of preventing nature decline and nature depletion was talked about,
00:46:36.500
comma, as a result of the climate crisis. And I stuck up my hand and said, look,
00:46:44.420
wherever I go in the world, and I'm a keen bird watcher and love nature,
00:46:50.980
the threat to wildlife that I see is nearly always not climate. It's nearly always invasive
00:46:57.540
species or habitat destruction. So I was recently in India and I said, what's your biggest problem in
00:47:02.420
this lovely habitat in the edge of the Tar Desert? And they said, well, we've got this new invasive
00:47:06.980
tree that's coming in. Well, a couple of years ago, I was in Africa. I said, what's your biggest
00:47:10.500
problem? Oh, there's an invasive ant, which is displacing the... And then when I think about my own
00:47:15.700
farm in Northumberland, the water vole has gone extinct because of invasive mink. The
00:47:20.340
red squirrel is going extinct because of invasive grey squirrels, etc. So,
00:47:23.700
and if you look at extinctions of birds and mammals, invasive alien species that compete with
00:47:33.380
them, particularly on islands, are the biggest threat. Now, that topic gets neglected. But you're
00:47:39.060
right. People think of it as part of the same... They say, oh, I didn't mean just the climate crisis.
00:47:43.780
I meant man's impact generally. So they're sort of... They're already there, if you know what I
00:47:49.460
mean. They're just using that as a shorthand for human impacts, which is fine. But I do think it
00:47:57.780
distorts our priorities. Because if you... Climate change can be a very convenient excuse for
00:48:04.740
politicians. Yes, we can't do anything about nature decline because it's climate change. Yes,
00:48:09.300
we can't stop fires destroying Malibu because it's climate change. Politicians love talking about
00:48:15.780
climate change because then they don't have to take any responsibility for not doing preventive
00:48:20.900
burns in California, not designing better flood mitigation schemes in Britain or whatever it might
00:48:28.260
be. Not doing anything about the grey squirrel. And it becomes this... It becomes like a doomsday cult
00:48:34.500
where you can't do anything or you can't say anything against it because
00:48:39.300
there is going to be this day where the fires are going to engulf us all. Well, I think
00:48:43.700
there is a sort of love of doom in the human spirit. And if you want to get attention on the
00:48:54.020
media, which you guys do, then it really pays to do the, you know, sackcloth and ashes,
00:49:01.860
profit on a hill thing about how we're all doomed, we're doomed. You know, people have said to me,
00:49:09.220
you keep writing these books saying that people are wrong to be so pessimistic about things.
00:49:14.180
You realize you could make more money if you said the opposite.
00:49:19.460
Does that, I guess, explain the phenomenon of Greta Thunberg?
00:49:24.580
Well, she's moved on, mate. We're getting left behind. So she was the head of the climate thing.
00:49:33.140
She's on Palestine. She's on Palestine, mate. So your references are behind the times.
00:49:38.420
But I mean, I'm sorry, that was peak strangeness. I was going to say a ruder word, but...
00:49:46.180
It was weird. Let's just call it what it was. It was deeply weird.
00:49:49.220
That photograph of Leila Moran and Michael Gove and Ed Miliband standing, looking at a 14 or 15
00:49:58.100
year old girl, as if she was Joan of Arc, hanging on their every word when she came to parliament here.
00:50:04.420
You know, I mean, I'm sorry, she knew relatively little about the topic.
00:50:10.500
I love how diplomatic matters. A 14 year old knew relatively little about one of the most
00:50:18.820
complicated scientific subjects in human history. Yeah. She didn't quite understand the climate models
00:50:24.980
at 14. It, of course, showed that, you know, there is an element of religious dogma into this topic.
00:50:36.900
And it's got all the features of a religious cult. Now, that doesn't mean there isn't any science
00:50:43.060
there either. Of course there is. But, but, you know, if you're banning Nigel Lawson from the BBC,
00:50:52.500
which happened to him before he died, but you're allowing Greta Thunberg, something's wrong with your
00:50:59.300
priorities about truth. Yes, indeed. We had Nigel on the show during the pandemic,
00:51:04.660
right before his death, actually. A very interesting conversation. Oh, we didn't talk
00:51:08.260
about climate. Well, Matt, one of the things you talk about in your latest book, which is there,
00:51:14.500
is, you know, one of the things that we wanted to explore is the kind of maverick,
00:51:19.620
the maverick theory of science, like the great man theory of science, if you like, which is,
00:51:25.940
you know, the idea that quite a lot of progress, it does get made, you call them brave,
00:51:30.500
stroke, mad people. Is that an exaggeration? Is that a story we like to tell our kids so they
00:51:37.780
think anything is possible, but actually most science is sort of a bunch of people working
00:51:42.340
together on a thing and eventually coming to a conclusion? Or is quite a lot of it genuinely,
00:51:47.300
you know, the crazy guy coming up with a theory, everyone thinks he's an idiot for 20 years,
1.00
00:51:51.220
and then we go, oh, actually, he gave himself an ulcer. He's right. Where's the, where's the balance
1.00
00:51:56.740
between those two? It's a very good question, and one that I wrestle with, and I don't fully know
00:52:00.980
the answer to. On the one, on the one, yes, of course, an awful lot of science is incremental
00:52:08.020
improvements to knowledge by perfectly sensible, normal people working in the orthodox mainstream.
00:52:13.620
You know, one can't deny that. And, and I'm not a huge fan of the great man theory of history. I do
00:52:18.900
think there's an inevitability about certain things happening in history. But you cannot deny
00:52:26.900
that an awful lot of scientific theories start out as heresies and are advanced by mavericks.
00:52:34.900
And those mavericks have a tough time being heard. But you also can't deny that that doesn't mean
00:52:41.140
that every maverick is a genius. You know, not everyone is Galileo, not everyone is Charles Darwin.
00:52:47.460
But it, I mean, in the book I've written, Birds, Sex and Beauty, it's, it's about the, the debate on
00:52:52.660
sexual selection, which is a, an idea that Darwin had, that the choice of your mate can drive evolution
00:53:03.060
in interesting ways that survival of the fittest can't. Seduction of the hottest versus survival of
00:53:08.740
the fittest is the way I put it. And it's a much more creative and different force. And it often
00:53:13.140
doesn't help you survive, but it does help you get a mate and therefore have attractive grandchildren,
00:53:17.460
which can attract others, et cetera, et cetera. So it can, it can result in peacock's tails and
00:53:21.940
flamboyant plumage and all that. Now, Darwin had that idea quite early. He pushed it quite hard.
00:53:27.780
All his friends said he was wrong. Wallace completely disagreed with him. Alfred Russell
00:53:33.220
Wallace is his rival co-discoverer of natural selection. Even his chums like Thomas Henry Huxley
00:53:39.700
and Herbert Spencer thought he was talking through his hat. But we now know he was basically right,
00:53:45.300
that make choices an important force in evolution and not to be underestimated.
00:53:50.180
And so I just, I was very curious about the idea that we think of Darwin as triumphantly successful,
00:53:55.780
but actually one of his favorite ideas that he's devoted a lot of energy to, he lost the argument
00:54:00.180
in his lifetime. And partly because in Victorian England, if you say female lust may be behind
0.96
00:54:09.060
male coloration. It doesn't work for crusty old Victorians who are sitting around discussing these
0.97
00:54:17.620
things. They don't understand about female lust, if you should. I mean, they certainly don't want to
0.98
00:54:22.100
talk about it. So there's a cultural background there that's quite interesting. But partly this was an
00:54:29.380
excuse to go birdwatching for a year and write a book about it. Partly I'm fascinated by the colors and
00:54:34.500
flamboyant plumage of birds. Francis, you're a bird watcher, you know what I mean.
00:54:43.220
And it's genuinely difficult to explain why should birds be so conspicuous? Why do they
00:54:48.020
dance so much? Why do they sing so much? And maybe it has a story to tell about human beings too,
00:54:55.940
because we are selective about our mates both ways. Males are very selective in which females they
00:55:01.540
mate with and vice versa. And a chap called Jeffrey Miller has written a book called The Mating Mind
00:55:10.740
Big fans of his. Yeah, he's a great guy. And it's a really good book. And in it, he argues that
00:55:19.140
we think the human brain exploded in size over the last couple of million years
00:55:24.100
because of natural selection. It helped us survive on the savannah. Well, why? I mean,
00:55:30.980
A, why did it do that? And B, why didn't baboons or gazelles need bigger brains too? They were also on
0.96
00:55:37.460
the savannah. So maybe it wasn't about surviving in practical terms. Maybe it was about social
00:55:43.700
survival. Maybe it was about working out what's going on in other people's heads was an important
00:55:48.980
part of our life. And so there was a sort of social arms race between us to have big brains
00:55:52.740
to figure out what each other was thinking. That's a perfectly good hypothesis. People discussed that
00:55:58.100
ad nauseam. Jeffrey says, well, maybe actually it was about seducing each other. Maybe once we got
00:56:02.980
into the habit of saying the guy who sings well or makes good jokes is the one I want to mate with,
00:56:09.780
that could drive an explosion of the human brain that could suddenly take off in the way that it
00:56:14.100
did take off. It's such an interesting idea, I think. But nobody takes it very seriously. And nobody
00:56:21.940
has thought through the implications of it in terms of how we rethink social sciences, economics,
00:56:27.860
sociology, all these things, if it is what we devote a lot of our mental energy to.
00:56:34.660
Um, uh, you know, but having a good sense of humor is an incredibly important part of seduction as
00:56:41.860
you're comedians, aren't you? So I shouldn't really tell you that, but
00:56:44.340
Well, he's single mate, so it hasn't worked for him. By the way, I was just thinking, Francis,
00:56:50.740
that book title in your accent takes on a whole different meaning, doesn't it?
00:56:55.460
Birds, sex and beauty. Birds, sex and beauty. Yeah, exactly. Why is that a different take?
00:57:00.340
Well, birds is, I mean, is it cockney to you to talk about women as well?
00:57:04.820
Women, yes. Birds, sex and beauty. Yeah. So if you read, if you've written that book,
00:57:10.500
it would take on a whole different meaning. Yeah, more like a natureist approach. If I was,
00:57:15.060
you know, I'm a Northumbrian, if I spoke in Jordan, board, sex and beauty, like.
00:57:19.300
Yeah. But, but it's an interesting point you make. And it's, there's, there was a term for it,
00:57:26.260
uh, on the internet many, many years ago, which is called peacocking.
00:57:30.180
Really? Yes, yes. And the term peacocking in the kind of dating slash manosphere culture of the time
00:57:36.260
was you displaying your peacock feathers in whatever, in like making jokes or
00:57:41.540
displaying a particular talent that women would then become interested in. Right.
00:57:45.700
So it's interesting that you, you say that when it comes to birds. Right.
00:57:50.580
And by the way, it's quite an important point that I make in the book,
00:57:53.860
because we've done experiments to show that in some birds, it goes both ways. You know,
00:57:56.820
that males have to be brightly coloured to attract females, and females have to be
00:58:00.260
brightly coloured to attract males. So, so this isn't necessarily about saying it was the male brain
00:58:06.260
that did all this. It could have been just as much the other way. I think it's quite important
00:58:10.180
to say that. Well, bird feminists will be very happy you've made that point.
1.00
00:58:13.300
The avian feminist movement will send you a letter of thanks for that meaningful contribution.
00:58:22.180
Quick question, though. So if, why is it that in some species, it's a female that's colourful?
00:58:28.260
Why is it in some species, it's the male that's colourful? Or is it arbitrary?
00:58:32.100
No. Well, the answer to that was brilliantly spotted, and we should have spotted it years
00:58:36.500
earlier by a guy called Robert Trivers at Harvard in the 1970s.
00:58:39.460
He said, whichever species invests most time and energy in bringing up the offspring
00:58:48.500
ends up being dully coloured and competed for by the other one that gets brightly coloured,
00:58:53.700
because he's got the spare time, as it were, to show off. And so the exceptions prove the rule again,
00:58:59.540
which is that there's a bird I used to study, and I mentioned it in the book, called the phalarope,
00:59:05.940
in which it's the females that are brightly coloured and do a lot of the displaying,
0.99
00:59:10.100
and they do a lot of the harassing of the males. And the males sit on the eggs. And it's because the
0.72
00:59:16.260
males sit on the eggs that they're the scarce resource that the other sex is going to fight for.
1.00
00:59:20.420
Do you see what I mean? Grabbing a mate who's going to look after your eggs for you for two weeks
1.00
00:59:26.580
is so valuable that it makes sense for you to invest time and energy in being seductive.
0.95
00:59:32.580
Because there was something that I used to think when I used to see these brightly coloured
00:59:36.660
tropical birds in Venezuela, where I was like, I understand there's a beauty aspect to it,
00:59:41.940
and then you're demonstrating to your potential mate how healthy you are, how strong you are.
00:59:48.260
But doesn't that then make you more vulnerable to predators?
00:59:51.700
Absolutely. And this was one of the points that got Darwin interested in this, because he's saying,
00:59:59.940
how does this help survival? I mean, it's supposed to be survival of the fittest. And the answer is,
01:00:04.500
yeah, no. If you can win the jackpot in the next generation by fathering beautiful sons,
01:00:12.580
who all get lots of mates, then it doesn't matter if your life is short because you get hit by a hawk,
01:00:21.460
as long as you've managed to mate first. Do you see what I mean? So it's a price you pay
01:00:30.020
for winning the seduction lottery is you lose the survival lottery. And obviously there's a balance.
01:00:36.740
But it makes perfect sense within the selfish gene explanation of these things, doesn't it? Because
01:00:41.780
it's not about the unit, Matt Ridley or Constantin kissing, it's about me passing on my genes. So if I
01:00:47.780
can live to 50 and have 10 kids, that's a hell of a lot better than living to 100 and having none,
01:00:56.500
Matt, you should come on many, many more times.
01:00:58.500
You put it beautifully. See, this is why Matt is a regular guest on the show. He's just,
01:01:05.460
he's so spot on about everything, isn't he? Yeah.
01:01:08.020
He just knows how to just, you know, say it exactly how it is.
01:01:11.220
And he does it at the end of the interview. So we remember it. So then we invite him back for next
01:01:15.380
Matt, always a pleasure to have you on. We're going to take some questions from
01:01:18.660
our audience in a second on our sub stack. Before we do, we always end with the same question,
01:01:23.140
as you well know, what's the one thing we're not talking about that we should be?
01:01:26.340
Is it Anthony Fauci getting that pardon? Pre-pardon? Wasn't it a pre-pardon? It was a preemptive
01:01:34.100
Yeah, it went back to 2014. There's something on the tip of my tongue and the back of my mind. I think
01:01:48.180
we're not talking about that asteroid that's aiming at us in 2032 or something.
01:01:57.460
It's only 0.1% chance it's going to hit us or 0.1%, I can't remember, but you know.
01:02:11.300
Well, we're going to have to work out how to deflect one. It's not an Earth killer,
01:02:18.660
but it could be a city killer, this one. Now, I gather they've downgraded its chances. It was 2%,
01:02:24.340
it's now 1% or something, you know, so we probably don't need to worry about that.
01:02:29.540
No, no. I mean, it's a long way off yet. I mean, you know, you know, whatever.
01:02:35.860
That was a very non-scientific way of using it.
01:02:44.820
No, sorry. It's a better one, actually. I don't know if you can edit that out.
01:02:50.740
Yeah, exactly. There's a guy called Avi Loeb, who's the professor of astrophysics at Harvard University,
01:02:59.220
and he is convinced that an object came in from another solar system, vertically sort of through
01:03:13.620
the plane of our solar system and passed between us and the sun in 2017. It was called something
01:03:19.380
like Umumia. It's got a Hawaiian name. And this is true. I mean, that's a fact that it did that.
01:03:26.820
But he's convinced it might have been an artificial object, not a natural object.
01:03:32.580
Because it was flashing on and off. It was bright, then dark, bright, then dark, bright, then dark.
01:03:37.700
Probably that means it was long and thin. And so it was spinning. And when it was pointing towards us,
01:03:41.060
it was dark. And when it was sideways on, it was brighter. It then did an acceleration that is
01:03:47.060
quite hard to explain. Not impossible to explain, but quite hard to explain.
01:03:50.980
Now, most of the astrophysics community says Avi's talking through his hat,
01:03:54.580
and he doesn't know what he's talking about. And it's far more likely to be natural.
01:03:59.380
Well, why? He's saying we don't know the Bayesian prior logic here, whether it's more likely to be
01:04:05.940
natural or artificial. If it's come from another solar system, an interstellar trajectory,
01:04:14.020
the chance, you know, it's possible that it was sent to our solar system.
01:04:22.820
And it's possible it had a solar wind collector to speed it up. So as it gets past the sun,
01:04:27.540
it does speed up. You know, you can imagine a technology that would do that. A solar sail,
01:04:31.940
I think is the word. And he wrote a wonderful book about this. And all his colleagues think
01:04:37.780
he's a complete nutcase. And I like nutcases.
0.99
01:04:41.220
Mm. Clearly. Well, thanks for being here with two other nutcases. We'll take a look at it. But,
0.93
01:04:46.900
uh, I mean, they part, they went, this object went past us, but didn't make any contact and
01:04:51.700
probably took a look at what's happening on earth and went off and just carried on.
01:04:57.380
Well, Matt, it's great to have you on the show. Head on over to Substack where we ask
01:05:02.180
How much of this politicization of the sciences can be attributed to people
01:05:07.860
being just being unwilling to accept that life isn't fair?
01:05:34.900
Broadway's smash hit the Neil Diamond musical a beautiful noise is coming to Toronto. The true
01:05:40.900
The true story of a kid from Brooklyn destined for something more, featuring all the songs you love, including America, Forever in Blue Jeans, and Sweet Caroline.
01:05:49.920
Like Jersey Boys and Beautiful, the next musical mega hit is here.
01:05:56.740
Now through June 7th, 2026 at the Princess of Wales Theater.