The Jordan B. Peterson Podcast


315. The World is Not Ending | Bjørn Lomborg


Summary

With decades of experience helping patients, Dr. Jordan B. Peterson offers a unique understanding of why you might be feeling this way. In his new series, "Daily Wire Plus," Dr. Peterson provides a roadmap towards healing, showing that while the journey isn't easy, it's absolutely possible to find your way forward. Today's guest is Dr. Bjorn Lomberg, who was named one of Time Magazine's 100 Most Influential People of the World. He is a visiting fellow at Stanford University's Hoover Institution and a frequent commentator in print and broadcast media for outlets including the New York Times, Wall Street Journal, The Guardian, CNN, and the BBC. His monthly column is published in many languages by dozens of influential newspapers across all continents. He's also a best-selling author whose books include False Alarm, How Climate Change Panic Costs Us Trillions, Hurts the Poor and Fails to Fix the Planet, Cool It, How to Spend $75 Billion to Make the World a Better Place, The Nobel Laureate s Guide to the World s Smartest Targets for the World, and Prioritizing Development, a Cost-Benefit Analysis of the UN's Sustainable Development Goals. In this episode, we discuss the role of human beings on the planet in relation to the environment, and why we should all be worried about climate change. We'll cover: 1. What is climate change? 2. Why is a religious metaphor? 3. What are we being told about it? 4. What does it really mean? 5. Why does it matter? 6. How does it have a role in the world? 7. Why it matters? 8. Is it real? 9. What do we need to be worried? 10. How can we have a better understanding of the environment in the 21st century? 11. How do we know we can be more sustainable? 12. Why are we all better at managing our environment? 13. Why should we care about the planet? 14. What role of humans in the universe? 15. What can we learn from our environment in order to be a better place? 16. What should we do to make the world better? And so on and so we can have a more sustainable and more sustainable future? 17. Why do we care more about the environment and less stuff in the first place, and how can we be kinder?


Transcript

00:00:00.960 Hey everyone, real quick before you skip, I want to talk to you about something serious and important.
00:00:06.480 Dr. Jordan Peterson has created a new series that could be a lifeline for those battling depression and anxiety.
00:00:12.740 We know how isolating and overwhelming these conditions can be, and we wanted to take a moment to reach out to those listening who may be struggling.
00:00:20.100 With decades of experience helping patients, Dr. Peterson offers a unique understanding of why you might be feeling this way in his new series.
00:00:27.420 He provides a roadmap towards healing, showing that while the journey isn't easy, it's absolutely possible to find your way forward.
00:00:35.360 If you're suffering, please know you are not alone. There's hope, and there's a path to feeling better.
00:00:41.780 Go to Daily Wire Plus now and start watching Dr. Jordan B. Peterson on depression and anxiety.
00:00:47.460 Let this be the first step towards the brighter future you deserve.
00:00:57.420 Hello everyone watching and listening on YouTube or associated podcasts.
00:01:13.520 I have the great privilege today of speaking once again to Dr. Bjorn Lomberg.
00:01:17.640 We've talked several times on my podcast before, but it's always good to talk to him.
00:01:21.640 Dr. Lomberg researches the smartest ways to do good with his think tank, the Copenhagen Consensus.
00:01:30.800 He's worked with hundreds of the world's top economists and seven Nobel laureates to find and promote the most effective solutions to the world's greatest challenges,
00:01:39.560 from disease and hunger to climate and education.
00:01:43.260 For his work, Lomberg was named one of Time Magazine's 100 Most Influential People of the World.
00:01:48.280 He's a visiting fellow at Stanford University's Hoover Institution and a frequent commentator in print and broadcast media
00:01:55.600 for outlets including the New York Times, Wall Street Journal, The Guardian, CNN, Fox, and the BBC.
00:02:02.020 His monthly column is published in many languages by dozens of influential newspapers across all continents.
00:02:08.540 He's also a best-selling author whose books include
00:02:11.100 False Alarm, How Climate Change Panic Costs Us Trillions, Hurts the Poor and Fails to Fix the Planet,
00:02:17.200 The Skeptical Environmentalist, Cool It, How to Spend $75 Billion to Make the World a Better Place,
00:02:24.180 The Nobel Laureate's Guide to the Smartest Targets for the World,
00:02:27.680 and Prioritizing Development, a Cost-Benefit Analysis of the UN Sustainable Development Goals.
00:02:35.420 All right. Hello, Mr. Lomberg.
00:02:37.440 Very nice to have the opportunity to speak with you again.
00:02:40.480 I thought today we would start our discussion by talking about what young people are being told.
00:02:49.000 And I want to lay out a few ideas for you, and we can delve into this, and we'll move on from there.
00:02:55.500 So I've just been reading Alex Epstein's book, Fossil Future.
00:02:59.660 And in that book, he details out, first of all, his belief that in the foreseeable future,
00:03:06.960 that not only will we have to use fossil fuels, but we should use them.
00:03:12.340 And he explains why, I would say, on ethical and practical grounds.
00:03:17.180 But he also says something that struck me as very interesting,
00:03:19.880 which is that the view that's being put forward to young people of the role of human beings on the planet
00:03:29.120 in relationship to the environment is essentially predicated on an implicit religious metaphor.
00:03:35.700 And I want to lay out the metaphor.
00:03:38.060 And I want to lay out why I think his claim that it's a religious metaphor is technically correct.
00:03:44.860 So the story is something like this.
00:03:47.980 The planet is fragile and virginal and continually pillaged.
00:03:55.020 The pillaging forces are the patriarchy, essentially, the social structure.
00:04:00.400 It's a masculine metaphor.
00:04:02.880 The social structure is viewed as a force that's nothing but devouring and negative.
00:04:12.320 And so you have nature, you have culture.
00:04:15.000 Nature's all positive, culture's all negative.
00:04:17.980 Then you have the individual, also part of the story.
00:04:21.780 And the individual is basically characterized as some combination of predator and parasite.
00:04:30.540 And so the reason that's a religious story, as far as I can tell,
00:04:35.560 is this is complicated, but I'd like to be able to lay it out.
00:04:38.780 When I wrote my book in 1999 called Maps of Meaning, it struck me that the basic cognitive and perceptual categories
00:04:48.580 were something like chaos, order, and the process that mediates between them.
00:04:55.600 I looked at a lot of mythological work, a lot of religious writing across multiple cultures and tried to look at the correspondence between that and certain neuropsychological models that were being built,
00:05:09.060 including models of hemispheric processing.
00:05:11.200 So our hemispheres are set up in some real sense so that the right hemisphere processes novelty and chaos and possibility,
00:05:19.600 and the left hemisphere imposes order.
00:05:21.940 And the fact that the hemispheres have this structure indicates, because they're adapted to the natural world, let's say,
00:05:29.500 indicates that the most fundamental way of perceiving the world is something like a place of possibility and chaos and potential, on the one hand,
00:05:38.540 and a place of habitable order and culture and predictability on the other.
00:05:43.920 So you have those two domains, and then consciousness looks like it's the process that mediates between the two.
00:05:51.720 And Epstein, now I learned in 1999 that these domains, chaos, order, and the process,
00:06:01.240 were always represented metaphorically or symbolically.
00:06:04.560 It's like an a priori axiom of cognitive function and perception itself.
00:06:10.380 The chaotic domain, potential and so forth, tends to be represented with female symbols, feminine symbols,
00:06:17.840 and the orderly domain tends to be represented with masculine symbols.
00:06:23.080 And so you can see how this plays out in the modern world, because you have Mother Nature,
00:06:28.000 who's virginal and fragile, being raped by the catastrophic patriarchy.
00:06:33.320 And you can see those metaphors lurking underneath, right?
00:06:37.240 There's the positive female, the negative male on the cultural front,
00:06:41.500 and then you have to lay the individual on top of that.
00:06:44.340 And the individual in that story, positive feminine, negative masculine, is also represented negatively.
00:06:51.500 Now, that's a very compelling story, because it does cover all the domains of existence.
00:06:57.440 And there is a beautiful and plentiful and positive element of untrammeled nature, let's say.
00:07:05.400 And there is a tyrannical and predatory aspect of culture.
00:07:09.240 And the individual can be a destructive, parasitical, and predatory force.
00:07:13.580 But that's only half the story.
00:07:15.500 And that's the problem.
00:07:16.620 And so the point I'm trying to make is that we can't structure our perceptions
00:07:24.700 without using something like an a priori category system.
00:07:30.960 And the a priori category system, whatever your a priori category system is your religion,
00:07:36.480 it functions in exactly the same way.
00:07:38.580 And we have a religion now that's focused on nature worship,
00:07:42.120 the derogation of culture, and the damnation of the individual.
00:07:46.040 And that's the story that's being told to young people, right?
00:07:50.280 The planet's fragile, culture is nothing but a destroying force,
00:07:54.360 and individual effort is to be construed as predatory, say, in the patriarchal sense,
00:07:59.940 and parasitic in relationship to the natural world.
00:08:05.100 So I'm wondering what you think about that.
00:08:09.600 Yeah, no, I think it's a great metaphor.
00:08:14.080 So again, if we go along with this, and if we all have religion,
00:08:18.380 I would tend to say that my religion is data.
00:08:21.780 You know, there's a famous statistician that say,
00:08:24.360 if you, without data, you're just another guy with an opinion, right?
00:08:28.380 We have a lot of knowledge about the world.
00:08:30.820 And the reality is that much of this is built on, you know, stories and metaphors
00:08:36.000 and things that we've heard.
00:08:37.140 And it's probably not very conducive to understanding what the world is actually like.
00:08:42.820 And I totally agree with you that everybody, not really just young people,
00:08:46.780 but especially perhaps young people, are told this is the end of times.
00:08:51.200 You know, this idea of should you really have children?
00:08:53.360 Should you really put them into this world, this terrible world?
00:08:55.700 Well, the world is going to end in, you know, whatever the number is right now,
00:08:59.200 but, you know, eight years or 12 years or whatever.
00:09:01.800 The feeling is that this is sort of end of times.
00:09:05.720 And that's very much, as you point out, a sense of we have this beautiful world
00:09:10.860 that we somehow, this natural world that we've somehow despoiled
00:09:15.400 and made terrible in so many different ways.
00:09:18.720 And I would argue that certainly if you look back in time,
00:09:22.660 this very clearly is a very modern way of thinking about the world.
00:09:27.860 You know, two, three hundred years ago, we were terrified of nature
00:09:31.380 because we really worried about, you know, the wolves out there.
00:09:34.940 We were terrified about nature in the sense that it would kill us in all kinds of ways.
00:09:40.340 Just, you know, think about one of the things I'll talk about a little later.
00:09:43.820 Smallpox, a disease that we've eradicated in 1978.
00:09:49.120 But even in the 20th century, it killed about 300 million people.
00:09:53.360 So, you know, it killed a couple of, you know, somewhere up to 5 million people every year.
00:09:58.100 This is a terrible disease.
00:09:59.620 It was not the only disease that you were struck with.
00:10:02.980 This killed, you know, royalty and everybody else.
00:10:06.140 Nature used to be terrible.
00:10:08.320 What has happened is that we have actually found a way to live such that
00:10:13.960 we can now say we like nature, we love nature,
00:10:17.560 we want to set aside lots of nature.
00:10:20.060 Remember, you know, most of European nations, for instance,
00:10:22.880 cut down most of their forests to build navies, to, you know, fight each other.
00:10:27.300 But the fundamental point is you get reforestation when you're rich,
00:10:32.720 when you're well off, when you can actually deal with the issues.
00:10:36.100 And so, again, that tells you something that I think is incredibly important
00:10:40.160 if we're actually going to have a good conversation.
00:10:42.560 It is that you need to understand, overall, things are moving in the right direction
00:10:48.780 and that we're much better off, you know, with just one statistic.
00:10:52.880 If you look at the number of people that die from climate-related disasters.
00:10:56.980 So these are the disasters that we hear about all the time.
00:11:00.480 You know, floods, droughts, storms, wildfires, extreme temperatures.
00:11:04.440 Those kinds of things, we have pretty good data for that.
00:11:06.760 We certainly have good data for the last hundred years.
00:11:09.260 How many people die every year?
00:11:11.120 Well, it turns out that in the 1920s,
00:11:13.960 about half a million people died each year from those disasters.
00:11:17.660 That's a terrible outcome of the world.
00:11:20.280 How many people die today?
00:11:21.760 If you ask most young people, if you ask most people in the world,
00:11:24.460 they'll probably think that number has gone up and up and it's just worse and worse.
00:11:29.260 Nothing could be further from the truth.
00:11:31.820 Last year, so the last full year that we have, 2021,
00:11:36.040 less than 7,000 people die.
00:11:38.780 We've seen a decline of more than 99%.
00:11:41.720 Why?
00:11:42.740 It's got nothing to do with climate.
00:11:44.700 It's not because climate has gotten better or indeed really worse.
00:11:47.940 We can't really tell in most of these impacts.
00:11:50.500 There's a few of them we can, but mostly we can't.
00:11:52.900 What has changed is our ability to handle it.
00:11:57.420 That's why we don't die from smallpox.
00:11:59.640 That's why we can afford to actually make sure we have forest.
00:12:02.940 That's why most rich nations are reforesting.
00:12:05.760 And that's why fewer and fewer people are dying from these disasters.
00:12:09.740 So I think we need to tell that alternative story, if you will,
00:12:13.880 that yes, you hear all these terrible things.
00:12:17.340 And it doesn't mean that there are no problems.
00:12:19.360 There's still lots of people that are terribly troubled from floods.
00:12:25.100 There's still lots of people that are terribly troubled by droughts and all these other things.
00:12:30.020 There's still a lot of infectious disease from both tuberculosis and malaria.
00:12:34.380 The world is not fine, but the world is much better.
00:12:37.820 And that's important because that put us in a very different frame of mind.
00:12:41.720 It means we are not looking at the world being despoiled.
00:12:46.200 And hence, we need to make some sacrificial offerings to please this deity that we're worried about.
00:12:53.820 It's instead to say, look, we're actually dealing with this in the right direction.
00:12:58.340 We're actually making things better, but we can do even more.
00:13:02.160 That's a very different message.
00:13:03.740 And of course, one that's much more optimistic.
00:13:06.820 And I would hope.
00:13:07.800 Well, it's also more balanced.
00:13:09.920 Okay, so you started talking about the relationship, you know, your concentration on data.
00:13:15.420 And so I wanted to delve into that a little bit.
00:13:18.560 So there's data, of course, and data would be something like a representation of the patterns in the world, not merely the subjective patterns, not merely the psychological patterns, but the patterns in the, well, let's say the objective world, the patterns that exist in somehow that transcend mere subjectivity.
00:13:40.420 And so those are patterns that we're going to test our presumptions against.
00:13:43.280 But part of the reason I wanted to delve into the underlying metaphorical substructure is because a lot of your work and the work of the more non-naive optimists that I've encountered in the last 10 years has this counter-narrative element that structures it.
00:14:00.320 I mean, because your a priori axioms, they're the reverse of the environmentalist axioms in some sense.
00:14:06.520 And in this way is what I mean, is that you started your description by pointing out that we shouldn't be lulled into thinking that nature is only benevolent.
00:14:17.920 It's only been a very short period of time, historically speaking, that any of us at all anywhere on the planet had the luxury of ever assuming that nature was a benevolent force for more than a few seconds.
00:14:30.260 Right? And so, because nature is conspiring in all of its benevolence to destroy us as rapidly as it possibly can all the time as well, with cold and heat and floods and disease and acts of God and volcanoes and earthquakes, etc., etc.
00:14:46.780 And so, you have to be extremely naive if you don't also see nature as a threatening force.
00:14:53.060 And so, now, you shouldn't see it as only a threatening force because we're also dependent on it.
00:14:59.900 All right? So, you flesh out the malevolent nature side of the story.
00:15:06.100 But then you also say, well, look, everyone who's listening, don't be so pessimistic about our culture, our Western culture, let's say, but the global culture even more broadly.
00:15:17.880 Because in many ways, we've been moving in the right direction.
00:15:21.540 Things are a hell of a lot better by almost every metric you can imagine than they were 100 years ago.
00:15:26.500 They're better by most metrics than they were 50 or 20 years ago.
00:15:30.240 And not just a little bit better, a lot better.
00:15:33.140 So, then you can flesh out the positive side of the culture.
00:15:39.460 And then on the individual side, you can say, well, you know, there are people who are predatory and there are people who are parasitical.
00:15:45.320 And everyone is subject to temptation and failure to hit the mark, let's say.
00:15:52.980 But by and large, people are striving in the right direction.
00:15:56.380 And you can view human beings as a positive force, even though there's some ambivalence about that.
00:16:01.060 And so, that fleshes out the story.
00:16:03.600 You know, you can also think about it as Rousseau versus Hobbes.
00:16:07.440 And strangely enough, you come down more on the side of the Hobbesians, even though I don't think that's your temperamental proclivity.
00:16:13.700 Because for Rousseau, right, the nature was all positive.
00:16:18.980 We were turned into negative creatures because we were perverted by our socialization.
00:16:26.500 And human beings, well, for Rousseau, human beings were innately good, assuming that they weren't warped and twisted by culture.
00:16:34.860 But Hobbes had the alternative viewpoint.
00:16:37.180 Hobbes said, well, the state of nature is chaos and war.
00:16:40.400 And we need a strong socializing force in order to integrate and organize us so that peace can obtain.
00:16:48.180 And I've thought for a long time that a comprehensive worldview melds Rousseau and Hobbes.
00:16:55.960 It's the same comprehensive religious idea in some sense, as you need a representation of nature that's positive and negative.
00:17:04.600 You need a representation of culture that's positive and negative and a representation of the individual that's positive and negative.
00:17:12.620 And we've offered a crippled religious view to young people.
00:17:18.260 It's also got this apocalyptic end, right, this apocalyptic undertone, which is not only is nature virginal and fragile and culture rapacious and predatory and the individual corrupt,
00:17:30.340 but this is a bloody emergency and the apocalypse is upon us, like if it isn't tomorrow, it's 10 years from now.
00:17:36.180 All of that's religious force, I would say, operating at the metaphorical and mythological level.
00:17:41.800 And a lot of what you've been doing, and I want to go, I want to get at the foundations of this,
00:17:47.440 a lot of what you've been doing is saying, well, look, let's just hold on on the apocalyptic vision side.
00:17:52.600 It isn't obvious that the bloody catastrophe is upon us now in any manner that would make incautious emergency action anything other than destructive.
00:18:05.860 There's no reason to assume that as a social force, we're only predatory and parasitical,
00:18:11.280 and we could give ourselves some credit for striving in the right direction and also being able to master this.
00:18:18.000 Because one of the things that I really liked about your work and about many of the people who are working in the optimistic front,
00:18:24.300 this is mostly economists do this, is the idea that, well, we don't have an apocalyptic challenge on our hands,
00:18:31.620 but we have some challenges, but we're the sort of creatures that can actually master those challenges
00:18:36.160 if we don't panic and do something too stupid.
00:18:40.260 Going online without ExpressVPN is like not paying attention to the safety demonstration on a flight.
00:18:45.280 Most of the time, you'll probably be fine, but what if one day that weird yellow mask drops down from overhead
00:18:51.240 and you have no idea what to do?
00:18:53.520 In our hyper-connected world, your digital privacy isn't just a luxury.
00:18:57.340 It's a fundamental right.
00:18:58.660 Every time you connect to an unsecured network in a cafe, hotel, or airport,
00:19:02.920 you're essentially broadcasting your personal information to anyone with a technical know-how to intercept it.
00:19:07.960 And let's be clear, it doesn't take a genius hacker to do this.
00:19:10.840 With some off-the-shelf hardware, even a tech-savvy teenager could potentially access your passwords,
00:19:16.300 bank logins, and credit card details.
00:19:18.600 Now, you might think, what's the big deal? Who'd want my data anyway?
00:19:22.220 Well, on the dark web, your personal information could fetch up to $1,000.
00:19:26.820 That's right, there's a whole underground economy built on stolen identities.
00:19:30.880 Enter ExpressVPN.
00:19:32.640 It's like a digital fortress, creating an encrypted tunnel between your device and the internet.
00:19:36.900 Their encryption is so robust that it would take a hacker with a supercomputer over a billion years to crack it.
00:19:42.980 But don't let its power fool you.
00:19:44.800 ExpressVPN is incredibly user-friendly.
00:19:47.140 With just one click, you're protected across all your devices.
00:19:50.160 Phones, laptops, tablets, you name it.
00:19:52.340 That's why I use ExpressVPN whenever I'm traveling or working from a coffee shop.
00:19:56.480 It gives me peace of mind knowing that my research, communications, and personal data are shielded from prying eyes.
00:20:02.200 Secure your online data today by visiting expressvpn.com slash jordan.
00:20:06.900 That's E-X-P-R-E-S-S-V-P-N dot com slash jordan, and you can get an extra three months free.
00:20:13.480 ExpressVPN dot com slash jordan.
00:20:18.180 Yeah, and look, that's exactly the right point.
00:20:21.500 I actually think I love your Hobbs and Rousseau.
00:20:25.320 In some ways, you're probably right that I would argue, of course, the world originally was like Hobbs and not Rousseau.
00:20:33.840 But what we've actually managed with 300 or 400 years of hard work is that we have turned the world into something that's much closer to Rousseau.
00:20:44.220 I'm not sure I'm going to go down in history with this philosophy lesson.
00:20:48.480 But the fundamental point is what we achieve is by making the world safer for us, by actually achieving to make sure that people don't die from smallpox and that they don't die from all these other things,
00:21:00.660 that we can actually produce a lot of the things that we need for our world in a much more sustainable way.
00:21:08.120 Remember, you know, if you look at the history of, for instance, fire over the last 10,000 years,
00:21:14.280 typically whenever, you know, humans come around, they just burn the whole stuff because it's in their way.
00:21:21.340 You know, if you're Indian, we have lots of evidence to show that Indians just burned large tracts of land because it brings out the animals.
00:21:29.220 You know, it makes them defenseless and you can kill them and you eat them.
00:21:31.580 It makes a lot of sense.
00:21:33.120 But that's really destroying nature.
00:21:35.220 What we're doing now, of course, is to a very large extent that we grow very efficient food so that in rich countries, at least in countries that have sufficient resources to actually care about other things than just surviving, they set aside more and more nature.
00:21:51.000 That's why, you know, just in Denmark, where I am right now, you know, we used to have about a third of the country covered in forest.
00:21:59.280 Then we cut about all of it down.
00:22:01.560 So we're down to about 2%.
00:22:02.920 Now we're back up to 14%.
00:22:05.500 Why?
00:22:06.280 Because we're rich.
00:22:07.400 We actually like and we plant forest so that we have places to take our kids out and watch it.
00:22:13.140 And so, again, the point here is it's not just optimism.
00:22:17.060 It's actually realism to recognize that you're only going to fix the problem by looking at the data, finding out what are the challenges, fix many of these challenges, and realizing you can't fix everything or at least not everything at once.
00:22:31.640 So you fix the most important challenges that takes the least resources to get the most impact.
00:22:38.280 So, you know, fundamentally, and again, this is what the economists love to say, the ones that have the biggest bang for the buck.
00:22:44.440 But in reality, it's much more about making sure that if you can only do some things, sometimes you do the smartest stuff first.
00:22:52.060 And that's what's brought us to here.
00:22:54.020 And that's why we should stop saying it's the end of the world but still recognize that there are plenty of troubles around.
00:23:01.680 And, again, also, just let's remember, you know, we're sitting in two developed countries where we're very well off, where we're not worried about, you know, neither smallpox because we've eradicated that.
00:23:12.580 But we're not worried about tuberculosis either.
00:23:14.640 We're not worried about not having enough food.
00:23:16.440 We're not worried about—
00:23:17.220 Valeria.
00:23:17.280 All these—yes, all these different things that most people in this world—so, you know, by far over 4 billion, so probably more like 6, 6.5 billion of the 8 billion we are on the planet—are worried about every day.
00:23:33.480 And that's why I'm also really frustrated with this way that we're very often so focused on saying, you know, for instance, on climate change, which is a real problem.
00:23:43.120 We're saying it's the only problem, and then we forget about all these other things where we could help much more, make sure that people are saved much better, and that they could also then eventually get to a point where they would want to, you know, preserve nature and think about other things and just simply making sure they survive the night.
00:24:03.480 Okay, so we talked about some of the reasons that this new quasi-religious view of the world and our place in it might have arisen.
00:24:12.220 Your point was that, well, because of technological progress, we've been able to begin to view nature as a much more benevolent force than we ever had the luxury to before.
00:24:23.360 But there's some other social pressures, let's say, that are pushing this narrative forward that I think are worth delving into.
00:24:32.620 You mentioned one when we just had a bit of a preliminary conversation is that there's a huge competition for people's attention online, and that competition has intensified dramatically in the last 20 years because there are so many voices clamoring for everyone's attention all the time.
00:24:51.020 And one of the advantages to an apocalyptic vision is that it is attention-grabbing.
00:24:58.380 And so any narrative that tilts towards the apocalyptic is likely to get magnified in online communication because things are good and slowly getting better isn't much of a headline.
00:25:13.820 Right, and there's nothing novel about it.
00:25:17.260 Okay, so that's another possible contributing factor.
00:25:20.760 Can I just give you one?
00:25:22.960 Yes, please do.
00:25:23.580 So the world, Our World in Data, it's a wonderful website, and they point out, I love the statistic, you know, we have no sense of how many people we've lifted out of poverty.
00:25:35.720 So for the last 25 years, it's in the order of almost a billion people.
00:25:39.680 So every year for the last 25 years, we could have had a headline in every paper in the world, everywhere around, telling us last yesterday, 138,000 people were lifted out of poverty.
00:25:56.780 How come you never hear that?
00:25:58.840 That's just an astoundingly amazing thing.
00:26:01.340 And yes, there are still many problems.
00:26:04.660 Yes, there are still many poor people.
00:26:06.500 But the fact is, 200 years ago, we used to be almost all extremely poor.
00:26:12.820 There was a few royalty on that.
00:26:14.720 And then it was 90%, 95% of all of us had what is typically known as a dollar a day, but it's really 2.50 now.
00:26:22.700 But the fundamental point is, we were incredibly poor.
00:26:26.260 Now, we have less than 10% that are extremely poor.
00:26:30.040 That's still a problem.
00:26:31.200 We should still help them.
00:26:32.400 And there's a lot of ways we could do that.
00:26:34.160 But that is one of those many stories that you don't hear because, yeah, it doesn't generate clicks.
00:26:41.100 Starting a business can be tough.
00:26:42.960 But thanks to Shopify, running your online storefront is easier than ever.
00:26:47.100 Shopify is the global commerce platform that helps you sell at every stage of your business.
00:26:51.340 From the launch your online shop stage, all the way to the did we just hit a million orders stage, Shopify is here to help you grow.
00:26:57.940 Our marketing team uses Shopify every day to sell our merchandise, and we love how easy it is to add more items, ship products, and track conversions.
00:27:06.420 With Shopify, customize your online store to your style with flexible templates and powerful tools, alongside an endless list of integrations and third-party apps like on-demand printing, accounting, and chatbots.
00:27:17.880 Shopify helps you turn browsers into buyers with the internet's best converting checkout, up to 36% better compared to other leading e-commerce platforms.
00:27:25.700 No matter how big you want to grow, Shopify gives you everything you need to take control and take your business to the next level.
00:27:32.720 Sign up for a $1 per month trial period at shopify.com slash jbp, all lowercase.
00:27:38.660 Go to shopify.com slash jbp now to grow your business, no matter what stage you're in.
00:27:44.040 That's shopify.com slash jbp.
00:27:45.700 Yeah, well, part of it, and it's a deep psychological problem too, is that we are structured psychologically so that the negative has more impact than the positive does.
00:28:01.400 And that's a very difficult bias to work against when you're in a situation where you might be making the case that the positive should be what's predominating.
00:28:12.640 It's like, well, fair enough, but that isn't exactly how we're wired, and I suppose that's why, because we can be 100% dead, but only so happy.
00:28:21.920 And so it, right, it's conservative in some sense, and I don't mean politically, to be a little more hyper-alert to the negative than to the positive.
00:28:31.280 But that's a tough thing to fight against when the negative can grab attention, especially when it's blown up to apocalyptic proportions.
00:28:38.360 Okay, so there's a couple of other things I wanted to delve into there too.
00:28:41.660 So there's this psychologist, Jean Piaget, and Piaget is very interested in ethical development and cognitive development.
00:28:50.140 He developed a stage theory of human cognitive and moral development across time, and the last stage in his sequence of cognitive-slash-ethical transformations was the messianic stage.
00:29:04.400 And not everyone hits that, but more philosophically sophisticated young people pass through something approximating a messianic stage.
00:29:14.000 And it occurs somewhere between the ages of 16 and 21, which, by the way, is the right stage of life to bring young men into the military if you're going to do it effectively.
00:29:25.240 Like, there's a whole radical process of neuronal pruning that takes place between age 16 and 21 that's analogous to what happens between the ages of 2 and 4.
00:29:37.720 It's almost as if, at that point, you die into your adult configuration, right?
00:29:43.880 So, because you're pared down to what is only going to work for your environment.
00:29:48.760 Okay, so now, one of the psychological consequences of that is that when young people are in this stage of development,
00:29:58.060 and they're looking for how to separate themselves from their parents,
00:30:02.380 and to maybe even move beyond the narrow confines of their immediate friendship group,
00:30:07.560 they're trying to catalyze their identity with a broader social mission.
00:30:13.180 And in archaic societies, that step would be catalyzed by something like an initiation ritual,
00:30:22.720 where the old personality is symbolically destroyed, put to death.
00:30:28.880 That accounts for some of the torturous elements of the initiation ceremonies.
00:30:33.200 And then the new man, because the initiation ceremonies tend to be more intense for boys,
00:30:39.600 the new man is brought into being as a cultural entity.
00:30:43.880 And then he's aligned with the mission and purpose, let's say, of the tribal unit.
00:30:49.480 It's something like that.
00:30:50.720 Well, right now, I think the radical leftists on the environmental side
00:30:57.120 have been very good at capitalizing on those urges,
00:31:00.100 because what they offer to young people is this,
00:31:03.380 but it's pathological in some real sense,
00:31:05.400 because it's a shortcut to messianic moral virtue.
00:31:10.560 So the idea would be, well, there is an apocalypse.
00:31:13.620 We need to save the virginal planet.
00:31:15.880 So there's a bit of a St. George thing going on there to protect the virgin, let's say.
00:31:19.880 And the way to do this is to become something approximating an activist
00:31:24.000 who's dead set against the evil patriarchy and the predatory and parasitical individual.
00:31:30.340 And you can understand why that's attractive,
00:31:32.400 because it does offer young people a grand vision.
00:31:35.680 They're now protectors of the planet.
00:31:37.700 They're participating in something that's beyond themselves.
00:31:41.620 But the problem with it is that it's an invitation to a very one-side story,
00:31:46.260 and it's got this terribly destructive anti-human element.
00:31:49.860 And so, well, I'm curious about what you think about that.
00:31:53.540 I think it's a very good metaphor for how the world, in many ways, have come to work.
00:31:59.900 I think you're absolutely right.
00:32:01.920 It's a very sort of stimulating and very easy message to fall into.
00:32:07.860 The world is terrible, but here is how we can help.
00:32:10.980 And the story very easily become, I'm going to help by cutting tons of CO2.
00:32:17.140 Now, again, I bring my data points to this.
00:32:19.240 Yeah, yeah.
00:32:19.580 And so I think there's two parts of it.
00:32:21.620 I mean, first of all, I think we should recognize it's wonderful that young people,
00:32:26.500 and really everyone, wants to do good.
00:32:29.620 We should encourage that.
00:32:30.940 We should make it.
00:32:31.540 That's wonderful.
00:32:32.300 And, you know, again, it's part of the fact that we're now well off,
00:32:36.880 that we can stop worrying where's our next meal coming from,
00:32:40.020 and then we can start thinking about, so how are we going to help the world?
00:32:43.300 But the reality is that when we're being told this, it's the end of the world,
00:32:47.840 and hence this is the only thing that matters,
00:32:51.040 we're very likely to make very poor decisions.
00:32:54.700 I mean, if it was true, you know, if there was a meteor hurtling towards Earth,
00:32:58.640 the only thing that mattered, and it was going to, you know, sort of wipe out the whole world,
00:33:03.180 the only thing that matters was to get this, you know, the space shuttle or the whatever,
00:33:07.640 the starship or whatever, up there and deflect it.
00:33:10.360 That's what we should be focused on.
00:33:12.220 But that's not the right metaphor for climate change.
00:33:16.220 It's a problem.
00:33:17.200 Right, right.
00:33:17.460 And it's a problem that we, in many ways, as we saw with that statistic I told you before,
00:33:23.380 you know, the fact that we've seen dramatically declining levels of people dying
00:33:27.440 from climate-related disasters because we can actually adapt to much of this
00:33:32.280 and because we can predict it, we can make sure that people become more safe from these things.
00:33:38.080 It's not the end of the world.
00:33:39.960 It is a problem.
00:33:41.580 And saying this is the only problem makes us very likely to make really poor decisions
00:33:48.060 because we only focus on this and forget all the other.
00:33:50.740 Let me just, one other thing.
00:33:52.660 So I think there's two points to it.
00:33:55.020 One is that thinking it's the end of the world and thinking this is the only problem
00:33:58.940 make you forget all the other problems.
00:34:00.700 But also when you look at them, what are the solutions that are typically offered?
00:34:05.860 They're terribly inefficient.
00:34:08.200 So they will typically involve something along the lines of saying, you know,
00:34:11.620 I'm going to forego driving my car, which will at best have virtually no impact.
00:34:17.840 It's not that, you know, please do it if it makes you feel good,
00:34:21.720 and especially if it works into your plans, but it's not how you solve the world.
00:34:26.460 And, you know, people will talk about going vegetarian.
00:34:28.840 Again, great thing.
00:34:29.780 I'm vegetarian.
00:34:31.200 But, you know, it's not going to save the world.
00:34:33.780 You need to get a sense of proportion.
00:34:35.500 Most of the things that people talk about are small fractions of what it will actually take.
00:34:40.400 And what they're really suggesting and what everybody's now talking about is this net zero idea
00:34:46.040 that we need to cut all carbon emissions from all economies by 2050.
00:34:52.760 This would be enormously costly and also terribly, terribly fatal for many countries,
00:34:59.880 especially the poorer countries, who basically keep alive by having lots and lots of access to fossil fuels.
00:35:07.340 One way of just seeing that is, right now, half the world's population survive on nitrogen that comes from fertilizer,
00:35:17.700 that comes from natural gas.
00:35:19.700 We have no way of knowing how we could possibly get enough nitrogen to feed most of the world if we went to net zero.
00:35:28.720 We saw a small example of that.
00:35:30.340 It was a very badly performed example in Sri Lanka.
00:35:33.020 But still, it's worthwhile to point out, you cannot actually feed most of the people on the planet
00:35:38.640 if you want to go organic and go net zero right now.
00:35:43.180 And that tells you a story because, as Norman Bolag loved to point out,
00:35:48.060 one of the Nobel laureates that actually helped save, you know, a billion people or so,
00:35:52.700 he said, I look around the world and I don't see four billion people willing to give up their lives.
00:35:58.000 Right. So there's no four billion volunteers to say, all right, I'm not going to be here.
00:36:02.180 We need to be realistic about this and say the current solution is often very counterproductive.
00:36:08.280 So stop believing it's the only problem and stop arcing for bad solutions.
00:36:12.700 OK, so let's talk a little bit about that, too, from a motivational perspective.
00:36:18.120 So we have this appeal to the messianic urge of young people.
00:36:22.100 But now here's how the appeal gets warped, because it is the case that each young person
00:36:28.340 should take their place as a responsible, productive and generous member of the broader social order.
00:36:36.080 But that's really difficult. That's genuine moral effort.
00:36:39.660 And that would require growing the hell up, being willing to make sacrifices,
00:36:44.600 including the future in your deliberation so you're not impulsively hedonistic,
00:36:48.580 serving other people, starting a family, starting a business,
00:36:52.780 like all these things that you have to do in the micro world that require real effort.
00:36:57.880 Well, here's the shortcut that's being dangled in front of young people.
00:37:02.860 It's like, forget about all that, all that activity,
00:37:06.120 that difficult, painstaking, conscientious, local activity.
00:37:10.640 That's all just part of the predatory, parasitical patriarchy.
00:37:14.240 So you can just dispense with all of that.
00:37:16.560 Instead, you can put yourself forward as an ally of virginal nature.
00:37:21.500 And instantly, as an anti-apocalyptic advocate of that sort,
00:37:27.280 you're elevated to the highest possible moral stature,
00:37:30.100 which is something like, well, it's something like a messianic figure.
00:37:33.920 I'm saving the planet.
00:37:35.460 It's like, well, I don't think you are.
00:37:37.520 I don't think you're doing any of the work necessary to save the planet.
00:37:40.620 I mean, one of the things I really liked about your work when I came across it,
00:37:44.080 and that's probably, it's got to be 15 years ago or more now,
00:37:47.540 was that you had done the detailed, data-driven work
00:37:53.380 that was necessary to differentiate the landscape of problems.
00:37:58.140 So first of all, you'd admit to the complexity of the problems that were in front of us.
00:38:02.340 You weren't falling prey to the idea that there was only one problem,
00:38:06.240 there was one solution, and you were the person merely by advocating for that solution
00:38:11.780 who was now God-Emperor of the world.
00:38:14.060 There was none of that in your work.
00:38:16.000 And I think part of the reason it's had a hard time getting traction to some degree
00:38:19.980 is that you're insisting to people that they actually pay some attention
00:38:23.980 to the complexity of the challenges that confront us, right?
00:38:30.100 And the problem with that is that that runs contrary to this narcissistically attractive
00:38:36.520 meta-narrative, which is, no, no, you can just oppose the patriarchy
00:38:40.340 and everything that goes along with it.
00:38:41.760 All that responsibility, which maybe you don't want to shoulder anyway,
00:38:46.180 and you can instantly become morally superior by being a climate change activist.
00:38:51.060 And some of that's attractive to the messianic drive in young people,
00:38:55.000 but also some of that's attractive to just straight, bloody, what would you say,
00:39:00.280 hyper-simplified narcissism.
00:39:02.240 Because people, one of the dark motivations of people is to obtain unearned moral virtue,
00:39:08.400 because we need reputation.
00:39:10.240 And if you can put yourself, that's why we worship allies now, you know,
00:39:13.780 if you can put yourself forward as an ally of the noble cause,
00:39:18.200 then all of a sudden you have as much moral stature as anybody can hope to gain.
00:39:23.140 But you haven't done any of the real work.
00:39:25.820 And the real work is the devil's in the details and the data in relationship to the real work.
00:39:31.520 And so one more thing to add on top of that.
00:39:34.160 So there's this enticement that we're offering to young people.
00:39:37.720 It's like, well, here's a worldview.
00:39:40.960 We can identify the villains.
00:39:42.760 The villains are culture and the predatory individual.
00:39:45.380 You can be an ally.
00:39:46.300 Now you have overblown moral virtue merely because you're on the right side.
00:39:51.120 And then you don't have to think through any of this because you've already got the story right,
00:39:55.440 even though it's a one-pixel story.
00:39:57.480 So that's a very bad moral trap.
00:39:59.940 But then there's something darker going on, too.
00:40:02.180 You know, in Epstein's book, Fossil Future, he cites some of these,
00:40:05.800 the more radical environmentalist types who say things like, I think it was McKibben he quoted,
00:40:11.520 who said something like, as far as I'm concerned, the vista of an unspoiled river,
00:40:18.960 so any natural environment that's completely untouched by human beings,
00:40:23.280 is so valuable that one person or a billion isn't worth that.
00:40:29.040 It's something very close to that.
00:40:30.420 And so there's a malevolent anti-humanism that's at the bottom of this, too,
00:40:35.880 which is also, it's part of this metaphor.
00:40:39.100 It's part of the idea that intrinsically,
00:40:42.000 human beings are something like a cancer on the face of the planet or a virus
00:40:46.040 or a biological force that's gone wrong, a Malthusian nightmare,
00:40:50.440 and that only that which is completely unsullied by human beings,
00:40:55.480 untouched by human hands, is intrinsically valuable.
00:40:57.920 And you can put that forward as a moral claim and say that you're on the side of nature,
00:41:03.160 but the flip side of that is, yeah, like, if there's too many people on the planet there, mate,
00:41:08.500 which of them do you think should go?
00:41:11.240 And exactly how are you going to bring that about?
00:41:14.100 And so that's the dark side of this, of this, like, apocalyptic environmentalist utopian narrative,
00:41:20.620 is that human beings are categorized as evil in and of themselves,
00:41:24.400 and all human activity as evil.
00:41:26.400 And, you know, one of the things that we've discussed is that not only is that a pathological viewpoint
00:41:32.420 and extremely dangerous, but interestingly enough,
00:41:36.020 it's probably also counterproductive from the hypothetical perspective of the environmentalist utopians.
00:41:44.320 Because if the goal is to produce a greener, more biodiverse planet, let's say,
00:41:49.940 then it seems to me, this is something we can discuss,
00:41:54.680 that the evidence suggests very strongly that if you make people richer,
00:41:59.000 we can talk about what that means,
00:42:00.380 if you make people richer in a benevolent manner,
00:42:04.800 or at least you get the hell out of their way,
00:42:06.740 then they start caring about the environment in a distributed manner,
00:42:10.420 and you get a positive relationship between the remediation of absolute poverty and environmental awareness.
00:42:17.640 So not only does this narrative not solve the climate problem and destroy the economy,
00:42:22.240 it actually makes, I think it makes the climate problem a lot worse.
00:42:25.400 And we're seeing that play out in Europe right now.
00:42:27.320 I think there's a number of things to unpack here.
00:42:31.520 So I think you're absolutely right that we have very good evidence to saying
00:42:36.580 if people are better off, they're much more likely to be environmentally concerned.
00:42:42.140 Environment problems are poverty problems.
00:42:44.940 That's really what that is.
00:42:46.520 You know, when you're poor, yes, you just cut down forest in order to feed your kids.
00:42:51.380 You'll basically litter around everything because, honestly, you have other things on your mind right now.
00:42:55.940 Whereas once you're well off and most of your future is secure,
00:43:00.180 you can care a lot more about the environment.
00:43:03.440 And I also think, yes, you're absolutely right.
00:43:06.080 There's a lot of people who seemingly get a lot of sort of instant credit
00:43:09.900 by just throwing paintings or whatever, you know,
00:43:13.160 a painting at whatever famous painting they're in a museum with,
00:43:18.660 or just, you know, get some sort of, you know, glue themselves to highways or whatever.
00:43:22.720 That's not how you solve this problem because it is very, very complicated.
00:43:27.760 And as you also point out, if you actually want to make part, be part of the solution,
00:43:35.220 help bring the world onwards, it's actually going to take a lot of painstaking work.
00:43:40.340 And I think you nailed it on why are my solutions much harder to sell?
00:43:47.080 Well, fundamentally, because it's more boring.
00:43:50.820 It is not as flashy and exciting as being able to, you know, get on a TikTok video and show your virtue.
00:43:59.140 But it's actually about a lot of hard work.
00:44:01.100 I mentioned Norman Borlark.
00:44:03.620 He got the Nobel Peace Prize in 1970.
00:44:07.220 He was the originator.
00:44:08.480 There's a lot of other people along with him for the Green Revolution that basically worked through the 60s and 70s.
00:44:15.660 And most people today probably don't even remember.
00:44:21.000 But if you go back and read what people were worried about,
00:44:23.740 we were incredibly worried about the fact that most nations just would not have enough food.
00:44:29.780 And so, you know, there is literally people considering maybe we should do triage and say,
00:44:35.660 well, India is just a goner.
00:44:38.000 You know, they'll just have to sort of die out.
00:44:40.240 And what Norman Borlark said was, we actually have the technology to make much more food on every hectare or every acre of land.
00:44:51.500 We do that by making genetic modifications.
00:44:54.820 So he just did it with, you know, normal genetics.
00:44:57.840 He simply constructed together with lots and lots of other researchers, constructed seeds of both rice and wheat that were shorter.
00:45:07.200 And that meant they were shorter.
00:45:08.620 And so they could put more energy into their kernels.
00:45:11.800 And that meant there were many more kernels, much less straw.
00:45:15.220 And we got much more food.
00:45:17.540 That is, you know, in a very short hand, that's basically what fed the world.
00:45:22.200 That's what brought India from being a basket case to now being the world's leading rice exporter.
00:45:28.400 It doesn't mean that there are not problems in India.
00:45:30.640 It doesn't mean that we've fixed everything.
00:45:33.080 But what that tells you is, this is the way you actually walk towards a solution.
00:45:38.980 So a lot of environmentalists, a lot of, you know, very, very smart thinkers back then basically said,
00:45:44.340 lots and lots of people are going to die.
00:45:46.400 Literally hundreds of millions of people are going to die.
00:45:48.880 And I'm okay with that because, you know, it had to happen.
00:45:52.220 Whereas the right way is to sit down and actually use science and spend, you know, your entire life working on making these rice grains more effective.
00:46:04.360 It's not nearly as sexy.
00:46:05.900 And, of course, I'm only telling the big story because he got the Nobel Prize.
00:46:09.040 You know, there were lots and lots of other researchers whom I don't even know, none of us really remember anymore.
00:46:14.060 But those are the people that actually made it work.
00:46:17.120 And so I'm often struck, as you also pointed out, I'm often struck when people say there are too many people on the planet.
00:46:23.280 Because when you drill into it, it means, you know, just enough of me but too many of you.
00:46:30.440 It's never, you know, you're not actually going to have you or your family leave.
00:46:34.880 But you think someone else should go.
00:46:36.420 Now, I get the idea of saying that maybe in some sort of very detached way we would like to see a world that had fewer people.
00:46:45.880 I think that's probably wrong.
00:46:47.360 But you can have that argument.
00:46:48.400 But if you actually look at it in, you know, in the philosophical implications of that is that you're telling lots and lots of people to die.
00:46:56.040 The reality should be, I think, and that's what our history shows us, is when you have rich and wealthy countries, you can actually get both.
00:47:04.740 You both get fewer kids because once you grow up, once you get rich enough, kids actually start to be really expensive.
00:47:11.860 So you have few of them, and that's one of the reasons why we no longer see this population explosion, as people talked about, in most of the rich world.
00:47:20.520 Actually, we're likely to see that spread over the whole world in the next 40 years or so.
00:47:25.440 So we are over most of the problem, and what we have managed to do is we can now grow food much more effectively,
00:47:33.380 and we should be moving towards growing it even more effectively so that we can have all the people well-fed on less and less land so there's more space for nature.
00:47:45.540 We're doing that in the rich world.
00:47:46.620 We can also do that in the poor.
00:47:48.680 The thing that's interesting here, or one of the things that's interesting, you talked about Norman Borlaug and about the sexiness of, say, your vision.
00:47:56.760 And the thing is, when I started to delve into the research on the economy and environment front,
00:48:03.000 I actually found the work that you were doing, so to speak, highly sexy, because I thought, oh my God, here's a better story.
00:48:11.700 We could make everybody in the planet rich, and I want to go into what rich means,
00:48:15.900 and at the same time make the planet much more sustainable on the biological front.
00:48:21.480 We could do both of those.
00:48:22.500 Why isn't that just way better than the Malthusian zero-sum game?
00:48:28.240 Let's delve into those issues a little bit.
00:48:31.760 So we're offering young people a cheap way out of their privilege-induced guilt.
00:48:39.780 So now they have this Rousseauian landscape set in front of them.
00:48:43.080 They're pretty secure.
00:48:44.600 They're pretty comfortable.
00:48:45.980 They're not going to die of malaria or smallpox.
00:48:48.060 They have enough to eat.
00:48:48.880 They have educational opportunity.
00:48:50.180 But now they're scrounging around trying to figure out what to do with their life,
00:48:53.500 because they need to justify their miserable existences to themselves.
00:48:57.080 They need something meaningful.
00:48:58.720 And so the radicals come along and say, well, just be an ally of the virginal planet.
00:49:03.200 And that is simple, so it has that appeal.
00:49:08.000 But it's also simple in an underhanded way, because it isn't the message,
00:49:13.060 look, why don't you be like Norman Borlaug and develop something like a noble vision,
00:49:18.000 which is, well, maybe we don't have to starve four billion people to death.
00:49:21.760 Maybe we can feed them.
00:49:23.060 OK, what do people eat?
00:49:24.740 Oh, they eat food.
00:49:26.520 Yeah.
00:49:26.900 So how about if we make food more efficient?
00:49:29.440 We make agriculture more efficient.
00:49:30.920 Let's see if we can feed all those people.
00:49:32.740 And that's a pretty hard problem.
00:49:34.240 So how would I devote my whole life to this?
00:49:37.340 And then you might say to young people, well, that's a hell of a price to pay to devote your
00:49:40.740 whole life to something.
00:49:41.640 But we could be saying forthrightly, well, don't you want an identity?
00:49:47.220 Don't you want to devote your life to the solution of some genuinely difficult problem?
00:49:52.920 I mean, that's where you're going to find meaning.
00:49:55.020 I mean, how meaningful has your work been to you?
00:49:58.540 Very meaningful.
00:50:00.060 And I think, so I just wanted to slightly flippantly, but not only flippantly say,
00:50:05.460 I'm very, very pleased and gratified that you thought this was very exciting.
00:50:10.280 I think it's also a little bit because you're a nerd.
00:50:13.920 So, you know, it is a more nerdy solution, you know, and it is less immediately satisfying.
00:50:20.180 But I think that's exactly the point we need to get out.
00:50:23.060 We need to tell people this will ultimately be a much, much more rewarding understanding.
00:50:30.840 Look, we should also have people that work on climate because, again, climate is a real
00:50:35.800 problem.
00:50:36.280 But you're not going to solve it by throwing paint at something.
00:50:39.000 You're not going to solve it by telling people you can't, you shouldn't, you should
00:50:43.060 freeze, you should not have a nice life.
00:50:45.820 The way you're going to solve this, of course, is by being the guy that comes up with the
00:50:52.300 technology that actually delivers clean energy or cleaner energy at much lower cost.
00:50:59.520 This is how we've solved pretty much all problems.
00:51:02.440 We haven't solved them by wishful thinking or telling people, I'm sorry, could you not do
00:51:07.240 stuff that you like to do?
00:51:08.780 That never works.
00:51:10.040 What does work is you come along with a better solution.
00:51:14.060 You know, this is a slightly trite metaphor.
00:51:17.760 But back in the 1860s, the world was basically fishing up all whales.
00:51:24.180 Why?
00:51:24.700 Because whales have this wonderful opportunity of whale oil.
00:51:28.980 It turns out the whale oil just burns much, much cleaner and much brighter than any other
00:51:33.620 oil.
00:51:33.880 Remember, that was pretty much the only lighting that you had back in the 1860s.
00:51:39.260 So pretty much all Western European and North American rich homes were lit up with whale
00:51:46.880 oil.
00:51:47.720 And so everyone just went out to the ends of the world to catch whales.
00:51:52.860 You could not have stopped the slaughter of whales by telling everyone, I'm sorry, could
00:51:57.900 you dim your lights a little bit?
00:51:59.920 Could you go back and have that sooty light that you didn't like?
00:52:02.660 Like, that's not going to work.
00:52:04.180 What did work was, ironically, that we found oil in Pennsylvania, right?
00:52:08.740 That we actually found ground oil, the oil that we just use today, mineral oil, and you
00:52:14.800 could substitute that for whale oil.
00:52:17.500 Turned out it was much cheaper.
00:52:18.880 It burnt better.
00:52:19.600 It was much easier to get hold of.
00:52:21.680 And so we pretty much stopped hunting whales after that.
00:52:25.440 There are still some because, you know, they also give meat.
00:52:27.940 But the fundamental point is technology solves this problem, not good intentions.
00:52:34.320 Right.
00:52:34.920 Well, so that means that we can thank the fossil fuel industry for saving the whales.
00:52:39.240 And you can say, you know, if you think about it, well, we can thank the fossil fuel industry
00:52:44.000 for a lot of things.
00:52:44.920 If you think about around 1900, almost everyone worried about the fact that you could see cities
00:52:51.300 becoming more and more congested.
00:52:52.740 You had horse carriages, and they lift an enormous amount of manure.
00:52:57.120 So there were lots of people who were really worried about the fact that by, you know, by
00:53:00.560 extrapolation, by 1920, 1930, all of New York, all of London would be covered by feet and
00:53:07.100 feet of horse manure.
00:53:08.540 How were you going to solve that?
00:53:10.160 And along came the automobile.
00:53:12.400 Again, the point here is not to say that a technology that we then innovated 120 years
00:53:18.320 ago is the right one for today.
00:53:20.600 Eventually, that will go, you know, the way of the dinosaur.
00:53:23.740 We'll find other ways.
00:53:24.960 But we should not be kidding ourselves and believing that just wishing it wasn't so makes
00:53:30.560 it go away.
00:53:31.320 The way you do this is through technology.
00:53:33.480 Well, especially, okay, so let's talk about wealth a bit.
00:53:36.480 Because in the West, it's easy for people, I saw yesterday, I think it was Extinction Rebellion
00:53:42.080 or one of these damn groups put out this message saying that, well, you know, people should
00:53:46.040 just stop flying because flying produces water vapor and carbon dioxide.
00:53:50.800 And, you know, really, we don't need to fly.
00:53:54.040 And so I'm reading that.
00:53:55.760 I'm thinking, well, because there's Marxism of a terrible type lurking under the earth.
00:54:01.820 It's like, well, who the hell determines what we need exactly?
00:54:06.680 I mean, needs are, first of all, needs aren't self-evident.
00:54:09.720 Really, what you need to do is you need to breathe.
00:54:12.840 You need to drink water and you need to eat.
00:54:15.000 After that, what constitutes a need gets pretty damn dubious.
00:54:19.280 And my concern is that if you get people adjudicating the, what would you call it, the comparative
00:54:25.220 validity of need, you turn the whole world over to people who say, well, you don't really
00:54:30.020 need that.
00:54:31.320 You don't really need shelter.
00:54:33.280 You don't really need, well, you don't need, you can have bugs.
00:54:36.020 You don't really need food.
00:54:37.080 You can eat a minimal protein source.
00:54:39.780 Well, you don't really need children because they're kind of hard on the planet anyways.
00:54:44.320 You certainly don't need pets because they add to the carbon dioxide load.
00:54:47.540 You don't need your fireplace.
00:54:49.180 You don't need a gas stove.
00:54:50.520 You don't need a heater.
00:54:52.320 And so, and then what you have is this insistence that the way to planetary salvation is to tell
00:54:57.160 other people what they don't get to have.
00:54:59.460 And what's interesting about that too, and this is the hypocritical element, and I certainly
00:55:03.860 see this at the elite Davis global level.
00:55:06.580 It's like, well, exactly who are you telling here that they don't get to have what they need?
00:55:11.180 Because you don't mean that for yourself.
00:55:13.400 You're not going to go live in a damn hut in the middle of Africa and burn dung.
00:55:18.460 You're not proposing that.
00:55:19.840 You're proposing that these damn poor people in the third world country, and maybe in your
00:55:24.040 own country, and there's too many of those blighters anyways, that they should just be
00:55:28.240 bloody well satisfied with the fact that they've got what they have now.
00:55:31.340 And they shouldn't in any manner ever dream of having this sort of wealth of opportunities
00:55:37.400 and security that we have in the West.
00:55:39.580 Then we could talk about wealth, because people in the West are guilty about wealth.
00:55:44.800 Well, we have all these things we don't need.
00:55:46.840 It's like, well, yeah, that's actually the definition of wealth.
00:55:49.240 You've got a choice of toothbrushes.
00:55:51.280 Maybe you don't need it, but it's not a bad side effect.
00:55:54.020 But we should get down to brass tacks here, people.
00:55:56.740 When we're talking about wealth for the typical person, here's what we're talking about.
00:56:01.900 Your house isn't too cold or too hot, so you have heating, and maybe you have air conditioning.
00:56:07.420 That'd be kind of nice.
00:56:08.960 You have running water.
00:56:10.980 You have good sanitation, so you have a toilet, and you have clean water.
00:56:15.100 You have a plentiful supply of high-quality food that you don't have to spend all your
00:56:19.880 time scrounging around to deliver, and it's reliably sourced.
00:56:23.700 And your children have the opportunity to live, to live healthily, and to be educated.
00:56:29.680 That's like 90% of wealth.
00:56:32.380 And so when we're talking about wealth that we want to provide the rest of the world,
00:56:36.600 we're not talking about 1920s spats-wearing, capitalist depredations, champagne, hooker,
00:56:43.540 and cocaine.
00:56:44.360 We're talking about the basics of life, right?
00:56:48.300 Temperature regulation, provision of water, provision of food, health and opportunity for
00:56:53.140 children.
00:56:53.720 And we still haven't provided that to everyone in the world, and we could.
00:56:57.000 That's one of the things that's so optimistic about your work.
00:56:59.680 Not only could we do that, we should do it, and we could and should do it in a way that
00:57:05.260 would benefit the long-term sustainability of the planet.
00:57:09.200 Yeah.
00:57:09.740 And look, again, you rightly point out that people will want to manipulate your choices.
00:57:17.920 And not only does that have a dubious sort of moral impact, but it's also, you know, just
00:57:24.740 from an economist's point of view, if you tell people you can't fly, it's not like they're
00:57:29.100 going to say, oh, I was actually going to—I was planning on spending 5% of my income on
00:57:33.680 flights, so I'm just going to burn these money.
00:57:36.320 I'm just going to spend it on other stuff, which also produce carbon emissions.
00:57:41.740 And so, you know, we have no sense of saying the only real way—and I have some respect,
00:57:46.680 sort of intellectual respect for these people who are actually saying the only way to solve
00:57:50.580 global warming—is by making everyone poor, what's called degrowth.
00:57:55.340 First, we make the rich world poor.
00:57:57.760 And then once the poor world have gotten slightly richer, we also say stop to them.
00:58:02.760 At least it's intellectually honest.
00:58:05.100 It's also terribly, terribly anti-human.
00:58:07.840 And it's not going to happen.
00:58:08.940 There's no constituency.
00:58:10.900 No politician would ever get voted into office, or if he or she actually delivered on it,
00:58:16.260 would get re-elected on that sort of platform.
00:58:19.860 And what that tells you is this is just simply, you know, again, wishful thinking.
00:58:25.280 And I keep getting back to saying if you're actually serious about problems, are you going
00:58:29.760 to, you know, are you going to suggest something that'll actually work?
00:58:33.480 Are you just going to suggest something that makes you feel good or, you know, that you know
00:58:37.260 have no chance on earth to get carried through?
00:58:40.340 Again, there is an argument, and I think there's a legitimate argument, for putting a carbon
00:58:45.520 tax on things.
00:58:46.740 That's a simple way that we make regulation that says there's a global bad here, we tax
00:58:53.000 it, and then you put that global bad into your considerations.
00:58:58.360 But that's how you solve it efficiently.
00:59:00.680 And of course, the reality of that is that in any realistic formulation of this, people
00:59:05.900 will fly slightly less.
00:59:08.140 And that's no good for many of these moral crusaders because they want to completely
00:59:13.840 get rid of it.
00:59:14.620 It's not going to happen.
00:59:16.100 What needs to happen if you actually want to solve this problem again is to get innovation.
00:59:22.160 You know, we already know how to, for instance, decarbonize most of the electricity system.
00:59:26.460 It's just through nuclear.
00:59:27.960 We know that's worked for 50 years.
00:59:30.220 The reason why we're not doing it and the reason why I'm a little skeptical about it is
00:59:33.520 that it's too costly right now.
00:59:35.040 We can have a whole conversation about why that's the case.
00:59:37.720 But there's a lot of innovation going on about fourth-generation nuclear that could become
00:59:42.280 much cheaper.
00:59:43.380 We just saw the breakthrough.
00:59:45.080 Maybe fusion as well.
00:59:47.060 But the point is that there's lots of technologies.
00:59:50.300 Those are the ones that we're going to focus on because, again, you're not going to tell people
00:59:54.420 you can't have your whale.
00:59:56.420 You have to dim your light.
00:59:58.100 What you can say is, oh, here's a better alternative.
01:00:00.480 Oh, it also happens to be cheaper, and it doesn't kill whales.
01:00:04.960 No, and so the degrowth model is predicated on the idea that, well, let's lay out the
01:00:11.600 ideas, that really there are too many people on the planet, and the people who are there
01:00:17.140 now, especially the rich ones, are consuming far too many resources per capita.
01:00:22.000 And so the only way forward to a sustainable planet is through degrowth.
01:00:25.740 Okay, so let's take that apart.
01:00:27.740 First of all, who's going to impose that degrowth?
01:00:31.820 Can you imagine the totalitarian state that would have to be built in order for every single
01:00:38.280 one of your perches to be monitored, which is really what the plan is in some real sense?
01:00:43.140 I mean, that's just a terrible, nightmarish vision of petty tyranny everywhere, where every
01:00:48.980 single move you make is analyzed in terms of its overall planetary consequence, and only
01:00:55.280 calculated by people who do not have your best interests in mind, by the way.
01:00:59.620 But worse than that is that there isn't any evidence whatsoever that if you use this strategy
01:01:06.800 of degrowth and you make people poor, that you're going to get anything approximating the
01:01:11.780 beneficial effect that you propose.
01:01:16.100 I mean, let's look at how this is already playing out.
01:01:18.820 So what has Germany managed with an approximate degrowth strategy?
01:01:25.400 I mean, so part of the degrowth is, well, we certainly don't need fossil fuels or nuclear
01:01:29.700 on the energy front.
01:01:31.920 Okay, so from what I've been able to gather now, energy is about five times more expensive
01:01:36.840 per unit cost in Germany than it is in the United States.
01:01:40.280 A huge number of industrial endeavors are fleeing Germany for U.S., which isn't so bad,
01:01:46.040 or China, which is really not good, because it's too expensive even to do such things as
01:01:51.400 build batteries for electric cars in Germany now.
01:01:54.340 But also, you might say, well, that's all worth it, because now we have all these renewables
01:01:58.800 and the place is much cleaner.
01:02:01.200 And if we have to pay a price in terms of loss of industrial productivity, c'est la vie.
01:02:05.140 But I read the other day, too, and I hope this is accurate, that Germany has fallen to
01:02:09.800 something underneath the hundredth position in the world in terms of emissions per unit
01:02:16.900 of energy produced.
01:02:18.160 Because as we've moved foolishly and precipitously towards unreliable renewals, especially on the
01:02:25.700 wind and solar front, the Germans have had to, especially because they killed their nuclear
01:02:29.620 plants, they've had to turn back to coal.
01:02:31.500 And so now they're burning way more coal than they used to for five times the electricity
01:02:37.220 price and a reliable power to boot.
01:02:41.080 And so this degrowth philosophy, which violates the presuppositions of economic motivation,
01:02:48.140 let's say, on a psychological and economic front, is not only going to demolish the industrial
01:02:53.780 structure, make the poor much poorer and more desperate, but it's going to bring about worsening
01:03:00.400 of the conditions for environmental sustainability and very, very rapidly.
01:03:04.320 Now, we're also seeing in Europe, and you can comment on this a bit, is that because we're
01:03:09.220 now in an energy crisis, a foolish energy crisis, that people are starting to deforest Europe
01:03:15.180 and they're starting to burn peat in Ireland again because they have to heat their damn homes.
01:03:20.100 So even by the standards of the environmentalists themselves, perverse though those standards
01:03:27.640 may be, the degrowth philosophy is completely unsustainable politically and psychologically
01:03:33.960 because you're just not going to tell people, you know, you can let grandma freeze in the
01:03:37.900 dark.
01:03:38.400 I think you're absolutely right that it's not going to happen.
01:03:41.400 People are just not going to allow that.
01:03:43.100 And they won't accept that sort of tyranny, as you're talking about.
01:03:49.380 You would need much, much more than anyone would be willing to offer up, just to give
01:03:54.840 you one sense of it.
01:03:56.120 For the average American, some estimates indicate that going net zero would actually cost you
01:04:02.760 in the order of $12,000 per person per year by 2050.
01:04:08.020 You know, that's just impossible to imagine than anyone would ever accept.
01:04:14.260 And also, I think we need to separate.
01:04:17.660 There has been this tendency in almost all conversation to totally model climate and environment.
01:04:24.160 Now, environment is a lot of other things.
01:04:26.360 And it's, for instance, air pollution and all these other things that are fairly, you know,
01:04:30.040 local to you.
01:04:31.920 So degrowth would actually solve climate change because we would stop producing.
01:04:37.140 It wouldn't solve a lot of the other environmental problems because now we would start, you know,
01:04:42.880 burning everything else and we would be terribly poor and, you know, life would be horrible
01:04:47.280 in so many other ways.
01:04:48.480 I think fundamentally we just need to get back to realizing that when you're saying, and,
01:04:53.600 you know, these protesters will glue themselves to the roads and say, we don't want you to
01:04:59.140 use any fossil fuels.
01:05:00.780 But, of course, when they come back in their lives, they're crucially dependent mostly on lots and lots of
01:05:07.360 energy produced by fossil fuels.
01:05:09.600 And they would probably, most of them, not be willing to give that up.
01:05:14.060 And if they're not even willing to do that, of course they're not going to get anyone else to do that
01:05:18.780 just by gluing themselves on high rates.
01:05:21.720 What they're going to do is, if they could come up with a new innovation that was cleaner than
01:05:30.600 anything else, if they, instead of, you know, gluing themselves to the road, would actually
01:05:34.640 take up university, go in and, you know, figure out, and it need not be a solution to climate
01:05:40.300 change, but one of those solutions, but one that was actually effective, then, of course,
01:05:45.240 they would have done really good in the world.
01:05:47.000 And so, again, I think that's the sort of purpose of our whole conversation, to say, you know,
01:05:51.860 stop believing this is the end of the world, because it's not.
01:05:54.680 But also, stop believing that the solution to the end of the world is to glue yourself or to
01:06:00.620 just be gloomy or tell everyone we should just stop with everything.
01:06:04.140 The solution is smart technologies.
01:06:06.940 That the solution to the end of the world is not to become a frightened tyrant that says,
01:06:15.880 oh my God, the sky is falling.
01:06:18.140 I need all the power now.
01:06:20.440 I need to make these centralized decisions.
01:06:23.180 These centralized decisions are going to affect every single element of your life and your
01:06:28.200 children's life, assuming you get to have children at all.
01:06:31.120 And that's going to bring us towards a more benevolent planet.
01:06:35.120 It's like, none of that's true.
01:06:37.060 There's not an apocalypse.
01:06:38.600 You don't get to be a frightened tyrant.
01:06:40.600 We don't need centralized, top-down control of absolutely everything we do.
01:06:44.220 And even if we did have all of that, what we would get wouldn't be the positive outcome
01:06:48.600 that everyone who's on that side is touting.
01:06:51.700 What we'd get is the same kind of centralized planning disasters that we've seen play out
01:06:56.100 time and time again over the last hundred years all over the world.
01:06:59.280 This is every single way you cut this, this is a bad set of solutions.
01:07:02.980 Now, I want to steal manned a little bit.
01:07:05.920 So, you know, the idea that there are too many people on the planet is actually predicated
01:07:12.040 on, you might say, you might say data, which is one of the weird things about data, is that
01:07:17.720 if you put, if you take a Petri dish that's full of a nutritive medium and you put a mold
01:07:24.700 on it, the mold will grow until it eats all the available nutrients and then it will all die.
01:07:30.080 And so that's a kind of limits to growth model.
01:07:33.080 But the world in that model is a encapsulated Petri dish and the biological agents are mindless
01:07:40.980 single-celled organisms.
01:07:42.920 So you might say, well, that's not a great metaphor for human beings because first of
01:07:46.960 all, human beings aren't mold or viruses or cancers.
01:07:50.280 We're a very strange breed of creature because we can think and we can react and we can modify
01:07:58.740 the environment and we can modify ourselves.
01:08:01.520 And so it isn't obvious at all.
01:08:03.680 People say, follow the science.
01:08:05.140 The science is Malthusian.
01:08:06.620 We're all, you know, microorganisms in a Petri dish or rats in a colony.
01:08:11.140 We're going to overpopulate till we devour everything.
01:08:13.460 It's like, I don't think so.
01:08:15.460 So if you, first of all, why would you use single-celled organisms as the cardinal metaphor
01:08:23.220 for human populations?
01:08:28.300 It's a preposterous biological metaphor.
01:08:30.960 And one of the things the economists have pointed out, as opposed to the biologists who
01:08:36.000 tend to be more Malthusian repeatedly is, look, you can create your linear models of what's
01:08:44.720 going to happen if.
01:08:46.840 But what you're failing to take into account is the proclivity of people to revolutionarily
01:08:53.440 transform the way they interact with the world and to continually figure out how to do more
01:08:58.760 with less.
01:09:00.160 And there's no indication whatsoever that we've run to the end of that process.
01:09:04.820 Quite the contrary, we'd seem to be getting faster at it all the time.
01:09:08.540 And there's no reason to assume that the limits to growth Petri dish model of human catastrophe
01:09:15.560 is the appropriate biological metaphor.
01:09:18.000 That's not justified by science.
01:09:20.160 And 99% of scientists don't believe it.
01:09:23.660 And one of the things that's so interesting, you know this perfectly well, is that economists
01:09:28.260 and biologists have been betting against each other really formally since the mid-60s.
01:09:33.080 And the biologists like, what's his name at Stanford, Paul Ehrlich, who's been screeching
01:09:39.160 since the mid-60s about the fact that we're Malthusian organisms doomed to extinction.
01:09:44.980 He's been wrong in every single one of his predictions and publicly and massively, and I would say even
01:09:51.820 murderously wrong in some fundamental sense.
01:09:54.920 So he predicted that there would be way too many people on the planet by the year 2000.
01:10:00.000 That was seriously wrong.
01:10:01.500 He predicted that all of the prices of our commodities were going to spike through the
01:10:06.620 roof as everything became more and more scarce, as there became more and more of us.
01:10:10.460 That was 100% wrong.
01:10:13.280 He had a famous bet with Julian Simon, and Simon collected, I think, just after the turn
01:10:17.920 of the millennia, because Simon, the economist, claimed that, no, no, quite the contrary.
01:10:22.280 As there are more people, there'll be more commodities, there'll be more resources, and the prices
01:10:27.160 will fall.
01:10:27.720 He even offered Ehrlich the opportunity to pick the basket of commodities they would
01:10:33.040 bet on.
01:10:34.340 And Ehrlich got stomped.
01:10:36.060 And what we've seen continually, time and time again, is the economists have been right,
01:10:40.920 which is that there's no limit to human ingenuity.
01:10:44.200 And that also means that if we got food, water, sanitation, opportunity to the billions of people
01:10:52.260 that don't have it, we would produce a few more spectacular hypergeniuses like Elon Musk,
01:10:58.000 let's say, or Norman Borlaug.
01:10:59.580 And God only knows what sort of efficiency they can produce for us.
01:11:03.200 So the idea that we could convert natural resources into human cognitive ability, that seems like a
01:11:10.200 pretty damn good trade from the perspective of economic flourishing and environmental
01:11:14.920 sustainability.
01:11:16.100 Yeah.
01:11:16.360 So just two points in that.
01:11:17.800 I think it's incredibly important to remember that it's not just the Elon Musks and the
01:11:23.600 Norman Borlaugs that make up the world.
01:11:25.820 Also, because, you know, it is that you very early on said that you've done all this great
01:11:31.660 work and thank you very much.
01:11:33.280 But this is the work of literally many hundreds of the world's top economists that I've helped
01:11:39.100 sort of shepherd together.
01:11:40.760 But there are lots and lots of people involved.
01:11:43.000 And likewise with Norman Borlaug, I'm sure that's also true with Elon Musk.
01:11:46.560 And so the fundamental point is this is about getting everybody on board with this.
01:11:51.900 And that's also why I'm so excited we have this conversation, because I think this is about
01:11:55.560 telling you don't need to be Elon Musk to be on this positive side of history.
01:12:01.140 You need to make sure that you're pitching into this very long battle in order to make
01:12:06.360 the world better.
01:12:07.660 And again, also, sorry, I'm just being the nerd here, right?
01:12:10.660 But of course, nobody's 100% wrong.
01:12:13.760 Yeah, I get that this is a sort of metaphorical 100%, but it's more the argument here is to
01:12:20.360 recognize that a lot of biologists and Julian Simon actually wrote about that.
01:12:24.740 The guy you mentioned that did the bet with Paul Ehrlich, and he said it is very curious
01:12:30.340 how most of the people who think the world is going to end are typically natural scientists.
01:12:36.500 They're typically biologists or biologists inspired.
01:12:40.260 And I think that is because the models that cover those, and it's not just mold, but it's
01:12:45.120 also, you know, fox, what is it, rabbit populations that they will sort of interact, and then there's
01:12:52.260 too many rabbits, and then there's too many foxes, and so on.
01:12:54.960 Those are all models of individuals that act without foresight.
01:13:00.960 And it makes sense.
01:13:01.840 That's how, you know, 99% of nature is.
01:13:04.880 But we're not that.
01:13:05.940 We actually not only know how to think ahead of ourselves, that's, of course, why we're
01:13:11.220 having this conversation.
01:13:12.180 That's why we're worried about things like climate change.
01:13:15.040 And again, you mentioned very early in the program, you know, that there's sort of evolutionary
01:13:19.580 reasons why we tend to be worried about stuff.
01:13:23.760 I heard this one guy say, you know, we're the descendants of the guys who worried about
01:13:28.660 the saber-toothed tigers.
01:13:29.780 You know, the guys who said, oh, it'll be fine, are the ones who, you know, didn't get
01:13:33.760 to pass on their genes.
01:13:35.120 So in that sense, it makes perfect sense.
01:13:37.560 And we should be happy that there's a lot of people out there pointing out, this might
01:13:41.940 be a problem.
01:13:42.660 Oh my goodness, this could actually be a problem.
01:13:44.880 We just shouldn't believe that all of those problems are then all of the ends of the world.
01:13:50.760 Because if our evidence has ever told us anything—
01:13:53.220 Or they're all the same problem.
01:13:54.300 Yes.
01:13:55.220 If our evidence has ever told us anything, sorry.
01:13:57.660 That is that overall, we have enormously succeeded.
01:14:02.880 I just want to give you one other data point.
01:14:05.540 You know, in 1900, the average life expectancy on the planet Earth was 32.
01:14:11.060 Today, it's like 74.
01:14:13.740 We have more than doubled our lifetime.
01:14:16.020 Each one of us has gotten twice the amount the life on this planet.
01:14:20.880 That's just astounding.
01:14:22.600 And actually, it hasn't stopped because of technology and because of medical advancement.
01:14:27.340 But also because we get better, you know, sanitation and everything else.
01:14:30.820 We actually continuously see life expectancies that go up.
01:14:34.700 It hasn't done so in the U.S. and we could talk about that.
01:14:37.600 But globally, it absolutely does.
01:14:39.940 And it still does.
01:14:40.960 In the sense of every year you live, you add about three months to your life expectancy.
01:14:47.360 How amazing is that?
01:14:48.460 A stunning statistic.
01:14:50.020 Yeah, that's true.
01:14:50.800 Okay, so let's go back to the biological metaphor here.
01:14:55.440 Okay, so we've already established, I hope for everyone listening, that we are not mold in a petri dish.
01:15:00.480 That's a bad metaphor.
01:15:02.000 So now you said, well, maybe we're foxes and rabbits.
01:15:04.580 And there is this nature red in tooth and claw that's going to modulate our population by necessity.
01:15:12.280 And that's sort of the biological argument.
01:15:13.940 That even though we're not mold, let's say, we're more like foxes or rabbits.
01:15:18.920 And we're going to multiply until the predators take us out or something like that.
01:15:22.980 But I would say, here's why.
01:15:25.380 Biologically, that's not true.
01:15:28.100 So Alfred North Whitehead, I believe it was, said the reason we think is so that our ideas can die instead of us.
01:15:37.100 So here's the human cognitive transformation.
01:15:41.220 So imagine that you go do something stupid and you get killed.
01:15:45.420 Well, that's not so good.
01:15:46.420 Now you're dead.
01:15:47.060 But your pattern, even your DNA, is no longer around.
01:15:51.000 That's a dead end.
01:15:52.040 It's an evolutionary dead end.
01:15:53.220 You did something stupid and now you don't exist.
01:15:55.420 And neither do your descendants.
01:15:57.360 Okay, so that's how animals work.
01:15:59.940 But that's not how human beings work.
01:16:01.760 Because we've taken this new leap.
01:16:04.080 And our leap is, well, let's make a virtualized self.
01:16:08.000 Let's make an avatar in imagination.
01:16:13.300 Let's play out a few different scenarios of what might be.
01:16:17.460 Let's allow the stupid scenarios to die before we implement them.
01:16:23.600 And then let's do that broad scale.
01:16:25.360 That's why we have free speech.
01:16:26.680 So what you and I are doing in this conversation, to the degree that it's successful, and this is what everyone who's listening is doing too, is that we're undergoing a sequence of micro-deaths and micro-rejuvenations.
01:16:39.480 And so you'll offer an idea, and then I'll criticize it or add to it and kill some of it and shape it.
01:16:47.540 And then you'll take that and you'll kill some of it and we'll toss it back and forth with the hopes that by the time we finally implement it, we actually won't have to die.
01:16:58.600 And so human beings have substituted the ability to think abstractly, which is partly to die abstractly, for the process of real death.
01:17:08.280 And in principle, if we are capable of maintaining open dialogue and engaging in critical thinking, we can make most of the death that would otherwise be necessary to control our populations virtual.
01:17:20.440 We don't actually have to act it out in simulation, and then we can only implement the ideas that seem to be productive.
01:17:27.880 And we're actually really good at that.
01:17:29.440 The whole human enterprise is precisely that.
01:17:31.880 So that's another reason why the biological model is just not tenable.
01:17:35.200 We are not of the same kind, even as foxes and rabbits, certainly not of mold, and certainly not of cancer or viruses.
01:17:44.460 And so the biologists who are thinking seriously, they have to take this into account.
01:17:48.180 And I don't think they are, you know, to paint with a broad brush.
01:17:53.200 No, and I think that is why you have all these economists telling us, actually, in many ways, we have moved on and we're much better at fixing problems.
01:18:04.780 And, again, I think it gives a better way of thinking about problems that you say, yes, it's great that there are people out there pointing out problems.
01:18:12.660 I'm happy that organizations like Greenpeace are there because they point out and nip at the heels of, you know, corrupt officials or governments that don't do their job and simply tell us these are potential problems.
01:18:26.240 But it shouldn't be taken as, oh, my God, that means we're all going to die.
01:18:30.880 No, it means here is another of the many, many problems that have beset us from all, you know, time immemorial and that we've also fixed and typically fixed in a way that actually left the world better, not worse off.
01:18:43.460 The environment is not purchased at the expense of the economy, and the economy is not purchased at the expense of the environment necessarily.
01:18:51.620 They can work in harmony.
01:18:53.540 And we know that.
01:18:55.100 Well, how do we know that?
01:18:55.920 We know that because as you accelerate people up the GDP production curve, so every individual is making more money, you get to a point where people, as we already pointed out, start to take a longer-term vision.
01:19:09.840 And that vision includes environmental maintenance.
01:19:12.200 So let's say we want a vision for the planet on the environmental sustainability side.
01:19:16.840 So how do we do that?
01:19:18.780 Well, why don't we produce as many people as we possibly can who are as concerned as they can be that their relatively local environment, the one they can actually control, is as green, productive, and sustainable as possible?
01:19:32.640 We want billions of people working on this, not just a few.
01:19:36.220 Well, how do we get billions of people working on it?
01:19:38.760 Well, we help them with cheap energy and the provision of plentiful food.
01:19:42.900 We help them provide security for their family and opportunity for their children.
01:19:47.740 And then we enable them to take a longer-term view.
01:19:50.860 And they'll automatically start attending to the sorts of concerns that hypothetically predominate among the environmentalists.
01:19:59.200 And the data that that's going to happen is very clear.
01:20:01.660 Yeah.
01:20:02.140 So just to take a step back, the economists typically call this the inverse Kuznets curve.
01:20:09.900 So basically what you see with most environmental problems, as you get richer, first problems increase.
01:20:16.680 You know, you get more air pollution as you industrialize in China or in India.
01:20:20.220 And then once you get sufficiently rich that this has actually meant now your kids are not dying, you have enough food, then you start to worry and say, I'd actually like to cough less.
01:20:29.700 And so you get the other side of this.
01:20:31.500 So there is a sort of intermediate disconnect.
01:20:35.700 So once you start getting people out of poverty, they actually get more pollution.
01:20:40.500 Now, if you lived in that situation, you would undoubtedly make the same decision.
01:20:44.700 You would say, I'd like to have more food and more opportunity for my kids, and then I'll cough a little bit.
01:20:49.600 But that's, remember, what we also did back in the 1800s, when pretty much all cities in Europe and the U.S. were terribly, terribly polluted, but we were getting richer and richer.
01:20:59.780 So there is this disconnect for a short time.
01:21:03.520 But it's very hard to imagine that the right way is to say, well, then let's not at all start down the route of getting much better off and actually living in a world that we both like and that where we'll actually be worried about the environment in the long run.
01:21:17.060 Well, it's not like the developing countries are going to go along with this.
01:21:22.240 No, they are.
01:21:22.540 They're just going to tell us colonials to go take a flying leap, which is exactly what they should do.
01:21:27.360 Because basically what we're telling them is, well, you know, we got pretty rich, and we're pretty happy to fly in our private jets to Davos and think about the globalist utopia.
01:21:36.820 But we don't think you guys should have any of that.
01:21:39.400 And, you know, the faster you get at being poor, the better.
01:21:42.040 And there's just absolutely no likelihood at all that places like India and China are going to do anything but lift a middle finger to us when we do that, and rightly so.
01:21:51.840 And then on the pollution front, we should differentiate that a little bit.
01:21:56.640 It is true that as the world got more industrialized, and that will happen in places like China and India, that air pollution increased, for example.
01:22:06.340 Particulate production increased, but it increased, you could argue that it decreased inside houses as it increased outside houses.
01:22:16.200 And so even that wasn't a clear, like, downside on the pollution front, because, well, your work has indicated this quite clearly, or at least you've brought it to people's attention.
01:22:26.020 In the developing world, because people burn dung and wood, very low-quality fuel with high particulate content, many, many young people around the world are dying every year, and elderly people as well, because of indoor air pollution.
01:22:41.240 Yeah. We have no sense of this.
01:22:43.360 So, you know, three and a half billion people in households, mostly in the very poor South, they basically, as you say, cook and keep warm with dirty fuels like dung and cardboard.
01:22:54.180 And the impact of that is equivalent, according to the World Health Organization, to, if you look inside huts, if you've ever been in one in Africa, they're terribly polluted inside.
01:23:06.000 And it's like smoking two packs of cigarettes every day for three and a half billion people.
01:23:12.380 It's not surprising this kills millions of people every year.
01:23:15.200 And, again, it's not to say, you know, if anything, this just simply makes us realize that there are a lot of different problems.
01:23:23.000 And some of them you solve very simply by getting people out of absolute poverty.
01:23:27.880 Not only do they stop dying from not having enough food and getting easily curable infectious diseases, but also they stop dying from indoor air pollution.
01:23:35.740 One of the first things they do is they get a stove that actually runs in natural gas.
01:23:41.420 Remember, that's why we're not, you know, afraid of going into our kitchens in the rich world.
01:23:45.680 And so it should be in the poor world.
01:23:48.560 And we need to have that conversation.
01:23:50.680 We need to understand.
01:23:51.620 And that's, of course, why overall getting to developed country status is something that almost everyone aspires to and certainly something that's worth having.
01:23:59.140 Okay, so let's talk about some of the, so people who are listening, we've spent a lot of time in the philosophical realm and in the relatively low resolution realm trying to lay out the underlying conceptual landscape and to, what would you say, delineate something like a metaphysics of optimism.
01:24:20.580 That might be a good way of thinking about it, but one of the things that's admirable about your work is that you also concentrate on the devil in the details.
01:24:31.800 And so we wrote an op-ed recently that got a fair bit of distribution on a couple of problems that we could solve globally, let's say, we could address at a relatively low cost.
01:24:45.540 Billions of dollars instead of trillions, and so that's like one thousandth the cost for people who want to do the mathematics.
01:24:52.300 So what do you see, what could people think about in terms of low-hanging fruit?
01:24:58.480 What are things we could address in the next 10 years to speed the process of improvement and to address both economic and environmental issues simultaneously?
01:25:07.500 Yeah, you know, it's a great question.
01:25:09.720 And that's really what I've been spending the last couple of years on and really a very large part of my career is basically engaging people and saying, there are lots of problems.
01:25:19.420 And we should be honest about that.
01:25:21.100 The world still has lots and lots of problems.
01:25:23.800 Some of them are very hard to solve.
01:25:26.280 Some of them are very easy to solve.
01:25:28.460 If that's true, why wouldn't we want to solve the easy ones first?
01:25:32.360 Some of them are incredibly expensive to solve.
01:25:34.460 Some of them are very, very cheap to solve.
01:25:35.960 Why wouldn't we solve the cheap ones first?
01:25:37.540 So what we try to go for is simply, as you said, the low-hanging fruit, saying of all the different problems in the world, where are some really smart solutions?
01:25:46.200 And it typically ends up being such that you can't solve all of the problem.
01:25:50.080 Remember, we rarely solve all of any problem.
01:25:53.920 You know, you don't go to university to learn everything.
01:25:57.300 You go to university to learn enough.
01:26:00.580 You don't, you know, well, I could go on with that metaphor.
01:26:03.240 I don't think I will.
01:26:03.860 But the point here is to say you need to find out when is enough enough.
01:26:10.160 What are the really smart things?
01:26:11.520 So take one thing that we actually wrote about in the op-ed.
01:26:16.140 Everyone needs education.
01:26:18.680 One of the reasons why countries have gotten rich is that people have learned reading, writing, communicating, understanding, and becoming much more productive citizens.
01:26:30.820 So if you look back in 1800, almost the entire world, except for a very tiny sliver of the aristocracy, were basically illiterate.
01:26:41.740 We are now in a world where more than 90% are at least technically literate.
01:26:46.640 We've moved an enormous amount of way.
01:26:48.560 And that's why a lot of rich countries are rich.
01:26:51.380 That's why we're well-off.
01:26:52.840 That's why we have the human flourishing that we have.
01:26:55.560 So this is incredibly important to understand.
01:26:57.880 We believe that almost half of the difference between being poor and being rich is whether you have an educated population.
01:27:06.160 Now, nobody disagrees.
01:27:07.540 Yes, we should all have educated people.
01:27:09.860 But the truth is, it's really, really hard.
01:27:12.620 We know that in rich countries because we have that conversation constantly.
01:27:16.300 How do we make our schools better?
01:27:17.780 But it's much, much clearer in the rest of the world.
01:27:21.900 So we look a lot on what the World Bank calls the low-income and lower-middle-income countries.
01:27:27.020 So that's about half the world's population.
01:27:29.060 It's 4 billion out of the 8 billion we're on the planet.
01:27:32.160 So you could say it's the poor half of the world.
01:27:35.160 In that part of the world, when you do studies, you probably heard about the PISA studies.
01:27:40.240 You know, we try to find out how good are people, how good are students around the world to do different things.
01:27:45.020 So there's similar kind of studies done across pretty much all of the world.
01:27:50.520 It turns out, and this is terrible, so there's 650 million kids in school.
01:27:56.460 So kids and adolescents in school in the lower 4 billion of the world, so in the low- and low-middle-income countries.
01:28:04.180 Of these kids, 80% cannot read and do math in any reasonable way.
01:28:13.080 And let me just give you an example of what that means.
01:28:15.540 It's not, you know, rocket science.
01:28:17.260 It's, for instance, you let them read a statement like this.
01:28:20.860 Vijay has a red hat, a blue shirt, and yellow socks.
01:28:26.380 What color is the hat?
01:28:29.320 80% of the kids can't answer this when they're 10 years old.
01:28:33.280 Likewise, a 10-year-old's math question would be, we have six pieces of cheese.
01:28:39.560 What is the way to divide this to two people so each get the same amount?
01:28:47.160 And again, about 80% can't.
01:28:48.980 The right answer is three, by the way.
01:28:51.320 But, you know, this is really depressing.
01:28:53.880 And the truth is, we don't know how to fix this.
01:28:57.260 We know a lot of ways that don't work.
01:28:59.300 So, for instance, in Indonesia, in 2001, the parliament decided they spent about 10% of public expenditure on education.
01:29:08.880 So, a lot of money on education.
01:29:10.840 But they decided, we're going to do more.
01:29:13.180 So, they decided, we're going to put in the constitution that we need to spend 20% on education.
01:29:19.900 That's, you know, potentially a great thing.
01:29:22.180 You really want to help the country.
01:29:23.640 So, what they did was they built a lot of new schools, they got many more teachers, so they went up from 2.7 to 3.8 million teachers.
01:29:32.780 They now have one of the lowest class sizes in the world.
01:29:37.100 They have great teachers.
01:29:38.500 They have lots of teachers, and they're really well-paid.
01:29:42.000 Unfortunately, you couldn't tell the difference in the outcome on students.
01:29:46.280 They were still just as bad.
01:29:48.180 And, of course, what that tells you is, so there's this wonderful study, it was called Double for Nothing.
01:29:54.780 You know, we basically, in Indonesia, paid twice as much and got nothing out of it.
01:29:59.920 That's the worst kind of way to try to help the world.
01:30:03.820 Now, a lot of people will make these arguments.
01:30:06.160 We need to pay teachers more.
01:30:07.920 It turns out that if you pay teachers more, they become really, really happy, which is not surprising.
01:30:12.820 But it doesn't actually increase learning by students.
01:30:16.360 Likewise, if you make class sizes smaller, it has virtually no impact.
01:30:21.180 What is the main problem here?
01:30:23.380 The main problem is it's really hard to teach a lot of kids.
01:30:27.520 So, you know, say there's perhaps 60 kids in a class in the typical Global South, to teach them.
01:30:33.900 They're all 12-year-olds, but they're wildly different abilities.
01:30:38.120 You know, some of them are incredibly bored because they know all the stuff and they want to go on to the next class.
01:30:43.460 Many of them have no clue what's going on.
01:30:45.840 No matter what the teacher tries to do, it's always going to be wrong for most of the students.
01:30:51.480 This is why, you know, a lot of people have then tried to say, are there ways to solve this?
01:30:55.620 And the answer is yes.
01:30:57.100 These are the Norman Borlags, if you will, of the world who've come up with new and interesting and amazing ideas.
01:31:03.820 I'll tell you about one of them.
01:31:04.860 So this one is about getting basically a teaching aid on a tablet.
01:31:11.220 So it could be an iPad, but it'll probably be a cheaper, you know, knockoff Android kind of thing.
01:31:16.520 And then it teaches you at your level.
01:31:19.640 So what it does is it starts asking some questions and you can pretty quickly find out what is actually your level.
01:31:26.440 And then it'll teach you through that school year.
01:31:29.940 So one hour a day.
01:31:32.160 This is partly because then you can actually still have most of the school running as it usually do.
01:31:37.120 Then it also means you can share the iPad with or the tablet with many, many other students over the day.
01:31:43.120 If you have one hour a day for a year, the amazing thing is we now know that you can triple the learning.
01:31:51.840 You can actually make kids learn three times as much as if they'd gone to school three years.
01:31:57.820 Now, remember, it's a low-quality school, so it's not as amazing as it sounds, but it's still much, much better.
01:32:03.240 Yeah, but it's an improvement.
01:32:04.360 Well, what it means, Bjorn, is that you're now putting children for an hour a day into the only place that learning actually takes place.
01:32:14.300 So we've known this, psychologists have known this for 100 years.
01:32:17.760 So this psychologist named Vygotsky, Russian psychologist, came up with this notion called the zone of proximal development.
01:32:25.240 And what he noted was that parents spontaneously speak to their infants and their toddlers.
01:32:34.360 At a level that slightly exceeds their current comprehension level.
01:32:40.500 And they do this automatically.
01:32:41.940 And so you can imagine that there's a horizon of learning.
01:32:44.880 And the horizon of learning is the place that's optimally challenging for you.
01:32:49.820 That's the only place.
01:32:50.900 That's also, by the way, on the border between order and chaos, technically speaking.
01:32:55.220 That's the only place that learning ever takes place.
01:32:57.520 And so if you have a classroom full of 12-year-olds, say 60 of them, some of them have an IQ of 70, which means no matter how you, how hard you try, no matter how much effort you expend, you'll never get them beyond the basics of rudimentary literacy.
01:33:13.780 And some of them have IQs of 145, which means those are kids who could learn to read at 12 to 1500 words a minute and who'd be capable of operating at the highest end of cognitive development.
01:33:25.240 They're all in the same class.
01:33:26.920 Well, obviously, you can't pitch to the middle of that because, as you said, you'll make a shambles of it.
01:33:31.360 But the data that you're laying out in terms of the effectiveness of this technology is an indication of the utility of finding that zone of proximal development.
01:33:41.160 That's what people talk about, by the way, when they talk about the zone, being in the zone.
01:33:45.960 And I like the particularity of your solution, too, because you're saying, well, look, we need to educate people.
01:33:52.460 We need to educate them because educated people generate more of the wealth that provides security and opportunity.
01:33:58.020 Literacy is core to that and, say, basic numeracy.
01:34:01.840 You can't even operate a computer without that.
01:34:04.400 And then we have some very efficient technological strategies that are also cost effective where we can target and solve that particular problem.
01:34:13.200 It's very particularized.
01:34:14.560 And so and that's also a lovely vision.
01:34:16.600 It's like, well, why not make education cheap and useful?
01:34:19.980 It's incredibly simple.
01:34:21.860 And also remember, there are lots of other solutions.
01:34:24.580 Don't first focus on those.
01:34:26.060 Focus on these incredibly effective ways.
01:34:29.320 So, you know, we talk about and there's lots of different data that shows how do you make sure these tablets don't get stolen.
01:34:35.840 So you need a place that you lock them up for the night.
01:34:38.820 How do you recharge them?
01:34:40.120 You need a solar panel.
01:34:41.360 You need all that cost.
01:34:42.480 You also need some people to operate.
01:34:43.880 Right, right.
01:34:44.100 But all of this has actually been proven.
01:34:46.700 So one of the things that we've helped do, and this is by no means just us, is now Malawi, one of the poorest countries in the world, is actually aiming at, over the next four years, to spread this out to all of their schools.
01:35:00.580 So that's more than 3,000 schools, and they're going to get this out to all one to four graders, fourth graders.
01:35:08.000 That's an amazing achievement.
01:35:09.480 Again, this is the kind of thing that will make Malawi richer because they'll become more productive in the long run.
01:35:15.660 So we do the whole cost and benefit calculation.
01:35:19.100 So this will have real costs.
01:35:20.640 You know, we're talking about several billion dollars if you were going to do this globally, but billion is a very low number when you're thinking about how much we're spending on some of these other problems.
01:35:29.460 As you mentioned, it's not a trillion.
01:35:31.420 And secondly, it's going to dramatically improve all of these countries to become better so that they'll both be better off to have human flourishing, but in the long run also have better environment.
01:35:41.400 We actually then try to say, well, so for every dollar you spend, how much good do you end up doing?
01:35:47.520 Turns out that you make so much good that it's equivalent to $54 of social good in the long run.
01:35:55.260 So every dollar in, $54 out.
01:35:58.520 And this is based on realistic assumption.
01:36:00.720 Right, which is $54 more dollars that you could go off and do some more good with, too.
01:36:06.860 Yes, yes.
01:36:07.340 Although, remember, again, much of this is somewhat out in the future because that's what education is all about.
01:36:13.760 Of course, but we have this reflexively anti-capitalist notion in the West often that lurks at the bottom of some of the metaphorical realms that we've been discussing, that there's something suspect about wealth.
01:36:27.120 But the thing about wealth is that if you use it ethically, it's life more abundant.
01:36:32.760 If you use that ethically, then part of the utility in gathering wealth is that hypothetically you can go, well, first of all, make more wealth with it, but you can do good things with it.
01:36:42.660 And so it isn't like Scrooge McDuck luxury in the money bin that we're talking about here.
01:36:47.800 It's the opportunity to make things better for people.
01:36:50.060 Now, the other thing we wrote about in that op-ed, you just talked about the education front.
01:36:53.920 Maybe we can close with this next proposition.
01:36:57.660 And by the way, for those of you listening and watching, this is just one of many, although Bjorn's done a very good job of trying to rank order these in terms of, well, let's do the cheap and easy things first.
01:37:07.680 We might as well hit the ball out of the park, at least in the places we can.
01:37:11.520 And maybe we can save some of the problems we don't know how to solve for the future, which is a perfectly reasonable way of going about it.
01:37:17.880 You also talked about, we also talked about the provision of nutritional aid to pregnant women and women who have infants.
01:37:25.920 And so do you want to just walk through that briefly?
01:37:28.060 Just very briefly, again, we've gone from a world over the last 100 years.
01:37:32.820 In 1928, we estimate about two-thirds of the whole world was deficient in food.
01:37:39.400 So basically, we were starving.
01:37:41.800 Two-thirds of us were starving.
01:37:43.620 Today, that number is down below 10%.
01:37:45.720 But that's still very high.
01:37:47.140 That's still about 800 million people.
01:37:48.880 So there's still a big problem.
01:37:50.600 Again, we can have diminishing problems and still want to do something about it.
01:37:54.760 How do you fix this?
01:37:55.800 Well, in the long run, you actually need to get much better productivity in agriculture.
01:37:59.380 And we have some other solutions for that.
01:38:01.380 But it turns out just giving out lots and lots of food is not only very expensive because you actually have to distribute it,
01:38:07.580 but it also typically destroys local agriculture.
01:38:11.000 If you give out all the food, then they don't have any incentive to produce it next year,
01:38:15.000 and you actually end up very easily make more problems.
01:38:18.100 Here is, we have a few really good solutions.
01:38:21.420 One of them is this, and that's the one for pregnant women.
01:38:24.580 So one of the things with pregnant women is they basically start off the next baby, right, in their bodies.
01:38:31.080 And they typically have very low levels of vitamins and minerals and nutrients.
01:38:36.440 Now, again, if we gave them all the nutrients, that would probably be better, but that turns out to be really hard.
01:38:41.080 But we can give them all the necessary micronutrients, vitamins, and a pill.
01:38:46.740 We already do that in a simple sense because we give them what's known as vitamin A and—oh, I'm sorry, I'm just blanking on that.
01:38:57.780 Yeah, some other thing.
01:39:00.120 That's sort of in the standard package from the World Health Organization.
01:39:03.540 But if we expand that, the pill will be a little more expensive.
01:39:07.000 But remember, we already have the whole infrastructure in place to hand out this pill.
01:39:10.840 It's one pill every day for every pregnant woman.
01:39:13.720 That's about 50 million pregnant women we're talking about every year.
01:39:18.160 So it's not a small operation, but it'll actually be fantastically cheap because we just need to exchange that pill.
01:39:24.320 It's already being done there.
01:39:26.600 We just need to get many more produced and get them distributed and get some more information out.
01:39:31.400 And if you do that, it means that the child will be born with a better possibility of developing its own mind.
01:39:41.600 We know that because if you're not born early or if you're not low birth weight, you have better chances in your life.
01:39:48.000 So we can basically make about 50 million kids better chances next year.
01:39:53.840 If we did this, it'll cost in the order of $140 million.
01:39:57.240 So it's, you know, literally peanuts that we're talking about.
01:40:01.020 But if we did that, these kids would grow up.
01:40:04.180 It would basically mean that somewhere between 2 and 3 million more kids would be better at school.
01:40:10.360 They would go longer.
01:40:11.480 They would learn more.
01:40:12.560 They'd become more productive.
01:40:13.800 They would help their countries become richer.
01:40:16.200 Again, this is a simple thing and it's a very well-documented thing.
01:40:20.620 And we know that if you spend $1 here, you can do about $38 of social good.
01:40:27.160 And again, this is not just, you know, making up more value or McScrooge as you were talking about.
01:40:33.680 This is actually making sure that really poor women can make the life for the next generation much better.
01:40:40.980 The point here is not that this is what you should be doing, but it's one of the many things that you could choose and say,
01:40:47.100 look, I want to do something that actually matters and that has huge impact rather than just go with,
01:40:53.740 oh my God, the world is terrible and then I'm just going to give up or I'm just going to go for these cheap, easy virtue signaling.
01:41:01.720 This is about, there are a lot of technologies.
01:41:03.980 There are a lot of innovations where we can actually make a lot of good at very low cost, very effectively.
01:41:09.380 Isn't that what we should be doing for 2023?
01:41:11.600 Okay, so let's sum up this conversation.
01:41:15.200 I'll sum it up briefly and then let's see what you have to add or subtract from that.
01:41:20.540 So we started by talking about the underlying metaphysic or even theology of the current worldview
01:41:28.440 and laid out the proposition that we tend to see, we tend to insist that young people see the planet as a fragile virgin
01:41:35.620 and culture as a rapacious predator and the individual as a parasitic predator.
01:41:40.400 And that that has its echoes in the a priori religious landscape, but that it's a very one-sided story.
01:41:48.860 And the corrective to that story is, well, nature can be pretty damn hard on us
01:41:54.020 and needs to be tamed and controlled in a manner that's sustainable.
01:41:58.180 We should be extremely grateful for what our culture has provided us with,
01:42:02.140 not least the ability to look at nature as though it was benevolent.
01:42:05.400 And we shouldn't be so damn hard on individual people because for all their flaws,
01:42:10.180 they can be Norman Borlaug, for example, or the people who worked along with them,
01:42:14.900 who are genuinely contributing not only to a much more productive and generous economy,
01:42:21.700 but also doing that in a manner that's beneficial on the environmental front.
01:42:26.340 And so we need to balance our viewpoint and we need to stop terrifying young people into apocalyptic nightmares
01:42:32.160 by insisting that the world is going to collapse and that all the power should be given to terrified, centralized tyrants.
01:42:40.020 So that's the first part.
01:42:42.700 Then we talked a little bit about the motivational landscape for that worldview,
01:42:47.400 is that people are being enticed into these apocalyptic views partly by being offered an easy pathway to moral virtue
01:42:56.140 when they're susceptible to that need.
01:42:58.940 And so it's all about climate.
01:43:00.880 It's all about carbon dioxide.
01:43:02.160 There's only one problem.
01:43:03.620 If you're just concerned about it, that means you're morally virtuous.
01:43:07.160 You've identified the enemy and now you have nothing else to do.
01:43:10.160 That's a bad model.
01:43:11.340 The proper model is for people to develop a sophisticated, a theory of the world and of their own action
01:43:19.140 that's as sophisticated as the actual problem set
01:43:22.740 and to be willing to devote mature time and energy to the solution of a set of real problems which we could solve.
01:43:30.520 And then we sort of closed the conversation, mostly you did that,
01:43:34.300 by delineating a couple of the areas that basically constitute low-hanging fruit.
01:43:41.340 We could educate poor people for a relatively low outlay economically,
01:43:48.540 certainly one that would produce a tremendously high return on investment.
01:43:52.140 We already know how to do that.
01:43:53.480 The infrastructure is already in place.
01:43:55.120 And we could do the same thing on the nutritional front.
01:43:57.760 And then we'd have fewer starving and stunted children, and they'd all be more well-educated.
01:44:02.520 And then we also pointed out, being wealthier and being more intelligent,
01:44:08.540 they'd also be more likely to take a long-term view of, let's say, ecosystem sustainability,
01:44:14.360 and they'd start to work in a distributed manner to serve the environment locally.
01:44:19.400 And so, like, why the hell is that a bad idea?
01:44:21.700 That seems to me like a really good idea.
01:44:23.760 So, anyways, that's my summary of the conversation.
01:44:27.240 Do you have anything that you want to add or subtract to that?
01:44:30.140 I think it's a great summary.
01:44:32.060 I think it really just summarizes into, you've got to stop believing that this is the end of the world.
01:44:38.600 That's not what the data shows us.
01:44:40.280 And we've talked about that for a number of different things.
01:44:43.040 So, we live much longer.
01:44:45.140 We're much less poor.
01:44:47.140 So, there are many fewer poor people.
01:44:49.300 Air pollution, for instance, indoor, has gone down dramatically.
01:44:52.560 We know how to solve many of these problems, and we are a smart species, so we will keep on learning how to fix this.
01:45:00.160 Yes, this is not about moral virtue and just showing up and saying, I want to do good.
01:45:05.860 This is about the long, hard grind of the Norman Borlaugs and all these other guys that actually helped us think out what are smart ideas.
01:45:14.940 That's what this third idea was then, or this third part of the conversation was really about.
01:45:19.920 There are a lot of smart things.
01:45:21.440 Do you want to be part of that?
01:45:23.040 I would love for all of us to be part of that.
01:45:26.400 And I think that was part of why we also wrote this all, but it's sort of, you know, the new year is a place when you start talking about, so, you know, what do you want to do?
01:45:36.060 What do you want to do for next year?
01:45:38.040 How do you want to look at the world?
01:45:39.620 Stop being scared.
01:45:40.860 Start thinking about, how can I help?
01:45:43.360 And wouldn't that be amazing if we actually had a lot more people saying, I want to help.
01:45:48.180 I want to be one of the guys who helped get, you know, these tablets out in a developing country somewhere.
01:45:54.620 I want to be the gal who focuses on making sure that we get these cheap tablets out to pregnant women.
01:46:02.840 I want to help push forward these very simple ideas and, of course, come up with new ideas.
01:46:09.400 This is the way we solve problems.
01:46:11.520 This is the way we actually make the world even better.
01:46:13.620 Right. Well, that was the other stream that I didn't summarize, is that we'd also talked about the fact that the mold in a Petri dish model, biological model of human existence is not appropriate.
01:46:25.240 Neither is the fox and rabbit model, is that we have the capacity to generate and kill new ideas constantly.
01:46:32.020 We're very good at testing them in those countries where there's freedom of expression and freedom of thought.
01:46:36.960 We're very good at testing those ideas.
01:46:39.060 We're very good at implementing them.
01:46:40.600 We've learned continually how to make more with less.
01:46:43.700 We're getting better and better at that in every possible way, especially as our computational power increases.
01:46:49.200 And so what that would mean is that if we could shed, maybe if we could shed the apocalyptic pessimism and encourage young people to work diligently towards a mature and integrated vision of the economy and the environment,
01:47:03.080 invite them to participate, invite them to participate as people whose basic destiny is to make the world a better place for people and for nature itself,
01:47:12.560 that that's a viewpoint that's much better, that's much more likely to lift people out of abject poverty and also to produce a greener and more sustainable world.
01:47:24.180 Well said.
01:47:25.100 All right, Dr. Lomberg, hey, for everybody who's watching and listening, I mean, first of all, thank you for participating in this conversation and, you know, and more power to you, by the way, on the upward and onward front.
01:47:39.860 There's no reason to be destroying your motivation by engaging in apocalyptic doomsaying when there's many things that need to be done and could be done that are productive and useful.
01:47:53.840 And many things that are positive that are beckoning and that have already made themselves manifest.
01:47:58.860 Bjorn noted, for example, that we've lifted a tremendous number of people out of poverty in the last 15 years, and we could do an even better job at that.
01:48:06.420 In the meantime, I'm going to talk to Dr. Lomberg for another half an hour on the Daily Wire Plus platform.
01:48:13.640 I'm interested in, with all my guests who are generally very successful and interesting people, I'm always interested in the manner in which their responsible destiny made itself manifest, right?
01:48:25.780 So one of the things that all young people contend with is the issue of where to find the central purpose, the meaning in their life.
01:48:31.880 And, you know, it's easy to get nihilistic and cynical about that and think that life has no purpose in the final analysis.
01:48:38.180 But I think that's a pretty gloomy and unwarranted supposition.
01:48:42.660 And one of the things I have seen among the people I've met whose lives are together and who are doing productive and generous things is they do find engagement in something that's truly meaningful.
01:48:53.200 And it does get them out of bed in the morning and help motivate them to be productive and not only for themselves and not only for their own gain, in case that has to be said, but so that they're working in a manner that's extremely socially responsible and meaningful in a reciprocal manner.
01:49:10.260 And so I'm going to talk to Bjorn about how his interests made themselves manifest in the early part of his life in my attempt to trace how such things come about.
01:49:18.900 And so we'll see you in January and Merry Christmas to you, by the way.
01:49:23.920 Hello, everyone.
01:49:25.060 I would encourage you to continue listening to my conversation with my guest on dailywireplus.com.