Bjørn Lomborg
Episode Stats
Length
1 hour and 32 minutes
Words per Minute
187.23528
Summary
In this episode, Dr. Jordan B. Peterson interviews Bjorn Lomborg, a Danish economist and President of the Copenhagen Consensus Center, a project-based, U.S. think tank where prominent economists are seeking to establish priorities for advancing global welfare. Dr. Peterson and Dr. Lomborg discuss how to prioritize the things we can do the most good, rather than the things that make us feel the best about ourselves, and why it s important to prioritize where we can help the most, and what we should prioritize in order to make the most of the limited amount of resources we have available to us. Dr. Petersen is the author of The Skeptical Environmentalist, How to Spend 75 Billion Dollars to Make the World a Better Place, and Prioritizing Development, a Cost-Benefit Analysis of the UN's Sustainable Development Goals, among many other works and books. He is also the host of the popular podcast, "Skeptical environmentalism" and hosts a weekly podcast called "The Skeptic Environmentalist," which focuses on debunking myths and debunkings of environmentalism. You can support these podcasts by donating to his PODCAST, which can be found in the description of his project, "Self-Authoring: A Guide to Making the World A Better Place," which can also be found at the link to the donation page. You can also support the project by becoming a patron of his podcast by making a monthly pledge at . or by pledging a small monthly donation at . You can be a supporter of the project, and you'll get access to more episodes, and access to his courses, books, courses, and more! and more. Thanks to the Daily Wire Plus Podcast. Subscribe to Dailywireplus.co/Dailywireplus to receive the latest episodes of The Jordan Peterson Podcast, and to support the podcast, and get exclusive access to future episodes throughout the week! Subscribe, rate and subscribe to the podcast on the podcast! Learn more about your ad choices, and subscribe on iTunes, Podcoin, wherever else you get your favorite podcasting choices are available. Thanks for listening to The Daily Wire + Podcasts? Subscribe to The Jordan B Peterson Podcast? Subscribe on Apple Podcasts! Subscribe on iTunes Subscribe on Podcoin.ee/DailyWire Plus Subscribe on Audible Subscribe on your favourite podcasting platform Subscribe on the App Store or wherever you re listening to the Podcoin Podcast? Subscribe on PODCO Connected?
Transcript
00:00:00.960
Hey everyone, real quick before you skip, I want to talk to you about something serious and important.
00:00:06.480
Dr. Jordan Peterson has created a new series that could be a lifeline for those battling depression and anxiety.
00:00:12.740
We know how isolating and overwhelming these conditions can be, and we wanted to take a moment to reach out to those listening who may be struggling.
00:00:20.100
With decades of experience helping patients, Dr. Peterson offers a unique understanding of why you might be feeling this way in his new series.
00:00:27.420
He provides a roadmap towards healing, showing that while the journey isn't easy, it's absolutely possible to find your way forward.
00:00:35.360
If you're suffering, please know you are not alone. There's hope, and there's a path to feeling better.
00:00:41.780
Go to Daily Wire Plus now and start watching Dr. Jordan B. Peterson on depression and anxiety.
00:00:47.460
Let this be the first step towards the brighter future you deserve.
00:00:59.780
You can support these podcasts by donating to Dr. Peterson's Patreon, the link to which can be found in the description.
00:01:06.840
Dr. Peterson's self-development programs, self-authoring, can be found at selfauthoring.com.
00:01:42.040
Bjorn Lomborg is a Danish author and president of the Copenhagen Consensus Center, a project-based
00:01:50.760
U.S. think tank where prominent economists are seeking to establish priorities for advancing
00:01:56.480
global welfare. He's the former director of the Danish government's Environmental Assessment
00:02:01.400
Institute in Copenhagen. Bjorn became known internationally for his book, The Skeptical
00:02:07.240
Environmentalist in 2001, and How to Spend 75 Billion Dollars to Make the World a Better
00:02:12.920
Place. Cool It in 2007, which is also a movie, and lately published in 2018, Prioritizing
00:02:20.760
Development, a Cost-Benefit Analysis of the UN's Sustainable Development Goals, among many
00:02:27.040
other works and books. And so I'm hoping we'll have a very productive conversation today, and
00:02:32.460
welcome to this discussion. Thanks a lot, Jeremy. All right, so why don't you just start
00:02:38.420
by letting people know what you've been up to over the last, let's say, two decades. I know
00:02:43.880
that's a very broad question. And why? I mean, I've been interested in talking to you because
00:02:48.660
I read a number of your books a few years ago, and I was interested in your economic analysis
00:02:54.500
as a way of determining which crises, let's say, or which potential crises are real, and
00:03:01.960
also how they might be managed most intelligently. Thanks a lot, Jordan. And so, you know, fundamentally
00:03:08.920
what I try to do is to say, and this really is very, very obvious, we only have a limited
00:03:14.420
amount of resources. So let's make sure we focus those resources on the places where we can help
00:03:20.660
the most, where we can do the most good. Remember, what is it that we mostly focus on in the world?
00:03:25.760
It's very often the things that get the headlines, the things that people talk about. And very
00:03:31.160
often that ends up being the things that have, you know, the cutest animals or the most crying
00:03:35.560
babies of the groups with the best PR. And surely that's not the right way to prioritize. The right
00:03:41.720
way should be to look at, if I spend a dollar or a peso or a rupee here, will I do more good
00:03:48.600
than if I spent that same dollar or rupee a peso over here? So basically asking across all the
00:03:54.360
different areas you can spend resources, where do I help the most? Now, obviously, that's a huge
00:04:00.220
conversation in and of itself, because it's not easy to just determine that. But the basic idea is
00:04:05.080
simply to say, let's focus on the places where you can do the most good, rather than the places where
00:04:11.380
it makes us feel the best about ourselves. So that's really what I've been trying to do for two
00:04:16.880
decades. And, you know, prioritizing the world and trying to say, where should you spend money,
00:04:22.400
of course, makes the projects that we say, these are the really, really great interventions.
00:04:27.940
They all love us and think we're like the best things that slice bread. But of course, the projects
00:04:32.820
and the policies that are not so effective, they think we're terrible. So, you know, it creates a lot
00:04:37.960
of antagonism and makes a lot of people annoyed and interested. But I think it's crucial to ask these
00:04:43.460
questions. Well, one question would be, what's the alternative? You know, like when I was looking
00:04:48.200
at the UN Development Goals, for example, if I remember correctly, there was something approximating
00:04:52.500
200 of them. And this was a few years ago, I worked on a UN panel. And I thought, well,
00:04:57.480
the problem with 200 goals is that you can't have 200. They're not goals if there's 200 of them,
00:05:03.620
because you absolutely have to prioritize in order to move forward, assuming some limitation on
00:05:08.240
resources, which is exactly what what you what you just described. And so then then the question
00:05:14.220
would become, well, how do you calculate benefit? And that's a really difficult problem, which is,
00:05:19.600
I think, why it wasn't addressed with the mishmash of 200 goals, apart from the fact that you're going
00:05:24.480
to offend people by rank ordering their priorities. So why don't you tell people a little bit about the
00:05:29.740
about the methods that you used? Because I think they're extremely interesting.
00:05:33.460
So you're absolutely right. The sustainable development goals, we actually work with the
00:05:38.440
UN back in 2015, when they were doing this, I, it was, it was, it was about 60 to 90, it was very
00:05:45.120
unclear. UN ambassadors, I actually met with a quarter of them in New York and talked to each one
00:05:50.500
of them said, shouldn't we try to, you know, focus on the targets that would do the very most good?
00:05:55.140
And of course, each one of them said, yes, individually. But the, the combined effort of all
00:06:02.220
the UN ambassadors was, of course, not to actually do the best goals. It was to get everybody's goals
00:06:07.380
in there. So, you know, the Norwegians had three ideas, and the Brazilians had four, and everybody
00:06:12.040
else had, you know, three or four that they wanted in there. That's why we ended up with 169 targets,
00:06:17.720
which, of course, simply means you promise everything to everyone everywhere. And that means
00:06:22.620
a lot of people are going to be very disappointed when 2030 rolls around. And we haven't actually dealt
00:06:28.280
with all the things that we promised. Right. So what happens there, what happens there is
00:06:32.640
that you, the people who are doing that, and including everyone on the list, maximize short
00:06:38.440
term emotional well being of the people who had been doing the consultation at the cost
00:06:43.160
of medium to long term progress. But then again, they're not going to be around in 2030 in all
00:06:47.940
possibility to suffer the consequences of that. So or at least, yeah, or at least they're not
00:06:52.860
nobody's going to see that we failed to do as much good as we possibly could, because
00:06:57.400
we will have done a little bit of good everywhere. But doing a little good everywhere is not nearly
00:07:02.920
as good as doing an enormous amount of good in the places where you can do the very most
00:07:07.300
good with extra resources. Again, remember, we're estimating the total cost of the SDGs is
00:07:12.740
somewhere in the range of two and a half trillion dollars. And the actual amounts available is about
00:07:19.480
140 billion dollars. So we literally have five percent of what we're promising. So we're promising
00:07:25.640
the world. And then we say, hey, here's a small amount of money. And let's spread it thinly. So
00:07:30.600
everybody gets a little bit of it. Okay, so that's that's so that's partly where you derive your premise
00:07:36.100
that we're dealing with limited resources is that you're actually using a real number. And the number
00:07:41.100
you're using is what's actually available. And when you wrote how to spend 75 billion, you basically
00:07:45.800
took half of what it was available. That yes, that was about much, much earlier on. And that was
00:07:52.000
mostly sort of a, you know, it's a fun idea to say, how would you spend a specific amount? Yes,
00:07:56.800
and that was half. So so we weren't saying we should spend all of it in the exact way that we
00:08:01.800
were talking about. Okay, so to agree with you, so to agree with you, people have to agree that
00:08:06.880
everything can't be done for everyone all at once at infinite expense, and that it's useful
00:08:12.260
practically, and and also, even in a utopian sense, in the desirable sense to rank order,
00:08:18.960
so that the obvious money that's available, the money that's genuinely available can be targeted
00:08:24.180
best. And so then the next question would be, how do you go about that in the least controversial
00:08:29.420
and most empirically sound manner to do the rank ordering? So we use cost benefit analysis, which is a
00:08:36.700
very well established economic tool that tries to say, all right, for each of the proposals that you
00:08:41.640
come up with, how much will that proposal cost? Now remember, this is not just economics, most of
00:08:48.000
the costs will be money. But for instance, if you want to immunize small children, you also have to
00:08:54.560
ask the mothers to spend perhaps a day to go to the place where their kids will be immunized,
00:08:59.700
that'll both cost them labor, they can't do labor that day, maybe they'll have transportation costs,
00:09:05.380
they'll have food costs, there'll be extra other costs. So we try to add all of those costs up and say,
00:09:10.560
okay, so what's the total cost of this project? Then we look at all the benefits. And remember,
00:09:14.920
the benefits are both economic, yes, but they're also social, for instance, kids not dying or kids
00:09:20.260
not being sick. And they're often also environmental. So we try to take all of those, all of those
00:09:26.340
benefits, so both the economic, the environmental and social benefits, add them all up into one number
00:09:32.080
that is denominated in dollars or rupees or whatever your currency is. And then you can say,
00:09:37.680
well, for this many dollars, you can do this much good. And that means you can also say for every
00:09:43.220
dollar spent, you can do this much good. Now, obviously, I'm simplifying this. But in a sense,
00:09:49.280
what it means is if you do it right, and a lot of economists spend a lot of time trying to make
00:09:53.620
this right. If you have all the same parameters across all these different areas, it actually means
00:09:58.560
you can start comparing different interventions across all the different areas and say, where do you
00:10:04.280
get the biggest bang for your buck? And of course, that is what matters if you're actually going to
00:10:09.480
do good. So we did, and this is not, please don't buy this book, because this is a very long and
00:10:15.120
academic book. But we did this long and academic book with more than 50 of the world's top economists
00:10:20.420
looking across all these different areas. But the beauty is, you can actually put it in just one
00:10:26.560
chart. And I'm going to show you that. And then we'll also put it up on your website.
00:10:30.280
So this is the one page chart that has all the targets here. And for each of the targets,
00:10:38.100
there's an analysis that says, how much will this cost? How much good will it do? And then it shows
00:10:43.080
for every dollar spent, how much good will it do? If it's a long line, it'll do a lot of dollars of
00:10:48.360
good. If it's a short line, not so much. So it really becomes this very simple menu for the world
00:10:54.460
to say, where can you spend your resources? And of course, this doesn't mean, you know, just like when
00:10:59.640
you go into a restaurant, you get a menu doesn't mean you buy the cheapest thing or the most
00:11:04.400
nutritious thing. Maybe you're in the mood for, you know, an unhealthy cake. But it's incredibly
00:11:10.360
important to know what is the cost and what's the benefit. And knowing this makes it a lot more
00:11:16.840
likely that the world is going to focus itself on some of the really long bars where it can do a lot
00:11:22.160
good for every dollar or rupee or peso spent. Okay. So now, if I was thinking about critiquing
00:11:29.000
this, let's say from a social science or a political perspective, the first thing that I
00:11:35.240
would object is, well, how is it that you know that your calculation of the costs and the benefits
00:11:42.200
are accurate? And how do you know that they're essentially as free as possible of any undue
00:11:50.920
political bias? So because your critics no doubt will object, as perhaps they should,
00:11:58.460
that there will be, let's call them implicit biases, even though I'm not a fan of that idea
00:12:04.480
in some sense. There's going to be underlying presuppositions that weight the manner in which
00:12:08.960
the economic calculations are made. And then, of course, you also have to buy the idea that the
00:12:13.340
cost-benefit analysis approach is actually valid. So can you tell me what you guys did to forestall
00:12:19.240
such or to take such criticisms into account? Sure. And so, first of all, we don't just ask one
00:12:25.920
team of economists. We also have other teams critiquing those economists. And we exactly try
00:12:31.140
to make sure that there's sort of critiques from both sides, if you will. So we know that this is
00:12:35.940
not just an ideologically driven number, but it's actually an empirically pretty clear number that says,
00:12:42.420
for instance, if you focus on vaccinations, you're actually going to do, and what we find is,
00:12:47.460
you're going to do $60 of social good for every dollar you spend. Now, you could argue, well,
00:12:54.060
what would be sort of the ideological spin you could put on that? Well, one thing is to say,
00:13:00.000
well, did you measure all humans as equal? Yes, we actually did. And of course, you could argue
00:13:05.520
from an economic point, the same point, that they're not. I think very few people would want to do that.
00:13:10.680
But what we do is, across all these areas, all people are ranked as equally important,
00:13:17.320
that is equally valuable. And that's obviously a political consideration. But I think, one,
00:13:22.260
if we're looking across the world, that is the right one. If anything, that probably means that
00:13:28.000
most rich country people are valued at way too little compared to what they're actually willing
00:13:32.840
to do themselves. But this means that we actually get the waiting right in the sense of saying,
00:13:38.180
where are you going to spend extra money if your goal is to help the most in the world?
00:13:43.420
Right. Okay. So you used an approach that was basically, I would say, a solid measurement approach
00:13:49.140
from the theoretical perspective, because the idea would be to have multiple measures of the same
00:13:55.600
phenomenon, or set of phenomena, and to see where they dovetail. So that would be an issue of
00:14:02.980
reliability. So if you have multiple teams of economists, and they all converge on something
00:14:08.180
that approximates an agreement, and you look at a diverse range of opinions, then you can be
00:14:13.260
reasonably certain that you've converged on something that's real. And then you also have
00:14:16.940
an element of peer review in there. Yes. Another way of... Go ahead.
00:14:22.320
And exactly. Both of these are obviously important to make sure, as you say, that you get the
00:14:27.480
reliability, and you actually get something that resembles somewhat of a truth. But the real point
00:14:34.180
here is, of course, in some way, if we just take a step back, this is not a question about getting
00:14:38.940
the absolute numbers exactly right. I mean, that would be wonderful if we could do that. But when
00:14:44.180
I just told you that, you know, vaccination, every dollar you spend will give you $60 back in the dollar,
00:14:49.220
it'd be very, very unlikely that that's the right number. But it's much, much more about getting the
00:14:54.860
order of magnitude, right? Right. Sure. Is it $60, or is it $600 you get back? Because really,
00:15:01.640
the point here is to compare across all these other things that you could also have spent that
00:15:06.200
money on. And there, we have a much greater sort of reliability, because we're pretty sure that even
00:15:13.840
if you change the assumption very much, you will still get much of the same kind of picture. You won't
00:15:20.000
get exactly the same picture, but you'll very much get the same sort of picture, which indicates
00:15:24.620
that there's a few targets that will do an amazing amount of good. And there's a lot of
00:15:28.660
targets that will just do a little good. And so again, we try to say is, do the amazing targets
00:15:33.480
first. Okay, so the most critical thing for you guys to get right is the rank ordering, and not the
00:15:37.840
absolute magnitude. So as long as your method is stable across all the different domains, then the
00:15:44.020
rank ordering should be relatively stable. Now, did you get a Pareto distribution with regards to
00:15:49.400
positive impact of investment? Like, is there a handful of interventions that are clearly head
00:15:57.420
and shoulders above the rest in terms of generating positive economic outcome? And what proportion of
00:16:03.060
the total number of, say, 170 goals? Like, if you did, if you accomplish 10% of the 170 goals,
00:16:09.940
how much of the economic bang for your buck would you accrue?
00:16:13.740
I should actually just say I didn't plant that question. That's wonderful, because that's exactly
00:16:19.040
what we tried to do. So if you do across all these areas, every dollar spent will do about $7 of good
00:16:25.640
if you just did it across all of them equally, which is probably unreasonable to assume, but it's not
00:16:30.780
unreasonable in sort of first order approximation. If you spend it on the best 19 targets, you would do
00:16:38.660
$32 of good, so more than four times more good. So you can simply spend the same dollar and do more
00:16:46.340
than four times more good than you would do if you just scattershot it across all areas. And again, what
00:16:52.780
you also have to remember is a lot of the things that we're looking at which are really, really
00:16:56.460
effective are also things that are much easier to do. So two of the best things that we point out
00:17:02.680
is actually contraception. So family planning for women. Why? Because if you do that, there's about
00:17:10.720
13% of women that still don't have access to contraception. And if they got that access,
00:17:17.860
they would be able to better space their kids. And we know that that means that you can actually have
00:17:21.900
your kids when you're ready to have them. That means you put more effort and investment into your kids
00:17:27.280
so they'll grow up better, they'll be fed better, they'll have a greater chance of surviving,
00:17:31.740
but they'll also become more productive in the long run. It also means fewer of those kids are
00:17:36.460
going to die, fewer moms are going to die. So we actually estimate you'd see about 600,000 fewer
00:17:42.000
kids die. And you would get a demographic dividend that is basically because you have slightly fewer
00:17:48.760
kids, you can invest more in them, you get better return on every kid, you have slightly higher growth
00:17:54.620
rates. And so we estimate for every dollar you spend on family planning, you will do $120 of social
00:18:00.960
good. The other great thing is invest in free trade. Free trade is something that we've sort
00:18:07.720
of forgotten. We actually had the last big free trade discussion from Doha back in 1999. It's
00:18:15.900
basically, you know, stopped being a concern, obviously with Trump, but also with many other
00:18:20.880
people who've sort of given up on free trade. Yet we have to remember that one of the basic things
00:18:26.000
that have made us wealthy is the fact that we trade with each other, you do what you're good
00:18:30.760
at. And I do what I'm good at. And that means when we exchange, we actually all get better off.
00:18:36.960
Maybe you could tell people in some in some detail, what what what that would mean practically,
00:18:42.200
like what are the sorts of barriers that you guys determined were particularly troublesome that
00:18:48.160
need to be addressed? So we actually estimated what would it take to get a reasonably successful
00:18:54.760
Doha round, which would not be free trade, but it would be freer trade. So it would simply be reduced
00:19:00.780
tariffs, make it easier for everyone to trade across the world, especially from the developing world to
00:19:06.520
the developed world. And one of the outcomes we found was not only that on average, every person in
00:19:12.180
the world would be about sorry, in the developing world would be about $1,000 richer per person per
00:19:17.820
year in 2030. So by the end of these sustainable development goals, we would lift 145 million
00:19:23.980
people out of poverty. But we would simply make everyone better off because if you're in a poor
00:19:30.540
country, you would be able to sell the things that you do best, a little easier, a little cheaper,
00:19:35.980
and hence be able to market more of it. And you'd be able to buy back more from other developed
00:19:41.040
countries. And that would make everyone better off. So again, this is not rocket science. I mean,
00:19:46.680
if you look at China, for instance, China over the last 30 years lifted, what, 680 million people out
00:19:53.380
of poverty, very largely driven by the fact that they could trade with the rest of the world. Imagine
00:19:58.520
if we could make that happen for sub-Saharan Africa, if we could make that happen more for Latin
00:20:03.280
America. So again, it's more about realizing that for very little money, some of these things can do
00:20:09.380
an amazing amount of good. And we tend to forget because there's no focus on these issues. We
00:20:15.320
don't think about family planning or free trade or indeed vaccinations. We think about all kinds of
00:20:23.140
other things like plastic waste or global warming or many other things that have a lot of sort of
00:20:29.020
attention from celebrities and getting in the newspaper. And that's not because there are no
00:20:34.380
problems. But it's about getting a sense of what's the magnitude of how much good we can do
00:20:39.380
with little money. And let's talk about global warming because my, okay, so my, my, I've talked
00:20:46.980
to lots of people about your work. And you know, you obviously have a lot of admirers and you have a lot
00:20:51.280
of detractors. And I'll be listening to the detractors because I believed after I had reviewed the UN
00:20:58.400
development goals, I'd come to the same conclusion that you had come to before I knew what you had done,
00:21:03.320
which was these things need to be rank ordered because otherwise it's, it's not a plan. It's not
00:21:07.680
a strategy. And so then there has to be some mechanism for rank ordering them. And I ran across
00:21:12.540
your work and I thought, well, that seems to be exactly right. I mean, from a, from an agnostic
00:21:17.380
perspective, let's say it's an interesting idea because what you have to start with is the willingness
00:21:22.220
to be agnostic about what the worst problems are. And, and while the worst and most solvable problems,
00:21:28.440
let's say at the same time, you have to be agnostic about that. And I actually think that that's part of
00:21:32.920
the reason why what you do bothers so many people because they have an a priori commitment to what
00:21:37.420
constitutes the most salient catastrophe. And that's clearly at the moment is, is that I, the idea
00:21:44.020
that climate change or global warming, depending on how you want to phrase it, constitutes such an
00:21:51.520
immediate and pressing threat of, of overwhelming economic magnitude that a sacrifice of any amount is
00:21:58.560
worth some probability of forestalling that. And so, um, one of the things that's striking about
00:22:06.760
your list, and maybe what I should do is have you tell us what the top seven or eight are, just so
00:22:12.080
that we have some sense of what the priorities are. Because one of the things I noticed was that
00:22:16.860
they don't tend to include measures that are designed to forestall global warming.
00:22:21.440
Yeah. So we actually, uh, we, we had two Nobel laureates, uh, look over all of these evidence,
00:22:28.920
all of this evidence and set priorities and come out with 19 targets that they focused on. And
00:22:34.380
actually one of them was, uh, stop fossil fuel subsidies, which is obviously a stupid idea in
00:22:40.000
so many different ways. Uh, remember this is most in developing countries, uh, where it's very often done,
00:22:45.260
uh, to, you know, basically pacify the population a little bit like you, uh, make, uh, subsidies for
00:22:51.160
bread or other things. Uh, but of course the idea of subsidizing fossil fuels like Venezuela has done
00:22:56.500
and Indonesia has done, uh, is basically a way of subsidizing fairly wealthy people to drive their
00:23:02.460
car that you have to have a car in order to enjoy this, uh, and drive it more and actually create more
00:23:08.100
congestion, more air pollution, and with very few benefits. Uh, so clearly what you should be doing is
00:23:14.440
scrap those subsidies for fossil fuels, not only because they lead to more, uh, air pollution and
00:23:19.720
CO2 emissions, but also because they're just terribly bad use of public resources that could
00:23:24.880
have been spent on education or health or other places where they could have done a lot more good.
00:23:29.700
But just to give you a sense of some of the other ones that we were talking about, uh, expanded
00:23:34.480
immunization, as we talked about, that's an incredibly good way. We know that we're right now cut
00:23:40.660
child mortality, uh, that is under five mortality from about 12 million kids dying every year in
00:23:47.260
1990 to about 6 million. That's a fantastic achievement. But of course, 6 million is still
00:23:52.540
a mind boggling number. That's way, way too large. And we actually know that by investing about a billion
00:23:58.040
dollars, we could save a million kids every year that you've got to almost say that again for a
00:24:03.880
billion dollars. You could save a million kids lives every year. Why the hell is that not one of our top
00:24:09.740
priorities? And that's also why we show for every dollar spent, you'll actually do $60 worth of
00:24:15.160
good. Another incredible, uh, investment is in nutrition. Uh, so, you know, everybody kind of
00:24:21.680
knows that it's, it's not right that people are starving and we know that we could actually feed
00:24:26.900
everyone. The main reason why people are still starving is because they don't have enough money.
00:24:30.760
It's not because we can't produce it. So it's because they're poor and they don't actually have
00:24:34.560
the demand capacity. But the real tragedy of malnutrition is that if you get it when you're
00:24:41.960
really small, so from zero to two years of age, your brain develops less. And that means when you
00:24:47.800
get into school, you're actually less able to learn. And that stays with you for your entire life.
00:24:53.580
You stay less, uh, long in school, you learn less and you come out and you actually are not very
00:24:59.620
productive. We know this now. We, we've had this as a theoretical argument for a long time. Uh, but
00:25:04.760
researchers and some of the researchers that we work with have now actually proven this because they
00:25:09.480
went back to an old study, uh, done in Guatemala in the late 1960s, uh, where researchers went to two
00:25:15.860
small villages in rural Guatemala and gave the kids there. So the really smallest there to two year
00:25:20.980
olds, good food. And then they took two other rural villages nearby and gave the kids essentially
00:25:26.860
sugar water. Of course you couldn't do this today, but the brilliant thing about it is, is our
00:25:31.980
researchers then refound these kids. They're now in their late thirties, early forties, and you could
00:25:36.740
see what had happened. And it was exactly what the theory predicted. If you had gotten good food,
00:25:42.860
you stayed longer in school, you'll learn more every year in school. And so when you came out,
00:25:48.300
you were much more productive. And one of the ways we measure that is you had higher incomes.
00:25:53.380
If you avoid it being stunted, you had 60% higher income. That's just a phenomenal outcome. So again,
00:25:59.700
spend money, for instance, on malnutrition by getting good food and also research and development into
00:26:04.760
better yielding varieties. And you can do an incredible amount of good for every dollar. We
00:26:09.400
estimate $35 or thereabouts. Right. And so what you're doing is forestalling cognitive deterioration
00:26:15.320
in the, in the first two years. And because cognitive ability is a great predictor of long-term
00:26:19.680
success, then you're producing people who are much more likely to be economically productive for
00:26:25.720
themselves and for other people. Right. And so, yeah. And the crucial bit is it's also for other
00:26:30.000
people, right? If you're good, you're likely to make other people better too. Yeah. Right. Oh yes.
00:26:35.240
That's an absolutely crucial issue. So, so yeah. So, so it's quite striking. It was quite striking to
00:26:42.580
me when I came across your work to find out to what degree it was focused on targeting children's
00:26:48.420
health in some fundamental sense in the developing world. That seems to be, I mean, if you had to put
00:26:53.680
it in a nutshell, correct me if I'm wrong, that seemed to be where you guys focused or where that,
00:26:59.240
where your focus took you. And so what else? Can I just say, because there's also some other very
00:27:05.640
low hanging fruit, for instance, what you're seeing increasingly across the world is that more and more
00:27:10.580
people die, not of infectious diseases because we've actually tackled many of those, but they die
00:27:16.200
from old age diseases like cancer and heart disease. Those are by far the biggest issues. Cancer,
00:27:22.940
it turns out to be fairly costly to deal with, but heart disease, we've actually now figured out
00:27:27.860
pretty much how to deal with, not to the extent that we will live forever, but that we can make
00:27:32.480
people live much longer. And that's basically by giving them very cheap and off patent heart
00:27:37.900
medication. So we give this to a lot of middle-aged people in the developed world. And it's very,
00:27:43.940
very cheap to also do in the developing world. So we can save about three years of life for these
00:27:49.240
elderly people, both men and women, and it costs peanuts and you can basically make all lives longer.
00:27:56.800
And so that's one of the places where we also show there's a huge benefit.
00:28:00.960
We also emphasize, and this is again, one of the depressing things that we should be focusing a
00:28:06.460
lot more on tuberculosis. Tuberculosis is now the world's leading infectious disease killer. It's no
00:28:12.260
longer HIV AIDS. That's still a good idea to invest in, but it's actually an even better idea to invest in
00:28:18.280
tuberculosis. But again, because it's an old disease, it's, you know, it's been with us for hundreds of
00:28:23.020
years. And we've kind of learned how to fix it a hundred years ago. It's not an issue in the
00:28:28.320
developed world. So most people don't want to hear about it, don't care about it, but it's a crucial
00:28:33.180
killer that kills 1.4 million people every year in the developing world. And we have the means to
00:28:38.700
eradicate pretty much all of those deaths very, very cheaply. So again, that's one of the places
00:28:43.700
where we say spend money here because you can do an amazing amount of good. So I think we're just
00:28:48.580
looking for where are the really good deals. If you want to do something about global warming,
00:28:54.520
as you then mentioned, you should ask yourself, well, how are we going to fix this? So there's
00:28:59.840
two things to global warming. One is, as you mentioned, there's a sense in which people believe
00:29:04.800
it's the overwhelming danger that's going to undermine the entirety of human civilization.
00:29:11.000
And just like pretty much all other problems, that's just not true. This is a problem. It's not the
00:29:16.380
end of the world. If you look at the economics that's been done, the Nobel Prize was just
00:29:21.580
awarded in climate economics to William Nordhaus this year. And he's been a guy working almost
00:29:27.860
three decades on what are the costs and the benefits of climate action. And he finds the
00:29:34.500
cost of climate. So climate change is about, and he's backed up by a lot of other economists,
00:29:40.980
is somewhere between 2% and 4% of GDP by the end of the century. So remember, by then we'll be,
00:29:48.540
say, four or five times richer. So we'll be 405% as rich as we are now. But we will see a drop in our
00:29:56.120
incomes worth about 2% to 4% less than we would otherwise have had. That's a problem. But it's by
00:30:03.400
no means the end of the world. And that's the first thing you sort of need to recognize. This is a
00:30:08.320
problem is not the end of the world. Because if you think it's the end of the world, as you rightly
00:30:12.480
pointed out, then you're willing to throw everything and the kitchen sink at it. But if it's a problem,
00:30:17.640
you will act exactly like what I think we should do with all problems, say, all right, there's a lot
00:30:22.440
of problems. Let's ask, where can we spend a dollar and fix most of that problem? And unfortunately,
00:30:28.960
that's not climate change. It's actually really, really hard to just change a tiny bit of climate change
00:30:34.720
with a lot of money. And that's why we find that most of the interventions that you do for climate
00:30:39.920
change turns out to be fairly poor. They're not necessarily bad investments. Some of them are.
00:30:45.120
But even, you know, for instance, adaptation or get more energy for poor countries, give you sort of,
00:30:52.820
you know, $2, $5 back in the dollar, which is nice. But in the big scheme of things, there are much,
00:30:58.700
much better places you can spend your resources on.
00:31:01.000
Okay, so let's look at this. Well, because I'm really curious about this,
00:31:06.140
because I can't see any a priori problems with your method. It seems to me to make a lot of sense. And
00:31:12.880
if your goal is to do the most amount of good in the shortest period of time with the least amount
00:31:18.440
of resources, which seems like a pretty damn good goal, then and to be realistic about what's
00:31:23.620
attainable, then I can't see that anyone's done a better job from a methodological perspective than
00:31:29.500
you guys have. Okay, but now but you still face a tremendous amount of opposition. And most of that
00:31:34.580
does come from the climate side of things as far as I can tell. And so it's, I've been trying to
00:31:40.320
think through why that might be. Going online without ExpressVPN is like not paying attention
00:31:46.280
to the safety demonstration on a flight. Most of the time, you'll probably be fine. But what if one
00:31:51.500
day that weird yellow mask drops down from overhead and you have no idea what to do? In our hyperconnected
00:31:57.260
world, your digital privacy isn't just a luxury. It's a fundamental right. Every time you connect
00:32:02.140
to an unsecured network in a cafe, hotel, or airport, you're essentially broadcasting your
00:32:07.060
personal information to anyone with a technical know-how to intercept it. And let's be clear,
00:32:11.540
it doesn't take a genius hacker to do this. With some off-the-shelf hardware, even a tech-savvy
00:32:16.380
teenager could potentially access your passwords, bank logins, and credit card details. Now, you might
00:32:22.040
think, what's the big deal? Who'd want my data anyway? Well, on the dark web, your personal
00:32:26.680
information could fetch up to $1,000. That's right, there's a whole underground economy
00:32:31.600
built on stolen identities. Enter ExpressVPN. It's like a digital fortress, creating an encrypted
00:32:37.660
tunnel between your device and the internet. Their encryption is so robust that it would
00:32:42.100
take a hacker with a supercomputer over a billion years to crack it. But don't let its power fool
00:32:46.900
you. ExpressVPN is incredibly user-friendly. With just one click, you're protected across
00:32:51.680
all your devices. Phones, laptops, tablets, you name it. That's why I use ExpressVPN whenever
00:32:56.880
I'm traveling or working from a coffee shop. It gives me peace of mind knowing that my research,
00:33:01.560
communications, and personal data are shielded from prying eyes. Secure your online data today
00:33:06.720
by visiting expressvpn.com slash jordan. That's E-X-P-R-E-S-S-V-P-N dot com slash jordan,
00:33:13.680
and you can get an extra three months free. Expressvpn.com slash jordan.
00:33:22.040
Starting a business can be tough, but thanks to Shopify, running your online storefront is
00:33:26.600
easier than ever. Shopify is the global commerce platform that helps you sell at every stage of
00:33:31.640
your business. From the launch your online shop stage, all the way to the did we just hit a
00:33:35.820
million orders stage, Shopify is here to help you grow. Our marketing team uses Shopify every day to
00:33:41.560
sell our merchandise, and we love how easy it is to add more items, ship products, and track
00:33:46.120
conversions. With Shopify, customize your online store to your style with flexible templates and
00:33:51.600
powerful tools, alongside an endless list of integrations and third-party apps like on-demand
00:33:56.500
printing, accounting, and chatbots. Shopify helps you turn browsers into buyers with the internet's
00:34:01.760
best converting checkout, up to 36% better compared to other leading e-commerce platforms.
00:34:07.240
No matter how big you want to grow, Shopify gives you everything you need to take control
00:34:11.100
and take your business to the next level. Sign up for a $1 per month trial period at
00:34:15.940
shopify.com slash jbp, all lowercase. Go to shopify.com slash jbp now to grow your business
00:34:23.000
no matter what stage you're in. That's shopify.com slash jbp.
00:34:29.600
And so when I reviewed the climate literature, which was a few years ago, I had some real concerns
00:34:36.800
about measurement accuracy and so forth because it's a very complicated issue and it's not
00:34:41.280
the constants that should be associated with increase in carbon dioxide aren't obvious and
00:34:47.700
there's quite wide error bars around them. And then carbon dioxide has all sorts of weirdly
00:34:51.840
complicated effects like increasing global greening, which is quite an interesting one.
00:34:57.320
And so anyways, and it also struck me that if you project out the climate change estimates across
00:35:04.720
about a 50 to 100 year period, the error bars grow very large as you move outward, obviously,
00:35:10.000
because the errors multiply. And then it struck me that we're in a situation where the error bars out
00:35:16.760
50 years are so wide that even if we did what people recommended now, we could never be sure that
00:35:22.340
it actually worked. Because you can't, the propagation of error across all those decades makes the picture
00:35:29.680
so blurry two or three, four decades down the road that there's no way of garnering evidence about
00:35:35.460
the effectiveness of your intervention. And if it's a high cost intervention, that seems to be a
00:35:39.400
really bad idea. So, okay. So, and I'm going to step one more step backwards, which is, so
00:35:45.480
I also found it difficult to trust the climate science. And the reason for that was that it struck me as
00:35:53.900
motivated by issues, at least in some part, that were outside of the science. It seems to me that
00:36:03.820
a tremendous amount of what motivates people's psychological commitment to the idea of climate
00:36:09.220
change is something like an underlying anti-Western or anti-capitalist ethos. And that the idea is that
00:36:17.400
we should restrict growth and we should restructure the economic system. And that would address climate
00:36:22.140
change. And that would be a positive thing and forced all the apocalypse. But the real goal seems
00:36:26.860
to be more to find an ethical justification for the political position that requires the retooling
00:36:32.080
of these economic systems. And so, so, well, I guess the first thing I'd like to know is what you,
00:36:40.840
does that strike you as a reasonable argument? Because I can't see otherwise why people would be
00:36:45.680
objecting to what it is that you're doing. Yeah. So there's, there's a lot of questions and then
00:36:51.280
let me just try to unpack, unpack some of this. So we, I think if you look out, yes, there's a lot
00:36:58.940
of uncertainty going forward. Some of that actually cancels out when you're saying, well, we're uncertain
00:37:04.940
about how much the temperature rise will be. But we do know that if the causal mechanism is CO2 leads
00:37:10.460
to higher warming, if you take some of the CO2 out, you will get less of it. Now we don't exactly know
00:37:15.980
how much, where you'd get it less, whether it was a lot and up down here a little bit, or whether it
00:37:22.000
was here and down a little bit, but you actually get a little bit the same thing. So you can take
00:37:26.820
some of the error bars up because you're only looking at the difference and you're not actually
00:37:31.920
looking at the actual input. The other thing, I'm in no doubt that there's a lot of, you know,
00:37:38.360
other reasons why people latch on to social phenomena. So, you know, when, when some climate
00:37:43.780
scientists say we should cut carbon emissions because this is leading to a really dangerous
00:37:48.200
issue, there's a lot of other people who will see, oh, that actually fits with my ideological
00:37:52.440
presupposition. So I'll actually join in, in this, in this conversation. I think that happens in a lot
00:37:59.520
of different areas. And what we try to do is to sort of step back and say, look, I'm not going to get
00:38:05.020
into all of that. I'm simply taking as the starting point. We're economists. I'm actually
00:38:09.640
not. I'm a political scientist, but all the people I work with are economists. We just take as given
00:38:14.440
what the climate scientists are telling us. I think it's a, it's a, it's an interesting conversation
00:38:20.540
and say, did they actually get it somewhat wrong? And I think certainly, you know, somebody should be
00:38:26.200
looking into it, but, but, you know, I've met a lot of these climate scientists. My sense is that
00:38:30.580
they're good, hardworking, you know, scientists are actually trying to find out what, what's up
00:38:35.200
and down in this area. So we simply take our starting point with you and climate panel. What
00:38:40.320
we do is simply ask how much will it cost to cut carbon emissions so much that we will see a
00:38:47.900
significant change in temperature. And of course, remember, we emit CO2, not because we want to bother
00:38:54.080
Al Gore or anyone else. It's a byproduct of having a life that is incredibly much nicer than one we
00:39:02.220
would have if we didn't have access to a lot of energy. I mean, we can sit and talk here across the
00:39:07.640
continent, but also, you know, you have heating and you're cooling and you have fertilizer that
00:39:12.680
subsidize, you know, artificial fertilizer that basically feeds half the world population and a lot
00:39:18.320
of other benefits that come from mostly using fossil fuels. So if you want to get rid of some of those
00:39:23.700
fossil fuels, possibly all of them, you will have to replace it with more expensive energy. And
00:39:30.540
that's why it costs to cut carbon emissions. Now, that may be worth the cost, but that's exactly the
00:39:36.640
question that we try to ask. And that's what William Nordhaus, the Nobel laureate in climate
00:39:41.100
economics and many others have asked. So it's really important. It's really important to note that
00:39:46.220
you're not questioning the science as it stands now, the consensus, so that this is purely a
00:39:50.980
consequence of economic calculation. This is only about saying, good, CO2 impacts warming,
00:39:57.480
and it does so in the way that the UN climate panel tells us. Now, that's not entirely true,
00:40:03.020
because there's a lot of things that they tell us, and there's a whole variability, and we try to take
00:40:06.980
that into account. But honestly, it turns out that if you do it on the central estimates, you get pretty
00:40:11.860
much what I'm about to tell you. Okay. So what you find is, if you cut carbon emissions now,
00:40:17.140
you can have a little bit of impact in 100 years, but it will have a significant cost now.
00:40:23.480
It's not going to put us to the poorhouse. Nobody's talking about that. Just like we're not talking
00:40:28.380
about the end of the world if we don't do something about climate change. We're not talking about the
00:40:32.440
end of the world if we do something about climate change either. So I want to sort of dial back in
00:40:37.100
the rhetoric, both on the alarmists that say, oh my God, the world is coming to an end, or the people
00:40:41.160
who are saying, oh, we can't afford this, and we're all going to the poorhouse if you want to, you know,
00:40:44.960
have solar panels and wind turbine. No. These are both manageable costs. These are sort of in the
00:40:50.200
order, you know, two to four percent. But the problem is that if we do sort of things that cost
00:40:56.140
one to two percent of GDP right now, and for the rest of the century, we basically solve almost no
00:41:03.060
part of the global warming problem. We'd probably solve about one percent of it. So basically by incurring
00:41:10.620
a cost of one to two percent of GDP now and every year throughout the rest of the century, you'll
00:41:15.780
have solved almost none of the problem come 2100. So you still have to pay all the same problems that
00:41:22.920
global warming is incurring minus a slight amount, and then you paid one to two percent every year.
00:41:30.240
That's the basic idea of why most cost-benefit analyses show that unless you do it very carefully
00:41:36.540
and only do a little bit of cutting and do it really smartly, you're actually incurring higher costs
00:41:42.920
than the problem you're trying to solve. So we can thread the needle and do it really carefully,
00:41:48.700
but that requires a lot of smartness from a lot of politicians. But if we do it really bluntly,
00:41:54.180
we're just going to incur lots of costs and actually not get very many benefits.
00:41:57.700
Okay, so when the people who are not happy with you for not prioritizing climate change to the degree
00:42:03.120
that it should be, hypothetically, how do they criticize your statements? On what basis do
00:42:10.880
they decide that your conclusions are inappropriate? I think, I mean, if you read most of the internet,
00:42:17.140
there's a lot of things that'll just say, oh, you're a denier, you can't say, you know, sort of,
00:42:21.340
we're going to excommunicate you, we don't want to talk to you, that's not right. I'm not really going
00:42:26.100
to talk about those people because that's just, you know, that's political posturing rather than
00:42:30.080
anything else. I think there's a reasonable argument to be made to say, that's not the only
00:42:35.760
way that you can approach this. So there's two ways that you can think about cost benefit analysis
00:42:40.180
actually showing that climate impacts are worth doing. One is to say, ah, but there might be a
00:42:48.060
tiny risk, even just a tiny risk, that the whole world is going to spin out of control and we're all
00:42:52.980
going to die kind of thing. And, you know, that's not implausible, especially if you say sort of,
00:42:58.160
I'm not going to put a percentage on it, but it's not, it's greater than zero. So there's a
00:43:03.760
non-zero chance that global warming will spin out of control and we'll basically be, you know,
00:43:08.940
relegate a few couple, a hundred couples of humans living on an ice-free Antarctica.
00:43:14.500
So that's a positive feedback loop argument, right? That we might trigger mechanisms like the
00:43:19.220
melting of the Greenland ice sheet, for example, that would flood the world or cause some
00:43:23.400
irreparable catastrophe of unparalleled magnitude. Okay, so I have a question about that because
00:43:30.020
that's a tricky one to address, right? That's kind of an apocalyptic argument. And then the argument
00:43:34.420
would be, well, if there's a 1% chance of an infinite apocalypse, it's worth any donation of
00:43:39.560
resources to stave it off. So one way of thinking about that is actually to multiply the catastrophes
00:43:46.900
because my suspicions are that that same argument could be used in relationship to a lot of the
00:43:53.920
other problems that you are trying to address. Like, I don't know what the probability is that if we
00:44:00.380
keep a substantial number of people in abject poverty over the next 20 years, let's say, more than we'd
00:44:05.800
have to, that we would increase the probability of the generation of epidemic and infectious diseases
00:44:13.900
disease. Because poverty, poor people are a risk to everyone in this, there's a horrible way of
00:44:20.620
thinking about it. But poverty is a risk to everyone, that's a better way of thinking about it. Because
00:44:24.840
decreased global human health is also a breeding ground for all sorts of catastrophes that might
00:44:30.480
emerge. And then there's also the possibility of political instability and nuclear war and all of
00:44:36.340
those things that are also equally apocalyptic. So it seems to me to be reasonable to some degree to
00:44:42.140
say, look, there's the possibility across a wide range of potential crises of unforeseen positive
00:44:48.400
feedback loops spiraling out of control. But we can't introduce them into the argument unless we can
00:44:53.060
parameterize them because all it's an unfair game move in some sense, because you can't be not wrong
00:45:00.240
about that. Exactly. And you're basically saying, if you allow it in this area, just because you like
00:45:07.700
that particular area and say, I want you to focus more money on my thing, which is climate change, you could
00:45:13.720
equally well do it in all kinds of other areas. And I actually think you can make your argument even
00:45:17.980
stronger. It's not just not, it's not just that poverty sort of breeds a lot of risk. But it's also that they
00:45:23.120
breed terrorism and willingness to, you know, do a lot of bad things, you know, focused. So it's not just
00:45:29.720
something that happens, but, you know, throw in bioterrorism and our ability to keep, you know, all
00:45:34.940
the plutonium in the world under lock. And, you know, you can get catastrophe from anything.
00:45:40.480
So abject poverty, for example, you could imagine that there might be two socioeconomic
00:45:45.640
contributions to that. There would be the absolute number of people in abject poverty who are therefore
00:45:50.800
desperate. And then there would also be maybe another contribution of excess inequality, and a sense that
00:45:57.680
the world isn't laying itself out fairly, and that that would justify political and revolutionary
00:46:03.580
instability. And also just, you know, state failure and many other things that we know make it a lot
00:46:09.840
easier for everyone to make really bad things, like we saw out of Afghanistan with 9-11, and a lot of other
00:46:17.620
things, you know, once you get failed states, you get a lot of bad things happening. Right, okay. So we try to
00:46:22.420
make people better off, we try to make people better off globally, so that we decrease the probability of
00:46:27.380
large-scale political and economic instability. And that's another way of staving off an apocalypse.
00:46:35.280
And so, can I just say, so Nordhaus actually looked at this, the Nobel laureate, because some people
00:46:41.120
argued exactly what you said, there's a tiny risk that things go really, really bad, so we should
00:46:45.820
spend all of our money on this issue. But of course, that's a failed argument, because likewise, there's a
00:46:51.600
lot of small risks everywhere else. We know one risk, which is being hit by an asteroid.
00:46:57.940
That could wipe out the Earth. Yet, and Nordhaus did that, and I thought that was very, very elegant.
00:47:04.660
We know that we can track all of those asteroids out there, you know, 99.99%. But we chose to only
00:47:13.560
track 90% because tracking the rest was too expensive. So you can actually see that we put a
00:47:19.660
price on how much are we willing to do this. And of course, that's just one place where we have a very
00:47:24.260
clear example, that we say, we care somewhat for the future, but we don't care about it entirely.
00:47:30.380
We have lots of other issues that we want to focus on right now. So that's just like every other area.
00:47:36.960
You have to argue, what is the risks and what are the opportunities to do this? And absolutely,
00:47:43.660
given that there are some risks, and there are probably more downside risks and upside risks for
00:47:47.480
global warming, we should probably do a little more than what we would otherwise do. And that's exactly what
00:47:52.060
the model show us. And those are the models that we use in making the estimate of how much should you
00:47:56.760
be paying for global warming. So yes, we should take that into account. But it's not a good argument.
00:48:02.280
That was that was one argument. I'm sorry, I'll try to be a little with the other argument. The other
00:48:06.980
argument is that people will say, one of the reasons why it doesn't pay to do global warming is
00:48:12.240
because you have to pay now, but the benefits come far up into the future. So basically, you do something
00:48:17.980
now that's fairly costly. And then you get a tiny benefit in 100 years. So you have to really say
00:48:24.240
the future is incredibly important. In economic speak, that is that the discount is really low,
00:48:30.020
that you really care a lot about the future. Now, a lot of people would argue, we should, you know,
00:48:35.060
we're rich, we should be able to care enormously about the future. And if you change that parameter
00:48:41.260
enough, and you say we care enormously about future generations, it actually turns out that global
00:48:46.660
warming becomes a good deal. You know, doing something about global warming actually becomes
00:48:50.480
goes from, you know, saying, you spend a dollar and you do a couple cents of climate damage to
00:48:54.960
sorry, to avoid climate damage, you actually spend a dollar and you might do $2 of climate benefit. So
00:49:00.860
it actually turns it into a good idea. But here's the kicker. If you care that much about the future,
00:49:07.380
you change every other one of all of these priorities and make them boom, right? Because what
00:49:13.040
obviously happens is, you've just said, I care so much about the future that the guy that I will
00:49:18.000
save from not having tuberculosis and dying from it tomorrow, will now go on to have a successful
00:49:23.380
life, his kids will live longer, they'll do better, he will have a better his, you know, nation will do
00:49:29.480
better. That means they'll be much, much better off in 2100 and so on. And that means this is no longer
00:49:35.140
a question of saying you spend a dollar and you do $43 worth of good, but you spend it all and you
00:49:40.520
a thousand dollars worth of good. So what you've achieved is basically just made everything a
00:49:46.780
great idea. And that's also intellectually what making the discount rate very, very small. What
00:49:52.340
it means is you should basically starve, you should just eat porridge every day and spend all of the rest
00:49:58.560
of your money on the future because you care so much about it. Right now, if you do that, I applaud
00:50:04.260
your sort of your consistency. But most people just don't do this. And we certainly don't do this in the
00:50:10.160
way that, you know, we don't seem to care all that much about our pension systems, which are in many
00:50:14.500
rich countries going to fail in the next 20 to 40 years and all these other issues. So as long as
00:50:20.040
you're saying, no, no, what you're really saying then is on climate, we want to care a lot about
00:50:24.660
the future, but we don't want to do it in all other areas. You're going to be consistent. You need to do it
00:50:29.280
across all areas and then you need to be very hungry and only eat porridge. Right. Okay. So your claim
00:50:33.780
basically from a methodological perspective is that your rank ordering remains constant across
00:50:38.200
variable discount rates. Yes. Right. And that's a really, that's a really important, that's a really
00:50:42.920
fundamentally important point. It's not entirely, I got to say, it does change some of these. Yeah,
00:50:47.980
sure. Because obviously, you know, education, for instance, is one of those where you pay now and
00:50:52.820
you only get benefits 10, 20, you know, 50 years out when the kids grow up and they actually become much
00:50:57.440
more productive. So there is a change, but mostly the rank ordering remains the same. And I'm simply just
00:51:03.540
insisting that people need to be consistent. You can't just say the future is important when you
00:51:08.580
talk about climate. Yeah. If you want to say the future is important, it's important across all
00:51:11.940
areas. And then you really have to do everything and, you know, forego having a good life yourself.
00:51:17.200
Right. Right. Okay. Well, so the other thing that's worthwhile pointing out about the discount rate is
00:51:21.580
that, you know, you can make a case that the future is more important than the present,
00:51:26.400
especially because it extends out so far. But, but the other reason that people discount the future is
00:51:31.200
because you can't make the case that you can predict the outcome of your actions. And that
00:51:36.820
means that the error in prediction magnifies itself as you move out farther in timeframe. And so what
00:51:42.300
happens is you get to a point where there isn't a lot of point in adjusting your behavior in the
00:51:47.640
present. If you look like a hundred years out, let's say, or a thousand years out, because you
00:51:52.000
actually can't calculate with any accuracy, the consequence of your actions or inactions. And so the,
00:51:57.480
the cumulative error makes discounting the future, the appropriate thing to do, because
00:52:02.820
you can predict what's going to happen if you act now, now, and maybe tomorrow, but you get much
00:52:07.820
less accurate as, as you move forward. So even in the best case scenario, where we had the best wishes
00:52:13.580
for the future, that doesn't mean we could justify incredibly radical sacrifices now, because we
00:52:21.080
No, there's a good, there's a good way of thinking about this. You know, if you look at what
00:52:25.760
previous generations has done for us, the only thing that they've really managed to do is to
00:52:31.320
give us a lot more knowledge. So investment in knowledge is actually a great way to help future
00:52:36.340
generations, because it helped them in all kinds of ways. And we can't really predict how, but it's
00:52:40.960
probably a good idea to leave them with a lot of books. I'm using that as a metaphor. But trying to
00:52:46.800
help them in a specific way, imagine back a hundred years ago, yeah, our forefathers back in the 1920s,
00:52:53.620
uh, no, 1910s, uh, would have, would have done reasonably to help us now. Chances are,
00:53:00.700
they would have wasted a lot of money on things that never turned into problems. There's a wonderful
00:53:05.320
book called today. Now, uh, uh, sorry. Oh God, I'm forgetting. Uh, tomorrow now. Oh God. What? Uh,
00:53:14.980
anyway, it's, it's, it has a great and clever title, which I've clearly screwed up now. Uh, but it,
00:53:20.600
it was, uh, back in 1893, uh, the, uh, the world fair in Chicago, I believe asked 50 of the world's
00:53:27.980
smartest people to predict what would the world look like in a hundred years. And so all of them
00:53:33.840
first started said, you know, I love the fact that I'm not going to be around when this prediction
00:53:37.960
comes, you know, true or not. Uh, but then they made all their predictions and they were almost
00:53:42.940
entirely off. You know, there's some that were pretty close. Uh, you know, there's one that
00:53:47.720
predicted sort of email in the sense that, you know, those, I can't pronounce that pneumatic
00:53:54.320
pneumatic tubes, which is sent away and, you know, it would sort of suck it out. And so you could send
00:54:01.860
a letter somewhere. They imagined that that would be a worldwide thing. And so you could actually send
00:54:06.420
a letter everywhere. You just sort of put it in the very, very long tube. Right. Right. Which is kind
00:54:10.820
of right. And you can sort of see, yeah, yeah, that's not entirely wrong. You know, got the whole
00:54:15.660
technology wrong, but the right idea. But the fundamental point is we're probably wrong about
00:54:21.360
so many things, as you say, that trying to help the world in the future by, for instance,
00:54:27.080
cutting temperature might be one of the least effective ways of helping. Okay. So, so there's
00:54:32.260
another thing that's interesting about that. I think that speaks to the fundamental intelligence
00:54:38.640
of your approach. So one of the things that has struck me as highly likely is that given how
00:54:44.900
complex things are and how rapidly they're changing, that the best thing we could do to
00:54:50.340
prepare ourselves for the future would be to make better people, smarter people, wiser people,
00:54:56.300
more responsible people, all of that. So there's a psychological element to that. And so I would say
00:55:01.860
some of my work has concentrated on that and the public lectures that I've been doing. But what's
00:55:06.080
interesting about your approach, the economic approach, is that you're diverting a lot of
00:55:10.120
resources to the creation of better people for tomorrow by investing, especially in childhood
00:55:16.520
nutrition. So if you, and I suppose this is kind of how economists look at the world, but maybe
00:55:21.840
biologists could look at it this way too. If you think of people as general problem-solving machines,
00:55:27.060
which is not a bad way of thinking about us, then it might be that given that you don't know which
00:55:32.380
problems are going to be paramount, what you want to do is improve the machines that will solve
00:55:36.540
the problems, whatever those problems happen to be. And so that investment in early childhood
00:55:41.700
development seems particularly apropos in that regard. And so having said that, I want to return
00:55:49.640
to the climate issue one more time, because here's something peculiar. This is something I don't
00:55:54.880
understand. It's a real mystery. So let's say that, just for the sake of argument, that most of the
00:56:00.960
people who are concerned about climate change and its relationship to economic development are on
00:56:05.600
the left side of the spectrum. Okay, but let's also say that those who are on the left side of the
00:56:11.240
spectrum are hypothetically also concerned with the economic well-being of the most dispossessed,
00:56:17.800
and that those might be equally important concerns, and they're integrally locked together.
00:56:22.420
Well, the strange thing about so many of your recommendations is that they're directly aimed at
00:56:28.380
addressing the immediate now concerns of the fundamentally dispossessed. And so you'd think
00:56:34.780
that that's part of the reason that I can't understand why there's so much objection to your
00:56:40.400
methods, because it's not like what you came up with looks like support for something that's
00:56:46.760
like a right-wing agenda by any stretch of the imagination. I mean, first of all, it's predicated on
00:56:51.620
the idea that there's a certain amount of development aid, especially directed to children,
00:56:55.640
you said not exclusively, that would be of great use. And it seems to me to be undeniable that the
00:57:01.320
most dispossessed people in the world are impoverished infants of impoverished people.
00:57:07.700
So if some of the objections to what you're doing are ideologically motivated, why doesn't that
00:57:17.360
That's a good question. I don't quite understand it. My sense is that in some way, so I was in New York
00:57:24.060
in September, there was the first ever summit at the UN for tuberculosis. And I was there because
00:57:31.820
one of the things that we've identified is this is a great investment. And of course, all the
00:57:36.000
tuberculosis people love us because, you know, we're pointing out you should spend more money on their
00:57:40.360
problem. And unfortunately, almost nobody went. I wrote an abet together with the South African health
00:57:47.960
minister, and it was widely published in the developing world. But almost no one in the developed
00:57:53.280
world picked it up. It was only when I wrote an other op-ed where I said there were two meetings
00:57:57.520
taking place in New York. One was this TV place where, you know, biggest infectious disease killer
00:58:03.060
in the world, where almost nobody turned up. And then there was the other meeting, which Macron and
00:58:08.540
the French president and Bloomberg and others attended to, which was the climate summit, which
00:58:15.620
You know, I sort of pointed out the disparity. And then it was picked up in all kinds of developed
00:58:22.340
country papers. I think fundamentally, it goes down to saying, while everybody says they care a lot
00:58:28.100
about the world's poor, the reality is that, you know, you care somewhat for it. But most rich,
00:58:35.860
well-meaning people probably care a lot more about the fact that they worry that their kids might be in
00:58:40.760
a position where global warming is really going to undermine the climate.
00:58:43.680
Right, but even that doesn't make sense. Because look, we already established the fact that
00:58:48.320
there's equal reason to be apocalyptically concerned about unchecked poverty and inequality.
00:58:55.260
So this is why I suggested to begin with that one of the lovely things about the idea of climate
00:59:01.180
change is that it really justifies the idea of overthrowing the current system or of undermining
00:59:06.800
the current system. And that if you're inclined to do that, rather than inclined to truly help the
00:59:12.400
dispossessed, let's say, then you'd be more inclined to support a theory that justifies that sort of
00:59:18.100
radical, let's say, interventionist policy. And I can't see a way out of that logically, given that
00:59:23.900
the work that you're doing on tuberculosis is a great example, is directly and evidentially
00:59:31.480
associated with a marked increase in the well-being of the dispossessed. And so you'd think that
00:59:38.120
you'd think that would attract the proper amount of ideological attention, but it doesn't. And that's
00:59:43.860
that's a great mystery. There's something about that extraordinarily good.
00:59:47.340
And it's a good point. And I think you have a consistent argument. The thing that I've decided a long
00:59:54.760
time ago, that I don't want to argue, and I think that's probably the difference between
00:59:59.200
being more psychologically focused. I don't want to argue on what I think might be people's
01:00:04.840
sort of inner motivations. I want to actually take them at face value. And, you know, many people I
01:00:10.160
meet, they say, I worry a lot about climate change. I worry a lot about the world's poor. And then I try
01:00:15.320
to show them, well, if you actually do that, why the hell would you be focusing on spending lots of
01:00:20.300
money that'll almost do no good, instead of spending possibly less money and doing it probably about
01:00:25.580
more good. And it creates some cognitive dissonance. And I think it switches people a little bit
01:00:32.340
towards spending smarter. But yes, you're right. But telling people...
01:00:36.660
It also might just be ignorance. You know, it's like, what you're doing is pretty new.
01:00:41.960
And it takes a long time. I mean, it's not new for you. And it's not new, considering the span of a
01:00:47.880
single lifetime. But, you know, what you're doing is very radical, in some sense. And there isn't
01:00:54.040
anybody else doing it. And so it might be that it will take 20 years, or 25 years, or something
01:00:59.880
like that, for the approach that you're publicizing and have developed, for people to actually
01:01:06.640
know about it. You know, and so what do they say? You should assume ignorance instead of malevolence
01:01:12.000
when you can. I do think that ignorance is a part of this. It's not obvious to everyone that
01:01:17.520
there is this method of rank ordering, that people have done it, and that there are consequences to
01:01:22.660
that that could be laid out in an intelligent economic plan. You know, and I know that it takes
01:01:27.520
a finding in the scientific literature, if it's going to make its way into the public, something
01:01:32.460
approximating 15 years. And that's only the ones that actually do manage to make it. And so it could be
01:01:39.880
that just way more people need to know what you're doing and why and how it was done before it gets
01:01:45.800
the steam going. I'm interested in the psychological issues, in part, to try to help figure out what it
01:01:51.600
would take to motivate people to be more attentive to the sorts of solutions that you and your people
01:01:57.780
have been putting forward, and to eliminate those barriers. But it could just be, as I said, it could
01:02:03.840
just be ignorance. And look, one of the problems that we're facing constantly, and I know
01:02:09.700
why there's no one else doing this than we're doing, because when you do prioritization,
01:02:15.760
you inevitably end up antagonizing a lot of people. I mean, climate change is the most obvious
01:02:21.560
one. But for instance, sanitation, water and sanitation, huge problem. So, you know, there's
01:02:26.860
about two and a half billion people affected by this. One of the points that we emphasize is that
01:02:31.880
doing sanitation, while a good thing to do, it probably only pays you about $3 back on the dollar.
01:02:37.960
Why? Because it's actually fairly costly to do sanitation, and also because the benefit's not
01:02:43.720
nearly as great as what many have assumed, and this is the new global burden of disease estimates,
01:02:48.720
that the real problem is that what you're doing with sanitation is that you're not removing fecal
01:02:53.700
matter from the environment, you're simply reducing the amount. And so you're not actually having all
01:02:58.260
that much of an impact on the disease. You're having a little bit, but not nearly as much as what we
01:03:02.580
would like to see. That obviously pisses off all the people who are doing sanitation. And, you know,
01:03:07.860
we end up pissing off a lot of people. And the truth is, I think it's necessary, if you're going
01:03:13.240
to do this, that when you rank order, of course, the causes that come out on top love it, and the
01:03:19.200
causes that don't, don't love it. But it's also important to make that argument. And so at the end of
01:03:25.460
day, certainly my sense is, it's necessary to do it, but it will always entail a great amount of sort of
01:03:33.240
unease. Because it doesn't feel like we're saying, you know, kumbaya, and we should do everything. But
01:03:40.300
we're actually saying, no, you should do these things, but not all of these things. Although they seem
01:03:48.020
Right, right, right, right. Well, that's, it would be nice if we could do everything good that we possibly
01:03:55.180
could all at once. But it's not, it's not realistic, because you can't do everything at
01:04:01.260
once. And you don't have infinite resources, as we've already pointed out. So, okay, so what would
01:04:06.660
you say would be, if you're going to play devil's advocate against your own position, which I presume
01:04:12.040
you've done a lot of anyways, what, are there criticisms against your, let's say your, your aims and
01:04:19.700
your methods that you regard as unresolved? Like, what is it that you're doing that's still weak and
01:04:26.800
wrong in your own estimation? Or where are the, where are the limitations in your methods?
01:04:31.340
Well, look, there's, no method does everything. So we have two very obvious problems. One is,
01:04:37.400
not all issues is about money. So we're looking at how do you prioritize money? But sometimes money is
01:04:44.240
not the issue. For instance, on free trade, as we talked about before, we estimate the benefit is
01:04:49.460
incredible. But we actually look at the cost as the cost of subsidizing Western farmers, because those
01:04:55.840
are the ones that basically make a, you know, killing from not freeing up global markets. And those are the
01:05:02.320
ones that usually kept, held it back. But what has happened is it has become much more sort of an
01:05:07.980
emotional thing. It's sort of an identity thing. And I'm not sure how we would cost that. So to the
01:05:13.760
extent that things have not nothing to do with money, but they're just simply about political
01:05:19.120
willingness or interest or want, then we are not making the argument that is going to convince you.
01:05:27.400
So in some sense, we are telling you where can you spend money, but we're not talking about the
01:05:32.060
things that don't require money. Right. So people would object that you're that the problem with your
01:05:36.640
method is that you're measuring everything that can be tangibly measured from an economic perspective,
01:05:42.200
but that's actually a small fraction of the universe of properly attended to.
01:05:47.540
Yeah, I would tend to say it's probably, you know, sort of 70 or 80%. I'm not quite sure how you'd make
01:05:52.880
that up. But you know, the biggest, the biggest policy decisions in most countries is the national
01:05:59.020
budget. It's very clear that that is a very substantial part of what we decide how we're going
01:06:04.160
to allocate money. Well, the problem with the objection is that, well, the people who are
01:06:12.760
objecting could be right, that your methods are narrow. I mean, you're making the case that
01:06:17.820
they're not as narrow as a, yeah, as a pessimist might assume, but that puts the onus on them to
01:06:23.160
come up with an alternative way of ranking. Right? I mean, I would just say, we're not talking about
01:06:29.020
those last 30% that are, you know, purely about, you know, should we have transgender bathrooms or
01:06:34.960
something? Yeah. Well, that's possibly a bad idea, because that actually has cost and building a third
01:06:40.080
bathroom or something. But you know, there's some things that are mostly just about what do you think?
01:06:45.740
What do you believe? What are your intuitions, rather than actual costs? Right, but there's no
01:06:51.540
way of adjudicating between those claims. That's the problem that all the people do is push each
01:06:56.100
other around about them if they can't. And the point is, there's still a substantial amount of issues
01:07:01.160
where you do need to look at resource allocation. And there we have a good argument. The second part,
01:07:07.160
the second sort of criticism, which is a very fair criticism is, we don't look at inequality. So
01:07:13.640
economists are very, very bad at dealing with inequality, because fundamentally, that's a
01:07:18.760
political issue. So when we look at you spend a dollar and you do $60 worth of good, we don't look
01:07:24.860
at who gets that $60. Now, to be fair, most of the things that you also pointed out, most of the
01:07:32.400
things that we actually indicate are really, really low hanging fruit in the world are things that will
01:07:36.320
help the world's poorest, mostly because the world's poorest have so many things that they haven't
01:07:41.380
gotten that would be hugely beneficial for them. So it mostly actually help also inequality. But we
01:07:48.100
don't measure it. And so we don't actually look at, would this be a good expenditure in the sense
01:07:55.820
of helping the world's poorest? Mostly would, but it's not part of our framework. And there's
01:08:01.660
making that well, because, because cost benefit analysis is basically assuming that everybody is
01:08:09.380
equally worth. We talked about that, that earlier, there's, there's no way of sort of, well, you can,
01:08:16.100
but it becomes incredibly unclear and very unintuitive, if you start making weightings on,
01:08:22.200
on who is actually worth more. So, so we are, we're again saying, it's a little bit like the menu,
01:08:29.060
you know, you get in the restaurant, we're telling you, hey, the spinach is cheap, and it has lots of
01:08:33.460
vitamins, and the cake is expensive, and it's bad for you. But you know, you go ahead and make the
01:08:39.120
choice. And I think that's the fair way to have that conversation, that we're telling you some
01:08:44.020
important facts about your decisions. But these are not the only things that are going to guide your
01:08:48.400
decision. And I'm absolutely happy to say that. So, so in some sense, you asked me to be, you know,
01:08:53.240
devil's advocate, I just think it's important to clarify, we don't look at all issues, because we
01:08:58.360
only look at issues that require resources. And we don't deal with inequality, which is also an
01:09:04.080
important issue. But apart from that, we have to make priorities. And we're simply making a little
01:09:10.540
clearer, at the end of the day, you can choose to totally disregard it. But I would imagine that you
01:09:15.280
would at least like to know, what does the evidence tell you if you spend a dollar here? How many people
01:09:20.960
will you save? How many lives will be improved? How much environment will be improved, and so on,
01:09:25.920
versus all the other things where you could have spent that dollar and done different amounts of
01:09:30.780
good in all those different areas. And that's what we provide for the menu. So do you, okay, okay,
01:09:37.100
well, I appreciate that very much. Do you have any sense, what do you, off the top of your head,
01:09:43.200
what the total capital expenditure for the minimization or eradication of tuberculosis actually
01:09:48.760
would be? What are you talking about in absolute dollar amounts? So it, and it depends a lot on,
01:09:55.440
because, so we estimated you need about $2 billion. The global funds that we were also
01:10:02.700
campaigning with are saying it's about $5.8 billion. And to be quite honest, I'm not quite sure which of
01:10:10.760
these two numbers is the right number. I think the, but, you know, compared to, you know, just to give
01:10:17.040
you a sense of proportion, uh, the, the amount of subsidies that we give to solar and wind, uh, is
01:10:22.800
about $120 billion right now. Uh, so, you know, we're, we're talking about a very, very small amount,
01:10:29.640
uh, and certainly it's a very small amount and, you know, it's, it's about what, um, three,
01:10:35.120
4% of global development, uh, spending. So the amount of spending that we, that we spend every
01:10:40.820
year to try and help improve the world. And it will probably be one of the very, very best things
01:10:45.540
that we could do. So again, very, very small things. People need to know these things.
01:10:49.700
And they, and I think that if they did know that they would start to care if they actually knew.
01:10:54.180
So, okay. I got, I got two final questions for you, I think. And then I'll ask you if there's
01:10:59.360
anything else you wanted to that bring to people's attention that, that you thought was particularly
01:11:03.460
necessary. Okay. So the first one is, um, to what degree do you think I'll ask all three questions.
01:11:12.080
To what degree do you think you've been making headway? Like, obviously you're, you've been successful
01:11:17.640
in putting together your institutes and, and your work has garnered a substantial amount of attention
01:11:22.920
published and otherwise. And so it's not like you've been silenced and, and, and imprisoned or
01:11:28.560
anything like that. And so are there reasons for optimism as far as you're concerned? And, and then
01:11:35.560
the next thing that I'd like to ask you about is what's happening in France? Because one of the things
01:11:40.940
that the people who are pushing for radical current interventions with regards to long-term
01:11:47.040
climate change haven't factored in is the reverse apocalyptic issue, which is that there's going
01:11:52.260
to be substantial resistance to the short-term costs that will cause spinoff disasters of their
01:11:58.280
own. And so the French example seems to be a very interesting case in point. So, so the first question
01:12:04.200
was, how do you feel about the impact that you're having? And the second is, what do you have to say
01:12:08.940
about what's happening in France? Yeah. Can, can I, I'm going to answer them in reverse because I think
01:12:14.840
the, the France point is, is really a good, a good argument. If you ask people around the world,
01:12:20.220
do you care about global warming? Almost everyone will say yes. Do you want to do something for
01:12:25.360
global warming? They'll say yes. Then when you ask them, how much are you willing to pay? The typical
01:12:30.620
answer, both in rich and poor countries is a couple of hundred dollars per year. So it goes from a hundred
01:12:36.920
to two hundred dollars. So fundamentally what people are saying is, yes, I do worry about this issue.
01:12:42.380
I'm willing to spend a little bit of money, but not very much. And I think this is the fundamental
01:12:46.840
thing that we just have not been able to get to the attention of a lot of people who are pushing for
01:12:52.600
really, really radical solutions. You're never going to succeed in a democratic situation.
01:13:00.140
If you keep ramping up the taxes on fossil fuels, if you keep making energy more and more expensive,
01:13:07.160
it's going to harm, first of all, the poor the most, very, very regressive. And that obviously
01:13:12.800
means a lot of heartache for a lot of poor people. These are typically also the people who are least
01:13:21.120
able to defend themselves because they're just so busy just surviving their day to day. So it
01:13:27.140
typically has to hit the middle class before you really get sort of an eruption as what you've seen
01:13:31.560
in France and elsewhere. Let's remember there's also a lot of other issues in France. So it's not
01:13:36.100
just because they put three sets on a liter of diesel, but it is an issue of saying, if you push
01:13:44.680
people too far, you will actually not be able to do the solution for climate. And so your very,
01:13:49.860
very expensive solutions are never going to be a long-term viable. When you predict these ideas of
01:13:55.080
saying, if we had Obama and if he would actually have managed to put a carbon tax on CO2, remember,
01:14:01.920
he actually had a democratic Congress the first two years of this presidency, and they were still
01:14:07.580
not able to get a carbon tax implemented. But of course, when you have a Republican Congress,
01:14:12.920
it becomes really hard. When you have a Republican president also, it falls apart. You just can't do
01:14:18.360
this for a hundred years. It's just not going to work out. And that's why, and we never got to that,
01:14:24.140
we actually did a climate consensus where we brought together 27 of the world's top climate
01:14:29.420
economists, three Nobel laureates, to look at where can you do good for climate. And what they found
01:14:35.200
was the by far best investment to tackle global warming is to invest in green energy R&D. So
01:14:42.980
fundamentally, if we could invest in making better green energy for the future, hopefully eventually
01:14:48.900
get it to be so cheap that it will out-compete fossil fuels, we will solve global warming just
01:14:53.920
simply because the green energy became cheaper. If you'll allow me a slight detour,
01:14:58.700
back in the 1860s, the world was hunting whales to extinction because whales have this wonderful
01:15:07.740
should I just say that again? Yep. Yeah. All right. Back in the 1860s, we were hunting whales to
01:15:16.000
extinction because whales have this wonderful oil that just burns a lot cleaner and a lot more bright.
01:15:21.740
And so it was wonderful for the houses in North America and in Europe to burn this whale oil and
01:15:28.320
they were all excited about it. And it had the bad side effect that was actually pushing whales to
01:15:35.220
extinction. Now, the sort of global warming approach to that problem would have been to say,
01:15:40.440
could you please turn down the light? Could you please have it a little less light in your room?
01:15:45.260
And of course, you would have entirely failed. What did save the whales was we discovered oil,
01:15:51.120
you know, fossil fuel oil, which were actually burning cleaner. It was much cheaper and you didn't have
01:15:55.620
to go out and kill whales for it. And so what happened about a decade was you stopped killing whales
01:16:00.760
because you got a better technological product. And we've seen this a lot of times that technology can
01:16:05.940
simply invalidate an old issue, a problem that you thought was almost intractable. If you get cheaper,
01:16:12.840
smarter, new technology, people will switch. Right. That's imposing limits, expensive limits on
01:16:19.420
people is not an appropriate long-term solution because of implementation resistance and cost. And
01:16:25.280
the best solution is to come up with a, well, let's say to put it in a cliched manner, is to come up
01:16:30.620
with a better solution, which is cheaper energy that's that, that has all the advantages and fewer
01:16:35.720
disadvantages. And that's really how you solve the problem. Wind turbines, solar panels, and I'm just
01:16:41.400
taking the two most popular things and batteries together. If they were cheaper than fossil fuels,
01:16:45.960
which they aren't right now, but if they were, of course, everyone would buy them. We'd stop buying
01:16:50.720
coal-fired power plants. So it's really not rocket science that way. Now, I'm not saying it's going to
01:16:54.760
be easy and it's certainly not going to happen right now, but it's the only viable long-term solution.
01:16:59.500
It's much cheaper and much more effective. So we estimate it that for every dollar spent,
01:17:04.220
you'll actually do about $11 of climate benefit. Okay. So that's on, that's on, that's on alternative
01:17:09.600
energy. Yes. Yes. Okay. So by far, you know, so, so it's, it's not the best thing, but it comes down
01:17:16.760
here. So it actually, you know, it's a pretty good investment. Yeah. It's not the best in the world,
01:17:21.520
but we should definitely be doing it. Okay. The second question that you had was the, you know,
01:17:26.320
the optimism. So, you know, fundamentally how much of an impact does this have? Well,
01:17:31.320
it's had the impact and very predictable impact that when we come out and say, for instance,
01:17:36.380
uh, in, uh, uh, uh, more immunization gives you $60 back on the dollar. The people who are doing
01:17:42.600
immunization tells you all the time, you should fund us because we do $60 and good for every dollar
01:17:48.620
you spend. So very clearly, uh, you know, I sat down with a guy from, uh, from, uh, uh, uh, a family
01:17:54.620
foundation, a big family foundation that, you know, we were at a malaria event and, and, and, you know,
01:17:59.680
he, we sat and politely conversed and he, uh, uh, he would say, so what do you do? I, well,
01:18:04.840
I work with the Copenhagen consensus center, you know, totally blacks there. Uh, and then I asked him,
01:18:09.540
what, what do you do? Uh, well, we work with malaria, you know, did you know that actually,
01:18:13.680
if you spend a dollar in malaria, you'll do $33 worth of good. And I was like, yeah, we did that
01:18:17.820
number. And, you know, and, and, and this, you know, this is exactly the point. We're not, we're
01:18:22.780
not there to, you know, get attention. We're there to make sure that we get attention to some of these
01:18:27.440
top ideas. And I think we've definitely helped a lot of these top ideas get a little easier ride
01:18:32.640
in the world. Okay. So let me ask you another cost benefit. Okay. So, so what if I said, wouldn't it be
01:18:40.800
interested, interesting in doing a meta cost benefit analysis on how much money you would
01:18:47.220
need to raise and spend to effectively market your findings, you know, and to hire people who are
01:18:54.260
really good at doing such marketing so that you had the appropriate advertisements. And so that you,
01:18:59.740
cause you're, you're, this is not a criticism, believe me, but your approach is very academic
01:19:05.020
and very objective and, and that's all well and good and reliable and valid and all of those
01:19:11.240
things. But do you, do you, could your economists compute the utility in dollar value of establishing
01:19:20.800
an extensive and appropriate marketing scheme? Because you'd think that there are, cause I've
01:19:26.720
been watching what's been happening with the Democrats in the U S and they've, uh, a pack that,
01:19:32.100
that I know about has been, uh, making new ads for the Democrats, trying to move them towards the
01:19:38.920
center. And they've had a substantial amount of impact because the advertisements have been very
01:19:43.440
professionally crafted and constructed. And so, well, I don't, I don't know what you think about that,
01:19:50.020
but I've been part of the reason I'm interviewing you today is because I want to put this up on YouTube
01:19:55.360
and I want to get it out there and podcast so that more people know about this. And so, all right.
01:20:00.800
Yeah. So, so the short answer of course, is we have tried to do that because we're economists and
01:20:06.800
we think that would, that would, that would, that would be a good idea. So let me just tell you about
01:20:10.520
something else that we've done. So, uh, so we've been working a lot and we were just talking about
01:20:14.760
the globe. Uh, so when you do prioritization on the planetary scale, it's academically very
01:20:20.920
interesting, but unfortunately the impact is mostly that people say, yeah, that's probably true
01:20:26.000
somewhere else. Yeah. So when we go and tell the Indians this, they'll say, yeah, that's probably true
01:20:30.000
for Mexico and Mexico will say it's probably true for Argentina. And so, so everybody just pushes
01:20:34.560
off to somebody else. The globe is never anyone's problem. You know, someone else needs to fix us.
01:20:40.440
And so what we've increasingly done is to do this in nations. Uh, so we did this a couple of years
01:20:46.380
ago in Bangladesh, uh, and in the nation, of course, there's an overlap between who actually decides
01:20:52.220
where to spend the money and the problems that we're analyzing. So we did the exact same
01:20:57.000
thing for just for Bangladesh. So we did a prioritization list of all the things. So
01:21:02.240
we talked to everyone in Bangladesh. We got them to say, what are the best things that you want to
01:21:06.820
focus on? What are the smartest new solutions? We worked with primers, think tank, the finance
01:21:11.200
minister, everybody else, and talked about where can you actually spend money? Right. Again, this is,
01:21:16.140
this is a menu that's not entirely politically correct. Certainly the politicians don't like all of this,
01:21:21.440
but what the, you know, the finance minister would go like, no, no, no. Oh yes. Yes. I like this.
01:21:26.340
Yeah. I see. And so they did some of the things. Huh? So that solves the tower of Babel problem to
01:21:31.200
some degree is that there's too much distance. Yeah. There's too much distance between the local and
01:21:35.520
the global. And so you're using the nation state as a, as a psychological intermediary. Yeah. So there we
01:21:43.120
actually estimate that every dollar spent on what we were doing, uh, does, you know, at least $10,000 worth of
01:21:49.080
good simply because we helped shift spending in Bangladesh, both on their national budget,
01:21:55.380
but also the development budgets spending in Bangladesh. And again, we didn't change, you
01:22:00.500
know, Bangladesh to suddenly become super rational or anything. Yeah. We simply changed it a little
01:22:05.340
bit. We gave, if you will, headwind to the poor ideas and tailwind to the good ones. And so we call
01:22:11.420
that in Haiti, we're, we're now doing this in India together with Tata Trust. I just came back. So that's
01:22:16.380
why I'm a little jet lag or a little bleary. I just came back, uh, from, from Ghana, uh, where
01:22:21.740
we're going to do our next project. So we want to bring this to Africa. Great. Great. So you can do
01:22:26.100
it globally and locally. Yes. And I think we'll have a lot more opportunity to actually get people's
01:22:32.080
attention when you talk specifically to the nation states where you're actually going to be making
01:22:37.040
these decisions. But obviously if it's true in Ghana, it's probably also true in the neighboring
01:22:41.680
states. You're going to get the overlap. Yeah. Exactly. So I think I would love to,
01:22:46.180
you know, have, have, uh, have, have a lot of, uh, extra sort of PR ability, but I think what we,
01:22:53.100
what we really need is just simply to be able to come up with all these great ideas. And I think we
01:22:58.480
are reasonably successful, but, but the problem in some ways is we're advocates of all the boring
01:23:04.300
stuff. You know, all the stuff that gets the attention are the ones that have the crying babies
01:23:09.080
and cute animals. Yeah. But God, it's the thing is, it's not true. It's not what you're doing.
01:23:13.160
Isn't boring. It's absolutely exciting beyond belief. I mean, I don't believe that it is dry and I don't
01:23:19.500
believe that it doesn't have a powerful narrative message. It's just that there are reasons that the
01:23:24.420
narrative message is obscured. And I think putting your finger on the gap between the local and the
01:23:30.100
global is a good one. That's smart because you, you see the same problem starting to emerge,
01:23:35.360
for example, with the EU where the overarching bureaucracy is so distant from the people on
01:23:40.700
the ground that there's a disconnect in identification. And so to harness the latent
01:23:46.320
power of the nation state and its patriotism and its, its economic, what would, it's rather local
01:23:53.560
economic structure seems to me to be a really smart solution. But I, I really, I was thrilled when I
01:23:59.820
came across your work. And I mean, on an emotional level, because I thought, well, wow, if we really
01:24:04.840
wanted to do good, if, if that was the goal and we wanted to do it intelligently and carefully and,
01:24:10.660
and agnostically to some degree, um, and in a non-ideologically self-serving manner, which would
01:24:16.920
be a lovely thing to manage if it was possible, then this seemed to be an incredibly exciting approach.
01:24:21.520
And I, I, I believe that there's every reason to assume that with proper publicity, that people
01:24:28.660
could really get behind the idea that we wouldn't have to have tuberculosis in 10 years, you know,
01:24:33.760
and, and that some of these things could really, and, you know, I've talked to people in, on my
01:24:38.160
lecture tour about the things that we could do to make the world a better place. And there's no
01:24:42.040
shortage of enthusiasm for that. It's just that there's a fair bit of cynicism, but there's a lot
01:24:47.740
more ignorance. It's like, well, we just don't know what to do, but your prescriptions say, well,
01:24:53.840
here's 10 things you could do one of them and one of them would work. And so you've got a bit of a
01:24:58.100
choice there. We wouldn't have to take our number one priority as your own, but you could take number
01:25:02.860
five and you'd still do a lovely job and it might be in accordance with your own motivations.
01:25:07.600
Okay. Can, can, can you get me PDFs, for example, of the relevant, whatever you think is relevant
01:25:16.020
that would inform people in short order, I will post in the description of the video.
01:25:22.600
Yeah. So that people can get access to that. So whatever material you'd like to have disseminated
01:25:27.340
as a consequence of this conversation, just get that to me and I'll post it and any blurb you want
01:25:32.780
me to put in the description, then I'll also do that with, with whatever. When do you need it for?
01:25:37.820
What? As soon as, as soon as you can get it. Cause I'll put this up very soon, like in the next day.
01:25:42.240
Okay. Yes. All right. Okay. Is there anything else that you would like to say that we haven't
01:25:48.940
covered? I mean, first of all, I think it's wonderful the way that you're, you're talking
01:25:53.940
about this as sort of an outsider, that it's actually exciting because we, we, and we probably
01:25:59.300
make this mistake that we think of ourselves a little bit sort of technocrats and, and, you know,
01:26:04.100
fiddlers and the margin kind of thing. And it actually, you're right. And it is exciting
01:26:08.500
because if we could actually spend our resources four times as well as what we're doing right
01:26:15.240
now. So instead of saying all the things that are politically correct and make us all feel
01:26:19.140
warm and fussy, actually doing the things that perhaps are not top of our mind, not top
01:26:24.660
of the, uh, uh, the, the agenda, but would just do an amazing amount of good. Why the hell
01:26:29.460
are we not doing that? So yes, thank you very much for, for getting me back in the groove and
01:26:34.200
actually saying this is incredibly useful. Yeah. Yes. I think, I think it's a mistake
01:26:40.860
not to view what you're doing as a, as an exciting adventure. I think it's a form of,
01:26:45.120
of what, uh, it's a form of non-helpful humility. Yeah. And I think we, I, I, I'm Danish, uh,
01:26:55.460
as background and, and one of the things we're very, very good at is self-effacement. Uh, and so,
01:27:00.620
yes, you're probably right. So you're calling people, you're calling people to a great adventure
01:27:05.620
here. Yes. You're saying, look, we have enough resources so that we could deflect a small
01:27:11.100
fraction of them and do an unbelievable amount of good. And, and there would be no downside
01:27:16.580
to that. There would be no downside to anyone. There would be nothing but upside, even with
01:27:20.960
some error and there's going to be error. And so of course there is. And I think we'll find
01:27:25.500
that people will respond to this video in a very, very positive way. And because there,
01:27:30.100
I do believe that there, as people become more aware that we are becoming richer and that we do
01:27:36.900
have more resources at our disposal, that that's genuinely true. And that with a bit of intelligent
01:27:42.080
consideration, we could make things a lot more, less dismal for a lot of people. I think that they
01:27:48.520
will start to view that as something that's part of a great global, national and global adventure.
01:27:54.200
And, and because you're agnostic and because you've done the legwork, then people can also,
01:28:00.320
I think, get behind that without any real cynicism. They can say, look, our money spent here is not
01:28:05.480
going to be wasted. We can be reasonably assured that if we're charitable in this direction, we're
01:28:09.860
going to do something positive. And so, so anyways, like I said, I found it incredibly
01:28:14.400
interesting. Wonderful. And let me, let me just say, again, we've said a couple of times that
01:28:19.720
something that I've done, I need to say, you know, I'm just a sock puppet that talks about all these
01:28:25.220
things. The, this is the work of an incredible amount of really, really smart economists. So
01:28:30.020
one of the three of the world's top economists, you know, seven Nobel laureates, those are the guys
01:28:34.360
who've actually done the work and who are giving the credibility to all of this, this research. And
01:28:39.680
that's why we can say with great amount of certainty, this is not just sort of a whim,
01:28:44.900
but something that is probably the best we know now. It's probably, as you point out,
01:28:49.120
not entirely true, but it's certainly better than anything else we have right now.
01:28:52.920
Right. So good. That's it. Well, and that's actually the right comparison. It's like,
01:28:57.560
it's not absolutely true. Yeah. But no one's got a better idea. Right. So we go with the best idea
01:29:03.560
that we have. Exactly. Right. Right. All right. Well, look, it was a great pleasure talking to
01:29:09.380
you. And I'm hoping that, you know, a million people will watch this and that we'll get another
01:29:13.960
million podcasts out of it. And that this will help disseminate what you're doing broadly. I think
01:29:20.100
that would be lovely. You bet. Thank you. And it was great also meeting you. And I hope our paths
01:29:27.900
will cross again soon. And, you know, we should, we should make a habit of this. Yes, we definitely
01:29:32.620
should. Well, when you start to develop some more of the national indices, have you worked on any
01:29:37.300
Western countries like Canada, the United States? So we, we, I wanted to do that. So we, we haven't
01:29:42.720
done it in the U S just simply because it's too dysfunctional in so many different ways. We've seen
01:29:46.740
a couple of other people trying to do this without our involvement. And one of the things we're really,
01:29:51.900
really good at is that, you know, you get all these economists to do all this stuff. And the first
01:29:56.240
draft of the paper is this is really hard. We need 10 years, a lot of research money, and then we'll
01:30:01.340
probably get you something. And, you know, we say, look, that's, that's nowhere. You've
01:30:05.380
got to, you know, give it your best shot, give use the information that's out there now and
01:30:09.500
give us a knowledge because politicians are going to make decisions next year. And, and
01:30:14.100
what has happened in those two places. So they did one in Holland. They did, it might actually
01:30:20.240
be Canada, but that was like 10 years ago. It ends up very much like a very interesting anthology
01:30:26.480
of, of, of points that, yes, yes, directions. I have no sort of consistency as what we're
01:30:32.620
trying to achieve. What would it cost? What would it cost to have something like this done
01:30:37.420
for Canada? Uh, so the short, the short version is it costs about two and a half million dollars.
01:30:43.880
Simply it's, it's, it doesn't really scale well. Uh, so doing it for, for Ghana is the same
01:30:49.220
thing as doing it for Canada. How long would it take? Uh, 18 months. God, that's such a good
01:30:55.660
idea. So two and a half million dollars, 18 months. Yeah. Okay. Well, I'm going to wrap
01:31:00.620
that around and maybe we can figure out how to raise the money. We should, we should try
01:31:05.480
and do that. That would be fantastic. And I think there'd be a lot of interest in Canada
01:31:09.160
and Canada would be great way to also get sort of bridge point into the U S without doing
01:31:13.800
it in the U S. Yep. Yep. Well that, that would be good. Cause we're flailing about politically
01:31:18.100
and it would be nice to. Yes. Okay. So, so that's, that's, that's, that's, that's fodder for
01:31:24.060
another conversation. We, I've got a couple of other conversations that I'd like to have
01:31:27.320
with you about policy development and also about marketing, but we'll save those for another