The Jordan B. Peterson Podcast - December 30, 2021


213. Don't Climate Panic - An Investigation into The Proposed Solutions to Climate Change


Episode Stats

Length

1 hour and 11 minutes

Words per Minute

172.4828

Word Count

12,383

Sentence Count

636

Misogynist Sentences

5

Hate Speech Sentences

16


Summary

Dr. Jordan B. Peterson has created a new series that could be a lifeline for those battling depression and anxiety. With decades of experience helping patients, Dr. Peterson offers a unique understanding of why you might be feeling this way. In his new series, he provides a roadmap towards healing, showing that while the journey isn t easy, it s absolutely possible to find your way forward. If you re suffering, please know you are not alone. There s hope, and there s a path to feeling better. Go to Dailywire Plus now and start watching Dr. B Peterson s new series on Depression and Anxiety. Let s take a first step towards the brighter future you deserve. In the previous compilation episodes, we looked at the progress of the human race. Today, we re tackling the topic of climate change. Are we experiencing an increasingly worsening climate? Is it possible that rising temperatures and tides will kill us all? What s the truth? Thankfully, there are a growing number of reputable scientists and authors that are presenting a much more optimistic story, using the same data set as many of the climate alarmists. This episode is brought to you by Schwenk Grills, an American-made, infrared grilling grilling grill that heats up to a whopping 1500 degrees Fahrenheit. That s $150 off an American made, infrared grill when using promo code JP150 at checkout! Enjoy the episode. Try their special offer by heading to schwenk.grills and get 20% off your entire purchase when using the promo code JJP150. at checkout. Thanks to S-C-HW-W-A-N-J-K for the best grill I've ever used in the market. I know that s the best I've seen in the past decade. and the best steak I veggie grill I ve ever used by a guy who s been cooking me a steak and I ve been trying to make me feel like that s a little more like that in the best way possible. J-E-R-U- and I know you re going to love it. Thank you, JP150! J.B. Peterson - J. B. - Season 4, Season 4 Episode 70: Season 4: Episode 70, Episode 70 - JB Peterson, J-B. P. Peterson, the Podcast, Episode 4, JB. is available on Daily Wire Plus - J-V. Peterson.


Transcript

00:00:00.000 Hey everyone, real quick before you skip, I want to talk to you about something serious and
00:00:05.560 important. Dr. Jordan Peterson has created a new series that could be a lifeline for those
00:00:10.560 battling depression and anxiety. We know how isolating and overwhelming these conditions can
00:00:15.700 be, and we wanted to take a moment to reach out to those listening who may be struggling.
00:00:20.080 With decades of experience helping patients, Dr. Peterson offers a unique understanding of why you
00:00:25.520 might be feeling this way in his new series. He provides a roadmap towards healing, showing that
00:00:30.400 while the journey isn't easy, it's absolutely possible to find your way forward. If you're
00:00:35.700 suffering, please know you are not alone. There's hope, and there's a path to feeling better. Go to
00:00:42.100 Daily Wire Plus now and start watching Dr. Jordan B. Peterson on depression and anxiety. Let this be
00:00:48.080 the first step towards the brighter future you deserve. Welcome to the Jordan B. Peterson podcast,
00:00:56.660 season four, episode 70. In the previous compilation episodes, we looked at the progress of the human
00:01:02.760 race. Today, we're tackling the topic of climate change. Are we experiencing an increasingly worsening
00:01:09.000 climate? Is it possible that rising temperatures and tides will kill us all? What's the truth?
00:01:13.660 Thankfully, there are a growing number of reputable scientists and authors that are presenting a much
00:01:18.700 more optimistic story. Using the same data set, as many of the climate alarmists, people like Bjorn
00:01:25.460 Lomborg, Michael Schellenberger, and Matt Ridley explain that while climate change is real, it's not the
00:01:31.980 threat that we've been told it is. This episode is brought to you by Schwenk Grills, hands down the best
00:01:38.380 grill I've ever used. As you probably know, my family is very experienced at cooking meat. Very.
00:01:43.980 So I would say I'm a trustworthy source of information about how to do that well. And Schwenk Grilling is
00:01:48.340 up there. Schwenk Grills are made in America. They're portable infrared grills that heat up to a
00:01:53.480 whopping 1500 degrees Fahrenheit. Being able to do that at home is something that you can't usually do.
00:01:59.480 That exceptional quality and taste where the steak's crispy on the outside, but rare or medium rare and
00:02:05.020 juicy on the inside. A mired reaction. This grill does all that, but in your backyard or patio or even out
00:02:11.680 camping. It doesn't smoke, it's super easy to clean, and it gets you high quality steakhouse worthy steaks
00:02:17.380 like you get at Morton's, Cut 432, Capital Grill, etc. These girls are the best I've seen in the market.
00:02:24.860 Steaks take around three minutes depending on thickness, or you can do chicken wings, fish, hamburgers,
00:02:29.520 vegetable, lamb chops. My favorite. It even comes with a pizza stone if you're a pizza eater.
00:02:34.600 I know. I know that most people are pizza eaters, so I thought I'd throw that in there.
00:02:39.020 Try their special offer by heading to schwenkgrills.com right now. That's S-C-H-W-A-N-K
00:02:46.560 for $150 off an American-made Schwenk 1500 degree grill when using promo code JP150 at checkout.
00:02:55.140 Important note, you must have the Schwenk grill and grill cover in your cart for the code to work.
00:02:59.500 Again, that's $150 off Schwenk grills with promo code JP150 at checkout. Enjoy the episode.
00:03:06.740 Making poor people richer is an extremely intelligent environmental move for a variety of reasons.
00:03:30.320 I mean, the first is perhaps that once you get people above a certain level of income,
00:03:35.460 they can start buying fuels that are cleaner than the fuels they use now.
00:03:42.100 Dung and wood and that kind of thing.
00:03:45.480 But also that as people move up the economic hierarchy,
00:03:49.520 they have time to be concerned about things that are more abstract,
00:03:53.220 like what the environment is going to be like for their children,
00:03:57.400 which they're not going to be, or when they go on holiday, for example,
00:04:01.680 you know, or even where they live as they have some options to choose where to live.
00:04:06.920 And so it could be, you know,
00:04:10.680 we often construe the relationship between the economy and the environment as a zero-sum game, right?
00:04:17.640 And the biologists in particular, broadly speaking,
00:04:21.160 the political biologists have a proclivity to do that.
00:04:24.480 That as the economy grows, we sacrifice the environment to it.
00:04:28.840 But it could be the case that we get the best environmental bang for the buck
00:04:34.660 by making the poor rich as fast as we possibly can around the world.
00:04:39.880 And if we make poor economic decisions,
00:04:42.420 because we're catastrophizing a certain kind of environmental calamity,
00:04:46.400 we're inviting, we're actually increasing the risk of environmental degradation
00:04:51.900 in the medium and the long term.
00:04:54.500 Do you think that's reasonable?
00:04:57.240 Yes, absolutely so, and in a number of different ways.
00:05:00.180 So I think it's funny how we don't recognize how terrible it is to be poor.
00:05:06.740 If you're poor, you're vulnerable in all kinds of ways.
00:05:10.360 You're very clearly incredibly vulnerable to global warming.
00:05:14.200 So if you remember, there was a big hurricane hitting Haiyan, the Philippines back in 2013.
00:05:22.040 It was made a big deal out of as global warming.
00:05:25.420 It hit this very, very poor city where most of their citizens live on the corrugated roof.
00:05:33.700 Not surprisingly, having a Hurricane 5 is terrible when you live on the corrugated roof.
00:05:39.340 The best way to help these people, obviously, would be to lift them out of poverty.
00:05:43.720 What actually is, we can see back in the early part of last century,
00:05:48.740 a similar hurricane hit and eradicated about half the city.
00:05:53.000 This time, it was only about a 20th of the city.
00:05:56.740 So much, much better because the city was much richer.
00:05:59.900 But if we focused on making them even richer,
00:06:02.500 they would be much better off just simply from the point of view of being more protected from hurricanes.
00:06:09.740 So, you know, fundamentally, there's something weird about us saying,
00:06:13.400 oh, those poor people in the Philippines, we should help them by not driving our car today.
00:06:19.180 What?
00:06:19.780 No, you should help them by becoming rich, becoming part of the integrated global economy,
00:06:25.000 making sure that their kids would be better educated,
00:06:28.540 not die from easily curable infectious diseases and so on.
00:06:31.380 So not only would it be better environmentally,
00:06:34.440 but it would obviously also be better for them educationally,
00:06:37.400 for them health-wise and all these other things.
00:06:40.360 It would simply generate much, much better lives in the Philippines.
00:06:46.180 But as you also pointed out, as you get richer,
00:06:49.700 you're actually cleaner in almost all ways.
00:06:52.220 You don't use dung and cardboard and wood to cook inside,
00:06:56.660 but also you stop cutting down forests.
00:06:59.820 You move to the city instead, you become a web designer or something else that's very,
00:07:04.840 very little related to actually clearing out forest land.
00:07:09.460 You do a lot of things in cities that are much more ecologically sustainable.
00:07:14.560 And of course, in the long run, you will actually also say,
00:07:17.480 I would like to make sure that we have better regulations,
00:07:19.920 so we have less air pollution,
00:07:21.640 so we have many of the other things that drive environmental benefits.
00:07:25.200 So absolutely, by getting people out of poverty,
00:07:29.220 we fix most environmental problems.
00:07:31.820 But, and this is the important but, we don't fix global warming.
00:07:35.160 As you get richer, you just simply emit more and more CO2,
00:07:38.600 because these guys will then start flying around the world.
00:07:41.540 They'll start consuming a lot more meat.
00:07:44.000 They'll be doing a lot of other things because they're richer.
00:07:46.460 That's wonderful for them, but it will mean higher emissions of CO2.
00:07:50.820 So we do need to have a conversation about how we're going to fix that problem.
00:07:55.780 Okay, so why don't you lead us down that path?
00:07:58.260 Okay, let me comment a bit on what you just said,
00:08:00.740 and then let's go down that pathway.
00:08:02.780 Okay, so to swallow what you just said and to believe it,
00:08:08.080 there's a set of beliefs that you have to have already in place.
00:08:12.160 You have to believe that the current economic system isn't fatally flawed
00:08:16.480 and basically works, or at least works better than any hypothetical alternatives
00:08:21.880 that have been tried or that we can dream up.
00:08:24.500 So it basically works.
00:08:25.700 And works means, as it runs, it tends to lift people out of absolute poverty.
00:08:30.580 There's still a maintenance of relative poverty,
00:08:34.840 but absolute poverty tends to disappear.
00:08:36.880 And there seems to be really good evidence for that,
00:08:39.040 especially across, well, since the Industrial Revolution,
00:08:41.860 but it's really taken off in the last 30 years,
00:08:44.340 maybe non-coincidentally with the demise of communism,
00:08:48.500 which was a competing economic theory
00:08:52.920 and produced all sorts of bad economic decisions.
00:08:55.620 In any case, you have to buy the hypothesis that the current system works
00:09:00.000 and that extending it is going to be better.
00:09:02.800 And so you don't get to adopt a stance of revolutionary criticism
00:09:10.120 of the Western capitalist hierarchy.
00:09:14.000 So that's a big sacrifice if your thinking is oriented in that direction.
00:09:19.160 Now, I don't know really what to make of that
00:09:21.720 because you'd think the evidence that the poor
00:09:26.860 had been lifted out of poverty at an unbelievable,
00:09:29.980 like an astonishing rate since the year 2000,
00:09:32.800 not just in China, but all over the world,
00:09:36.260 would be essentially irrefutable evidence
00:09:40.080 that the current system works.
00:09:43.580 And then if you look at China
00:09:44.900 after they adopted free market policies
00:09:48.000 compared to before they adopted free market policies,
00:09:50.520 there's absolutely no comparison with regard to growth.
00:09:53.960 And so it isn't obvious to me
00:09:56.520 how if you were truly concerned with the poor,
00:09:59.380 you'd be able to deny the sorts of propositions that you put forward.
00:10:06.120 I don't understand that.
00:10:08.020 Maybe it's partly because people just don't know
00:10:11.120 how much better things have gone in the last 20 years and why.
00:10:15.840 Because it has been difficult news to bring forward
00:10:18.640 and it's difficult to market.
00:10:20.480 If I can just...
00:10:23.660 Yes.
00:10:24.020 Yes.
00:10:24.740 So one of the things I think people don't recognize,
00:10:27.860 if you look at a graph over the last 200 years,
00:10:31.700 200 years ago, almost everyone in the world
00:10:33.820 were absolutely poor in the sense of less than a dollar a day.
00:10:37.220 Less than a dollar a day.
00:10:38.120 Yeah.
00:10:39.240 95% of humanity was below that level.
00:10:42.620 And we've just seen a dramatic decline.
00:10:44.860 As you mentioned, we're now down below 10%.
00:10:47.800 Even despite of COVID,
00:10:49.860 which a lot of people have pointed out,
00:10:51.340 have actually made more poor people,
00:10:54.540 we've gone from seven up to about 9%.
00:10:57.340 And so we've delayed the benefit for a couple of years.
00:11:01.760 That's terrible.
00:11:02.400 And I would rather not have had that happen,
00:11:04.340 but it doesn't change the long-term trajectory.
00:11:06.700 That's amazingly downwards in the sense
00:11:09.220 that we have many, many fewer people that are poor.
00:11:12.620 One of my favorite guys who runs the World in Data website,
00:11:18.040 he points out that every year for the last 25 years,
00:11:23.260 the headline of every newspaper around the world
00:11:25.820 could have been over the last 24 hours,
00:11:28.500 138,000 people have been lifted out of poverty.
00:11:32.660 138,000 people every day for the last 25 years.
00:11:37.380 But of course, it's not news because it happened every day.
00:11:41.100 It was not, you know, some, oh, this day it happened.
00:11:44.960 We don't get these good news.
00:11:47.180 And I think we need to get them in order to be able to understand
00:11:50.540 the magnitude of what we're talking about.
00:11:53.780 Well, you know, the problem with accepting that good news
00:11:56.280 or a problem with it is that it pretty much eradicates
00:11:59.580 the romantic rebel, you know,
00:12:02.780 because it all of a sudden makes it very difficult
00:12:05.420 for you to be cool, to find something cool
00:12:08.800 to stand up against and to resist.
00:12:12.460 You know, you have a benevolent,
00:12:16.540 relatively benevolent society that's getting incrementally better.
00:12:20.280 It's not a villain that you can heroically resist.
00:12:22.740 And that is, and I'm not being cynical about that.
00:12:26.260 That is actually a problem
00:12:27.440 because resisting arbitrary authority is a good story
00:12:33.680 and it's served people well for a very long time.
00:12:36.960 And if you don't have that to catalyze your identity,
00:12:40.680 you have to search for something perhaps equally grand.
00:12:44.980 And that's difficult,
00:12:47.020 especially when you also don't have to go out
00:12:48.960 and contend with the brute force of Mother Nature
00:12:51.720 to anywhere near the degree that you once had to.
00:12:55.460 But if you look at it,
00:12:57.020 there's plenty of other things you could stand up to.
00:12:59.540 And that was what we were talking to.
00:13:00.840 Instead of being the romantic hero
00:13:02.980 that stands up against society,
00:13:04.460 why aren't you the romantic hero
00:13:05.920 that stands up against tuberculosis
00:13:07.580 or the one that stands up against maternal death
00:13:10.800 or the one that stands up for free trade
00:13:12.840 or the ones that stand up for all these other things
00:13:15.360 where we know for very little money,
00:13:17.460 we can make a tremendous benefit.
00:13:19.520 So again, I get why it's not as sexy.
00:13:22.000 That's a really hard question, man.
00:13:23.260 I mean, I think it might have something to do also
00:13:25.420 with the inability to utilize your resentment.
00:13:29.320 You know, if you're resentful about things
00:13:30.960 and you oppose the capitalist state,
00:13:34.100 you can easily identify an enemy.
00:13:36.040 But if you stand up against tuberculosis,
00:13:39.340 like obviously tuberculosis is bad.
00:13:41.580 It doesn't make you look good by comparison.
00:13:43.880 Environmentalism is depressing.
00:13:49.860 It's actually bad for mental health.
00:13:52.580 I think that's now being proven quite dramatically
00:13:54.740 with rising levels of anxiety and depression
00:13:57.160 and reports by school children around the world
00:13:59.500 that they were having nightmares about climate change.
00:14:02.120 You may know that half of all people surveyed
00:14:04.120 say that they think climate change
00:14:05.780 could result in the extinction of humankind.
00:14:08.080 My views have evolved over the years,
00:14:09.960 but I've always viewed apocalyptic environmentalism
00:14:13.500 as a problem for people that care about saving nature,
00:14:18.000 for people that, for everybody.
00:14:20.080 And so those awards came from that prior book.
00:14:23.820 Yeah, well, the environmental activism issue
00:14:26.980 is interesting because, at least in part,
00:14:29.220 because it also, it seems to me,
00:14:31.040 interferes with sensible policymaking.
00:14:33.100 So it's actually self-defeating in a profound sense.
00:14:36.240 I mean, first of all, it gets people hyper-worried
00:14:38.380 about extremely vaguely formulated problems,
00:14:42.640 distracts them from what the prioritized issues might be.
00:14:47.780 And, well, it's hard to think clearly
00:14:51.000 about what steps to take to move forward
00:14:52.880 when you're panicking in a vague and unpleasant manner.
00:14:56.280 So, and that, you do not do that in Apocalypse, never.
00:14:59.220 That's one of the things I really liked about it
00:15:00.980 was that in each sub-chapter,
00:15:03.340 you drill down, at least to some degree,
00:15:05.540 to the level of actually, actual implementable policy.
00:15:08.900 So you start with a story about this group.
00:15:11.120 And, no, I should ask you first,
00:15:12.340 who are you exactly to write such a book?
00:15:14.340 Like, why do you know this?
00:15:15.400 And why should people listen to you?
00:15:17.840 Sure.
00:15:18.740 So I've been an environmental activist for 25 years.
00:15:21.760 I've also been, I'm also an environmental journalist.
00:15:24.220 I write a column for Forbes.
00:15:26.700 This is, that's my, Apocalypse Novel is my second book.
00:15:30.540 You know, I don't have any formal qualifications.
00:15:32.640 I was a cultural anthropologist.
00:15:35.520 I quit my PhD program in the 1990s
00:15:37.540 because the program had become too postmodern and abstruse.
00:15:41.280 The first big essay I wrote was called
00:15:43.540 The Death of Environmentalism.
00:15:45.060 And then I mentioned the book Breakthrough.
00:15:47.360 I mean, you may find interesting that, you know,
00:15:49.000 my father is a very humanistic psychologist
00:15:52.120 in the same tradition of work that you are in,
00:15:54.740 or I see us in.
00:15:55.660 And I knew that environmentalism was making me depressed,
00:16:00.220 like climate change was depressing me.
00:16:03.040 And that, and so one of the famous lines
00:16:05.060 from The Death of Environmentalism,
00:16:06.440 which was an essay in 2004,
00:16:07.780 was Martin Luther King didn't give the
00:16:09.980 I Have a Nightmare speech.
00:16:11.500 He gave the I Have a Dream speech.
00:16:13.580 And we wrote that because I was reading,
00:16:16.180 I would read books about the civil rights movement,
00:16:18.680 and I would feel inspired by these stories
00:16:20.620 of heroic overcoming.
00:16:22.080 And then I would read books by Bill McKibben
00:16:24.900 and other environmentalists,
00:16:25.820 and I would feel depressed.
00:16:27.480 And I thought, you know,
00:16:28.540 something that makes you feel depressed
00:16:29.960 is probably not very motivating
00:16:31.200 to make positive social change.
00:16:33.300 Yeah, you kind of wonder, you kind of wonder too,
00:16:35.920 and this is, since we're talking about
00:16:37.280 psychological issues, is that it's possible too
00:16:40.860 that that kind of apocalyptic thinking
00:16:42.860 is much more difficult for people to escape
00:16:46.140 when they are in fact depressed.
00:16:49.180 And so it's very difficult to separate out
00:16:51.320 political beliefs from, let's say, emotional state.
00:16:56.020 And so that's an interesting issue in and of itself.
00:17:00.080 You know, people might object,
00:17:01.440 well, you know, the crisis is so gloomy
00:17:04.700 if you're a realist that, of course, you're depressed.
00:17:07.800 And it should be the case because, you know,
00:17:10.760 look how depressing the facts are.
00:17:12.540 But that strikes me as, well,
00:17:16.100 it kind of puts the cart before the horse in some sense.
00:17:18.300 It's like, are you sure the crisis is of that proportion?
00:17:21.320 And then are you sure that depressing people
00:17:23.820 is precisely the way to go about it?
00:17:25.400 And then last thing there may be is,
00:17:27.720 I couldn't shake the suspicion,
00:17:30.100 especially in relationship to environmentalism,
00:17:32.420 that it's contaminated quite badly
00:17:35.860 with like historical shame and guilt
00:17:38.380 and a certain kind of profound anti-humanism.
00:17:40.920 And so, and I mean contaminated by that.
00:17:44.120 You know, I've heard environmentalists say something like,
00:17:47.220 well, the planet would be better off
00:17:48.840 as if it was a being in some sense,
00:17:51.400 if there were no people on it.
00:17:53.220 It's like, yeah, well,
00:17:54.700 I'm not so sure I trust people
00:17:57.440 who say things like that and then don't notice.
00:17:59.760 So, yeah, I mean, I was,
00:18:03.080 I, one of the things I stumbled across,
00:18:04.500 I mean, I think at the end of the apocalypse,
00:18:05.960 never in the false gods for lost souls chapter,
00:18:08.600 I talk about how I myself was depressed
00:18:11.880 at a period when I was drawn
00:18:13.300 towards apocalyptic environmentalism.
00:18:15.460 So I think there's an interesting question of,
00:18:17.340 is apocalyptic environmentalism depressing
00:18:19.180 or is, or are depressed people
00:18:22.360 attracted to apocalyptic environmentalism
00:18:24.320 or both, of course.
00:18:25.980 I stumbled across the work of Aaron Beck,
00:18:28.320 who, you know, the founder
00:18:29.700 of Cognitive Behavioral Therapy,
00:18:31.980 one of the founders.
00:18:33.520 And I was struck that the three structures
00:18:35.740 of depressed people that he identified,
00:18:38.620 I'm a terrible person,
00:18:39.920 the world is a terrible place
00:18:41.440 and the future is bleak,
00:18:43.240 that that's the exact same three structures
00:18:45.880 of the every environmental narrative.
00:18:48.800 So every environmental narrative
00:18:49.880 is that humans are terrible,
00:18:51.620 cancer on the planet,
00:18:53.440 the world is going to hell in a handbasket
00:18:56.000 and the future is not,
00:18:57.580 you know, the end is nigh.
00:18:59.380 And that struck me really-
00:19:00.640 Yeah, well, so that's a very interesting observation,
00:19:02.440 especially in relationship
00:19:03.540 to your comments about school children.
00:19:05.580 And so perhaps driving those three axioms home,
00:19:10.540 you know, emphatically and forcefully,
00:19:12.960 isn't the wisest thing to be doing
00:19:14.440 to young children.
00:19:15.820 And the fact of that overlap
00:19:17.260 with depressive thinking,
00:19:18.360 I mean, Beck's no small figure
00:19:19.820 in the history of psychological thinking.
00:19:22.080 He's also extraordinarily practical
00:19:23.760 as is cognitive behavioral therapy.
00:19:25.760 And it also has a,
00:19:27.380 what would you say,
00:19:28.220 as a psychological philosophy
00:19:30.420 or as a branch of medicine even,
00:19:34.020 one of the things
00:19:34.680 the cognitive behaviorists
00:19:35.800 are really, really good at,
00:19:37.600 and I did this in my clinical practice,
00:19:39.340 is to take those vague
00:19:40.900 depressive apprehensions
00:19:42.600 and then break them down
00:19:44.420 into micro-problems
00:19:46.100 that can actually be addressed.
00:19:47.640 And that's much less depressing.
00:19:50.420 It's like, well,
00:19:50.880 exactly why is the future so depressing
00:19:53.420 as far as you're concerned?
00:19:54.600 Like, in some detail,
00:19:55.900 not vague.
00:19:57.280 Now, look,
00:19:58.540 if you're going to run away from something
00:20:00.020 because it hurts
00:20:01.200 and it's dangerous,
00:20:02.200 it doesn't really matter
00:20:03.160 if you have a vague conception of it,
00:20:05.320 right?
00:20:06.140 But if you're going to face it
00:20:07.400 and confront it
00:20:08.260 and solve it,
00:20:09.320 let's say,
00:20:09.700 then you can't be vague about it.
00:20:11.800 And that's also good
00:20:12.980 for your mental health.
00:20:13.940 That approach,
00:20:16.300 orientation,
00:20:17.780 is directly linked
00:20:18.960 biochemically
00:20:19.940 and neurophysiologically
00:20:21.000 to positive emotion.
00:20:22.180 So the process
00:20:22.860 of decomposing
00:20:23.720 these terrible abstract problems
00:20:25.320 into solvable micro-problems
00:20:27.040 actually facilitates
00:20:28.160 positive emotion
00:20:28.900 and suppresses anxiety.
00:20:30.840 And so it is
00:20:31.260 very interesting overlap there.
00:20:33.120 And it's worth thinking about.
00:20:35.100 I viewed writing,
00:20:35.960 I viewed Apocalypse
00:20:36.640 never as cognitive behavioral therapy.
00:20:38.880 both for myself
00:20:40.300 but for other people.
00:20:41.560 And in fact,
00:20:42.360 the highest praise
00:20:43.240 I received from people
00:20:44.180 is people who told me
00:20:45.100 that they were very depressed
00:20:46.260 about the environment
00:20:47.600 and then they read Apocalypse never
00:20:48.900 and they felt much better.
00:20:50.520 And so I think
00:20:51.180 you have to do both things.
00:20:52.320 Like, as you pointed out,
00:20:53.580 cognitive behavioral therapy
00:20:54.600 require his,
00:20:56.040 you know,
00:20:56.200 Beck's therapy was
00:20:57.140 you have to be
00:20:58.380 very concrete
00:20:59.340 about why you're
00:21:00.340 a good person,
00:21:01.300 why the world
00:21:01.820 is a good place
00:21:02.600 and why the future is bright.
00:21:03.880 You have to be
00:21:04.200 very specific about it.
00:21:05.720 It has to be very,
00:21:07.060 it has to be evidence-based.
00:21:08.340 It can't be fantasy land.
00:21:10.960 It has to be actionable as well.
00:21:12.560 Yeah.
00:21:12.760 Oh, that's so interesting
00:21:13.840 because I wouldn't have,
00:21:14.720 I certainly didn't get that sense
00:21:16.320 reading the book,
00:21:17.260 you know,
00:21:17.520 that it,
00:21:17.920 that although
00:21:18.460 you could also,
00:21:20.080 although
00:21:20.420 illuminating
00:21:22.560 the fact that
00:21:23.900 the problems
00:21:24.920 that beset us
00:21:26.180 globally
00:21:26.680 and individually
00:21:27.460 are actually
00:21:28.320 actionable
00:21:29.300 and aren't so dismal
00:21:30.900 when you look at them
00:21:31.660 in detail
00:21:32.080 and are also complex
00:21:33.360 in weirdly interesting ways.
00:21:34.920 it's not surprising
00:21:36.980 that has positive
00:21:37.880 psychological consequences.
00:21:39.540 I mean,
00:21:39.720 I certainly was pleased,
00:21:41.220 for example,
00:21:41.720 by your discussion
00:21:42.680 of plastics.
00:21:44.720 You know,
00:21:44.960 I've been following
00:21:45.820 the work of
00:21:46.420 this Dutch kid,
00:21:47.760 I don't remember his name,
00:21:48.960 but he's built this gadget
00:21:50.080 for gathering plastic,
00:21:51.340 which is quite cool.
00:21:52.720 And,
00:21:52.920 and I didn't know
00:21:55.400 that the evidence
00:21:56.380 for the decomposition
00:21:57.680 of plastics
00:21:58.500 was as robust
00:21:59.620 as you describe
00:22:00.940 in the book.
00:22:01.420 So I thought,
00:22:02.060 hey,
00:22:02.220 isn't that good?
00:22:02.940 That's,
00:22:03.500 that's,
00:22:04.040 that's,
00:22:04.440 that's a positive
00:22:05.560 thing to see.
00:22:06.260 And I saw many examples
00:22:07.440 of that in the book,
00:22:08.440 that,
00:22:08.760 that things aren't
00:22:10.380 as bad as we think.
00:22:11.320 So let's go through that.
00:22:12.300 Let's start.
00:22:12.820 You start talking
00:22:13.660 about this group,
00:22:14.680 I think it's a UK group,
00:22:15.980 Extinction Rebellion.
00:22:17.120 And I kind of see them
00:22:18.440 in some sense
00:22:19.080 as the forerunners
00:22:20.220 of where we might go
00:22:21.600 if we regard
00:22:22.460 the impending
00:22:23.620 climate catastrophe
00:22:24.920 as a doom
00:22:26.120 and gloom
00:22:26.660 laden
00:22:27.700 existential crisis.
00:22:28.920 It's like,
00:22:29.760 man,
00:22:30.100 half the people
00:22:30.760 on the planet
00:22:31.260 are going to die.
00:22:32.900 No solution
00:22:35.080 is too drastic.
00:22:36.860 Okay,
00:22:37.240 so that's Extinction Rebellion
00:22:39.040 in some sense.
00:22:39.780 So maybe you could
00:22:40.800 tell the story
00:22:41.360 about that.
00:22:43.260 Going online
00:22:44.180 without ExpressVPN
00:22:45.140 is like not paying
00:22:46.260 attention to the safety
00:22:47.100 demonstration on a flight.
00:22:48.760 Most of the time
00:22:49.440 you'll probably be fine,
00:22:50.840 but what if one day
00:22:51.980 that weird yellow mask
00:22:53.200 drops down from overhead
00:22:54.220 and you have no idea
00:22:55.720 what to do?
00:22:56.520 In our hyper-connected world,
00:22:57.980 your digital privacy
00:22:58.940 isn't just a luxury.
00:22:59.920 It's a fundamental right.
00:23:01.640 Every time you connect
00:23:02.420 to an unsecured network
00:23:03.600 in a cafe,
00:23:04.540 hotel,
00:23:05.080 or airport,
00:23:05.900 you're essentially
00:23:06.460 broadcasting your
00:23:07.340 personal information
00:23:08.160 to anyone with
00:23:09.120 a technical know-how
00:23:10.040 to intercept it.
00:23:10.960 And let's be clear,
00:23:11.820 it doesn't take
00:23:12.560 a genius hacker
00:23:13.220 to do this.
00:23:14.160 With some off-the-shelf
00:23:15.140 hardware,
00:23:15.740 even a tech-savvy teenager
00:23:17.140 could potentially access
00:23:18.280 your passwords,
00:23:19.300 bank logins,
00:23:20.060 and credit card details.
00:23:21.560 Now,
00:23:21.960 you might think,
00:23:22.780 what's the big deal?
00:23:23.660 Who'd want my data anyway?
00:23:25.200 Well,
00:23:25.600 on the dark web,
00:23:26.360 your personal information
00:23:27.480 could fetch up to $1,000.
00:23:29.820 That's right,
00:23:30.360 there's a whole underground economy
00:23:31.900 built on stolen identities.
00:23:33.880 Enter ExpressVPN.
00:23:35.640 It's like a digital fortress,
00:23:37.100 creating an encrypted tunnel
00:23:38.280 between your device
00:23:39.300 and the internet.
00:23:40.320 Their encryption is so robust
00:23:41.900 that it would take a hacker
00:23:42.940 with a supercomputer
00:23:43.780 over a billion years
00:23:45.180 to crack it.
00:23:45.920 But don't let its power
00:23:46.940 fool you.
00:23:47.820 ExpressVPN is incredibly
00:23:49.120 user-friendly.
00:23:50.140 With just one click,
00:23:51.120 you're protected
00:23:51.580 across all your devices.
00:23:53.140 Phones,
00:23:53.640 laptops,
00:23:54.020 tablets,
00:23:54.760 you name it.
00:23:55.360 That's why I use ExpressVPN
00:23:56.700 whenever I'm traveling
00:23:57.840 or working from a coffee shop.
00:23:59.480 It gives me peace of mind
00:24:00.520 knowing that my research,
00:24:01.860 communications,
00:24:02.620 and personal data
00:24:03.600 are shielded from prying eyes.
00:24:05.460 Secure your online data today
00:24:07.000 by visiting
00:24:07.620 expressvpn.com
00:24:09.080 slash jordan.
00:24:10.200 That's
00:24:10.520 e-x-p-r-e-s-s-v-p-n
00:24:12.840 dot com
00:24:13.320 slash jordan
00:24:13.960 and you can get
00:24:14.600 an extra three months free.
00:24:16.620 expressvpn.com
00:24:17.740 slash jordan.
00:24:22.280 Starting a business
00:24:23.220 can be tough.
00:24:24.160 But thanks to Shopify,
00:24:25.340 running your online storefront
00:24:26.620 is easier than ever.
00:24:28.280 Shopify is the global
00:24:29.520 commerce platform
00:24:30.300 that helps you sell
00:24:31.060 at every stage
00:24:31.820 of your business.
00:24:32.780 From the launch
00:24:33.380 your online shop stage
00:24:34.480 all the way to the
00:24:35.200 did we just hit
00:24:35.980 a million orders stage,
00:24:37.520 Shopify is here
00:24:38.280 to help you grow.
00:24:39.660 Our marketing team
00:24:40.420 uses Shopify every day
00:24:41.660 to sell our merchandise
00:24:42.460 and we love how easy
00:24:43.840 it is to add more items,
00:24:45.360 ship products,
00:24:46.060 and track conversions.
00:24:47.620 With Shopify,
00:24:48.440 customize your online store
00:24:49.560 to your style
00:24:50.480 with flexible templates
00:24:51.580 and powerful tools
00:24:52.640 alongside an endless list
00:24:54.340 of integrations
00:24:55.060 and third-party apps
00:24:56.020 like on-demand printing,
00:24:57.440 accounting,
00:24:58.020 and chatbots.
00:24:59.180 Shopify helps you
00:24:59.920 turn browsers
00:25:00.560 into buyers
00:25:01.320 with the internet's
00:25:02.040 best converting checkout,
00:25:03.460 up to 36% better
00:25:04.840 compared to other
00:25:05.560 leading e-commerce platforms.
00:25:07.500 No matter how big
00:25:08.280 you want to grow,
00:25:09.160 Shopify gives you
00:25:09.900 everything you need
00:25:10.620 to take control
00:25:11.360 and take your business
00:25:12.380 to the next level.
00:25:13.940 Sign up for a $1
00:25:14.820 per month trial period
00:25:15.940 at shopify.com
00:25:17.260 slash jbp,
00:25:18.520 all lowercase.
00:25:19.440 Go to shopify.com
00:25:21.100 slash jbp now
00:25:22.260 to grow your business
00:25:23.280 no matter what stage
00:25:24.520 you're in.
00:25:25.140 That's shopify.com
00:25:26.540 slash jbp.
00:25:30.160 I was going to say,
00:25:31.120 they say no solution
00:25:32.540 is too drastic
00:25:33.280 unless it's nuclear energy,
00:25:34.720 in which case
00:25:35.200 they're against it.
00:25:36.280 Or in case it's fracking,
00:25:37.880 in which case
00:25:38.260 they're against that too.
00:25:39.440 And I get at that right away,
00:25:40.580 which is that why are the people
00:25:41.440 who are the most apocalyptic
00:25:42.460 the most dead set
00:25:44.480 against the things
00:25:45.440 that have reduced
00:25:46.220 carbon emissions,
00:25:47.440 natural gas and nuclear,
00:25:49.520 by far the two things
00:25:50.580 that have reduced
00:25:51.000 carbon emissions the most,
00:25:52.420 instead they're in favor
00:25:53.280 of things that don't work,
00:25:55.160 adding a lot of unreliable
00:25:56.800 renewables onto your grid,
00:25:58.860 making electricity expensive,
00:26:00.900 making societies
00:26:02.220 less resilient to climate change.
00:26:04.480 Those are all high priorities
00:26:06.020 for the apocalyptic
00:26:07.000 environmental movement.
00:26:08.120 So it's not just that.
00:26:08.820 So why, why, why, why?
00:26:09.920 What's going on?
00:26:11.340 Like, well, I mean,
00:26:11.940 it was interesting.
00:26:12.340 It's so interesting.
00:26:13.820 Yeah.
00:26:14.460 Well, I mean,
00:26:14.940 you were on my mind a bit
00:26:16.160 when I was working,
00:26:16.900 particularly towards
00:26:17.700 the last chapter.
00:26:18.680 I go through three core motivations.
00:26:20.520 One is there's certainly
00:26:21.220 powerful financial interests
00:26:22.560 that work,
00:26:23.460 renewable energy companies.
00:26:25.020 I document how fossil energy
00:26:26.700 companies have financed
00:26:27.820 anti-nuclear campaigns
00:26:29.000 for 50 years.
00:26:31.040 I also have the third,
00:26:33.040 it's the chapter,
00:26:34.080 there's chapters 10, 11, 12,
00:26:35.580 the last three chapters
00:26:36.360 of the book
00:26:36.720 look at the motivations.
00:26:38.220 Chapter 11 is more
00:26:39.200 on kind of will to power,
00:26:40.880 a desire for status,
00:26:42.660 for feeling important,
00:26:44.200 particularly places like Europe,
00:26:45.400 which are becoming irrelevant
00:26:46.720 with the rise of China,
00:26:48.600 wanting to assert their power
00:26:50.100 over the developing world.
00:26:52.140 You know,
00:26:52.280 it's no coincidence,
00:26:53.020 I think,
00:26:53.620 that as Europe's power
00:26:54.660 has faded,
00:26:55.440 they've become more demanding
00:26:56.860 to take control
00:26:57.740 of the international economy
00:26:59.000 in the name of climate change.
00:27:00.960 And then the third chapter
00:27:02.180 kind of says,
00:27:02.860 you know,
00:27:03.040 those are both
00:27:03.560 important motivations,
00:27:04.620 but there's something else
00:27:05.720 going on,
00:27:06.480 which is that
00:27:07.240 apocalyptic environmentalism
00:27:08.580 is clearly
00:27:09.200 a religious movement.
00:27:10.500 Everything about it,
00:27:11.340 the guilt,
00:27:13.740 the original sin,
00:27:15.420 the apocalypse,
00:27:17.260 the obsession with food,
00:27:20.500 you know,
00:27:20.760 various things about it
00:27:21.960 are clearly a religion
00:27:22.880 and I'm hardly the first
00:27:23.860 to make that observation.
00:27:25.940 I document,
00:27:26.720 in fact,
00:27:26.960 there's actually good
00:27:27.480 empirical work
00:27:28.160 documenting that.
00:27:30.040 And so I see the,
00:27:31.160 you know,
00:27:31.440 rising secularization,
00:27:33.200 what Nietzsche called
00:27:34.000 the death of God
00:27:34.920 and the nihilistic vacuum
00:27:36.440 that would be created
00:27:37.140 in its wake
00:27:37.800 as really the underlying engine
00:27:40.060 for apocalyptic environmentalism.
00:27:42.360 It's a way to give meaning
00:27:44.360 to the world.
00:27:46.160 So, you know,
00:27:46.880 I'm writing a new book
00:27:48.140 which is going to be called
00:27:49.280 We Who Wrestle With God
00:27:50.800 and it,
00:27:52.860 it,
00:27:53.760 obviously,
00:27:54.760 we're thinking along the same lines
00:27:56.320 and for some of the same reasons
00:27:57.720 in that
00:27:58.100 there's this
00:27:59.740 adage in the New Testament
00:28:01.980 that warns people
00:28:03.880 that they should deliver
00:28:04.960 unto God
00:28:05.460 that which is God
00:28:06.260 and unto Caesar
00:28:07.000 that which is Caesar's
00:28:08.080 and, of course,
00:28:08.800 that,
00:28:09.180 on that statement
00:28:09.860 is built the notion
00:28:10.860 that separation of church
00:28:12.500 and state
00:28:12.840 is actually appropriate.
00:28:13.920 But I also think
00:28:14.860 that's true psychologically
00:28:16.000 and this is part of the problem
00:28:17.520 I have with the new atheist movement
00:28:19.080 or that
00:28:20.060 if you don't have a domain
00:28:22.240 that's sacred
00:28:22.940 and rituals
00:28:23.740 and,
00:28:24.420 and,
00:28:24.920 and to deal
00:28:26.280 and some understanding
00:28:28.000 that there are deepest values
00:28:29.780 and that's the domain
00:28:31.000 of the sacred
00:28:31.540 whether you like it or not,
00:28:33.300 you obliterate that
00:28:35.080 in the name of rationality
00:28:36.300 and all that happens
00:28:37.000 is that things
00:28:38.180 that are Caesar's
00:28:39.420 now become contaminated
00:28:40.560 with the religious
00:28:41.540 and that's really
00:28:42.780 not a good thing.
00:28:43.860 It's a seriously
00:28:44.460 not a good thing.
00:28:45.780 So it's interesting
00:28:46.400 to see you
00:28:47.020 close the book
00:28:48.300 with that kind of,
00:28:49.800 you know,
00:28:50.020 with thinking
00:28:50.420 that's along the same
00:28:51.300 sort of line.
00:28:52.700 And so did you see that
00:28:53.580 working in you personally
00:28:55.120 that?
00:28:55.860 Yes.
00:28:56.460 Okay, how?
00:28:59.300 Yeah,
00:28:59.820 I mean,
00:29:00.220 when I was apocalyptic
00:29:01.180 about climate change,
00:29:02.700 you mean?
00:29:02.920 Yes,
00:29:03.400 for sure.
00:29:04.380 And I came back
00:29:05.520 to my Christianity
00:29:06.900 in writing Apocalypse Never,
00:29:09.460 but it was also,
00:29:10.680 I also became convinced
00:29:11.840 that by Jonathan Haidt
00:29:13.460 and others
00:29:13.880 that,
00:29:14.380 that having faith
00:29:15.660 was rational.
00:29:18.340 So,
00:29:19.100 you know,
00:29:19.600 that it's actually
00:29:20.180 psychologically healthy
00:29:21.100 to have a faith
00:29:22.000 and so I had to get over
00:29:23.020 my own
00:29:24.120 demonization
00:29:26.120 of spirituality
00:29:27.060 or demonization
00:29:27.800 of faith
00:29:28.300 and that unlocked
00:29:29.200 the,
00:29:30.080 I couldn't finish
00:29:30.920 Apocalypse Never
00:29:31.520 actually until I had
00:29:32.320 done that
00:29:32.920 No,
00:29:33.740 I wouldn't have guessed
00:29:34.900 that again
00:29:35.380 from reading the book
00:29:36.100 because it,
00:29:36.900 that isn't obvious
00:29:38.660 just as the psychological
00:29:39.800 issue wasn't obvious
00:29:40.900 and that,
00:29:41.300 I think that's a really
00:29:42.060 good thing by the way
00:29:42.960 that should all be
00:29:43.660 implicit in the book
00:29:44.520 rather than explicit.
00:29:46.160 It makes for a better,
00:29:47.600 a less cluttered book
00:29:48.660 let's say.
00:29:49.920 I wanted,
00:29:50.460 yeah,
00:29:50.720 I mean,
00:29:51.020 I,
00:29:51.140 I,
00:29:51.740 some of my best allies
00:29:53.520 Stephen Pinker,
00:29:54.840 Michael Shermer
00:29:55.460 are in the New Atheist Movement
00:29:57.140 and I really regard them
00:29:59.040 as friends
00:29:59.540 as I love them
00:30:00.380 and,
00:30:01.880 and Steve also
00:30:02.860 blurbed my new book
00:30:04.160 San Francisco
00:30:04.720 and so San Francisco
00:30:05.820 and then I'm doing
00:30:06.260 a third book afterwards
00:30:07.280 and all three books
00:30:08.320 are basically
00:30:08.940 about the threats
00:30:11.360 to civilization
00:30:12.140 from within
00:30:13.240 and that they're,
00:30:14.980 and they all conclude,
00:30:16.560 San Francisco looks at
00:30:17.740 the religious,
00:30:18.800 this,
00:30:19.560 you know,
00:30:19.920 religious,
00:30:20.560 the secular religion
00:30:21.480 of compassion
00:30:23.100 and how it's gone
00:30:24.360 completely crazy
00:30:25.360 to basically result
00:30:27.380 in greater victimization
00:30:28.660 in the name
00:30:29.220 of rescuing victims
00:30:30.360 and so I'm definitely
00:30:32.580 after,
00:30:33.420 I think we're after
00:30:33.980 the same big prey here
00:30:35.400 which is,
00:30:36.600 you know,
00:30:37.100 the threats to civilization
00:30:38.860 are coming from
00:30:39.840 the most civilized
00:30:40.800 members of society
00:30:42.180 who are also
00:30:44.260 the most secular members
00:30:45.460 or they think
00:30:46.260 they're the most secular
00:30:46.840 members of society
00:30:47.520 and they're,
00:30:48.120 they're,
00:30:48.720 they're projecting
00:30:49.600 their needs for,
00:30:50.480 they're constructing
00:30:51.440 new religions.
00:30:52.820 Yeah,
00:30:53.000 well,
00:30:53.180 they're also,
00:30:53.920 so,
00:30:54.280 you know,
00:30:54.520 with the death of God
00:30:55.560 and this is Nietzsche
00:30:56.620 through Jung,
00:30:57.460 I suppose,
00:30:57.980 because Jung was
00:30:58.640 a great student
00:30:59.320 of Nietzsche
00:30:59.840 and as much as Freud
00:31:01.560 for sure,
00:31:03.240 as much as he was
00:31:04.220 a student of Freud's
00:31:05.280 and Jung was really
00:31:06.520 trying to solve
00:31:07.420 the problem
00:31:08.060 that Nietzsche posed
00:31:09.160 and that was
00:31:10.020 his life's work
00:31:10.840 and I think in many ways
00:31:12.300 he actually managed that
00:31:13.600 pointing out,
00:31:14.760 first of all,
00:31:15.200 that we cannot create
00:31:16.220 our own values,
00:31:17.060 that's actually not possible,
00:31:18.300 we're not wise enough,
00:31:19.320 smart enough,
00:31:19.800 we don't live long enough,
00:31:21.420 we just don't have
00:31:22.260 that much intellectual,
00:31:24.900 spiritual capacity,
00:31:25.840 we have to depend
00:31:26.640 at least to some degree
00:31:27.500 on tradition
00:31:28.020 and that brings up
00:31:28.720 all sorts of problems
00:31:29.640 and that guilt
00:31:30.320 you talked about,
00:31:31.100 like that religious guilt,
00:31:32.420 I was watching Guy Ritchie's
00:31:34.040 King Arthur the other day
00:31:35.500 and when the new,
00:31:37.200 the king,
00:31:38.140 the to-be king Arthur
00:31:39.620 puts his hands
00:31:40.760 on the sword,
00:31:41.980 he has this unbearable vision
00:31:43.880 of his uncle
00:31:45.140 killing his father,
00:31:46.760 the evil uncle,
00:31:47.480 and the evil uncle
00:31:48.440 is a very standard
00:31:49.800 archetypal trope,
00:31:50.960 you see it in
00:31:51.560 The Lion King,
00:31:52.600 for example,
00:31:53.240 with Scar
00:31:53.700 and the evil uncle
00:31:54.980 is often the tyrannical aspect
00:31:56.980 of the patriarchy,
00:31:58.320 let's say
00:31:58.780 and, you know,
00:31:59.860 we all exist in relationship
00:32:01.240 to that
00:32:01.880 because we all exist
00:32:02.800 in relationship
00:32:03.420 to this patriarchal
00:32:05.160 social structure,
00:32:06.080 history,
00:32:06.680 because we're
00:32:06.960 historical creatures
00:32:07.760 and then we all
00:32:08.860 do have this guilt
00:32:10.000 that overwhelms us
00:32:11.000 about the blood
00:32:12.080 and gore
00:32:12.680 and catastrophe
00:32:13.480 that got us
00:32:14.540 to where we are,
00:32:15.460 our unearned privilege,
00:32:16.940 you know,
00:32:17.160 to take a phrase
00:32:17.880 from the radical leftists,
00:32:19.360 it's part of our
00:32:20.200 existential burden
00:32:21.180 and the existential psychologists
00:32:23.920 who were followers
00:32:25.140 mostly of,
00:32:27.300 I can't remember
00:32:30.000 the philosopher's name
00:32:30.880 momentarily,
00:32:31.600 wrote being in time,
00:32:32.500 Heidegger,
00:32:33.220 Heidegger,
00:32:33.900 you know,
00:32:34.140 Heidegger talked about
00:32:35.020 being thrown into the world,
00:32:36.640 so you're arbitrarily
00:32:38.000 put somewhere,
00:32:39.360 parents are arbitrary,
00:32:41.160 you're subject to society
00:32:42.600 and this,
00:32:43.620 and you have these
00:32:44.220 existential concerns
00:32:45.180 that will never go away
00:32:46.160 and one of them is
00:32:46.940 the terrible corrupt weight
00:32:48.880 of history
00:32:49.460 and how are you related
00:32:51.120 to that as an ethical being
00:32:52.940 and the radical leftists
00:32:54.340 are definitely wrestling
00:32:55.480 with that,
00:32:56.460 you know,
00:32:56.720 but in their depression,
00:32:58.160 let's say,
00:32:58.620 they can only see
00:32:59.600 the negative aspect
00:33:00.580 of the patriarchal figure
00:33:02.300 and not the positive aspect
00:33:03.600 and that's a real catastrophe
00:33:04.960 because,
00:33:06.100 well,
00:33:06.640 makes you ungrateful
00:33:08.080 for one thing,
00:33:09.420 which is not a good idea
00:33:10.460 in a modern state.
00:33:11.940 So,
00:33:12.540 okay,
00:33:13.180 well,
00:33:13.560 let's go back
00:33:14.160 to Extinction Rebellion
00:33:16.620 and so,
00:33:17.380 you talk about
00:33:18.180 this activist group
00:33:19.720 that's highly motivated
00:33:21.260 to point out the crisis
00:33:22.780 and to take whatever
00:33:23.860 steps are necessary,
00:33:26.020 but they won't do
00:33:26.980 practical things.
00:33:28.100 Nuclear energy,
00:33:29.320 for example,
00:33:30.000 that's a really interesting one
00:33:31.300 and so,
00:33:31.920 why not?
00:33:33.040 Is that part
00:33:34.480 of the contamination
00:33:35.480 of the environmentalist movement
00:33:36.920 with anti-capitalism per se
00:33:38.780 or what's going on there?
00:33:40.240 It's like...
00:33:40.640 Well,
00:33:40.780 yeah,
00:33:40.940 for sure.
00:33:41.520 I mean,
00:33:41.660 I think you,
00:33:42.500 yes.
00:33:43.060 So,
00:33:43.340 and this is also in my new book,
00:33:46.380 which is,
00:33:46.860 why are the main advocates
00:33:49.460 for action on the issue
00:33:51.560 opposed to the obvious solutions,
00:33:54.380 the solutions that have worked,
00:33:55.500 that have proven to work?
00:33:57.260 And so,
00:33:57.800 yes,
00:33:58.100 for sure,
00:33:58.620 because their motivation
00:33:59.800 is to destroy the whole system.
00:34:01.360 They view the system
00:34:02.200 as the cause of the problem
00:34:03.600 and they view anything
00:34:04.280 that distracts attention
00:34:06.600 from destroying
00:34:07.600 what they view
00:34:08.320 as an evil system
00:34:09.260 as,
00:34:10.520 in some ways,
00:34:10.860 participating in the system.
00:34:12.300 So,
00:34:12.480 that's definitely going on.
00:34:13.380 Right, right.
00:34:13.780 Yeah.
00:34:14.180 Yeah,
00:34:14.400 I've seen that sort of thinking
00:34:15.780 really destroy people too.
00:34:17.420 Like,
00:34:17.580 I've seen people literally
00:34:19.080 take their own lives
00:34:20.600 because they thought that way.
00:34:21.920 They felt they were so corrupt
00:34:23.760 that any ambitious achievement
00:34:26.720 whatsoever in the service
00:34:28.220 of this evil structure
00:34:29.380 was ethically forbidden.
00:34:31.740 And so,
00:34:32.540 it's kind of,
00:34:33.260 it's like the ultimate
00:34:34.140 in pessimistic,
00:34:35.320 nihilistic Buddhism.
00:34:36.160 And it's also another example
00:34:37.600 of that global thinking,
00:34:39.580 global vague thinking
00:34:40.480 that does,
00:34:41.240 in fact,
00:34:41.660 characterize clinical depression.
00:34:44.560 Yeah,
00:34:44.740 I mean,
00:34:44.920 it's interesting.
00:34:45.640 I mean,
00:34:45.780 one of the things,
00:34:46.700 I talk about how Greta Thunberg,
00:34:48.100 the Swedish youth climate activist,
00:34:51.000 condemned nuclear power
00:34:52.220 as dangerous,
00:34:54.220 unnecessary,
00:34:55.000 and too expensive.
00:34:56.500 Well,
00:34:56.840 since when does she care
00:34:57.760 about too expensive?
00:34:58.820 I mean,
00:34:59.020 she's demanding basically
00:35:00.380 that we,
00:35:01.340 you know,
00:35:02.060 grind economic growth
00:35:03.240 to a halt
00:35:03.780 in order to reduce
00:35:04.700 carbon emissions.
00:35:06.700 You know,
00:35:07.400 she condemns basically
00:35:08.920 any modest progress
00:35:10.240 as inadequate,
00:35:11.680 and yet she comes out
00:35:12.580 against the source of power,
00:35:14.460 the zero carbon source
00:35:15.920 of power
00:35:16.240 that provides 40%
00:35:17.320 of the electricity
00:35:17.920 in her own country.
00:35:19.720 When our allies
00:35:21.220 in Germany
00:35:21.780 have been speaking out
00:35:22.740 to stop Germany
00:35:23.760 from shutting down
00:35:24.440 its last six nuclear reactors,
00:35:26.500 reached out to her
00:35:27.180 to get her to say something
00:35:27.880 that she wouldn't do it.
00:35:29.420 So the problem is
00:35:30.580 solving the problem
00:35:31.440 gets in the way
00:35:32.200 of the alarmism.
00:35:33.400 The alarmism isn't just,
00:35:35.300 I think journalists
00:35:36.220 and others misunderstand
00:35:37.440 the alarmism.
00:35:38.860 They think it's a tactic
00:35:39.920 to achieve some end.
00:35:41.260 And so one of the things
00:35:42.020 I would get from journalists
00:35:43.760 is they would say,
00:35:44.600 come on,
00:35:44.920 Michael,
00:35:45.160 don't you think
00:35:45.580 that it's important
00:35:46.160 to exaggerate climate change
00:35:48.500 a little bit
00:35:48.900 in order to get action?
00:35:49.760 Well, first of all,
00:35:50.400 there's no evidence
00:35:51.100 that exaggerating
00:35:51.920 the problem
00:35:52.320 gets more action.
00:35:53.520 Yeah, the answer
00:35:54.040 to that is no.
00:35:55.840 Let's not lie, okay?
00:35:57.800 I don't care
00:35:58.440 what the reason is here.
00:35:59.600 No lying,
00:36:00.900 especially about
00:36:01.520 something important.
00:36:02.700 I mean, it's notable
00:36:03.280 that it comes from journalists
00:36:04.440 who have become
00:36:05.520 propagandists, effectively.
00:36:07.980 And so the alarmism
00:36:09.220 is the goal.
00:36:10.500 Like, the goal
00:36:11.180 is the alarmism.
00:36:12.260 Yeah, well, it,
00:36:13.000 okay, so let's,
00:36:14.320 okay, so let's
00:36:15.280 dig down here
00:36:16.180 in a little bit.
00:36:16.940 And so part
00:36:19.280 of what Nietzsche
00:36:19.940 predicted was
00:36:20.820 that the death
00:36:21.820 of God,
00:36:22.400 what the death
00:36:22.880 of God meant,
00:36:23.660 what he described
00:36:24.760 and predicted was
00:36:25.660 that the death
00:36:26.680 of God meant
00:36:27.340 the collapse
00:36:28.100 of the highest
00:36:28.880 unifying value.
00:36:30.700 Okay, so it's
00:36:31.500 become pretty evident
00:36:32.360 to me that
00:36:33.060 we literally
00:36:34.620 perceive the world
00:36:35.980 through a hierarchy
00:36:36.760 of value,
00:36:37.280 and we certainly
00:36:38.300 organize our social
00:36:39.380 communities
00:36:40.060 inside a hierarchy
00:36:41.760 of value.
00:36:42.320 And there has to be
00:36:43.080 something at the top
00:36:43.980 to unite us.
00:36:44.820 Now, it isn't obvious
00:36:45.640 what should be
00:36:46.160 at the top.
00:36:46.800 In fact, it's so
00:36:47.520 not obvious
00:36:48.140 that we probably
00:36:49.020 can only think
00:36:49.980 about that
00:36:50.580 in images.
00:36:51.880 We're not
00:36:52.340 philosophically astute
00:36:53.580 enough to actually
00:36:54.420 conceptualize it.
00:36:55.880 And a lot of the
00:36:56.440 religious enterprise
00:36:57.240 is the attempt
00:36:58.180 to conceptualize
00:36:59.100 that thing at the top.
00:37:00.560 Now, let's say
00:37:01.220 it dies because
00:37:02.060 it's God
00:37:02.620 and it got too abstract.
00:37:04.260 Mircea Eliade,
00:37:05.120 the historian of religion,
00:37:06.280 said that that happened
00:37:07.220 many times in our history,
00:37:08.660 that the top value
00:37:09.720 got so abstract
00:37:10.660 it got disembodied
00:37:11.700 and people didn't
00:37:12.600 know what it was
00:37:13.140 anymore or how to
00:37:13.780 act it out
00:37:14.240 or what it meant
00:37:14.840 and so it floated
00:37:15.620 away and then
00:37:16.380 collapse into
00:37:17.820 competing claims
00:37:20.060 about what should
00:37:21.280 be the highest value.
00:37:23.180 Well, let's say
00:37:23.840 diversity,
00:37:25.420 equity,
00:37:27.820 compassion.
00:37:28.820 Well, why shouldn't
00:37:29.560 compassion be
00:37:30.340 the highest value?
00:37:31.440 Well, you know,
00:37:32.680 that's a reasonable
00:37:33.480 thing to argue about.
00:37:34.620 I think there's
00:37:35.720 some credibility
00:37:36.740 in the claim
00:37:37.360 that love should
00:37:38.040 be the highest value,
00:37:39.820 perhaps.
00:37:40.340 There's truth
00:37:40.920 and beauty,
00:37:41.480 many other issues.
00:37:43.200 Okay, so
00:37:43.740 the highest value
00:37:45.080 collapse,
00:37:45.640 we're not united
00:37:46.300 anymore.
00:37:46.840 Well, then we're
00:37:47.320 motivated to argue
00:37:48.360 about what the
00:37:49.080 highest value
00:37:49.640 should be.
00:37:50.660 And since it's
00:37:51.260 about the highest
00:37:51.940 value now,
00:37:52.760 now I have an idea
00:37:53.900 it's saving the
00:37:54.700 environment,
00:37:55.140 that's the highest
00:37:55.720 value.
00:37:56.960 Well, when you
00:37:57.680 attack that,
00:37:58.560 then you attack
00:37:59.500 my claim to
00:38:00.480 embody the
00:38:01.680 highest ideal.
00:38:03.460 And so you
00:38:03.920 threaten me
00:38:04.680 psychologically,
00:38:05.500 because that's
00:38:05.920 where I found
00:38:06.520 some refuge
00:38:07.220 and some ethical
00:38:08.640 guidance.
00:38:09.580 and so I'm
00:38:11.240 not going to
00:38:11.680 listen to your
00:38:12.220 practical solutions
00:38:13.200 either.
00:38:13.700 And then I
00:38:14.040 haven't examined
00:38:14.760 what other
00:38:15.280 motivations I
00:38:16.180 might have,
00:38:16.840 like, well,
00:38:17.460 this anti-capitalism
00:38:18.580 issue, that's a
00:38:19.440 terrible contamination
00:38:20.860 for the environmentalist
00:38:22.200 movement.
00:38:23.300 So, because you're
00:38:24.660 just not going to
00:38:25.240 solve both of those
00:38:26.040 problems at the
00:38:26.760 same time.
00:38:27.240 You want to
00:38:27.640 dispense with
00:38:28.480 capitalism,
00:38:29.580 invent an entire
00:38:30.400 new economic
00:38:31.060 system and save
00:38:32.400 the planet.
00:38:32.900 part of the
00:38:36.100 problem is that
00:38:36.480 they're not
00:38:36.740 actually sincere
00:38:37.540 about it.
00:38:38.100 So they would
00:38:38.560 suggest nature
00:38:39.280 is the highest
00:38:39.880 value.
00:38:40.500 But when you
00:38:40.820 say, okay,
00:38:41.780 well, here's
00:38:42.060 what you could
00:38:42.420 do to save
00:38:42.980 nature,
00:38:44.240 fertilizer,
00:38:45.140 irrigation,
00:38:45.760 and tractors
00:38:46.260 for poor
00:38:46.680 countries,
00:38:47.120 so they can
00:38:47.800 take the
00:38:48.400 pressure off
00:38:48.860 the forests,
00:38:49.460 which is where
00:38:49.820 the gorillas
00:38:50.740 and the nature
00:38:51.300 is, using
00:38:52.720 oil rather than
00:38:53.760 whale oil to
00:38:54.680 save the whales,
00:38:55.520 and using nuclear
00:38:56.240 power and natural
00:38:56.900 gas.
00:38:57.380 No, no, they
00:38:57.860 don't want to
00:38:58.160 do any of those
00:38:58.780 things.
00:38:59.720 So there is a
00:39:00.280 nihilism there
00:39:01.180 in the sense
00:39:01.640 that the goal
00:39:02.760 is power
00:39:03.620 itself.
00:39:04.700 No, there's
00:39:05.020 also no such
00:39:05.840 thing as
00:39:06.300 nature.
00:39:07.140 Like, think
00:39:07.600 about that.
00:39:08.300 It's like when
00:39:09.240 you refer to
00:39:10.040 France as an
00:39:12.480 entity, as a
00:39:13.280 person, so
00:39:14.440 you're personifying
00:39:15.380 it, or maybe
00:39:16.040 you're deifying
00:39:16.760 it to some
00:39:17.380 degree.
00:39:18.220 Well, that's
00:39:18.660 what happens
00:39:19.160 with nature.
00:39:20.020 It's like,
00:39:20.480 nature, what
00:39:21.380 is that exactly?
00:39:22.500 Well, everyone
00:39:23.200 knows it's like
00:39:24.180 an old growth
00:39:24.940 forest or
00:39:25.520 something.
00:39:25.920 There's some
00:39:26.320 vague set of
00:39:27.440 images, but
00:39:28.160 nature, conceptualized
00:39:30.440 in that manner,
00:39:31.080 is actually
00:39:31.560 a deity of
00:39:32.940 sorts, an
00:39:33.840 unexamined
00:39:34.500 deity, and
00:39:35.180 who God only
00:39:36.080 knows what it
00:39:36.720 means.
00:39:37.120 I mean, you
00:39:37.680 look at what
00:39:38.260 happened in
00:39:38.600 Nazi Germany
00:39:39.260 before the
00:39:39.860 Nazis took
00:39:40.440 power, because
00:39:41.280 they were
00:39:41.520 allied pretty
00:39:42.000 tightly with
00:39:42.520 certain kinds
00:39:43.040 of environmentalist
00:39:43.960 thinking.
00:39:44.660 Purity, for
00:39:45.560 example, very
00:39:46.800 big pushback
00:39:48.000 against invasive
00:39:48.900 species, for
00:39:50.040 example, which
00:39:50.660 is quite
00:39:50.940 interesting.
00:39:51.940 It's like, well,
00:39:52.680 there is this
00:39:53.320 worship of
00:39:54.080 whatever it is
00:39:54.820 that nature
00:39:55.360 signifies, and
00:39:56.180 symbolically it
00:39:57.680 signifies something
00:39:58.580 like, well, the
00:39:59.920 maternal as put
00:40:01.480 against the
00:40:02.100 patriarchal, so
00:40:03.360 that's in there.
00:40:04.240 The warm embrace
00:40:05.540 of mother, that's
00:40:07.000 all in that
00:40:07.480 symbolic realm.
00:40:08.460 There's a great
00:40:08.900 book about that
00:40:09.660 called The Great
00:40:10.340 Mother by Eric
00:40:11.560 Neumann.
00:40:12.040 Best book ever
00:40:12.840 written on that
00:40:13.440 in the 1950s, an
00:40:14.540 absolute classic, and
00:40:15.560 it outlines the
00:40:16.300 entire domain of
00:40:17.560 symbolism of the
00:40:18.440 positive feminine.
00:40:19.780 And so you do
00:40:20.420 see this religious
00:40:21.240 struggle between
00:40:22.160 those who are now
00:40:23.820 advocates of the
00:40:24.900 positive feminine,
00:40:25.760 feminine, and
00:40:26.920 detractors of the
00:40:28.180 negative masculine.
00:40:29.540 But it's very
00:40:30.000 unbalanced, you
00:40:30.860 know, because
00:40:31.180 there's a negative
00:40:31.780 feminine and there's
00:40:32.640 a positive masculine
00:40:33.660 as well.
00:40:34.500 So we're all
00:40:35.320 tangled up in that.
00:40:36.240 We don't
00:40:36.480 understand it.
00:40:37.900 I mean, one of
00:40:38.380 the interesting
00:40:38.880 shifts that's
00:40:39.560 occurred even in
00:40:40.360 my own career as
00:40:41.640 an environmentalist
00:40:42.560 is that all of
00:40:43.920 the stuff from
00:40:44.520 the ecotopia, the
00:40:46.100 utopianism, the
00:40:47.240 green utopianism, the
00:40:48.620 renewal, I mean, the
00:40:49.840 harmony with nature,
00:40:51.120 the kind of we're
00:40:51.560 all going to live in
00:40:52.220 these small self
00:40:53.020 sustaining kind of
00:40:53.820 anarchist
00:40:54.340 communities, the
00:40:55.260 Ewok village sort
00:40:56.500 of picture, that's
00:40:57.820 gone now.
00:40:58.680 I mean, Greta
00:40:59.060 Thunberg actively
00:41:00.080 says that that's
00:41:00.960 not, they're just,
00:41:02.200 they literally will
00:41:02.780 say now, we're just
00:41:04.040 trying to prevent it
00:41:05.020 from being as
00:41:05.880 terrible.
00:41:06.520 We're trying to
00:41:06.880 make it less
00:41:07.480 terrible.
00:41:08.580 So the utopianism,
00:41:09.980 it's still there.
00:41:11.740 I'm not saying it's
00:41:12.500 totally gone.
00:41:13.460 You certainly see
00:41:14.060 with renewables, the
00:41:15.180 picture of renewables
00:41:16.120 is somehow harmonizing
00:41:17.120 us with the natural
00:41:17.740 world, but it's
00:41:18.700 nothing like what it
00:41:19.560 was in the 70s.
00:41:20.920 Nothing like Earth
00:41:21.580 Day was actually
00:41:22.220 mostly positive.
00:41:23.380 I have a lot of
00:41:23.740 criticism of Earth
00:41:24.440 Day, but it was
00:41:25.220 a mostly positive
00:41:26.100 picture.
00:41:27.120 So what's striking
00:41:27.740 to me is the
00:41:28.380 disappearance of even
00:41:29.320 that positive
00:41:29.960 picture from
00:41:30.840 apocalyptic
00:41:31.720 environmentalism.
00:41:32.720 I wouldn't have
00:41:33.500 predicted that
00:41:34.820 apocalyptic
00:41:35.240 environmentalism could
00:41:36.340 sustain itself with
00:41:37.780 such a single
00:41:38.560 polarity without
00:41:39.900 this much more
00:41:40.720 positive, romantic
00:41:42.200 utopianism, which
00:41:43.820 was really even
00:41:44.640 there, it was
00:41:45.840 there 15 years
00:41:46.840 ago, 20 years
00:41:47.640 ago, but it's
00:41:48.160 somehow gone.
00:41:49.000 So you don't get
00:41:49.560 that picture from
00:41:50.500 Greta Thunberg.
00:41:51.480 Well, depression
00:41:52.820 can be all
00:41:54.140 consuming, you
00:41:55.020 know, and
00:41:56.240 another thing
00:41:58.040 Jung pointed out
00:41:58.860 very blatantly, he
00:42:00.140 said, well, what's
00:42:01.300 really going to
00:42:01.920 threaten us, he
00:42:02.500 wrote about this in
00:42:03.540 the 1950s, is
00:42:05.500 unexamined psychic
00:42:07.100 epidemics, and he
00:42:08.540 meant psychological
00:42:09.260 epidemics, and their
00:42:10.760 effect on the
00:42:11.380 political structure,
00:42:12.120 because he thought,
00:42:12.900 well, we've become
00:42:13.500 the most powerful
00:42:14.380 force on the
00:42:15.020 planet, and now
00:42:16.200 our unrecognized
00:42:17.360 psychological, what
00:42:20.520 would you call
00:42:21.000 them, our
00:42:22.820 illnesses is good
00:42:23.760 enough, they're
00:42:25.540 going to manifest
00:42:26.020 themselves in all
00:42:26.820 sorts of ways that
00:42:27.480 are going to be
00:42:27.980 extraordinarily
00:42:28.420 dangerous, given
00:42:29.340 our power, and
00:42:30.280 so.
00:42:30.980 You know, you
00:42:31.580 grow crops, you
00:42:33.420 use up land for a
00:42:34.520 year to grow crops,
00:42:35.400 and then to process
00:42:36.040 them, and then to
00:42:36.700 turn them into fuel,
00:42:37.900 when you can just
00:42:38.400 get fuel straight out
00:42:39.320 of the earth, and
00:42:39.940 much cheaper, much
00:42:40.740 more, and much
00:42:42.920 better quality that
00:42:43.900 doesn't degrade the
00:42:44.580 car, and that
00:42:45.740 doesn't destroy the
00:42:46.420 soil, and as I
00:42:48.880 was studying that,
00:42:50.020 I was, on the one
00:42:51.080 hand, I was struck
00:42:52.540 by the impossible
00:42:54.140 complexity of the
00:42:55.080 question, and then
00:42:56.240 on the other hand, I
00:42:57.660 was noticing the
00:42:58.620 inescapable conclusion
00:43:00.800 that everything that
00:43:02.620 they're trying to do
00:43:03.760 here is just making
00:43:04.600 things worse.
00:43:06.100 In fact, you know,
00:43:07.180 the idea was that
00:43:08.260 we're going to save
00:43:08.960 the earth from
00:43:09.800 climate change by
00:43:11.000 replacing fossil fuels
00:43:13.680 with ethanol and
00:43:15.580 biodiesel, and
00:43:16.420 but the reality was
00:43:17.740 that you were
00:43:18.120 chopping down the
00:43:19.260 Amazon and
00:43:20.380 Indonesian palm
00:43:22.120 forests, and to
00:43:24.160 turn them into
00:43:24.920 sterile farming
00:43:27.640 land, the Midwest
00:43:29.220 and the Mississippi
00:43:30.500 Valley, or over the
00:43:31.460 U.S., was
00:43:32.460 essentially degraded
00:43:33.460 from all of its
00:43:34.060 soil, and then the
00:43:35.580 fertilizer that is
00:43:37.360 just running off from
00:43:38.160 the Mississippi is
00:43:38.860 causing all these
00:43:39.560 enormous damage to
00:43:40.700 the Gulf of Mexico.
00:43:41.500 it's insane how much
00:43:44.920 damage is being done
00:43:46.020 from this attempt to
00:43:47.640 sort of centrally plan
00:43:48.680 things from above, and
00:43:50.460 Hayek has a saying
00:43:52.080 where he says, you
00:43:52.540 know, we used to
00:43:53.120 suffer from problems, we
00:43:54.200 now suffer from
00:43:54.960 solutions.
00:43:56.160 And this...
00:43:56.660 So let me ask you
00:43:57.660 about that.
00:43:58.280 Like, look, I've been
00:43:59.120 struck by exactly this
00:44:00.660 problem, right?
00:44:01.460 This is the problem of
00:44:02.380 unintended consequences
00:44:03.640 and the irreducible
00:44:05.800 complexity of things.
00:44:07.000 So, like, we can talk
00:44:08.500 about climate change, the
00:44:09.900 problem of climate
00:44:10.940 change, but there
00:44:11.880 isn't...
00:44:12.820 Those words are
00:44:13.800 unbelievably deceiving
00:44:15.500 because there isn't a
00:44:17.760 problem of climate
00:44:18.920 change.
00:44:19.540 There's 150,000 problems
00:44:22.840 of climate change, and
00:44:24.200 every single one of
00:44:25.420 those is an unbelievably
00:44:26.820 difficult problem, and
00:44:28.740 the solutions to those
00:44:29.940 problems could well
00:44:31.220 exist in terrible
00:44:32.200 contradiction to one
00:44:33.160 another.
00:44:34.300 And so, but you get
00:44:35.000 deluded, you think, well,
00:44:36.200 climate change, well, I
00:44:37.140 can understand that.
00:44:38.060 The climate, that's
00:44:39.120 easy to say.
00:44:39.760 I must be able to
00:44:40.580 conceptualize that.
00:44:41.740 Of course, you can't.
00:44:43.580 And then it hides the
00:44:46.040 fact that it hides this
00:44:47.500 understructure of
00:44:48.920 complexity.
00:44:49.580 Now, you got a glimpse
00:44:50.500 of that, and that
00:44:51.280 instilled in you this
00:44:54.140 resistance to this fatal
00:44:55.440 conceit.
00:44:56.060 It instilled in you some
00:44:57.100 humility.
00:44:58.100 Why doesn't that happen
00:44:59.220 to other people?
00:45:00.320 And, you know, there's
00:45:01.660 lots of things that we're
00:45:02.500 not doing that we're
00:45:03.140 doing that we could
00:45:03.560 do that you also
00:45:04.640 touch on.
00:45:05.740 One of the things
00:45:06.660 that strikes me as
00:45:08.920 somewhat catastrophic
00:45:10.740 is the tragic
00:45:13.120 underdevelopment of
00:45:14.080 nuclear power.
00:45:15.400 I've spoken with a
00:45:16.420 number of people about
00:45:18.060 the possibilities of
00:45:19.120 nuclear power, and you
00:45:20.460 point out in your, I
00:45:21.740 think it's in how
00:45:22.660 innovation works,
00:45:23.500 actually, that there
00:45:25.740 are no shortage of
00:45:27.540 plans for much smaller
00:45:29.720 nuclear reactors that
00:45:30.900 don't use water as the
00:45:32.120 primary coolant, that
00:45:33.120 use salt or some other
00:45:34.360 substance like that,
00:45:35.720 certain salts, and that
00:45:37.080 if they fail, they
00:45:40.020 actually shut down rather
00:45:41.300 than melting down.
00:45:43.160 And so that's another
00:45:45.340 example, I think, of
00:45:46.460 where the environmentalists,
00:45:49.540 which a broad brush, but
00:45:50.700 the environmentalists got
00:45:51.620 things seriously wrong and
00:45:52.780 are still doing so.
00:45:53.660 because as far as I can
00:45:54.880 tell, if you wanted, the
00:45:56.740 question is, what do you
00:45:57.620 want?
00:45:58.680 Like, if you want cheap
00:45:59.700 power of the sort that
00:46:01.060 would make people rich
00:46:02.280 enough to start caring
00:46:03.460 about the environment, it
00:46:04.620 seems to me that you
00:46:05.520 would be a nuclear power
00:46:06.580 supporter rather than a
00:46:07.800 supporter of solar or
00:46:09.000 wind power, which I think
00:46:10.720 only still accounts for
00:46:11.940 about 3% of total energy
00:46:13.760 needs.
00:46:15.360 That's true.
00:46:17.020 People say, oh, no, no,
00:46:18.280 that's wrong.
00:46:18.700 It's more than 10%.
00:46:19.500 You find they're referring
00:46:20.420 to electricity, but
00:46:21.320 electricity is only about
00:46:22.280 25% of energy at the
00:46:23.980 moment.
00:46:24.300 So it's around 3% comes
00:46:26.500 from solar and wind.
00:46:27.800 But the real problem with
00:46:30.100 solar and wind versus
00:46:31.100 nuclear, nuclear is still
00:46:32.860 horribly expensive because
00:46:34.100 of the way we've regulated
00:46:35.220 it and driven up its
00:46:36.060 price.
00:46:36.580 So our problem is how to
00:46:37.960 get the price down.
00:46:39.420 But the real problem is
00:46:40.700 the amount of land that
00:46:41.880 solar and wind use because
00:46:43.260 they're very low density
00:46:44.180 sources of energy.
00:46:45.580 So you have to have a lot
00:46:46.440 of land and you need more
00:46:47.840 land than there is.
00:46:49.000 You know, I mean, even
00:46:49.760 Canada has hardly got
00:46:50.780 enough land to produce
00:46:52.220 renewable energy for its
00:46:53.280 population.
00:46:55.240 And frankly, that's going
00:46:57.900 back to a medieval
00:46:58.820 economy where you had to
00:47:01.460 use the landscape to
00:47:02.560 produce energy.
00:47:03.200 You had to dam the rivers
00:47:04.300 and grow the crops that
00:47:07.240 you then and cut down
00:47:09.360 the forests, you know,
00:47:10.420 to burn.
00:47:11.380 Well, it's not obvious
00:47:12.540 either that wind farms
00:47:13.900 aren't a blight on the
00:47:15.020 landscape.
00:47:16.060 I'm afraid they are.
00:47:16.920 They're terrible for birds.
00:47:17.940 I'm a keen bird watcher.
00:47:18.940 I don't like the idea of
00:47:20.120 these birds being
00:47:21.940 devastated by onshore and
00:47:23.340 offshore wind.
00:47:24.920 And, you know, a wind farm
00:47:26.600 spends the first seven,
00:47:28.600 eight years of its life
00:47:29.500 earning back the energy that
00:47:31.520 went into building the wind
00:47:33.080 turbine.
00:47:34.320 You know, and only after
00:47:35.720 that is it net positive.
00:47:38.820 And even then, it's a huge
00:47:41.360 investment of capital that
00:47:43.640 could be doing something
00:47:44.540 else.
00:47:44.900 You know, the point about
00:47:45.560 energy is that it's the
00:47:46.760 master resource.
00:47:47.900 It's the thing that
00:47:48.500 everybody else needs to
00:47:49.640 use.
00:47:50.160 So you want to make it as
00:47:51.140 cheap and as reliable as
00:47:52.620 possible.
00:47:53.080 Yes, exactly.
00:47:53.700 And that should be said
00:47:54.520 over and over that if you
00:47:55.820 were, it seems to me that
00:47:57.440 if you were truly concerned
00:47:59.560 about the planetary fate,
00:48:02.860 let's say, or even more
00:48:04.220 precisely the fate of the
00:48:05.700 people on the planet, that
00:48:07.280 you would do everything you
00:48:08.440 could to drive the cost of
00:48:10.380 energy, including the
00:48:12.980 externalized costs to
00:48:14.300 something as low as
00:48:15.240 possible, because it's the
00:48:16.980 prerequisite for everything
00:48:18.220 else.
00:48:18.780 And starving people aren't,
00:48:20.620 we already talked about
00:48:21.840 this, but starving people
00:48:23.020 aren't good planetary
00:48:24.520 stewards.
00:48:25.960 So even if you...
00:48:26.960 But you'll notice, Jordan,
00:48:28.740 you and I have now slipped
00:48:29.780 into a slightly pessimistic
00:48:31.580 mood in that we're finding
00:48:34.660 the energy policies of our
00:48:35.920 countries rather stupid.
00:48:38.360 Yeah, it's probably because
00:48:39.180 we're old enough so that a
00:48:40.380 90-minute discussion starts to
00:48:41.980 become tiring.
00:48:42.800 Well, there's that.
00:48:45.860 You do promote CO2 emission
00:48:49.820 amelioration strategies in
00:48:51.760 false alarm.
00:48:53.080 And you did just point out
00:48:54.220 that although we should be
00:48:55.860 striving to make the poor
00:48:57.760 around the world as much less
00:48:59.360 poor as we possibly can, as
00:49:00.820 quickly as we can, so everyone
00:49:02.760 wins, including us, just like
00:49:06.100 Henry Ford won when he paid his
00:49:07.700 workers enough to buy his cars,
00:49:09.440 the cars they made, they are
00:49:12.820 going to increase their rate
00:49:15.120 of carbon dioxide emission.
00:49:17.100 And for some people, that
00:49:18.140 would be enough reason to
00:49:19.420 scrap the whole enrichment
00:49:21.420 process.
00:49:23.080 But you have some strategies
00:49:24.820 that you think are wise to
00:49:26.320 ameliorate the problems that
00:49:27.760 would be associated with that.
00:49:29.920 Yes.
00:49:30.280 So I talk about five different
00:49:32.260 solutions in the book.
00:49:33.380 So the first one is a carbon
00:49:35.600 tax.
00:49:36.680 Any economist would say, you
00:49:38.680 know, look, you have a
00:49:39.880 problem, you emit CO2, but you
00:49:41.860 don't actually take it into
00:49:43.200 consideration because it's
00:49:44.680 free to emit.
00:49:45.900 So that's how we think about
00:49:47.300 the polluted pays.
00:49:48.840 You put a price on carbon.
00:49:51.440 In principle, you should do
00:49:53.100 this across the world.
00:49:54.180 You should do it so that it
00:49:55.300 slowly rises with time.
00:49:56.960 It's the most efficient way to
00:49:58.500 deal with it.
00:49:59.180 There's two things we need to
00:50:00.480 recognize with it.
00:50:01.380 One is it turns out to be very,
00:50:03.280 very hard because it makes it
00:50:05.120 very explicit to people that
00:50:06.980 tackling global warming is
00:50:08.120 actually costly.
00:50:09.440 Secondly, we know that
00:50:10.840 politicians are just really,
00:50:12.380 really bad at doing
00:50:13.560 something for a long time,
00:50:15.240 very consistently across all
00:50:16.680 areas.
00:50:17.320 What politicians typically end
00:50:18.680 up doing is they'll put it
00:50:20.300 on some things.
00:50:21.240 So, you know, in many places
00:50:22.500 in Europe, for instance, you
00:50:23.620 have enormously high taxes on
00:50:25.600 cars and you have enormously
00:50:27.340 low taxes on people who are
00:50:30.260 good at lobbying their
00:50:32.280 governments for their
00:50:33.140 particular interests.
00:50:34.100 So, you know, greenhouse
00:50:35.420 gardeners, greenhouse
00:50:37.280 growers don't have to pay
00:50:40.080 the carbon tax because that
00:50:42.700 would make it really hard for
00:50:43.780 them to grow their, you know,
00:50:45.360 tomatoes or whatever.
00:50:46.600 And you can see how this
00:50:48.280 happens across a wide range
00:50:50.200 of areas.
00:50:50.820 So that's one part of the
00:50:53.260 problem.
00:50:53.720 The other part is that even if
00:50:55.180 you do this really, really well,
00:50:56.540 it'll only solve a smaller part
00:50:58.400 of the problem.
00:50:59.400 So you should do this.
00:51:00.880 We should focus on a carbon
00:51:02.820 tax, but we should also be
00:51:04.220 realistic.
00:51:04.860 This is not what's going to fix
00:51:06.460 climate change.
00:51:07.340 This will fix a smaller part of
00:51:09.780 climate change.
00:51:10.480 So it's part of the solution,
00:51:11.680 but it's not the most important
00:51:12.940 part.
00:51:13.600 The second part, and that's
00:51:15.260 where I think we actually have
00:51:16.580 the biggest opportunity, is
00:51:18.300 innovation.
00:51:19.600 So if you talk to Matt Ridley,
00:51:21.800 this is certainly also his
00:51:23.480 ballpark, but it's basically
00:51:25.520 recognizing that most things
00:51:27.420 that we've solved in this world
00:51:28.700 are about innovation.
00:51:30.620 So you rarely get people to
00:51:32.740 solve a problem by saying,
00:51:34.280 I'm sorry, could you please
00:51:35.440 not do all that cool stuff
00:51:37.320 that you like?
00:51:38.300 Could you please stop feeling
00:51:39.760 good about all of that?
00:51:41.100 That rarely works out as a
00:51:43.060 political strategy.
00:51:44.280 Unfortunately, that's typically
00:51:45.600 what we say.
00:51:46.340 Could you please not fly, not
00:51:47.760 eat meat, not do all these
00:51:49.640 things?
00:51:50.040 Could you please have it a
00:51:50.960 little hotter in the summer
00:51:52.400 and a little cooler in the
00:51:53.740 winter?
00:51:54.080 That's really, really hard to
00:51:55.580 sell to most people.
00:51:56.580 What you need is innovation.
00:52:00.360 And let me just give you an
00:52:01.600 example.
00:52:02.640 Back in the 1950s, Los Angeles
00:52:05.280 was one of the most polluted
00:52:06.800 places on the planet because
00:52:08.840 there are lots and lots of
00:52:09.820 cars and they have the special
00:52:11.600 sort of geographical notion
00:52:13.340 that just leaves all of the
00:52:14.560 pollution inside this little
00:52:15.920 basin of Los Angeles.
00:52:17.660 It was terrible to live there
00:52:19.980 in many ways.
00:52:21.440 And obviously, the simple
00:52:22.740 answer is to tell people most
00:52:24.580 of this came from cars.
00:52:25.540 So the simple answer would
00:52:27.020 be to say, stop driving your
00:52:28.840 car.
00:52:29.400 Of course, if you've ever met
00:52:30.880 someone from Los Angeles, you
00:52:32.140 know that that's not a
00:52:33.480 solution that's actually viable
00:52:34.760 to them.
00:52:35.360 Well, there aren't even any
00:52:36.400 sidewalks.
00:52:37.680 No, it's not really viable for
00:52:40.500 anyone in any city.
00:52:42.740 What did solve the problem was
00:52:44.520 the innovation of the
00:52:45.840 catalytic converter.
00:52:47.000 This little thing that cost
00:52:48.660 money, you put on the exhaust
00:52:50.260 pipe and then basically you have
00:52:52.040 much, much cleaner cars.
00:52:53.600 That made it possible for
00:52:55.940 people to keep their cars,
00:52:57.660 drive a lot and have much,
00:52:59.800 much cleaner air in Los
00:53:01.400 Angeles.
00:53:01.980 Now, I'm not saying everything
00:53:02.920 is perfect in Los Angeles and
00:53:04.400 there's still air pollution
00:53:05.520 problems, but it made it a lot
00:53:07.820 better for very little money.
00:53:10.000 That's the way that we need to
00:53:11.700 solve global warming.
00:53:12.920 If we could innovate the price of
00:53:15.340 green energy down below fossil
00:53:16.860 fuels and this green energy could
00:53:18.440 be nuclear, it could be fusion
00:53:20.300 energy, it could be solar or wind
00:53:22.500 with batteries, it could be lots
00:53:24.800 and other possible solutions.
00:53:27.160 If we could innovate one or a
00:53:28.880 few of these solutions down below
00:53:30.440 fossil fuels, everyone would
00:53:32.620 switch.
00:53:33.000 You wouldn't need sort of a Paris
00:53:35.280 Accord where you have to twist
00:53:36.760 everybody's arm.
00:53:38.240 Let me ask you about that for a
00:53:39.300 minute.
00:53:39.580 So it's not a straightforward matter
00:53:42.420 to set up governmental policy to
00:53:44.600 support innovation.
00:53:49.660 I mean, innovation is a very
00:53:50.580 abstract idea and I've seen much
00:53:53.580 evidence of failure at the
00:53:55.120 governmental level here in Canada
00:53:56.760 when governments have set out to
00:53:59.900 foster entrepreneurship and to
00:54:03.080 seed, you know, the development of
00:54:05.380 high-tech industry, for example.
00:54:07.160 Generally, it's a cataclysmic failure.
00:54:09.740 I mean, obviously, it's self-evident in
00:54:12.020 some sense that a good idea is good
00:54:15.320 because it solves a complicated
00:54:16.780 problem and the more good ideas we
00:54:18.600 have, the better.
00:54:19.940 But do you think that it's like it
00:54:22.140 seems on the face of it, unless you
00:54:24.460 dig down into the details, it seems
00:54:26.920 like hand-waving.
00:54:27.980 Obviously, we should have better ideas
00:54:30.080 to solve our problems.
00:54:31.980 But you, what do you think
00:54:33.960 constitute concrete, realistic,
00:54:37.440 evidence-based solutions to the
00:54:40.160 problem of fostering innovation?
00:54:42.460 Do you think it's actually possible
00:54:43.660 to set up policy that does that?
00:54:46.180 Yes.
00:54:46.800 So the short answer is yes.
00:54:48.320 And the reason is that what's
00:54:51.960 lacking is mostly long-term
00:54:55.020 investment.
00:54:56.140 So investment that will only generate
00:54:58.640 the solutions in 20, 30, 40 years.
00:55:01.240 Remember, this is why we invest a lot
00:55:03.840 of money in healthcare, basic research
00:55:08.140 that then eventually becomes research
00:55:10.640 that, you know, for instance,
00:55:11.960 pharmaceuticals can make into products
00:55:14.000 that they can make money off of.
00:55:15.940 There's always too little investment
00:55:19.280 societally in things that you can't
00:55:22.560 monetize right away.
00:55:24.500 So it's very hard to invest in things
00:55:26.660 that you can't monetize right away.
00:55:28.260 Yes.
00:55:28.640 If I make an innovation that then in
00:55:30.920 20 years, say, will help us generate
00:55:34.220 this enormously beneficial breakthrough.
00:55:36.660 Unfortunately, I won't get any money
00:55:38.700 because my patent has run out.
00:55:40.580 That's why most companies will not be
00:55:42.400 investing in these long-term development.
00:55:45.620 What happens is that you then have a
00:55:48.380 dearth of investment into these terms,
00:55:52.220 these sorts of long-term innovations,
00:55:55.600 unless you have the public invest in them.
00:55:58.040 And I'll get back to how we do that
00:55:59.660 smartly.
00:56:00.200 Okay.
00:56:00.960 But we do that in medical research
00:56:03.220 for many reasons.
00:56:04.280 People recognize this is part of the
00:56:06.460 place where we need to produce lots of
00:56:08.640 professors, lots of medical Nobel laureates.
00:56:11.220 And then, you know, eventually the
00:56:12.900 pharmaceuticals will take over and
00:56:14.900 actually make products out of this.
00:56:16.720 That's a great setup.
00:56:18.080 We don't do this in energy for a variety
00:56:20.820 of reasons.
00:56:21.740 It is one of the places where we spend
00:56:23.920 very, very little money, partly because
00:56:26.740 it doesn't feel like you're solving global
00:56:28.520 warming because you're not solving it right
00:56:30.040 now.
00:56:30.520 You're only solving it in 20 or 40 years.
00:56:32.940 That feels like you didn't really care.
00:56:35.020 But the reality is, this is the only way
00:56:37.920 that we're going to get these sorts of
00:56:39.800 long-term breakthroughs.
00:56:41.880 Now, one reason why politicians often
00:56:45.000 screw this up is because they are not
00:56:47.560 willing to invest in these long-term
00:56:49.280 investments.
00:56:50.240 They'll say, we want a, you know, a
00:56:52.340 Silicon Valley in Canada in three years.
00:56:55.560 Yeah.
00:56:56.100 That makes sense if you need to get
00:56:57.380 reelected in four, but you can't do that.
00:57:00.100 And, and so you shouldn't be trying to do
00:57:03.060 this in a very short-term way.
00:57:05.180 Another way is that you end up giving this
00:57:07.380 away to companies and companies, of
00:57:10.320 course, are just going to spend it on the
00:57:11.980 product that they were going to do next
00:57:13.560 year anyway.
00:57:14.740 But Hey, thanks for the money.
00:57:16.520 So the, the point here is you need to do
00:57:18.860 this carefully in a way that will generate
00:57:21.360 long-term innovation.
00:57:24.020 This is not easy.
00:57:25.200 You are going to waste a lot of money, but
00:57:27.340 we know that governments around the world
00:57:29.620 has done this in a variety of different
00:57:31.420 ways.
00:57:32.340 We know, for instance, the, you know, the
00:57:34.280 internet, the, the transistor, the
00:57:37.920 fracking in the U S there's a number of
00:57:41.540 places where you have been successful.
00:57:43.140 And all we have to do is to spend lots of
00:57:46.420 money.
00:57:46.600 And, and I'd love to talk more about
00:57:48.160 specifically how we should set this up, how
00:57:50.940 we should evaluate, and we should be
00:57:52.280 careful about it.
00:57:52.940 But fundamentally we should do this in a
00:57:55.240 way that we say we want to generate a lot
00:57:57.100 of knowledge that we believe in the long
00:57:58.640 run can deliver benefits that'll actually
00:58:01.860 help companies produce energy that will
00:58:04.640 be viable, but we are not going to try and
00:58:07.040 do this for the next three or five years.
00:58:08.820 So we've got to stop that panic mode and
00:58:11.420 start this long-term thinking.
00:58:13.480 We do have realistic knowledge about both
00:58:16.640 that we're investing very little compared to
00:58:18.500 typically, typically almost all other areas
00:58:21.080 and that more investment here would make it
00:58:24.240 more plausible that we would faster, get
00:58:27.000 cheaper green energy.
00:58:28.700 So, okay.
00:58:29.600 So in Canada, there's a medical research
00:58:31.520 council and a social sciences research
00:58:33.480 council and natural sciences and engineering
00:58:35.720 research council.
00:58:37.180 That might be a bit dated that information, but
00:58:39.600 essentially that's how it's been set up.
00:58:42.660 But there isn't an energy innovation research
00:58:46.000 council.
00:58:46.500 And, you know, I'm thinking that way because
00:58:50.240 I'm an academic and I've seen these granting
00:58:51.960 agencies, I've seen how they work and they're
00:58:54.600 set up to provide funds for basic research.
00:58:59.260 And something like that doesn't exist.
00:59:01.660 So, what's, why, why aren't we funding
00:59:06.740 research into energy, into, into the generation
00:59:12.440 of cheap and clean energy?
00:59:15.600 What's, what's, what's gotten in the way?
00:59:17.180 Every year, we want to spend it on solar panels
00:59:21.480 that makes us feel like we're doing something
00:59:23.420 right now.
00:59:24.700 The, the, the surprising thing is in 2015,
00:59:27.980 when all countries signed the Paris climate
00:59:32.020 agreement on the sidelines of that event, Obama
00:59:36.620 and 20 other global leaders, Bill Gates and lots
00:59:39.780 of billionaires actually signed another agreement
00:59:42.240 that I'm happy to say we were a tiny part of
00:59:44.980 pushing, which was, we're going to double our
00:59:48.180 investment into green energy research and
00:59:50.440 development.
00:59:51.380 So all countries both promised the thing that
00:59:54.360 you heard about, namely, we're going to cut our
00:59:56.360 carbon emissions, but they also promised to
00:59:59.000 double their green energy investment in five years.
01:00:03.600 So in 2020, they did quite a bit of the cutting
01:00:07.980 carbon emissions, they did nothing of the increase
01:00:11.920 spending in green energy R&D.
01:00:14.440 And I think fundamentally, because it doesn't feel
01:00:16.780 like a solution, it doesn't feel like something
01:00:18.720 urgent, it feels like something you can do next
01:00:20.980 year.
01:00:21.440 It feels like something that's nice to have, but
01:00:24.120 this, you know, putting up the solar panel is
01:00:26.120 urgent and we need to do it.
01:00:27.800 The reality is the over worry about global warming
01:00:31.680 that we have, because we're, you know, we have this
01:00:34.020 existential feel that this could be the end of the
01:00:36.240 world. Surprisingly, also not only is wrong, but it
01:00:39.880 also leads us down the wrong path, namely the path
01:00:42.680 where we say, let's do anything that just makes it
01:00:44.920 look like we're doing something next year, rather
01:00:47.980 than actually laying the groundwork for fixing this
01:00:51.020 problem. Now, obviously, and some people will say, well,
01:00:53.320 we should have done this 20 years ago. And yes, that
01:00:55.500 would be wonderful. We should have done that. But we
01:00:57.460 didn't, you know, it's sort of too late to do something
01:01:00.080 about what we should have done 20 years ago. But we can do
01:01:03.840 something about what we're going to spend our money on in
01:01:06.220 2021. And if you look, for instance, on Biden's proposal
01:01:09.580 to fix climate change, he's, he's, he's thinking about
01:01:13.020 spending $2 trillion, you'll probably not get to spend all
01:01:16.200 that money on a vast array of things, many of which are not
01:01:20.200 going to be very effective. But he's also saying he wants to
01:01:23.860 dramatically increase actually, I think probably too much, but
01:01:27.640 certainly a very, very large amount of increase in American
01:01:31.200 spending on R&D. This is what he should be focusing on. But I
01:01:35.500 do worry that he's going to end up having much more success
01:01:38.600 with all his other much less effective proposals, simply
01:01:41.780 because they are more glamorous.
01:01:44.160 All right, so you don't seem to be an admirer of the Paris
01:01:50.160 Accords. And so my sense of your argument is that the
01:01:56.800 proposals that are part of that accord are extremely expensive.
01:01:59.920 And they're not cost effective, especially when viewed in this
01:02:03.340 larger framework that encompasses a whole host of
01:02:06.760 problems instead of focusing just on climate change. And so
01:02:10.040 maybe if you don't mind, you could summarize your, could you
01:02:15.660 lay out your critique of the Paris Accords for for us?
01:02:19.860 Yes. So so two things. The Paris Agreement is really just an
01:02:24.020 extension of what we've been trying for the last 30 years and
01:02:26.580 failed to do the last 30 years. Namely, let's try to do
01:02:30.060 something that's really hard, that costs a lot of money, that
01:02:32.720 will have a little bit of impact in 100 years, and try and see if we
01:02:36.360 can't get everybody to do it. Not surprisingly, that's a really,
01:02:39.540 really hard thing to get to do what and to do what exactly so so
01:02:43.560 basically get Canada, get the US get Denmark, get everybody else to
01:02:47.840 cut their carbon emissions, which privately for them is going to be
01:02:51.700 costly, they have to, you know, reduce their use of cheap energy
01:02:56.140 and use a little bit more expensive energy, sometimes less reliable
01:03:00.380 energy. Basically, it puts a slight slower dampener on their economic
01:03:06.900 growth. That's always going to be hard. That's always going to be
01:03:10.160 unpopular. You're basically asking people, could you please pay some
01:03:13.460 more and use a little bit less? That's, that's a hard sell. Not
01:03:17.880 surprisingly, you do a little bit of it, you typically don't do a lot
01:03:21.540 of it, you don't live up to all of your promises. But even if you do,
01:03:25.220 so let's just take the Paris Agreement, even if everyone did
01:03:28.380 everything they promised to 2030, that would cut as much CO2 that if
01:03:33.960 you run it through a climate model, it would cut temperatures by
01:03:38.040 0.025 degrees centigrade by the end of the century. So literally
01:03:44.400 nothing, we wouldn't be able to measure magnitude of increase. So it's
01:03:49.020 about four degrees temperature rise, we've already seen one, so about
01:03:53.360 three, three degrees more. So this would be a trivial part of
01:03:58.200 reduction. Now it would be a reduction, it would mean we would have less
01:04:01.940 problems because global warming is a problem. So we estimate there
01:04:05.960 would be benefits. But there would also be huge costs because you'd
01:04:10.500 actually have to pay for this. So if you look at how much you're
01:04:14.300 going to pay, which is in the order of one to two trillion US dollars
01:04:18.020 per year in 2030, for every dollar spent, you will avoid climate
01:04:23.880 damages across the centuries worth about 11 cents. That's a very poor
01:04:29.300 way of spending money, paying a dollar and actually achieving 11 cents,
01:04:33.840 you could just have paid out the dollar and done, you know, almost 10
01:04:37.580 times as much good in the world. So the reality here is, the Paris
01:04:41.800 Agreement is a really well intentioned agreement, but it will fail just
01:04:45.520 like all the other agreements. So you know, Rio, Kyoto, and all the
01:04:50.380 other national policies that we've done, it'll mostly fail. But even if it
01:04:53.380 succeeded, it would be a very expensive way of achieving very
01:04:57.520 little. And this, of course, is the big problem of the climate
01:05:01.520 conversation, that because we're so worried, we've decided, yeah, we're not
01:05:04.380 worried. We've decided, yeah, we're not going to spend all that much
01:05:08.040 money on all these other problems in the world, tuberculosis, all this
01:05:10.920 other stuff. But we are going to spend one to two trillion dollars.
01:05:14.660 Remember, it's not going to bring us to the poorhouse, but it's a lot of
01:05:17.560 money. That's one to 2% of global GDP on something that will basically not
01:05:23.380 bias any measurable impact in 100 years. That's a bad deal. That's why we need
01:05:29.860 to do better. You know, every action has a cost. This is the thing about
01:05:33.740 how you learn from economics. Everything has a cost. Everything has an
01:05:36.300 opportunity cost. And people like to present this idea as if, you know,
01:05:39.780 there's an evil cabal of oil producers that are out there forcing our
01:05:43.960 governments to make our lives dependent on these oils. And what we need to do
01:05:48.520 is to just have the political will to transition to these more advanced forms
01:05:51.740 of energy that are going to save the planet. And it's just completely
01:05:55.000 ridiculous. Well, people don't understand. Like, people don't
01:05:58.840 understand things as simple as the fact that there's a finite amount of solar
01:06:02.300 energy that falls on a square yard of territory. At that basic level of
01:06:08.400 physics, it's just, well, the sun is an inexhaustible source of energy. It's like,
01:06:11.960 well, that's true. But practically speaking, that's not exactly the issue.
01:06:16.740 Exactly. Because what you need is the power. What you want is to convert that
01:06:19.360 high quantity of energy into high power, which is a lot of energy over a small,
01:06:23.400 short period of time. And to do that, this is really understanding that what
01:06:28.400 humans look for is not energy, because energy is everywhere. There's sun and
01:06:31.440 there's wind and all of that stuff. But being able to channel that in order to
01:06:35.720 use it in high power applications, that's what makes everything that we value
01:06:39.500 possible. That's what makes surviving the winter a breeze for the vast majority
01:06:44.300 of us. This is why we have our modern life. This is why we have transportation.
01:06:48.160 It's because we have high power. And the way that we've managed to secure high
01:06:51.920 power, the way that we've built our world, is modern hydrocarbons. And so the thing
01:06:57.080 that people who are afraid of carbon dioxide need to present is they need to make the case
01:07:03.540 that stopping hydrocarbons is going to have such a noticeable effect on the climate that
01:07:11.940 it outweighs the benefits of taking humanity back to the 1500s.
01:07:15.240 Yeah, they're not going to do that. Because you look at how this works out is that we're
01:07:21.180 trying to calculate the economic consequences of climate change and the economic consequences
01:07:26.640 of our interventions over this. This has to do with the time preference over, say, 50 to 100 years.
01:07:31.640 But the errors in your measurements and your predictions grow and grow and grow as you move
01:07:36.320 out decades into the future. It's that those justifications will never be forthcoming.
01:07:40.580 They're technically impossible. We talked about carbon tax and innovation. Innovation is crucial.
01:07:48.200 You should also focus on adaptation. It's sort of a naughty word in much of the conversation
01:07:54.700 in global warming. But very clearly, adaptation is going to be one of the big ways that we're
01:07:59.080 going to fix many of the problems. It's going to happen to a large extent simply because people
01:08:03.240 do that. If you're a farmer, you're going to plant later or earlier, depending on the climate
01:08:08.380 changes and eventually, you might plant something else. You should also look at geoengineering.
01:08:14.060 We talked about that very briefly. But basically, the idea of saying, if there were to be a really
01:08:19.500 catastrophic impact, geoengineering is basically a way of making sure that you can restore the
01:08:26.620 temperature of the earth very quickly at fairly low cost. We should not just go ahead with it,
01:08:33.420 but we should certainly be thinking about it. And that's all I'm going to say about this right now.
01:08:38.100 The last bit, and we also talked extensively about that, is to make sure that prosperity is also a
01:08:44.740 big solution to climate change. Most of the things you're impacted with, you're impacted with because
01:08:51.100 you're poor. If you're really poor, everything hits you hard, but climate hits you hard as well. If you're
01:08:57.600 rich, you're much, much less impacted. And so very clearly, the question is, do we want to help
01:09:03.620 Bangladesh a little bit by cutting carbon emissions and basically then leaving them poor? But hey,
01:09:10.140 at least sea levels rose this much less by the end of the century? Or would we rather make sure that we
01:09:16.380 actually leave Bangladesh much richer, which means that they'll be much better able to handle hurricanes,
01:09:22.180 that they'll be much better able to handle sea level rise and so on? There is a very strong basis of
01:09:28.100 evidence that shows that prosperity is actually much better for most countries, not just because
01:09:33.200 it's wonderful in all kinds of other ways. You can avoid your kids dying and make, get them better
01:09:37.420 education and all these other things, but also for climate. So those were the five points and innovation
01:09:43.180 is by far the most important thing. I just want to say one last thing about, you know, because my book is
01:09:49.780 very much, we've talked a lot about all the big problems in the world. The reason why I talk about
01:09:55.120 global warming is because it is the one thing that I, that I experienced most people actually
01:10:01.140 talking about all the time is this existential threat. This is the big thing that we should
01:10:05.780 all be concerned about. Certainly a lot of people, the UN Secretary General, many others are telling us
01:10:11.740 this is the top priority for humanity, because if this is going to eradicate all of us, surely this
01:10:18.940 should be the thing that we focus on. I think that that makes intellectual sense if it was true, but
01:10:25.800 that's not what the UN Climate Panel is telling us. It's not what the science is telling us. It tells us
01:10:31.140 this is a problem by no means the end of the world. And that is not only important because you can't
01:10:37.380 really get to all the other things we were talking about unless you stop believing this is the end of
01:10:43.220 the world. If this is the end of the world, you are going to set everything else aside. But also, of
01:10:48.160 course, it's the only way that you can actually get a better life. You know, when you see all these
01:10:53.220 kids being really worried about, am I going to have a future when I grow up? People believing literally
01:11:00.640 that humanity is going to end, that must be terrible. Now, if it was true, we should be telling
01:11:05.580 people, but it's not true. And therefore, being able to relieve yourself from that scare is also
01:11:13.280 really, really valuable on a personal level. So this book was written not just to make sure that you can
01:11:19.260 get rid of the scare, but also that you can start realizing this is a problem among many others. Now
01:11:24.180 let's think about how do we prioritize? And that's what I'm hoping this conversation will help us. So in a
01:11:30.620 sense, you could say, the false alarm book is the stepping stone to be able to have that, you know, more
01:11:36.040 general conversation, namely, what is it that the world should be prioritizing, if we're not scared
01:11:41.280 witless about global warming, but actually sees it, see it as it is a problem among many problems.