The Jordan B. Peterson Podcast - January 13, 2025


514. How to Solve All of America’s Energy Problems | Alex Epstein


Episode Stats

Length

1 hour and 55 minutes

Words per Minute

197.51442

Word Count

22,785

Sentence Count

1,400

Misogynist Sentences

8

Hate Speech Sentences

10


Summary

Alex Epstein is the author of two influential books: The Moral Case for Fossil Fuels, and the Second One, Fossil Future. He has been beating the pro-human energy slash environment drum for some 17 years. And with increasing effectiveness, he is one of the people at the forefront of the dawning realization that fossil fuels are making energy more expensive and unreliable, and simultaneously increasing our dependency on dictatorial governments. And so he laid out his ideas today in a manner that could be brought to bear for policymakers who are interested in developing exactly that kind of policy framework.


Transcript

00:00:00.000 There's much more practical things that we can do to keep people safe from climate change,
00:00:05.780 let's say, than making everybody poor by making fossil fuels impossible to access.
00:00:10.140 You know, we have this clear demand that fossil fuels are needed for,
00:00:13.820 and then we restrict fossil fuels some, and we start getting these big problems
00:00:17.340 when we were told we would get big wealth.
00:00:20.040 Well, why would we take off the table any potential source of innovation
00:00:24.660 that would make energy more plentiful and more reliable?
00:00:28.220 We haven't even reduced the supply of fossil fuels in the world.
00:00:31.160 We've just slowed the growth, and we're having all these problems.
00:00:34.040 There's no single town on the planet that runs entirely on renewables.
00:00:37.880 Poor ones do. They run on wood and wood.
00:00:39.240 Yeah, well, right. Well, yeah.
00:00:40.840 Even if it creates something like, you know, an air pollution challenge,
00:00:44.120 it can also create the technology that can filter the air,
00:00:47.660 and if anyone happens to get sick, it can also create the whole medical industry.
00:00:51.540 I really like your emphasis on the nexus between energy provision and human flourishing.
00:00:58.220 So I had the good fortune to speak again today with Alex Epstein,
00:01:14.660 who I spoke with two years ago, almost to the day.
00:01:18.860 Alex is the author of two influential books.
00:01:21.980 One, the first one, The Moral Case for Fossil Fuels,
00:01:24.700 and the second one, Fossil Future,
00:01:26.140 and Alex has been beating the pro-human energy slash environment drum for some 17 years.
00:01:34.620 And with increasing effectiveness, I would say,
00:01:37.660 he's one of the people at the forefront of the dawning realization
00:01:41.560 that impoverishing humanity and destroying the industrial infrastructure of the West,
00:01:47.780 while making energy spectacularly expensive and unreliable
00:01:54.220 and simultaneously increasing our dependency on, let's say, dictatorial governments,
00:02:01.080 is not really very wise policy, all things considered.
00:02:04.880 And Alex has been an icebreaker in that regard,
00:02:08.020 pointing out to everyone who will listen and listen carefully
00:02:10.680 that fossil fuels, all things considered, are obviously and overwhelmingly a net good.
00:02:17.840 And that if we want to move forward into a future of abundance,
00:02:20.640 that it's necessary to get that straight in our minds and stop playing foolish games.
00:02:26.600 And we had an opportunity to continue that conversation today
00:02:30.240 and to deepen it because Alex has spent the last several years
00:02:34.140 making his knowledge about the energy environment nexus
00:02:39.940 more and more detailed at the practical level
00:02:42.740 in a manner that enables policymakers to move towards a energy-rich, abundant, pro-human future.
00:02:50.740 And so he laid out those ideas today in our podcast
00:02:53.360 in a manner that is at least illustrative of the wealth of knowledge that he has
00:02:58.640 that could be brought to bear for policymakers who are interested
00:03:02.980 in developing exactly that kind of policy framework.
00:03:06.700 And so join us for that.
00:03:09.500 Well, I thought we might as well begin this
00:03:12.520 by briefly evaluating the change in the conceptual landscape since 2022.
00:03:20.280 I mean, I would say two years ago, the stance that you had been promoting,
00:03:26.800 like a positive stance towards fossil fuel, was not only, what would you say, controversial,
00:03:34.060 but could we say fringe?
00:03:38.520 And I don't think that's the case now.
00:03:41.140 And I think that has a fair bit to do with you, actually, which is quite cool.
00:03:45.580 And so that's my opinion, my sense, broadly speaking.
00:03:50.660 It's not like there still isn't all sorts of work to do to make the case for fossil fuels.
00:03:55.660 But how are you feeling about, you know, if you evaluate the landscape over the last two years,
00:04:01.060 how are you feeling about it?
00:04:01.980 Yeah, so I've been at this 17 years now.
00:04:05.880 And it's definitely at a peak in terms of enthusiasm and opportunity in this sphere.
00:04:11.980 And I think it's interesting to break down.
00:04:14.580 So maybe I'll do my own part last and the part of sort of people who think like me.
00:04:18.420 But there are a few other developments that are notable.
00:04:21.340 And they're all sort of intertwined.
00:04:22.520 But one that's one of the interesting ones that I take no credit for,
00:04:26.820 but is very fortuitous intellectually,
00:04:28.320 is the dramatic increase in electricity demand that is occurring right now.
00:04:33.280 Right, right, right.
00:04:34.320 In the world of-
00:04:35.060 Because of IT.
00:04:35.760 Yeah, right, exactly.
00:04:36.940 So specifically data centers and within that, AI.
00:04:40.040 And in particular, where you see it is with the very large digital tech companies
00:04:46.020 and what their role has been in the energy debate so far,
00:04:50.600 and then how it has drastically changed in the last year or two.
00:04:54.800 So if you look at even in 2022,
00:04:57.280 what's the posture of the big tech companies for the past years before that?
00:05:01.380 It's overwhelmingly a posture of,
00:05:04.740 we are 100% renewable and you should be too.
00:05:08.240 Yeah.
00:05:08.640 And then politically advocating the net zero by 2050 kind of goal,
00:05:13.540 which basically means rapidly eliminate fossil fuels.
00:05:17.160 And prosperity.
00:05:18.300 Yeah.
00:05:18.840 And so you have them and that you have to think of them
00:05:21.080 as they're just an incredible center of gravity in the culture.
00:05:24.440 And where they are is hard to move the culture away from
00:05:27.820 because it's just so much wealth and so much influence.
00:05:31.020 And people in many ways want to be like them.
00:05:33.520 And I think their posture was part of the Larry Fink era.
00:05:36.520 So, you know, Larry Fink, the head of BlackRock.
00:05:38.920 By the way, I'm in D.C.
00:05:39.740 Whenever I'm in D.C., this guy is always in D.C.
00:05:41.260 Like I always spot this guy in like heart building and dirks and building.
00:05:44.940 I mean, this guy.
00:05:45.700 But I think he seems nervous right now, whereas he was on top of the world.
00:05:48.960 So if you take two years ago and especially four years ago,
00:05:51.560 he was called the emperor, right?
00:05:53.440 Actually, first time I had Vivek on my podcast,
00:05:55.780 like he was fighting against Fink.
00:05:57.700 This was in about 2020, 2021, maybe.
00:06:00.120 And he's talking about him as the emperor.
00:06:02.980 Like, and you see now, like just to give a sense of Larry Fink,
00:06:06.540 Larry Fink went, of all places, to the World Economic Forum.
00:06:09.460 This is after telling the whole world you have to go net zero.
00:06:12.000 And specifically, you want to be 100% renewable.
00:06:14.240 He goes to the World Economic Forum and says,
00:06:16.240 hey, we have data centers.
00:06:17.320 We have AI.
00:06:18.260 There's going to be massive new demand for what he'll call dispatchable or reliable electricity.
00:06:23.520 So electricity available on demand.
00:06:25.040 You mean like the kind of electricity we got accustomed to with our systems that work?
00:06:29.300 Right, right.
00:06:29.780 Exactly, exactly.
00:06:30.800 Before people mucked around with them.
00:06:32.220 Yeah, exactly, exactly.
00:06:33.520 What we used to just call electricity.
00:06:37.000 And he said, and we cannot power this with solar and wind.
00:06:41.120 And we need dispatchable electricity like natural gas.
00:06:43.680 I'm like, whoa, where has this guy been?
00:06:45.460 Like, this was the leader of net zero, 100% renewable.
00:06:48.380 And you're seeing it with the tech companies, too, right?
00:06:50.920 Facebook, with Microsoft trying to resurrect Three Mile Island.
00:06:55.540 Everyone is admitting it.
00:06:57.340 I mean, Elon, his stance has been really interesting.
00:07:00.060 So he's been, I think, radically improving on oil and gas.
00:07:03.860 Radically improving.
00:07:05.600 Very much dampening on any sort of climate catastrophism.
00:07:09.720 At this point, I don't even think he's a climate catastrophist.
00:07:11.980 Which is, like, if you look at videos of him back when he's introducing the Powerwall,
00:07:15.860 there's a lot of climate catastrophism.
00:07:17.600 So it's just this fascinating development.
00:07:19.700 And he's using natural gas, like a lot of new natural gas, to power Grok.
00:07:23.520 So what we've had is there is just the economic reality.
00:07:27.480 Once you need a lot more electricity, you have to run into the reality that you need
00:07:33.300 more specifically natural gas.
00:07:35.240 Unfortunately, with nuclear, our policy is so bad.
00:07:37.400 We'll discuss in a minute how to fix that.
00:07:39.060 But it's so bad, we cannot rapidly scale up nuclear.
00:07:42.260 Solar and wind have limited scaling ability in terms of actually contributing to reliable
00:07:46.260 electricity because the storage.
00:07:47.500 As we see continually.
00:07:48.440 Yes, exactly.
00:07:49.360 The storage is just so prohibitively.
00:07:51.000 Norway's having a fit at the moment, right?
00:07:52.720 Because they're having to export their electricity because of the treaties they've signed.
00:07:57.200 And their power supply is so unreliable that they're having spot price hikes.
00:08:01.520 What?
00:08:02.020 Spot price hikes of up to $1,000 per kilowatt hour.
00:08:06.840 Something like that.
00:08:07.460 Yeah, I mean, Europe is sort of, has been a precursor in all of these dimensions.
00:08:11.820 I mean, Germany was, they used to tell us it's the model and now it's a joke.
00:08:15.060 Yeah, right.
00:08:15.900 Now they disavow Germany.
00:08:17.560 So you have everyone going.
00:08:19.260 And it's because when you really need a lot more electricity, you have to face economic
00:08:24.580 reality.
00:08:25.100 What was the case before is we had stagnant electricity demand and we could accommodate
00:08:30.400 a certain amount of intermittent electricity on the grid.
00:08:32.960 And we could get away with shutting down a little bit of reliable capacity, although we
00:08:37.180 were sort of bursting at the seams in terms of, you know, we have a polar vortex, the grid
00:08:41.180 almost crashes.
00:08:42.240 We see a crash in California.
00:08:44.000 We see a crash in Texas.
00:08:45.340 We see warnings across the country.
00:08:47.180 But now we have massive new demand.
00:08:50.040 And what the tech companies had to do is they had to go from, what they did before is they
00:08:53.120 just relabeled the fossil fuel electricity.
00:08:55.500 So they would use the fossil fuel electricity and pay the utilities to label it as green.
00:09:00.380 This is called renewable electricity credits.
00:09:03.780 And unfortunately, this is legally allowed, which is one of my recommendations to the
00:09:06.940 new administration, is they need to disallow this.
00:09:09.680 Explain that in more detail.
00:09:10.700 What are they doing exactly?
00:09:11.860 You are allowed to claim that you are 100% renewable, which everyone takes to mean you
00:09:16.760 are using 100% renewable electricity if you buy credit from somebody else to relabel your
00:09:23.540 fossil fuel electricity as renewable.
00:09:26.120 Is that part of the carbon offset?
00:09:27.480 It's a similar kind of thing, but it's a different version of it.
00:09:31.320 It's specifically labeling yourself renewable.
00:09:34.660 So you take like Apple in North Carolina, right?
00:09:37.060 So Apple is drawing from the grid in North Carolina.
00:09:40.080 When you draw from the grid, you draw an equal percentage of every source in the grid,
00:09:45.100 right?
00:09:45.320 They all become like this homogeneous thing.
00:09:47.400 So whatever, I don't know the exact state of the grid right now, but it has historically
00:09:50.580 had a bunch of coal, a bunch of gas, a bunch of nuclear.
00:09:54.000 But Apple wants to label themselves 100% renewable.
00:09:57.660 So how do they possibly do that?
00:09:59.880 Will they pay the utility to say, hey, you know, the coal and gas that Apple is using,
00:10:04.200 that's the responsibility of the home consumers.
00:10:06.660 They're actually using that.
00:10:08.380 And Apple gets credit.
00:10:10.160 Apple has the special electricity.
00:10:12.180 Yeah.
00:10:12.320 They just take the portion of the electricity.
00:10:15.380 Like, it's just a total figment, right?
00:10:17.140 I see.
00:10:17.540 I see.
00:10:17.720 And you can do different versions of this, like called power purchase agreements, which
00:10:21.280 Google does a lot of where like you'll buy a certain portion of the wind in Iowa, even
00:10:25.780 if you're not in Iowa.
00:10:27.540 But you claim that you're using it.
00:10:29.160 So it's all a fraudulent relabeling scheme.
00:10:33.000 But you can get away with that as long as you don't need much new electricity in the mix,
00:10:37.980 right?
00:10:38.200 But once you need new electricity, you're running out of reliable electricity to relabel
00:10:43.100 as green.
00:10:45.100 And so that's what's happened is, and a friend of mine who's a CEO of a major company was
00:10:50.020 telling me, he was at a conference and he was telling me about the shift in attitude
00:10:53.580 on electricity.
00:10:54.220 And he said, before, these tech companies said to us, if it's not 100% renewable, like
00:10:59.160 to different districts and stuff, don't even talk to us.
00:11:01.860 And now they said, like, they'll burn anything from bunnies to puppies to get electricity.
00:11:07.360 That's the shift.
00:11:07.920 Can you walk everybody through what has, what's at the base of the demand for the IT?
00:11:13.940 Like, I know it, I know it's associated with, with artificial intelligence and these massive
00:11:19.320 banks of compute computational banks they're producing, but I'm unclear about the details.
00:11:24.600 Like, what is it that's drawing such immense resources of power?
00:11:28.600 Well, I mean, this is, you know, the best person to talk to is an expert in large language
00:11:32.240 models, but let's just, I'll give it to you in the macro.
00:11:34.540 And especially, I'm very proud that I forecast this in Fossil Future.
00:11:38.200 So Fossil Future was completed in 2021, came out in 2022.
00:11:41.500 And the basic mechanism I talked about is there's really an unlimited human need for
00:11:46.860 energy.
00:11:47.460 And, you know, what energy is, it's machine food or machine calories.
00:11:51.620 And historically, the major use of energy has been to expand and amplify human physical
00:11:57.200 labor.
00:11:57.780 So by expand, I mean, via machines.
00:12:00.660 So we can do things that we couldn't do.
00:12:02.500 Like, we can power an incubator with energy.
00:12:04.840 We can't get five humans together and make an incubator, right?
00:12:07.840 We can't get a thousand humans and make a plane.
00:12:09.540 So one thing energy and machines do is they expand our productive abilities.
00:12:13.540 And then they also amplify.
00:12:15.440 So the example of, well, a modern combine harvester will make an agricultural worker able to reap and
00:12:20.680 thresh 1,000 times more wheat than he could on his own, right?
00:12:25.140 So that's the kind of classic thing.
00:12:26.880 So it expands and amplifies the abilities.
00:12:28.860 Historically, it's been primarily our physical abilities.
00:12:32.940 And what the AI does, and it's really better thought of as augmenting our intelligence, is
00:12:37.940 it's figuring out new ways to dramatically expand and amplify our mental abilities.
00:12:44.060 So as we're recording this, it's been about a week since ChatGPT Pro came out.
00:12:47.980 So ChatGPT Pro is a $200 a month product of open AI, which for certain businesses, I think,
00:12:56.220 is going to just be wildly cheap, including mine.
00:12:58.600 So any kind of, like, I'm in the business, including of creating energy policy and arguments
00:13:03.360 and this kind of thing.
00:13:04.480 And this is something modern AI, and specifically these large language models, has become incredibly
00:13:09.340 good at.
00:13:09.780 And with the Pro version, it's just, it's unbelievable in terms of just helping you make decisions,
00:13:14.580 helping you solve problems.
00:13:16.160 So I have, you know, I had a very complex accounting and legal question that I needed because, you
00:13:22.740 know, I'm in the world of politics, and there's always a question of what's lobbying and what's
00:13:26.620 not lobbying, and do you want a nonprofit structure or a for-profit structure?
00:13:30.380 And I can just lay this out, and it can give me, you know, the equivalent of 10 hours of
00:13:34.060 a lawyer, and then I can just run it by an actual lawyer for one hour to vet it, and I
00:13:39.160 save whatever it is.
00:13:40.480 Right, right, right.
00:13:41.060 $5,000 or something like this.
00:13:42.940 And it's, of course, much quicker.
00:13:44.140 The thing is on demand.
00:13:44.980 It doesn't get sick.
00:13:45.940 It doesn't make spelling errors, right?
00:13:47.680 It's this thing.
00:13:48.520 But the way in which we do this is, like, the computation involved is sort of incredibly,
00:13:55.760 to call it crude is not quite the right way to put it, but it's like, it's not nearly
00:14:00.560 as energy efficient as our brains.
00:14:02.120 I mean, it's not even remotely as energy efficient.
00:14:04.160 And basically, part of what it does is it just scans the entire sum, at least in terms of
00:14:09.460 words, of human knowledge and, like, everything that we've ever created to find patterns in
00:14:14.820 these very sophisticated ways.
00:14:15.920 And this is where I'm no longer an expert.
00:14:18.040 But the key thing is to amplify our mental abilities to our maximum capability right now
00:14:23.900 requires this incredibly energy-intensive thing that people are very, very excited about.
00:14:28.900 Yeah, so our brains are remarkable, not only for the fact that they're intelligent, but
00:14:32.620 for the fact that they're insanely energy efficient.
00:14:35.400 Right.
00:14:35.660 And so we don't have, like, you know, there's a lot of stuff in biology that's just insanely
00:14:39.460 efficient that we have not been able to replicate with non-biology.
00:14:42.660 But of course, the great thing about energy is we don't need to be as efficient as nature
00:14:48.800 at any given point, because, you know, for a human in the United States, you know, we
00:14:53.640 have 75 times more energy used by our machine servants than we do by our own bodies, right?
00:15:00.440 So that's, and we're going to, but with the AI and with the need for knowledge, what we're
00:15:05.540 finding is there's no real end point to our desire to augment our intelligence.
00:15:11.080 Right, right.
00:15:11.600 And in particular, in the realm of medicine, and then more broadly, longevity, if you,
00:15:16.780 and this is going to be really paid, it's one of these-
00:15:18.480 Scientific discovery in general.
00:15:20.140 Yeah, but if you think about things like, what are billionaires going to be willing to
00:15:24.820 pay for?
00:15:25.600 Well, are they going to, how much are they, you know, if you have $100 billion, are you
00:15:29.320 willing to invest $10 billion with a 10% chance that you'll get a five-year longer life?
00:15:34.340 Right, right.
00:15:34.640 Probably.
00:15:35.340 Right, right, right.
00:15:35.760 I think that's a great thing, and that's going to benefit all of us tremendously.
00:15:38.740 But that's the kind of, I mean, there's also just the whole phenomenon of creating not
00:15:43.700 just, right now, AI is primarily advisors, right?
00:15:46.360 That's sort of giving us advice on what to do.
00:15:49.120 But as it becomes more of an agent model, then you can do more and more.
00:15:52.820 And of course, nobody knows exactly how successful these will be, how much they'll proliferate,
00:15:57.720 what their limits will be, what new capabilities they'll have.
00:16:00.900 But obviously, the world is very excited about it, particularly the digital tech world.
00:16:04.820 From a security perspective, it's, we view it as existential, which I think is a correct
00:16:10.240 read on it, given the power, like, in every sense of this, including metaphorical.
00:16:16.000 And the rate of change.
00:16:16.980 Yeah, it's just, this is the kind of thing you want to be very much on top of.
00:16:21.360 So for all of these reasons, there is huge urgency in, I think, proper urgency in the
00:16:26.940 digital tech world to-
00:16:28.240 Well, even to keep ahead of the Chinese, for example.
00:16:30.080 Yeah, right.
00:16:30.640 And you're seeing that.
00:16:31.500 And I think somebody like, you know, Burgum in the new administration, like, this is a
00:16:35.080 big focus of his, in particular.
00:16:36.920 It's like, he's very sensitive to the national security implications of AI.
00:16:40.560 Well, so it's lovely to see that when push came to shove, so to speak, that the big tech
00:16:48.160 companies in the United States returned to their own narrow self-interest and made the
00:16:53.840 right bloody decisions.
00:16:54.780 Yes, yes.
00:16:55.660 That's really, it's really something to see.
00:16:57.420 And that's not, I don't, I'm not even so cynical about that.
00:17:00.680 I mean, this is, I think, part of why sometimes populations will have better conclusions than
00:17:05.780 the experts at a given time.
00:17:07.900 Although I'm a big fan of consulting experts in a proper way, but you'll often have something
00:17:12.200 like-
00:17:14.040 So it's a new year, 2025.
00:17:16.240 And you're thinking, how am I going to make this year different?
00:17:18.340 How am I going to build something for myself?
00:17:19.960 I'm dying to be my own boss or see if I can turn that business idea I've been kicking
00:17:23.720 around into a reality, but I don't know how to make it happen.
00:17:26.600 Well, Shopify is how you're going to make it happen.
00:17:28.940 And let me tell you how.
00:17:29.880 The best time to start your new business is right now.
00:17:32.300 Shopify makes it simple to create your brand, open for business, and get your first sale.
00:17:36.300 Get your store up and running easily with thousands of customizable templates, no coding
00:17:40.260 or design skills required.
00:17:41.780 Their powerful social media tools let you connect to your channels and create shoppable posts to
00:17:46.380 sell everywhere people scroll.
00:17:47.600 Shopify also makes it easy to manage everything, shipping, taxes, and payments from one single
00:17:53.040 dashboard.
00:17:53.880 Well, what happens if you don't act now?
00:17:55.360 Will you regret it?
00:17:56.260 What if someone beats you to the idea?
00:17:57.920 Don't kick yourself when you hear this again in a year because you didn't do anything now.
00:18:01.520 Established in 2025 really has a nice ring to it, don't you think?
00:18:04.580 Sign up for your $1 per month trial period at shopify.com slash jbp.
00:18:08.980 That is all lowercase.
00:18:10.400 Head over to shopify.com slash jbp to start selling with Shopify today.
00:18:14.520 Again, that's shopify.com slash jbp.
00:18:17.600 People just sort of know, you know, it's not good, like with the whole fossil fuel thing.
00:18:25.860 So even just in terms of common sense.
00:18:27.520 So another development, so I mentioned one development, is the urgent need for more electricity
00:18:32.680 and the recognition that fossil fuel expansion is necessary for that.
00:18:36.660 But number two, and this is what you were alluding to with Norway, is the very conspicuous
00:18:41.860 failure of the net zero agenda, even when only barely implemented.
00:18:47.360 Yeah, right.
00:18:48.120 So it's important.
00:18:48.740 Well, one of the scandalous elements of that is that there's no single town on the planet
00:18:56.140 that runs entirely on renewables, right?
00:18:58.540 There's no micro projects, proof of concept.
00:19:02.060 Except poor ones do.
00:19:03.400 They run on wood and wood.
00:19:04.000 Yeah, well, right.
00:19:04.760 Well, yeah.
00:19:05.500 But they don't really run.
00:19:06.520 Well, that's the problem.
00:19:07.480 That's the problem.
00:19:08.380 They more die.
00:19:09.080 Yeah, yeah, exactly.
00:19:10.340 And so, and the fact that electricity prices spike toward the infinite as the wind stops blowing
00:19:16.100 and it's nighttime, which turns out to be a real problem if you happen to be like in the winter.
00:19:20.800 Yeah, yeah.
00:19:21.180 You know, so yeah, yeah.
00:19:22.960 And then you need the parallel.
00:19:24.620 The thing that's so bloody peculiar about that is that because these renewable sources
00:19:32.280 are sporadic and unreliable, you have to have a backup system that has the same capacity
00:19:39.320 as the renewable system when it falls to zero.
00:19:43.240 And so what you have is a new system built on top of the old system, being particularly
00:19:48.200 catastrophic in Germany, right, where they shut off their nuclear plants and now use
00:19:52.720 late night fired coal plants to augment their unreliable renewable.
00:19:59.580 I mean, it's complete insanity, right?
00:20:02.120 It toppled the price.
00:20:04.340 And so this is a case where I think the general public was much smarter than, say, the New
00:20:08.580 York Times, where the general public was like, wait a second, we were told to shut down
00:20:13.180 these reliable fossil fuel plants and they could be easily replaced.
00:20:16.360 And now we have all these electricity shortages and our electricity prices are higher.
00:20:20.980 Maybe these two are related.
00:20:22.740 And then the New York Times is like, no, no, no, no, no.
00:20:24.660 There's nothing to see here.
00:20:25.780 Renewables are actually cheaper, right?
00:20:27.500 They're actually cheaper, even though we added a lot of them and our electricity got more
00:20:30.620 expensive and less reliable.
00:20:32.200 They're really cheaper.
00:20:33.340 And we'll make up a number called levelized cost of electricity that tries to calculate
00:20:37.500 the cost of electricity if it doesn't have to be reliable.
00:20:40.320 And so we'll tell you it's so there's all this like mumbo jumbo, which I sort of debunk in
00:20:44.140 fossil future, like chapter six type stuff.
00:20:46.240 But this is another thing where the net zero agenda promised us we would be richer.
00:20:51.240 Yeah, right.
00:20:52.040 And then even just a very marginal implementation.
00:20:54.840 And I want to stress this because we haven't even reduced the supply of fossil fuels in the
00:20:58.620 world.
00:20:58.900 We've just slowed the growth.
00:21:00.340 Right.
00:21:00.620 And we're having all these problems.
00:21:01.920 So even just a very bare marginal attempt to slow the growth in the name of net zero has
00:21:07.460 been a disaster.
00:21:08.420 So that's number two.
00:21:09.360 And they're related because, you know, we have this this clear demand that fossil fuels
00:21:14.060 are needed for.
00:21:14.760 And then we restrict fossil fuels some and we start getting these big problems when we
00:21:19.120 were told we would get big wealth, basically.
00:21:22.860 And do you think there's any utility in the renewable energy sources?
00:21:27.580 I mean, yeah, you do.
00:21:29.460 OK, well, of course.
00:21:30.240 So, I mean, the obvious things are where they're already used in a free market.
00:21:33.560 So with their off grid kind of applications and that kind of thing, I think what we need
00:21:37.940 and this is this relates to some policy ideas is insofar as we're going to have electricity
00:21:42.200 markets, what you really need is some form of tech neutral reliability or dispatchability
00:21:47.580 standard where you allow the intermittent sources the chance to provide reliable electricity,
00:21:54.040 but you require them to.
00:21:55.320 So just to give you an example, like, let's just say, let's say like in five years, Elon
00:22:00.980 thinks, hey, you know what?
00:22:02.160 I can get solar and batteries to the point where I can provide dispatchable electricity.
00:22:06.720 Or maybe it's I can get solar and batteries and maybe I'll have a few gas peaker plants
00:22:11.360 as like a backup.
00:22:12.900 And I think I have this this system to do it.
00:22:15.960 I want to encourage that kind of thinking because you could imagine it could be possible,
00:22:20.600 but you also don't want to burden the grid with somebody's incorrect idea.
00:22:24.580 And most people's initial ideas are incorrect.
00:22:26.760 So the basic way you do this conceptually, the details become difficult, but you basically
00:22:31.860 say everyone on the grid has to meet a certain standard of dispatchability or reliability.
00:22:36.900 We don't care how you do it.
00:22:38.580 You can do it with whatever you want.
00:22:40.360 Right, right.
00:22:40.920 You're a black box and we just demand certain standards of performance of the black box.
00:22:46.000 I think that kind of model will allow you to have market discovery if any of these ideas
00:22:51.840 are true.
00:22:52.360 But unfortunately, what's happened is people have made these crackpot claims that we can
00:22:56.420 just power it with solar and a little batteries.
00:22:58.520 And they've used all these false models that are, you know, people like me spend a lot
00:23:01.880 of time debunking.
00:23:02.780 But then they just ruin the grid because what they're really getting is they have the right
00:23:07.000 to sell unreliable electricity with no reliability guarantee at the same price.
00:23:12.840 And in fact, with subsidies, a far greater price than reliables.
00:23:16.780 And so this would be the equivalent of the government passes a law and says, you know, rental car companies
00:23:22.460 have to charge the same for a car that works all the time in a car that works a third of the
00:23:27.800 time and you don't know when.
00:23:28.980 And actually, you have to pay.
00:23:30.460 Actually, we're going to subsidize the car that works a third of the time and you don't
00:23:33.960 know when, which then actually is going to take money away from the reliables, which has
00:23:37.600 happened on our grid is we give, you know, whatever pool of utility payments and stuff
00:23:42.620 we have on the grid.
00:23:43.680 More and more of that goes to solar and wind in part because of subsidies, because they
00:23:47.560 can always bid if they want a negative price, because they can basically say, I'm going
00:23:52.080 to pay you to take our electricity because we're giving them so much as taxpayers.
00:23:56.100 So even if they pay the grid, we pay them way more than that.
00:23:59.540 So it's just this totally screwed up system.
00:24:01.880 But I'm not one of these people who says we should just not consider solar.
00:24:05.840 We shouldn't consider solar and wind.
00:24:07.580 But we need real markets.
00:24:09.640 Yeah.
00:24:09.920 Well, the fundamental question under that has got to be something like, well, why would
00:24:13.880 we take off the table any potential source of innovation that would make energy more
00:24:20.180 plentiful and more reliable?
00:24:21.840 Right.
00:24:22.200 Because we need it.
00:24:23.000 We wouldn't.
00:24:23.480 But with the grid, what we have is we have a monopoly situation.
00:24:26.380 So you have to have you have to think of it in that context.
00:24:29.640 Now, I think long term, we could argue that we don't even need to have a monopoly with
00:24:32.720 the electricity grid.
00:24:33.560 But in anything resembling the near future, when you have government monopolies and you
00:24:37.980 have insofar as you have these electricity markets, which are not exactly markets, they're
00:24:42.780 more like government pricing schemes, you need to orient those so that they reward reliability
00:24:48.100 and they really value your liability.
00:24:49.740 And right now, they don't.
00:24:51.480 So you want to so people talk about all of the above, which I think is a pretty bad term
00:24:55.860 because you don't and no no thing.
00:24:57.820 Do you just want everything because it happens to exist?
00:24:59.660 Like we don't want animal dung, right?
00:25:00.980 We don't want wood that those are part of the above.
00:25:03.240 You really want best of the above.
00:25:05.240 And that's what you get in a real market.
00:25:07.260 So with electricity, we need to create the closest approximation we can with a monopoly
00:25:11.940 system of a real market where the lowest cost, most reliable solution is allowed to emerge
00:25:18.960 and rewarded.
00:25:20.620 OK, so we talked at the beginning here when we were trying to structure this conversation,
00:25:25.340 I noted, remembered that, you know, you had written these two books, The Moral Case
00:25:29.460 for Fossil Fuels and then Fossil Future.
00:25:31.460 And I asked you if you were writing another book and you said you kind of you sort of said
00:25:36.660 kind of, but you're not.
00:25:38.100 You're focusing on energy policy, per se.
00:25:40.740 And you wanted to step through the you have five points.
00:25:44.020 Yeah, five kind of big objectives.
00:25:45.460 So let's I'd like you to go through those.
00:25:47.880 And one of the things I want to return to at some point, because I don't want to forget
00:25:51.420 about it, is how you view the role of nuclear power in this.
00:25:55.500 Well, that's going to be one of them.
00:25:56.460 OK, fine.
00:25:57.460 Let me lead into this, actually, by saying sort of the the way in which I think I'm part
00:26:02.680 of this this this change in the culture, because it sort of relates to the relationship
00:26:07.600 between my work in the past and my work right now.
00:26:10.820 Yes.
00:26:11.000 So I think of if you think of the Moral Case for Fossil Fuels and Fossil Future, they're
00:26:15.100 really, you know, it was really designed to sort of create the ultimate guide to evaluating
00:26:21.000 energy sources.
00:26:22.380 That's really like fossil fuels just happen to be the conclusion that this needs to be
00:26:26.460 our dominant form of energy because there's nothing close in terms of cost, reliability,
00:26:30.960 versatility and scalability for the foreseeable future.
00:26:33.460 But it's really about how do you evaluate our energy situation and potential side effects
00:26:38.960 of energy, including climate side effects, which is the main thing people are concerned
00:26:42.300 about, in a pro-human way.
00:26:44.420 And the basic idea is that you need to be very even handed.
00:26:49.640 So you need to very carefully weigh both benefits and side effects of fossil fuels, particularly
00:26:54.780 with climate.
00:26:55.400 And within side effects, you need to be even handed.
00:26:57.260 So you can't just look at negative climate effects.
00:26:59.140 You also have to look at positive effects.
00:27:01.380 And part of being even handed is you need to be precise.
00:27:03.720 You mean like more plants.
00:27:04.880 Yeah, exactly.
00:27:05.620 Things like that.
00:27:06.500 Yeah, exactly.
00:27:07.440 Way more plants, as a matter of fact.
00:27:09.080 Right, right.
00:27:09.560 We talked a lot about that last time.
00:27:11.920 But then also, of course, open to things like more heat waves and, you know, expansion of
00:27:16.740 water and sea level rise.
00:27:18.140 And the idea is you need to be very even handed.
00:27:20.920 And so that's one of the core methodological things that I think I've encouraged people to
00:27:25.380 use and that I think you're seeing more and more of.
00:27:27.420 And people like Bjorn and Schellenberger and Kuhn and I think have also done this.
00:27:32.540 And then the other thing, which is a little bit deeper, is that we have to evaluate them
00:27:37.460 in a pro-human way and be aware that most people are either deliberately, I think in most cases,
00:27:43.680 inadvertently evaluating fossil fuels from an anti-human perspective.
00:27:47.820 And then within that, again, we talked about this last time, but just the sort of key ideas
00:27:52.080 are one is when we're thinking about the Earth and what our goal for it is, our goal, we need
00:27:58.640 to be clear, our goal could either be to advance human flourishing on Earth or to eliminate
00:28:03.020 human impact on Earth.
00:28:05.080 And that the dominant goal guiding our policy is this goal of eliminating human impact,
00:28:09.480 as evidenced by the fact that the number one cultural political goal in the world right
00:28:14.420 now is eliminating our impact on climate.
00:28:18.680 That's what net zero means.
00:28:19.680 So like our whole focus with the Earth is how do we eliminate our impact on it in general
00:28:23.520 and particularly with climate.
00:28:25.380 And my argument is that's an anti-human perspective in the same way that if our goal were to eliminate
00:28:30.200 bear impact, you'd think that's an anti-bear perspective.
00:28:34.040 Yeah, well, just get rid of the bears.
00:28:35.560 Exactly.
00:28:36.360 That's what, that's the logical.
00:28:37.900 There's too many of those damn bears.
00:28:39.300 Exactly.
00:28:39.840 That's obviously what this is trending, although they don't say that, right?
00:28:43.360 They used to say it, but they don't say it as much anymore.
00:28:46.340 They used to say we're against population growth and we're against technology, but that
00:28:50.420 didn't go over well.
00:28:51.740 So then they said, we're just against climate change, we're against climate impact.
00:28:56.580 And then they sort of, you sort of fill in the blanks.
00:28:58.620 Oh, wait, the way to get rid of that is regress technologically.
00:29:02.340 And have no people.
00:29:03.220 Yeah, have no people, right.
00:29:04.460 So there's this moral perspective of what I call your standard of evaluation.
00:29:08.940 So it's your standard advancing human flourishing on Earth or eliminating human impact on Earth.
00:29:13.920 And of course, I'm on team human here.
00:29:16.260 And then I think the part you are most interested in, which is your basic premise about the
00:29:21.640 nature of humans and our environment and what I call the delicate nurturer view, which is
00:29:26.800 the main view, that basically the unimpacted Earth is this nurturing mother that's stable,
00:29:35.440 so it doesn't change too much.
00:29:37.080 It's safe.
00:29:37.920 It doesn't endanger us.
00:29:39.160 It's harmonious.
00:29:39.960 Yeah.
00:29:40.220 And it's sufficient.
00:29:41.580 It gives us enough resources as long as we're not too greedy, right?
00:29:44.760 And then human beings are what I call parasite polluters.
00:29:47.820 So we just take from the Earth and we ruin the Earth.
00:29:50.300 And my view is, well, in reality, this is all just nonsense.
00:29:52.500 Like, it's total pseudoscience, even though many scientists believe it.
00:29:55.880 And in fact, human beings are producer improvers.
00:29:59.040 So many people who identify as scientists believe it.
00:30:01.460 Well, no, I think even many real scientists do, unfortunately, because many specialized
00:30:07.060 intellectuals are in the thrall of bad philosophy because they don't think about philosophy.
00:30:11.380 Yes, yes.
00:30:11.980 So I think we're producer improvers.
00:30:14.860 So we add value to the world.
00:30:16.540 That's why we have this amazing world now that's abundant and safe, even though the caveman
00:30:21.300 had nothing.
00:30:22.080 Like, if the world were abundant absent us, the caveman would be rich and we'd be poor
00:30:25.880 because there are so many of us.
00:30:27.020 But it's the opposite.
00:30:28.600 So we're producing and we improve our environment in many ways.
00:30:31.320 Like, we've ridden it of all kinds of disease and disgustingness.
00:30:34.480 And then, of course, we give ourselves access to natural beauty.
00:30:37.080 We can decide to cultivate whatever species we want.
00:30:39.800 You know, if we love a species, we can make it plentiful.
00:30:42.140 And then the Earth is not this delicate nurturer.
00:30:44.780 It's actually, I call it, wild potential.
00:30:46.540 So it's not stable, it's dynamic, it's not safe, it's dangerous, and it's not sufficient,
00:30:52.840 it's deficient.
00:30:53.760 And we need to impact it a lot intelligently to make it abundant and a safe place.
00:30:57.800 And so when you think of fossil fuels in this even-handed way from a pro-human value
00:31:02.280 perspective, and you get rid of this anti-human view of humans and our environment, it's very
00:31:07.980 obvious that, well, this thing we've cultivated called fossil fuel is just this incredible
00:31:13.960 net benefit, because it just allows us to harness energy and therefore machine labor,
00:31:19.160 you know, all these machine servants like never before.
00:31:21.440 And one of the things about energy is it can solve any problem, including the problems it
00:31:25.940 creates.
00:31:26.860 So if energy creates a drought challenge, well, it can also create irrigation and it can also
00:31:32.020 create crop transport.
00:31:33.420 Even if it creates something like, you know, an air pollution challenge, it can also create
00:31:37.880 the technology that can filter the air.
00:31:40.220 And if anyone happens to get sick, it can also create the whole medical industry.
00:31:44.100 Yeah, and God only knows how much that'll be augmented by this electricity-demanding AI.
00:31:48.580 So it's, again, it's energy solving its own problems.
00:31:51.120 So I feel like I got, particularly with Fossil Future, I sort of got to a level where I felt
00:31:57.300 like I had fully, like, fleshed out how to think about this in a pro-human way.
00:32:01.520 And then to amplify that, we created this thing called Energy Talking Points, which people can
00:32:06.060 see at energytalkingpoints.com, and the idea there was, let's make this excess, let's make
00:32:10.800 it easy for anyone to make and understand these arguments.
00:32:14.100 And I basically just broke every issue up into tweet-length points.
00:32:17.920 And our target was politicians, so we wanted to make it easy for politicians to talk about
00:32:21.920 this.
00:32:22.400 And what we saw is once we made it easy, like, once you make it easy for people to say the
00:32:26.520 right thing, they'll say it a lot more.
00:32:28.660 And so we saw even in this Republican presidential primary, you know, candidates like Ron DeSantis and
00:32:33.520 Vivek, making points like, we've had a 98% decline in climate-related disaster deaths over
00:32:39.200 the last century.
00:32:41.320 Right.
00:32:41.880 Well, there's a good practical lesson embedded in what you just said that everyone should
00:32:46.360 listen to very carefully when they're considering negotiating.
00:32:50.420 Like, if you want things to move in a particular direction, make it very easy for people to move
00:32:56.320 in that direction.
00:32:57.100 You want to do a lot of the work a priori that would be necessary to help them move in that
00:33:02.440 direction.
00:33:02.780 If you go to your boss with a problem, it's very useful to accompany that with a solution
00:33:08.520 that's thought through and already ready to implement.
00:33:11.760 It's much more likely to occur.
00:33:13.360 Yeah, you'll be in that, as someone who employs about 10 people, you'll be in the top 99% of
00:33:17.240 employees if you come with solutions.
00:33:19.060 Yes, yes, yes, yes.
00:33:20.560 Well, and if you have some idea about what a solution might be desirable for you, coming
00:33:25.680 armed with the strategy that would make that simply implementable, and some indication
00:33:31.060 that you've thought through the consequences radically improves your chances of success.
00:33:35.500 Otherwise, you're just a pain, the kind of messenger that gets shot.
00:33:39.200 And this is, I feel, and this is going to relate to what I'm doing now, but even in
00:33:43.180 the realm of energy evaluation and messaging, I found it was a huge breakthrough to make it
00:33:48.220 easy to be my ally.
00:33:49.300 That was the sort of, that was a breakthrough.
00:33:52.420 And there's obviously tons more work to do here, but I felt like, I kind of think of
00:33:57.720 myself as either a practical philosopher or an intellectual engineer.
00:34:01.240 Like, I like engineering intellectual products that help people flourish.
00:34:05.180 And I sort of felt like my core work that I wanted to do here, like the most, there was
00:34:11.720 less innovation forward than there was behind me in terms of energy evaluation.
00:34:16.740 And I, of course, I build a team and there's a lot to do, but I feel like I had really,
00:34:20.760 to my satisfaction, answered all the arguments on the other side, taught people how to think
00:34:24.460 about this.
00:34:25.500 And I was trying to think of, okay, like what else, and it's going to take a long time
00:34:28.920 for this all to flesh out and stuff, and I'm going to keep working on it.
00:34:31.940 But sort of, what's the next frontier that I'm interested in?
00:34:37.880 And I do think that those of us, I call us energy humanists, I do think we've made a
00:34:42.380 big difference.
00:34:43.340 So like Bjorn Lomborg, me, Michael Schellenberger, Steve Kuhn, and I think you've really taken
00:34:48.480 up this mantle as well.
00:34:50.060 And it's really helped a lot.
00:34:52.400 And I don't want to be complacent, because we need massively to spread it.
00:34:56.040 But in terms of what I personally wanted to do, I felt like there was a much bigger gap
00:35:01.720 now to fill in a potentially, in a very time-limited window.
00:35:07.340 Well, I really like your emphasis on the nexus between energy provision and human flourishing.
00:35:13.440 I mean, partly, you can make a pretty blunt case for that from an environmental perspective,
00:35:17.800 even if you're rather radically environmentally oriented, in that if you realize that you impoverish,
00:35:24.940 if you impoverish people, which you certainly will do if you make energy expensive, if you
00:35:29.400 impoverish people, you make them desperate.
00:35:31.620 And desperate people are not investing in a green future.
00:35:35.380 That's for sure.
00:35:36.620 They're going to rampage through whatever resources are available to them in very short
00:35:40.920 order.
00:35:41.180 And so I got convinced of this, well, probably 15, 20 years ago, when I started to understand
00:35:49.560 the statistical data indicating that if you got people's GDP up, average GDP up above
00:35:55.220 $5,000 a year in US dollars, that they started taking a long-term view of the future ecologically.
00:36:00.960 It's like, well, of course that's the case.
00:36:03.240 And then I thought, that's so cool.
00:36:04.540 That means that you could work really hard to make energy inexpensive and people rich.
00:36:09.520 And one of the consequences of that would be that people would be much more attendant
00:36:13.940 to genuine ecological concerns locally and over time.
00:36:18.880 Well, yes, I think, you know, when I talk about advancing human flourishing on Earth, I think
00:36:22.800 of, I don't draw a distinction really between our economy and our environment.
00:36:27.840 I mean, I think it's actually all our environment.
00:36:29.820 And I think of environment in a very humanistic way.
00:36:32.320 So just take a bird, right?
00:36:33.760 Like, is a bird's nest a part of its environment?
00:36:36.860 Right.
00:36:37.080 I would say yes, right?
00:36:38.380 So I think a factory is our environment and the beach is our environment.
00:36:41.660 Yeah.
00:36:42.020 I think we're just uniquely good at reshaping our environment to be particularly conducive
00:36:46.940 to us.
00:36:47.420 So when you talk about ecological thinking, I really think of that as humanistic.
00:36:52.260 Say hello to Tim Selects, Tim's everyday value menu.
00:36:55.560 Enjoy the new spinach and feta savory egg pastry or our roasted red pepper and Swiss pinwheel
00:37:00.300 starting at only $2.99 plus tax.
00:37:02.680 Try one or try our full Tim Selects lineup.
00:37:04.820 Terms apply.
00:37:05.320 Prices may vary at participating restaurants in Canada.
00:37:07.540 It's time for Tim's.
00:37:09.660 Dick, thinking about our environment as in how do we make sure that we have-
00:37:12.940 Yes, I was still using that dichotomous perspective.
00:37:14.260 So you're looking at it from an advancing human flourishing on earth perspective, but what
00:37:20.300 you're pointing out is the more resources you have and the more time you have, the more
00:37:24.440 you can think about that in a broad way, right?
00:37:27.960 When you're just freezing to death.
00:37:29.100 Yes, exactly.
00:37:29.600 You just cut down whatever trees are around you and you burn them and what else are you
00:37:34.200 going to do?
00:37:35.000 Versus you don't think as holistically about your environment, not because you wouldn't
00:37:38.760 care about those things.
00:37:39.560 It's just you have a sense of urgency.
00:37:41.080 Because once you're wealthy, you can think about things like, hey, even, hey, what would
00:37:44.920 the ideal climate be?
00:37:46.360 I mean, let's leave aside, are we negatively impacting?
00:37:48.700 Like, how can we maybe make more places like California, right?
00:37:52.220 Or how can we, like, oh, how can we optimize the species on this particular island for some
00:37:57.620 particular goal?
00:37:58.480 Or even, we really like, you know, at home, we have a dog and it's like, how can we make
00:38:02.560 this dog really survive?
00:38:04.180 Or how can we get rid of these mosquitoes?
00:38:05.980 Like, we don't like these merrier mosquitoes.
00:38:07.540 The polar bears, they're beautiful, but we want them cordoned off so they don't eat
00:38:12.260 us.
00:38:12.520 Like, we're really engineering the earth.
00:38:14.680 So when you talk about ecological stuff, I think about it as this very sort of long-term
00:38:19.740 and broad thinking engineering of the earth to advance human flourishing.
00:38:24.820 Whereas the anti-impact crowd, that's not how they think of it.
00:38:27.920 So if you made that argument to them, they're like, no, we don't want to impact it at all.
00:38:31.740 We don't want like 8 billion prosperous people who have nice gardens and clean air.
00:38:36.380 Like, that's way too humanized in earth.
00:38:39.320 We need, you know, back to the Pleistocene, as the, you know, Earth First, I think, used
00:38:43.440 to say.
00:38:43.860 So that's just to say, like, I don't like this idea of like, oh, if from an environmental
00:38:48.720 perspective, because is it a pro-human environmental perspective or anti-human?
00:38:52.540 And if it's anti-human, they just don't, they won't accept anything that involves human
00:38:56.460 success.
00:38:57.560 Correctly accepted.
00:38:58.860 So thank you.
00:38:59.820 So let's go on the, let's go on the, in terms of what I think the big opportunity is.
00:39:05.560 And so when I'm, you know, I'm very like, I wouldn't exactly say hedonistic, but like,
00:39:10.680 I'm very much an enjoying life and work person.
00:39:14.740 And like, I like doing things that are very beneficial to others that I really like doing.
00:39:20.040 I'm not like a good, like, I'm going to be miserable for 20 years and everyone else is
00:39:24.400 going to be happy.
00:39:25.060 Like, that doesn't appeal to me much.
00:39:26.400 Uh, so I think I do as much as anyone for energy, but like, I, I like to enjoy it.
00:39:31.440 And part of that is I like to, you know, for me, what's interesting is like an unsolved
00:39:36.440 problem that I think would be fun to solve that I'm not convinced anyone else is going
00:39:40.760 to solve unless I work on it, which again, people can say like that's megalomaniacal
00:39:45.800 or whatever.
00:39:46.220 But in this case, I think it was pretty clear there was an unsolved problem, which is there
00:39:50.660 was no real pro-freedom energy policy fully worked out in the event of a pro-freedom administration
00:39:58.360 and Congress.
00:40:00.020 And so, you know, like you look back a couple of years ago and I learned this particularly,
00:40:04.060 maybe we could start there on the issue of nuclear energy, because I'm just a huge, like
00:40:08.560 I've been interested in nuclear and enthusiastic about nuclear far longer than I've been enthusiastic
00:40:12.580 about fossil fuels.
00:40:13.520 Cause you know, I grew up in a liberal environment.
00:40:15.160 I'm like, I was afraid of climate change and this kind of thing.
00:40:17.740 Whereas nuclear, I was never really afraid of the nuclear kinds of fears.
00:40:22.080 I mean, I know you have your own background in terms of like nuclear war, but I didn't
00:40:24.740 grow up in that era, right?
00:40:26.560 I mean, I was born in 1980 by 1989, we have the fall of the Soviet Union.
00:40:30.080 I didn't really buy this idea that we're all going to be, you know, three-eyed fish and
00:40:33.820 whatever.
00:40:34.780 So...
00:40:35.260 And that's also a concern that is in many ways, importantly, separate from the issue of
00:40:39.460 nuclear power anyways.
00:40:41.000 It's totally right.
00:40:41.940 Because the nuclear power plants can't explode.
00:40:43.760 That's a very fundamental distinction.
00:40:45.620 It's the physics make it impossible to explode.
00:40:48.560 And so, um, but with when I would, it's in nuclear, my focus, why does nuclear so stagnant?
00:40:55.500 Right.
00:40:55.980 I mean, we had this ideal of too cheap to meter, what back in the forties, right?
00:41:00.020 It's going to be, the electricity is going to be so cheap.
00:41:01.660 You don't even need an electricity meter at your house because like, who cares?
00:41:05.180 It's like air.
00:41:06.180 Yeah.
00:41:06.440 It's going to be air.
00:41:07.180 Even like, even like, um, data on hard drives.
00:41:10.740 Like think about how much that's gone down in price, you know, in the last 30 years.
00:41:14.280 And yet nuclear just, it had this boom in the sixties and then starting in the seventies,
00:41:20.720 you know, and then the early eighties just totally start stagnating where, to the point
00:41:24.480 where from the beginning of the Nuclear Regulatory Commission in 1975, we went 48 years until
00:41:30.880 2023, until we had one plant go from conception to completion.
00:41:35.440 And these were these Vogel plants in, in, uh, in Georgia.
00:41:39.020 And they, and they were just, you know, seven, eight, nine times over cost.
00:41:43.560 They were just catastrophically expensive.
00:41:45.100 So there was everyone who knows anything about nuclear knows the policy is a disaster.
00:41:49.880 Like you need to fix the policy.
00:41:52.080 And yet I would ask nuclear advocates, okay, what do we do?
00:41:55.520 Like if you were the president, what would you do?
00:41:57.700 And they'd always say, it's nuclear policy so bad.
00:42:00.520 It's terrible.
00:42:00.920 I'm like, okay, but what was pretty low resolution, what would you do?
00:42:04.860 And then I started realizing like, this is the problem is that I don't really know what
00:42:10.940 to do.
00:42:11.920 And so I, even if I help people evaluate energy in a better way, of course, there are some
00:42:16.740 things I can know how to do in terms of stop blocking these pipelines and stuff like that.
00:42:20.800 But even at the resolution of like, okay, what exactly should the air quality standards
00:42:25.680 be?
00:42:25.980 And how do you determine them?
00:42:27.000 You can say, oh, this recent thing on ozone is ridiculous because it's, you know, it sets
00:42:31.880 the level of ozone below the background level in some parts of the U.S.
00:42:35.200 So how are you going to possibly meet that, right?
00:42:36.920 If background ozone is greater than your, but like you could see these irrationalities, but
00:42:40.940 there's a question of, well, how do you actually come to the solution?
00:42:44.660 And I just kept seeing this with every issue.
00:42:46.680 And I just thought like, I don't know the answers.
00:42:48.980 And it's not that nobody knows the answers, but that the people who know the best answers,
00:42:54.760 there's in no way has this been put together in a usable, coherent way.
00:43:00.260 And at the same time, through Energy Talking Points, I had really proven to myself that,
00:43:04.200 hey, if you can make it easy for politicians to do something, they're going to be a lot
00:43:07.600 more likely to do it.
00:43:08.500 Like a lot of these politicians want to do.
00:43:10.260 Well, someone might just come along and capitalize politically on your ideas.
00:43:13.840 Yeah, that's certainly a possibility.
00:43:15.820 But you need, they need to be developed, right?
00:43:17.440 You need to have something to hand them.
00:43:19.440 And I noticed when I would talk to them, it would be too much vaporware, it'd be too
00:43:22.740 abstract.
00:43:24.300 And so then I became really interested in, okay, let's say we have a new pro-energy
00:43:28.120 president and a pro-energy Congress.
00:43:30.980 How can we be prepared for that situation?
00:43:33.920 Because if you think about, you know, at a certain point, it became clear it'd be Trump.
00:43:38.020 But even if you take some of the others, like DeSantis and Vivek, and that field was
00:43:43.740 incredibly pro-energy compared to previous years.
00:43:46.880 And I hope this is something that I and the energy humanists have contributed to.
00:43:51.080 Certainly in terms of the arguments available, I'm certain I've helped contribute to it.
00:43:55.200 But we had this thing—
00:43:55.760 Well, you've definitely broken the ice for those arguments.
00:43:58.640 At minimum, right?
00:43:59.880 At bare minimum.
00:44:01.060 So then you see, like in many cases, I know like people are using the exact argument and
00:44:04.980 that kind of thing.
00:44:06.060 But that was this situation where, you know, even compared to like Romney, who by the way,
00:44:12.580 in many ways I admire, so I'm not trying to criticize Mitt Romney, but I'm just saying
00:44:15.320 like, if you look at the 2012 situation where Republicans were in 2012, there was much less
00:44:20.940 positive enthusiasm for energy, certainly fossil fuels, and much more friction in terms of
00:44:26.660 let's hold back fossil fuels for climate reasons.
00:44:29.560 With the crop that we had and certainly ending up with President Trump, like that was not the
00:44:34.740 case.
00:44:35.300 And with Trump, we saw in the end, his like a central campaign thing was let's unleash American
00:44:40.720 energy, which that is, you know, to my energy years, that's music.
00:44:44.400 Like, oh, you want to actually do that?
00:44:46.220 Because that hadn't been—but then I feel this obligation of, well, we as in a country,
00:44:50.780 we need to be ready for this situation.
00:44:54.160 And I don't think—I don't think we are.
00:44:55.960 I mean, we are in the sense of they did a lot of good things, the first administration,
00:44:59.660 so it's not like they'll do nothing good.
00:45:01.580 But if you have an opportunity like that, you want to do as much good as possible, right?
00:45:05.340 And as a citizen, I felt like, okay, what I can do is try to create like a very specific
00:45:10.820 platform and accompanying messaging so that we have—so that they at least have the option
00:45:17.900 of the policies.
00:45:19.000 Because I was never—I've always said I'm never going into politics, which I'm not.
00:45:21.920 Like, I'm not—I'm never going to have any control.
00:45:24.240 But at least I can be the ultimate resource if somebody wants to take advantage of the
00:45:30.120 resource.
00:45:30.480 And so this is like the last year and a half has been developing what I call like the Energy
00:45:35.200 Freedom Platform.
00:45:36.540 And this is like a very step-by-step detailed guide.
00:45:38.960 So I thought we'd walk through the high level.
00:45:40.880 Do it.
00:45:41.220 But then we'll go into just some specifics because people can get a flavor.
00:45:44.740 Because what I don't want to do is say, well, you have to be specific, but then just be
00:45:47.800 high level.
00:45:48.380 Yeah.
00:45:48.660 But of course, like I sent you one of the internal documents I've shared with people the
00:45:51.920 other day.
00:45:52.140 It's like 125 specific policies.
00:45:54.420 So we're not going to get into that.
00:45:55.440 But I just want to—I'll give a few indications of some of the kinds of specific things.
00:45:59.820 Yep.
00:46:00.560 So it makes sense?
00:46:01.340 And then just ask questions and interrupts.
00:46:02.780 Absolutely.
00:46:04.120 So maybe let's just start off with the five key objectives.
00:46:07.860 And then maybe we can drill down in whatever you're most interested in, because any one
00:46:11.580 of these has numerous priorities and then numerous policies.
00:46:15.040 But I'd say at a high level, number one is liberating responsible domestic development.
00:46:21.260 So that includes like all the pipelines, all energy production, all sorts of stuff.
00:46:27.020 That's music to an Albertan's ears.
00:46:28.640 Yes, exactly.
00:46:29.440 So, you know, you have a sort of slightly different set of obstacles, but in many ways, the same
00:46:34.120 kinds of obstacles in Canada.
00:46:36.380 And, you know, Canada is a total tragedy that I'm also trying to work on at the moment in
00:46:39.860 terms of—
00:46:40.360 Yeah, yeah.
00:46:40.780 I mean, in some ways, even much more of an energy tragedy.
00:46:43.160 I mean, way more oil reserves than the United States and could be just providing so much—
00:46:47.040 Providing with way worse policies.
00:46:48.520 I know.
00:46:48.980 That's the thing.
00:46:49.840 It's a mind-boggling.
00:46:50.240 And worse philosophy.
00:46:51.660 So we'll focus on the U.S. first.
00:46:53.100 So it's like liberating responsible development from anti-development policies.
00:46:58.300 So that's that sort of one bucket.
00:46:59.900 Yeah.
00:47:00.060 Number two is ending preferences for unreliable electricity, which we talked about a little
00:47:05.860 bit.
00:47:06.240 But fundamentally, our grid, our policy is just totally punishing reliability and rewarding
00:47:12.320 unreliability in the name of so-called renewability.
00:47:14.880 So there's that.
00:47:17.020 Number three is setting—this is a really important one that there's not been enough
00:47:22.740 work on to date, which is setting environmental quality standards based on cost-benefit analysis,
00:47:29.040 on real cost-benefit analysis, including objective health science, not health speculation.
00:47:34.400 That might be an interesting one to go into in terms of how that's done.
00:47:37.040 Number four is addressing climate danger through resilience and energy innovation, not punishing
00:47:45.760 America.
00:47:47.420 So the way we—our idea is we're going to keep ourselves safe from climate by destroying
00:47:51.880 fossil fuels, which, by the way, have made us way safer from climate.
00:47:55.580 And then suddenly the climate's going to be nice to us, even though the rest of the world
00:47:58.660 is not going to fall.
00:47:59.260 So like we have this insane thing if we punish ourselves by destroying our fossil fuel industry
00:48:03.700 and somehow—
00:48:04.120 We'll set an example for the rest of the world.
00:48:05.840 And the climate will be nice to us, right?
00:48:08.100 Yeah, or other—well, what you see in Canada, as far as I can tell, to the degree that there's
00:48:12.540 anything remotely like logic driving this, is that, well, Canada has a responsibility
00:48:17.060 to set the kind of moral example that other countries like China could follow, which—and
00:48:22.540 do not, in the least, follow.
00:48:25.060 And the same with India and their economies that are of such a scale compared to, say, the
00:48:29.780 Canadian economy, that that example is—it's essentially irrelevant.
00:48:33.600 Now, you know, you could argue that the Canadian fossil fuel industry is comparatively clean
00:48:40.020 in its approach, and maybe there's some benefit in that, but the idea that if Canada sets a
00:48:45.520 moral standard, China is going to fall suit is—it's egotistical beyond belief, and it's
00:48:51.640 utterly preposterous.
00:48:52.860 Plus, there's no evidence that it's happening.
00:48:55.140 So that's a major problem.
00:48:56.200 Yeah, but I would say to the point about—so the key is really the combination of resilience.
00:49:00.480 So the way you make yourself safe from climate is by becoming incredibly resilient.
00:49:04.160 That's what we've already stated.
00:49:04.740 That's only valid if you take that pro-human perspective that you described to begin with,
00:49:09.040 right?
00:49:09.200 Yes, yes.
00:49:10.040 There's much less—there's much more practical things that we can do to keep people safe from
00:49:15.820 climate change, let's say, than making everybody poor by making fossil fuels impossible to access.
00:49:21.160 Yes, right.
00:49:21.900 And then the other thing, though, I mentioned energy innovation, countries can set an example
00:49:26.100 insofar as they want superior forms of energy if they can actually innovate a globally cost-competitive
00:49:31.600 alternative energy, which is really the only thing you can—the only way you can actually
00:49:35.960 address a global issue that's caused by the cheapest form of energy emitting CO2 in the
00:49:41.260 atmosphere, right?
00:49:42.360 The only way you can really address it in a remotely humane and practical way is come up
00:49:47.140 with a cheaper form of energy that doesn't emits—
00:49:49.000 That doesn't do it.
00:49:49.460 Yeah, that's all you can do.
00:49:50.340 And you need to be a wealthy and prosperous and free society to do that.
00:49:53.360 You're not going to run your economy into the ground and then innovate the new nuclear.
00:49:58.140 Absolutely.
00:49:58.740 So what was the fifth one?
00:49:59.780 And the fifth one is unleashing nuclear energy from the many pseudoscientific restrictions.
00:50:05.480 So, yeah, which one do you want to talk about?
00:50:07.500 Oh, I think we might as well just go through them in order.
00:50:09.880 I think they're all extremely interesting.
00:50:11.680 And you can go into them in as much detail as you see fit.
00:50:15.900 Yeah, so I'll just highlight some sort of priorities for all of them and where I think
00:50:20.060 the—and of course, by the way, if any politicians watching this or anything like that, email
00:50:24.060 me, alex.alexepstein.com.
00:50:25.540 I'm happy to help with the details of this.
00:50:27.880 I should say one thing about—by the way, because I think you're always good at drawing
00:50:31.100 lessons from things about how to compile this.
00:50:33.460 Because this is certainly not just me thinking in a room, although I do think a lot in a
00:50:39.060 room.
00:50:39.580 I mean, part of it has been trying to find the absolute smartest people who had already
00:50:45.560 figured out as much of this as possible.
00:50:47.900 So some of what I'm going to say has been me or often my team.
00:50:51.780 So I have a very brilliant team who works for me full time.
00:50:54.700 They're sort of all around the world.
00:50:55.800 I found them in these very—it's almost like the X-Men.
00:50:57.880 You just find them in these very obscure places like one—
00:51:00.860 Are the AI systems helping you?
00:51:02.380 Yes, they are.
00:51:03.760 But I think it's going to be—particularly they're good at the messaging part of it.
00:51:09.040 Like at—so I was doing something last night actually playing around with explaining why
00:51:14.260 I'm very suspicious of these CO2 capture schemes.
00:51:18.380 And I was getting it to do the math on how much—I wanted like a set of talking points
00:51:23.280 on how much we pay—I might as well tell you—like how much you pay for the coal and how much
00:51:29.400 we are subsidized to sequester the air, the CO2, right?
00:51:33.280 And because people have no idea.
00:51:34.860 But it's basically the math is one ton of coal generates between two and three tons of CO2.
00:51:42.640 So like it's more than its mass.
00:51:44.780 And it's similar with gas, but gas has twice the energy density per mass.
00:51:48.900 So with coal, that means if you have a $50 ton of coal, you get paid $85 a ton to sequester
00:51:55.160 the CO2.
00:51:56.240 So that means you're on the order of $200 to store the air.
00:51:59.780 So for $50 of coal, you pay $200 to put the air underground.
00:52:03.320 Is China going to do that?
00:52:04.640 Is India going to do that?
00:52:06.140 Yeah.
00:52:06.280 And so for gas, it's about half.
00:52:07.840 So gas, you basically pay for like—you know, you can think of it for—you basically pay
00:52:12.340 about half that.
00:52:12.900 So it's preposterous economically, and that's on top of the fact that the—what would you
00:52:19.180 say—a detached analysis of the cost benefits in relationship to carbon emission has not
00:52:25.880 been conducted properly.
00:52:26.920 Yeah, yeah.
00:52:27.100 One of the things I just can't figure out, and then we'll get back to these five points,
00:52:30.780 is like when I—like I've spent a lot of time looking at scientific data, and there's
00:52:36.520 a pattern to doing that that works to some degree across disciplines.
00:52:39.960 It's hard to know the details of the discipline if you're not an expert in it, but the pattern
00:52:44.720 of evaluation is similar.
00:52:47.520 When I look at the carbon dioxide data as a whole, and I think, well, what stands out
00:52:53.820 to me in this mess of consequences of carbon dioxide?
00:52:58.020 I would say the thing that stands out to me most is the magnitude of global greening.
00:53:02.880 It's overwhelming.
00:53:04.800 And then I think—and it's so—it's not only overwhelming in terms of its magnitude, right?
00:53:08.720 So, immense areas have greened.
00:53:10.960 It's also the case that the areas that have greened tend to be semi-arid areas.
00:53:17.240 And so, when I look at that, I think, well, if you were looking at this neutrally, at least
00:53:23.740 the question of whether or not that's a net good should arise.
00:53:27.180 Yeah.
00:53:27.420 Because—
00:53:28.000 The CO2 on its own.
00:53:29.180 Yeah, well—
00:53:29.360 Leaving aside the energy that's coupled with it.
00:53:31.220 Because if you're factoring the energy—
00:53:32.220 Just the CO2, right?
00:53:33.040 Exactly.
00:53:33.680 And that's being accompanied by about a 13% increase in crop productivity in consequence
00:53:39.440 of the additional CO2 emissions.
00:53:42.680 Right.
00:53:42.920 And so, well, that's just an example of the preposterousness of the claim that carbon dioxide
00:53:50.920 sequestration is something that makes sense.
00:53:53.480 It's like, well, actually, that stuff might be useful.
00:53:55.780 Plus, it's really, really expensive to sequester it.
00:53:58.200 Like, you already made the economic case.
00:54:00.300 That's devastating by any reasonable standard.
00:54:03.560 And this is actually a really important issue right now because there's—I think one of
00:54:07.100 the big challenges to what I call energy freedom policies is now, thanks to the Inflation
00:54:11.480 Reduction Act, which is the thing that really set these carbon capture prices incredibly high,
00:54:16.500 we have a huge portion of the oil and gas industry now lobbying to keep these very large
00:54:21.620 subsidies.
00:54:22.400 Right.
00:54:22.680 Of course.
00:54:23.140 The oil and gas industry was more consistently pro-energy freedom before, but now when it
00:54:28.400 comes to Congress talking about, you know, do we repeal the whole IRA or do we be part
00:54:32.060 of it, there's a lot of very influential people who are saying, no, no, of course we have to
00:54:35.280 keep the carbon capture stuff.
00:54:37.320 And I think that's—I think it's a wrong view, but also it's—people very much underestimate
00:54:43.340 how valuable it is to be known as a truth teller and to have intellectual integrity across
00:54:49.440 a lifetime.
00:54:50.180 And I found this out just as a sort of digression when I was, you know, many people sort of interacted
00:54:55.580 with the transition team, and I'm never officially on any kind of team, and, you know,
00:54:58.980 and I definitely wasn't there.
00:55:00.140 But I remember I was making some recommendations, and one person on the team told me, he's like,
00:55:04.380 you know, to make really good picks, we—so, by the way, I'm not taking responsibility
00:55:07.840 for any given pick.
00:55:08.780 I'm just saying, you know, I was one of the people consulted, and I just—he said, you
00:55:11.360 know, to make really good picks, we need outside experts that we can totally 100% trust,
00:55:16.960 and that's the way this person said, like, I feel about you.
00:55:19.380 And I thought, like, that's interesting, because I didn't even know this person.
00:55:21.960 I met him once a while ago, but, like, he's seen me think consistently for 17 years in
00:55:27.900 a way that's logical and not partisan or tribal or not, like, pro-fossil fuel industry
00:55:33.280 when the fossil fuel industry is doing something badly.
00:55:35.300 And when the fossil fuel industry tries to convince people, oh, well, it's good to subsidize
00:55:39.720 the hell out of CO2 reductions for carbon capture, but it's bad for solar and wind, how are people
00:55:45.840 supposed to take that?
00:55:48.120 So I think short-term, it feels like, oh, well, maybe we can keep these subsidies, and
00:55:52.680 like, we were planning on getting these subsidies.
00:55:54.960 But in the long term, you establish these anti-freedom things, and you diminish your credibility,
00:56:00.580 because what—
00:56:01.580 Well, I think you're pointing to the fact that there isn't a better medium to long-term
00:56:06.580 strategy than the truth.
00:56:08.400 Yeah.
00:56:08.780 Yeah.
00:56:09.160 And that's, it's like, it's kind of like quality is the best business model, like, in
00:56:14.040 the long term.
00:56:14.820 Yeah.
00:56:15.040 But it's one of these things where it's a lot easier for the missionary to figure that
00:56:19.860 out.
00:56:20.320 Like, sometimes people talk about, like, missionaries and mercenaries in business, and, like, often
00:56:25.000 the missionary, it's just, like, you can't even do anything else.
00:56:28.040 But for me, whatever, because I'm kind of an entrepreneur, but I'm really just, I like
00:56:33.060 thinking about what I think is, I like trying to figure out what I think is right, and then
00:56:36.500 convincing other people of it.
00:56:37.740 So just, psychologically, there's no appeal to me of, oh, I'm going to say something someone
00:56:43.040 else thinks is right, and that I think is wrong, and I'm going to make money.
00:56:45.980 Right.
00:56:46.220 I mean, that's just throwing my life away, and it's going to be, it's going to be miserable.
00:56:50.540 So that's sort of like, I didn't do it with the idea of, oh, in 17 years, I'm going to
00:56:56.940 be a trusted, like, political advisor.
00:56:59.200 But maybe people, if it takes that to convince other people, like, these trade groups, because
00:57:04.140 when I do energy talking points, now we advise on policy, and I just...
00:57:07.480 Hi, can I get some extra fortune cookies, please?
00:57:10.980 Like, 30?
00:57:11.800 Well, what are you doing?
00:57:12.820 Oh, I get all my investment advice from fortune cookies.
00:57:15.080 This place has the best ones.
00:57:17.080 See?
00:57:17.620 It says, come to BMO for any investment support you need.
00:57:20.740 No, it doesn't.
00:57:21.840 Ah, yeah, it does.
00:57:23.600 Ooh, this is a long one.
00:57:24.860 Whether you're new to investing or pro, BMO is there on your financial journey.
00:57:28.400 Wow, that's actually really helpful.
00:57:30.360 Oh, yeah, this one.
00:57:31.480 Learn more at bmo.com slash invest.
00:57:33.600 Terms and conditions apply.
00:57:34.540 Wait a minute.
00:57:35.080 Is this a BMO commercial?
00:57:36.260 BMO.
00:57:37.480 Kind of, people just roll out the red carpet for us.
00:57:40.660 They're so interested, and we do no lobbying, no public support.
00:57:44.040 I have no affiliation.
00:57:45.320 I give no money to anybody.
00:57:47.040 I don't endorse anybody.
00:57:48.220 Because what you did was you focused on addressing the problem.
00:57:51.540 Yes.
00:57:51.880 Right.
00:57:52.600 And that was the focus.
00:57:54.600 Yes.
00:57:54.980 And not the consequences of that.
00:57:56.900 Yeah.
00:57:57.100 And you're pointing out that the medium to long-term consequences of that really couldn't
00:58:01.320 have possibly be more positive.
00:58:03.040 Yeah.
00:58:03.260 Right?
00:58:03.440 It took a long time.
00:58:04.720 Yeah, exactly.
00:58:05.080 Like, it's a long-term investment strategy.
00:58:07.480 But, yeah, I found exactly the same thing.
00:58:09.400 It's exactly the same thing.
00:58:10.340 So, it's just this thing where, but I notice these trade groups, like, I was in a meeting
00:58:14.080 recently, and, like, the other, you know, another person in the meeting might be way
00:58:17.780 more famous and way richer than I am.
00:58:20.020 But I feel like everyone in the room trusts me more because they know, like, I'm saying
00:58:25.180 what I think is right, which doesn't mean they'll agree with me.
00:58:27.080 Right.
00:58:27.260 But they know that I'm—
00:58:28.140 It doesn't even mean that you're right, but it does mean you can be trusted.
00:58:31.260 Yeah.
00:58:31.440 In this case, I am right.
00:58:32.560 Yes.
00:58:33.000 Well, fair.
00:58:33.680 Yes.
00:58:34.220 There is also that.
00:58:35.420 No, no.
00:58:35.660 But, yeah, no.
00:58:36.360 It's like—and just knowing that you can—if the trade groups really came across, like,
00:58:41.700 we really believe this, and we're saying this not just because we're in oil or solar
00:58:45.680 or whatever.
00:58:46.580 We're in oil or solar because we think it's good, and we're going to only tell you what
00:58:49.580 the truth is, even when it causes us to lose some short-term subsidy, they would be so
00:58:55.380 powerful.
00:58:56.620 Yes.
00:58:57.000 Amen to that.
00:58:57.900 That's definitely the case.
00:58:59.480 Yeah.
00:58:59.500 So I forget—oh, yeah, I was just saying you're in the ChatGPT.
00:59:01.740 Yeah.
00:59:02.100 Like, it came up with this, and I just thought, oh, wow, this is going to be really fun in
00:59:06.460 the messaging because you can just—and even some of the Canadian stuff I've been doing,
00:59:11.340 I've been using it.
00:59:12.220 So, yeah, people need—if you're in any kind of intellectual thing or really anything that
00:59:16.880 relies on high-quality decisions, you need—here's my free advertisement, Sam Altman, like, you
00:59:22.040 should use something like chat.
00:59:23.420 You should just get whatever the cutting-edge thing is because it's so much cheaper than
00:59:27.560 people, and it really is replicating and replacing a bunch of human functionality.
00:59:31.520 Oh, yeah, well, let's return to that for one second before we go back through these.
00:59:34.980 So, a point you made very early is that energy transformed into work has substituted for
00:59:41.760 labor, and so we trade energy for labor.
00:59:44.940 And now we're trading energy for intelligence, and intelligence itself is a labor multiplier.
00:59:51.480 Yeah, exactly.
00:59:52.160 So, the question is, well, is trading energy for intelligence a good trade?
00:59:56.380 It's like, well, that's what we do.
00:59:59.240 That's what human beings did.
01:00:00.460 Is buying intelligence really cheaply a good thing?
01:00:05.000 Right, right, right.
01:00:06.280 Well, it's a good thing for every individual who buys it, so in the aggregate, it's probably
01:00:09.360 going to be pretty good.
01:00:10.200 Yes, yes.
01:00:10.860 Well, and it's what you want to do when you hire someone to do a complex job.
01:00:13.740 You're going to pay for intelligence.
01:00:16.860 Like, it's so, just as one final digression, people are just so, they just don't yet realize
01:00:23.240 how much, how efficient it's going to be to use these things.
01:00:27.180 For not every application, by any means, but for many applications, they have all these
01:00:30.520 things like, oh, it's not always, it doesn't always get directions properly.
01:00:34.460 Oh, really?
01:00:34.940 All your human employees always get the directions correctly?
01:00:37.680 Right, right, right.
01:00:38.180 Like, at this point, ChatGPT Pro is better than almost any human you will ever employ in
01:00:43.560 terms of following directions.
01:00:45.780 Like, it is really, really good in terms of just, if you write out what you think, if
01:00:49.780 you say it in a circuitous way, it is pretty damn good.
01:00:53.120 And in terms of, like, the output, this is another thing I think people don't get is,
01:00:56.700 they're like, for instance, I'll give you an example.
01:00:58.740 I was running this CO2 thing, like, this natural gas CO2 thing, and I ran it with the
01:01:03.080 non-pro version of ChatGPT, and it gave me a false thing that was something like, I wanted
01:01:08.800 to know how much volume of natural, how much, how does the volume of natural gas compare
01:01:14.300 to the volume of CO2 generated?
01:01:17.020 Because one way to think about this idea of let's capture, let's use fossil fuels and
01:01:21.120 capture the CO2 is, you need to build a whole new industry for the captured CO2.
01:01:26.260 So, if it's a one-to-one ratio, right, if it's a one-to-one ratio of natural gas and
01:01:32.520 CO2, then you need to basically double the size of the industry, right, in terms of piping
01:01:37.120 it and putting it underground.
01:01:38.400 And I thought it was a one-to-one, and then I was in the airport and talking to ChatGPT
01:01:42.440 Plus, and it told me it was like 59 times.
01:01:45.500 I'm like, that sounds wrong, but if that's true, that's crazy, and I was sort of excited,
01:01:49.540 like, oh, wow, that's a blockbuster.
01:01:51.120 But then I was like, I ran it the other day with ChatGPT Pro, and it said, no, no, it's one-to-one,
01:01:55.300 which made more sense to me.
01:01:56.780 But it's like, okay, yes, sometimes it'll make a mistake like that, but I don't go to
01:02:00.340 print with that thing without confirming it by an expert.
01:02:03.900 But often, but even if it's right 95% of the time or 98% of the time, you can often get,
01:02:09.400 I don't know if the perfect words for this, but you can often get, like, the shape of
01:02:12.900 what the solution will look like, even if not every value is correct.
01:02:17.560 So, you can get the idea of, yeah, you do need to build a whole new parallel industry, and
01:02:21.140 it would cost a lot of money, even if your estimate is 50 times too high, you can help
01:02:24.840 think through it.
01:02:26.060 So, just the fact that it can make errors, A, humans can make errors too, but it can really
01:02:31.480 help you explore territory and develop ideas very, very quickly.
01:02:36.180 And then what you might find is that you make one or two errors at the end, but usually those
01:02:39.660 are not decisive errors that throw out the whole project.
01:02:43.880 Usually, because every kind of thought process involves lots and lots of little assumptions
01:02:48.700 and interrelationships.
01:02:51.360 And if you get 98% of that right, probably the overall thing is going to be right.
01:02:56.720 Probably a very small percentage of the time, it's going to be, oh, there was one false assumption
01:03:00.420 and the whole thing is wrong.
01:03:01.380 I have yet to find that.
01:03:03.340 It could happen.
01:03:04.060 It could happen.
01:03:04.860 But so, yeah, I think people are just underestimating how good this stuff is going to be.
01:03:09.300 Although not the people who are demanding more energy so that they can make faster artificial
01:03:14.760 intelligence systems.
01:03:15.880 They're not underestimating it.
01:03:17.700 And that's why they're driven to do too.
01:03:19.140 And they may be even overestimating certain things.
01:03:21.980 And it could be we're back in five years and yeah, there might even be a bubble in certain
01:03:25.880 ways.
01:03:26.160 But at this point, it's obvious that for a lot of people, it's going to help them.
01:03:30.540 And I would just say as an entrepreneur who runs a small business that's very intelligence
01:03:34.340 based, it obviously helps me think much, much better.
01:03:38.520 It helped me a lot when I was writing my last book.
01:03:40.960 I used the AI systems a lot.
01:03:43.440 Yeah.
01:03:44.060 Yeah, yeah.
01:03:44.600 To sketch out research domains, you have to check the references.
01:03:47.480 You have to make sure it's not lying to you.
01:03:48.840 Yeah, yeah.
01:03:49.080 But you can do that.
01:03:49.860 You have to be careful with it and you have to, you know, interact with it intelligently.
01:03:53.280 But yeah, and we've built specialized AI systems too that some of them based on my work
01:03:58.380 that I consult because, well, that's an extension of my thinking and that's been extremely helpful.
01:04:03.260 And so.
01:04:03.520 Oh yeah, I should mention, alexepstein.ai is now free.
01:04:06.360 So people should check that out.
01:04:07.600 Oh yeah, okay.
01:04:08.100 So that's now the latest version of Energy Talking Points is we're already having some
01:04:12.020 elected officials use this for like floor speeches is you can have alex.ai write you
01:04:16.220 a floor speech.
01:04:17.240 And it's so we've spent a lot of money customizing an AI.
01:04:21.120 And that's alexepstein.ai.
01:04:23.000 Yeah, .ai.
01:04:23.600 And what it's really done is it's engineered with prompting that is very based on this even
01:04:29.280 handed and pro-human thinking.
01:04:31.600 Right, so you've built that ethos into it.
01:04:33.300 Yes, and really one thing it does really well is question assumptions.
01:04:37.320 So it scans everything for, does this question or statement have an assumption that alexepstein
01:04:42.200 would disagree with?
01:04:43.440 So for example, if you say, hey, alex.ai, how do you, how do we get to net zero by 2050
01:04:48.520 as quickly as possible?
01:04:49.720 It doesn't just try to manufacture your answer.
01:04:51.380 It says, would Alex agree that that's the right goal?
01:04:53.180 And he says, well, actually, I disagree.
01:04:54.480 I don't think this is the right goal.
01:04:56.000 I don't think our goal.
01:04:56.620 It's on the question, the premise.
01:04:57.460 Yes, exactly.
01:04:58.260 That's very funny.
01:04:59.200 Yeah, but that's one thing we had to teach.
01:05:00.820 That means it accurately reflects you.
01:05:02.800 Yes, yes, exactly.
01:05:03.980 It's just as annoying in some circumstances.
01:05:06.620 All right, let's go back to these five points.
01:05:08.760 So in terms of liberating, so let's make sure to couple, at least one of the big things
01:05:12.900 for this.
01:05:13.920 So I would say with the liberating domestic development, one of the key things we need to do is address
01:05:19.360 what's called NEPA.
01:05:20.240 I don't know if you've heard this term.
01:05:21.880 You might have heard it.
01:05:22.360 It's National Environmental Policy Act.
01:05:24.280 So this is one of the early environmental laws.
01:05:26.840 And NEPA is the thing.
01:05:28.260 And I forget what the Canadian equivalent is.
01:05:30.160 I think you just passed a new version of this that's nuts.
01:05:33.320 But it's basically, it's a duplicative review process.
01:05:37.340 That's what NEPA is.
01:05:38.420 So it basically says like any agency that does anything, it has to also go through an additional
01:05:43.540 review for its quote unquote environmental impacts or impacts on the human environment.
01:05:48.460 I mean, it's worded something like any significant impact on the human environment.
01:05:52.940 And significant is not defined, human environment isn't defined.
01:05:55.940 So what it means is basically, and then it has to do with federal actions.
01:06:00.060 So it's like a major federal action.
01:06:02.600 But what is a federal action?
01:06:04.200 Is it anything where federal law applies?
01:06:06.040 So originally it was supposed to be, okay, the federal government is building like a giant
01:06:10.180 bridge, you know, that's a mile long or something like that.
01:06:13.380 Is it going to cause any kind of major damage or something like that?
01:06:17.420 And you write like a 10-page report.
01:06:19.080 Now it's become every project imaginable is covered and it can take 10 years.
01:06:24.360 And one of the major mechanisms is judges can sue.
01:06:27.680 So NEPA has no official authority to stop anything.
01:06:30.920 It's just a review thing, but people can sue, activist groups can sue, and they can say,
01:06:37.700 you left out this bird on your NEPA review, so you throw it back.
01:06:42.160 And then the judge says, yeah, you have to do this.
01:06:44.160 So in practice, this is the thing why we can't build any roads, why everything takes forever,
01:06:48.480 why mine, you know, takes something like 15 years to permit.
01:06:51.440 If you have put your five points here into one of these AI systems, could you ask it,
01:06:58.360 for example, if we were moving in this conceptual direction, what policy changes should be implemented,
01:07:05.620 prioritized by their benefit to cost?
01:07:08.060 I'll have to look now.
01:07:09.160 When I tried it a year ago, it was horrible.
01:07:11.640 But it didn't have search back then.
01:07:13.320 It didn't have a sophisticated processing.
01:07:15.760 So with the search, it would be, it's a good thing.
01:07:18.460 I'll do it right after this and see.
01:07:19.960 Because it's a Pareto distribution issue, right?
01:07:22.300 There's going to be a couple of things you could change.
01:07:24.620 They're going to have a disproportionately positive effect.
01:07:27.300 And everything else is going to sort of...
01:07:28.360 Now, interestingly, with this kind of thing, it's often very efficient to just talk to a very smart lawyer.
01:07:34.360 So like lawyers that we pay, like at the top end, it's like over $1,800 an hour.
01:07:38.480 It's like we'll pay lawyers.
01:07:39.680 And it's worth it because you can just ask them, like an expert in NEPA or an expert in electricity,
01:07:43.740 like what really needs to change?
01:07:45.820 Because you often find that the thing people talk about isn't the thing.
01:07:49.540 Yeah, that's exactly my point.
01:07:51.060 So with NEPA, some of the things are like one of the big issues is like one of the big kinds of solutions is you can,
01:07:59.880 what's called limit NEPA to agency discretion.
01:08:02.880 So basically, the agency can do the NEPA review, but it basically decides, okay, we've done the review and it can't be challenged.
01:08:10.000 Like something like that.
01:08:10.980 And it's fine because it's the agency's responsibility to review the thing.
01:08:14.860 You don't need to put everything in double jeopardy and just take forever and have outside people allowed to question your review.
01:08:20.640 And basically, if it's the...
01:08:21.580 That just makes it impossible.
01:08:22.880 Yeah, that's what it is.
01:08:24.120 So that's this kind of thing where when politicians will talk about NEPA, they'll often say something like, let's limit the length of the process, right?
01:08:32.140 Let's limit an environmental impact statement to two years or one year or what's called an environmental assessment to a smaller amount of time.
01:08:38.820 But they don't fully get that if you, you can still have infinite litigation on that.
01:08:45.040 So even if you set a shot clock, if you have infinite ability to challenge it.
01:08:48.740 So that's the kind of example where, and there's like specific...
01:08:52.100 That's a fix that won't work.
01:08:52.980 Yeah.
01:08:53.420 Okay, I'm going to guide you through these one at a time.
01:08:56.060 So I think what we should do is let's go through them one at a time and you can hit the highest point.
01:09:01.520 Okay, I'll hit the highest point.
01:09:02.920 And then we can go back if we still have additional time.
01:09:05.220 Okay, so that's the highest point for liberating defensive development.
01:09:07.840 It's really limiting NEPA's ability to delay projects, but addressing the real core substance.
01:09:13.360 Right, so that's a red tape reduction process.
01:09:14.880 Yeah, but it's like the, it's the ultimate one.
01:09:16.620 So if you're talking about Doge, like you got to go after NEPA.
01:09:20.860 Yeah, okay, okay, okay.
01:09:22.660 Got it.
01:09:23.020 Because it produces this infinite, exponentially expanding network of litigation in the aftermath of the review.
01:09:29.460 Right, so activist groups can weaponize it.
01:09:31.580 Yeah, have.
01:09:32.560 Yes, of course.
01:09:33.080 It's their weapon, yeah.
01:09:33.680 Right, okay, great.
01:09:34.820 But the renewal subsidy issue.
01:09:38.480 Yeah, so the ending preferences for unreliable electricity.
01:09:43.160 So there's a lot of interesting ones here.
01:09:45.940 I would say the most important thing is that FERC, the Federal Energy Regulatory Commission, has to become laser focused on reliability.
01:09:55.600 Right, right.
01:09:56.260 And secondary cost.
01:09:56.680 That's a real conceptual switch.
01:09:58.100 Which it hasn't done.
01:09:59.080 And so one of the major, so right now it's focused on things like climate, right?
01:10:02.740 So I review a project and I'll say, is this project climate friendly?
01:10:06.840 Now let me ask you, if you're approving a natural gas pipeline, how the hell can you tell, and climate is a global issue, and this is going to be de minimis, how do you decide is it going to add more?
01:10:16.020 Like, it's not its job at all, it has no statutory right to discuss that kind of thing, but it's threatening all sorts of projects on the grounds of, I think this will lead to slightly more greenhouse gases in the world or slightly less.
01:10:27.720 Like, so it needs to get out of that.
01:10:29.200 So that's one thing is getting out of this whole set of issues.
01:10:33.400 So part of that is the focus.
01:10:34.780 Right, so is your point there that, I think it's related to the point that you made earlier, is that there's going to be criteria for acceptable sources of power, and one of the fundamental criteria is going to be reliability.
01:10:48.440 Well, that's actually, you're anticipating what I was going to say next.
01:10:51.280 So one thing is just get non-electricity concerns out of the mix, except for safety.
01:10:57.720 Like, it has a mandate of safety, like if, you know, your power lines are endangering people and this kind of thing.
01:11:01.720 But FERC should have nothing to do with climate or anything like this.
01:11:06.220 So, and this is going to be related to the climate thing, we have to get rid of the whole of government climate agenda.
01:11:11.560 But then to your point, yeah, so it means exclusively focusing on reliability and cost and also safety, but then it also means it needs to do new things.
01:11:21.420 And in particular, it needs to have some sort of national reliability standards, which it doesn't have.
01:11:26.320 And there's a lot of complexity as to why, because technically FERC is not allowed to regulate what's called generation.
01:11:34.480 But the way FERC oversees a set of institutions called RTOs and ISOs, RTO stands for Regional Transmission, I think it's Operator Organization, and then Independent System Operators.
01:11:44.400 And they are these interstate entities.
01:11:48.300 So, you'll sometimes hear about like PJM, or ERCOT is not quite one of these, or like CalISO, that's what we have here in California.
01:11:56.780 But most of them, like MISO is a big one that'll cut across, say, Indiana and Iowa and multiple states like that.
01:12:03.100 And what's happening is these are electricity organizations that are supervising all of the, you know, all the electricity among all these states, but they're imposing no reliability requirements on the states.
01:12:14.960 And so, this is allowing certain states like Iowa can just build a whole bunch of unreliable generation and then parasite off Indiana's coal plants.
01:12:22.520 And what's happening, because there's no real oversight on reliability, is we're getting a nationwide decline in reliability.
01:12:28.220 And I used to dig deep into these things, and what happened is someone will put forward like what's called an IRP, an integrated resource plan for their electricity.
01:12:37.280 And their plan will be like, we're just going to build all this intermittent stuff, and we're going to get the excess from the grid.
01:12:44.160 But nobody's responsible for the grid being reliable.
01:12:47.140 So, everyone's making these plans to do unreliable stuff and saying, we're going to get the rest of it from the grid, but nobody's responsible.
01:12:52.940 Assuming it's reliability.
01:12:53.480 But there is no grid.
01:12:54.520 Yes, exactly.
01:12:55.240 So, that's a key thing, is they need to have some sort, if they're going to, if we have this system, like this regional system that we have, they need real reliability standards there.
01:13:05.540 And that relates to the concept I mentioned earlier of like technology neutral, like dispatchability standards.
01:13:11.660 Right, right.
01:13:12.080 And that means you don't prohibit solar and wind, but you require them to be firm or reliable.
01:13:17.580 And then generators can meet that however they want.
01:13:20.540 If they think they could do a solar plus storage, if they think they could do a solar plus gas or solar wind, like let them experiment.
01:13:26.600 But don't allow people to sell unreliable electricity onto the grid and then have everyone else pay for it.
01:13:32.820 And compromise the reliability of the grid across time in a degenerating way.
01:13:37.020 Of course, yeah, pay for it in the financial sense, but ultimately in the reliability sense.
01:13:40.240 And the cost that we pay for electricity in dollars is nothing compared to the cost of unreliability.
01:13:46.720 I mean, the cost of unreliability is, I mean, you can see it's literally death in a case like the Texas freeze.
01:13:53.640 I mean, just imagine the grid goes out for a week.
01:13:55.720 Like, you don't pay that electricity.
01:13:57.500 It's a terrible thing for an Albertan to think about.
01:13:58.960 I mean, yeah.
01:13:59.520 I mean, in any kind of cold, like where we are today is, you know, in Hollywood Hills.
01:14:03.440 Yeah, maybe like, but even there, right, your whole system gets, you're in, like the way to think about it is electricity, like in the way I think about environment, as in like the environment, our environment is everything that affects our well-being.
01:14:16.260 Like there's no more important aspect of the human environment than electricity.
01:14:20.760 Right.
01:14:21.060 Like that is, I mean, that and maybe the roads and transportation system, like those two things, without those, your environment is destroyed, you regress, the world cannot support 8 billion people.
01:14:30.000 So, like any threat to reliability is just such a catastrophic cost in terms of lives and then in terms of industry, right?
01:14:38.140 Because if you start to have any kind of frequency of blackout, you just can't support any industry, which means your country becomes more.
01:14:45.400 Especially the kinds of industries that depend on being on 100% of the time.
01:14:49.740 Yes, which is why these tech companies, guess what they're doing?
01:14:52.260 They're not, they're starting to not build things on the grid.
01:14:54.660 Right.
01:14:54.860 They're starting to build things off the grid because they can't.
01:14:57.780 So, what's going to start to happen, and this is going to lead to outrage, is they're going to be partially on the grid, they're going to be sucking up a lot of electricity, consumers are going to have outages, and then they're going to learn that these companies are both taking reliable electricity from their grid while promoting solar and wind, and that they're building their own natural gas.
01:15:15.520 Hey everyone, real quick before you skip, I want to talk to you about something serious and important.
01:15:21.960 Dr. Jordan Peterson has created a new series that could be a lifeline for those battling depression and anxiety.
01:15:28.300 We know how isolating and overwhelming these conditions can be, and we wanted to take a moment to reach out to those listening who may be struggling.
01:15:35.000 With decades of experience helping patients, Dr. Peterson offers a unique understanding of why you might be feeling this way in his new series.
01:15:43.420 He provides a roadmap towards healing, showing that while the journey isn't easy, it's absolutely possible to find your way forward.
01:15:50.820 If you're suffering, please know you are not alone.
01:15:53.980 There's hope, and there's a path to feeling better.
01:15:56.660 Go to Daily Wire Plus now and start watching Dr. Jordan B. Peterson on depression and anxiety.
01:16:02.920 Let this be the first step towards the brighter future you deserve.
01:16:09.220 ...while touting 100% renewal.
01:16:12.480 So to avoid this PR nightmare, they should join me and pro-freedom people and be pro-electricity and pro-fossil.
01:16:18.720 Right, right.
01:16:19.220 So that's on the electricity side.
01:16:20.400 It's like the focus on reliability, including federal reliability standards over the RTOs and ISS.
01:16:25.760 Environmental quality cost-benefit analysis.
01:16:29.180 I mean, this is such an interesting one.
01:16:31.380 I mean, the core thing is you need to do, you need to calculate very carefully the benefit.
01:16:38.600 So let's just start with the benefit.
01:16:39.840 So the cost, people can probably guess, you need to look at the full cost of a given policy.
01:16:44.540 So if you're talking about let's lower what's called PM 2.5, like let's lower it from, you know, 10 to 8,
01:16:51.900 or like, you know, whatever kind of level of microns they're talking about.
01:16:54.520 Like, you have to look at what is the cost of doing this throughout the economy.
01:16:58.440 And they totally fail to do this for a number of reasons.
01:17:00.940 But let's talk about the benefit side, because this is often something that trips people up,
01:17:04.600 where they think like, oh my gosh, I want clean air.
01:17:06.900 I don't want to die, right?
01:17:08.320 I don't want to choke.
01:17:09.180 So people are very, very sympathetic to incremental reductions in air quality standards.
01:17:15.720 But what you have to realize is those at certain thresholds, those have little or no benefit.
01:17:22.000 I mean, with any kind of-
01:17:23.080 And they take resources away from other things that might be useful.
01:17:26.100 Massive.
01:17:26.120 So that's the cost.
01:17:27.400 But just to give you a sense of how skewed the benefit calculations are,
01:17:31.460 the EPA calculates that the Clean Air Act gives us $15,000 per household per year in value.
01:17:40.220 Like, where the hell-
01:17:41.220 Think about how much a household makes.
01:17:42.780 Right.
01:17:43.120 But it's like a quarter of their income.
01:17:44.460 Like, how do they calculate this?
01:17:46.240 Well, they have this whole system of dramatically inflating the benefits of things.
01:17:51.460 And some of this is just using speculation, like very loose correlation as causation,
01:17:57.480 engaging in all sorts of speculation.
01:17:58.700 But the most obvious one, which I think people are afraid to address,
01:18:02.560 is what they call the value of the statistical life, which is an absolute scandal the way we do it.
01:18:07.740 So the way they do it is they'll say, every life, like, we're going to assign $10 million per statistical life.
01:18:13.600 And people feel like, oh, wow, that's a lot of money.
01:18:15.960 You must really value life.
01:18:17.460 Like, the higher the value you accord to life, you must be a really nice person.
01:18:22.080 But what does it mean to give higher value to that?
01:18:24.660 That just means you're willing to pay $10 million of cost per life saved.
01:18:30.420 But that means you're willing to take away $10 million from everyone else to prevent one life from being saved.
01:18:35.660 So even that should be suspicious because the average person maybe has $1 million of productivity throughout their life.
01:18:41.760 So you're basically taking away the livelihood, like a 10x livelihood takeaway, if only it were that good.
01:18:47.420 Because how is a value of statistical life calculated?
01:18:51.220 It is calculated by literally any delay of death, even a day.
01:18:56.520 So if you delay, if you can claim via speculation, well, you know, I'll just use myself.
01:19:03.420 Like myself, I'm 44 now.
01:19:04.700 Let's say when I'm 88.
01:19:05.980 Like, I have something that says Alex will die on Tuesday instead of Wednesday, then I consider that a $10 million benefit.
01:19:14.500 And therefore, I'm going to take away $10 million from everyone else, which is what they're doing in terms of things.
01:19:19.040 So it just leaves this insane looting of the economy in favor of these just totally, they're both speculative and then tiny things that no one would ever accept for themselves.
01:19:31.000 That's the thing.
01:19:32.060 Like, you have to think of it as what would you accept for yourself in terms of what would you replace that with?
01:19:37.840 Like, can you tie that back to the energy issue that we're focusing on?
01:19:42.620 So the basic cost benefit, you're making the case that the basic cost benefit assumptions underneath many of the current policies are radically wrong and counterproductive.
01:19:52.880 Yeah.
01:19:53.180 So what would you replace it with?
01:19:54.320 Yeah, yeah.
01:19:54.740 Oh, well, so you need two elements.
01:19:57.920 So one is you need a statistical life year.
01:20:00.240 So you can't do life because what life does is it doesn't differentiate between a day and 100 years, right?
01:20:06.220 Which, of course, as humans, and ultimately you have to think about it like we would think of as individuals rationally.
01:20:12.000 And then if you have some sort of aggregate policy, then you need to think of it that way.
01:20:15.140 So one is you think of it, this happened with COVID, right, where people are like, oh, we saved a life, you know, somebody was about to die, but they can't see their grandchild because they might die a day earlier or something like, I want to see my grandchild.
01:20:26.880 Or even a lot of older people said, this is a perfect example, right?
01:20:29.440 Like, I'm willing to take the risk because it's worth it to me to be around my family.
01:20:34.140 Like, I'm willing to, people are willing to take serious risks, not like the risk of having-
01:20:38.480 Well, like driving to work, for example.
01:20:39.960 Yeah, well, yeah, but, and you, so, but here they're acting like nobody's willing to take the risk of inhaling like a tiny minuscule percentage of what, like, you know, like a couple cigarettes worth of PM, like over, you know, a decade or two or something like this.
01:20:53.940 It's just, it's just crazy.
01:20:55.620 So what you need is you need to measure in terms of life years, and then you need a value that's based on typical human productivity.
01:21:03.000 Because that's actually how we make decisions.
01:21:05.300 Like, if I'm deciding how much do I spend on medical care, it's based on my productivity.
01:21:09.240 I don't get to say, you know what, I really value my own life, so I'm going to allocate $50 million to keep myself alive for six extra months in the ICU.
01:21:17.860 Like, you don't get to say that because those resources don't exist.
01:21:21.060 Resources are potentially unlimited over time, but at any given moment, they are finite.
01:21:24.840 And so to take them from one person, like, to just allow people to have these irresponsible, like, tiny delays of death that they don't even ask for, and then wreck the economy and wreck the young, that's what's happening.
01:21:38.840 So that's, yeah, that's an example of you go from death, $10 million per death delayed, to a whatever lifetime productivity is for, like, a full life of life years.
01:21:50.660 And that already will just dramatically reduce the benefit calculation.
01:21:54.640 To give you a sense, there's a really—
01:21:55.840 So you're basically making the claim, if I understand it properly, that the thorny problem of how to calculate the economic—like, the value of a life in economic terms is just to turn to productivity.
01:22:08.900 What you basically say is, well, the typical person has a productivity level of this amount, and that's how we're going to value their life from the—that doesn't mean that's what their life is worth.
01:22:18.080 Yeah, that's the same issue.
01:22:19.420 It's their life-sustaining ability.
01:22:20.980 Yeah, yeah, well, you can't—you can't—you can't translate that into economic terms.
01:22:24.720 Yeah, and so there are questions of, do you—how—do you always use this in every situation?
01:22:28.400 Are there alternative ways to use it?
01:22:30.320 Because you don't even need to do that.
01:22:31.300 I mean, you could also do other things like, hey, tell people what the general science is, and then have them vote on, like, hey, what do you think is a good standard?
01:22:38.580 So, insofar as you are—so I'm not saying this kind of economic calculation applies to every situation.
01:22:43.640 Right, right, right.
01:22:44.100 So, when you are using it, it needs to be—like, your productivity is your life-sustaining and saving ability.
01:22:49.960 Right, right, right.
01:22:50.640 And so, for every individual, that's how we rationally make decisions about delays.
01:22:55.560 Right.
01:22:55.820 So, that's what we can afford to spend on a life, typically speaking.
01:22:59.860 Yeah.
01:23:00.200 Which is not the same thing as what a life is worth.
01:23:01.680 But it's not even—it's really, like, how much are you willing to take away from people's life-sustaining ability in order to limit this risk, right?
01:23:09.820 Yes, yes, yes.
01:23:10.800 What people need to realize is that the number one thing you need to—like, the number one—the biggest risk is depriving people of potential productive ability.
01:23:23.320 And one reason is that you're not only depriving them of, like, of what you know they can do, but you're depriving them of innovation.
01:23:31.680 And this, by the way, relates to the whole externalities fraud, where people are like, oh, fossil fuels have so many negative externalities, and they're talking about positive.
01:23:39.540 But everything that frees up human time—and that's really what energy does, right?
01:23:44.100 It frees up human time.
01:23:45.880 One of the things that frees up time for is innovation.
01:23:48.120 Yes, yes.
01:23:48.620 And innovation has an incalculably large positive externality to it because you don't know when it's going to lead to the internet, when it's going to lead to a cure for cancer.
01:23:57.880 That's the economics—that's the economist's repost to the Malthusian biologist, right?
01:24:04.680 It's like it's not a zero-sum game because if you free up time for innovation, you've transformed the territory that would otherwise be zero-sum.
01:24:12.760 Yeah, well, there's a question of is it ever zero-sum?
01:24:15.760 I mean, the key thing to the Malthusians is they don't understand—they think resources are taken from nature, not created from nature.
01:24:21.880 Right, yes, yes, definitely.
01:24:22.920 And then innovation is basically you're expanding your resource—you're expanding and amplifying your resource creation ability through the discovery of new knowledge.
01:24:31.840 I call this in Fossil Future—what do I call this?
01:24:35.860 It's called—I call it like the—oh, my gosh, it's been so long since I've read my own book.
01:24:39.940 But it's like, oh, the—
01:24:40.640 You need Alex Epstein.ai.
01:24:42.780 He would totally know this immediately.
01:24:44.620 See, there you go.
01:24:45.320 I'm declining in some way, and he's still going strong on fossil fuels.
01:24:49.780 I can't run on fossil fuels, unfortunately.
01:24:51.880 But it's something like the—it's basically the vicious circle of like a low-energy life where you just have very little energy and you can produce very little and you have very little time and resource for innovation.
01:25:05.440 Yes, yes, of course.
01:25:06.300 Versus the virtuous circle, right?
01:25:07.840 What happens is like once you get—or I often call it like the hockey—you start to get that hockey stick thing because what happens is you free up time that teaches you how to become more productive, that frees up more time that teaches you how to become more productive and more productive and more productive.
01:25:21.540 So, in general, when we're doing environmental quality regulation, we need to—or standard setting, we need to be deathly afraid of anything that impedes productivity because impeding productivity is impeding health, including life expectancy.
01:25:35.460 And that's not the way people think of it at all.
01:25:37.140 They just think, the only thing that matters with health is I want to breathe in less smoke and then I'll be like a little healthier versus, no, you can—like, you want to create life-saving cures.
01:25:47.460 And by the way, you can create air filters and you can like—there's just—it's so stupid that we're just destroying our productive ability for these tiny little reductions in particulate that nobody would notice, which is different from the marginal benefit of if you have huge particulate pollution, right?
01:26:04.300 And you have the wealth to lower it, then you should definitely do that.
01:26:08.400 Or if you have a particular region that's very difficult, maybe you want to switch from, you know, diesel-powered buses to natural gas—diesel-powered trucks to like natural gas-powered trucks in a port.
01:26:17.680 Like, there are real reasons to do this, but any calculation you're doing has to be based on rational things.
01:26:23.240 So that value of statistical life at $10 million, that's like a total killer for cost-benefit analysis.
01:26:30.580 So you just—I want people to know there are like 20 more of these.
01:26:33.120 It's so bad the way it's done.
01:26:35.680 But fortunately, I think we have solutions for all of them, but people just need to be aware of the solutions.
01:26:43.020 Yeah, well, you can see why the approach that you're taking is difficult, because as you cascade down the levels of abstraction to the detail, the details multiply.
01:26:53.840 Yeah.
01:26:54.100 And the complexity multiplies, right?
01:26:55.880 But that's part of what's fun about it is it's like a big—it's a big frontier.
01:26:59.760 But you also start to see commonalities.
01:27:01.720 But yeah, it is—I'm still in the mode of—I mean, we've figured out a lot, but I'm still—I feel like maybe two or three years from really feeling like I have a mastery of it.
01:27:12.220 Because there's also so many areas, right?
01:27:14.780 It's not just—but what I find is there's—at a certain level of understanding, you can absorb most of the—you can kind of find—there's always a level at which the details don't matter that much for the essential action.
01:27:29.520 It's like in energy.
01:27:30.540 There are a million things somebody could quiz me about about energy technology, like something specific about the workings of an internal combustion engine, and they could catch me on that.
01:27:38.760 But I still think I know everything I need to know about the workings of an industrial—internal combustion engine for purposes of evaluating energy, right?
01:27:47.140 And so there's the thing of what details do you need to know to guide policy.
01:27:53.200 And it's a lot, but it's not infinite.
01:27:56.020 And one of the things I think I do well is know—is be very—like, for better or worse, I'm very purposeful in knowledge, so I'm not the most curious guy.
01:28:06.460 Like, I don't just, like, learn about the world and just study a lot on my own.
01:28:09.740 I have specific goals, and then I want to learn exactly as much as I need to achieve that goal and know more.
01:28:15.260 So I have these really weird gaps where if somebody catches me, it'll look like, how could you—I thought you were a smart guy.
01:28:20.460 Like, how could you not know that?
01:28:21.880 But then with the goals, I do feel like, oh, yeah, I either know what I need to know or I know what I need to know.
01:28:29.000 So at the moment, I have plenty of gaps, but I also know, like, oh, here's the direction to go.
01:28:33.700 But the direction is not, I'm going to become the world expert on every detail of the Clean Air Act.
01:28:37.860 But I can tell you the Clean Air Act is a piece of garbage and why it's a piece of garbage and how fundamentally it needs to be reformed.
01:28:45.040 Fundamentally, it does not allow you to consider the cost of air quality improvements.
01:28:52.020 So that's a big flaw.
01:28:53.280 Right, right.
01:28:53.740 That's a big flaw.
01:28:54.440 Because you could literally use the Clean Air Act to justify killing the entire population because it's health benefit.
01:29:01.240 Because you say, we're going to get rid of all this particulate emission by killing everybody, but we're not allowed to look at the cost of killing everybody.
01:29:07.140 So we don't want to program an AI with the Clean Air Act.
01:29:10.080 Exactly.
01:29:10.680 Yeah, right.
01:29:11.120 No kidding.
01:29:11.960 Not make it an agent anyway.
01:29:13.180 Right, right.
01:29:14.020 Climate resilience.
01:29:16.020 Oh, so, I mean, this is kind of the one that's pretty easy in terms of what people would expect, I would say, in terms of unwinding the whole of government approach and reforming the, I mean, reforming the international institutions is a big one, too.
01:29:29.760 But maybe the one we'll focus on is, and then, of course, unleashing all energy innovation, which we'll talk about nuclear.
01:29:35.200 We talked about NEPA.
01:29:36.480 That's key to all energy innovation.
01:29:39.240 But let's see.
01:29:40.360 What was I?
01:29:40.680 Oh, maybe the key one to talk about is the resilience itself.
01:29:44.500 Because the broader term I use is called climate mastery.
01:29:48.740 So it's the ability of using energy and machines and technology and intelligence to neutralize climate danger and amplify or create climate opportunity or benefit.
01:29:59.580 So an example of the latter would be taking—
01:30:01.720 It's like a definition of civilization.
01:30:04.020 Yes.
01:30:04.400 Well, civilization is environmental mastery, right?
01:30:06.320 We've civilized nature.
01:30:07.680 People used to realize that.
01:30:09.200 They used to not worship the unimpacted environment.
01:30:11.680 They used to want to civilize it so that they could—
01:30:13.640 That's because they had to live in it.
01:30:15.320 Yes, exactly.
01:30:15.960 Yeah, that's for sure.
01:30:16.880 It's like if you live, right, people live in our mastered, you know, civilized environment, and they think everything they like about it is natural and everything they dislike is unnatural, is human created.
01:30:29.020 But, yeah, so the mastery element—so we can do things like make, you know, a snow—a very snowy area into an expensive paradise, like Snowbird, Utah, where I'll go snowboard a lot, right?
01:30:40.380 That used to be a menace, but through climate mastery, we've made it a very expensive destination through the, you know, warm buildings and the synthetic clothing and stuff.
01:30:49.720 So one point about climate people need to get is there's not even really such thing as a climate negative depending on your level of mastery.
01:30:57.760 Mm-hmm.
01:30:58.460 Because anything, like, if you have enough mastery, you can sort of make use of anything.
01:31:01.660 I live in Scottsdale, right?
01:31:03.740 It's a desert.
01:31:04.640 Yeah.
01:31:04.880 And it's really nice to live there because there's water.
01:31:07.480 Right.
01:31:07.920 Right.
01:31:08.380 Yeah.
01:31:08.560 It'd be a rough place otherwise.
01:31:10.260 And at some point, you know, we'll be able to—people will customize the temperature there more and do all sorts of stuff.
01:31:15.660 And hurricanes, like, if you could harness the energy of a hurricane, you would be thrilled every time a hurricane came around, right?
01:31:20.720 So we have to have that mentality with climate danger that the higher your level of mastery, the higher your level of resilience, and any given challenge ceases to become a problem.
01:31:31.640 And it can, in fact, become a benefit.
01:31:33.620 Like, the snow can become a benefit or heat can become a benefit when it was a harm before.
01:31:38.380 And there—
01:31:38.720 Or endless sunshine in the desert for solar panels.
01:31:42.380 Yeah, but it's not really endless, unfortunately.
01:31:44.900 Well—
01:31:45.120 It's not like being out in space.
01:31:46.620 If it were out in space, then it would be a lot better—well, then you have to beam it to the ground, right?
01:31:50.780 But it's unfortunately not endless all the time.
01:31:53.160 Otherwise, it would be nice.
01:31:54.580 It would be a lot better.
01:31:55.220 We could make a lot more use of solar, which has certain advantages.
01:31:58.200 Like, it's really cheap to make the panels, but its fuel sorts is very problematic.
01:32:02.680 But if we take so mastery, like, maybe an area to focus on is something like wildfires, where—because wildfires is an important area of mastery because it's the one where the green anti-develop movement has most impeded us.
01:32:15.540 So with most things, like, we're not actually—people think we're more endangered from hurricanes, we're less.
01:32:20.860 They think we're more endangered from floods, we're less.
01:32:23.000 And a lot of that is we've at least semi-allowed ourselves to master our environment.
01:32:27.600 Now, the more governments are controlling these things, we're leaving a lot of opportunity on the table, including we reward people for—we give them free flood insurance, so we reward them for living in disaster-prone areas.
01:32:39.960 Because policies like NEPA prevent you from being more resilient more quickly, so there's all sorts of—there's all sorts of ways in which we're not mastering our environment and making it climate resilient to the extent we could.
01:32:51.680 I'd say DeSantis in Florida is a good counterexample.
01:32:54.120 Like, he's very focused on the right kinds of things.
01:32:56.360 Like, hey, let's harden our grid.
01:32:58.280 Let's harden our infrastructure.
01:32:59.600 Let's lower the number of days we have down.
01:33:01.660 Like, he's—he's very good on that.
01:33:03.620 I think he has that kind of mentality, and he's—that state seems very open to that kind of leadership, but around the country, we don't have that as much.
01:33:10.500 But wildfires are this very conspicuous thing where, in many ways, they've gotten worse, right?
01:33:15.480 Like, where you'll see, certainly in California, we have these dangerous, out-of-control wildfires.
01:33:19.960 And, of course, people jump to, oh, well, it's Mother Nature punishing us for our sins.
01:33:23.740 Like, if only we hadn't used those evil fossil fuels, we would have a totally, you know, pristine, lush forest that never caught on fire and never endangered anyone.
01:33:32.220 So we just have to make a net zero pledge, and then we'll have no emissions, the rest of the world will have no emissions, and then the forest will like us again.
01:33:38.700 That's, like, Newsom's plan, more or less.
01:33:41.120 Basically, exactly his plan.
01:33:43.680 So—but actually, it's pretty easy to deal with dangerous, out-of-control wildfires.
01:33:50.320 Like, there are places that deal with this very well, because, really, they are dangerous and out-of-control if they have a certain fuel load, like, right, which is based on, like, the amount of dead wood and stuff that's allowed to accumulate.
01:34:03.620 We learned that in Canada when Jasper, the town of Jasper, burnt to the ground, because people ignored—the federal government.
01:34:09.460 Yeah.
01:34:09.800 They only ignored the fuel load that was gathering around the town, despite repeated warnings.
01:34:14.140 Yeah, and you've seen—I mean, there are books going back decades talking about around the country that this is a problem.
01:34:20.720 And the way I think of California is we've engineered through, like, government-controlled land, federal and California-controlled land, and these green policies that basically say, thou shalt not interfere with nature.
01:34:31.220 You've just allowed this huge accumulation of fuel load.
01:34:34.280 You have this often—these huge, unbroken things of forest, which people think are a good idea, but that's just the ultimate environmental hazard.
01:34:42.420 Like, the California forest is the biggest environmental hazard in the United States.
01:34:46.040 If it were a company, it would not be allowed to exist, because it's such a big threat to just allow all this—it's just like—it's basically like building a forest bomb, the way we've set it up.
01:34:55.820 So we need to take advantage of fuel load management, including logging, which we used to be allowed to do.
01:35:01.600 Like, that's a hugely important thing.
01:35:03.740 You need to have things like fire breaks, and, like, we should really think of how do we engineer the forest so that it's really, really manageable.
01:35:10.920 But that requires this pro-human way of thinking about things.
01:35:14.640 And I think this is an issue where Trump's definitely on the right track about it, but he recognizes, you know, it's the forest management thing.
01:35:20.880 We need to be open to a lot of stuff, including, ultimately, how much of this can—I mean, I don't know if the current administration or Congress will do this, but, like, how much of this can be privatized?
01:35:31.080 Or how can you—but you really need to start to think of forests as an environmental danger.
01:35:35.820 You cannot just allow them to exist—you can't allow anything in a society to exist in a way that is a mortal danger to the human population.
01:35:43.580 Hmm. Well, or even to the forest itself. I mean, one of the problems, as far as I've been able to tell, is that if forests are managed in such a way that that undergrowth builds up and builds up, when they do burn, which they will, they can burn so hot that they burn the topsoil out.
01:35:58.540 Yeah.
01:35:58.980 Right? So that's obviously not good for the long-term viability of the forest itself.
01:36:04.300 Yeah, I mean, there's a question, like, when you're thinking, like, what exactly does that mean? Like, who is the forest? It's just a collection of things. So it's not like the forest is, like, one little—it's not a forest being.
01:36:12.420 But yeah, in terms of the forest, for any purpose you would want it for, right?
01:36:16.860 Yeah.
01:36:17.280 But it's just, we have this very, like, religious, unimpacted nature worship attitude toward forests in particular, I think. And that's not what our ancestors had.
01:36:28.320 That's where the unicorns hang out, you know.
01:36:30.120 I guess, I guess, I don't know. It's just, everyone loves these areas like Alaska and the California Forest, where they don't go, but they really want them unchanged.
01:36:38.040 And they're really willing to inflict a lot of harm on the local residents. Alaska is the ultimate example. Everyone claims to care about the Arctic.
01:36:44.760 Like, I don't know why you're so against development in a place where there's almost no life. Like, why are you so against that?
01:36:51.420 The reason is because there's not much there, and it's always easier to oppose progress in a new area than an existing area.
01:36:57.840 Right? That's why they focus on it. But it doesn't make sense. Why are you against drilling in the Arctic? There's no, there's, there's so little there. Like, you'd be much more against drilling.
01:37:07.460 Because it's white. It's white.
01:37:09.040 It's white, but they don't want to get dirty.
01:37:10.820 They don't.
01:37:11.680 Well, it's something like that. It's a purity violation.
01:37:14.840 But it's, it's always, it's to your point, they're not in it. They're not near it. So, they just have this fantasy in their head.
01:37:22.260 If they were in it, they would die, too.
01:37:23.980 Yeah. You're like, if you go outside there, yeah, you're not like, yeah, maybe we do need some oil here to keep us.
01:37:29.680 Yeah, quickly.
01:37:30.500 Yeah.
01:37:30.760 Okay. So, let's close with comments on nuclear.
01:37:33.300 On the nuclear. Okay, yeah. So, nuclear, there's a bunch of different things.
01:37:36.480 But at the core of it, the biggest problem by far is how we make policy with regard to the allowable amount of radiation from a plant.
01:37:46.100 And this is a very important thing.
01:37:47.680 It's the same thing with the air, the air quality issue.
01:37:50.960 It's like, you need to set a standard that is, that is overall healthy for human life.
01:37:56.180 If you minimum, the worst thing you can possibly do is set a standard so low that it has little or no benefit to human health, but a massive cost.
01:38:05.820 Right, right, right.
01:38:06.500 And nowhere has, because it's always hard to limit the kind of natural emission of something to near zero.
01:38:13.760 Right, right.
01:38:14.020 And in the case of nuclear, it's like radioactivity.
01:38:16.560 In particular with nuclear, it's radiation in the event of some sort of radiation release.
01:38:22.260 Because part of what they're doing is it's not just radiation under normal circumstances, but radiation in the event of like a meltdown or something like that.
01:38:30.060 And they do what are called probabilistic risk assessments in there.
01:38:32.640 But it's all, and when they do evacuations, it's all based on how do you measure the danger of radiation.
01:38:38.500 So you need very precise measurements of radiation, and then you need very rational policies for weighing those risks against any cost that, you know, to lower them, right?
01:38:50.340 And so on both counts, we're wildly irrational, because we measure the radiation risk 50 to 100 times too high through something called LNT, which is linear no threshold.
01:39:00.540 Then I'll talk about how we do the policy.
01:39:02.920 So with nuclear, think about sunlight, right?
01:39:06.600 Sunlight is like many different substances and phenomena.
01:39:10.100 It is healthy in certain quantities.
01:39:11.900 It's like benign or healthy in certain quantities, and it is deadly in certain quantities.
01:39:17.200 So if you go outside long enough, you'll get sunlight poisoning and die.
01:39:20.820 Does that mean that any amount of sunlight is deadly?
01:39:24.300 No, right?
01:39:25.040 Because there's a threshold at which it is benign or even, in the case of sunlight, beneficial, right?
01:39:29.980 Because if you get no sunlight, it's bad.
01:39:31.520 This is true.
01:39:32.540 This is definitely true for radiation more broadly.
01:39:35.620 There's a threshold at which it is benign and even arguably beneficial.
01:39:39.460 There are lots of interesting studies about places with higher levels of radiation that people seem to have less propensity to different kinds of cancers.
01:39:46.580 And it's very interesting, like the physics of it.
01:39:49.300 But part of it is that, you know, the radiation, like what's happening is it attacks your cells in a certain way, but then they repair, and it's kind of like muscle building.
01:39:59.080 Like, do they actually need to be stimulated to a certain extent to do it?
01:40:03.780 Like exposure to immunological agents in childhood.
01:40:09.280 Yeah, exactly.
01:40:09.900 There are different versions of this, but yes.
01:40:11.640 And so what's absolutely true is there's a threshold at which nuclear is safe.
01:40:18.180 But the model that we use to measure nuclear risk is called linear no threshold, which means there is no threshold at which radiation from nuclear is safe.
01:40:28.760 So what that means, to use an analogy of—
01:40:30.820 So that's another zero problem.
01:40:32.480 But it's a particularly—yeah, it's a really, really bad one because we set—what we do is we treat it as dangerous at any level, and then we set the level to be—the allowable level to be 50 times lower than that of nuclear workers, even though nuclear workers have zero sign of any damage whatsoever from their level.
01:40:54.160 So it's at least 50 times too low.
01:40:56.560 So think about that.
01:40:57.680 You have to make something 50 times—as your baseline, you have to make it 50 times lower.
01:41:02.660 So this just is the thing behind so much stuff.
01:41:04.940 You have to way overbuild it.
01:41:06.500 You have to have all sorts of backup scenarios to just prevent this.
01:41:09.760 So we need to change the LNT model.
01:41:11.860 And then on top of this, what's even more irrational is we have something called as low as—the way we make policy.
01:41:17.320 So we measure danger by this no-threshold model.
01:41:20.300 And we make policy by what's called ALARA, which is as low as reasonably achievable, which means we want it as like—and now you can ask, what is reasonable?
01:41:29.580 Yeah.
01:41:30.000 And that threshold would change as technology advances, too.
01:41:32.800 Exactly.
01:41:33.540 But it needs to be based on a scientific understanding of risk in the first place.
01:41:37.240 So if you make a reasonable standard based on a 50 times too high—50 times too low level, so you're 50 times off in terms of what's safe, you're not going to be very reasonable.
01:41:48.660 And in this case, what they do is they basically say, you can go even below.
01:41:53.020 So let's say the standard should be here.
01:41:54.720 It's totally safe.
01:41:55.500 There's no benefit in going below.
01:41:56.660 They put it down here, but then ALARA says you have to put it even lower because there's no threshold, right?
01:42:02.500 So you have to put it even lower if it's reasonable to do so.
01:42:05.560 How do they calculate reasonable?
01:42:07.260 If nuclear at any given point in time is cheaper than an alternative, then it is reasonable to increase its cost.
01:42:14.280 Oh, yeah.
01:42:14.780 Now let's look at what happens in the 1970s.
01:42:17.360 Well, how expensive is natural gas in the 1970s?
01:42:20.280 It is very expensive, right?
01:42:22.100 And oil, which is a major source of electricity, right?
01:42:24.560 We had an energy crisis with the Middle East and around the world.
01:42:29.700 So great, what do the regulators say?
01:42:33.080 Oh, well, nuclear is now too cheap.
01:42:35.920 It's now reasonable to make it more expensive.
01:42:39.220 So let's even lower the threshold more past the point of no benefit.
01:42:43.560 But then what happens when natural gas and oil and coal get cheap?
01:42:46.420 Do they ratchet back the nuclear regulation?
01:42:48.920 No, they do not.
01:42:49.540 So it's an infinite, like, ratcheting of totally useless regulation anytime nuclear has any advantage.
01:42:58.560 So it basically prevents nuclear from ever becoming cheaper and makes it prohibitively expensive.
01:43:02.980 So it's the fundamental risk model and the fundamental policymaking model are both broken.
01:43:07.720 And so we need, like, we need executive actions announcing that this needs to change.
01:43:12.760 And then the NRC needs to have a dramatic reform process and or the NRC needs to be scrapped and we need to have something new or put it out of the Department of Energy.
01:43:23.320 So it's very—so what's interesting about talking to you?
01:43:27.120 Well, I think, first of all, it's a sense of relief, I would say, because it's quite remarkable watching you delve into the details.
01:43:34.300 You know, you differentiated the energy problem into five major tranches.
01:43:38.460 And then you've developed this very detailed knowledge of each of those tranches.
01:43:42.900 And it was obvious that we were just scraping the surface there.
01:43:45.760 And it's like, you can understand—it's easy to understand in consequence of talking to someone who's developed the kind of detailed knowledge that you've developed just exactly why it is that so much public policy fails, right?
01:44:01.440 There's a lot of work to be done at the level of detail.
01:44:04.040 It's very, very difficult to understand exactly what the obstacles are.
01:44:07.640 And it's very attractive intellectually and morally to hand wave at the highest level of abstraction possible, right?
01:44:15.420 And that hand-waving is part of the problem that causes all the impediments—produces all the impediments that you've been describing.
01:44:22.920 So now, you're hoping—we'll close with—
01:44:24.860 Wait, wait, wait.
01:44:25.120 Just one quick comment on this.
01:44:26.520 So I think one benefit that I've had is the combination of being a fairly well-known public intellectual and interested in working in the details.
01:44:36.840 Like, it's a very unusual thing.
01:44:38.600 Combination, definitely.
01:44:39.340 It's a very unusual thing because part of—at the beginning in particular, like, being a well-known public intellectual,
01:44:45.700 particularly in energy, made it easy to get in the door and to talk to people.
01:44:51.460 But then, you know, if it's most other people, they have—most other people, like, their business is a substack or their business is like speaking or they're writing books.
01:45:00.100 Like, this is part of why I'm not writing a book right now.
01:45:02.200 I'm trying to write, you know, a new American energy policy, and eventually I make a book.
01:45:05.940 But I'm not thinking about a book right now because that's a totally—you know, that's a really hard focus and it's a really distracting focus if you have a book deadline or a book contract or that kind of thing.
01:45:16.840 But it's this unique position that I have where I'm well-known for the ideas and they have a certain kind of trust and interest in me.
01:45:24.660 But then I'm super interested in the weeds and I have a team that's interested in the weeds.
01:45:29.620 And so when I can talk to any congressman or senator or—and staffers, and we work with, like, 300 talking points, we have 500 staffers, like, who are—who get our stuff and are part of our group in one way or another.
01:45:41.280 That's a really—in case anyone else wants to try it, you have to have the motivation because it's hard.
01:45:46.860 Oh, yeah.
01:45:47.520 And you are—there's a trade-off with, at least temporary, with public notoriety.
01:45:52.280 And it depends on how much pleasure one gets from the public notoriety.
01:45:57.060 And there's a lot that's great about public notoriety, including you get to meet a lot of interesting people.
01:46:01.460 But sort of my personal bent is I like the problem-solving the most.
01:46:05.220 So I'm quite happy—I like the public stuff, too, but I'm kind of happiest doing the problem-solving and figuring stuff out for myself.
01:46:13.120 So that, for me, works very well and might work well for others.
01:46:17.080 But I would just say that, yeah, if you're a well-known person publicly, there is this path of you can make a really big difference.
01:46:24.140 And one of the differences I can make is I can make the market for the best political ideas being adopted into policy much more efficient because it's been really easy for me to get to know a very large portion of the people in power and have a trusted relationship with them.
01:46:40.360 And I think that would have been impossible if I were not publicly known.
01:46:44.040 I think it would have been so hard.
01:46:45.480 Yeah, well, you're straddling this weird divide between public notoriety, which is one skill set, and your ability to delve into the details of policy at a micro level.
01:46:55.660 That's a very rare combination of interest and ability.
01:46:58.960 And the second one is definitely still in progress, and I need to give a lot of credit to the people that I work with.
01:47:05.100 But it is—I think that's—there are all these smart people out there that I have met, like these lawyers that I pay or sometimes I don't pay them.
01:47:12.620 These people have spent, you know, 30 or 40 years, and I'm the first person who's come to them, not who's come to them, but who's come to them and said, you know what?
01:47:21.360 There's a real chance that if you explain you're a really good idea to me, an exact solution, it might become policy.
01:47:27.680 And it sometimes takes a while because they're so cynical because they've spent their whole life, like, coming up with these good ideas and knowing that at least for some piece of the puzzle, they have the solution.
01:47:37.820 Right, but they don't have the public face.
01:47:39.440 How are they going to get it to them? It's not like Senator X will talk to them or whatever, but Senator X will usually talk to me, A, because they knew me, and then now I and my team are a resource.
01:47:50.800 That's what I'm—part of what I'm excited about is the ability to be like a clearinghouse for the best ideas.
01:47:56.820 And once you have that reputation, then the smart people will say, like, oh, you should really talk to Alex.
01:48:01.980 It's worth your time because he can really get the ideas here.
01:48:06.080 So I think that—
01:48:06.800 Okay, so if policymakers and other interested people are watching, well, they can contact AlexEpstein.ia?
01:48:13.880 No, no, no.
01:48:14.440 AI, sorry.
01:48:15.420 No, no, they could contact—no.
01:48:16.320 No, anyone can contact AlexEpstein.ia, but if you really want help, contact AlexEpsteinHI, Human Intelligence, which is alex at alexepstein.com.
01:48:24.460 That's just my address.
01:48:26.940 And what about online resources apart from the AI system?
01:48:30.860 Oh, so there's energytalkingpoints.com, and I would say the most important thing for people is just my—I was talking about Substacks.
01:48:36.760 I just have a Substack.
01:48:37.540 It's free just to—it's just to share my latest stuff.
01:48:40.740 That's where they'll get this energy freedom platform.
01:48:43.400 So alexepstein.substack.com.
01:48:45.540 Okay, well, we'll put all that in the description of the video as well.
01:48:48.020 Yeah, and I would just say one more thing, so in case it helps people and in case people want to join us.
01:48:51.120 Like, one other benefit I have is that a few years ago, like, I switched from just a public intellectual model to for the political work.
01:48:58.680 I basically got it subsidized by creating a membership group where a bunch of people would contribute $25,000 a year.
01:49:04.540 And they would like talking to each other and getting information from us, but they would agree to let me do whatever the hell I wanted with zero control.
01:49:12.900 So we now have, like, 120 people in that group, so giving us a lot of resource.
01:49:17.660 And on their deal, it is no lobbying, no representation, no control.
01:49:22.420 And that's a very fortunate arrangement to have.
01:49:25.140 Yeah, that's for sure.
01:49:25.800 Because that's what allows me to pay for all these really smart people, like pay a lawyer $1,850 an hour.
01:49:30.940 Why do they pay it?
01:49:31.740 Because they think it's right, I think.
01:49:35.580 And at this point, it's become, also these networks at a certain stage become a benefit just to be part of the network.
01:49:44.080 And now, like, we give them free access to our consulting, which nobody gets except politicians.
01:49:49.420 They like the inside.
01:49:50.380 Now it's sort of become like a profitable thing to be part of Energy Talking Points, like to give your $25,000 membership.
01:49:57.000 But at the beginning, it was, and I think this is why most people do it, it's just they believe in it.
01:50:01.460 And by the way, these are, for half the people, they're contributing post-tax dollars, because it's not a 501c3, because we interact the hell with government.
01:50:08.520 So we have no restrictions on our activity.
01:50:10.740 But it's, yeah, people really believe it.
01:50:12.500 And then they see, like, this is a really efficient, like, it's an unusually effective group that we have.
01:50:19.220 And it's, it's like, I just think, I think at this point, people who see what we're doing are like, wow, you guys really are making it easy for pro-freedom politicians to adopt great energy policies?
01:50:31.000 Okay, so let's review it for the people who are just going to be listening on audio.
01:50:34.980 Tell me, again, the right places of contact, your email address, and then the other proper places of further investigation online.
01:50:42.340 So alex.alexepstein.com.
01:50:44.080 Yeah.
01:50:45.600 alexepstein.substack.com to just get the latest talking points.
01:50:48.720 And then the two big repositories are energytalkingpoints.com and alexepstein.ai.
01:50:53.860 And so if you're, anyone can email me, but particularly if you're a politician who wants help on this, like, we're actively working on this right now.
01:51:01.000 And I'm, yeah, I work with a lot of people, but I just want, I want everyone to at least know about us as a resource.
01:51:06.680 Yeah.
01:51:06.860 And then if, yeah, and certainly if anyone wants to join our energy talking points group, you can email me as well.
01:51:12.040 So, okay, so for everybody who's watching and listening, we're going to switch over to the Daily Wire side now, as you, most of you know, that's going to happen.
01:51:20.700 That's another 30 minutes.
01:51:21.660 I'm going to talk to Alex about some things that are more personal.
01:51:25.660 I want to talk to him about how he learned to be an effective public speaker, because when he started doing this and started envisioning it, he was absolutely terrified by that proposition.
01:51:35.180 And so I'm very interested in how he, how and why he overcame those initial hesitancies and inadequacies, let's say, or inabilities.
01:51:43.200 We want to talk about how to improve your information diet.
01:51:46.660 So I suppose that there's something that Alex introduced to me, that idea just before we started the podcast.
01:51:52.840 And I presume that's a way of strategically approaching the problem of what sources of information you expose yourself to online and elsewhere.
01:52:02.220 And we're going to talk about parenthood as well, because, and how maybe you balance that with a productive career, because Alex has recently become a new father.
01:52:11.280 So if you want to join us on the Daily Wire side for that, we've got another half an hour to spend with Alex Epstein.
01:52:16.740 And in the meantime, thank you, sir.
01:52:19.520 Thank you.
01:52:20.020 I was glad.
01:52:20.640 It was a lot more fun to do it in person.
01:52:22.240 Yes, definitely.
01:52:23.240 Definitely.
01:52:23.680 It was.
01:52:24.120 Yeah.
01:52:24.260 Well, I'm also struck, you know, like, I really enjoyed talking to you the last time we talked, and I've enjoyed talking to you every time we've met.
01:52:32.180 But, you know, you seem to be operating at another plane of analysis at the moment, and it's really quite something to see.
01:52:39.240 I mean, you're a wealth of information that's not only very well thought through from the perspective of first principles on the philosophical side, but all of that's integrated with all this wealth of detailed information that you have at hand.
01:52:54.820 And that should be an invaluable resource for public policy makers who actually want to make a difference in this pro-human direction that you've described, which is crucially important to everyone, crucially important to everyone.
01:53:06.940 Yeah, thanks.
01:53:07.900 And I'd just say it's really fun to—I think it really suits me in particular to, like, actually have exactly what people should do.
01:53:15.440 Like, one of the challenges earlier in my career is, like, yes, here's how to think about the issue, but then what do you do?
01:53:21.440 And I was kind of jealous of those people who would be like, call your congressman and tell them to support Bill 4A.
01:53:25.720 Yeah.
01:53:26.040 And now it's sort of like, oh, I can tell you here's exactly what to do.
01:53:29.780 There's a satisfaction that most people in business have that I think some—I think you probably—I mean, you certainly have this, I'm sure, in psychotherapy.
01:53:37.620 Behavioral psychotherapy in particular.
01:53:39.360 Yeah, and in advising people in general, right?
01:53:41.500 Like, you tell them what to do on some level of abstraction, and then they get the benefit from it, which is—it wasn't quite that way just telling people how to think about energy.
01:53:51.520 Yeah, well, there's something to be said for clarifying things conceptually, and there's something else to be said for differentiating the conceptual into the behavioral.
01:53:59.580 Here's something you could actually do that will serve these ends.
01:54:03.020 Yeah, yeah.
01:54:03.620 That's kind of an optimized approach, isn't it?
01:54:06.380 Because you clear up the conceptual, and you lay the groundwork for a practical movement.
01:54:10.960 Yeah, it's almost as a means to doing it, and it ultimately ends in some action that will lead to a lot of benefit for the person or the society.
01:54:16.780 Yeah, right, which is meaningful action.
01:54:18.300 Yes, exactly.
01:54:18.900 Yeah, yeah, yeah.
01:54:20.020 Good to see you, man.
01:54:20.820 Likewise.
01:54:21.520 Yeah, yeah, great talking to you, yeah, and great listening, for sure, yeah.
01:54:25.260 Anyways, thank you, everybody, for your time and attention today, and check out the web resources that Alex described, and you can—especially if you're politically minded and in a position of political authority,
01:54:37.040 because there is a tremendous amount of work to be done in the domain that Alex is describing that could have nothing but, you know, an endless stream of positive benefits, so we want to get right on that.
01:54:47.800 Thank you.
01:54:48.600 Thank you.
01:54:49.540 Thank you.
01:54:51.540 Thank you.