#154 - Steve Levitt, Ph.D.: A rogue economist's view on climate change, mental health, the ethics of experiments, and more
Episode Stats
Words per Minute
191.87282
Summary
Stephen Leavitt is an economist and coauthor of the bestselling book Freakonomics and its two sequels. He is the William B. Ogden Distinguished Service Professor of Economics at the University of Chicago, and the founding partner of TGG, or the Greater Good Group Consulting Firm. In 2003, Steve won the John Bates Clark Medal, which is one of the two highest honors someone can win in the field of economics (the other being the Nobel Prize). This award is given to the most outstanding economist in America under the age of 40. In 2006, Steve was named one of Time Magazine s 100 People Who Shape Our World.
Transcript
00:00:00.000
Hey, everyone. Welcome to the drive podcast. I'm your host, Peter Atiyah. This podcast,
00:00:15.480
my website, and my weekly newsletter all focus on the goal of translating the science of longevity
00:00:19.800
into something accessible for everyone. Our goal is to provide the best content in health
00:00:24.600
and wellness, full stop. And we've assembled a great team of analysts to make this happen.
00:00:28.880
If you enjoy this podcast, we've created a membership program that brings you far more
00:00:33.280
in-depth content. If you want to take your knowledge of this space to the next level,
00:00:36.840
at the end of this episode, I'll explain what those benefits are. Or if you want to learn more now,
00:00:41.740
head over to peteratiyahmd.com forward slash subscribe. Now, without further delay,
00:00:47.740
here's today's episode. My guest this week is Stephen Leavitt. Steve is an economist and coauthor
00:00:56.040
of the bestselling book Freakonomics and its two sequels. He is the William B. Ogden Distinguished
00:01:03.680
Service Professor of Economics at the University of Chicago. He is also the founding partner of TGG,
00:01:11.580
or the Greater Good Group Consulting Firm. In 2003, Steve won the John Bates Clark Medal, which is one
00:01:20.720
of the two highest honors someone can win in the field of economics, the other being the Nobel
00:01:25.660
Prize. This award is given to the most outstanding economist in America under the age of 40. And in
00:01:32.900
2006, Steve was named one of Time Magazine's 100 People Who Shape Our World. He received his
00:01:38.980
undergraduate degree from Harvard and his PhD from MIT before heading to Chicago, where he is currently
00:01:45.520
a professor. In this episode, we talk about a lot of things, Steve's path through the world of
00:01:51.440
economics, why he chose it, and how unlikely a candidate he was for such study. And then we get
00:01:57.540
into the limitations of economics. We talk about one of the more controversial things that Steve has
00:02:03.500
written about along with his coauthor for Freakonomics, which was climate change and some of the confusion
00:02:09.100
around what they wrote and the implications of it and how things have changed in the decades since they
00:02:14.340
wrote it. We talk about some of the tragedies that Steve has endured in his life and how his very
00:02:20.420
unique disposition has allowed him to not only tolerate these things, but actually thrive through
00:02:26.660
them. We talk about Steve's recent appreciation for mental health and the importance that he believes it
00:02:34.320
plays in our education world. And we get off onto a couple of interesting tangents around things like
00:02:41.580
golf and horse racing, which even if you're not fond of either, I think you'll get a kick out of. So
00:02:47.160
without further delay, please enjoy my conversation with Stephen Levitt.
00:02:56.100
Hey, Steve, it's awesome to see you today. Although I was really hoping we were going to be able to
00:03:01.780
figure out a way to do this in person if we waited long enough. But at some point, I just thought
00:03:05.400
we just need to sit down and do this and videos almost as good. So anyway, thanks so much for making time.
00:03:10.820
So excited to be here. Really looking forward to it.
00:03:13.660
I feel like I haven't seen you in person and it's been too long. It's probably been four years.
00:03:18.460
I think it was, we drove around in that really noisy car of yours was the last time that I was
00:03:22.640
with you. The little tiny thing that made so much noise. You know what I'm talking about?
00:03:27.060
Didn't go very fast, but it made a lot of noise for going slow.
00:03:31.180
I think a lot of listeners probably are familiar with you. Of course, they're familiar with you through
00:03:35.960
obviously your books, podcasts, things like that. But maybe it's worth actually telling people a
00:03:40.540
little bit about how you got interested in the study of economics. And obviously it's through
00:03:45.420
that study that your work became available to a much broader swath of the population than just
00:03:51.180
sort of economists. So what got you interested in studying economics?
00:03:56.100
So I was never interested in economics. I'm probably still not interested in economics,
00:03:59.360
but the way I became an economist was really through the back door. I was the worst kind of student
00:04:06.140
in college. I mean, I got good grades. I got into Harvard. I got good grades at Harvard,
00:04:10.020
but I didn't care about learning at all. All I really cared about was gaming the system.
00:04:13.880
And I had learned that the best way to pick classes was to find the classes that everybody
00:04:19.720
took because it turned out those were both really good classes in general and really easy.
00:04:24.540
And it just happened that at Harvard, EC10, the introductory economics course was exactly one of
00:04:29.120
those courses. So I took it not because I was interested in economics, but just because my rule of
00:04:32.700
thumb was to take classes like that. And I remember roughly, I don't know, the fifth or the sixth
00:04:37.980
lecture was on this topic called comparative advantage. And as the teacher walked us through
00:04:44.240
it, I was just disgusted. And I thought, my God, this is the easiest topic in the entire world.
00:04:50.340
I have literally known about the concept of comparative advantage since I was five years old. I mean,
00:04:55.660
I understood it deeply in my soul. I didn't know the name, like he was putting a name on it.
00:04:59.140
And I remember at the time going through and actually thinking, this is such a travesty.
00:05:04.700
My parents are paying this much per lecture at Harvard. It was a big number for this total
00:05:09.760
drivel. Like this is just an insult. And as I was walking out of the room, my best friend at college
00:05:15.780
was in the same class and he walked out next to me and he said, oh my God, that lecture was crazy.
00:05:22.300
I said, I know, how could they teach us such drivel? He said, no, that was the most confusing thing I've
00:05:26.660
ever heard. I don't think I couldn't understand that in a million years. And I had this moment
00:05:30.580
of clarity where I said, wait a second, I actually think like an economist. And even though I've never
00:05:36.060
been interested in the topic, from that moment on, it became clearer and clearer to me that I was just
00:05:40.260
born thinking like an economist. And it's not advice I would give to other people. In general,
00:05:45.100
I don't think it's great advice to just do what you're good at. But I was so lazy and so
00:05:49.820
intellectually unmotivated that for me, it turned out not to be a bad path was just to say,
00:05:54.660
who cares what I'm interested in? Because I wasn't really that interested in anything.
00:05:58.100
Let's just go be an economist. And I didn't go willingly. I didn't. It took a long time for me
00:06:03.060
to be an academic economist. I graduated. I did management consulting. It was only my deep
00:06:08.920
disgust and hatred with management consulting, which in desperation, as I looked for any exit,
00:06:14.720
I thought, well, the only thing I can even think of to get out of here is to go get a PhD in economics,
00:06:18.200
which is what led me to get a PhD in economics. And if I had known what it meant to get a PhD in
00:06:24.340
economics, I never would have done it. It was only complete ignorance of what I was getting
00:06:28.060
myself into that led me down that path, which turned out to be a lucky path. I mean, things
00:06:32.280
could almost not have turned out better for me, but it was only a series of mistakes,
00:06:38.260
miscalculations, and ignorance that led me to this lucky place.
00:06:42.520
But Steve, there's got to be some challenge and pain and resilience that requires doing
00:06:47.980
everything you just described. So, okay, maybe econ 101 or whatever the course was,
00:06:52.580
was a breeze. But at some point, you're taking senior courses at Harvard in economics. Were they
00:06:58.920
challenging? Did you find them interesting? I mean, what was kind of going through your mind as
00:07:03.080
you're plotting your way through undergrad? I didn't ever intend to be an academic
00:07:09.880
economist. So I didn't approach it the way a normal, like someone who wants to be an academic
00:07:14.880
economist approached it, taking a bunch of math. So it turns out at the highest level,
00:07:19.040
economics is all about math. And I didn't take, I didn't take any math. I took exactly one math
00:07:24.040
class at Harvard. It was called Math 1A. It was because I got a two on the calculus AP exam and I
00:07:30.560
couldn't place out. And so it was really high school calculus. It was the lowest math class they
00:07:35.560
offered at Harvard. And I did do quite well in that class. It was essentially me and the football
00:07:40.120
team and the hockey team were the only people who placed into that. And I smoked those guys.
00:07:43.760
So I just had fun. And honestly, in college, I just had fun. And I liked the economics classes and
00:07:50.500
they were easy for me because the intuition, economic intuition has always come pretty easily,
00:07:55.520
especially in microeconomics, pretty easily. And the classes just weren't that hard. They didn't
00:08:00.400
require much math. I didn't know any math. And it was, I mean, college in general for me was just so
00:08:05.800
much fun. I had good friends and good times and it was easy and I had independence. And I just,
00:08:12.700
you know, for me, those were the best four years of my life and I didn't take school too seriously,
00:08:17.000
but I did well. So, you know, it was all, that was just a breeze. It was the shock of reality,
00:08:21.920
the real world. I had no idea how unfun the real world was going to be when I actually got into it,
00:08:26.820
but college was fun. That is so funny to hear you say that. I will forever maintain that college was
00:08:32.340
the worst four years of my life. And I hope I can say that now, meaning that there will be no four
00:08:37.120
year period that is still in front of me. That's going to be as bad as those four years were. But I,
00:08:40.920
I always enjoy hearing people talk about college being the best four years of their lives.
00:08:45.540
You mentioned microeconomics. Maybe now's as good a time as any to explain to people the difference
00:08:51.140
between micro and macroeconomics, because it is almost a disservice to talk about economics as
00:08:57.560
though it's one broad topic. Yeah, you're absolutely right in that. So let's start with
00:09:02.720
macroeconomics because that's what most people think of when they think of economics. They think about
00:09:06.360
inflation and banking, unemployment and economic growth. And those are really important fundamental
00:09:14.960
problems, but they are, they turn out to be incredibly hard problems because the degree of
00:09:20.480
complexity in the macroeconomy is almost infinite, right? So it's this complex system built up of
00:09:26.500
billions of individual actors and you've got companies, you've got them. Look, it's just a mess.
00:09:31.280
And it turns out, just making a long story very short, that we haven't made much headway in really
00:09:37.300
understanding it because we've, we've approached the profession with a kind of discipline, which says
00:09:42.180
every model of macroeconomics needs to start with foundations that are really rational in the base,
00:09:48.580
micro foundations. And it turns out, I think the problems are just too hard. And that approach in
00:09:52.000
the end isn't going to win. Okay. But what is microeconomics? Microeconomics is instead the study of
00:09:57.660
individual decision-making. So in a, in a world in which there's scarcity and there's competition,
00:10:02.160
how do I decide how to spend my income? How do I decide what job to do and how many hours of work
00:10:10.040
I want to do? How do firms decide what to make and where to locate and things like that? So these are
00:10:15.100
much smaller decisions. And for me, much more intuitive decisions. And they're able to be analyzed both
00:10:22.820
with the formal models of economics, which historically have been very heavily predicated
00:10:27.880
on rationality simply because it makes the math easy, along with the last 40 years of behavioral
00:10:32.940
economics, where we've introduced a lot more psychology and mistakes and a lot more freedom
00:10:37.640
into the models that we use. But anyway, really they're almost distinct micro and macroeconomics
00:10:43.880
because macro in the end is really complicated, not to make all my macro colleagues incredibly angry
00:10:50.920
with me, but very self-referential models because the problems are so hard that you need to abstract
00:10:55.740
so greatly to try to deal with the macro problems that I think we don't have a good handle on them.
00:11:01.540
So I've really steered clear of the macro problems to focus on the individual decision-making, which is
00:11:06.740
I think both in my own life, much more relevant, but also I got something to say because I have some
00:11:12.580
intuition for it. Is there a hierarchy within economics the way, you know, so in physics, you sort of
00:11:18.720
have sort of the theoretical physicists and the experimental physicists and Nobel prizes will
00:11:23.740
go back and forth between the two of those. How does that work in economics? Do a disproportionate
00:11:28.640
amount of Nobel prizes or whatever the highest, I mean, I guess the Nobel, it's not technically a
00:11:33.400
prize, but it's close enough. When you look at the highest levels of excellence and awards that are
00:11:39.160
given in the entire field, do they disproportionately lie on one side of those or the other?
00:11:43.540
So historically, economics has been an entirely theoretical discipline. So the highest status
00:11:50.460
positions historically have been in theory, where I make the distinction between theory and the analysis
00:11:56.820
of data and econometrics and actually trying to take messy data and make sense of it. With the data
00:12:04.080
revolution and the emergence of econometrics, I think I switch in the profession. So there are two big
00:12:10.920
prizes in economics. There's a Nobel Prize. And then there's something called the John Bates Clark
00:12:14.920
Medal, which is given to the most influential economist under the age of 40, American economist.
00:12:21.500
I did win that. Yeah. And so that more reflects what's going on in the present time. And so those
00:12:28.260
prizes, the Clark Medal has mostly gone in the last 20 or 30 years, not always, but often to people at
00:12:36.420
the intersection of theory and data, which I think really is where the action is in economics. It's
00:12:43.160
people who can make sense of data, but are looking at it through a lens of economics and pushing economic
00:12:49.900
thinking in that way. I don't know if anyone cares about this, but economics is really at a crossroads
00:12:54.960
because it turned out when data analysis really started to emerge and we could start to say things
00:13:01.360
about causality in ways we couldn't before. The empirical part of economics really took off,
00:13:09.020
but all of the easy stuff has been done. And so we've kind of estimated most of the parameters that
00:13:15.240
we care about. And so the question is what comes next? And what's happening in economics is that the
00:13:21.280
push has been in the direction of, okay, it's not enough to just estimate what's happened in the
00:13:28.280
world in the past. That's really what empirical economics is really good at is saying, I can
00:13:33.060
look at what's happened in the past, I can estimate a number, and then I can build models to try and say
00:13:37.340
why. Now the task has become greater, which is to say, I don't just want to understand the past.
00:13:43.460
I want to be able to build a model that has the flexibility to tell me what would have happened
00:13:49.100
if instead of the US being a democracy, we were a dictatorship. Okay, so things that are so far
00:13:56.000
out of the realm of what actually happened, that it's almost like science fiction. And so what
00:14:01.280
economists do now is they're asked not just to estimate parameters really well, but to embed
00:14:06.220
those parameters into models, which then have enough degrees of freedom that you can start to imagine if
00:14:11.660
I turn this dial to that dial, what would happen? And I actually personally think that's a terrible
00:14:16.780
direction, because I think what has made economics great is that it's easy to understand the estimates we
00:14:24.680
create. And you can say whether you like them or not based on fact or the approach or whatever.
00:14:30.620
But I think when we get into these more fanciful models, I think we're going to end up running into
00:14:36.400
a huge dead end is that is that they're really too complicated to be estimated well. And so we're
00:14:42.560
just going to get into fantasy land of creating things that are really not grounded in fact. So
00:14:48.540
there's a famous physicist, and it's been attributed to so many that I actually don't know who
00:14:52.760
really said it. But the spirit of the quote is, all models are wrong, some are useful. And I feel
00:14:58.480
like economics is certainly a discipline in which that applies and suffers greatly. Even something
00:15:05.760
that I think was much simpler to try to understand was the extent to which coronavirus was going to be
00:15:11.220
a problem. I mean, those initial models were out to lunch. I don't fault the people who tried to put
00:15:18.580
those models together, the epidemiologists who worked on these models. But the sensitivity
00:15:23.180
around so many variables was so great that you couldn't even begin to extrapolate. And when you
00:15:30.520
did, the only thing I think the models helped you understand was how impossible it would be to predict
00:15:35.600
what was going on. And if nothing else, it was going to show you what parameters mattered.
00:15:40.220
But that was about the extent of what you could extract from that, which was the rate at which this
00:15:44.780
thing spreads and the length of time that it can lay dormant, those things are going to matter a lot.
00:15:50.380
And by the way, what we should have taken from that is earlier and more aggressive testing was a
00:15:55.380
better solution, as opposed to trying to pontificate or estimate whether 60 million Americans were going
00:16:01.160
to get it and 5 million were going to die versus 1 million were going to die. I mean, that was sort of
00:16:05.700
the wrong thing to be thinking about. But again, all of that's easy in hindsight. But I hear you in
00:16:10.340
terms of how that could be frustrating in this profession. Yeah. I mean, I have a very strong
00:16:16.220
view about the relationship between data and theory. And my view is that the sensible way to do things
00:16:24.960
is to make a really sharp distinction between the two, that we should really understand. You go to the
00:16:31.660
data and you just understand what the 3, 5, 9, 12 facts are. And in my view is that those
00:16:39.620
12 facts should be things that if any reasonable person looked at the data, you'd more or less
00:16:45.460
agree on those facts. And to just establish that that is the role of data analysis is to understand
00:16:50.720
those facts. And obviously, you want to get it in the direction of causality and whatnot. But you
00:16:55.840
never do that with the data alone. The data alone are never enough except in a randomized experiment to
00:16:59.920
tell you about causality. And rarely in economics do we get to do randomized experiments.
00:17:03.860
And then I think if you can all agree on the facts, then I'm all in favor of models going forward.
00:17:10.920
Because if I write down a model that is built off of the 12 facts I see, it turns out it's really easy
00:17:18.140
the kind of models that economists write down for other economists to say, OK, I understand whether
00:17:23.600
that's a good model or bad model, what it's sensitive to and whatnot. And if you make this bifurcation
00:17:28.260
between understanding the facts and then going into what those facts imply, because we really care
00:17:34.740
not about the facts themselves, but we care about the implications of those facts for the future or
00:17:38.640
what they tell us about causality or whatnot. I think it's a really powerful model. It's not the
00:17:43.340
model that we use in economics. And it's definitely not the model we use, say, in journalism, that it's
00:17:49.060
often incredibly hard or in debate. So it's often incredibly hard when a debate is going on
00:17:53.820
to understand, are these people disagreeing about the facts or are they disagreeing about the
00:17:58.920
implication of the facts? And so like if I had my dream about how we would do journalism,
00:18:04.100
every newspaper article would basically have a separate little feature that just said,
00:18:08.840
these are the five facts on which I am writing this article. And then it would be easy for me,
00:18:16.040
say, if it's an area I'm expert in, to say, oh, well, those facts don't make any sense or for a fact
00:18:19.920
check or different facts from them. Or in a debate to say, hey, before we debate, I would like each of
00:18:24.640
the participants to lay out the 10 facts that they believe to be true and see if they disagree with
00:18:30.200
each other's facts. If they disagree about facts, then we're in a very different situation than if
00:18:34.240
we disagree about the implications of those facts. And somehow I think that's gotten a lot like that
00:18:38.860
distinction. I've never really heard anyone talk about it except for me, because I'm probably it's
00:18:43.100
not important, but it strikes me as being really, really important.
00:18:45.820
Let's use an example of that where I think you and Dubner have got into some crosshairs of people
00:18:54.300
is around climate change. I mean, you wrote about this, God, has it been 10 years ago now? I've lost
00:18:59.800
track. Was it Super Freakonomics or? Yeah, Super Freakonomics. So I think it's been about 10 years.
00:19:05.440
Let's start with some facts. What did you actually write? Because I think that's been very
00:19:09.680
misconstrued over time. Yeah. Okay. So one thing you can never trust is for people to tell you the truth
00:19:15.300
about what they've done in the past. Number one, because they're going to lie. But number two,
00:19:18.400
they don't even like, I don't really know what I've written. I haven't looked back at that book
00:19:21.840
in a long time either. But here's what I think we wrote. We basically wrote that at that time,
00:19:27.580
there was evidence that the planet was getting warmer, but certainly it was not completely open
00:19:33.280
and shut. And we said, but look, we're not scientists. And what we thought was that the people
00:19:40.820
talking about climate change were asking the wrong question, right? What people talking about climate
00:19:45.240
change were saying is, how do we reduce the amount of carbon we put in the air? Which actually wasn't
00:19:50.960
the right question. The real right question was, the planet is too hot. We're not happy about the
00:19:56.360
planet getting hotter. What is the best way to cool down the planet? To me, that was really the right
00:20:01.940
question. And like the example I give is, I have a lot of babies. I have six kids. So the kids,
00:20:07.580
they poop all the time, okay? And it's a problem, okay? And because if they poop on the floor all
00:20:14.780
day long, it's a real problem, okay? So you could say, well, to solve that problem, let's figure out
00:20:20.340
how to keep kids from pooping or how to stuff the poop back in the kid or whatever. It's like,
00:20:24.760
but the answer is like, how do you get the poop so it doesn't go on the ground? Well, it's a diaper,
00:20:28.260
okay? And it's like a very different answer than if you get worried about like, well, poop shouldn't
00:20:31.680
exist. So I think the same with carbon. Look, carbon's coming out and you don't necessarily have to
00:20:36.220
stuff the carbon back in. You can find another way to cool the planet. Starting from that premise,
00:20:40.560
we then talked to real scientists who knew something who were working on what's called
00:20:45.240
geoengineering. So these are ways to try to cool the planet, okay? And look, they're not perfect in
00:20:50.840
all sorts of ways, but these really smart guys have come up with a lot of clever plans that we
00:20:56.120
might potentially be able to cool the planet. So things like putting sulfur dioxide into the
00:21:00.760
stratosphere. Turns out super cheaply, couple hundred million dollars a year, we could put
00:21:06.000
sulfur dioxide in the stratosphere. It mimics, but in a much more efficient way,
00:21:10.720
what big volcanic eruptions like Mount Pinatubo have done. And that could cool the planet.
00:21:16.980
The best idea I think that was out there was it turns out that certain kinds of clouds that have
00:21:22.240
the right kind of reflectivity and the right level can actually reflect a bunch of sunlight back and
00:21:27.300
oceans are really dark. So they absorb lots of heat. Clouds are really light. They reflect a lot of
00:21:32.180
heat. So a scientist had come up with a plan, never been tested even in the 10 years since we did it,
00:21:37.140
where a bunch of solar powered dinghies would troll around in the ocean, controlled by GPS,
00:21:42.940
throwing up salt from the water, which then seats the clouds, which then leads to a bunch of clouds
00:21:47.500
over the ocean, which his claim, their claim from simulations is that that would be enough.
00:21:52.600
10,000 of those dinghies would be enough to offset all of climate change, global warming. Look, all
00:21:57.260
these are probably short-term solutions relative to a real solution, a scientific solution or
00:22:02.780
behavioral solution to try and deal with climate change. So that's what we wrote about. And
00:22:07.180
admittedly, in retrospect, I regret that we wrote about it in such a lighthearted way. I didn't realize
00:22:13.020
that environmentalists had no sense of humor. And so you couldn't like poke fun at mistakes they'd
00:22:18.760
made in the past. And so we really aggravated a lot of people. But what we did was we unleashed
00:22:24.800
this incredible machinery. There was a whole machinery around the environmental movement,
00:22:32.640
which basically decided we were public enemy number one, and then set out to destroy us,
00:22:39.500
not really worried about what we actually said or whether we were right. I mean, I do think 10 years
00:22:45.600
later, almost everyone now thinks that geoengineering is likely to be part of the solution.
00:22:52.180
It was interesting. One of the magazines, I think it was the Atlantic Monthly, basically wrote
00:22:56.040
an article about us saying that we were the most evil, stupid morons who've ever existed in the history
00:23:02.560
of the planet. Four years later, the same magazine wrote essentially our chapter in our book with the
00:23:11.080
exact same examples we had used, with no citation at all of the fact we had written about it,
00:23:15.180
saying how geoengineering was going to be part of the future. So I think it's good not to worry about
00:23:19.620
whether you're right or you get credit for it. I think you just have to be happy that the movement,
00:23:24.540
although way too slow, has been in the right direction. I think for what's inevitable is that
00:23:28.820
the problem of climate change is one which will not, I claim, ever be solved by behavior change,
00:23:36.700
because it's all about what economists call externalities, that my own behavior has almost no
00:23:42.240
effect on me and a negative effect on the rest of the world. And I would say there's never in the
00:23:48.560
history of mankind been a problem that is fundamentally about externalities, which has
00:23:54.260
been solved by telling people, you should just do the right thing and hoping that suddenly everyone's
00:23:58.980
just going to start doing the right thing. Just literally has never happened. I don't see any
00:24:02.720
reason to think it's going to start now. Yeah, that's interesting. Can you think of any examples
00:24:06.880
where humans behave in that way? Where they do the right thing to help other people?
00:24:11.760
Yes, even if it inconveniences them. Are there any? I mean, there are social norms on the roads.
00:24:19.440
People don't really cut in line very much, a little bit, but I don't think it's because-
00:24:24.720
But is that also because there's direct feedback?
00:24:27.200
Exactly, because people yell at you because it's uncomfortable. But there's very few things
00:24:32.720
where we have private benefit. And look, little things we do around the edges, and certainly we
00:24:38.460
do it for family members and whatnot. But honestly, if you look at every big problem that's been solved
00:24:44.700
for mankind, I challenge you to find one that wasn't solved by technology. And it goes from everything
00:24:50.820
from disease. You know much more about this than me. But if you think about the health benefits that
00:24:56.500
have come from behavior change versus technology, invention, discovery, like I got to believe,
00:25:03.800
you know, smoking being maybe the one example where in the face of overwhelming evidence,
00:25:09.640
it still took, what, 40 years for people to cut down on smoking and still a lot of people smoke.
00:25:16.520
Well, and I would argue that it wasn't really due to just a straight,
00:25:20.240
it wasn't the knowledge that tobacco killed that actually led to the reduction. In fact,
00:25:26.180
if you look at the timeline of this, it's an interesting case study in behavior change. So
00:25:30.360
I think the Surgeon General's report on smoking, which was the first really unambiguous declaration
00:25:37.200
that smoking killed, it had been suspected in the 40s and 50s. But it wasn't until I want to say about
00:25:42.980
65-ish that you couldn't argue it anymore. I mean, we're talking hazard ratios of 14X, right? Like
00:25:52.280
there's nothing in biology that produces a hazard ratio of 14X outside of parachuting without
00:25:58.140
parachutes, that this is going to be, this is going to end in a bad way. And if there was really
00:26:04.460
no change in tobacco use after that for five, six more years, the first real dent in tobacco use
00:26:13.560
came with advertising. When a law was passed that said you couldn't advertise, a tobacco company
00:26:19.540
couldn't advertise tobacco unless the commercial was juxtaposed with an anti-tobacco ad. And it
00:26:25.420
turned out the anti-tobacco ads were so successful that the tobacco companies voluntarily stopped
00:26:31.100
advertising. And that was the first step in the reduction of tobacco. And then of course,
00:26:36.540
you threw in other things, which was laws that said, hey, you can't have tobacco sitting at the
00:26:40.460
counter. It has to be way behind. And obviously then you had excise taxes that were imposed and then
00:26:45.900
you had other environmental things. You can't smoke on airplanes. And so you could even make
00:26:49.880
the case that the reduction in smoking has been less about behavior change and more about a change
00:26:54.640
in the default environment around it, which just speaks more to your point. Yeah, no, I think that's
00:26:59.460
true. And I think also what's changed in the end is it used to be cool to smoke. And now look,
00:27:05.780
if you smoke, people think you're crazy, right? It's like, so, I mean, I think that those social norms,
00:27:11.240
again, speaking to your point, the costs and the benefits of smoking have changed dramatically.
00:27:16.720
And if anything, it's surprising how little behavior change has followed that as opposed to
00:27:21.460
how much. I mean, look at, I mean, again, we're in your territory and not mine, but look at diabetes
00:27:26.280
and how little behavior change has been in response to diabetes. It's surprising in many ways.
00:27:33.180
Let's go back to this climate change issue because again, it's very interesting. And it's one of those
00:27:38.800
things I think like nutrition where there's a very gray area between true scientific expertise and
00:27:46.640
propaganda. What is it about the use of science as a propaganda? I mean, frankly, I think in fact,
00:27:54.780
I think I'm somewhat sympathetic to people who are kind of losing faith in the scientific process a
00:28:00.940
little bit. I'm talking lay people, people who have never studied science, but who are sort of sick
00:28:06.060
and tired of hearing scientists stand up on TV and say something. And they're a little bit skeptical.
00:28:12.820
And so if you're sort of an elitist, you look at them and you say, well, they're morons. How can they
00:28:17.380
not believe in science? But the reality of it is they've kind of been a bit misled, haven't they?
00:28:22.480
And you look at the using science as kind of a weapon, I guess, has become a bit problematic, right?
00:28:29.000
I'm more favorable towards climate science than I used to be. So climate science suffers from the exact
00:28:34.480
same thing that macroeconomics does, which is that there's observations, but really much of it is
00:28:39.680
based on these models. And the models can't possibly have the kind of specificity because
00:28:44.620
it's such a complex system to understand. So like most of the leading models at the time we were
00:28:49.160
looking at it had carbon dioxide, but nothing about water vapor. And it turns out that many people
00:28:54.660
think water vapor is more important than the interaction between carbon dioxide and water vapor
00:28:58.740
would be more important than anything. But the model is just like computing power and the complexity
00:29:02.200
was too much. But I give some credit to, look, there's a part of climate science which is focused
00:29:07.840
on measurement, measuring the temperature of the oceans, the rising of the oceans. And I think that's
00:29:12.580
been done well. Now, what you're talking about, though, is that there's such a blurring of the line
00:29:18.240
between the job of measuring climate change and advocating on TV or in the newspapers for what that
00:29:28.160
means for public policy. And those are very different tasks. And I think as a scientist, you want to be
00:29:37.060
very careful about moving from one role to the other. And I think what climate science, as an observer,
00:29:44.440
a pretty close observer, has done a very poor job on is helping people, lay people, understand
00:29:50.840
when one is an advocacy role versus when one is in a scientific role. And the same people have played
00:29:57.060
both roles. And I think really, it's an unusual branch of science, because the people who go into
00:30:02.160
climate science, almost always share this view that the climate's really important, and we should
00:30:08.140
protect it. And we should be doing things not to have climate change, which is different, I think,
00:30:13.680
from, say, physics, where physicists don't have an inherent belief about whether the universe should
00:30:21.340
be expanding or contracting or whether a particular particle should or should not exist. It's just like
00:30:27.960
that's a different thing. And the other thing I think which has been completely unrealistic and
00:30:33.880
really problematic for climate science is that they don't think about economics enough, and they don't
00:30:39.380
think about the reality of human behavior and how difficult it will be to have behavior change.
00:30:43.680
And they don't think well about the costs of changing our economy. So the costs of going to
00:30:52.720
a zero car, a net carbon world are really, really big. And we pay those costs right away. And I'm not
00:31:01.680
saying that we shouldn't necessarily try to do that. It might actually be a good idea to do it. But to
00:31:05.980
ignore the fact that it's an incredibly expensive solution that's being offered, I think, has been part of
00:31:13.060
the problem because it's been a mix of ignoring that or at the same time saying, look, those costs are
00:31:21.580
good because we're immoral. Humans are awful. Humans are ruining the planet. And we should suffer for the
00:31:28.340
fact. We should have an enormous contraction in the economy. Like Al Gore was upset that there were way
00:31:34.060
too many kinds of cereal being sold in the grocery store. That's sinful that there are so many choices
00:31:40.160
about cereal. But it's like a weird religious thing about we should suffer. And again, I think
00:31:45.800
that doesn't work well to people who maybe aren't convinced or who don't want to suffer. So honestly,
00:31:51.940
I think if I were able to do one thing in the world, and I tried to do it and I failed at it,
00:31:57.920
it would be to have a sensible approach to figuring out whether there is a cheap solution to climate
00:32:06.500
change. Because right now, all we're talking about are incredibly expensive ones, like complete
00:32:12.280
rethinking of capitalism, da-da-da-da-da-da. And I don't think that the capitalist structure has worked
00:32:18.760
well on trying to solve the problem of if it were up to me, if I were president, the first thing I would
00:32:24.160
do is I would have what I call a Manhattan Project for Climate Change. And I would try to convince
00:32:32.280
the 1,000 smartest scientists in the world to stop whatever they're doing and to work on climate
00:32:39.340
change and to put them in a place like Los Alamos and to give them resources. Because it's a hard
00:32:45.720
problem and it's an interdisciplinary problem. And to say, look, for five years, let's just work and
00:32:51.000
let's figure out whether or not there's going to be a great way to do carbon sequestration or to do
00:32:56.980
something, you know, I don't know, something out of the box to do it. Either I think we'll get a
00:33:01.220
solution in five years or we'll figure out, oh, crap, this is never going to work. Like we have
00:33:05.880
no way out of this other than stopping to produce carbon or living in the world. Look, and that's
00:33:11.760
important information to know, because if I knew that the only way out of this was we're going to
00:33:17.480
have to live with the carbon we put in there, then the willingness of public policy to impose enormous
00:33:23.920
costs on carbon producing behavior should go way up. But I kind of doubt that's true. I have a lot
00:33:29.140
of faith in science. I think if we had the best 1000 scientists within five years, we would come
00:33:33.800
up with a reasonably cost effective way to deal with carbon. We'd pull it out of the air in some
00:33:39.060
way, shape or form. And we'd kind of like have the thing solved or we'd know we're never going to solve
00:33:45.340
it. And those are two important things to know. I don't disagree with you, Steve. I think where I'm
00:33:49.420
a little less optimistic than you and I love the idea of a Manhattan project for this. I'd back up one
00:33:55.180
step and reiterate what you said earlier, because I think it's so important. I phrase it in a somewhat
00:33:59.380
different way. But I like how you you discussed it, which is we confuse the objective, the strategy
00:34:04.900
and the tactics all day long. I mean, that that to me is like, if we were going to list the 10 human
00:34:10.420
failures, just as a species, it's that we are not wired to differentiate between objectives, strategies
00:34:15.880
and tactics. That's why 10 years ago, you were getting lambasted for talking about geoengineering,
00:34:22.300
which is a tactic, because the only tactic that was being discussed was the reduction of carbon.
00:34:28.600
And you were saying, well, wait a minute, we've lost sight of the objective. The objective is to
00:34:32.360
slow warming and warming is a much bigger problem than just how much carbon is in the atmosphere
00:34:37.460
that also deals with insulation. So let's look at different tactics, right? Okay. So notwithstanding
00:34:44.160
that problem, I think my biggest fear, Steve, is there's no question that in five years,
00:34:50.260
a well staffed, well resourced team could come up with what solutions might look like. I just don't
00:34:57.880
know that it could ever be implemented from a policy perspective. I mean, for example, I don't think
00:35:03.920
anybody who's really studied energy seriously disagrees that nuclear energy, next gen, not I'm not talking
00:35:12.580
about like the nuclear reactors of 70 years ago. I don't think there's anybody who doesn't think that
00:35:17.400
next gen nuclear would play an important role in electricity generation, but it's just not a
00:35:24.460
politically palatable solution. I don't know where geoengineering falls on that spectrum either for
00:35:30.080
that matter. So that would be my fear is you'd come up with solutions, but then you'd still be at
00:35:35.240
kind of a gridlock in the implementation of these things.
00:35:39.060
Yeah. So that is a good point. And that is possible. I do think, however, that even in that
00:35:47.620
world, there is a real insurance value in this investment now. So we might not be able to come
00:35:55.720
up with a sensible policy now, but if in 20 years, all of a sudden the ice shelf on Greenland just like
00:36:02.960
begins to slide into the ocean and things are desperate. My prediction is we will do something
00:36:08.940
like my Manhattan Project for climate change. The question is, when will we do it? Will we do it
00:36:14.520
as we're staring down catastrophe or will we do it in advance and potentially either avoid catastrophe or
00:36:22.060
at least get five years earlier? I mean, not that different than COVID, right? So for my podcast,
00:36:27.300
I interviewed Monsif Slaoui and six or seven years ago, he proposed, well, why don't we have
00:36:32.920
in the bank a vaccine for like everything we'd ever want, but not make very much of it, just like have
00:36:39.800
a little bit around so we know how to do it. And we have, and like, look, and everyone laughed at him
00:36:43.580
and said, how can you, how can you suggest that? That would cost $500 million. We can't waste $500
00:36:50.160
million. And then you come to a world where the US government is bleeding $6 billion a day in
00:36:58.400
deficits. And that would have been like literally the best investment of all time, probably if we
00:37:02.760
had done it. So I think that kind of insurance policy, I think would be valuable even in a world
00:37:08.040
of gridlock and poor policymaking that we live in today. I agree completely. Because the one thing
00:37:12.600
that I think we haven't stated here, but I think we both agree on is this is such an asymmetric risk
00:37:19.120
that it has to be taken seriously. And that's probably the thing that bothers me the most about
00:37:24.960
the other argument, you know, the people on the other side of the argument that say, well,
00:37:29.300
we don't have enough certainty. Therefore we shouldn't, you know, it should be status quo
00:37:33.420
business as usual, ignore this. And it's true. I think the uncertainty, the error bars on these
00:37:38.540
projections are bigger than people lead us to believe, but it's so asymmetric. It's so nonlinear
00:37:45.300
if this goes wrong that you have to take that seriously. There are very few things in economics
00:37:50.160
that shocked me, but something that shocked me was an analysis done a long time ago, 10 years ago and
00:37:56.600
more, where an economist, and I'm embarrassed that I can't remember exactly who it was, just worked
00:38:04.240
through the kinds of utility functions. So the basic models that economists use and said, in a world in
00:38:10.260
which bad things happen, how much are you willing to pay to avoid bad things? And it is, it is
00:38:15.420
completely shocking. So I'm making these numbers up, but I think they're in the right ballpark. So
00:38:20.200
if you have something like a 1% chance of dying, like how much will you pay? How much would you pay
00:38:26.220
of your total wealth to avoid a 1% chance of dying today? Okay. And it turns out the answer isn't 1%.
00:38:32.480
It's more like 10 or 15%. And as it goes up to 5%, you're willing to pay like 50% of your wealth to
00:38:38.140
avoid it. Because once you die, you're like done. And so there's this like otherwise really long flow.
00:38:43.000
And it's very counterintuitive. It really caught me off guard. And it really changed my thinking
00:38:47.700
about the importance of dealing with climate change. Because I was kind of of the, look,
00:38:51.720
you know, climate change, it's all in the future. We discount the future. Why worry about it? But once
00:38:57.060
I saw that, it really changed my perspective on it. So I've come to think that this investment,
00:39:03.160
and the other thing to stress is that the cost of R&D is so low compared to the cost of disaster
00:39:12.240
and of implementation that to do the R&D is just a no brainer, right? So I think for a billion dollars
00:39:18.180
a year, my guess is you could fund this Manhattan Project for a billion dollars a year. This is like
00:39:22.560
the kind of money that actual philanthropists could do as opposed to even government having to do it.
00:39:26.740
Look, if you actually wanted to put whatever solution is discovered into practice,
00:39:30.260
it would probably be enormously expensive. But to figure out the answer is just like trivial.
00:39:36.280
And the same is true of COVID, like talking to people who did the vaccines. I think at Moderna,
00:39:41.760
it took them like a day or two to come up with the vaccine, you know, and all the costs are in the
00:39:46.220
testing and the production. But the actual R&D part is so cheap compared to everything else that
00:39:52.460
to not invest in it just seems like a real mistake, first order mistake to me.
00:39:56.860
I agree. I'm glad we got off on that tangent because I actually did want to talk about that
00:40:01.140
at some point that it came up kind of quick. But I want to now go back to the beginning of
00:40:06.440
the story, right? So you graduate from college, you go off to work in management consulting,
00:40:11.180
which if I remember, you described as only slightly less uncomfortable than a root canal
00:40:17.160
without Novocaine. It's so painful that it in fact exposes you to a PhD. So you go off to MIT.
00:40:24.400
So you basically just didn't leave Boston. You went from, you know, Harvard to consulting to MIT.
00:40:30.060
So you really have a penchant for these cold winters. And then what did you decide to study
00:40:34.360
in your PhD? Because now you actually, you got to get serious, right? You actually have to pick
00:40:37.600
a problem and work on it. So I have to say, I didn't like consulting, but I learned something
00:40:43.360
incredibly important in it. And I learned it from the guy who happened to have the cubicle next to me,
00:40:49.580
a guy named Jeff Thomas, who I'm still friends with. And he taught me how to look at the world
00:40:53.040
strategically. And it was weird. I never did that before. It never occurred to me to do it. But
00:40:56.860
then I got to grad school and I did have a fundamental understanding while everybody else
00:41:02.440
was really busy working on problem sets. And essentially these were all people who've gotten
00:41:06.400
straight A's their whole life. And they figured, well, what worked for me before, that's what I got
00:41:10.760
to keep on doing in grad school because that's how I'm going to get ahead. And I looked at it and
00:41:14.800
partly it helped that I knew no math and I was completely overmatched. And I was like the worst,
00:41:19.540
I'm not exaggerating when I say that my classmates sat down about a month into it. And my friend,
00:41:25.460
Austin Goolsbee, who's now gotten famous as part of the Obama administration, later told me that they
00:41:30.340
went through the list of people in the class and tried to decide who was least likely to succeed.
00:41:35.140
And I was a unanimous choice that I was going to be the worst one. But I had a great approach,
00:41:40.500
which is I understood that you had to create research. You had to go from being a consumer
00:41:45.320
of knowledge to a producer of knowledge. And I also was kind of still, I have to say,
00:41:51.100
I wasn't internally motivated. So what I did was I said, well, look, who's at the top of the
00:41:56.180
hierarchy? I might as well try to be that guy. And it turned out that that was macroeconomics
00:42:01.360
at MIT at the time. So for the first semester, I tried to be a macroeconomist. And it became so clear
00:42:08.880
to me so quickly that I was never going to be a macroeconomist because I always thought that nobody
00:42:13.720
had intuition for macroeconomics until I started talking to some of the people at MIT who actually
00:42:17.980
understood, oh, if the exchange rate between the euro and the dollar goes up, then that's going to
00:42:24.100
trickle through. And how is that going to affect immigration? Like, oh, my God, like it made no
00:42:27.900
sense to me. I was smart enough. I've always been smart enough to fail quickly. So I realized I
00:42:32.360
couldn't be a macroeconomist. The theorists were number two. And so I tried to be a theorist. And I
00:42:37.640
actually spent a little longer being a theorist. I actually wrote a couple of papers being a
00:42:40.580
theorist. And again, it became really clear to me that I didn't have what it took to be a theorist.
00:42:46.020
So then I said, well, kind of the only thing left is doing data analysis. And it turned out I should
00:42:51.460
have been smart enough to know that from the beginning because I was a weird kid. The only
00:42:55.620
thing I liked to do was sit in my room and essentially study data. Like, as I look back on it,
00:43:02.140
this is how weird I was. When I was maybe eight, I asked for and received a pocket calculator as my
00:43:09.200
my birthday present. And it was overjoyed. When I was maybe like nine, I graduated to a scientific
00:43:16.920
calculator. And my pastime was I was into baseball is I would go through and I would by manually typing
00:43:24.420
into a scientific calculator, I type in a column of of like wins for a team. And then one by one,
00:43:30.600
I would type in each of the statistics like the team batting average or the team number of triples
00:43:36.060
variables and compute the partial correlation between those two. And what my dad, when my dad
00:43:40.460
came home from work, he would ask me what the correlations were between the variables I computed
00:43:45.360
like that was how I spent my time. So it should have been obvious to me that's what I want to do.
00:43:49.740
And the other thing I did, I would use data to try and answer real problems. Like when I was in college,
00:43:53.600
I became obsessed with the racetrack and I would try to, you know, use data to win at the racetrack,
00:43:58.780
which wasn't very significant. But like I should have been obvious to me that I that I should have
00:44:02.260
gravitated to doing empirical research. But but I didn't because I was still caught up in this idea
00:44:06.380
of, well, of a status. So finally, I just said, look, I'll do what I'm good at. But honestly, I was
00:44:11.420
so far behind the people around me. It was an interesting experience. It's kind of a sideline,
00:44:17.840
but it's interesting. I've been used to being really good at most of the things I did and
00:44:23.140
academically. And it happens for almost everyone where you get to a point where you look around and
00:44:28.300
you realize you were the dumbest person in the room and how you deal with that is really important.
00:44:33.960
And strangely, I don't know why that was roughly the greatest feeling I've ever had in my life.
00:44:38.860
When I looked around and I was the dumbest person in the room, it was just sense of joy. It was really
00:44:44.120
interesting. Somehow, I think maybe I had felt pressure my whole life. Once I was the dumbest person
00:44:49.000
in the room, I felt like, wow, I can do whatever I want. Like I don't you know, I can. It's just it
00:44:55.540
was such a joy to be around such talented people. And it freed me up to be myself. And instead of
00:45:01.940
always trying to follow the crowd, I started doing what I liked. And what I thought, well,
00:45:06.180
what do I like? And I said, well, I don't know. I like watching the TV show Cops. I love to show
00:45:11.460
Cops. I watch it every day. Like, well, why don't I do research on it? And so I somehow like found myself
00:45:17.100
and I began just studying the things I liked. And much to my amazement, even though no other
00:45:24.160
economists were really studying those questions. I really thought it was a it was a one way ticket
00:45:29.060
to getting a PhD and then doing something outside of academics. Because, you know, why would it be
00:45:35.180
the case that if these were topics no one else researched, they'd be interested in it. But I
00:45:40.260
didn't you know, I knew I couldn't compete on a level footing with these amazing people around me.
00:45:44.520
So I had to compete on like in my own space. And in my shock, people liked it, you know, and they
00:45:49.540
were excited about it got published in good journals. I honestly I look back and I say I'm
00:45:53.880
so lucky to have been so far below the bar that I didn't try to do what I think most people do and
00:46:00.980
what I had a tendency to do, which was to try to figure out what I should be because that's what
00:46:04.420
other people are. And I just was myself and it was such a great lesson. And I've really practiced
00:46:08.400
being myself ever since. Look, and sometimes it works and sometimes it doesn't. But the great thing
00:46:13.040
about life is when things don't work, you can take a different path. And so so I just try to be
00:46:17.520
myself and and I stopped doing stuff when being myself isn't a good recipe for it.
00:46:22.640
So when you got to Chicago, that must have been another difficult decision because Chicago is a
00:46:27.360
powerhouse in economics, right? You could have taken your sort of offbeat approach to economics and
00:46:33.560
gone someplace that is where you're not going to be surrounded by a bunch of Nobel laureates. So why
00:46:40.120
did you I mean, were you so secure in your off the beaten path approach that you said, hey, I might as
00:46:46.480
well go to another cold winter city and be around smart people, but I but not have to compete with
00:46:51.900
them. I mean that what was the decision like there? I had a lot of choices and I loved Cambridge and I'd
00:46:58.680
been at Harvard and MIT and and I really what had happened is I'd given a couple of presentations at
00:47:04.980
Chicago. And I had been shocked at how different the questions I got were. And I really went to
00:47:13.020
Chicago because I wanted to get to know the enemy at that time. So Gary Becker is one of the most
00:47:19.600
influential economists of the last 50 years and he's since passed away, but was a real mentor to me.
00:47:24.900
He was demonized. I mean, when I first met him, I expected him to be a monster and it turned out,
00:47:29.480
well, he wasn't. He was super smart and he asked great questions. And and so I very self-consciously
00:47:35.300
said, I'm going to go to Chicago and I'm going to get to know the enemy from the inside for a couple
00:47:41.400
years before I go back to Harvard. And what I hadn't anticipated is that the thinking that goes on
00:47:47.500
at Chicago, it looks like it's really simple. It looks like one could learn it. I thought, look,
00:47:53.480
I'm good at economics. I can learn this. But I tell you, even 25 years later, I'm still not very good
00:48:00.560
at the kind of deep thinking that is done in Chicago economics. And I love it. And I think
00:48:06.560
it's powerful. But I'm still more of a spectator than an actual do of it. I really got co-opted into
00:48:12.860
the spirit of Chicago. But in some sense, I will say that I've always been probably unreasonably
00:48:20.580
overly confident in in my own ability. And and I was very I was at ease like I was going to do what
00:48:29.560
I was going to do. And I thought I could do a better Chicago than anywhere else. But I will say
00:48:34.200
it was it was unusual. So at that time, I think it must have been seven or 10 years since someone
00:48:41.980
who had a lot of options had chosen to go to Chicago economics over another place, including the
00:48:48.040
Chicago Business School, which was really getting most of the talent. And one of my advisors, so I
00:48:52.760
told one of my advisors at Harvard, I was going to go to Chicago. And he said, if you do that, I will
00:49:00.540
never speak to you again. And I said, why? And he said, because if you do that, it shows that you are
00:49:07.180
so effing stupid that you're not worth me wasting my breath. And it was interesting. That was how weird
00:49:13.360
it was to do that decision. But I did it. And he was happy to talk with me, you know, never stopped
00:49:18.000
talking to me. And I think he would agree. I probably did the right thing. It's funny now,
00:49:22.360
because it used to be there were these enormous differences between departments. Now, I think
00:49:26.460
they're all kind of the same Chicago, Harvard, MIT, Stanford, Princeton. I mean, there are yeah,
00:49:29.960
there are like all these great departments. We've all like blurred the lines together, not so
00:49:34.440
different. But I you know, but but at the time, there really was a distinctive personality
00:49:39.880
personality that for me was really, you know, really special. I'm so glad I did it. And I think
00:49:44.560
it's I've been a huge beneficiary of it. Do you think the field is better that the elite 10 programs
00:49:53.160
have become more homogeneous? Or do you think it did more for the field when you had these very
00:49:58.740
distinctive schools of thinking and Stanford had its way of doing it and Chicago had its way and Harvard
00:50:03.540
had its way? Such a hard question. I remember maybe five or 10 years into my coming to Chicago,
00:50:13.680
Milton Friedman came back and he was bemoaning the fact not just that Chicago was becoming more like
00:50:21.520
other places, but really that what was distinctive about Chicago, what's called Chicago price theory,
00:50:25.900
was basically dying out and how awful that was. And one of my colleagues, super smart colleagues,
00:50:31.700
Casey Mulligan said, Milton, I thought you believed in markets. Sounds to me like like pressure is
00:50:37.960
losing. And I thought, wow, that is exactly right. And even Milton Friedman in the end didn't really
00:50:43.460
believe in markets when markets moved against him. It sort of used to be that way in medicine,
00:50:47.860
right? There was a very clear line between East Coast medicine and West Coast medicine. And there was a
00:50:55.720
very different way cardiac surgery was done in Minnesota versus Stanford versus Boston. I mean,
00:51:03.120
those were so different and they produced remarkable innovations.
00:51:08.460
I think you're right. Like, I think these diversity of approaches are useful because number one,
00:51:13.780
there's data, right? It's like, let's especially take more something like cardiac surgery where you
00:51:18.280
get real data. Look, you know whether people are living or dying. And so you actually can figure out
00:51:23.040
whether one is better than the other. So in that world, for sure, I think this diversity of views
00:51:27.800
is really, really important. Now you might say, look, it doesn't have to be across departments.
00:51:32.220
It can be within department, right? You have one brilliant surgeon innovating in this way or that
00:51:35.940
way. But I do think in general that it's diversity of views is good. And it's especially good
00:51:41.960
in a dynamic world, right? Because in a world that's static, it's kind of like doesn't matter as much.
00:51:48.120
But when something radical happens, then often one model is much better situated than the other
00:51:55.320
to deal with whatever this radical occurrence happens, some new disease or some, I don't know,
00:52:00.420
maybe radical things tend not to happen so much in medicine. But, you know, sometimes radical things
00:52:04.860
happen in the economy. Again, I don't know how interested your listeners are in macroeconomics,
00:52:09.940
but I think macroeconomics is a case where it's Chicago style macro, more or less one.
00:52:15.000
And so all of the world of macro now looks a lot like what was going on in Chicago
00:52:19.740
40 years ago. And I think that's been a problem because I think there isn't this diversity of
00:52:27.140
thinking that puts you maybe in a better situation as different kinds of macroeconomic problems arise
00:52:34.020
to have a range of possible approaches and solutions to it. So I think it probably would be
00:52:41.500
better if there were greater differences between the departments. But it's one of these things
00:52:47.700
where like the facts of life are that it's impossible to maintain that equilibrium because
00:52:52.480
there are all sorts of private forces that are pushing for this homogeneity that like there's no
00:52:59.460
easy way to fight it in some sense. And so even though you wish it would happen, it's hard to see how
00:53:05.560
Let's go on to talk a little bit about a colleague of yours outside of the world of economics,
00:53:11.980
Dubner approached me to write an article about me for the New York Times. This is
00:53:16.920
after I'd won the Clark Medal, but well before Freakonomics. And I really was incredibly hesitant
00:53:24.600
to accept the invitation because I just didn't, I didn't really like to be written about. There was
00:53:29.360
no, I didn't have anything to sell. So I didn't really, I wasn't in the business of trying to market
00:53:33.640
myself. And in the end, I have this, my mom though, really likes it when I'm in the newspaper and on TV
00:53:40.160
and stuff. And so I really remember thinking, oh God, I'll take this hit from my mom because it'll
00:53:44.540
make my mom so happy if there's a piece about me in the New York Times magazine. And then Dubner came
00:53:49.500
out and it was unlike any experience I'd ever had with a journalist. I honestly think he had read
00:53:56.380
every academic paper I'd ever read. I mean, I'm talking about like 50 papers.
00:54:00.460
And he came out and he ended up interviewing me for, I don't think I'm exaggerating if I say maybe 25
00:54:07.980
hours over three days. And there was never a silence. So I would, he would ask me a question. I would
00:54:16.160
answer it. And as soon as I answered it, he would ask me another question. And this went on for 25
00:54:21.440
hours. And it was unbelievably painful to me because it's like, it was the last thing I wanted to do.
00:54:25.900
And I thought he was going to say for three hours, not 25. And one of the notable things about it is
00:54:31.240
that I literally did not ask him a question for the first 24 and a half hours. And I only thought
00:54:38.840
of him as like this parasite that was like sucking the life out of me. And after 24 and a half hours,
00:54:45.100
I actually had the thought to say, well, like, who have you written about before? And he started
00:54:49.960
telling me these fantastic stories. He was the only guy who had been able to interview the Unabomber.
00:54:54.380
And he had been in a rock band, like, and he was really interesting. And I learned something there,
00:54:59.180
which is like, especially when you're tired of being asked questions, ask the questions yourself.
00:55:03.820
It's almost always more interesting to ask questions than to answer them. But still we parted. And I
00:55:08.860
would have said I would never see that guy again. I mean, we were not friendly in any way. There was no
00:55:14.340
like meeting of the minds or anything. But he did write a piece about me in the New York Times that
00:55:20.500
people loved. And he gets so pissed at me because I always say like, he created this,
00:55:24.020
this personality about me as this incredible wonder boy genius who you give me a problem,
00:55:29.720
I type away at my computer, I solve it an hour later. And, and people love that persona,
00:55:34.580
even though it was completely and totally off. I mean, it wasn't right at all. But look,
00:55:39.060
I've been writing that for the last 15 years. So I can't complain. Like I've milked it for everything
00:55:44.060
I could. And people loved it. And then we ended up not through an easy buy, but we ended up deciding
00:55:50.300
to write Freakonomics together, not because we had some passion to tell our story, but just because
00:55:55.260
we both wanted to make some bucks and the publishers were willing to pay us to write this book.
00:56:00.300
And in the end, we had an amazing agent, Suzanne Gluck, who I interviewed for my podcast,
00:56:05.360
people I mostly admire. And like, you know, we relive that story, but she just like totally out
00:56:10.240
negotiated the, the publishers. And there we were with a super lucrative book deal
00:56:16.160
and no idea at all what we were going to write about. We like had no conception of what this book
00:56:20.980
was going to be. And I think part of the fun of it was, so we both assumed that we had just pulled
00:56:26.340
off like a bank heist and that like no one was ever going to read this book. And so we could do
00:56:32.000
whatever we wanted because it wasn't like my colleagues cared what I wrote about this book that
00:56:37.440
no one would read. And he was like a serious, you know, memoirist and his, you know, people would
00:56:42.620
understand he was just doing this to make some cash and they wouldn't hold it against them.
00:56:45.960
And it freed us up, I think, to write a book that was very different than what we would have written
00:56:50.280
if we actually had the fear that people were going to read it and judge us based on what we wrote.
00:56:54.760
So we had a lot more fun with it than we would have otherwise. And we kind of broke a bunch of rules
00:57:00.380
and we were super lucky, I think, to be in the right place at the right time
00:57:03.940
to have a book that ended up selling a whole bunch of copies.
00:57:10.000
I don't know. I have no idea. 05, I think probably 05.
00:57:15.420
I'm so good at dates. Usually I can almost remember. I mean, I remember reading it as soon
00:57:20.180
as it came out. I still remember where I first learned about it. I remember it's so odd that I
00:57:25.820
would remember this, but I was in the OR waiting for a patient to wake up, you know, waiting to come out
00:57:31.360
of anesthesia. So, you know, we'd finished operating and it was, you know, I was just sort of writing
00:57:35.020
the orders to get ready to take the patient to the recovery room. The chief resident said,
00:57:41.140
I just read this book, Freakonomics. You've got to read it. It is unbelievable. It's totally
00:57:47.220
incredible. And he just started raving about it. So I just went and picked it up and I couldn't put
00:57:53.780
it down. I mean, it was just, and it was very unusual for me to read anything outside of medicine.
00:57:58.480
You know, it was very hard for me to make time to do anything that wasn't immediately related to
00:58:05.120
sort of what I needed to do for work. So I'm trying to think which my favorite one was in there.
00:58:11.160
Which one did you guys get the most blowback on? Was it the seatbelt one, the car seat one? Did that,
00:58:16.120
did that create the most blowback? You know, we literally had like three lines on, on car seats.
00:58:21.100
We had like one, one paragraph on car seats and that probably did create about as much
00:58:25.960
negative feedback as anything. Strangely, we really thought abortion was going to be the
00:58:30.840
lightning rod and everyone was going to get upset about it, abortion and crime. But
00:58:33.860
the backstory is that when I couldn't control the story, like when the media reported on my academic
00:58:41.480
results about abortion and crime. Which by the way, that was in the late 90s, wasn't it? Or early
00:58:45.760
2000? Yeah, 2000, 2001. Yeah. And that, in that decade. And we had no control over the way it was
00:58:52.160
portrayed. It was deeply misportrayed. It was simplified. And it was a, it was a nightmare.
00:58:56.620
It just triggered hostility from all corners of the right and the left. And it just kind of made
00:59:03.520
it look stupid because it was easy to parody it. Because if you didn't understand how simple and
00:59:10.420
obvious the things we were talking about and how powerful they were in the data, it kind of looked
00:59:14.040
like we were crazy people. And in part because, look, nobody talks publicly about abortion unless they
00:59:19.660
have a stake in it, right? So all of the conversation about abortion is either stridently
00:59:25.300
pro-life or stridently pro-choice. But like, we didn't have a stake. We were just doing something
00:59:29.600
really different, which is we said something simple, which is abortion was legalized in the US.
00:59:35.540
And it turns out that there's this really strong relationship between unwantedness. So if a kid is
00:59:40.980
unwanted by his or her parents, that they tend to have a hard life. And after abortion was legalized,
00:59:47.940
there appears to have been a dramatic decline in the number of unwanted children that were born.
00:59:52.540
And so just like in a simple empirical like statement of fact, fewer unwanted children
00:59:56.340
with legalized abortion should lead to less crime 18 years later. It's like, gotta be true. It's not
01:00:03.020
hard to see why that's true. It's just a matter of, is it a big number or a small number? And back in the
01:00:07.800
umbilical calculations, Soda said it could be a really big number based on other studies of unwantedness.
01:00:12.600
And when you looked at the data, not perfect. We obviously didn't want a randomized experiment.
01:00:16.780
We didn't get to choose who didn't, didn't get abortions. We looked at things like
01:00:21.240
after Roe versus Wade happened, you look exactly at some states it was easy to get an abortion.
01:00:26.440
Some states it was hard. And then you look 20 years later and you see that, well, for the first 15
01:00:31.320
years, the crime patterns look really similar between those states. And they only started to diverge
01:00:35.220
once the kids who were exposed to legalized abortion were old enough to have, you know,
01:00:39.740
start committing crimes. So like, you know, it was a really simple, straightforward paper that
01:00:44.460
almost anyone could look at the data and make some sense of whether what we were doing was
01:00:49.180
reasonable and sensible. And so when we actually got to tell our story in Freakonomics,
01:00:54.840
nobody complained. And what was really interesting, just not just from a data perspective,
01:00:59.600
but from a storytelling perspective, is it was one of the great successes of storytelling
01:01:05.060
storytelling in that people who were pro-choice read that chapter and would pat me on the back
01:01:10.960
and say how amazing, like what a great chapter that was and how it pushed the pro-choice agenda.
01:01:16.940
And people who are pro-life would slap me on the back and say, you know, I'm glad someone finally
01:01:21.780
told the truth about pro-life. And it was interesting. The exact same chapter was read totally different
01:01:26.800
and everybody liked it. It was, I got to say, of all the things I've ever done,
01:01:30.800
that was one of the weirdest and most unexpected things that ever happened is that we wrote something
01:01:36.040
that everybody liked when we expected everybody to hate it. So Freakonomics, we wrote all these
01:01:41.720
stories and people just liked it. And we, you know, we kind of had this, like we offended lots of groups.
01:01:47.700
Like we talked about how real estate agents were like the KKK, but like in a lighthearted way that
01:01:54.380
even most of the real estate agents didn't get that mad at us. And I got invited to speak at the
01:01:59.360
National Realtors Association. So it was like, everyone's like a good sport about it.
01:02:02.900
Then when we wrote a second book, people got super pissed off because I guess we hit a little
01:02:08.680
closer to home with stuff like climate change. Then we wrote a third book. And by that time we
01:02:13.060
had alienated everyone. It turns out that like there was someone-
01:02:18.740
Yeah. And so by that time there was like every person on the right and the left thought we were
01:02:24.440
jerks by the time we were done with the third one, which is more or less why we stopped writing books
01:02:27.900
because we had gone from, Oh, these are these fun loving economists who poke fun at everything to
01:02:32.620
like those jerks. They're like offended me deeply. And I'm never going to buy another book from them
01:02:37.420
again. And then remember one night we were having dinner. This was at my house in San Diego years
01:02:43.480
ago, maybe six or seven years ago. And I think you were toying with the idea of writing a fourth book
01:02:50.080
about like the Freakonomics approach to golf. And you really spent the entire night trying to convince
01:02:55.780
me to be the protagonist of that book. Do you remember this discussion?
01:02:58.760
Oh my God. Yeah. So I actually wrote large chunks of a book about the Freakonomics of golf
01:03:04.940
and liked it and enjoy doing it because I have, I love and loved golf and I wasn't that good at it,
01:03:13.140
but at the age of 40, I really in a semi-serious way dedicated myself to becoming a professional
01:03:20.020
golfer and never got nearly good enough, but improved a lot. But I knew I couldn't do it
01:03:25.200
based on physical talent. I knew I had to do it based on being smart and using data in a particular
01:03:31.060
way. And so I've developed all of these sets of tools that I think could be really helpful to a
01:03:38.020
golfer. And what I needed was I needed someone who had amazing physical talent and was a complete
01:03:46.140
maniac in the sense that I know you are of like willing to put 10,000% into anything that you
01:03:51.760
started. And I thought, wow, what a great chapter would be to take someone who's never golfed before
01:03:56.720
and to see how good you could get, that person could get in a year. And you were my number one
01:04:05.020
We talked about it seriously. I mean, I really did consider it just out of curiosity,
01:04:08.960
which was if I devoted one year to this, like I had devoted one year to swimming,
01:04:13.580
like how far could you get in a year if you were willing to put in two to three hours a day
01:04:18.640
of very deliberate practice, especially with your guidance, which as I understood,
01:04:24.140
it was going to make a lot of shortcuts. But in the end, I think my fear was just the addiction
01:04:29.020
that would come of that. Like golf is one of those things that just, it could consume someone like me.
01:04:34.320
It would be a terrible waste of your talent as well. I did eventually take on one student
01:04:39.360
and it's a funny story. So, so Larry Summers, who was the secretary of the treasury and the
01:04:46.140
president of Harvard and whatnot. So he came to, he came to Chicago one, one time to give a speech.
01:04:52.220
And this is, you know, not too many years ago when he was, was, and is incredibly eminent.
01:04:57.500
And like, I got an email that was addressed to like five Nobel laureates, the president of the
01:05:04.700
university. And to me demanding our presence when he comes to entertain him or, you know,
01:05:12.040
like, I don't know. So he could, we, he could hold court or whatever. And I was completely befuddled
01:05:16.880
at how I got on this list of, of these folks. So I showed up, I was maybe the fourth one to arrive
01:05:23.060
and he was deep in conversation with a couple of our Nobel laureates and the president. And he literally
01:05:28.320
like stopped. I walked in the room and he like broke off the conversation and he said,
01:05:33.500
sorry, gentlemen, this is who I need to talk to. And I'm like, wow, like my stature in the
01:05:37.800
profession of economics is really going up. And he says, Steve, I mean, I don't even really know
01:05:42.980
him. I've talked to him like twice in my entire life. He says, Steve, I have been needing to talk
01:05:47.800
to you. I'm like, wow. I'm like thinking, what, what, what am I going to tell Larry Summers? And he
01:05:52.260
says, I've heard that you're the one guy in the world who can get my handicap from 20 down to five in
01:05:59.140
golf. How are we going to do that? And it was hilarious that the thing he cared about was
01:06:04.540
golf. And so I had failed with you, Peter. I thought, okay, what a great chapter this will
01:06:09.700
be. I'm going to take Larry Summers from a 20 handicap down to a five. And he turned out to be
01:06:15.900
the worst student that ever existed in the planet. He like wouldn't do anything. Like he, I sent him all
01:06:22.120
my stuff. We worked out, you know, we were going to work out plans for talking and like, he wouldn't
01:06:27.200
practice. He was the exact opposite. Like the reasons I wanted you to be my student were the
01:06:31.980
exact reasons that Larry turned out not to be a very good student. Let's hold on to this. There,
01:06:36.680
there may be a day, you know, I don't know, maybe when my kids are in college or something and I have
01:06:40.840
a little bit more bandwidth, I might be willing to pick this up, but I, I, you, you have me very
01:06:45.100
intrigued by this. The title for economics, you got that from your sister, right? I did. So my sister
01:06:51.800
who unfortunately has since passed away from cancer, she was a force of nature. She, you know,
01:06:57.340
it's funny when you grow up, you're around a relatively small set of people and you kind of
01:07:02.880
gauge your own stature in the world relative to the people around you. And my sister was so amazing
01:07:08.640
that I always thought I must be like below average in terms of creativity and things because like
01:07:16.380
relative to her, I was awful, but she was also odd in the sense that she didn't really function that
01:07:23.760
well in the world. Like she wasn't that successful in traditional metrics because she was kind of,
01:07:30.280
she was a little bit like Robin Williams. Like, like the person most like her that I've ever seen
01:07:35.100
is Robin Williams and that she was incredibly creative, but like, I don't know, like didn't
01:07:39.700
necessarily fit in that well with how the world worked. So anyway, we tried to write this book
01:07:43.480
about nothing, right? So this was a book where we, we didn't have a theme and we were just,
01:07:48.620
you know, we didn't have a title. That's for sure. We had the worst set of titles.
01:07:52.080
And I knew immediately, my sister was the only one who would come up with a title for this book.
01:07:55.980
And so I told her what we were doing and she came back a half an hour later with, I would say,
01:08:02.060
10 titles that were better than any title we had come up with. But she basically just said,
01:08:06.300
look, but the title of the book is Freakonomics. And I'm like, you're right. The title of the book
01:08:10.340
is Freakonomics. And even Dubner was okay. Yeah. Freakonomics, that works. So we went to the
01:08:16.360
publishers and they just flipped. They're like, we paid way too much for this book to call it
01:08:22.080
Freakonomics. They like refused the title. They fought it. And it was months of back and forth
01:08:27.360
before they grudgingly said that they'd call this book Freakonomics.
01:08:30.640
What were they advocating for? What were some of the other titles? Do you remember?
01:08:32.860
Oh my God. You know, the one we were at, I think, I always forget it. I think one of the leading
01:08:40.560
candidates, just to put you where how bad we were off, was E-Ray Vision, like where E stood for
01:08:46.620
economics. And I think that was a not ruled out title before Freakonomics. But in the end,
01:08:54.880
I honestly think this same book with a different title might have sold no copies. Life in general,
01:09:01.720
publishing specifically, it's just a huge luck component to it. Like I think our book,
01:09:08.480
little things happened that made all the difference. We got a good review in the Wall
01:09:12.600
Street Journal. I went on the Daily Show with Jon Stewart and somehow he like made it seem really
01:09:19.180
cool. And like little things happen with books that kind of determine whether they fly or they
01:09:23.860
don't fly. And I think if we ran back time and thought about this book and it going on, I think
01:09:31.420
99 times out of 100, it would probably wouldn't have sold very many books. We're just like super
01:09:36.860
lucky in a lot of things that happened. I mean, I think it was a good book. I mean, I'd like the
01:09:40.700
book. I'm not trying to say it's a bad book. But what is really amazing is if you actually sit down
01:09:45.060
and read books. And so people send me books all the time because they want me to blurb them.
01:09:49.180
It's incredible how good these books are. Like, you know, random books that I expect to be awful.
01:09:54.320
Well, there's so many good books and look, nobody reads them. So many good podcasts that nobody
01:10:00.660
listened to. But it's just really hard in the clutter of this world to break through. And I
01:10:07.140
think the tightness of the correlation between how good something is and how much attention it gets
01:10:12.100
much lower than I think most people give credit for. Steve, something you've been talking about
01:10:17.040
quite a bit lately. We spoke about it when I was on your podcast. I've heard you speak about it on
01:10:21.100
their podcasts, is kind of our shared appreciation in mental health and how underappreciated it is.
01:10:29.780
How long has this been something that's kind of been, at least to you, as important as I think it
01:10:35.020
is now? Is this recent or is it something that's always really mattered and you've only kind of come
01:10:39.380
to appreciate how underutilized these tools are? Look, I was raised by a father who was like,
01:10:50.220
no nonsense and the idea of therapy. I mean, such an embarrassment. Oh my God. I mean, like no real
01:10:57.420
man would ever think about certainly crying. Like I was, you were not allowed to cry.
01:11:03.760
You had a grandfather that committed suicide. Was it your father's father or your mother's father?
01:11:08.780
My father's father. No, but it wasn't a mental, like my father, my grandfather's suicide was not a
01:11:13.800
mental health issue. My grandfather, who loved life more than anything and was, I don't know,
01:11:19.360
maybe 90 years old, his wife, who he loved dearly, had terminal cancer. And so he decided to commit
01:11:27.420
suicide, not out of unhappiness, just because he had lived a great life and he was satisfied and he
01:11:33.040
didn't want to be a burden on people. So it was actually in some sense the opposite. It was, in many
01:11:37.980
ways, he was extremely far along in the mental health domain because he like accepted death as
01:11:42.760
being a natural part of the cycle of life and was not troubled by it. I think honestly, for me, the
01:11:47.760
big change was I got divorced and I met my wife, Suzanne, who is much more in tune with these issues
01:11:58.000
and got me thinking about them in a way that I never had before, whether it's like kind of
01:12:05.420
more esoteric stuff in kind of the spiritual domain or purely about mental health. And I somehow shook
01:12:13.180
off like a lifetime of teaching, which told me that mental health was a joke and you just got to like
01:12:19.500
fight through it and came to appreciate it. You know, seven years ago, really, is when I started
01:12:23.260
thinking about these issues. And I think you're right in saying that I've come to believe now that
01:12:30.000
they are dramatically more important than our education system or our society has traditionally
01:12:38.360
given them credit for. You've spoken about this a little bit. I mean, you've spoken about it with
01:12:43.780
me and you've explained that you'd be even comfortable talking about it today. But one of your children
01:12:49.680
struggles with an eating disorder. How hard has that been for you? And how have you sort of navigated
01:12:56.760
your appreciation for mental health with both her struggle and your struggle? Because I think that's
01:13:03.820
hard for both of you in different ways, obviously. You know, it's interesting that I, if there's one
01:13:08.840
thing I'm really good at, it's, I forget the serenity, the serenity prayer. I'm really good at
01:13:15.000
understanding the difference between things I can and cannot control and not worrying very much about
01:13:21.200
the ones I can't control. And if there's one thing that was clear to me that I could not control
01:13:25.740
directly, it was my daughter's eating disorder and that no amount of trying to push or pull or
01:13:32.440
whatever was going to do it. So it was weird. I approached it with a real calm. I mean, it was
01:13:38.540
awful to watch, but I didn't struggle in the way that I think most parents struggle, which is the feeling
01:13:45.800
of like, I should be doing something about it. Look, these are often lethal. And she was incredibly
01:13:52.260
accomplished as an anorexic. She was extremely effective, you know, at torturing herself.
01:13:58.860
And in a weird way, I think that calm was helpful in letting her be comfortable with me about it. And
01:14:10.180
eventually, I think I played some very small role in her recovery. I mean, obviously 99.999% of
01:14:16.900
recovery was her own. It was odd. I mean, that's probably not the answer you expected me to give,
01:14:21.580
but in a strange way, that was not. Maybe more generally with my adult children,
01:14:27.500
I really am maybe better than most parents at understanding that they're their own people
01:14:33.680
and that they live their own lives and that I'm just an observer and I can offer advice,
01:14:41.560
Where do you think that transition takes place age-wise? I mean, a friend of mine,
01:14:46.340
Rick Elias, who was on this podcast once said something that has stuck with me. I don't think
01:14:51.180
a day, maybe two days would go by that I don't think of what he said, which is raising children
01:14:56.480
is playing a game of tug of war that you have to lose. And he said it more eloquently, but it was
01:15:02.600
like you lose it gradually, obviously, right? So when you're holding the tug of war with your five-year-old,
01:15:08.860
you're still winning. And with your 13-year-old, you're really starting to slip. And by the time
01:15:14.660
it's the child's 18, it's their rope, right? Or whatever. You seem to have navigated that better
01:15:19.440
than most perhaps. I had a son who died when he was one and that was far and away, like a thousand
01:15:25.920
times harder than anything I've ever experienced. In part because your real job as a parent in some
01:15:34.040
sense, your biggest job is to keep your children safe. And it was so difficult for me to navigate
01:15:40.280
the knowing that I had not kept him safe. And he was my first son. And six kids later,
01:15:48.240
I think that really affected my parenting. And he didn't die because I was abusive or neglectful.
01:15:56.380
He died because of a terrible disease, meningitis, that came out of nowhere and I couldn't protect
01:16:03.920
him from it. And I think there's one or two ways you can go. You can become incredibly fearful.
01:16:09.480
Like I just happened to watch Finding Nemo yesterday with my kids. And so you can be like
01:16:14.720
Marlon, the dad in Nemo, who becomes incredibly protective and doesn't want Nemo to do anything.
01:16:19.100
Or you can just understand that the world is one of uncertainty and of loss and of risk.
01:16:27.980
And I really, for whatever reason, I went that second way. And I just said, look, I can't control
01:16:32.580
the world around me. And I'm just going to do the best I can knowing that that's true.
01:16:38.160
So even from a very young age with my kids, I have a lot of acceptance of their autonomy and their
01:16:44.800
independence and the fact that I can't mold them the way I'd like to or shape them.
01:16:51.300
So even with, look, do I want my four-year-old to do what I tell her to? Do I get super pissed off
01:16:57.620
when she doesn't go to bed when I tell her? Yes. But on the other hand, I put an enormous value
01:17:03.980
on that autonomy and that independence. And when I'm calm and quiet, I'm more or less,
01:17:10.780
for better or worse, putty in her hands and will do essentially whatever she'll want
01:17:14.080
because I think a big part of her growing up in life is learning to have control over situations.
01:17:20.500
And what better way for her to learn about control than to be able to have some control
01:17:25.060
in her interactions with me, which is totally different than the way I was raised, I think,
01:17:28.560
in the way that maybe I would have raised kids absent the death of my son, Andrew.
01:17:35.160
So how deliberate was that process when he died that you made what sounds like a very conscious
01:17:43.580
or even a subconscious decision to accept it and move on without, as you said, basically,
01:17:52.020
well, look, I'd back up and say the following, right? I mean, I think one could say that many
01:17:57.560
parents who lose a child are never the same again. And I don't say that in a glib way. You're not the
01:18:02.620
same again. So let me rephrase that. No parent who loses a child is the same again. But you could
01:18:07.200
make the case that many parents are so damaged after that, that they're even externally never
01:18:13.380
the same again. Most people, I mean, I know this about you, but my guess is many people would
01:18:19.080
interact with you and never know that you've lost a son. So, I mean, what strikes me about it,
01:18:24.020
is that it's one of the core tenets of something called dialectical behavioral therapy, which I've
01:18:30.280
become very fond of, which is this idea called radical acceptance, which is super hard. But as
01:18:35.980
its name suggests, radical acceptance is basically just radically accepting things, not saying that
01:18:40.260
they're good. Radical acceptance doesn't mean it's good that my wife got cancer. It's I radically
01:18:45.780
accept that she got cancer. I work on this every day and it's the hardest thing in the world.
01:18:50.580
It's super hard. It's even hard with silly things, by the way. You know what I mean? Like it's even
01:18:56.280
hard with the most irrelevant things at times. So I want to understand that a bit more. How did you
01:19:02.100
possibly come to that type of profound radical acceptance of something so awful?
01:19:10.660
So it certainly wasn't conscious. It wasn't from deep thinking or anything. I just, I would say it
01:19:14.960
happened. And in many ways, I would say my hunch is that radical acceptance of big things must be easier
01:19:21.760
than radical acceptance of little things because it's so obviously beyond control. Look, my son had
01:19:29.480
died. There was definitely no undoing that. I have no real self-awareness of how it happened.
01:19:36.600
It did. And it wasn't, I didn't work at it. Let me put it that way. I didn't work at it. It wasn't
01:19:43.180
something like where I went to therapy and I came to eventually accept it. It just happened. I mean,
01:19:49.820
it was disjoint from the grief. I mean, the grief is one thing. The grief is real and is semi-permanent.
01:19:57.740
But how you behave in other situations, to me, that's just, you know, I don't know why,
01:20:03.460
but that's just how I reacted to it. If that makes some sense. It was really long before
01:20:11.280
I consciously invested any time, effort, resources into thinking about mental health or about more
01:20:21.520
spiritual things for lack of a better word. In the time that you've now come back and reflected
01:20:27.200
more heavily on these issues around mental health in the past, as you said, seven years,
01:20:31.680
have you learned anything new about that experience when Andrew died?
01:20:37.340
You know, I've never literally, I have not thought about it in those. I mean, obviously I think about
01:20:41.740
his death all the time, but I've never tried to make any sense of it. That's the answer. I've
01:20:48.580
literally haven't thought about the way in which I processed it. I haven't really talked about it. I
01:20:53.640
mean, other than talking to you right now, I'm not sure I've talked about his death and the
01:20:59.100
aftermath of it, but I've really never talked or in many ways thought about much about the
01:21:05.220
implications of it in a weird way. Probably seems strange, but it's true. What do you think would
01:21:11.200
be necessary to infuse this idea of self-care and mental health into education? You talked earlier
01:21:19.980
about how this might be even more important than education, which I don't think anybody would
01:21:25.140
argue is not itself important. How would you operationalize that? If you were education
01:21:30.440
czar, you know, whatever that might be, and you had a hold over the entire K through 12 system,
01:21:36.940
how would you infuse this type of thinking? Let me take a step back and just first make the case
01:21:41.040
for it, because I think that's what we're doing. I think if we went back to square one and we thought
01:21:46.080
about what we should teach children about navigating the world, we would radically overhaul the
01:21:52.900
curriculum and like we'd change math and stuff, but much more fundamentally, I think we would try
01:22:00.080
to teach them the set of skills that will make them happy and able to resolve conflict with other
01:22:06.980
people and to just get along in the world. I think like, at least for my kids, I'm much more worried
01:22:13.380
about that than whether they're like really good at calculus. That matters less to me that they have
01:22:18.740
good lives and they know themselves and they make good choices and like that. I think that wasn't
01:22:24.580
the spirit when we started school. And so our curriculum is very much built off of things from
01:22:30.000
a hundred years ago, and this wasn't in the air and maybe we didn't have the tools then.
01:22:34.700
So that's my case for why I think we should be teaching. And that doesn't answer your question at
01:22:40.260
all about how to do that. It's just my belief that as a parent or as a school, our goal should be
01:22:45.900
that we raise well-adjusted kids who have a set of tools that are helping them cope with the world
01:22:51.100
around. And that many adults like you and me have made a lot of investment in these tools later in
01:22:58.380
our lives. And maybe we could reduce the need for that if we had invested more earlier as a society in
01:23:05.360
that. How to do that? I have no idea. I mean, it's like impossible because it's impossible to change
01:23:10.120
anything. What's the problem? The problem is that time during the day is scarce in school. And every
01:23:16.520
minute you spend focusing on mental health or self-care, whatever, is a minute taken away from
01:23:22.920
something else. And that's really hard to do. The second thing is who in the world actually knows how
01:23:27.820
to do this well? I'm not sure I know how to do this well. And even if I did know a person who knew
01:23:35.700
how to do this well, do I know how to scale it? Absolutely not. I don't know what kind of training
01:23:41.520
you give teachers or who you'd hire. So look, I think it's very amorphous in my mind. It's a belief
01:23:46.800
that it's important. And I think it'll be a long, let's just say y'all agreed it should happen.
01:23:53.080
It will be a long, arduous process. What I believe the right way to do it is for the Department of
01:23:58.380
Education to mandate, like we have for the Common Core and math, that we should have a Common Core
01:24:05.020
in mental health and that there should be these 17 things that are hit each year, you know,
01:24:09.720
different ones for anything. Like, no. Obviously, it's the kind of thing where I think maybe you'd
01:24:14.540
want to experimentation, right? So let a bunch of models run, see what seems to work and grow into
01:24:19.640
it. I would be very much in favor of that, where we encouraged subsidized school districts to try out
01:24:25.820
things and encourage innovation. But I think we're a long ways away from that. I had the privilege of
01:24:32.100
talking to the Biden transition team not too long ago about education. And I threw out three
01:24:37.880
or four ideas and they listened quite intently to each of the ideas, except for the mental health
01:24:43.200
ones. Like they could not get off that topic fast enough. I was like total silence in the room when
01:24:49.460
I suggested that as an important thing to do. Have you ever gone back and spoken with your dad?
01:24:54.380
Because your dad is still spry as can be. Doesn't he still work?
01:24:56.880
Yeah, my dad is still a practicing physician at the VA hospital.
01:25:01.160
So have you ever gone back to your dad and revisited this idea of mental health and
01:25:08.200
I haven't. You know, it's funny. So I'm actually thinking hard about interviewing my dad
01:25:12.900
for my podcast. It's funny. I feel like somehow doing it via podcast should be the opposite.
01:25:18.740
But I somehow feel like, well, if I'm doing it on the podcast, I can ask my dad all sorts of
01:25:22.800
questions that I couldn't ask him otherwise, which clearly, like logically makes no sense at all.
01:25:28.300
Why would recording something and playing it for hundreds of thousands of people put questions
01:25:33.520
on the table instead of taking them off the table? But in a weird way, I think somehow it would be
01:25:39.480
easier for me to ask my dad questions like this in the context of a podcast. We'll see what happens.
01:25:43.900
I he's a very reluctant potential podcast guest, and I'm not sure he can do technology well enough
01:25:52.400
to hit the button to actually record it on his end. So we'll see whether it happens. But no,
01:25:57.520
I've literally never talked to my father about any of these issues ever. I've not talked to my father
01:26:03.140
about my sister's death or his father and mother's death. We're a family that doesn't
01:26:10.540
talk about stuff like that. It was a nice enough family. But the word love never, never used in my
01:26:20.160
household, except with reference to the family dog. People love the dog, but you would never,
01:26:26.400
ever say you loved another family member. Now, how different is that with your current
01:26:30.420
children and your current family? Is that different? So I have self-consciously been very
01:26:34.720
different about that. I tell my kids and they tell me they love me every time we hang up the phone or
01:26:40.200
whatever, which I don't know if that matters for anything, but it is like, I don't do that many
01:26:44.300
things self-consciously in my life, but that is one that I very self-consciously tried to propagate
01:26:49.400
was the idea that you could express your feelings of, of love and affection in my family, the family.
01:26:56.240
I'm the father figure. So you're fond of decision-making. What's the best decision you've
01:27:02.260
ever made in your life? And what's the worst decision you've ever made in your life?
01:27:05.460
I'm probably going to give you a terrible answer to this. I think my worst decisions,
01:27:11.660
I'm too embarrassed to talk about. Okay. That's, that's fine. I probably wouldn't
01:27:15.580
be able to publicly admit my worst, but I've made a lot of like, look, I will tell you all
01:27:19.660
of my worst decisions have been about inaction, not action. I know that, that of the hundred worst
01:27:25.240
decisions I've made, I'm sure that 99 of them were not doing something as opposed to doing
01:27:29.900
something. There are very few really bad things I've done. By the way, I feel the opposite, Steve.
01:27:35.060
My worst decisions have been decisions of bad action, not inaction, but that's interesting.
01:27:40.900
And then what's the best decision you think you've ever made or a subset of the best decisions you've
01:27:44.980
ever made? I don't know if you'd actually call it a decision, but the decision to not care what
01:27:51.580
people think about me, it's not a decision in like, oh, I went back to school, I didn't go to school,
01:27:56.040
but, but I somehow just made a choice. It's a choice. I don't know if you call it choice. I made a choice
01:28:00.440
to just not care what other people thought about me, which was a big choice for me because like most
01:28:06.280
high school kids, I, all I cared about was what other people thought about me. And at some point I
01:28:12.040
just broke with that and was happy to live with the consequences. And it's just a great way to live.
01:28:19.540
It just frees you up from so much burden. I think all my best decisions I wanted have relieved
01:28:27.720
burden. I'll give you another one, which is trivial in comparison, but important is I guess I used to
01:28:32.360
be super cheap. I used to worry about every transaction I made and like prided myself on
01:28:39.160
being frugal and thinking about, oh, you know, should I spend that dollar on a bottled water in
01:28:44.280
the airport or not? And at some point I just said, Hey, this is like, I could free up a lot of
01:28:49.580
mind space if I just didn't worry about decisions that were like under $5. And it was hard for me to
01:28:57.540
do, but I said, look, I'm going to stop. If it's under $5, when I start to go into this friend's,
01:29:01.260
I'm going to forget about it. And then I moved it up to like 10 or 20 and like the highest, like for me,
01:29:06.560
it's like the higher, the better. And it is just so liberating for me that I just have this rule of
01:29:11.340
thumb that like, if it's under some ridiculously large amount of money, I just don't worry about
01:29:17.020
it. And look, it adds up. I probably, you know, I waste tens of thousands, if not more dollars a
01:29:23.040
year because I'm not getting these decisions right. And that's awesome because I free up 10% of my time
01:29:32.420
to do whatever the hell I want. And I love doing the stuff I do. And it's just super liberating.
01:29:38.460
So I think many of the things I like are these things that have been liberating, where I stop
01:29:43.180
worrying about stuff. And maybe it's not that far off than your idea of radical acceptance. Like,
01:29:49.100
I don't know, it's like not radical acceptance, but it's like, it's just a total acceptance that
01:29:53.180
there's going to be these imperfections. I'm just not going to worry about getting things exactly right.
01:29:58.020
Do you think in general, we as a species are good at decision-making? Is there a way to evaluate this?
01:30:04.560
Good compared to what I'm not sure about. Good compared to optimal? No, we're terrible. I mean,
01:30:09.920
it's like behavioral economics of the last 40 or 50 years shows how bad we are at making decisions.
01:30:16.060
And I think the best evidence of this is the people who know the most about decision-making. So whether
01:30:24.160
it's these really prominent economists who've thought enormous amounts about backward induction and
01:30:32.340
optimized behavior in dynamic systems, or whether it's Danny Kahneman, who has spent more time
01:30:39.520
thinking about behavior. These are the worst decision-makers I've ever seen. I mean, Kahneman
01:30:44.780
even is like very forthright. He said, like, the reason I study decision-making is that I'm awful at it.
01:30:49.760
And with everything that he's learned, he's still awful at it. And so I don't think humans are good at
01:30:56.300
making decisions. And I think we can arm ourselves with tools. Like there are a lot of, like a lot of,
01:31:01.620
a lot of economics is really common sense codified into helping you make better decisions. I think a
01:31:07.840
fair amount of psychology and pop psychology is that as well. But look, and there's no one who talks
01:31:16.060
more than me about how we don't quit enough, you know, and how people get stuck in bad situations.
01:31:24.180
I know that. And I have so much trouble quitting. I mean, it's stuff I should quit. I eventually do
01:31:30.880
quit it, but it takes me usually like on average. I know I should have quit it two years before I do
01:31:35.000
it. Look, and I understand all of the problems. I understand exactly what the situation is.
01:31:40.100
But to overcome the complexity of the brain and all these forces, it's super hard to make good
01:31:46.080
decisions. Do you think there's an evolutionary force behind that?
01:31:49.420
When we were evolving, the kind of choices you had to make are so different than the ones that we face
01:31:53.860
now that it's not a surprise that evolution would not necessarily be our friend in these kinds of
01:32:00.120
decisions. I mean, just, I don't know anything about it really, but my sense is that evolution
01:32:05.400
has not been our friend in the nutrition realm, right? Because we evolved in an environment of
01:32:11.240
scarcity and now we live in an environment of plenty and all of the triggers we get from evolutionary
01:32:16.020
triggers we get are the wrong ones. Probably that's the same with decision-making, right?
01:32:19.740
We evolved in settings where we were facing life and death problems all the time. I mean,
01:32:26.620
I think one of the hugest problems that we face mental health-wise is this is trauma and people
01:32:32.580
are stuck in a trauma mode and reacting to trauma, which evolutionarily is probably fantastic because
01:32:37.940
the kind of challenges we faced when we were evolving were short-term, incredibly intense
01:32:44.900
risks, risks that you needed to marshal everything in your body and your mind to fight. But now,
01:32:51.720
now those aren't typically the kind of risks or challenges we're facing. And so people end up
01:32:56.180
using the wrong set of tools that the body is just not configured well for the kind of chronic
01:33:02.480
settings of we live in of too many people, too much information, constant fears and threats around
01:33:09.460
itself. Like, I don't know much about it, but, but like, I think there's zero reason to think that
01:33:13.300
evolution is our friend when it comes to difficult, complex decisions, because we just didn't, like,
01:33:18.800
how complex could the decisions have been when we were evolving? I mean, I look back at, this is maybe
01:33:23.360
off track, but it was something that really struck me. I went to Australia for a while and was talking
01:33:27.480
with some guides as we went around Ayers Rock, Uluru, and talking about how basically the Aboriginal
01:33:35.240
life had not changed hardly at all over the course of, of, I don't know, 5,000 years or something.
01:33:44.660
And it was interesting to think about, like, you don't need much innovation at all, right? You,
01:33:49.140
you innovate 1% per generation and over 5,000 years, life is completely transformed. So it was like,
01:33:54.820
there's literally no innovation going on. I mean, there are things like they didn't have the wheel
01:33:58.300
because in the bush, the wheel didn't do very much good. So like, it was interesting to hear about,
01:34:02.080
but it just, it really hit home for me, that observation, which is modern life and this idea
01:34:08.280
of progress and innovation and being better off than your parents. Like, that's a super, super modern
01:34:15.460
idea, not at all one associated with, with mankind's existence over huge, you know, except for this tiny
01:34:24.380
sliver of, of kind of post-industrial life that we're in. It makes it almost impossible to fathom what
01:34:30.440
another hundred years looks like for exactly that reason. Again, the non-linearity of what
01:34:34.700
the acceleration technology has been like in 100 years and what it looks, it looks like going
01:34:39.380
forward. Now, speaking of a lack of progress, this is totally off topic, but what you said kind of
01:34:44.840
made me think about this. Let's talk about horse racing for a minute. You're kind of a fan. Did you
01:34:50.200
get to see Secretariat at one point? I did. Yeah. So when I was in college,
01:34:56.120
I wrote my undergraduate thesis on thoroughbred breeding and it just happened to be the best
01:35:03.300
library was in Lexington, Kentucky. And that's where Secretariat was. And I didn't really understand
01:35:09.560
how the world worked very well. I just figured I could show up at Claiborne Farms and they would
01:35:13.700
invite me to go see. So I showed up unannounced at Claiborne Farms and I said, I would like to see
01:35:19.320
Secretariat. And, and they looked at me like I was effing crazy. And like two seconds later,
01:35:25.000
one of the part owners of Secretariat happened to drive up behind me and like walk up to them and
01:35:30.520
say, Hey, we were just going to go say hi to Secretariat. And for whatever reason, instead of
01:35:35.700
like shooing me away, they said, well, this like total clown here thinks he should be Secretariat too.
01:35:39.920
Do you mind if he tags the line? Like, okay, why don't you? So I did, I did get to meet because
01:35:44.340
you know, it's funny. I was six when Secretariat won the Triple Crown. It's one of my first sports
01:35:49.600
memories. Absolutely. I remember where I was and I remember in part because for whatever reason in
01:35:57.000
the Belmont, I had chosen Secretariat as the horse that I thought would win, probably based on heavy
01:36:04.140
guidance from adults who had suddenly like led me to think he should win. And then that was the race
01:36:10.640
he won by 30 lengths or whatever it was. And I still remember that vividly. I'm not a very good sports
01:36:17.600
fan in the sense that I'm not very good at remembering names of players and whatnot or
01:36:22.160
heavy heroes, but, but Secretariat was always one of my heroes. And then he died. I don't know when
01:36:26.260
he died. It was, I would have gone there in 1988. Yeah. So he didn't live very much longer after that.
01:36:32.220
Yeah. He was only 19 when he died. I mean, you wrote about thoroughbred breeding. Why do you think
01:36:38.360
that Secretariat was the peak of thoroughbreds and they've never gotten no, no horse has ever approached
01:36:43.440
Secretariat's speeds. I mean, I think for the listener, just to put in context, what's
01:36:47.560
the triple crown, you've got the Kentucky Derby, the Preakness, the Belmont to win all three of
01:36:53.580
those. And the three of those take place over what, about six weeks. Yep. So to win these three
01:36:58.740
races, which get longer, I mean, technically, I guess the it's a mile and a quarter, a mile
01:37:04.360
and three eighths, and then a mile and a half for horses. That's a pretty big difference in distance.
01:37:09.340
So that's speed and endurance in a short period of time. I mean, it's very difficult to win all
01:37:14.740
three of those. Secretariat came along and won in 1973, but did, you know, to this day
01:37:20.440
has the record for the fastest time in each. I think in the Kentucky Derby, what to me is
01:37:26.280
remarkable is negative splitting each quarter mile successively. So each of the five quarters
01:37:31.420
was faster and faster and faster and faster and faster. And then in the Belmont, not only
01:37:35.880
running 224, which is like six seconds faster than, you know, American Pharaoh won it, but also
01:37:43.040
winning by whatever, 30, 31 lengths. No horse has come close to this. Why do you think that is?
01:37:50.240
So I don't know. A couple of things to think about. So I've actually dabbled with trying to
01:37:55.160
think about, could you in a smarter way do breeding of horses? So the way they decide what horse to breed
01:38:03.020
do in others, not extremely scientific. They kind of look at the horse and they have some ideas. And
01:38:09.220
so my hunch is that it's not particularly well done. Although just like with humans, there's an
01:38:15.520
enormous amount of assortative mating, right? So it's, it's not like they mate the worst mares to the
01:38:20.540
best studs. I mean, they've made the best and the best, you know, in the same way that the best
01:38:25.840
basketball players end up getting married to the volleyball star. And then who knows what kind of
01:38:30.460
amazing offspring they're going to have? I think probably the best explanation for it
01:38:34.840
would be that in contrast to human sports, I'm not sure there's been any advances in training
01:38:45.320
techniques for horses in the last 40 or 50 years. Now, maybe there have been, but they're not obvious
01:38:53.060
to an outside observer. I have a few little secret dreams. And one of my secret dreams, which sounds like
01:38:59.020
a Disney movie or something, is that I would buy a horse, an average thoroughbred, and I would train
01:39:06.220
that horse the way Peter Atiyah trains for ultra marathons. And the horse would actually turn out
01:39:13.280
to be really good. I mean, I don't really know much about horse training, but I don't think I'm
01:39:17.880
greatly exaggerating when I say that the way that horses are currently trained is they stand in their
01:39:24.080
stall for something like 23 and a half hours a day. And roughly once every day or two, they bring
01:39:33.360
them out of the stall and they trot around and they run. And sometimes they run fast, but they don't
01:39:38.160
tend to run fast for more than like three eighths of a mile, even though their actual races are going
01:39:42.980
to be a mile, a mile and a quarter. They only like when they train, they only run three eighths of a mile.
01:39:47.320
And I think these horses spend almost no time swimming. So you might say, look, their legs are
01:39:53.900
fragile because their legs are fragile. You can't like run them all the time. But like in my little
01:39:58.300
fantasy world, I've got a horse sized pool with a strong current going against it. And my horse Silver
01:40:04.900
or whatever, you know, whatever the horse is that I have in my Disney movie is swimming in the water
01:40:09.240
like four hours a day. And I've got him hooked up to oxygen monitors and he's doing whatever,
01:40:15.860
the equivalent of stretching or whatever. That's an interesting point, right? You'd put
01:40:19.360
some zone two training and him build some aerobic capacity, mitochondrial efficiency. I mean, that's
01:40:24.560
I mean, I don't know enough about it to say if what you're saying is correct or not, that there
01:40:28.940
haven't been advances in 50 years. It's hard to believe there haven't been, but I don't know.
01:40:35.080
I mean, it would be interesting. I think though, if you let's forget about the golf project. I think
01:40:39.480
it would be super fun if you and I, you already know a lot about how humans work and I'm sure
01:40:45.720
you could figure out really quickly how horses work. I think it would be so much fun to claim.
01:40:52.460
Okay. Cause these horses, when they run, you can claim them for like $5,000. Take some total
01:40:57.080
losing. Well, the problem is that for the triple crown, you gotta be a three-year-old. So you can't
01:41:00.100
really do it. You just got to take some completely unspecial horse. And what we'd really want to do is
01:41:05.660
we'd want to win the triple crown like four years in a row doing our aerobic training. Cause look,
01:41:10.960
if it were humans, like imagine, just, just imagine you go back to, to the time when,
01:41:17.320
you know, much more than me, but like when Roger Bannister was running the four minute mile. And
01:41:21.080
I think there's still like not very good training going on at all at that time. I mean, I may be
01:41:26.480
crazy, but is it not true that good high school runners in the U S routinely break four minutes?
01:41:32.100
I think in the case of humans, we can definitely say that the biggest difference between Roger
01:41:38.700
Bannister and a good collegiate runner today is knowledge of training much more than equipment
01:41:44.760
or nutrition for sure. So it would be true. Like if you could magically understand
01:41:50.000
fitness and training and take today's fitness and training and take like a relatively good athlete,
01:41:57.320
but nothing special. I think you could have won the Olympic gold medal in a range of sports.
01:42:02.100
That would be my conjecture. Take a good athlete, a good college caliber athlete.
01:42:07.180
And the difference between like swimming would be probably a good example. Like I think you probably
01:42:10.580
the hundredth of the thousandth best swimmer today is probably better than the best swimmer in the
01:42:15.460
world 50 years ago in terms of times. I don't know. You wouldn't know that, but anyway,
01:42:19.580
that's what I'm talking about with the horse. So you take a good, but not great horse.
01:42:23.640
And it seems like you could, if you could get the same kind of advances in training,
01:42:30.260
The other thing is how much of an outlier was Secretariat, right? I mean, I think there's still
01:42:33.820
some debate about was his greatness, how much of it was due to the size of his heart.
01:42:38.880
Unfortunately, his heart was not weighed at autopsy, but the same vet that did the autopsy
01:42:44.900
on Secretariat several years later did the autopsy on Sham, who was the horse that finished second.
01:42:51.080
It was basically the second best horse in the world in 1973. And Sham had a pretty large heart.
01:42:58.300
And he said Secretariat's heart was, you know, at least 15% larger, which basically put Secretariat's
01:43:04.560
heart at twice the size of a normal heart. And it's interesting that has been linked to an X
01:43:10.420
chromosome, which may partially explain why Secretariat's immediate male offspring weren't
01:43:17.240
that special because his male offspring would have got his Y chromosome. His female would have got the
01:43:22.640
X and only if that X made it into a male two generations later, would that have been particularly
01:43:29.840
fast horse, which I think also gets to your point about the lack of sophistication around breeding.
01:43:35.380
I don't know why I find this stuff so interesting. I think there's just something so beautiful about
01:43:39.300
these horses, but in particular Secretariat, I've always been so enamored with. And in fact,
01:43:43.720
when our middle son, before he was born, I was really lobbying hard for the middle name Secretariat.
01:43:52.600
Didn't even get like, that didn't even get a prayer, but I lobbied really hard for it.
01:43:58.320
Maybe if you lived in communist Russia, you would have had more, a better chance with the
01:44:05.080
Okay, Steve, a couple of last questions for you here, because I've been taking up too much of your
01:44:11.160
time. What do you think is the most interesting question on your mind that you don't have a clue
01:44:17.460
of what the answer is? So this is just, this is just pure weird, but something I think about,
01:44:22.960
and actually I think of it with you because it's your influence that led me in this direction,
01:44:28.480
because you introduced me to Sam Harris's work. And there was a few paragraphs in, in Sam Harris's
01:44:35.500
book, Waking Up, which have intrigued me, which I don't really know anything about, but I find them
01:44:42.160
intriguing. So this is what I find really interesting. So he talks about how the left side of the brain
01:44:48.360
has the ability to speak and the right side of the brain doesn't have the capability to express
01:44:53.300
itself through speech. And I'm sure there's a lot more to it than that, but, but it's a starting
01:44:57.820
point, that simple observation, I find interesting because it got me thinking about the degree to
01:45:02.340
which our lives are mediated through speech. So obviously when I talk to you, I have to use speech,
01:45:08.240
but my own inner life is incredibly dictated through speech. So I'm, I'm putting words on
01:45:14.240
everything. Like every, as I go through my life, it's not just this idea of chatter in the back of our
01:45:18.760
head. And like, is there some monkey on my back telling me I'm a loser or whatever. That's not really
01:45:23.340
what I'm talking about. It's much more about that in general, when, when I look at a computer,
01:45:30.020
I put the word computer on. Okay. So then it just got me interested in the question of, well,
01:45:36.220
he kind of makes a little suggestion that it's possible that you could think of the right side
01:45:43.520
of the brain as being held as a slave dominated by the left-hand side of the brain. And like the
01:45:49.360
right-hand side of the brain has all sorts of interesting things it could do, but because it
01:45:54.100
can't speak, it's relegated to this awful subsidiary role. And so I just got interested in the idea of
01:46:02.600
how could I introduce myself to the right side of my brain? How could I actually get to meet that
01:46:10.480
part of my brain? Which I don't even like, I know nothing about how the brain works. I don't even know
01:46:15.360
if that makes sense, but like given infinite time, which I don't have, it's actually, I think that's
01:46:22.100
one of the things I would pursue. So I've pursued it like in a little way, which is, I found it was
01:46:26.140
actually quite easy with concentration to train myself to, um, to be silent.
01:46:34.540
No, I want to think, I just don't want to think with words. Okay. It's really, so it's a way,
01:46:37.940
so it's, I think it's slightly like, I don't understand meditation and all the different kinds of
01:46:41.220
meditation, but I think what I'm trying to do is slightly different than what most people are trying to do.
01:46:45.360
I'm not trying to quiet my brain. What I'm trying to do is I'm trying to observe the world with all
01:46:50.480
of my senses, but observe them without, but free of language. Okay. So that, look, it's, it's totally
01:46:58.220
possible as I, I don't know, peel a potato. Like I know how to peel a potato without calling it a
01:47:05.900
potato, without using the word peel. And almost everything we do in life, even, I think in some
01:47:11.260
sense, listening, well, I don't know about listening, but, but like most things I can do on my own,
01:47:14.740
I don't need language, but I constantly have language as my partner. And so it was an interesting
01:47:21.040
and surprisingly easy task to on demand, be able to try to put words aside and be word free. I
01:47:30.900
won't say that I can do it like at will or perfectly, but, but I was better at doing that than I thought
01:47:35.340
I'd be. And what followed was kind of interesting and intriguing and, and, and increased my interest
01:47:40.760
in the question as opposed to decrease it, which is, I find that it's very difficult for
01:47:46.880
me to be unhappy or angry without words. That words are really critical to feeling victimized
01:47:56.180
to feeling. It's not that I don't feel angry, but I feel very, very differently in the absence
01:48:01.320
of words than with words. And I can process it much better. So anyway, that's probably a crazy
01:48:06.300
answer to what you just asked, but it's something I have a hunch that it might be important.
01:48:13.320
I have no clue about how to actually do it. Well, um, I'm sure there are probably gurus out
01:48:18.720
there who could maybe help me do it, but I've actually never really heard anyone else talk about
01:48:22.200
it explicitly in that way. So when I'm trying to go to sleep, it's kind of thing where when I try to go
01:48:29.060
to sleep at night, I spend maybe five minutes a day playing around with this. And I think it's been
01:48:35.900
good for me in a weird way. Well, I think that last observation about how the,
01:48:40.720
the reduction of nomenclature around a negative emotion or an emotion, let's not that anger is a
01:48:47.380
negative emotion, but a negatively valenced emotion, at least can reduce its impact. That,
01:48:52.180
that alone is super interesting as an observation. The other thing I'd say is that strikes me as
01:48:57.240
something that could be probably augmented by medication. There could be certain plants or
01:49:01.960
drugs or chemicals that would probably really augment that state. That's interesting. That's
01:49:07.720
really interesting. Maybe you can help me with some of those. Last thing I want to ask you, Steve,
01:49:13.020
you've mentioned numerous times that the random controlled experiment, you know, the RCT, the gold
01:49:20.220
standard of medicine is a tool that is virtually never available to the economist for obvious reasons,
01:49:26.680
the scale would be enormous. If you could put all of that aside for a moment, if you truly had
01:49:33.720
billions of dollars and all the time in the world, what experiment would you conduct?
01:49:44.280
So it's funny. Um, I don't actually have big ideas about economics. I think economics isn't where I would
01:49:52.360
go with something like that because in some fundamental way, I think we understand
01:49:57.400
what's understandable about the economy. I mean, it isn't, it's not a lack of resources
01:50:05.000
that makes it hard to understand the macroeconomy is the fact that we have exactly one of them,
01:50:09.960
right? So absent a set of parallel universes, I don't think we're ever going to find out through
01:50:14.380
data or experimentation. So look, if you give me a different choice, which is if you could be God and
01:50:19.960
create enormous number of parallel universes, then I would have a bunch of interesting macro
01:50:25.000
experiments that I might want to do. So for me, when I think about big problems, they're much
01:50:29.640
more off. They're rarely about randomization. I'll give you an example where maybe there's more
01:50:35.320
you really talk about, but like, I think how we really blew it with COVID. So what have we done?
01:50:40.420
Well, we used randomized trials to figure out that we have really good vaccines. I fought with
01:50:47.660
Montefslavie about this, but look, I think we were idiots not to do challenge trials where we
01:50:52.560
actually went out and learned much more quickly by giving people vaccines and giving them COVID and
01:50:58.440
seeing if it works. You know, like, I think we were foolish given the stakes not to do that,
01:51:03.080
but much more fundamentally, I think that we have ruled out medical ethics and societal views have
01:51:12.080
ruled out really sensible. Like when we want to know the answers, the best way is a randomized
01:51:17.620
experiment. Okay. So there's so many, and you and I have talked about a little bit in my pocket as
01:51:20.840
well. There's so many cases around COVID where we just don't know the basic facts about, like,
01:51:26.860
I think you made a big point about the six feet thing, like six feet away. Like somebody just made
01:51:30.620
that up and we live with it. Or another one that just came up is there's a great op-ed piece that said,
01:51:35.420
well, why is everyone still wearing terrible masks? Because like at the beginning, people,
01:51:41.320
you know, like we're told, well, don't wear a mask because a mask won't help you anyway,
01:51:45.860
which was obviously idiotic because we were really being told that because they didn't want us buying
01:51:49.920
up all the good masks because they wanted to give them the frontline health professionals. Of course,
01:51:53.480
if they didn't work, why are the frontline health professionals wearing them? The real missed
01:51:57.320
opportunity in COVID was not to take the approach that I think is a hundred percent accepted,
01:52:04.340
which is the best way we are able to learn given the constraints of society about COVID is by doing
01:52:11.160
clever little natural experiments where we look at some particular airplane and some particular
01:52:15.920
person we know got COVID and then we go in and do contact tracing to figure out how bad it was.
01:52:20.720
Look, the best way to figure out whether airplanes are dangerous is to put people on airplanes,
01:52:24.360
some who have COVID and some don't, and to figure out who gets it. I mean, and people are like,
01:52:28.280
oh, you can't do that. This brings it back to the field of economics. You would argue if you pay people
01:52:33.660
enough as volunteers, there's going to be a subset of people that are going to say,
01:52:37.120
yeah, my risk of getting COVID and something really bad happening is sufficiently low enough
01:52:41.540
that I'm willing to be one of the healthy volunteers on that airplane who risks getting
01:52:46.300
COVID. And of course we would argue, well, that might not be medically ethical, but you're saying
01:52:50.220
desperate times call for desperate measures. There's a way to solve these problems economically.
01:52:55.600
Yeah. And then Montefi Slaoui, who ran Operation Warp Speed said, yeah, but that's no good because
01:53:00.480
you just have the healthy people and that might not tell you anything. But look, there are plenty
01:53:04.420
of people who are really, really sick and who are worried about their dependence and what's going
01:53:09.100
to happen to their family. Look, there are plenty of people who know their life expectancy is four
01:53:12.420
months who are going to be willing to get exposed to COVID if you say, well, take care of your family,
01:53:17.880
you know, if you die from. So look, I think I know I'm completely out of step with a mainstream
01:53:23.660
medical ethics on this, but I think medical ethics gets it really wrong because when they think about
01:53:29.880
money, they think about small amounts of money. And I, it's probably wrong. Like if you, if you're
01:53:34.760
willing to offer people a hundred dollars to do this, then the only people are going to do it are
01:53:38.300
going to be drug addicts and uneducated people who don't understand. But look, if you, when the stakes
01:53:44.160
are as high as COVID, you can offer people $10,000, a hundred thousand dollars. I mean, you can offer
01:53:49.020
enough money that everyone's going to be lining up to do it. And if everybody's lining up, then I think
01:53:54.000
medical ethics no longer has a real stake in this because it's just a different problem when everyone
01:54:00.460
is volunteering to do it. Volunteering because you're paying enough money, look, then almost every issue
01:54:04.920
of medical ethics fades away when you got a surplus of people volunteering.
01:54:09.700
You know, I never thought about it that way until you said it, Steve. That is a fantastic point.
01:54:13.400
Institutional review boards, IRBs are adamant about making sure honorariums are very small
01:54:19.900
in studies to prevent coercion. That's their big shtick is you can't overpay people. Otherwise it's
01:54:25.780
coercive. But the irony of it is it is coercive if it's low and it disproportionately targets the most
01:54:32.360
vulnerable. Exactly. It's so true. And on organ donation, this has been something I've harped on
01:54:37.780
for a long time. People, people are really terrified of the idea of a market for organs, for like live
01:54:43.380
donors and kidneys. And every horror story they tell is a horror story that's embedded in the idea
01:54:49.840
that it's going to be exploitative. Look, and that's because the market price would be too low.
01:54:55.220
It's actually an odd case in which you want to enforce an actual price, which is way above the
01:55:01.220
market price because the value to our, you know better than me, the amount we spend on dialysis and
01:55:07.520
the amount of suffering associated with it, it's like some even like measurable percentage of GDP,
01:55:12.440
like one or two percent of GDP or some crazy thing like that. And the value of a really good,
01:55:19.700
healthy, live kidney to society is enormous. It's, you know, on the order of, I don't know,
01:55:25.500
you know, hundreds of thousands of dollars at least. So you could easily pay people a hundred
01:55:31.680
thousand dollars for a good kidney. And if you did that, I think you might have, I don't know,
01:55:37.280
might have 10 million people who would, 50 million people who would sign up to be kidney donors in
01:55:42.280
that world. And in a world in which Wall Street stockbrokers are signing up to donate their
01:55:47.480
kidneys, not donate, I mean, not altruistic, they just want a hundred grand. Look, in that world,
01:55:52.940
the whole medical ethics is turned upside down. And the weird thing is I cannot get anyone
01:55:58.160
to take that scenario seriously. Even though to me, it seems completely and totally obvious that if we
01:56:04.180
could get, I don't know, Guatemala or Singapore, Mozambique or someplace to just do that in one
01:56:10.220
place, I bet it would work great. And that one example could lead the rest of the world to follow.
01:56:15.800
But we are so far from any place being willing to try it. I mean, the closest is Iran. Interestingly,
01:56:21.320
ironically, Iran is the one place on the planet that's paid for organs, but not the way I would
01:56:26.280
do it. Not that it's the most important problem in the world. That is a fan, like along with training that
01:56:32.060
horse, Silver, to win the Kentucky Derby. That's another of my fantasies is a world in which I
01:56:39.260
create this market for live donors. All right. So on that note, Steve, we've got three huge problems
01:56:43.880
to follow up on. We've got to, in order, figure out how to create a really good scratch five golfer
01:56:50.860
in 12 months. We've got to figure out how to take a B player horse as a one-year-old and get a triple
01:56:58.720
crown by age three. And we've got to figure out a market for organ donation with a really, really,
01:57:05.420
really high price premium paid for kidneys. Let me just add number four is we got to figure
01:57:09.820
out how to do this Manhattan project for climate change. That's the most important one. I'm going
01:57:14.780
to personally spend more time thinking about that one than the other three, but that's awesome to
01:57:19.760
come out of this podcast with four great problems to work on is more than we bargained for.
01:57:24.620
My prediction, the only one we're going to solve is going to be that five handicap golfer when you
01:57:29.120
get burned out and you get tired of doing everything else. The other three, I'm not too
01:57:33.140
optimistic about. Steve, thanks so much for sitting down today. It was awesome.
01:57:37.980
Thank you, Peter. It's awesome. It's always so much fun to talk.
01:57:41.080
Thank you for listening to this week's episode of The Drive. If you're interested in diving deeper
01:57:45.200
into any topics we discuss, we've created a membership program that allows us to bring you more
01:57:49.940
in-depth exclusive content without relying on paid ads. It's our goal to ensure members get back
01:57:55.360
much more than the price of the subscription. Now to that end, membership benefits include a bunch
01:58:00.600
of things. One, totally kick-ass comprehensive podcast show notes that detail every topic, paper,
01:58:06.440
person, thing we discuss on each episode. The word on the street is nobody's show notes rival these.
01:58:12.480
Monthly AMA episodes or ask me anything episodes, hearing these episodes completely.
01:58:16.820
Access to our private podcast feed that allows you to hear everything without having to listen
01:58:22.600
to spiels like this. The Qualies, which are a super short podcast that we release every Tuesday
01:58:28.100
through Friday, highlighting the best questions, topics, and tactics discussed on previous episodes
01:58:32.840
of The Drive. This is a great way to catch up on previous episodes without having to go back and
01:58:37.880
necessarily listen to everyone. Steep discounts on products that I believe in, but for which I'm not
01:58:43.640
getting paid to endorse, and a whole bunch of other benefits that we continue to trickle in
01:58:48.120
as time goes on. If you want to learn more and access these member-only benefits, you can head
01:58:52.440
over to peteratiamd.com forward slash subscribe. You can find me on Twitter, Instagram, and Facebook,
01:58:59.780
all with the ID peteratiamd. You can also leave us a review on Apple Podcasts or whatever podcast player
01:59:06.620
you listen on. This podcast is for general informational purposes only and does not constitute the practice of
01:59:12.560
medicine, nursing, or other professional healthcare services, including the giving
01:59:17.040
of medical advice. No doctor-patient relationship is formed. The use of this information and the
01:59:23.060
materials linked to this podcast is at the user's own risk. The content on this podcast is not intended
01:59:29.240
to be a substitute for professional medical advice, diagnosis, or treatment. Users should not
01:59:35.280
disregard or delay in obtaining medical advice from any medical condition they have, and they should
01:59:41.440
seek the assistance of their healthcare professionals for any such conditions. Finally, I take conflicts
01:59:47.400
of interest very seriously. For all of my disclosures and the companies I invest in or advise, please
01:59:53.460
visit peteratiamd.com forward slash about where I keep an up-to-date and active list of such companies.