#389 — The Politics of Risk
Episode Stats
Words per Minute
183.48462
Summary
Nate Silver joins me to talk about his new book, On the Edge: The Art of Risking Everything, and why he thinks Kamala Harris should lose the presidential election. We talk about why the polls are so close, what s going on with the Harris campaign, and why we should all be worried about it. And we talk about how to prepare for a presidential election that is so close that we can t even begin to predict who s going to win. We also talk about the state of American politics and why it s important to be prepared for the possibility that Donald Trump could still win the election, and what that means for the future of the country and the country s political discourse. This is a great episode to listen to, especially if you ve been paying attention to the polls and wondering what s happening with the race between Hillary Clinton and Donald Trump. We don t run ads on the podcast, and therefore it s made possible entirely through the support of our listeners. If you enjoy what we re doing here, please consider becoming a supporter of the podcast by becoming a subscriber. You ll get access to all the great shows on the network, including The Making Sense Podcast, The FiveThirtyEight blog, The Signal and the Noise, The Silver Bulletin, and The New York Times Magazine s election forecasting blog The Upshot. Thanks to Sam Harris for helping us make sense of this episode. Sam Harris and Nate Silver for coming up with his perspective and perspective on the politics of the election and why the election is close and how it s possible for Donald Trump to win the White House in 2020. And much more complicated than we thought it would be a good time to be a better choice than it really is a good one to vote for the next time, right? Make sense of it all. - Sam Harris - to make sense in this episode of Making Sense: What's going to happen in 2020 and why you should vote for Hillary Clinton? - by Nate Silver by on the election prediction model and why she s running for president in 2020? by The Signal & the Noise? . and is a new book: On the edge: The art of risking everything by the art of risking everything? in The Art Of Risking It All? and more! by On The Edge: for a better 2020 election model by Sam Harris.
Transcript
00:00:00.000
Welcome to the Making Sense Podcast. This is Sam Harris. Just a note to say that if
00:00:11.640
you're hearing this, you're not currently on our subscriber feed, and we'll only be
00:00:15.580
hearing the first part of this conversation. In order to access full episodes of the Making
00:00:19.840
Sense Podcast, you'll need to subscribe at samharris.org. There you'll also find our
00:00:24.960
scholarship program, where we offer free accounts to anyone who can't afford one.
00:00:28.340
We don't run ads on the podcast, and therefore it's made possible entirely through the support
00:00:32.860
of our subscribers. So if you enjoy what we're doing here, please consider becoming one.
00:00:45.080
Well, we've got 11 days left until the presidential election, and I must say I am concerned at this
00:00:52.780
point. I know the polls still have it more or less even, but it's not feeling that way somehow.
00:01:01.300
What has worried me about the Harris campaign from the beginning still worries me. For some reason,
00:01:08.340
I can't understand. She and her campaign have not found a way for her to explain her changes of
00:01:15.740
position. On border security, on crime and policing, on trans activism, and the result is that many
00:01:24.940
people don't know what her views are, or they think they do know, and they suspect that she's some kind
00:01:31.000
of woke Manchurian candidate who will govern like a far-left activist. Now, I'm not even slightly
00:01:38.600
concerned that that's true, but it really seems that there are millions and millions of people
00:01:44.580
who are worried about this, and she is not saying much to put them at ease. She can't put them at ease
00:01:54.780
unless she articulates how her views have changed. Whenever she's asked about a position she held in
00:02:02.620
2019 or 2020, she just looks evasive. She looks like she's, above all, trying to not acknowledge
00:02:11.020
that she changed her mind, and then she seems desperate to change the subject. This has produced
00:02:17.480
a long series of own goals that she seems condemned to score on herself again and again and again. I don't
00:02:25.500
understand it. Needless to say, there's not a lot of time left to make sense on these topics.
00:02:34.860
And we turn our attention to the election today, because today I'm speaking with Nate Silver.
00:02:40.960
Nate Silver has been a professional poker player for quite some time, but he's famous for being the
00:02:46.360
founder of FiveThirtyEight, which is an election forecasting website, and he's also a New York Times
00:02:53.160
best-selling author. He wrote The Signal and the Noise, and he has a new book titled On the Edge,
00:03:00.060
The Art of Risking Everything. And he also has a substack where he publishes his most current
00:03:05.120
election model, and that is The Silver Bulletin. Anyway, Nate and I cover a lot of ground here.
00:03:11.740
We talk about cultural attitudes toward risk and the state of American politics. We discuss the
00:03:18.300
erosion of trust in liberal institutions, polling and political narratives, two different camps of
00:03:24.840
cultural elites, the influence of Silicon Valley, Peter Thiel, Elon Musk, Sam Bankman-Fried and the
00:03:31.920
fall of FTX, Gail Mann amnesia, Christopher Ruffo, why Kamala Harris can't admit to having changed
00:03:39.700
her views, a problem with strict utilitarianism, AI and existential risk. Then we turn fully to the
00:03:47.880
election. We talk about what people misunderstand about election forecasting, which news events have
00:03:53.860
affected the 2024 race, how the current polls might be misleading, public versus private polling,
00:04:01.220
undecided and marginal voters, Gen Z, the gender divide, the likelihood that Trump won't accept the
00:04:08.560
election results if he loses, election integrity in the swing states, the chance of a landslide,
00:04:14.560
the prospect of public unrest, and other topics. And now I bring you Nate Silver.
00:04:27.900
I am here with Nate Silver. Nate, thanks for joining me.
00:04:33.320
So we are two weeks out from the presidential election. I don't have to tell you that. You
00:04:37.600
started the FiveThirtyEight blog, the election forecasting blog, and now you have the Silver
00:04:43.740
Bulletin on Substack, where you have recently published an article titled 24 Reasons that Trump
00:04:49.100
Could Win. And I want to get into that. I definitely want to talk to you about the election. But first,
00:04:55.020
we should talk about your book, which is fascinating. The book is titled On the Edge,
00:04:59.980
The Art of Risking Everything. And it's a book about risk and cultural differences in risk
00:05:07.140
tolerance. It looks at those issues through the lens of poker and game theory and venture capital.
00:05:13.520
You talk about cryptocurrency and effective altruism and AI and other topics that are of interest to
00:05:19.240
me, certainly, and much of my audience. Let's jump into, I guess, some of the background concerns that
00:05:26.000
all of us feel, and which you discuss at various points in the book. We have this sense of real
00:05:32.860
fragmentation in our culture politically. There's a kind of secular stagnation, the economics of which
00:05:39.680
I guess can be somewhat debated, but you discuss it a little bit. You had this image that struck me
00:05:46.300
as interesting. It actually connects with your experience as a poker player and the time you
00:05:51.300
spent in Vegas. But you have this image that really kind of stratifies society across variables
00:05:57.540
like wealth inequality and status and opportunity and much of what people care about. And it's the
00:06:03.780
image of people walking into a casino. And there are those who get stuck on the slot machines and
00:06:08.980
addicted to the dopamine there. And they're just playing a bad game and gradually, but inevitably losing
00:06:15.080
their money. And then there are the people who walk right past those machines and go to the great
00:06:20.300
restaurants and see a show and maybe even play high-stakes poker. And there's just a very different
00:06:27.080
world within the same world. Before we get into your book, and I want to track through it, how concerned
00:06:36.440
are you about this moment for us in what should be a purely auspicious world of progress and relative
00:06:46.260
wealth? I mean, we just think of all the promise of America, speaking specifically now, but maybe the West
00:06:53.040
It seems somehow precarious. What's your view from 30,000 feet?
00:06:56.600
Yeah. I mean, it kind of depends on how far you want to zoom out. If you zoom way out compared to
00:07:01.040
how life was 100 years ago or 500 years ago, lifespans are greatly increased. Poverty is greatly
00:07:06.040
decreased. We have creature comforts we could never have imagined. If you look at a closer track,
00:07:11.740
maybe the past 10 or 20 years, it's a bit less clear. In Europe, for example, GDP growth per capita
00:07:18.040
is very slow. In the US, life expectancy is barely growing, especially for men. It's about the same as
00:07:24.100
it was 10 years ago, even though we've now mostly recovered from high COVID deaths. You could argue
00:07:29.940
that the pace of technological innovation has slowed. I'm not sure I buy that or not. You can
00:07:34.860
argue that democracy is on the decline. If you look at various democracy indices, then there are more
00:07:39.700
people living under authoritarianism than there were 10 years ago. So I do think there are a lot of
00:07:45.220
problems, not to mention political instability. Maybe it's a euphemism, right, in the US and other
00:07:51.780
countries. How do you view our loss of trust in liberal institutions and the market economy in
00:07:58.980
some sectors and even democracy? I mean, it seems like everything has been turned upside down. Again,
00:08:05.080
I'm looking at this through an American lens, especially. I mean, like, you know, we have a
00:08:10.400
Republican Party that now can't figure out what stake America has in helping to maintain global order.
00:08:18.780
You know, there's an America first approach to foreign policy that is reminiscent of Charles
00:08:25.880
Lindbergh that, you know, on the left, obviously, there's profound doubt about the viability of
00:08:30.660
capitalism. And everyone seems to have found reason to just distrust our institutions. As we get
00:08:37.160
into the book, we're going to meet various characters who have amplified and capitalized on
00:08:43.200
those sentiments. But how do you view that erosion? I mean, do you think it's a symptom of being too
00:08:51.020
online to see it as being this salient? Or do you think it's as problematic as many think it is?
00:08:57.100
That finding is very robust in the polling, actually. Gallup just came out with data showing
00:09:01.180
decline, yet another decline in trust in the news media, which is now at its lowest levels ever.
00:09:05.920
Basically, every institution except the military, ironically, but from the church to, you know,
00:09:11.300
higher education has seen its ratings decline a great deal. So has big business. So has the tech
00:09:16.900
world, for example. And in the book, I kind of look at it through the lens of the prisoner's dilemma,
00:09:21.320
which is a famous kind of thought experiment in game theory, where basically, if these two prisoners
00:09:26.880
are able to cooperate, they wind up collectively better off. And the game theory dictates that unless
00:09:31.580
they can communicate ahead of time and have some reason to trust each other, then they won't
00:09:35.780
cooperate, they'll be selfish and wind up collectively worse off. And so, you know, from an economist
00:09:40.440
perspective, when you have a loss of trust, you have deadweight loss. I mean, you can imagine in
00:09:45.040
neighborhoods where things are frequently stolen, that there's a lot of measures taken to prevent
00:09:49.680
crime, burglaries, robberies, et cetera, that cost money. I mean, hiring good police officers cost money,
00:09:55.240
or putting extra locks in your doors costs money. And I'm not quite sure how to reverse this,
00:10:00.640
in part because, you know, I'm someone who I describe myself as a liberal kind of in the way
00:10:05.080
that term is used classically more than meaning of the left. But I sometimes criticize the left too,
00:10:10.220
because, you know, Donald Trump has done many, many things to undermine trust in democracy and the
00:10:15.620
media. You know, January 6th was a horrifying event. And it's amazing that he may be on verge of
00:10:21.520
getting reelected again, despite that. But sometimes the autoimmune response from these institutions is
00:10:27.040
poor. So I thought, for example, that there was a lot of partisanship that motivated certain
00:10:31.640
behaviors, certain discussions of certain behaviors around COVID, for example, or I don't know,
00:10:37.560
you know, I think some of the loss of trust in higher education has been earned. You know,
00:10:41.860
I'm not a big, I'm not super into like discussing affirmative action and things like that, but the
00:10:46.500
way it was being implemented was often very cynical, where if you were, you know, if you were an Asian
00:10:50.660
applicant, for example, you were just kind of very obviously penalized in the name of some higher
00:10:55.720
goal. And so, you know, and the media too. I mean, I, I, I think, you know, two out of the three
00:11:00.620
stories they cover for Trump are perfectly fair. And probably one of those three doesn't go far
00:11:05.080
enough, but you know, the amount of column inches devoted and newscast footage to, to the Russia story,
00:11:12.100
which is not, I could articulate many problems I have with Donald Trump. I think the Russia stuff
00:11:17.000
was not among the foremost five or 10 most important problems. And so look, voters who aren't reading the
00:11:23.040
news every day and listening to podcasts and reading sub stacks and things like that, they may not quite
00:11:28.820
have all the kind of detailed facts at their disposal, the way someone, you and I, you know,
00:11:34.760
our job is to consume information, but they have some, they have some intuition that like the elites
00:11:39.480
are not working in their best interest, which I think is probably basically correct with many
00:11:45.260
caveats and, and provisos. Now, I don't think Trump is, is working in their best interest either
00:11:50.120
necessarily, but I think I'm not surprised that people feel betrayed because when you have a lack
00:11:55.260
of trust, it creates a downward spiral where because you're less trusting, maybe you become
00:11:59.520
more partisan. You don't trust that, Hey, I'm going to just lay out information. You know, the New York
00:12:03.940
Times has been in trouble recently because they say, Hey, our job actually is to be a fourth estate,
00:12:09.720
not a partisan institution. And we still believe that if you just do accurate reporting and facts,
00:12:14.660
then that's good for the world. And I think some people don't believe that. Right. And in some ways,
00:12:18.580
the Times is, is saying, we trust that when there's more information that people will behave
00:12:23.280
more rationally or more generously or more smartly. And maybe that's not true. That's, that's,
00:12:28.560
you know, the notion that we get so much, this cornucopia of free information, basically all the
00:12:32.320
time, a public square like none other on Twitter. And it doesn't seem to have worked as, as well as
00:12:38.380
the idealist, I guess, including me might've hoped. Yeah. Well, there are a few reasons,
00:12:43.460
I think obvious reasons why it's working to fracture our view of the world rather than
00:12:49.600
getting everyone to converge on the same set of facts and the same interpretation of the facts.
00:12:54.400
I mean, one, one is just that the information is now endless and can be endlessly segmented in a
00:13:01.200
bespoke way so that you can, if confirmation bias is your thing and it's almost everyone's thing,
00:13:06.220
there's no end to it, right? You can just silo yourself and discount anything that seems to
00:13:12.720
go against the grain of your cherished opinions. And the level of tribalism that is, is amplifying
00:13:19.940
this segmentation to say nothing of the, the algorithms at the bottom of it that are born of
00:13:25.640
this perverse business model, which is, you know, not selecting for truth or, you know, social
00:13:32.040
health, but for just time on device, you know, at any really at any apparent cost,
00:13:38.400
it's pretty easy to see how we're, we're in a kind of doom loop with respect to our consumption
00:13:43.860
of information online. Yeah. I mean, look, I see this with respect to political polling because
00:13:49.460
polls relatively speaking are, are simple facts, you know, Trump plus two in Arizona or Harris plus
00:13:54.460
three in Pennsylvania or whatnot. And people's ability to like, Hey, just confirmation bias and
00:13:59.980
cherry pick the data without even pretending not to be with being very uncomfortable with any
00:14:04.400
information that doesn't fit their prior. So if you're a Harris fan, then any poll having
00:14:08.600
Harris behind in any poll in any state, you might reasonably hope her to win has to be
00:14:13.540
disproven. It's kind of a very Victorian attitude. It's dirty. It has to be cast out somehow. And of
00:14:18.980
course, the Trump people are absolutely no better at this, maybe worse. So that, you know, just being
00:14:24.040
involved in that little thing where the facts are, I mean, we don't know who's going in the election,
00:14:28.020
right. But you know, a poll is a fact unto itself. Um, even if not an accurate representation of the
00:14:33.600
electorate, the fact that people are so unwilling to just kind of look at a consensus and say, I mean,
00:14:39.020
that's the whole thing. My model does just puts the numbers in an average and, and calculates the
00:14:43.860
error bands around them, which are often larger than what the margin of errors in polls is, but we
00:14:47.840
don't have to get into a polling discussion. It's just been kind of very frustrating. Um, and even
00:14:53.320
thing, I mean, people will tweet stuff about me that is like literally conspiratorial. You know,
00:14:58.200
I got into like a spat with the actress, Bette Midler a few weeks ago. Cause she's like, who got
00:15:03.160
to Nate Silver? Uh, because at that point our forecast at hair is slightly behind. And so being
00:15:08.320
at the forefront of this, I mean, every year I kind of forget how bad election years are. And then,
00:15:13.140
but now that I'm in the thick of it, I don't know. I always say this is the last one. And then,
00:15:17.220
and then don't manage to quit. Well, I want to save our discussion of polling and your model and your
00:15:23.300
gut feelings and all the rest, uh, for the second half of this discussion when we, when we've cleared
00:15:28.060
the topic of your book, but just one question related to what you just said. Do you think that
00:15:33.980
there are, um, seemingly reputable entities that present polls, not as their best effort to accurately
00:15:42.900
depict public sentiment, but as actually just met, you know, as efforts to message into the,
00:15:48.740
the election space and, and sway voters. I mean, do you think there's, there is, is there a,
00:15:54.340
do you detect or have you detected in previous elections, dishonest polling?
00:15:59.340
So there's a bit of an asymmetry here, which is that Republicans tend to believe that if you win
00:16:04.200
the narrative and create momentum, then that helps you in real life. So in some sense, I think that
00:16:09.460
good polling results report on as good polling results for Trump by the media will translate
00:16:13.340
into more probability of actual success. I'm pretty skeptical of that for various reasons,
00:16:17.760
but there are some GOP aligned polls that are basically designed to, to influence the narrative.
00:16:24.460
I mean, one polling firm called Rasmussen reports was, was actually caught in leaked emails coordinating
00:16:29.400
with the Trump campaign and they're supposed to be nonpartisan. So that's, that's quite bad.
00:16:34.020
Now the models have ways of adjusting for them. Basically, if you label a poll as a partisan poll,
00:16:38.920
then the model makes different assumptions about it. It weights it less and kind of assumes that
00:16:42.500
it's biased and tries to de-bias it. But yeah, that, I mean, absolutely. I mean, I, I think there
00:16:47.560
are, you know, some polls that are a lot more worthwhile than others.
00:16:52.540
And do you think this is just confirmation bias or do you think it's actually a nefarious
00:16:59.480
I think somewhere in between. I also think sometimes people start out saying,
00:17:03.560
LOL, ha ha. I'm just going to like troll with people a bit and then, and then, and then they
00:17:08.060
become true believers. Right. I mean, you've seen this adaptation with Elon Musk, for example,
00:17:13.300
where he kind of flirts with some more conservative right-wing ideas and then gets in all of a sudden
00:17:17.820
after a year, he's like literally jumping up and down on stage with, with Donald Trump and things
00:17:22.920
like that. Look, I think when you're doing a poll, just like building any type of model,
00:17:27.600
it's not as simple as just kind of calling people anymore on the phone and then recording what they say.
00:17:32.220
Most people don't answer polls at all. And so therefore you have to do a lot of
00:17:36.960
waiting, modeling, data massaging to get the poll to be at least somewhat accurate.
00:17:42.860
And anytime you have those decisions and you're inserting a human being in the process and their
00:17:47.500
political views or their sense of the narrative or the sense of the vibes can, can influence what
00:17:53.240
they say. Okay. Well, we'll come back to polling and the election, but let's talk about your book
00:17:59.500
for a bit here. You have a framing with respect to two cultures that does a lot of work in your book
00:18:06.620
and that you describe one as the river and the other as the village. I know you've spoken about
00:18:12.560
this a lot, but it's very useful to describe these two camps. What is the river and what is the village?
00:18:18.140
Yeah. So the river and the village are rival groups of elites who are competing for cultural
00:18:24.280
and economic, I suppose, superiority. The village is a term, it's not entirely original term. Other
00:18:29.020
authors have used similar terms to describe the establishment. So the New York Times and Harvard
00:18:34.400
University are kind of the quintessential institutions of the village. It's on the East Coast. It's politically
00:18:41.080
progressive and really more democratic party than anything else. It's very collectively oriented,
00:18:46.960
not necessarily collectivist, probably kind of center left on average, but in the sense that like
00:18:51.080
you're standing within the community matters a lot and you might be afraid of cancellation
00:18:54.920
or ostracization. And it tends to be risk averse. These are people, for example, that were really,
00:18:59.520
really careful about COVID. And I've even turned a little bit on free speech because free speech is
00:19:05.660
risky. Free speech can upset people. You can think about Charlie Hebdo or things like that. And free speech
00:19:10.400
can have consequences and they tend to be more risk averse. The river, by contrast, is the world of
00:19:15.880
analytical risk takers. So Silicon Valley, Wall Street in Las Vegas, both the casinos themselves
00:19:22.660
and some of the gamblers, probably not most. It's very risk on. It's very analytical. It's very,
00:19:28.160
very, very individualistic and usually capitalistic, very competitive. These are people who are playing
00:19:33.380
to win. And the argument of the book is that the river is becoming the dominant entity in terms of
00:19:39.160
the economy, at least, that finance and tech continue to grow as a share of the economy.
00:19:45.140
Meanwhile, gambling is not a huge, huge business. It's a medium, big business, but more money is
00:19:50.500
being gambled in the United States today than ever before, probably in the country's history.
00:19:55.940
And so, you know, I consider myself really kind of originally from the river in the same sense,
00:20:02.680
some people might say, oh, I'm from the South or whatever. It's kind of my cultural orientation.
00:20:05.260
When I meet a poker player, we have the same vocabulary. We almost assuredly can have a
00:20:10.520
good conversation about a few things, whereas I've never quite fit in with the village. You know,
00:20:15.880
I kind of cover politics because actually Congress basically passed a law to essentially outlaw
00:20:21.140
online poker, which was my main source of income in the early 2000s. And that's what got me into
00:20:26.460
politics. And so I take that kind of, I suppose, handicapper's, even gambler's mindset
00:20:30.920
to making forecasts, looking at things probabilistically kind of as a Bayesian rather
00:20:36.440
than as a political partisan, which, you know, look, I have preferences. I mean, I've said in
00:20:43.000
other shows I plan to vote for Kamala Harris, but that's not my goal. I'm not trying to do
00:20:47.480
partisan combat. I'm trying to give people accurate information.
00:20:50.140
So how useful, I mean, it proves very useful in your book, but I'm wondering if you think there
00:20:58.160
are limits to this analysis. The frame of gambling and poker in particular, that you lay over this
00:21:05.840
conception of risk. I mean, in poker, you're using a strategy that is designed to work out over
00:21:13.580
many, many, many trials, right? And it's so it's, you know, you're guided by a concept of expected
00:21:20.700
value. And yet many of the people who are in the river, the real risk takers, don't seem to be
00:21:26.920
placing their bets with an especially sound poker mindset. I mean, first of all, there's just not that
00:21:31.980
many trials. I mean, even if you're Elon Musk, you can only start so many companies, right? And so perhaps
00:21:38.880
one way to track through this is to actually talk about some of the, the prominent people you
00:21:44.500
profile in the book and, and just, and you can tell me what you, you learned from each of them.
00:21:49.700
I mean, let's, perhaps let's start with Peter Thiel. If looking at this landscape through Peter Thiel's
00:21:56.260
eyes, what did you see? So Peter Thiel is fascinating in that he is kind of actually anti-probabilist
00:22:03.280
and that makes him unusual in, in the river where, you know, Peter had a pretty religious
00:22:08.100
upbringing. Um, there's a good book by Max Chafkin or Chafkin of Bloomberg about him called
00:22:13.500
the contrarian. And like, he grew up very conservative and I think remains very conservative.
00:22:18.580
He sometimes says he's libertarian, but, but somewhere on the right side of the spectrum,
00:22:22.360
I think beneath that libertarian, there's a Catholic, I think for sure. And the first question I asked him
00:22:28.080
was kind of like a, a softball question I've used to loosen people up, which is like, okay,
00:22:32.360
if you simulated the world a thousand times, how often would you wind up roughly in a position like
00:22:36.540
you're in now? Right. And most people do the politically correct response. Like, oh, I know
00:22:41.260
I'm very lucky, maybe once or twice out of a thousand, but I'm very fortunate. Right. Whereas
00:22:45.700
he like really objected to the question, right? He's like, if the world's deterministic, then a hundred
00:22:51.920
percent of the time, right? If it's probabilistic, then it's kind of a poorly formulated question
00:22:55.980
because just the question of how much are you perturbing? How many disturbances are you inserting
00:22:59.780
into this hypothetical? Right. And he thought it didn't really make a lot of sense, but like,
00:23:04.460
but you get the sense that like, I mean, first of all, if you are Peter Thiel or Elon Musk or Sam
00:23:09.400
Altman or one of these people or Sam Bikman Freed, and you become one of the, you know, 10 most
00:23:14.480
influential or wealthy people in the world out of how many people have relived, I think 50 billion
00:23:18.300
or something. I mean, it's got to feel, it's got to feel pretty weird, right? If you wake up and
00:23:21.860
you're Donald Trump, it's got to feel pretty weird, I think. So I think some of these guys actually do
00:23:26.260
have, uh, you know, are spiritually, yeah. And so, you know, I mean, there's a simulation argument
00:23:33.840
that I'm sure you're familiar with that some people think, Hey, am I, is this just kind of
00:23:36.680
like a simulation design for me? Or am I like, kind of like the, the chosen one somehow, you know,
00:23:42.540
one of the areas where I agree with the left is that I don't think people realize how much the top
00:23:47.180
of the scale is still growing in wealth and, and, and influence, right. Um, that the top 10 people
00:23:54.160
in the world, not always the same top 10, but if you look at the top 10 richest billionaires in the
00:23:58.640
world, those are getting, uh, increasing by about twice every decade, right? So if you compound your
00:24:04.560
growth two X every decade, that really, really compounds, you see the influence on the, on the
00:24:09.260
election this year where you have like Elon Musk, perhaps illegally offering to just give a million
00:24:14.300
dollars every day to somebody who registers to vote in a particular way, um, or signs a petition,
00:24:19.560
I guess I should say among registered voters. So I, I know I do worry a lot about that, but
00:24:24.640
cause it used to be that a few years ago, people in Silicon Valley would kind of take pride in being
00:24:31.200
politically aloof, right? Um, Scott Alexander, the wonderful blogger called it like the gray tribe.
00:24:36.180
And you say, Oh, I'm apolitical. What's going on in Washington, DC doesn't really concern me.
00:24:40.860
We're doing interesting things here in Silicon Valley. And so, you know, hopefully won't be
00:24:44.820
over-regulated, but apart from that, we're not so concerned about politics. Although they probably,
00:24:48.640
you know, they also mostly just kind of vote democratic for cultural reasons. And now that's
00:24:53.120
all changed and I'm not sure that's good for Silicon Valley in the long run or for politics.
00:24:59.100
Yeah. Well, it, it, it sort of matters which of these figures you're talking about because I think
00:25:03.200
Peter and Elon and Sam Bankman Freed and Sam Altman is another one you talk about. These guys are all
00:25:10.580
impressively different in many respects. I mean, I only, I can really, I'm actually, I've spoken to all
00:25:17.040
four of them and knew Elon fairly well. I can't say that I know him. I can't say that I quite recognize
00:25:24.800
him now and we've had a spectacular falling out. But, you know, I think certainly Peter and Elon
00:25:32.480
are as allergic to the wokeness and the identitarian moral panic that happened on the left. And there's,
00:25:41.680
there's allergic to that as anybody. So I, you know, on some level, you know, their, their economic
00:25:47.220
interests aside, their behavior with respect to Trump is almost perfectly foreseeable as a result
00:25:53.940
of that. I mean, just that, you know, for Elon, it's the, the trans issue and immigration. I mean,
00:25:58.740
that's, these are, these are just super stimuli at this point. You know, where someone like Sam
00:26:03.380
Altman and his aspirations to build out AI that, you know, it's, I don't know, I don't know what he
00:26:10.260
said about his political engagement this time around. I don't know who he's voting for, but
00:26:13.740
he's a kind of a different object. How do, is, is there more to say about Peter's view of the
00:26:19.400
situation? No, look, these are impressive people. They are people who have accomplished a lot. And,
00:26:24.820
you know, Peter in particular is a very intellectual guy who will give you these, you know, half an hour
00:26:30.900
long, very well considered with lots of literary references, kind of answers to, to what seemed
00:26:36.440
like throwaway interview questions. Look, there's another reading too, which is that it's just in
00:26:41.420
their economic self-interest. Uh, if you read some of the background on Elon, he was very annoyed
00:26:47.400
that, that Joe Biden was dealing Tesla out of some of these early electric vehicle initiatives
00:26:53.500
because they weren't sufficiently unionized, I guess. So he felt snubbed by Biden. And then,
00:26:58.760
you know, like a lot of people, he has a strong reaction to the, to the wokeness,
00:27:03.620
whatever you want to call it. Think of those two things, those interests combining, right? And then
00:27:07.760
all of a sudden it's confirmation virus all the way down where everything else looks like it's a,
00:27:11.980
you know, rigged by the Democrats, et cetera. Well, in his defense, it was absolutely insane
00:27:18.580
that Biden held a, I think it was a summit at the white house for, you know, American automakers
00:27:26.240
celebrating the success of American ingenuity in the EV market. And, you know, Tesla was,
00:27:32.780
is the, the shining example of success on that front. And they weren't invited to the summit.
00:27:39.580
It's not really Elon's financial interests that were besmirched there. It's just, it's just
00:27:45.920
reputationally. I mean, it was just, I mean, he took it personally, obviously. And, but it's,
00:27:50.140
it's understandable because it's completely outrageous and, and idiotic to have held such a
00:27:55.240
summit and not have invited Tesla. No, look in, in some ways, if Kamala Harris loses,
00:28:00.900
we'll be looking at a lot of mistakes that were made early on in the Biden Harris administration.
00:28:06.960
So particularly with respect to some degree, probably of overspending on the economy. I mean,
00:28:11.560
inflation has complex causes. It's not all of it, but maybe it's part of the high inflation we had
00:28:15.520
having a lax border policy. And then due to pettiness or peak, I guess, not inviting the richest
00:28:21.920
man in the world to your EV summit when that's his kind of main occupation. And then he,
00:28:25.320
you know, turns around and spends $300 million of his own money against your candidate in the
00:28:30.760
campaign. Yeah. And look, there are some bread and butter, you know, Lena Khan has been more
00:28:35.740
aggressive at the FTC about going after big tech. Taxation is very high in California. They're
00:28:41.420
winning some battles, by the way, the fact that Gavin Newsom vetoed the California bill on AI regulation
00:28:47.320
that many people in the river even thought was a very reasonable regulatory effort.
00:28:54.240
Even Elon did. He's been consistent about that, about genuinely being concerned about AI risk.
00:28:58.920
The fact that bill was vetoed because of now people fear the power of, of the tech space as a
00:29:05.180
political entity. Even the crypto people have spent a lot of money on this particular election
00:29:10.660
campaign. And so, and so it's starting to have an influence very quickly.
00:29:13.620
Well, let's, um, let's jump over Elon. I spent a fair amount of time bashing him and it's,
00:29:18.980
it always makes me uncomfortable because, uh, he used to be a friend, but it's, he's just
00:29:22.760
unavoidable at this point. You spent a fair amount of time with Sam Bankman Freed. I only spoke to him
00:29:29.280
once on this podcast before things completely unraveled for him and, um, really just focused on
00:29:35.720
effective altruism. What did you learn from his case study, his curious case study? He's really,
00:29:42.920
there's so many intersecting issues with him that are surprising. What, what, what was, uh, the,
00:29:48.560
you, you, you started speaking to him before there was any obvious sign of a problem, right?
00:29:54.360
Well, that was going to say just the opposite thing, actually. I think what's kind of remarkable
00:29:57.660
about Sam, Sam S, Sam McMinfreed, not Sam Altman, is that he was saying all these crazy things and no
00:30:04.780
one was really flagging it and being sufficiently worried about it.
00:30:08.100
But I mean, in terms of the timeline where you, you, you had your first conversation with him,
00:30:12.420
was there any common knowledge that the wheels were going to come off at FTX?
00:30:18.200
No, in fact, so, um, this was in January or February, 2022. So this is close to when Bitcoin
00:30:23.780
is at, uh, I guess it's first all time high since surpassed it. And he's kind of like, you know, Nate,
00:30:29.100
you know, there are two basic scenarios, one where Bitcoin continues to go up and then one where it
00:30:33.000
just stays at, at 50 K per coin. Right. And I'm like, well, I can think of a third scenario,
00:30:38.000
Sam, like, what if it goes down? And he seemed to have not really processed this very much. Um,
00:30:44.200
yeah, the irony is like, he is not known as a terribly good assessor of risk, right? What I
00:30:49.680
heard from people that had worked with him is that like, he's good at, at his initial instincts are
00:30:54.080
pretty good and he's fast. He's fast at calculating speed, but he's not a particularly deep thinker.
00:31:00.440
I don't think, I mean, I think it was him who said, Oh, no one really ever needs to read,
00:31:04.200
no one ever needs to read a book, just read articles and blog posts and tweets and things
00:31:07.840
like that. And honestly, that's, that's one thing you run into with a lot of these guys. You, you,
00:31:13.520
I mean, speaking about many of the people who are prominent in Silicon Valley, I mean, they're,
00:31:18.120
everyone is bearing the, the scars of being an autodidact on almost every topic that they've
00:31:26.140
touched, right? I mean, these are not people who came out of the center of the village for the
00:31:30.160
most part and, and have a kind of institutionally acquired a lot of knowledge. A lot of people got
00:31:35.640
CS degrees or dropped out of college and read Ayn Rand and a lot of science fiction, and then
00:31:42.520
just took everything else in, on the menu of human understanding a la carte. And they just,
00:31:50.440
you get this, these strange weightings of influence, right? Where the, you know, you'll, somebody,
00:31:54.440
somebody will recommend a book, like it completely changed their, their worldview. And it's, there's
00:31:59.280
some really peripheral tract. And then you've got all of these neo-reactionary quasi-lunatics who are
00:32:07.280
very influential, but, you know, under, underneath the hood here where, you know, people who I've never
00:32:11.960
spoken to on the podcast, but, you know, people like Curtis Yarvin. And it's, it's a mess and it's an
00:32:18.880
anti-institutional mess and a contrarian mess. And then you dump hundreds of billions of dollars
00:32:23.960
into this clockwork and you get a lot of weird convictions, uh, that, that have a lot of energy
00:32:31.620
behind them. Yeah. Look, you can go case by case. I mean, I think there are some people in Silicon
00:32:36.640
Valley. So, you know, Patrick Collison, the founder of Stripe is a guy who is, uh, quite well-read and I
00:32:42.680
think quite well-rounded, right? He strikes me as a very balanced thinker. Yeah. But less, less fun
00:32:48.720
for the book. I had a great conversation with him, but he's not saying these provocative
00:32:52.300
things that make for good soundbites necessarily. So there's a little bit of selection bias there
00:32:57.120
and who kind of gets written up more or less, but for sure, you know, I think they feel like,
00:33:02.840
Hey, no one really has credibility or trust anymore. So therefore I can just kind of like
00:33:07.260
make it up as I go. Right. And look, I, I, you and I have both probably had issues where we have
00:33:12.680
different from the mainstream. If you'd like to continue listening to this conversation,
00:33:18.020
you'll need to subscribe at samharris.org. Once you do, you'll get access to all full-length
00:33:23.920
episodes of the Making Sense podcast. The podcast is available to everyone through our scholarship
00:33:28.660
program. So if you can't afford a subscription, please request a free account on the website.
00:33:34.400
The Making Sense podcast is ad-free and relies entirely on listener support.