Making Sense - Sam Harris - October 07, 2024


#386 — Information & Social Order


Episode Stats

Length

47 minutes

Words per Minute

158.0117

Word Count

7,466

Sentence Count

485

Misogynist Sentences

1

Hate Speech Sentences

11


Summary

Yuval Noah Harari is a celebrated historian and author. He has written Sapiens and Homo Deus, and his latest book is Nexus, a brief history of information networks from the Stone Age to AI. And then we take the framework that Yuval has built and use it to talk about the state of politics in Israel, the contributions of Jewish fanaticism to the ongoing war with Iran, and whether a two-state solution is even conceivable. In this episode, we discuss Yuval's advice to Elon Musk and Mark Zuckerberg, and why so many Americans are confused about the ethics of the information landscape. We don t run ads on the podcast, and therefore, made possible entirely through the support of our subscribers, we ll only be hearing the first part of this conversation. If you enjoy what we re doing here, please consider becoming a supporter of what we're doing here. You'll get access to our full-length episodes of the podcast on all major podcast directories, as well as access to the podcast's companion YouTube channel, Making Sense: The Making Sense Podcast, wherever you get your podcasts. Thanks for listening, and Happy Listening! Thank you, Sam Harris and . (Make Sense: A Podcast about the Future of Israel and the Middle East, by & (Making Sense: An Internationalist Journalist) is a podcast produced by the New York Times, edited by Jonathan Goldstein, and produced by Ben Bergman, and edited by Rachel Goodman, and written by Daniel Bulgarelli, and is a post-penned by Ian McKee, and so on, and , and by Alex Blumberg, and all other editors are nominated by the BBC, and other people who do not dare to say so so much so we do so so so we can be so so is that so much of that is so good and so much more than that is a good thing, and also they do it so good is that they do so, and they also do it, and you can help us do so and they do that so good things are so good, and really really does so so it also does it so it really does that really does it is that really is that it does so it does it, really really is so much that they also does so that they really does all that it's really that really helps them do it properly is that good things etc. , etc. )


Transcript

00:00:00.000 Welcome to the Making Sense Podcast. This is Sam Harris. Just a note to say that if
00:00:11.640 you're hearing this, you're not currently on our subscriber feed, and we'll only be
00:00:15.580 hearing the first part of this conversation. In order to access full episodes of the Making
00:00:19.840 Sense Podcast, you'll need to subscribe at samharris.org. There you'll also find our
00:00:24.960 scholarship program, where we offer free accounts to anyone who can't afford one.
00:00:28.340 We don't run ads on the podcast, and therefore it's made possible entirely through the support
00:00:32.860 of our subscribers. So if you enjoy what we're doing here, please consider becoming one.
00:00:45.300 Okay, well, I'm about to go on a trip overseas, so I'm recording this housekeeping about a week
00:00:53.300 in advance of dropping this episode. This is just to say that if anything extraordinarily
00:00:59.560 important has happened in the world that you would expect me to comment on here, I do not
00:01:04.260 currently know about it, and no doubt we'll talk about it after I get back. Today I spoke to Yuval
00:01:11.240 Noah Harari, who's been on the podcast before. Yuval is a celebrated historian and author. He is almost
00:01:19.460 certainly the most popular historian on earth at the moment. He has written Sapiens and Homo Deus,
00:01:26.420 and his latest book is Nexus, a brief history of information networks from the Stone Age to AI.
00:01:33.540 And in the first part of the conversation, we talk about the book, and then we use it as a lens
00:01:38.220 through which to look at current events, of which there are far too many. On the day we recorded this,
00:01:44.120 Israel had just bombed Beirut in an operation that successfully killed the leader of Hezbollah,
00:01:50.960 Hassan Nasrallah. And based on much of the media coverage, you could be forgiven for thinking that
00:01:56.800 this was a pure act of aggression on the part of Israel. There was almost no mention of the fact
00:02:03.360 that Hezbollah has been launching rockets by the thousands, unprovoked, into civilian centers in
00:02:11.280 Israel since October 8th at the behest of Iran and in solidarity with Hamas. Also no mention of the
00:02:19.800 fact that Hezbollah's main contribution to human culture has been the perfection of the modern
00:02:26.220 tactic of suicide bombing, which they have used not only against Jews in Israel, but also against
00:02:32.540 Americans. They've even bombed Jewish centers in Argentina. This is a global terrorist organization,
00:02:40.260 but judging from the obituaries of Nasrallah, you would think Hezbollah was an entirely legitimate
00:02:47.240 resistance group attempting to throw off some colonialist power. Needless to say, there was
00:02:55.200 almost no mention of the fact that Nasrallah and Hezbollah helped Bashar al-Assad kill hundreds of
00:03:02.860 thousands of Muslims in Syria. The editors over the New York Times and the Guardian and the BBC
00:03:11.040 seem to think that their readers don't care much about these Muslim deaths, and they're probably
00:03:18.060 right. Reading this coverage, you're likely to form the impression that Nasrallah himself was quite a
00:03:25.500 grandfatherly figure with a winning smile. He is described as charismatic and revered and as a great
00:03:34.340 organizer and an astute strategist and pragmatic and idolized by his followers. The New York Times in
00:03:42.360 their obituary makes it seem like he was a man of peace committed to a two-state solution, when in fact
00:03:48.840 he was a religious lunatic running one of the world's most lethal terrorist organizations, and he had an
00:03:56.260 immense amount of innocent blood on his hands. Anyway, Yuval and I wade into this ethical morass, and
00:04:04.020 generally we talk about the consequences of having an information landscape that is so skewed. We talk a lot
00:04:10.480 about the connection between information and social order. We discuss how democracies and dictatorships are
00:04:16.880 different information networks representing points on a continuum. We talk about the tension between
00:04:22.920 truth and order, and the need to have institutions that self-correct for both. The advantages of fiction
00:04:29.620 over truth, what Yuval calls the naive view of information, social media and the breakdown in
00:04:36.060 democracy, what advice Yuval would give to Elon Musk and Mark Zuckerberg, the rise of populism and the
00:04:43.040 breakdown of trust in institutions, inefficiency as an occasional feature rather than a bug of
00:04:49.440 information networks, the banning of counterfeit humans. And then we take the framework that Yuval has
00:04:55.700 built and use it to talk about the U.S. election and Trump and the war in Ukraine and why so many
00:05:02.520 Americans seem confused about the ethics and politics there. And then Yuval and I have a very interesting
00:05:08.500 exchange about the exchange about the state of politics in Israel, the contributions of Jewish
00:05:14.720 religious fanaticism, the ongoing war with Hamas and Hezbollah, potential war with Iran, whether a two-state
00:05:22.820 solution is even conceivable at this point, and other topics. And now I bring you Yuval Noah Harari.
00:05:30.540 Yuval Noah Harari. Thanks for joining me again.
00:05:39.500 It's good to be here again in, you know, in a real space together.
00:05:43.160 Yeah, this is, you are my first interview in this studio, which is auspicious because we share so many
00:05:50.060 interests. We have struggled for years to talk about meditation and you think we might get there this
00:05:55.820 time. I'm skeptical.
00:05:56.760 I'm determined at least to say at least something about meditation this time.
00:06:00.520 Yeah. Well, it's, there's so much going on in the world that it's, uh, it's always a challenge to
00:06:05.060 get to that topic.
00:06:05.720 We can always connect it. I mean, we can talk about Nasrallah and meditation. I think there are
00:06:09.300 many links there worth exploring.
00:06:12.200 As counterintuitive as that sounds.
00:06:13.700 Yeah. If meditation is not related to the world and to reality, it's worthless.
00:06:20.880 That's true. Yeah. I guess I, well, I feel like I'm, I'm perhaps selling our audience short in
00:06:26.740 thinking that they don't have the bandwidth to think about it, given all the chaos and given,
00:06:32.180 given your expertise that is so relevant to so much of the chaos. You just mentioned Nasrallah. So
00:06:36.760 at the time we're recording this, it was just announced that Israel killed, uh, Hassan Nasrallah,
00:06:42.100 who's the, who was the head of Hezbollah, uh, which we'll talk about, uh, perhaps in the,
00:06:47.220 so I want to focus what you've done in your recent book, Nexus, which is wonderful and here,
00:06:53.980 and perhaps you can say how this relates to your two other big books, Sapiens and Homo Deus. So what,
00:07:02.220 what is, how do you view the project that you've?
00:07:04.500 So it's, it starts basically where Sapiens and Homo Deus ended, uh, in Sapiens, I, I covered how this
00:07:12.700 insignificant ape from a corner of Africa took over the world. And in Homo Deus, I explored what
00:07:20.420 could be the potential future of, of us and of our products and creations here on earth. And, uh,
00:07:28.740 Nexus starts with a key question about both the past and the future, which is if humans are so wise,
00:07:36.760 why are we so stupid? Like, you know, we've reached the moon and we split the atom and we can decipher
00:07:43.240 DNA, but we are on the verge of destroying ourselves in so many different ways. It could be ecological
00:07:51.900 catastrophe. It could be a world war, a nuclear war. Uh, we are producing the most powerful
00:07:58.680 technology in history, AI, which might quite easily get out of our control and enslave or,
00:08:05.840 or destroy us. And we know all that, and yet we keep doing it. So what's happening? And so many
00:08:13.760 mythologies and theologies throughout history said that the blame is that something is wrong in human
00:08:21.580 nature, that we are flowed, deeply flowed. And I don't think that this is the right answer.
00:08:26.860 I think that the problem is not in our nature. The problem is in our information. If you give good
00:08:34.560 people bad information, they make bad decisions. It's, it's as simple as that. And, uh, so then the
00:08:41.180 question becomes, why are we flooded with bad information? Why is it that after thousands of
00:08:48.420 years of, of, of history of developing, you know, sophisticated networks of information and, and,
00:08:55.560 and, and, uh, communication, our information is not getting any better. I mean, at the present
00:09:01.700 moment, you can say that humans have the best, the most sophisticated information technology in history,
00:09:07.580 and we are losing the ability to even talk with each other and to listen and to hold a reasonable
00:09:14.680 conversation. So, so what, what's happening that that's the key question of Nexus and it explores
00:09:21.400 the history of information and, and of information networks. It takes another look at history, but from
00:09:29.500 the viewpoint, not of, of humanity, but of information. And for instance, I look at, uh, or Nexus looks at
00:09:37.700 the history of democracies and dictatorships, not as we usually think about them as different ethical
00:09:44.720 systems that believe in different ideals, but as different information networks, how information
00:09:51.720 flows differently in a democracy, in a dictatorship, in a dictatorship, you know, there is one hub where,
00:09:59.580 where all the decisions are being made. So all the information flows to and from that single center
00:10:05.960 and democracy is a distributed information system in which decisions are being made in many different
00:10:14.300 places. And much of the information never passes through the center. Like if you think about United
00:10:20.380 States, so you have the center at Washington, in Washington, but so many decisions are made elsewhere,
00:10:25.880 like here in, in, in Hollywood, in Los Angeles, and most information never passes through the center.
00:10:32.740 And you can think about the historical struggle between democracy and dictatorship in terms of
00:10:39.680 different models of information flows.
00:10:42.740 Yeah. I want to pass over that ground again, because what you're saying is pretty counterintuitive and it's
00:10:47.860 very interesting. So it could, because people think about democracy and dictatorship as this kind of binary
00:10:55.340 that are just categorically distinct and you're placing them on a continuum of information flow.
00:11:01.380 Yeah. Yeah. And, and there, I mean, so let's add here this, what, what you call the naive view of
00:11:08.440 information because it's, there's a sense that more information is an intrinsic good, right? And we're
00:11:14.880 getting this now with the people who are running our, our social media regimes. It's just the idea that
00:11:20.280 if you could just let all ideas collide and remove every point of friction from the flow of information
00:11:28.660 information and amplify anything, however, a market or the dynamics of any internet business chooses to
00:11:35.340 amplify it, there's, it's just the principle, you know, people have these phrases in their minds,
00:11:40.760 you know, sunlight is the best disinfectant, right? So let's just expose everything and we're going to be
00:11:45.220 fine, right? And any effort to steer this information flow is by its very nature, sinister. It is edging us
00:11:54.180 toward the totalitarian side of this information continuum. So how do you react to that?
00:12:00.740 This is so naive. This is so disconnected from reality, from history. You think that you flood
00:12:07.220 the world with information and the truth will just rise to the surface? It won't. It will sink to the
00:12:13.100 bottom. Information isn't truth. Most information is junk. It's like thinking that more food is always
00:12:20.240 good for you. The more you eat, the more healthy you will be. That's the same thing. No. I mean,
00:12:24.660 yes, you need some food to survive, but if you just keep eating, it will not be good for you,
00:12:29.520 especially if you keep eating junk food. And the world basically needs an information diet.
00:12:35.400 And, you know, the truth is a subset of information and a very small subset because the truth,
00:12:44.240 first of all, it's very costly. If you want to write or to produce a truthful account of something
00:12:51.480 of the Roman Empire, of the economic crisis, whatever, you need to invest a lot of time and
00:12:58.000 effort and money in looking for evidence and fact-checking and analyzing. Like, I don't know,
00:13:04.400 if you want to know something that happened in the Roman Empire. So, you know, historians,
00:13:08.300 they go to university to study for at least 10 years before they become professional historians.
00:13:14.040 You learn Latin and Greek and how to read ancient handwriting and how to do these archaeological
00:13:20.440 excavations. And even if you found a document from the Roman Empire and you know Latin and you can
00:13:25.940 read it, maybe it's just propaganda. Just because Caesar says that the enemy had 100,000 soldiers,
00:13:32.760 it doesn't mean they actually had 100,000 soldiers. So how do you evaluate information?
00:13:37.500 So the truth is very costly. Fiction, on the other hand, is very cheap. You just write the
00:13:44.360 first thing that comes to your mind. The truth is also very complicated or often it's complicated
00:13:50.780 because reality is complicated. Whereas fiction can make it as simple as you would like it to be.
00:13:56.980 And people tend to prefer, in most cases, simplicity over complexity. So this is another disadvantage.
00:14:03.700 So you're pointing out there an asymmetrical relationship between truth and fiction.
00:14:09.060 Absolutely.
00:14:09.520 Which redounds to the advantage of fiction in this friction-free environment.
00:14:14.240 And with the third advantage of fiction, that the truth is sometimes, not always, but the truth is
00:14:20.480 sometimes painful. You know, from the personal relationships that we often don't want to know
00:14:26.400 the truth about how we treat the other people in our lives. This is why we need to go to therapy
00:14:31.840 for many years to acknowledge the reality. You know, all the way to entire nations or entire cultures,
00:14:38.360 if you run for elections in the U.S. or in Israel or anywhere else, and you just tell people the truth,
00:14:44.660 the whole truth, and nothing but the truth. I mean, an Israeli politician who would just tell
00:14:50.320 the truth about the Israeli-Palestinian conflict is not likely to gain many votes that way. You need at least
00:14:58.080 some dose of fiction, of mythology to make it more attractive, more pleasant for the voters.
00:15:05.620 Well, it's already so unpleasant. I've shudder to think what the truth is, but we'll get there.
00:15:09.260 So again, the truth is it's costly, it's complicated, it's sometimes painful, fiction is cheap, it's
00:15:18.760 simple, you can make it as attractive as you'd like it to be. So in a completely free market of
00:15:24.820 information, truth will lose. You have to tilt the balance in favor of truth by building institutions
00:15:32.400 like courts, like newspapers, like universities, like research centers, that make the effort to produce
00:15:42.340 and to protect the truth. And when people attack these institutions, they often claim that they are
00:15:50.320 liberating people from the yoke of these elite institutions and conspiracy and so forth. But no, when you
00:15:58.540 destroy all trust in these institutions, you're paving the way for dictatorship. Society needs
00:16:05.720 institutions. And democracy works on trust. But if you destroy all trust, the only alternative left
00:16:13.820 to hold society together is with terror, which is what dictatorships do. So this is the game of many
00:16:20.320 would-be dictators. They systematically destroy all trust in the institutions that are our main
00:16:28.400 access to truth and knowledge. And then when all these institutions are gone, then the only
00:16:35.860 alternative left is a dictatorship. So what would you say to someone who says that the institutions
00:16:41.000 have proven themselves to be untrustworthy, right? So we have the capture of the most elite academic
00:16:48.060 institutions, certainly in the United States, by a kind of woke moral panic, right? You have Hamas
00:16:54.560 supporters, not only among the students, but among the faculty. I'll be happy to talk about that.
00:16:57.980 And I have a lot of criticism of my own kind of disciplines and institutions that I'm sometimes,
00:17:04.180 you know, I'm, you hear things from people who went to study history for 10 years, and then they come
00:17:09.780 up with the most simplistic views of reality.
00:17:13.400 Okay. So what, so, but, so let's take, let's stay at the 30,000 foot level. The experts in many
00:17:19.520 institutions have heaped shame upon their own heads in recent years.
00:17:24.440 But the reaction is not to destroy the institutions. I mean, this is why we need two things. First of
00:17:28.980 all, we need several institutions, not just one. So they keep each other in check. I mean, the basic
00:17:34.900 assumption is humans are fallible. Institutions are composed of humans. So all institutions are fallible.
00:17:41.340 They can make mistakes. They can be captured. They can be corrupted. And therefore you need
00:17:46.760 several institutions to keep each other in check. So if one institution is really corrupted,
00:17:52.000 you can go to the courts or you can expose it in newspapers or in other media or whatever.
00:17:57.000 And secondly, every institution needs a self-correcting mechanism. This is the sign of a good
00:18:02.320 institution, that it has mechanisms inside the institution to identify and correct its own
00:18:08.680 mistakes. This is again, a key difference between democracy and dictatorship. Dictatorship has no
00:18:14.600 self-correcting institution. There is no mechanism in Russia that can expose and correct Putin's
00:18:21.760 mistakes. But democracy is all about self-correction. That, you know, the basic mechanism of elections
00:18:28.400 is that every four years or so, you can say, oh, we made a mistake. Let's try something else.
00:18:34.780 But of course, every mechanism like this is itself fallible. Elections can be rigged. Like we just
00:18:41.080 had elections in Venezuela and the Venezuelan people said, okay, we made a mistake with Chavez and
00:18:46.400 Maduro. Let's try something else. But because Maduro is in power, he rigged the election. He said,
00:18:52.340 no, no, no, I won. And this is also, of course, very, very relevant to what's happening in the
00:18:57.320 upcoming elections in the United States. Because the greatest danger for a democracy,
00:19:02.240 in a democracy, you give power to someone for four years on condition that they give it back.
00:19:08.560 And there is always the danger, what if they don't give it back? So giving power to somebody that you
00:19:13.960 have good reasons to suspect that he will not give it back, very dangerous. So again, elections are not
00:19:21.360 enough. You also need free media. You also need free courts. Now people ask, okay, so what if all
00:19:27.120 these institutions are corrupted, then bad luck? I mean, nothing is perfect. If all the institutions
00:19:34.540 of your society has been corrupted and taken over and all the self-correcting mechanisms are
00:19:40.920 disrupted, dysfunctional, very bad news, society collapses. Hopefully we don't reach that point.
00:19:49.140 And it's very important to try. And the solution is not to lose trust. One of the key problems I see
00:20:00.360 today in the world is that you have an extremely cynical view of humans and of human societies
00:20:07.980 spreading both on the right, but also on the left. This is something that the extreme left and the
00:20:15.760 extreme right agree about. They have an extremely cynical view of humans and of reality. They say
00:20:22.760 that the only reality is power, that all social interactions, all human interactions are actually
00:20:29.920 power struggles, that all human institutions are just conspiracies to gain power, that journalists,
00:20:38.760 scientists, historians, judges, politicians, these are just conspiracies to gain power.
00:20:45.100 Whenever somebody tells you something, you shouldn't ask, is it true? Because nobody cares about the
00:20:51.660 truth. This is naive. They would tell you, no, this is a power play.
00:20:57.100 Who benefits.
00:20:58.040 Who benefits.
00:20:58.840 Whose privileges are being served? Whose interests are being served? This is something you hear from
00:21:03.940 Marxists and from Trumpists. This is something that Donald Trump agree with Karl Marx, at least on that,
00:21:10.380 that everything is just a power struggle. And if you think like that, all trust collapses. And the only
00:21:18.380 thing that is left standing, that can remain standing, is a dictatorship, which indeed assumes that
00:21:24.780 everything is just power. Now, the important thing to realize is this is not just extremely cynical, this is
00:21:32.440 just wrong. People are not these power crazy demons that care only about power. Even powerful people
00:21:42.880 really care about the truth, at least some of them.
00:21:46.440 Well, even, I would just add, I totally agree with you, but I would add as a footnote to that cynicism,
00:21:52.340 even if the cynical take were true, people's incentives are not perfectly aligned. So even in a rivalry of people
00:21:59.740 seeking power, the kinds of conspiracies and collaborations and Orwellian, you know, star chambers
00:22:07.620 rarely exist the way populists imagine, right? I mean, it's just, you can't get, you take...
00:22:13.140 You take Elon Musk and Donald Trump, you know, two people with a very high opinion of themselves
00:22:19.660 and with not necessarily the same goals in life and in the world. Even if they can ally themselves
00:22:28.340 for some time around a certain common interest, in the long run, it will be very, very difficult
00:22:34.480 to keep this alliance.
00:22:36.120 Yeah. Yeah. To say nothing of the people who are not aligned with them. There's a fascinating
00:22:41.340 tension between the self-correcting mechanism that would deliver truth and the self-correcting
00:22:47.780 mechanisms that would deliver order. There's this trade-off between truth and order that you
00:22:51.460 describe in the book. Yeah.
00:22:52.740 Let's cycle on that for a minute.
00:22:54.540 Okay. That's very important because, and if we think about human societies really as information
00:23:01.980 networks, so the question is, what do these networks produce? And to function, they need to produce
00:23:10.040 two different things, not just one. They need to produce both truth and order. A system that just
00:23:18.660 ignores the truth completely will collapse. It will sooner or later encounter reality and will collapse.
00:23:26.200 But also, a system which is only interested in the truth will not be able to maintain order
00:23:32.880 because it's easier to maintain order with fiction and with fantasy and with mass delusions.
00:23:41.000 So we just spoke about the tension between truth and fiction as though fiction were by definition
00:23:46.400 invidious and something to be canceled. What you're saying now is that we need certain fiction. It can't
00:23:53.920 be purely truth-seeking, right?
00:23:55.880 Fiction is very efficient in creating order. And the main thing is that it's complicated
00:24:02.820 to maintain a human society. It's complicated because you need to balance two things that
00:24:09.200 are pulling in different directions. You need to balance truth with order.
00:24:14.000 There's lots of trade-offs.
00:24:15.060 Yes. And I'll give one example of how it works. Think, for instance, that you want to produce an
00:24:24.740 atom bomb. So let's say that you are Iran and you want to produce an atom bomb. You need to know some
00:24:31.820 facts about the world. It's essential. If you just ignore the facts of nuclear physics, you will not be
00:24:38.340 able to produce an atom bomb. It's as simple as that. On the other hand, to produce an atom bomb,
00:24:44.960 just knowing the facts of physics is not enough. You need millions of people to cooperate on the project.
00:24:50.960 If you have a single physicist, she is the most brilliant physicist in history. And she knows
00:24:57.020 that E equals MC squared and all the secrets of quantum mechanics and whatever, she cannot produce
00:25:03.140 an atom bomb by herself, working in her garage or something. You need people to mine uranium,
00:25:09.900 you know, thousands of kilometers away. You need engineers and workers to build the nuclear reactor.
00:25:15.420 You need people to produce food so that all these workers and physicists have something to eat.
00:25:21.580 How do you get millions of people to cooperate on the project? If you just tell them the facts of
00:25:26.240 physics, E equals MC squared, now get on with it, it doesn't work. So what? So just because E equals MC
00:25:33.860 squared, we should now work on building this atom bomb? No. This is where ideology and mythology
00:25:39.660 and fictions come into the picture. You usually convince millions of people to work on a project
00:25:45.920 together by telling them some ideology or some mythology. And here the facts don't matter so much.
00:25:52.800 If you try to build a very powerful ideology and you ignore the facts, your ideology is still likely to
00:25:59.760 explode with a very, very big bang. And in most cases in history, the people who are experts
00:26:07.280 in nuclear physics, for instance, get their orders from the people who are experts in Shiite theology
00:26:14.880 or in Jewish theology or in communist ideology or in Nazi ideology. It's the people who are experts in
00:26:23.040 the truth usually get orders from the people who are experts in order. And this is something that
00:26:30.800 very often scientists and engineers don't understand. That they work, I don't know, on AI and they think
00:26:39.720 that the scientists and the engineers will decide what to do with it. But no. Once you produce this
00:26:46.660 very powerful technology because you know the facts about the world, then you will get the people who
00:26:53.500 are experts in mythology and theology coming in and telling you, thank you very much for producing this
00:26:59.160 very powerful technology. Now we will decide what to do with it.
00:27:03.140 Is there a truly benign and wise version of this? Because what you seem to be describing is a yet
00:27:11.740 another reason for cynicism and distrust.
00:27:14.320 So this is important. Now, it's very hard to create large-scale societies without any fictions.
00:27:22.420 Even money is a fiction. Even, you know, think about what is the last thing that still holds American
00:27:27.100 society together. What is the last thing that Republicans and Democrats still agree on?
00:27:32.240 It's the dollar. And even this is under attack, you know, from cryptocurrencies and so forth.
00:27:36.980 But almost the last story that still holds the place together is that everybody agrees on the
00:27:41.960 value of a dollar, which is just a fiction.
00:27:44.260 I think we hit this in a previous conversation. There's something a little confusing about
00:27:47.780 your use of the word fiction here. Because fiction, in any kind of context where we're talking about
00:27:52.100 the truth sounds intrinsically pejorative, right? So like this is, there's truth and there's fiction.
00:27:57.920 It's not pejorative.
00:27:58.700 You're talking about conventions. Something that's conventionally constructed or socially constructed.
00:28:01.660 Something that comes out of the human imagination and not from reality.
00:28:04.700 Yeah.
00:28:04.940 I mean, the value of the dollar is purely an imaginary reality.
00:28:09.200 Exactly, yeah.
00:28:09.800 The paper, the dirty paper in your pocket.
00:28:12.040 And most dollars are not even paper. They're just digital tokens in computers.
00:28:15.840 Right.
00:28:15.960 They have no objective value.
00:28:18.240 Yeah.
00:28:18.520 They have value only in our imagination. In this sense, the dollar is a fiction.
00:28:22.080 It's a socially constructed reality.
00:28:23.600 Yes. And now the big question comes, so is everything just a conspiracy? Is everything
00:28:30.400 just a fiction? And the key thing is that fictions can be extremely valuable and positive provided
00:28:38.380 you acknowledge they are fictions. I don't think that the dollar is a bad thing. I don't think
00:28:44.560 that the fictions holding society together are a bad thing as long as you acknowledge the
00:28:49.860 reality that this is a man-made imaginary thing. And this is important because then you can correct
00:28:57.280 it. Then you can make amendments in it. Let's compare two texts that are foundation texts for
00:29:04.400 holding human society together. The Ten Commandments and the U.S. Constitution.
00:29:09.660 I have a preference.
00:29:10.280 Both are fictional in the sense that they came out of the human imagination. But one text refuses
00:29:18.740 to acknowledge this reality, and the other text is fully honest about it. So the Ten Commandments,
00:29:26.360 they start with, I am your Lord God. This text, it claims to be written by God, not by any human,
00:29:35.660 which is why it contains no mechanism for correcting its errors. And if some of our listeners are
00:29:44.120 outraged, errors in the Ten Commandments, what could possibly be wrong with don't kill and don't
00:29:49.460 steal?
00:29:49.900 I don't think I have those listeners. You give me too much credit.
00:29:52.920 Yes, but maybe for the one or two are still left out there. Notice if you read, for instance,
00:29:57.480 the Ten Commandment, that it endorses slavery. The Ten Commandment says that you should not covet your
00:30:05.940 neighbor's house or field or slaves, which implies that God is perfectly okay with people holding
00:30:14.000 slaves. It's just God doesn't like it when you covet the slaves of your neighbors. No, these are his
00:30:21.080 slaves. Don't covet them. Now, because the text doesn't recognize that this is the creation of
00:30:27.740 the human imagination, there is no 11th Commandment, which says, well, if you discover something wrong
00:30:34.500 in the previous Ten Commandments, by a two-third majority, you can vote on changing Commandment
00:30:40.740 number 10. No mechanism. So we still have the same text from the Iron Age until today. Now,
00:30:48.520 the U.S. Constitution, it's also a foundational text which gives instructions for people to how
00:30:54.800 to manage their society. It also came out of the human imagination. It's not an objective reality.
00:31:02.740 It's also, in this sense, a fiction. But it is honest. It starts with we the people. We the people
00:31:09.760 wrote this text. And because we are fallible human beings, maybe we made some mistakes, like endorsing
00:31:16.660 slavery. So we also include in this text a mechanism to amend it, to identify and correct
00:31:23.220 its own mistakes. It's not easy, but it can be done. And it has been done. So fictions can be
00:31:30.180 extremely valuable. We need them. We cannot have a large-scale society without them. But they should
00:31:35.920 be honest about their own fictional nature, which gives us the ability to identify and correct
00:31:44.820 our mistakes or the mistakes of our ancestors.
00:31:47.880 But there's something intrinsically conservative about this picture. Because admitting that something
00:31:54.080 is a fiction or a convention is not to say that you should want to revise it impetuously, right?
00:32:01.660 Absolutely not.
00:32:02.380 It's very good that very few people try to rethink the convention about whether to drive on the left
00:32:07.840 side or the right side of the road on a daily basis, right? I mean, it's clearly arbitrary,
00:32:11.620 but it's crucially important that we all, once we've decided, that we not keep rethinking it.
00:32:17.000 Yeah. And there are many things like that. And so this distrust in institutions, that what has
00:32:21.660 grown so corrosive is the sense that the appropriate response to each of the errors, however embarrassing
00:32:29.300 they have been of late, is to break fully with the institutions and in some sense reinvent
00:32:36.200 civilization on your own. You find some place to stand where you can reboot from. And that actually
00:32:42.260 seems to be, I mean, obviously there's a populist version of this, which we can talk about. I think
00:32:47.580 we should talk about populism. But this message seems to be coming from on high. I mean, you've
00:32:53.340 mentioned Donald Trump and Elon Musk as prime offenders. They're prime offenders on this very point.
00:32:59.020 They sow distrust in our institutions on such a fundamental level and at such scale.
00:33:05.480 It's coming from the right and from the left. It's the populist position. It's the Marxist
00:33:09.220 position. Let's destroy the old world and create a new world in its place. This is in the international,
00:33:16.380 in the heim, how do you call it, of communism. And it's also on the populist right. And, you know,
00:33:23.320 as a historian, I tend to be conservative in the deep sense of the word, in the Burkean sense of
00:33:28.980 the word. I mean, what you saw recently all over the world is the suicide of conservative parties
00:33:33.880 that they abandoned the conservative values and became revolutionary parties. Like the Republican
00:33:39.720 party today in the United States is a revolutionary party. Revolutionary in the sense-
00:33:44.780 It's a personality cult. Yeah.
00:33:45.420 Not just that. It says that, you know, all the institutions are rotten. They cannot be reformed.
00:33:50.340 We just need to destroy all of it and start again. Which, I mean, people can say, this is true.
00:33:55.640 This is false. Let's leave that aside. But just look at the structure of it. It's, this is what
00:34:01.840 revolutionary parties look like. They say things are so corrupted. Things are so out of order that
00:34:08.620 the only thing left to do is to destroy all the existing institutions and start from scratch,
00:34:14.040 which was the Leninist position, the Bolshevik position a hundred years ago. And now it's the
00:34:19.140 position of many so-called conservative parties. That the, the, the traditional insight of
00:34:25.180 conservatism is that yes, institutions are flawed. Institutions can be corrupted, but it takes
00:34:31.500 generations to build a functioning society, a functioning institution. Humans don't really
00:34:38.780 have the capacity to understand the full complexity of reality and to invent a perfect society from
00:34:46.880 scratch. It just can't be done. Every time we try to do it, it leads to disaster. Even worse than the
00:34:54.300 things that we try to, to, to, to, to, to correct. You cannot create a perfect society. So move more
00:35:01.220 slowly, be more respectful of the existing institutions and traditions. They need correction. They need
00:35:09.340 amendment. They need improvement, but just destroying them completely and starting from scratch.
00:35:14.880 I mean, you have hundreds of years of kind of previous corrections from real things that happened
00:35:22.120 in history that are baked into the system. Be very careful before you throw all of it out
00:35:28.560 and try to start from scratch. Okay. So what do we do about social media? Given that picture,
00:35:34.920 what, what advice do you have for correcting the obvious pathology we see here? I mean,
00:35:39.900 if you could give advice to Elon Musk or Mark Zuckerberg, I mean, one thing, two things,
00:35:45.800 corporations should be liable for the actions of their algorithms and only humans have freedom of
00:35:54.740 speech. Bots and algorithms do not have freedom of speech and they should never masquerade as humans.
00:36:03.140 So these are the two main things that, that are needed to, to, to correct social media.
00:36:07.900 So they're publishers. These platforms should be viewed as publishers and their algorithm,
00:36:12.960 the tuning of the algorithm is an editorial choice.
00:36:15.960 Absolutely. But this is strange to think about it, but one of the first jobs that was fully
00:36:22.680 automated was not taxi drivers. It was not textile workers. It was editors, news editors. It's amazing
00:36:29.320 to think about it. It was automated. The job that once belonged to Lenin and Mussolini is now being
00:36:35.820 done by algorithms. I mean, Lenin, before he was Soviet dictator, he was editor of a newspaper,
00:36:41.760 Iskra. And Mussolini also, he rose to power from being editor of the newspaper Avanti. So this was
00:36:48.800 like the promotion scale, the promotion ladder, editor of a newspaper, dictator of the country.
00:36:53.320 And this is for a good reason. I mean, the editors of, of news, they sit at one of the most important
00:36:59.840 junctions in society. They shape the conversation. They decide what people would be aware of, what
00:37:06.400 people would be discussing. And this is what some, one of the most important positions in society. And
00:37:14.240 now it's been taken over by algorithms because again, in, in Iskra, it was Lenin who decided
00:37:21.580 what would be the top story of the day, what would be the main headline. And on Facebook,
00:37:27.100 it's an algorithm deciding what is at the top of your feedback of your, of your news feed.
00:37:33.240 But both of those sound bad. So you're, if you're going to give people a choice between
00:37:36.440 Lenin and an algorithm, they're going to take the algorithm.
00:37:38.480 So the thing is that in, in, not in a dictatorship, in a democracy, newspapers and other news outlets
00:37:46.120 are liable for their decisions. Like if the editor of the New York times decides to publish
00:37:51.960 some conspiracy theory or fake news at the front page of the New York times, he cannot hide or she
00:38:00.300 cannot hide behind the argument, but free speech, there are some people who believe it's true.
00:38:05.900 So I put it on the front page of the New York times. No, your job as editor is not just to put
00:38:11.860 something random there or something that would please people. Your job is to fact check and to
00:38:18.680 take responsibility for these decisions and to make sure that if you publish something on the front page
00:38:25.280 of the New York times, you better be sure that this is accurate and this is responsible. And if you
00:38:31.680 don't know how to do it, then you're in the wrong job. And what, again, I will tell to, you know,
00:38:37.340 Facebook, to Twitter, to TikTok, be very, very careful before you censor human users. I mean, if a human
00:38:45.440 user decided to publish even a lie, even a fake news, even a conspiracy theory, I would be extremely
00:38:52.280 careful before I shut down their account or send those in. But if the, if my algorithm as the
00:39:00.580 corporation then decides to promote this particular piece of fake news or this particular conspiracy
00:39:08.080 theory, this is on me. This is on the company, not on the user.
00:39:12.280 The promotion, it's the amplification, not the fact that it exists on the network in the first
00:39:16.720 place because they can't possibly prevent, and they have billions of pieces of content arriving
00:39:22.200 every day, right? So they can't, they can't guarantee that there'll be no malicious lies or even child
00:39:27.740 pornography on their, on their network. And they need, it's the same way that, you know, people send
00:39:31.280 letters to the New York times every day. So it's not the position of the job of the New York times to
00:39:36.260 send them, but don't publish them on your front page unless you're sure that you did, you know,
00:39:42.160 and sometimes, okay, sometimes you would make mistakes, but still your job is to fact check
00:39:48.060 and to, to think about the, the, the, the implications of this story and then take a very
00:39:54.440 responsible decision about what you choose to promote. The other thing is that social media
00:40:01.480 should reserve freedom of speech to human beings, not to bots and to algorithms. And in particular,
00:40:09.040 we should ban fake humans, counterfeit humans. If a bot pretends to be a human, we should know about
00:40:17.520 it. Like if you see that some story on Twitter gains a lot of traction, a lot of traffic, and you,
00:40:25.660 you think to yourself, oh, lots of humans are interested in this story. This must be important.
00:40:32.400 I also want to know what everybody's talking about. And you also start, you click on it,
00:40:37.540 you also start commenting on it, but actually the entities that at least originally pushed this
00:40:46.120 story to the top of the conversation, they were not humans. They were bots working in the service of,
00:40:53.380 of Putin or whoever. This is wrong. We should not have fake humans shaping the human conversation.
00:41:02.060 You know, democracy is a conversation. Imagine it as a group of people standing in a circle,
00:41:08.360 talking with each other. Suddenly a group of robots join the circle and talk very loudly,
00:41:16.100 pretending to be humans.
00:41:16.880 Very persuasively. They pretend to be humans and you don't know. You can't tell who are the humans
00:41:22.520 and who are the robots. The conversation breaks down.
00:41:25.880 To be clear, you're not against bots of various kinds. You just think they should be declared as bots.
00:41:31.040 Absolutely. Again, if you have a medical bot and you want to consult with that bot about some
00:41:36.840 medical condition, I mean, soon we'll have AI doctors with capabilities far beyond human doctors.
00:41:42.620 I'm not against that. They can improve healthcare dramatically. They can help provide better healthcare
00:41:48.880 for billions of people. But when I talk with a non-human entity, I want to know that this is a
00:41:54.920 non-human entity, that I'm not talking with a human being. They are welcome to join the conversation
00:41:59.620 on condition that they don't masquerade as humans.
00:42:04.240 What you're arguing for, essentially, and I think this is a phrase you use in the book,
00:42:08.620 that what we need are benevolent networks that have a fiduciary responsibility to their users.
00:42:15.420 Yeah, and it's a very old principle. I mean, we don't need to invent anything new in this respect.
00:42:19.380 Like, if you think about your doctor or your therapist or your accountant or your lawyer,
00:42:25.640 for centuries, we already had these regulations and understanding that they have access to extremely
00:42:33.280 private information, to potentially explosive information about us that could maybe ruin our
00:42:40.240 life. And they have a fiduciary duty to use that information only for our interests,
00:42:46.920 except in very extreme circumstances when there is a crime or something. But our doctor,
00:42:52.380 for instance, cannot take my personal information and sell it to third parties for profits. And the
00:42:59.420 same principle should hold with our relationship with the high-tech giants. I mean, they should have
00:43:05.260 the same responsibilities.
00:43:06.920 How do you think about this trade-off between efficiency and inefficiency? Inefficiency sounds like
00:43:13.080 it's a bug. But as you point out in the book, there are places where it's a feature because it's a
00:43:17.340 bulwark against totalitarianism. And yet we want a certain kind of efficiency so as to be able to
00:43:23.440 find malicious actors and terrorists, et cetera. So how do you view that in a reasonably well-functioning
00:43:30.880 democracy that has institutions that are error-correcting both with respect to truth and with respect to
00:43:37.440 order? How would you, if you could get your hands on the dial of efficiency, how would you tune it?
00:43:44.100 I mean, that's the democratic conversation. We avoid the extremes and find the middle path,
00:43:48.900 and you're bound to make mistakes. So keep correcting your mistakes. It's not like there is a magic bullet
00:43:55.420 that solves it once and for all. So, you know, what is the right level of surveillance? What is the
00:44:02.020 right level of immigration? You know, this is what we have the democratic debate for. If you go for an
00:44:07.260 extreme position that, you know, humans have a right to immigrate to anywhere they like in as huge
00:44:16.380 as numbers as they like, this is completely unfeasible. Open borders, yeah. So again, how many
00:44:23.000 immigrants a country want to absorb and under what conditions, let's discuss. Different people have
00:44:29.580 different views. I don't think that people who want a more strict immigration policy, this immediately
00:44:35.280 turns them into fascists and Nazis. And similarly, people who want more lenient immigration policies,
00:44:41.720 less restrictive, that doesn't turn them immediately into traitors who want to destroy the country.
00:44:48.400 Let's have a conversation and try this policy and try that policy. It should not be a kind of all-out
00:44:55.640 war between good and evil. And the same goes for the level of surveillance and the same goes for the
00:45:01.500 level, again, of free speech. I mean, in all these cases, we need to find the middle path and it's
00:45:07.680 difficult. And we need to start with the assumption that we are not infallible and that other people
00:45:14.800 might have good ideas about these questions. Okay. So let's take this general framework that you've
00:45:20.420 sketched in your book and look at a few current events. I mean, there really is too much to talk
00:45:25.400 about, but we have the U.S. election. We have the ongoing war in Ukraine. We have the ongoing war
00:45:31.500 between Israel now on at least two fronts, Israel and her enemies. Let's start with the U.S. election.
00:45:39.220 How do you view our circumstance here? I mean, there has been, we really are the poster child for a lot
00:45:46.140 of the dysfunction you describe more generically in your book. I mean, there's just a pervasive sense
00:45:52.220 that I think social media is, is, doesn't fully explain it, but it certainly has amplified the
00:45:57.120 problem. There's a pervasive sense that we have, we've lost the capacity to speak to one another
00:46:02.740 about rather fundamental issues. And we're just hurtling towards some political catastrophe here.
00:46:10.260 So we have an, we have an election, which however it goes, it's quite plausible to imagine that half
00:46:17.380 the country won't believe the results, right? Given, given what has happened in recent years. So how do
00:46:23.400 we pull back from the brink here? So historically there are two big dangers for the survival of
00:46:28.640 democracies. And you can see both of them now in the U.S. One big danger is what we discussed earlier.
00:46:37.020 Democracy is this system when you give power to a person, to a party, to try out some policies
00:46:45.180 on condition that they give the power.
00:46:48.500 If you'd like to continue listening to this conversation, you'll need to subscribe at
00:46:52.900 samharris.org. Once you do, you'll get access to all full-length episodes of the Making Sense
00:46:58.200 podcast. The podcast is available to everyone through our scholarship program. So if you can't
00:47:03.400 afford a subscription, please request a free account on the website. The Making Sense podcast
00:47:08.440 is ad-free and relies entirely on listener support. And you can subscribe now at samharris.org.