Yuval Noah Harari is a world-historian and meditator. His new book, 21 Lessons for the 21st Century, is about the present and the future, and his views on AI, automation, and other topics. In this episode, I talk to Yuval about how he thinks about the past and the present, and why he thinks that the past has something to teach us about the future. We also talk about the importance of meditation, and the role that meditation plays in helping us understand the past, present and future, as well as the benefits of meditations, and how they can be applied in the present to understand the present. And, as always, thank you for listening to the Making Sense Podcast! Please consider becoming a supporter of the podcast by becoming one of our many platinum memberships, where you get access to all of our newest episodes, plus early ad-free episodes throughout the week. This podcast is made possible entirely through the support of platinum-memberships, which means you'll get early access to our most popular episodes and access to the latest releases, and access all special offers, all of which are available to you, the listener. You'll also get 20% off the entire M&M store, plus a 20% discount when you become a platinum member! discount code: M&Ms. at checkout. The discount starts at $99.99 and includes free shipping, and shipping throughout the rest of the month, plus two-and-a-half off the retail version of the M& a free copy of the book, "Making Sense: The Making Sense: A Guide to the podcast, "The Making Sense is a Bookshop. available in Kindle and Audible, and all other places you get 10% off including Audible Prime and Pizzazzarelli, and P&P Pro, plus an Audible membership, shipping + VaynerSpeaker, and Vimeo membership, too! and PODCAST PROMO + Vimeo Pro, and a free course on the Audible store, starting at $19. . Thanks to John Rocha, who kindly provided us with a copy of his excellent book, Making Sense and we'll be giving you a discount on the book review of the final issue of Making Sense? Subscribe to the book and a $5 autographed copy of "A Brief History of the Future: A Brief History Of Tomorrow,
00:03:19.720So, Yuval, you have these books that just steamroll over all other books, and I know because I write
00:03:29.940books. So, you wrote Sapiens, which is kind of about the deep history of, yes, with a few fans.
00:03:38.940Which is really about the history of humanity. And then you wrote Homo Deus, which is about our far
00:03:44.640future. And now you've written this book, 21 Lessons for the 21st Century, which is about the
00:03:50.780present. I can't be the only one in your publishing world who notices that now you have nothing left to
00:03:55.880write about. So, good luck with that career of yours. So, how do you describe what you do? Because
00:04:05.120you're a historian. I mean, one thing that you and I have in common is that we have a reckless disregard
00:04:10.220for the boundaries between disciplines. I mean, you just touch so many things that are not
00:04:15.340straightforward history. How do you think about your intellectual career at this point?
00:04:20.940Well, my definition of history is that history is not the study of the past. It's the study of change.
00:04:28.180How things change. And yes, most of the time you look at change in the past. But in the end, all the
00:04:36.140people who lived in the past are dead. And they don't care what you write or say about them.
00:04:42.940So, if the past has anything to teach us, it should be relevant to the future and to the present
00:04:49.100also. But you touch biology and the implications of technology. I follow the questions. And the
00:04:59.040questions don't recognize these disciplinary boundaries. And as a historian, maybe the most
00:05:06.160important lesson that I've learned as a historian is that humans are animals. And if you don't take
00:05:12.600this very seriously into account, you can't understand history. Of course, I'm not a biologist.
00:05:19.560I also know that humans are a very special kind of animal. If you only know biology, you will not
00:05:25.620understand things like the rise of Christianity or the Reformation or the Second World War. So you
00:05:32.860need to go beyond just the biological basis. But if you ignore this, you can't really understand
00:05:41.220anything. Yeah. And the other thing we have in common, which gives you, to my eye, a very unique
00:05:48.020slant on all the topics you touch, is an interest in meditation and a sense that our experiences in
00:05:54.320meditation have changed the way we think about problems in the world and questions like just what
00:06:02.300it means to live a good life or even whether the question of the meaning of life is an intelligible
00:06:09.200one or a valid one or a one that needs to be asked. How do you view the influence of the contemplative
00:06:17.700life on your intellectual pursuits? I couldn't have written any of my books, either Sapiens or Homo Deus or 21
00:06:25.400lessons without the experience of meditation, partly because of just what I learned about the human mind
00:06:33.900from observing the mind, but also partly because you need a lot of focus in order to be able to
00:06:43.020summarize the whole of history into like 400 pages. And meditation gives you this kind of ability
00:06:52.580to really focus. I mean, my understanding of at least the meditation that I practice is that the
00:06:59.520number one question is what is reality? What is really happening? To be able to tell the difference
00:07:07.800between the stories that the mind keeps generating about the world, about myself, about everything,
00:07:15.980and the actual reality. And this is what I try to do when I meditate. And this is also what I try to do
00:07:23.560when I write books to help me and other people understand what is the difference between fiction
00:07:31.400and reality. Yeah, yeah. And I want to get at that difference because you use these terms in slightly
00:07:38.780idiosyncratic ways. So I think it's possible to either be confused about how you use terms like story
00:07:45.920and fiction. For instance, just the way you talk about the primacy of fiction, the primacy of story,
00:07:54.180the way in which our concepts that we think map onto reality don't really quite map onto reality,
00:08:02.300right? And yet they're nonetheless important. That is, in a way that you don't often flag in your
00:08:10.540writing a real meditator's eye view of what's happening here. I mean, it's not, it's like you're
00:08:16.460giving people the epiphany that certain things are made up, like the concept of money, right? Like
00:08:23.080the idea that we have dirty paper in our pocket that is worth something, right? That is a convention
00:08:28.140that we've all agreed about. But it is a, it's an idea. It only works because we agree that it works.
00:08:35.100But you, the way you use the word story and fiction rather often seems to denigrate these things a
00:08:44.440little bit more than I'm tempted to do when I talk about it. I don't say that there is anything wrong
00:08:49.340with it. Stories and fictions are a wonderful thing, especially if you want to get people to
00:08:55.900cooperate effectively. You cannot have a global trade network unless you agree on money. And you can,
00:09:03.540you cannot have people playing football or baseball or basketball or any other game unless you get
00:09:09.180them to agree on rules that quite obviously we invented. They did not come from heaven. They did
00:09:16.460not come from physics or biology. We invented them. And there is nothing wrong with people agreeing,
00:09:23.160accepting, let's say for 90 minutes, the story of football, the rules of football, that if you score
00:09:29.540a goal and this is the goal of the whole game and so forth, the problem begins only when people
00:09:36.960forget that this is only a convention, this is only something we invented. And they start confusing it
00:09:46.060with kind of, this is reality. This is the real thing. And in football, it can lead to people,
00:09:52.820to hooligans beating up each other or killing people because of this invented game. And on a higher
00:10:00.000level, it can lead to, you know, to world wars and genocides in the name of fictional entities like
00:10:07.500gods and nations and currencies that we've created. Now, there is nothing wrong with these creations
00:10:15.600as long as they serve us instead of us serving them. But wouldn't you acknowledge that there's a
00:10:23.480distinction between good stories and bad stories? Yeah, certainly. The good stories are the ones that
00:10:30.740really serve us, that help people, that help other sentient beings live a better life. I mean,
00:10:36.240it's as simple as that. I mean, of course, in real life, it's much more complicated to know what will be
00:10:42.520helpful and what not and so forth. But a good starting place is just to have this basic ability
00:10:49.040to tell the difference between fiction and reality, between our creations and what's really out there,
00:10:58.000especially when, for example, you need to change the story. Or a story which was very adapted to one
00:11:07.440condition is less adapted to a new condition, which is, for example, what I think is happening now
00:11:14.240with the story of the underground liberal democracy, that it was probably one of the best stories ever
00:11:22.720created by humanity. And it was very adapted to the conditions of the 20th century. But it is less
00:11:31.480and less adapted to the new realities of the 21st century. And in order to kind of reinvent the
00:11:39.080system, we need to acknowledge that to some extent, it is based on stories we have invented.
00:11:48.800Right. But so when you talk about something like human rights being a story or a fiction,
00:11:54.400that seems like a story or a fiction that shouldn't be on the table to be fundamentally
00:12:01.900revised. Right. Like that's where people begin to worry that to describe these things as stories
00:12:07.680or fictions is to suggest tacitly, if I don't think you do this explicitly, that all of this stuff is
00:12:15.900made up and therefore it's all sort of on the same level. And yet there's clearly a distinction
00:12:22.240between a distinction you make in your book between dogmatism and the other efforts we make to justify
00:12:29.680our stories. Right. There's there are stories that are dogmatically asserted and religion has more
00:12:34.640than its fair share of these. But there are political dogmas, there are tribal dogmas of all kinds,
00:12:39.920you know, nationalism can be anchored to dogma. And the mode of asserting a dogma is is to be
00:12:47.080doing so without feeling responsible to counter arguments and demands for evidence and and reasons
00:12:54.860why. Whereas with something like human rights, we can tell an additional story about why we value
00:13:01.360this convention. Right. Like we we don't have to doesn't have to be a magical story. It doesn't have
00:13:06.080to be that we were all imbued by our creator with these things. But we can give we can talk for a long
00:13:13.280time without saying it's just so to justify that convention. Yeah. I mean, human rights is is is is
00:13:21.060a particularly problematic and also interesting case. First of all, because it's our story. I mean,
00:13:27.360we are very happy with you discrediting the stories of all kinds of religious fundamentalists and all
00:13:33.680kinds of tribes somewhere and ancient people, but not our story. Don't touch that. It depends what you
00:13:39.640mean by we. So I guess we most of the people, I don't see anybody here. It's it could be just empty
00:13:45.880chairs and then recordings of laughter. But I assume that the people here, most of them, this is our
00:13:53.280story. The second thing is that we live in a moment when liberal democracy is is under a severe attack.
00:14:01.740And this was not so when I wrote Sapiens. I felt much fear writing these things back in 2011, 2012.
00:14:09.640And now it's it's much more problematic. And yes, I find myself one of the difficulties of living
00:14:16.280right now as an intellectual, as a thinker, that you kind of I'm kind of torn apart by the
00:14:24.800imperative to explore the truth, to follow the truth wherever it leads me. And the political realities
00:14:32.880of the present moment and the need to engage in in very important political battles. And this is one of
00:14:41.680the costs, I think, of what is happening now in the world, that it restricts our ability or our freedom
00:14:52.400to truly go deep and explore the foundations of our system. And I still feel the importance of doing it, of
00:15:06.260questioning even the foundations of liberal democracy and of human rights, simply because I think that as we have
00:15:15.640defined them since the 18th century, they are not going to survive the tests of the 21st century.
00:15:23.640And it's extremely unfortunate that we have to to engage in this two front battle, that at the same moment, we have to defend these ideas from people who look at them from the perspective of nostalgic fantasies.
00:15:43.640That they don't even, they want to go back from the 18th century. And at the same time, we have to also go forward and think what it means, what the new scientific discoveries and technological developments of the 21st century really mean to the core ideas of what do human rights mean when you are starting to have superhumans?
00:16:10.040Do superhumans have superhuman rights? What does the right of freedom mean when we have now technologies that simply undermine the very concept of freedom?
00:16:25.540We kind of, when we created this whole system, not we, somebody, back in the 18th and 19th century,
00:16:35.180we gave ourselves all kinds of philosophical discounts of not really going deeply enough in some of the key questions. Like, what do humans really need? And we settled for answers like, just follow your heart.
00:16:57.800This is Joseph Campbell. I blame Joseph Campbell for follow your bliss.
00:17:01.140No, but follow your heart. The voter knows best. The customer is always right. Beauty is in the eyes of the beholder.
00:17:08.300All these slogans, they were kind of, of covering up from not engaging more deeply with the question of what is really human freedom and what do humans really need?
00:17:22.320And for the, for the last 200 years, it was good enough.
00:17:28.000But now to just follow your heart is becoming extremely dangerous and problematic when there are corporations and organizations and governments out there that for the first time in history can hack your heart.
00:17:44.820And your heart might be now be now a government agent and you don't even know it.
00:17:51.040So telling people in 2018, just follow your heart is a much, much more dangerous advice than in 1776.
00:18:00.920Yeah. So let's drill down on that circumstance.
00:18:04.340So we have this claim that liberal democracy is one, under threat, and two, might not even be worth maintaining as we currently conceive it, given the technological changes that are upon us or will be upon us.
00:18:21.400Well, it is worth maintaining. It's just becoming more and more difficult.
00:18:24.980Well, presumably there are things about liberal democracy that are serious bugs and not features in light of the fact that, as you say, if it's all a matter of putting everything to a vote and we are all part of this massive psychological experiment where we're gaming ourselves with algorithms written by some people in this room to not only confuse us with respect to what's in our best interest,
00:18:53.620but the very tool we would use to decide what's worth wanting is being hijacked.
00:19:01.720It's one thing to be wrong about how to meet your goals.
00:19:05.240It's another thing to have the wrong goals and not even know that.
00:19:09.300It's hard to know where ground zero is for cognition and emotion if all of this is susceptible to outside influence,
00:19:19.180which ultimately we need to embrace because there is a possibility of influencing ourselves in ways that open vistas of well-being and peaceful cooperation that we can't currently imagine or we can't see how to get to.
00:19:35.360So it's not like we actually want to go back to when there was no, quote, hacking of the human mind.
00:19:40.460Every conversation is an attempted hack of somebody else's mind, right?
00:19:44.260So we're just getting, it's getting more subtle now.
00:19:47.360Yeah, it's, you know, throughout history, other people and governments and churches and so forth,
00:19:54.980they all the time tried to hack you and to influence you and to manipulate you.
00:20:01.280They just weren't very good at it because humans are just so incredibly complicated.
00:20:07.940And therefore, for most of history, this idea that I have an inner arena, which is completely free from external manipulation,
00:20:20.000nobody out there can really understand what's happening within me.
00:21:51.940Similarly, with the whole idea of shifting authority from humans to algorithms.
00:22:01.760So I trust the algorithm to recommend TV shows for me.
00:22:06.040And I trust the algorithm to tell me how to drive from Mountain View to this place this evening.
00:22:12.700And eventually I trust the algorithm to tell me what to study, and where to work, and whom to date, and whom to marry, and who to vote for.
00:22:23.180And then people say, no, no, no, no, no, no.
00:22:35.980And if the yardstick is the algorithm to trust the algorithm, to give authority to the algorithm, it needs to make perfect decisions, then yes, it will never happen.
00:22:50.420The algorithm just needs to make better decisions than me about what to study, and where to live, and so forth.
00:22:58.120And this is not so very difficult, because as humans, we often tend to make terrible mistakes, even in the most important decisions in life.
00:24:11.820It takes a lot of things that you just don't have.
00:24:14.660And it's not just a bug of liberal democracy.
00:24:17.180It's true of any socio-economic or political system, you could not build a communist regime in 16th century Russia.
00:24:27.520I mean, you can't have communism without trains and electricity and radio and so forth.
00:24:34.060Because in order to make all the decisions centrally, if a slogan is that you work, they take everything, and then they redistribute according to needs, each one works according to their ability and gets according to their need, the key problem there is really a problem of data processing.
00:24:54.100How do I know what everybody is producing, how do I know what everybody needs, and how do I shift the resources, taking wheat from here and sending it there?
00:25:07.460In 16th century Russia, when you don't have trains, when you don't have radio, you just can't do it.
00:25:14.820So as technology changes, it's almost inevitable that the socio-economic and political systems will change.
00:25:51.960I mean, we had a moment in the sun that seemed, however delusionally, to be kind of outside of history.
00:26:00.600You know, it's like the first moment in my life where I realized I was living in history was September 11, 2001.
00:26:06.320But before that, it just seemed like people could write books with titles like The End of History.
00:26:12.360And we sort of knew how this was going to pan out, it seemed.
00:26:17.340Liberal values were going to dominate the character of a global civilization, ultimately.
00:26:23.660We were going to fuse our horizons with people of however disparate background.
00:26:30.620You know, someone in a village in Ethiopia was eventually going to get some version of the democratic, liberal notion of human rights and the primacy of rationality and the utility of science.
00:26:46.760So religious fundamentalism was going to be held back and eventually pushed all the way back and irrational economic dogmas that had proved that they're merely harmful would be pushed back.
00:26:58.900And we would find an increasingly orderly and amicable collaboration among more and more people.
00:27:07.100I think, like I say, and we would get to a place where war between nation states would be less and less likely to the point where, by analogy, a war between states internal to a country like the United States, a war between Texas and Oklahoma just wouldn't make sense, right?
00:27:24.420But how is that possibly going to come about?
00:27:37.720There's a xenophobic strand to our politics that is just immensely popular, both in the U.S. and in Western Europe.
00:27:46.880And this anachronistic nativist reaction is, as you spell out in your most recent book, is being kindled by a totally understandable anxiety around technological change of the story.
00:28:03.040I mean, we're talking about people who are sensing, it's not the only source of xenophobia and populism, but there are many people who are sensing the prospect of their own irrelevance, given the dawn of this new technological age.
00:28:19.400What are you most concerned about in this present context?
00:28:23.340I think irrelevance is going to be a very big problem.
00:28:27.300It already fuels much of what we see today with the rise of populism, is the fear and the justified fear of irrelevance.
00:28:37.680If in the 20th century, the big struggle was against exploitation, then in the 21st century, for a lot of people around the world, the big struggle is likely to be against irrelevance.
00:28:49.040And this is a much, much more difficult struggle.
00:28:51.760So, a century ago, so you felt that, literally if you were the common person, that there were all these elites that exploit me.
00:29:02.600Now you increasingly feel, as a common person, that there are all these elites that just don't need me.
00:29:23.940First of all, because you're completely expendable.
00:29:27.620If a century ago, you mount a revolution against exploitation, then you know that if things, when bad comes to worse, they can't shoot all of us because they need us.
00:30:04.860And again, we are often, our vision of the future is collowed by the recent past.
00:30:10.660The 19th and 20th century were the age of the masses, where the masses ruled.
00:30:17.080And even authoritarian regimes, they needed the masses.
00:30:21.060So you had these mass political movements like Nazism and like communism.
00:30:26.520And even somebody like Hitler or like Stalin, they invested a lot of resources in building schools and hospitals and having vaccinations for children and sewage systems and teaching people to read and write.
00:30:44.040Not because Hitler and Stalin were so nice guys, but because they knew perfectly well that if they wanted, for example, Germany to be a strong nation with a strong army and a strong economy,
00:30:58.040they needed millions of people, common people, to serve as soldiers in the army and as workers in the factories and in the offices.
00:31:07.900So some people could be expendable and could be scapegoats like the Jews, but on the whole, you couldn't do it to everybody.
00:36:36.120But universal and basic, they are ill-defined.
00:36:40.580Most people, when they speak about universal basic income, they actually have in mind national basic income.
00:36:48.740They think in terms, okay, we'll tax Google and Facebook in California and use that to pay unemployment benefits or to give free education to unemployed coal miners in Pennsylvania and unemployed taxi drivers in New York.
00:37:05.060The real problem is not going to be in New York.
00:37:08.420The real problem, the greatest problem, is going to be in Mexico, in Honduras, in Bangladesh.
00:37:13.920And I don't see an American government taxing corporations in California and sending the money to Bangladesh to pay unemployment benefits there.
00:37:24.240And this is really the automation revolution.
00:37:29.440They're clapping to stop us from paying.
00:37:32.940Those are the libertarians in the audience.
00:37:37.100We've built, over the last few generations, a global economy and a global trade network.
00:37:44.480And the automation revolution is likely to unravel the global trade network and hit the weakest links, the hardest.
00:37:53.020So you will have enormous new wealth, enormous new wealth created here in San Francisco and Silicon Valley.
00:38:02.100But you can have the economies of entire countries just collapse completely because what they know how to do, nobody needs that anymore.
00:38:12.340And we need a global solution for this.
00:38:16.160So universal, if by universal you mean global, taking money from California and sending it to Bangladesh, then yes, this can work.
00:38:25.440But if you mean national, it's not a real answer.
00:38:30.580How do you define what are the basic needs of human beings?
00:38:34.880Now, in a scenario in which a significant proportion of people no longer have any jobs, and they depend on this universal basic income or universal basic services,
00:38:50.420whatever they get, they can't go beyond that.
00:38:55.080This is the only thing they're going to get.
00:38:56.900Then who defines what is their basic needs?
00:39:14.280Is it just, I mean, if you're looking 50 years to the future and you see genetic engineering of your children and you see all kinds of treatments to extend life,
00:39:24.100is this the monopoly of a tiny elite, or is this part of the universal basic package?
00:39:51.520Well, so let's imagine that we begin to extend the circle, coincident with this rise in affluence.
00:40:00.620And because on some level, if the technology is developed correctly, we are talking about pulling wealth out of the ether, right?
00:40:11.260So automation and artificial intelligence, there's more, the pie is getting bigger.
00:40:16.040And then the question is how generously or wisely we will share it with the people who are becoming irrelevant because we don't need them for their labor anymore.
00:40:25.320Let's just, let's say we get better at that than we currently are.
00:40:30.140But I mean, you can imagine that we are going to be, we will be fast to realize that we need to take care of the people in our neighborhood, you know, in San Francisco.
00:40:38.380And we will be slower to realize we need to take care of the people in Somalia.
00:40:43.700But maybe we'll just, these lessons will be hard.
00:40:46.860One, we'll realize if we don't take care of the people in Somalia, a refugee crisis, unlike any we've ever seen, will hit us in six months, right?
00:40:56.080So there'll be some completely self-serving reason why we need to eradicate famine or some other largely economic problem elsewhere.
00:41:06.920But presumably we can be made to care more and more about everyone, again, if only out of self-interest.
00:41:15.540What are the primary impediments to our doing that?
00:41:28.440I think we need for a number of reasons to develop global identities, a global loyalty, a loyalty to the whole of humankind and to the whole of planet Earth.
00:41:41.500So this is a story that becomes so captivating that it supersedes other stories that seem to say, Team America.
00:42:31.580So I'm not saying let's abolish all other identities, and from now on we are just citizens of the world.
00:42:37.880But we can add this kind of layer of loyalty to the previous layers.
00:42:44.760And this, you know, people have been talking about it for thousands of years.
00:42:48.040But now it really becomes a necessity, because we are now facing three global problems, which are the most important problems of humankind.
00:43:00.320And it should be obvious to everybody that they can only be solved on a global level, through global cooperation.
00:43:07.380These are nuclear war, climate change, and technological disruption.
00:43:11.040It should be obvious to anybody that you can't solve climate change on a national level.
00:43:19.320You can't build a wall against rising temperatures or rising sea levels.
00:43:26.520No country, even the United States or China, no country is ecologically independent.
00:43:33.280There are no longer independent countries in the world, if you look at it from an ecological perspective.
00:43:38.680Similarly, when it comes to technological disruptions, the potential dangers of artificial intelligence and biotechnology should be obvious to everybody.
00:43:50.620You cannot regulate artificial intelligence on a national level.
00:43:55.340If there is some technological development you are afraid of, like developing autonomous weapon systems, or like doing genetic engineering on human babies,
00:44:07.300then if you want to regulate this, you need cooperation with other countries.
00:44:12.680Because like the ecology, also science and technology, they are global.
00:44:18.860They don't belong to any one country or any one government.
00:44:22.920So if, for example, the United States bans genetic engineering on human beings,
00:44:30.460it won't prevent the Chinese or the Koreans or the Russians from doing it.
00:44:34.940And then a few years down the line, if the Chinese are starting to produce superhumans by the thousands,
00:44:41.660the Americans wouldn't like to stay behind.