Making Sense - Sam Harris - May 02, 2025


#412 — Better Things & Better People


Episode Stats

Length

26 minutes

Words per Minute

182.34454

Word Count

4,850

Sentence Count

252

Misogynist Sentences

4

Hate Speech Sentences

4


Summary

Rutger Bregman is a Dutch historian, journalist, and author of Utopia for Realists and Humankind. He s also the founder of the School for Moral Ambition, which focuses on a new kind of activism: a call to action.


Transcript

00:00:00.000 Welcome to the Making Sense Podcast. This is Sam Harris. Just a note to say that if
00:00:11.640 you're hearing this, you're not currently on our subscriber feed, and we'll only be
00:00:15.580 hearing the first part of this conversation. In order to access full episodes of the Making
00:00:19.840 Sense Podcast, you'll need to subscribe at samharris.org. There you'll also find our
00:00:24.960 scholarship program, where we offer free accounts to anyone who can't afford one. We don't run
00:00:29.340 ads on the podcast, and therefore it's made possible entirely through the support of our
00:00:33.120 subscribers. So if you enjoy what we're doing here, please consider becoming one.
00:00:45.420 I am here with Rutger Bregman. Rutger, thanks for joining me.
00:00:49.200 Thanks for having me, Sam.
00:00:50.440 Yes, nice to finally connect with you. I've been seeing your stuff for a while and just read your
00:00:55.900 book, your newest book, which is Moral Ambition, which is a little bit of a departure in tone,
00:01:01.340 but you've also written Utopia for Realists and Humankind. This is much more of a call to action,
00:01:07.520 and I want to talk about the call. You've also started the School for Moral Ambition,
00:01:13.020 which I want to talk about. But before we jump into the book, how would you summarize your focus
00:01:17.860 as a historian and just as someone who comes to all these topics we're going to talk about?
00:01:25.300 So my whole career, I've been fascinated by history. I studied history at Utrecht University
00:01:29.980 in the Netherlands, and initially I was a bit frustrated by academia. You know, it seemed so
00:01:36.020 insulated. I had this dream once of becoming a professor, and then maybe when I was 50 or 60,
00:01:42.100 I would finally be allowed to write about the big, interesting questions of history, like,
00:01:46.080 why have we conquered the globe? Why did the Industrial Revolution start in England, in the
00:01:51.840 West? Why not in India or China, for example? Those were the kind of books that I really loved,
00:01:56.760 you know, Jared Diamond, for example, Gernst, Germs, and Steel. But it started to dawn on me that
00:02:02.520 I would probably have to, you know, specialize first and, you know, spend four years of my life
00:02:09.340 writing a PhD, which on the one hand seemed really interesting. But then on the other hand,
00:02:12.780 I looked at all the PhDs that had recently been published at Utrecht University, and I found all
00:02:18.020 of them really boring. So I thought, you know what, let's go into journalism. But then I found
00:02:22.900 that to be quite frustrating as well. You know, the relentless focus on breaking news, on what happens
00:02:29.380 today instead of what happens every day. And then when I was 25, I got my lucky break. It was a new
00:02:35.680 journalism platform that was founded in the Netherlands called The Correspondent. And these guys,
00:02:41.620 the founders had a bit of a different news philosophy. They wanted to unbreak the news.
00:02:46.580 And they said, Rutger, you can come and work here and write about whatever you want and focus
00:02:51.680 more on the structural forces that, you know, govern our society. So finally, I could write about all
00:02:57.940 kinds of hobbies of mine. For example, universal basic income. That was something that had long
00:03:02.640 fascinated me. It seemed to me a really exciting idea that moves beyond the traditional political
00:03:08.120 divide of the left versus the right. So as I said, that was my lucky break. That's how I got
00:03:12.680 started. And ever since then, The Correspondent was my platform, my little laboratory where I could
00:03:18.500 develop my ideas. So that's one of the benefits of not being a native speaker is that you have your
00:03:25.160 own focus group, a tiny country that no one gives a shit about. And you can test out ideas, see what
00:03:30.180 works, see what doesn't. And so that's how I've been writing my books for the past decade.
00:03:34.680 Uh, first as essays for Dutch readers and then, yeah, uh, reiterating, learning, changing my mind.
00:03:41.740 And then at some point you're like, yeah, this is a book. Let's write it.
00:03:46.240 So again, I think we're going to mostly talk about moral ambition, but, um, big picture,
00:03:51.600 how would you describe the state of the world from your point of view? And I mean, there's so much
00:03:56.980 is happening in, in American politics. I mean, and it has so many global implications that we've
00:04:03.260 basically from, to my eye, we've created an emergency for much of the world, uh, at least,
00:04:08.820 uh, at least optically, it's, it remains to be seen what's going to happen. You probably finished
00:04:13.480 this book about a year ago, I would imagine. What's your view of the current situation?
00:04:18.920 So the first line of my very first book, Utopia for Realists was that in the past,
00:04:23.400 everything was worse. You know, when we zoom out, we see that we've made tremendous progress in
00:04:29.620 many respects. I mean, you know, this, right? The massive decline of child mortality of extreme
00:04:34.760 poverty, especially since the 1980s progress has been speeding up. So that is wonderful news. And
00:04:40.700 this was more than a decade ago when I was a bit frustrated that it seemed we had arrived at the
00:04:46.360 end of history. And most of my friends on the political left, they mainly knew what they were
00:04:51.500 against, against growth, against austerity, against the establishment, but they didn't really
00:04:55.620 know what the next big thing was going to be. So in that book, I wanted to say like, come on,
00:05:00.480 let's, let's think about what could be the new utopian milestone. There's this beautiful quote
00:05:05.820 from Oscar Wilde, who once wrote that, you know, a map without utopia on it is not worth even glancing
00:05:11.900 at because it leaves out the one Island where humanity is always landing. Now, I guess I got what
00:05:17.140 I wished for. Uh, things are not boring anymore, but not really the direction I had hoped for,
00:05:22.860 I guess. So, um, I've always loved this statement from Max Roser from our world and data, uh, you
00:05:29.640 know, the fantastic website that collects all the data on, on the state of the world, basically.
00:05:34.360 And, um, I think it's just correct that on the one hand, yeah, the world is really bad. We could
00:05:39.520 do so much better. The world has become better. That's also true. We have made progress and, um,
00:05:44.920 yeah, it's all of that at the same time. I would say I, just like you, I'm really,
00:05:49.600 really terrified of what's going on in the United States right now. Things are also happening,
00:05:54.340 happening quicker than I expected. And yeah, it's one of the big lessons of history, right?
00:06:00.380 There's nothing inevitable about the way we structured our, our society right now. It can
00:06:05.000 radically change. And sometimes, sometimes quite quickly, both for the better and for the worse.
00:06:10.120 Yeah. Well, we'll come back around to existential concerns because I think one of the ways in which,
00:06:16.420 um, the, uh, things are always getting better analysis has left people dissatisfied. I'm thinking
00:06:23.100 in particular, the kinds of, uh, criticism and distortion Steven Pinker had to face when he
00:06:28.980 released his books on this topic. I mean, Steven certainly was not arguing that progress is
00:06:34.080 inevitable. He was just asking us to acknowledge how much progress we've obviously made, uh, very much,
00:06:38.800 uh, based on the kinds of data you, you referenced. But many of us perceive more and more acutely
00:06:46.320 how much potential energy is stored up in the system and how destructive it could be on so
00:06:52.140 many fronts. I mean, you know, AI is the latest wrinkle here, but the idea that we could just
00:06:56.720 needlessly destroy the possibility of building a, um, something like a utopia. I mean, that's,
00:07:03.180 it certainly seems within reach if we could just iron out our political problems and sideline a few
00:07:09.320 prominent sociopaths. But we do seem on the verge of screwing a lot of it up, you know, quite needlessly.
00:07:16.320 So we'll talk about that. I'll come back around to that.
00:07:18.780 Yeah. I guess if I can say one thing about that, Sam. So the shape of history is just
00:07:23.360 really, really weird. So in my, in my new book, Moral Ambition, I have this one
00:07:28.000 graph where I asked this simple question, what was the most important thing that happened in
00:07:32.240 all of human history? And there are a couple of candidates, right? Maybe it was the birth of the
00:07:37.260 Buddha or Jesus or Muhammad. Maybe it was the rise and fall of the great empires, you know,
00:07:42.020 the Roman empire, the Aztec empire. Maybe it was the invention of the wheel. Maybe it was the
00:07:46.860 invention of the compass. I mean, there are so many candidates, but then you just look at some
00:07:50.960 simple graphs, growth of GDP, decline of extreme poverty, growth of carbon emissions. And all these
00:07:57.920 graphs have basically the same shape, right? You see the hockey stick that starts in 1750 and it's a
00:08:03.800 rocket that has been launched ever since. And it seems to be the case that we are, you know,
00:08:09.040 looking at a movie or actually we're participating in a movie and we are nearing the climax, you know,
00:08:13.580 when the music is swelling and we have no idea how this is going to end. It could be that the
00:08:19.220 rocket totally crashes quite soon and that the story will be over quite soon, or we will break out and
00:08:26.500 colonize the Milky Way. And maybe we will be able to build some kind of utopia. And then our ancestors
00:08:34.040 will look back on us and say, gosh, these people were the Asians, right? So that is so weird about
00:08:42.260 being alive today is that we basically have a front row seat to the greatest show in all of human
00:08:48.380 history. And we don't know how it's going to end. Yeah. This is a point you make toward the end of
00:08:52.620 the book when you point out, you know, quite accurately that the chronocentrism of past generations,
00:09:00.180 the idea that every generation imagines that it's living at an especially significant time has
00:09:05.500 almost always been delusional. And yet at this moment, it's very hard to persuade ourselves that
00:09:12.140 something isn't unique about this moment. I mean, again, AI is, is the development in recent years
00:09:18.860 that has sharpened that up especially, but even prior to that, the pace of change and the kind of
00:09:25.300 the asymptotic nature of it. And again, the reference in the graphs, you just, you cited the
00:09:30.820 difference between getting things close to right and getting them catastrophically wrong in this
00:09:35.900 generation seems especially important. Yeah, absolutely. I guess I find hope in the knowledge
00:09:43.440 that we've been in really scary times in our history and also really immoral times in our history
00:09:49.360 when there was a countercultural revolt of elites against the prevailing morale, immorality of their
00:09:55.340 time. So in the book, I write a lot about the British abolitionists, the late 18th century who
00:10:00.580 revolted against the elites who were in power back then. So this was a time of a huge alcoholism in
00:10:08.560 parliament, you know, politicians slurring their speeches. One in five women was a prostitute in London.
00:10:14.800 You had the Prince of Wales who was an extraordinary prick, even by royal standards.
00:10:21.180 And then there was a movement of people like Thomas Clarkson and William Wilberforce who said,
00:10:25.180 we are going to make doing good fashionable once again. And abolitionism was just a part of that.
00:10:30.720 That was one of the main projects. I think we've seen something similar in the United States with the
00:10:35.260 move from the Gilded Age to the Progressive Era. You know, again, the Gilded Age, extraordinary
00:10:41.540 inequality. These robber barons who had made insane amounts of money with their monopolies
00:10:46.320 in railroads, for example, and they started spending the money in the most crazy ways. You know,
00:10:51.980 the Vanderbilts, for example, built these huge mansions on Fifth Avenue in New York. There was
00:10:57.620 this one mansion where they recreated Venice inside the mansion with the canals, etc. Really bizarre.
00:11:04.200 But then again, there was a countercultural movement against it of elites, actually. People like
00:11:08.960 Theodore Roosevelt, the progressive president, or people like Louis Brandes, who became the
00:11:14.120 people's lawyer and ended up on the Supreme Court. One of my favorite persons from this era is a woman
00:11:20.200 called Elva Vanderbilt, who married into this Vanderbilt family and initially really wanted
00:11:26.240 to become part of the 400 in New York, like the richest 400 families in New York who spent the most
00:11:32.580 money on the most silly things. But then, yeah, she divorced. She had a lot of money and became
00:11:38.560 a pretty radical suffragette, an advocate for women's rights, and donated a huge amount of
00:11:43.980 money to the women's rights movement, almost a little bit like Mackenzie Scott is doing today,
00:11:49.380 the wife of Jeff Bezos. So I guess that's what I'm calling for in this new book is that, again,
00:11:55.040 we need a countercultural movement, especially now that things are getting a bit dark and we see so
00:12:01.680 many examples of just blatant immorality. I mean, in the U.S., the whole Republican Party is basically
00:12:08.420 in a state of moral collapse. You know, I've got two young kids and it's not, for me, it's not really
00:12:15.500 left versus right anymore. It's when I think about how I want to raise my kids, it's pretty much the
00:12:19.380 opposite of how these people in power are behaving, like so nasty and basically like bullies all the
00:12:24.380 time. But as I said, we've been here before and there have been cases in history when we overcame it.
00:12:29.560 Don't you know they're making America great again? What about that project don't you like,
00:12:33.760 Rutger? That's, so.
00:12:36.980 Well, it depends on, yeah, what particular reference you have. I mean, as you know,
00:12:43.400 I'm an advocate of tax fairness. I think it's quite unfair that billionaires around the globe
00:12:50.460 have lower effective tax rates than working class people and middle class people. I think that this
00:12:56.500 can be fixed and that there are beautiful historical examples in history, actually,
00:13:01.340 in the 1950s and the 60s, when we had a much more reasonable system of taxation and actually also
00:13:06.160 higher growth rates. So, yeah, make America great again. Yeah, I see some inspiration there in the
00:13:15.200 past, definitely.
00:13:16.840 Well, let's talk about what is aspirational here. I mean, one of the points you make in the book is that
00:13:21.920 moral ambition is contagious, right? What you want is to find a mode of life that is not just
00:13:31.280 masochistic, right, and merely moralistic, but you want something that people aspire to because
00:13:38.640 it's just obviously good. I mean, it seems to me that the whole point of our being here ultimately is
00:13:45.580 to make life worth living. And once we've done that, to continue to refine it and safeguard it and
00:13:53.280 just make the possibilities of human happiness more and more beautiful and to spread the wealth
00:13:57.980 around, obviously, right? I mean, the thing that is so excruciating is the level of inequality in our
00:14:03.900 world and how this inequality, you know, you can, whatever delusions you take on board with respect
00:14:11.080 to being self-made. I mean, any, you know, any five-minute analysis of really any one situation
00:14:17.040 reveals that it's, at bottom, it really is all a matter of luck. I mean, just people are extraordinarily
00:14:23.080 lucky not to be born in some failed state where they have the opportunity only to, you know, get
00:14:31.060 killed at an early age or spectacularly injured or to die of some, you know, tropical disease that
00:14:37.000 we haven't suffered in the developed world for quite some time. So, so much of your discussion
00:14:42.680 here is focused on being motivated by these disparities, to find them morally intolerable,
00:14:49.760 very much in the spirit of, in which someone like Peter Singer has argued. I mean, you acknowledge in
00:14:56.120 the book that you can't merely castigate people and demand that everyone sacrifice. There's something
00:15:03.600 aspirational about this. And I think we need to, to focus on that because there's, you know, even,
00:15:08.020 even some of your past pronouncements, I mean, the moments for which you became famous, I mean,
00:15:12.580 I think that probably the biggest one was when you were at Davos castigating the billionaires for
00:15:17.840 having, you know, all flown there on private jets. I think, I think you said that something like
00:15:21.380 1,500 private jets had flown into that meeting. And then they cry when they see David Attenborough's
00:15:27.260 film, right? Right. Yeah, exactly. It's quite a funny experience. I was in the auditorium.
00:15:31.420 It's on the menu. Yeah. But the, my concern there is that you can be read or heard as
00:15:37.580 merely demonizing wealth, you know, in, in the limit, in success, what we want is the wealth to
00:15:45.840 be spread around such that the poorest people on earth live the way the richest people do now,
00:15:51.900 you know, a hundred years from now. I mean, something like that, whatever is compatible with,
00:15:56.080 with physics is something we want to aspire to. So I don't think we want to be saying at the end of
00:16:00.880 the day that, that wealth is the problem. Yeah. I can't agree more. And the left used to
00:16:06.240 understand that. So social democracy, I see myself as an, as an old fashioned social Democrat. So
00:16:11.220 I think in the sixties and the seventies, the left was the party of progress, right? It was the party
00:16:15.580 of growth, of innovation, of building. Today you have ideologies like degrowth, for example,
00:16:21.520 that to me seem to demonize wealth or luxury or whatever. And I'm like, no, like we're way too
00:16:29.220 poor. We should become much richer. And then indeed, as you say, spread it around. The very
00:16:34.600 first essay I ever wrote was when I was 16 years old, I had this epiphany as the son of a preacher.
00:16:42.460 You know, I grew up in, in the church and you know, this is an age when you start thinking about
00:16:47.420 what do I actually believe? Do I agree with all the dogmas that are served to me? And I wrote this
00:16:52.700 essay about free will, like came to the conclusion that like doesn't make sense at all. Like surely
00:16:57.940 it can't exist. And I, I guess that argument will resonate with you. And I guess ever, ever from a,
00:17:04.340 from that young age, that it's also always been something that has driven me. Whenever we talk
00:17:09.240 about inequality, I think it's especially important to, to zoom out, right? If you live in a rich country
00:17:15.560 like I do in the Netherlands, or I'm currently living in New York, you're already part of the
00:17:20.000 richest 3.5% in the world. So when we talk about inequality, we mainly have to talk about
00:17:26.580 global inequality and the world needs so much more growth in that respect. Right. And, um,
00:17:34.340 I'm, I'm pretty optimistic that we, that, that we can make that happen and that we have already
00:17:39.260 made quite a bit of progress in the last couple of decades. But yeah, I, I can't agree more that this
00:17:43.460 idea that, I don't know, it's so anti-human in a way that this is quite dominant, maybe also in
00:17:50.460 environmental circles, the idea that humans are a plague or something like that, that we are a virus,
00:17:55.360 that we are just bad. And that is just something I've always deeply, deeply disagreed with.
00:18:00.760 Well, so let's get into the details because I suspect my tolerance for inequality is, is more, um,
00:18:07.900 capacious than yours, at least, uh, by tendency. I mean, it's not clear to me that we want, if we
00:18:15.000 could spread the wealth around completely immediately, that that would be the right
00:18:20.360 solution. I mean, to bring, I mean, this is one of the arguments, really the only argument for open
00:18:24.200 borders, the idea that your borders, national borders, and the inequalities they enshrine are
00:18:29.360 totally unjustifiable ethically. And so people should be free to move everywhere. And when I look at
00:18:36.180 the consequences of that, what I imagine would happen is that, okay, people would move more or
00:18:41.320 less everywhere until there was no reason to move anywhere because everywhere was just as mediocre
00:18:46.500 as everywhere else. And I, again, this, I come back to this, this notion of aspiration. I do think we
00:18:52.400 want societies that are wealthy enough so as to sustain scientific advancement and, you know, artistic
00:19:00.040 expression at the highest level and, you know, everything we, we have, we celebrate as technological
00:19:06.900 and cultural success in the developed world when we're not distracting ourselves by pointless
00:19:12.560 conflict. So the question is how does, if we agree that we wanted to maintain that, you know, if we want
00:19:18.040 New York city to be a beautiful, high functioning city, right. And yet Peter Singer's analysis wouldn't
00:19:25.940 allow us to prioritize anything in New York today because life in sub-Saharan Africa is so bad. All
00:19:32.600 of those resources should obviously go there. How do you square that? How would you, I mean, if you could
00:19:36.780 just start allocating funds where they should go, would you follow a Peter Singer or would you have, have
00:19:44.140 a different calculus?
00:19:45.940 Uh, quite different. So on the one hand, I, I deeply admire the man. He's one of the great philosophers
00:19:50.520 of our time. And there's also a lot to like about the movement that he co-founded, you know, these
00:19:56.860 effective altruists. They've gotten a lot of bad, bad press recently, especially since the
00:20:02.800 SBF fiasco. But on the other hand, there's a lot to admire about them. I guess as someone who comes
00:20:10.400 from the political left, what I like most about them is their moral seriousness, the willingness to
00:20:14.820 actually practice what they preach. So if I go to, I don't know, a conference of a bunch of leftists,
00:20:21.600 I don't see a lot of people giving a lot of money away. I see a lot of people talking about the need
00:20:28.120 for systemic change and overthrowing the patriarchy and, you know, destroying capitalism or whatever.
00:20:34.740 But very often they don't take a lot of individual responsibility. But if you go to an effective
00:20:39.960 altruist conference, you will meet a lot of people who have donated kidneys to random strangers.
00:20:45.340 Now, I got to admit, I still have both of my kidneys, sorry to say, but I admire the people
00:20:52.140 who do that and who give a really substantial part of their income to highly effective charities.
00:20:58.300 I think just like you, I became a member of Giving What We Can. And that has been a pretty
00:21:02.220 transformational experience for me personally. It really changed my outlook on life when I started
00:21:08.820 donating a much more substantial part of my income and the money that I had made with my books.
00:21:14.340 So that's what I really admire. What I don't really like is, I guess, the focus on guilt. I think EA got
00:21:21.720 started in the 2010s when a lot of people who I like to describe as born altruists, people who were
00:21:28.060 basically always that way already when they were young, they turned vegan and gave away, you know,
00:21:32.280 the money they got from their parents to charity. They basically discovered each other in that era when
00:21:38.280 social media got started. And that's how the movement got going. And I think that's beautiful,
00:21:43.920 but it's not for most people. So I couldn't take most of my friends to an EA conference because it's
00:21:48.780 just too weird, right? It's a lot of people who are somewhere on the spectrum or at least
00:21:54.800 neurodiverse, which is great, right? Which is EA should just continue being EA.
00:21:59.660 But I think there's a lot of room for a different kind of movement that taps into
00:22:04.700 different sources of motivation. I'm personally a pluralist. I care about many things in life.
00:22:10.620 I'm motivated by, well, altruism and empathy, definitely, but also motivated by other things,
00:22:17.140 maybe enthusiasm, maybe even a bit of vanity. And I think that's fine to be motivated by multiple
00:22:24.060 things. What we're trying to do with our organization, the School for Moral Ambition,
00:22:28.740 and also what I'm calling for in the book is to once again, make doing good high status to basically
00:22:34.480 say like, if you are one of those most talented, ambitious people in the world, then you shouldn't
00:22:39.800 work for McKinsey. You shouldn't work for Goldman Sachs. You should be working on the most pressing
00:22:43.880 issues we face as a species. And we are trying to ground this movement, not in guilt, right?
00:22:49.040 We don't want to see drowning children everywhere. You know, the famous thought experiment from Peter
00:22:54.220 Singer, where he said, yeah, the shallow pond. Well, I guess most of your listeners will know
00:23:00.540 about that. So I won't repeat the story. But yeah, I've never really liked that. It always came across
00:23:05.600 as moral blackmail to me. Like now suddenly I'm supposed to see drowning kids everywhere when I
00:23:09.800 take a sip from my coffee, right? That I probably shouldn't have bought because it was too expensive.
00:23:14.320 Yeah, I've never really liked that. I would prefer to be part of a movement that is grounded
00:23:18.720 in enthusiasm and excitement of, yeah, just the simple fact that we can make this world a wildly
00:23:25.760 better place and that it's just really cool to be part of a small group of very dedicated idealists
00:23:30.860 who want to take on some of these challenges. All right. So let's take the extreme case here.
00:23:35.160 Let's take somebody like Bill Gates, who obviously lives extraordinarily well. He flies around in a
00:23:42.080 private plane, which he almost certainly owns. He probably has more than one. And
00:23:47.900 he spends a fantastic amount of money on himself. He has homes all over the place. Again, I can only
00:23:53.460 presume. I don't actually know Bill. But assuming he lives like most billionaires, he spends a lot of
00:23:58.800 money, you know, more than thousands of people in the developing world, maybe more than tens of
00:24:04.660 thousands of people in the developing world on himself. The question is, how much should we begrudge
00:24:10.560 him or anyone living that way with having amassed those kinds of resources? In Bill's case, I mean,
00:24:18.020 so you can, in the case of a, the prototypically selfish billionaire, I think that we can get to
00:24:24.240 begrudging pretty quickly. But in Bill's case, he's been really probably the most philanthropic person,
00:24:31.060 if not merely of his generation, of any generation. You know, his personal quirks aside,
00:24:36.120 again, I don't know him. I just know what I read. He's done a tremendous amount of good in the world
00:24:41.040 and his, and when I think about what is optimal for Bill, it's hard for me to see that the sight of
00:24:49.180 him struggling to figure out how to check his luggage at the, you know, the Southwest counter of
00:24:53.200 an airport, it's hard to see that, how that's optimal. So do you think he should be flying commercially?
00:24:59.540 Or do you think that if he saves time flying private, uh, where he's free to think about the
00:25:06.800 next thing he wants, next disease he wants to cure, if he, um, found flying commercially as onerous as
00:25:13.180 many people do, uh, if he would be, um, reluctant to travel to that conference, uh, where he might meet
00:25:19.780 the person whose project he would fund, et cetera, et cetera. You see the knock on effects here. I mean,
00:25:24.300 my intuition is we want Bill being Bill as freely and as happily as possible in a way that's
00:25:30.800 commensurate with him being as inspired as possible to help the world in all the ways he's been helping
00:25:36.060 it of late. So there's a lot to say about this. A lot of people indeed will know me for saying some
00:25:41.140 nasty things about billionaires when I went to Davos and also being quite critical of billionaire
00:25:45.500 philanthropy. And I think there's a good reason for that. A lot of philanthropy is just really
00:25:49.880 unimpressive. You know, it's boring people giving a lot of money to have their name on an already
00:25:54.620 well-funded museum or a university or whatever, you know, let's give Harvard more money. And I've
00:26:00.440 always found that pretty sad at the same time as a historian, I know that there are beautiful
00:26:06.080 exceptions. If you'd like to continue listening to this conversation, you'll need to subscribe at
00:26:13.820 samharris.org. Once you do, you'll get access to all full length episodes of the Making Sense
00:26:19.100 podcast. The podcast is available to everyone through our scholarship program. So if you can't
00:26:24.300 afford a subscription, please request a free account on the website. The Making Sense podcast
00:26:29.340 is ad-free and relies entirely on listener support. And you can subscribe now at samharris.org.