#349 — Generosity, Cynicism, and the Future of Doing Good
Episode Stats
Length
1 hour and 2 minutes
Summary
Chris Anderson has been the curator of the famous TED conference since 2001. His new book is Infectious Generosity: The Ultimate Idea Worth Spreading. In this conversation, we talk about the new spirit of cynicism that seems to surround any notion of doing good in the world, in tech and finance at the moment, the problems with diversity, equity, and inclusion, and the controversy that enveloped Coleman Hughes when he spoke at TED last year. And then we get into the topic of Chris s new book, Proper. In which he talks about the science of generosity, the leverage offered by the internet, the false opposition between selfishness and selflessness, mixed motives in giving, results versus reward, and other topics, including his own business model for this podcast, the economics of TED, TEDx, wealth inequality, the ethics of billionaires, philanthropy at scale, the power of pledges, the arguments of Peter Singer, the problem with effective altruism, how to improve our digital lives. And other topics. And now I bring you Chris Anderson, who is here to talk about his new book. Chris has been a journalist, a writer, and an impresario. And he s here to tell us about what it s like to run a billion-dollar company and then lose it all in the dotcom crash of 2000s dotcoms, and how he got to where he is today. This is a podcast you don t want to miss. Subscribe to Making Sense: The Making Sense Podcast by Sam Harris and learn more about what s going to happen next in the next episode of the podcast. Subscribe to the podcast by becoming a Making Sense Member! Subscribe on iTunes Learn more about your ad choices are linked in this episode? Subscribe on Apple Podcasts by clicking here and other places on the App Store or wherever else you re listening to this podcast might be listening to it. Thanks for listening to the making sense? v= Thank you, Sam Harris I ve got a podcast about this podcast? And I ve made it so much of it? I ll let you know what s good, I ve heard it and I ll tell you so it s great, too say so on it s good and I ve sent it out on my podcast, I ll tweet it so it really is that s got it so I said it s really good, so I m talking about it so good, right so you can be that s not just that s good or I ll say it so s good so I s good like that s really so much so I say it s not good so s really really
Transcript
00:00:00.000
Welcome to the Making Sense Podcast. This is Sam Harris. Just a note to say that if
00:00:11.640
you're hearing this, you're not currently on our subscriber feed, and we'll only be
00:00:15.580
hearing the first part of this conversation. In order to access full episodes of the Making
00:00:19.840
Sense Podcast, you'll need to subscribe at samharris.org. There you'll also find our
00:00:24.960
scholarship program, where we offer free accounts to anyone who can't afford one.
00:00:28.340
We don't run ads on the podcast, and therefore it's made possible entirely through the support
00:00:32.860
of our subscribers. So if you enjoy what we're doing here, please consider becoming one.
00:00:45.140
Today I'm speaking with Chris Anderson. Chris has been the curator of the famous TED conference
00:00:51.600
since 2001. The tagline of TED is Ideas Worth Spreading, of which there have been many
00:01:00.000
first shared at TED. The title of his new book is Infectious Generosity, the Ultimate Idea Worth
00:01:06.800
Spreading. And that is in part the topic of today's conversation. We talk about the new
00:01:11.900
spirit of cynicism that seems to surround any notion of doing good in the world, in tech and
00:01:17.240
finance at the moment. The problems with diversity, equity, and inclusion, DEI. The
00:01:23.440
controversy that enveloped Coleman Hughes when he spoke at TED last year. And then we get into
00:01:29.640
the topic of Chris's book, Proper. We talk about the science of generosity, the leverage offered
00:01:34.900
by the internet, the false opposition between selfishness and selflessness, mixed motives in
00:01:41.340
giving, results versus reward, trying to see the good in people, digital business models, including
00:01:48.800
my own business model for this podcast, the economics of TED, TEDx, wealth inequality, the ethics of
00:01:56.900
billionaires, philanthropy at scale, the power of pledges, the arguments of Peter Singer, the Sam
00:02:05.400
Bankman-Fried scandal, and the problems with effective altruism, how to improve our digital lives,
00:02:11.920
and other topics. Is there a more important topic than trying to figure out how to inspire
00:02:18.080
the luckiest among us to do more good in the world? Given the implications, I'm not sure.
00:02:34.760
I am here with Chris Anderson. Chris, thanks for joining me.
00:02:39.260
So you have a new book, Infectious Generosity, The Ultimate Idea Worth Spreading, which I want to
00:02:45.580
talk about, which it seems like now is the moment for such a book. It's really, it's great that you've
00:02:50.740
written this. But most people will know you as the owner and curator and impresario of the TED
00:02:57.140
conference. What did you do professionally before TED?
00:03:01.140
I was a journalist out of university. I think I originally wanted to be a physics professor,
00:03:09.120
but I found physics very hard at Oxford and switched course partway through to politics and
00:03:14.740
philosophy and ended up as a journalist. And then fell in love with computers in the early 1980s and
00:03:21.940
started a magazine publishing company for, you know, hobbyist, nerdy computer mags, which turned out to be
00:03:28.500
very fortunate timing. And, you know, the company did well, came to America eventually, continued to
00:03:33.580
grow it, and then had a horrible tangle with the dot-com crash of 2000, 2001. Ended up leaving the
00:03:42.280
company, which had gone public by this stage, but left it and did the sideways move into TED, which was
00:03:47.360
a conference I'd first went to in 1998 and kind of fell in love with it and had a chance, surprisingly,
00:03:54.340
to buy it from the owner. I didn't have any money because of the dot-com bust, but I did have a
00:04:00.680
foundation. So it was the foundation that bought TED. And so it found itself in a nonprofit. And
00:04:06.140
I've been, that's been most of my time since then, really since 2001.
00:04:11.360
When you say you had a horrible tangle with the dot-com bust, were you one of these people who
00:04:17.560
had a billion dollars on paper and then watched it get halved every 15 minutes until it vanished
00:04:23.900
over the course of a day? I mean, was it that kind of experience or what exactly happened?
00:04:28.640
Pretty much. I'm not sure I quite, maybe for five minutes after one company went public,
00:04:33.920
hit the billion mark, but it very quickly fell away. And yeah, for 18 months, I basically lost on
00:04:39.540
average a million dollars a day. And it was, it's not just that it, I mean, it really eats at your,
00:04:45.880
your sort of self of sense of self-worth. I mean, like, because I'd told this fantasy to myself that
00:04:51.100
I was a, this, you know, successful entrepreneur, everything had always gone up into the right. And,
00:04:57.280
you know, the company for years, the company kind of doubled in size every year and it just felt,
00:05:00.880
you know, easy and great. And then this happened and I, and it was a real lesson really. Don't tie up
00:05:07.860
too much of your happiness and sense of self-worth into your business or what you do or you'll, you
00:05:14.300
know, it's a, it's a recipe for disappointment. So I went through a very difficult 18 months and,
00:05:19.040
and kind of got, survived it. I think by reading and getting, just entering, remembering how amazing
00:05:26.640
the world of ideas is and how much cool science was happening. I hadn't really discovered evolutionary
00:05:32.940
biology. I hadn't, there were so many things that I got into for the first time. And that,
00:05:37.480
that made the prospect of a move into Ted and sort of living in that world, incredibly appealing.
00:05:44.460
I mean, at the time it was, it was an annual conference, nothing else, but Hey, all these
00:05:48.200
interesting people. And it felt like a real respite from the ugliness of, of, um, you know,
00:05:54.420
the dot-com crash where, you know, I had 2000 people at one point, we had to let go half of them.
00:05:58.040
And it was just, it was so painful. It was very, it was a horrible time.
00:06:02.280
Well, I want to talk about wealth and, um, philanthropy and, and, uh, all these,
00:06:07.380
these intersecting issues. I mean, Ted has given you a front row seat to see the social phenomenon
00:06:13.040
I want to discuss. And many of which are the relate to the topic of your book, uh, which is
00:06:18.360
briefly, you've written a book about how we can do more good in the world through leveraging new norms
00:06:25.320
around generosity. And especially as becomes possible in the, the age of the internet, I mean,
00:06:30.600
it really, it's, it, it changes the possibility of being generous in interesting ways. But before
00:06:35.160
we jump into that, I want to acknowledge that there's this new spirit of cynicism in the air,
00:06:41.520
you know, wherein it seems that almost any conspicuous efforts to do good in the world
00:06:47.120
now seems suspect. And this relates to various degrees to things like ESG, you know, environmental,
00:06:54.260
social, and governance, investing, effective altruism, DEI, you know, diversity, equity,
00:07:00.100
and inclusion efforts. And there's this sense, especially among influential people in tech
00:07:06.280
and in finance, that all of this stuff amounts to a little more than a sanctimonious scam.
00:07:12.700
It's all just virtue signaling. It's just elites marketing to elites. And, you know, I've been
00:07:18.220
critical of many of these things, and I'm actually especially critical of DEI at the moment,
00:07:23.480
but I'm also worried that what we have here is a situation in which some of the luckiest and
00:07:29.240
smartest people in the world appear to have drawn the wrong lesson from some specific recent
00:07:35.860
embarrassments. And they, they appear to now think that altruism and generosity and compassion
00:07:41.440
are basically bogus. And we're all just condemned to live in a world where we play this game for
00:07:47.540
ourselves. And, you know, perhaps our family and a few friends make it into the lifeboat with us.
00:07:53.320
But otherwise, we should just be narrowly selfish without apology. And in fact, there's an attraction
00:07:59.460
to this. I mean, really, the apotheosis of this is someone like Trump, where, you know,
00:08:03.760
where you have half the country idolizing a man who makes no pretense of being other than
00:08:11.140
malignantly selfish, right? And if you can be just, you're just nakedly selfish and merely selfish,
00:08:19.060
one of the superpowers you acquire is that it's impossible to be a hypocrite. And it's the moments
00:08:25.020
of hypocrisy that many people have found so despicable in our efforts to do good. So you have
00:08:29.660
someone like Sam Bankman Freed, or you have the DEI bureaucrats who are apparently unable to condemn
00:08:36.440
calls for genocide against the Jews. It's just very easy to see what is wrong here. I mean,
00:08:42.120
I'm thinking of one with respect to ESG now, you have, I guess, you had the spectacle of, you know,
00:08:49.140
lots of right thinking people flying on their private jets to a climate conference, right? And so
00:08:56.200
it's just people see this, and now they've begun to default to a new norm of cynicism and selfishness.
00:09:03.480
And so I'm just, I want to talk about the problems such as they are with ESG and effective altruism
00:09:08.840
and DEI and these other issues. But I'm wondering if you share my concern that the pendulum is in
00:09:14.520
the process of swinging back into something like an Ayn Randian and fairly psychopathic ethic of
00:09:24.160
Yeah, I think you're right to be concerned. I'm certainly worried about it. I think different
00:09:29.820
people would probably tell the story in different ways. But there's no question that the techno
00:09:37.000
optimism of, say, the early 2000s, when it really seemed like you could frame the internet, for example,
00:09:43.600
as bringing the world together. Lots of people had the narrative that, wow, this is wonderful. We can
00:09:48.860
see people on the other side of the world. Maybe this exciting new technology can spread things like
00:09:55.760
freedom and democracy and some of the ideas that we care about. And the narrative of the last 10
00:10:01.580
years has been the opposite of that. It's been, and I've felt it, I felt it through my ringside seat
00:10:07.340
at TED, if you like, of this growing sort of sense of crushing disappointment that the internet was not
00:10:13.260
doing what we thought it would do. It was actually helping engender the opposite. And yeah, somewhere
00:10:19.640
around the rise of social media, the election of Donald Trump, the world became very, very, very
00:10:26.800
divided. And it was almost the only thing you could be was to pick a tribe and then fight and be really
00:10:35.120
annoyed at the other side. And those were the most sort of fervent conversations. I mean, I think,
00:10:46.660
I think, and I hope that a lot of people have got really sick at how mean the world has become.
00:10:54.460
And actually, you're right that there is a lot of cynicism and exhaustion at, the way I would frame it,
00:11:02.120
Sam, I think, is almost the language that each side has sort of fallen into. So the language of DEI
00:11:08.960
has become its own sort of sing-song thing that is almost invisible to the people who are in it.
00:11:15.400
And it just sounds so exhausting and mind-numbing and annoying to people who aren't. And it makes it
00:11:22.480
impossible to have actually a conversation about the real things that are underlying it. You know,
00:11:27.720
that the real, the reason why DEI happened originally was that there were injustices in the
00:11:33.680
world and that there were groups of people who hadn't been given a fair shot and there was a need and
00:11:37.700
a desire from good-thinking people to try to do something about that. But it, you know, it, it, it
00:11:44.160
warped into tribalistic language. That's the shocker. So that you couldn't even have a discussion about the
00:11:50.340
actual merits of the case. It was just, oh, you use that word. I know who you are. I can't talk to you.
00:11:55.020
You make me sick. And, um, it's, it's, it's sort of, I think, and I hope that there are a lot of
00:12:00.640
people who have got really weary of that world and that there's, there's a desire. And I certainly
00:12:07.200
in the next, in the generation coming through, but I think in our generation as well to that this,
00:12:12.980
this can't go on. And I, I, I, you know, part of my sense of urgency about this is if we can't
00:12:20.260
figure out how to start listening to each other, talking to each other, working together,
00:12:25.280
collaborating, we're not going to be able to solve any of the giant problems that, that, uh, the future
00:12:32.080
is, is sort of throwing up. And, and, uh, you know, whether, whether it's climate, uh, whether
00:12:36.880
it's wars around the world, you know, whether it's artificial intelligence and, and the challenges
00:12:41.480
that that poses, we're not going to be able to solve any of this stuff. And so we, we, I just,
00:12:47.120
I see that there is no, no way forward other than to, at least to try to tackle this. And, um,
00:12:55.200
and I guess the, the, um, the, you know, the framing that I've come up with and I'm excited by
00:12:59.700
is that just as nasty things have gone viral online in the last decade, especially fear,
00:13:07.800
resentment, disgust, anger, there, it is also possible for really good things to go viral.
00:13:13.800
There is no reason why they shouldn't. And with a bit of a nudge and a bit of a rethink and,
00:13:18.060
and, uh, uh, uh, an effort by people of goodwill, I think there just might be a chance to turn the
00:13:24.660
tide. At least that's the ambition. And if we, if we don't do this, I don't know what else we do
00:13:29.120
because the future is just going to be so ugly. Well, I had thought to save a discussion of DEI
00:13:35.320
to the end, but since we've touched on it on the fly, I think we, um, maybe we'll just take a brief
00:13:41.080
detour and talk about it because it's, it's not, it's, it is a bit of a sidebar discussion
00:13:45.060
with respect to the, the other topics that relate directly to generosity and, and philanthropy
00:13:50.840
and wealth, et cetera. You know, I've always viewed you before I came to know you at all
00:13:57.720
as a, as a kind of prisoner of sorts of Ted with respect to DEI. I mean, perhaps you had a bit of
00:14:03.740
Stockholm syndrome too. I don't know, but it just seemed like, it's actually the first moment I
00:14:09.400
noticed this was after I gave my first Ted talk where I took a pretty hard line against
00:14:13.700
traditional Islam, in particular, the compulsory veiling of women. And you came up on stage
00:14:19.260
afterwards and tried to perform a bit of an intervention. And I mean, people can watch that
00:14:24.460
on YouTube and draw their own conclusions. And you and I have since talked a fair amount about Islam
00:14:29.280
and my approach to criticizing it. And as well as my approach to criticizing religion in general.
00:14:33.820
And, you know, we've hashed that out on your podcast and, and in private. So we don't have
00:14:38.820
to rehash that here unless you, you know, you feel free to say anything you want and edify me in front
00:14:44.000
of my audience here, but it's been useful to talk to you about all that. But what I want to talk about
00:14:49.080
is a, a recent event that had somewhat the same character, but what was notable in that it revealed
00:14:56.720
how much had changed in the intervening years. So when I, when we first met, when I gave that first
00:15:01.920
TED talk, I think that was 2010. And so now, you know, a decade plus, hence, you have an event which
00:15:08.440
produces much more controversy. And yet the target of the blowback was, to my eye, far more anodyne
00:15:16.960
than I was giving that first TED talk. In fact, it made me think that there's no way I could have given
00:15:22.260
that TED talk now, or at least last year. And I'm so, needless to say, I'm talking about the case of
00:15:28.800
Coleman Hughes. What happened there? What was your perception of what happened there? I, I wasn't
00:15:33.660
actually at that conference. I wasn't in, you know, I wasn't there for the, the immediate, to see the
00:15:39.780
immediate dominoes fall, but I just heard about it after the fact. What was your experience of that?
00:15:45.940
So just two words on, on your own TED talk. I actually really liked your TED talk. What I was
00:15:50.460
there, I was the person who came up on stage there. It was nothing to do, I think, with DEI. This was a
00:15:56.220
person who had lived for years as a kid growing up in Afghanistan and Pakistan with a lot of Muslims.
00:16:02.080
And I, I do dislike huge aspects of, of actually lots of religions. And I, I think living for,
00:16:09.480
you know, paradise rather than the current world is incredibly, incredibly dangerous. I think the
00:16:15.700
whole, inshallah, you know, if it is, is, it can be a very disempowering way to live where people
00:16:22.000
give up trying because they just, well, you know, God will do whatever he will do. There's lots that
00:16:26.320
I don't like, but what I, what I do like is that there are some genuinely, that there, there, I just
00:16:31.840
got to know many super spiritual Muslims and, and many Muslim women who actually liked aspects of the
00:16:40.740
conservatism of, of Muslim culture and, and preferred it to the sort of, you know, the, the excesses of
00:16:47.440
the West, if you like, the excessive public sexuality of, of the West. So it was, it was a
00:16:51.880
kind of ex missionary kid, if you like, coming and speaking and, and, and challenging you. But,
00:16:57.540
but the, the core of what you said in that talk was incredible, Sam. I mean, the, the, the notion
00:17:01.300
of building morality from, from the ground up based on science, based on what we know about
00:17:08.160
human nature. I mean, I think that's, I, I, I actually believe that because it's, that's what opens
00:17:13.440
the door to moral progress, which I think is a real thing. I don't, I think cultural relativism
00:17:17.960
is a disastrous philosophy. And so we, we probably had more in, in line than, than you thought,
00:17:24.920
but I was probably aware of, you know, some Muslims in the audience just wanted to look
00:17:30.080
out for them. Okay. So Coleman Hughes, so the narrative online in some, you know, depending
00:17:36.200
on whether you look on the left or on the right, one, one narrative is that we, you know,
00:17:41.140
invited a controversial man to Ted to give a talk about, um, he made the case for colorblindness
00:17:46.240
on the left. This is seen as a super controversial and in fact, and also deeply upsetting to some
00:17:52.960
people because on the left, the view is that colorblind policies over decades have been proven
00:18:00.060
not to work. You know, we're still here with a lot of racial injustice and that you cannot make
00:18:04.860
enough progress without proactively making decisions based on race. In some cases, there's a,
00:18:10.080
there's a very good Ted talk, I think by Melody Hobson arguing exactly for this, be color bold,
00:18:14.860
not, not colorblind. And, um, and she, you know, actually persuaded me to be more conscious in,
00:18:20.900
in how we hide. It's not about tokenistic hiring. It's about just making extra effort to find great
00:18:27.700
people and that with a more diverse employee base, you actually get a lot more done and, uh, it's just
00:18:33.140
healthier or, or around. So, so I, I, I get that argument from the left and I also get why, you
00:18:40.020
know, what happened when Coleman gave his talk is that, um, some people were heard it as you no longer
00:18:48.240
care about my identity. And, uh, and so some people in the community were really upset by it. It's true
00:18:53.500
to say that meant that we had some people internally were troubled at the notion of pushing it out online
00:19:00.120
and, uh, wondered whether we showed whether it really was an idea was spreading. And they felt
00:19:04.880
that some of the issues sort of really needed further debate given how sensitive it was. So
00:19:09.000
Coleman kindly agreed to do a debate and he did, I thought he did very well in it and that was posted
00:19:14.720
and that was fine. And then we posted his talk and, um, all well and good so far. The talk was not
00:19:22.060
included in one of our super popular podcasts for a while. I think the team were uncomfortable
00:19:27.400
with it for whatever reason. I actually didn't, wasn't really aware of this, but it meant that
00:19:33.080
the view count on Coleman's talk was low compared with other talks. And, um, someone pointed this
00:19:39.160
out to him. And so he got, he got upset and posted this piece saying, why is Ted afraid of colorblindness?
00:19:44.220
Okay. And so, so the main pushback we got then was from people right of center who, who were
00:19:49.480
outraged that this, this brilliant person, and Coleman is a brilliant person, was, was being suppressed
00:19:56.080
by the woke, you know, Ted team. And, um, you know, I was told, you know, well, if you, if you believe
00:20:03.460
in Coleman and you, you want to champion him, fire your, your woke staff and so forth. And so there
00:20:09.460
was, you know, as, as is common in the current moment, you had incredibly different narratives on
00:20:14.500
left and on right. I, I think on the right, the narrative is, you know, this proves that Ted is,
00:20:20.420
is, is, is woke and, and, you know, is suppressing very smart centrist ideas, you know, articulately
00:20:28.660
espoused by a very good person. But an alternative framework framing of it is Ted fought to bring
00:20:35.580
to the stage someone who actually three or four years ago, five years ago, in the midst of the
00:20:39.940
political division that was there, we would never have invited to the stage. We brought him on. A lot
00:20:45.160
of people in the, in the community noticed and were excited that we were opening the tent wider than
00:20:49.640
has been typical in recent years. And, um, last time I checked, you know, that talk has now has
00:20:55.140
800,000 views. It has been on the podcast, you know, the, the, uh, so I, my own framing of it,
00:21:02.040
Sam, is that, is that Ted is on a, I think it's true that in the political, in the height of the
00:21:08.200
sort of political division, when we did stuff that was political, most of it came from the progressive
00:21:12.580
side of the equation. And, you know, we had talks with some of that language that I refer to that
00:21:17.340
ends up being very irritating to people on the other side. We are fighting to address that first
00:21:23.360
of all, just on the language. It is pointless having a talk about social justice. If the people
00:21:27.180
you want to persuade aren't, aren't going to hear it because of the language. And so we have to get
00:21:32.920
the language said, right. But more generally, Ted is nonpartisan. And I, I, I feel that I feel
00:21:39.280
strongly that there are people in the center and so forth who we need to listen to more and bring
00:21:44.980
to Ted. And I think when people see the lineup for Ted 24, they're going to realize that we are,
00:21:50.480
we are absolutely committed to a broader open tent. We're unafraid of, of, uh, controversy.
00:21:56.260
And, um, as well as ideas we're spreading, we're ideas worth debating. You often don't know whether
00:22:01.880
ideas we're spreading in the current environment. You have to really hammer it out. So that, that's
00:22:07.400
where I'm at right now. I'm, I'm, I'm, I, I've been a fan of Coleman for a, for a long time.
00:22:11.520
You know, I, I get why some people were upset by this talk. I think he's a, he's a, he's a,
00:22:17.540
a brilliant thinker on issues outside race. When I've heard him, I've always been wowed and I'm
00:22:24.340
very glad he came to Ted. And, um, and I think, I think ultimately the story is Ted after a lot of
00:22:31.540
internal debate offers its platform to a voice who, who, who never would have been offered it a few
00:22:37.180
years ago. Well, that's interesting. So yeah, I mean, my reaction to that is, I don't, I don't
00:22:44.000
think will surprise you. I mean, it may sound like a, a, a mere confession of my own bias here, but
00:22:49.420
when I look at, you know, Coleman, who he is and what he's said over the years and written, and, um,
00:22:55.980
as you say, he is, he's quite brilliant. He's, he's quite inspiring. He's, he was this prodigy that
00:23:01.240
many of us were very happy to discover. I don't know how many years ago it was, but, uh, as an
00:23:06.600
undergraduate at Columbia, he was writing really useful and, um, beautiful essays that just, they
00:23:13.480
were like arrows shot directly to the heart of our, uh, various social problems at that time. And,
00:23:19.320
you know, quite uncharacteristically of someone his age, he did not overwrite at all. I mean,
00:23:23.780
it was really, it was quite beautiful. And yet, of course, for me to spell out his eloquence, uh,
00:23:30.360
at length is considered a microaggression given the color of his skin among many people over there
00:23:35.700
at TED. And so to hear your description of this was to hear the description of largely a pathology.
00:23:42.680
I mean, for you to say that Coleman wouldn't have been invited a few short years ago because of how
00:23:47.600
controversial it would have been then. And, and, and his, his invitation this time around didn't
00:23:52.860
escape controversy as you just described. And when you look at what his talk was, his talk was just,
00:23:58.540
as I said, the truly anodyne. It was just this reboot of Martin Luther King Jr.'s dream that we
00:24:06.340
get past the superficial characteristics that, that seem to divide us based on this insane obsession
00:24:12.260
with race and get to caring about people on the basis of, of, of their actual contributions to
00:24:19.120
society and the content of their character, et cetera, et cetera. That was never controversial
00:24:24.440
until it suddenly became controversial. And if the only issue was that we have learned in the
00:24:30.160
intervening years that it simply doesn't work, you know, colorblindness, while it might sound
00:24:35.560
ethically wonderful, it's quixotic and it's ineffective and we need to make more muscular
00:24:42.400
efforts to promote people, uh, even past their point of competence, right? Even, you know, even just
00:24:48.360
the kind of affirmative action that many people deride now, even, let's say we want to make an
00:24:53.200
argument for that. The thing that it was so bizarre about the response Coleman received from, and again,
00:24:59.560
I don't know how many people at Ted was that his utterly straightforward and unemotional and again,
00:25:09.300
traditional civil rights talk was received as an attack, as an insult, as just pure opprobrium
00:25:19.160
heaped on a vulnerable constituency, right? And these people perceive themselves to be under threat
00:25:26.380
in the face of the least threatening guy and the least threatening message, you know,
00:25:31.640
I get, Sam, Sam, Sam, Sam, I get, I get, I get that response from you and I get why a lot of people
00:25:39.360
see it that way. It does seem anodyne, it does seem obvious, you know, so, so at the conference,
00:25:46.300
there was a, there was a standing ovation for the talk from some people. Some people were really
00:25:51.320
delighted, you know, Ted is a, is a lot of different things and some people were super delighted
00:25:55.720
at the talk. I think the piece that you may have missed and some of the people who think,
00:26:02.720
oh, this is just obvious, I think you may have missed how, just how deep and complex this debate
00:26:09.060
has, has become. There are people who have spent their whole lives, you know, fighting to address
00:26:14.980
issues of, of racial injustice and they have come to believe that straightforward colorblindness is,
00:26:22.340
it has not worked. It has not worked. And I don't think that the alternative to that
00:26:27.000
is just sort of tokenism of, of, you know, it's not necessarily affirmative action. It may well,
00:26:33.300
there, there are many reasons why people of color may not be in a position to, let's say,
00:26:40.220
apply for a job in the same numbers and in the same way as others. A color bold policy might well
00:26:48.540
be just to look that much harder to find the brilliant people. They, they may well be out
00:26:52.760
there. And Sam, with respect to you yourself, aren't always colorblind. I, I am, I will bet
00:26:57.060
anything that there are times when you look at your podcast lineup and say, I can't just have a lineup
00:27:03.180
of, of white guys. You know, you, you, you feel that I bet you feel that.
00:27:07.280
No, actually, actually, no. And, and that's, you know, it's been, no, no, no, I'll get, let me,
00:27:13.720
I'll just be completely transparent to you. I mean, it's been pointed out to me that, you know,
00:27:18.960
I don't have enough people of color. I don't have enough women or et cetera, et cetera. And
00:27:23.400
so I've heard that criticism, but the truth is I simply reach out for the next topic of interest
00:27:31.480
or the next person who has caught my attention. And obviously I can't say that I'm colorblind
00:27:37.860
because I mean, take the case of Coleman, what, what I hope for in our world and in my own life
00:27:43.880
and in, you know, and for Coleman is to get to a place where no one cares that he's black.
00:27:50.820
He doesn't have to care about it. I don't care about it. Of course, given our current situation,
00:27:56.740
given the topic of this conversation, it is highly relevant that he's black for a whole,
00:28:01.600
a host of reasons. First of all, had a white guy given that same talk, had I had the temerity to
00:28:06.400
come back and give a talk on colorblindness, you would have received much more controversy and I
00:28:12.320
would have received much less defense than Coleman did purely, but based on the color of our skins.
00:28:18.180
And that makes no sense because the argument stands or falls on its merits. But I just think
00:28:24.280
it's unfortunate and it will soon become a travesty if a man of Coleman's intelligence
00:28:31.780
has to spend all of his time, most of his time, even much of his time talking about race simply
00:28:39.640
because he's black. He has so much more to contribute on so many other topics. It's a massive
00:28:45.180
opportunity cost. And when I'm with Coleman, I very quickly forget that he's black because
00:28:52.240
it doesn't matter to me. It only matters to me when suddenly we're talking about this issue
00:28:57.180
and I have to turn to him and say, listen, you have a superpower based on how you look. You should
00:29:02.800
be doing something that I'm not doing over here. And that's a totally rational conversation to have
00:29:09.040
with him. But I view it as an opportunity cost and I view it as something that we have to outgrow.
00:29:14.680
I just think it's exactly, I mean, the heuristic here, and I'd be interested to know if you actually
00:29:20.960
don't share this vision, I hope that someday skin color is like hair color or eye color, right?
00:29:28.980
Where we get to a place where no one would even think to ask how many blondes got into Harvard last
00:29:36.120
year, right? Knowing how many green-eyed people are cardiologists and does it perfectly match
00:29:42.100
their representation in the rest of society? Who cares, right? We have to get to the who cares
00:29:47.860
moment. That's what success will look like. And the people who reacted so badly to Coleman's talk
00:29:53.220
are people who not only don't think we can get there fast enough by pretending to be colorblind,
00:30:00.500
they don't even want to get there, apparently. They want to enshrine these differences among people
00:30:06.960
as indelible, as true for all time, and as important for all time. And I just think that's
00:30:12.460
the only other people who want to do that in our society are white supremacists, right? They have
00:30:17.400
the same logic and it's toxic. So it may be true that some people want that permanently, but I don't
00:30:24.600
think that is the general picture. And I think this is actually where we can find common ground.
00:30:30.900
When I did a podcast interview with Melody Hobson, who'd made the argument against colorblindness,
00:30:37.380
and I actually asked her this very question. I said, in the long term, do you dream of a world
00:30:44.360
where race as an issue goes away and that we just relate to each other as individuals?
00:30:49.000
And, you know, there was a long pause. And she said, that's incredibly hard to articulate for,
00:30:55.940
just because I see so many issues in the immediate term of why that approach isn't working. But
00:31:03.120
long term, at least what I heard her say is that that is the dream. And I think most of the people
00:31:09.900
at TED who were upset by the talk, and I include a couple of close friends of mine,
00:31:15.700
would absolutely say that long term, we want a colorblind society. That is the dream.
00:31:23.640
And that in their view, the way to accelerate that is to look specifically at some of the issues
00:31:29.760
around race and try and tackle them directly. So look, to me, this is a really important debate,
00:31:36.460
and I'm very glad we had it at TED. And I think the way, you know, I think common was embraced by many,
00:31:43.160
many, many people at TED. And as a whole, you should know, the world should know that we are committed
00:31:50.940
to having an open stance to important, what are often controversial issues, and to not be owned by
00:31:59.340
one tribe. And I think the tribalism is the biggest problem, Sam. It's the fact that so few people are
00:32:08.380
willing to explore and live in the very uncomfortable space between the tribes. I think
00:32:15.700
that actually, most people in America, in Europe, in the world, are not at the extreme ends of the
00:32:25.080
spectrum here. I think most people actually would, if they could find a way, like to embrace a more
00:32:30.800
centrist position. The trouble is that if you put your head above the parapet, you get it blown off by
00:32:35.800
both sides. And that is difficult. It's actually, I speak about this a bit in the book, one of the forms
00:32:40.720
of generosity that I think is most important for the era that we're in is bridging. It's the ability
00:32:47.320
to, and willingness to listen to people from the other tribe with respect, and to try to understand
00:32:56.080
them, and to be willing to try to find language, you know, that can find common ground. If we don't find
00:33:04.520
people who are willing to do that, we're screwed. And so I think it's actually one of the most
00:33:09.240
important forms of generosity that there is. I tell the story in the book of this African-American
00:33:14.280
Daryl Davis, who was puzzled why people hated him because of his skin color, you know, invited the
00:33:20.600
local leader of the Ku Klux Klan to a meeting, ended up forming a relationship with him. He went to
00:33:27.760
KKK rallies, and they built a, they built, they somehow built a friendship. And this guy eventually
00:33:33.860
left the KKK and, and, but his, his, his story, you know, became widely reported and inspired many
00:33:41.920
others. I mean, people like him, I think are rare, modern heroes that we need so, so many more of.
00:33:49.420
And it's just a very hard space to be in. I'm doing a terrible job of trying to be in that space,
00:33:53.600
but I'm absolutely determined to try and be in that space for Ted. We're going to, we're going
00:33:57.480
to get it wrong. A lot of the time, we're going to be laughed at by people actually on both sides,
00:34:03.480
but that's, that's okay because we, we, we have to be there. Ideas are supposed to be able to leap
00:34:10.880
from one tribe to another. That's the whole reason they're powerful. And if we, if we give up
00:34:15.420
that connectivity, we're giving up one of humanity's most important superpowers.
00:34:20.080
Yeah. Well, I, I applaud your efforts to continue that conversation and, um, I encourage you to
00:34:26.740
keep at it despite the pain it may cause you because it's, this is a enormous problem,
00:34:32.540
which is making so many other political impasses in our society, just impossible to navigate. I just,
00:34:40.280
it's, it touches everything. I mean, as we just saw, it touches the immediate response to the war in
00:34:46.620
the Middle East and, uh, the repute or disrepute with which our, um, most important academic
00:34:53.980
institutions are, are held. I mean, it's just, it's a, um, it's touching everything. And it's,
00:34:59.760
you know, to my eye, many people are just very confused about the ethics and politics here. And
00:35:03.720
there are some, there's some low hanging fruit in terms of right answers that we can get our hands
00:35:08.160
around. And, you know, I happen to think Coleman is in possession of at least one there,
00:35:12.380
but we're not going to resolve it here. And I want to bend this conversation back to, um,
00:35:17.560
something that I know is closer to your, your heart at the moment, which is the topic of your
00:35:21.240
book, generosity. How do you think about generosity at this point? And how has the internet changed your
00:35:29.100
thinking? Yeah. So there's two pieces really that came together for this. One is just the science
00:35:35.580
of generosity. I think it's actually really, really interesting. You know, my background was
00:35:40.640
religious. And, um, as I, one thing that kept me in the church for a long time was the belief that
00:35:49.100
there was no basis for generosity or kindness or goodness or altruism outside a belief in God.
00:35:55.320
I thought that outside that, what you had was an evolved animal, you know, that was evolved to
00:36:01.680
survive and that there was no basis for conscience or for, for being kind to others. And, and it was,
00:36:09.100
it was the most thrilling discovery to realize that, um, actually it was possible for selfish genes
00:36:17.460
to evolve unselfish people, to build unselfish people. And that you could, that unselfishness
00:36:25.780
was actually a brilliant way of surviving if it could spread, you know, across a species. And, um,
00:36:34.100
and you know, there's scientific arguments about whether that was a group selection thing or
00:36:37.920
individuals, you know, how quite how that worked. But the fact is that humans, along with several
00:36:42.500
other species have developed this as a kind of superpower. And, and it's that that has enabled
00:36:49.240
basically everything that we've done by being the cooperating species that, that, that learned to,
00:36:56.020
for example, you know, share bounty of hunting or whatever back in the day and learned mechanisms for,
00:37:02.880
for trust and belief in each other. It's, it's that, that I think is at the heart of everything
00:37:08.000
that we have built. It's at the heart of civilization. Um, so that's, that's one piece
00:37:13.120
that's cool. A second piece that is, is again, you know, biological is new evidence around just how
00:37:21.260
wired we are to respond to generosity. You know, I got this ringside view of a crazy experiment,
00:37:28.360
called it mystery experiment, where 200 people on the internet were given $10,000 out of the blue,
00:37:35.900
told no strings attached. You just have to tell us what you spend it on. And they ended up spending
00:37:41.920
two thirds of that money generously. Like this is not, you know, what, what sort of the rational
00:37:48.760
agent theory of economics would, would predict, I think.
00:37:51.480
To be clear, this is 200 people getting $10,000 each.
00:37:54.040
So $2 million. Yeah, that's right. And, um, I had the chance to speak with them after this
00:38:00.160
experiment was done. So the, the, the, the experiment was done in partnership with Elizabeth
00:38:04.160
Dunn at the University of British Columbia. Um, they published on it, you know, it's, it's the
00:38:09.960
biggest experiment of, of this kind that, you know, past experiments been done, you know, where,
00:38:13.980
where like psychology students were given 20 bucks and yeah, they, they, they tended to be
00:38:18.500
generous with that. But, um, but the fact that this held across seven countries and different
00:38:23.480
income levels and so forth, really surprising and kind of wonderful. It, it, it shows that
00:38:29.380
there is this mechanism in there whereby ripple effects can happen from generosity. There's a
00:38:34.680
third biological piece that, um, is the, the, the, the feeling we get when we see other people
00:38:41.540
being generous. There's a sort of feeling of uplift that also increases levels of generosity.
00:38:46.420
So put those things together with the fact that we're in this connected age now, and it's
00:38:53.320
actually much easier to give away things that are hugely valuable to people at unlimited
00:39:00.620
scale. So those ingredients in principle create, create a sort of playbook whereby acts of generosity
00:39:08.260
could absolutely send ripples across the world in a, in a great way. You've discovered this
00:39:14.540
by the way, in your own work here on this podcast, like you used to write books and sell them and
00:39:19.200
so forth. And you, you took a risk at some point and put all your time and effort into,
00:39:23.400
you know, giving, or I mean, people can support your podcast, but you will also give it away.
00:39:28.600
And what you discovered was that your impact on the world increased by at least an order of
00:39:34.000
magnitude, I think. And I, I, I think, you know, okay, so that's, is that self-promotion or is
00:39:39.360
that generosity? Well, it's both. And I, I think both, both are there. And I think your
00:39:43.500
willingness to spend all this time, you know, offering reason and insight to so many people,
00:39:49.880
I, I view that as a fantastic gift to the world that was not possible a couple of decades ago.
00:39:57.360
And it's amazing that we're in a world where that, where that can happen, where from your mind,
00:40:02.220
we can get all this, all this stuff and I could, I can listen to you every week or whatever. I think
00:40:06.940
that's, that's incredible. There are so many examples of this under the radar that I think
00:40:12.600
it's worth putting a label on it and, and trying to imagine how we tweak it and dial it up. And so
00:40:19.640
the label I put on it is infectious generosity. And I, I think, I think there's a pathway where it
00:40:27.800
can allow us to reclaim the internet to being a force for, for good in the world and giving us at
00:40:34.300
least a shot at a more hopeful future. We could talk about the, the various digital business models
00:40:41.400
here and, and how they interact with, with this concept of generosity. Cause it's, you know, I have
00:40:47.440
had a fairly unique experience here and I'm happy to get into that if it's of interest, but to your
00:40:53.200
first point about the common misunderstanding of our biological selfishness, right? The phrase,
00:40:59.960
is the selfish gene, which, you know, almost was an, almost an afterthought for, you know, when,
00:41:05.400
I mean, Dawkins could have named it the immortal gene or the eternal gene. I forget which he,
00:41:09.880
I think the immortal gene was what he, which he had named it. But, um, it doesn't mean that we're not
00:41:16.040
capable of altruism obviously, or selflessness even, and, and, and self-sacrifice. And there, there,
00:41:22.040
there's a evolutionary rationale for why we would be from the genes eye view. I mean, there's,
00:41:26.680
you know, kin selection and, and other properties of biology there, but there's, on the psychological
00:41:32.460
side, there's just this fact, which you point to that generosity and, and even classic, you know,
00:41:40.440
selflessness just feels good, right? Which is to say that caring about others is a very good strategy
00:41:47.380
for caring about yourself, right? And so this opposition between selfishness and, and selflessness
00:41:53.280
is fairly spurious. I mean, certainly as you climb the hierarchy of needs towards something like
00:42:00.040
self-actualization, your actualization entails this capacity to genuinely care about other people
00:42:08.760
and to genuinely love them and feel rewarded by, by your moral concern for them. And so there's,
00:42:15.680
there's just this thing that we might call, you know, wise selfishness, which contains all of the
00:42:23.260
pro-social and even self-sacrificing attitudes and norms that we, we would want it to, and which are,
00:42:31.020
are traditionally celebrated and, and, and championed under the banner of one or another religion. I mean,
00:42:38.140
Buddhism is, is very articulate on this, but you know, as is Christianity. And yet this is not
00:42:44.140
biologically mysterious and it's not psychologically mysterious. And, and yet many people have drawn the
00:42:50.940
opposite lesson that not only is selflessness and altruism and, and self-sacrifice and not,
00:42:56.940
not only are those false norms, they're basically illusions, right? They're there. If you prick any
00:43:04.300
one of those balloons, what you find in the middle of it is just pretension, you know, as I said at the
00:43:11.100
top, just that people are, are virtue signaling, they don't really, they, you know, they're just,
00:43:15.580
this is another way of being self-regarding. And you, you touched this a little bit in your book
00:43:22.300
where you deal with the Kantian, uh, notion of, of the purity of one's motives with respect to
00:43:29.580
generosity and then the, the, the necessity that they not be mixed with any desire on one's own part
00:43:36.060
to feel better or to be better or to burnish one's reputation. How do you think about the
00:43:41.180
mixture of motives that can inspire giving? And this notion, this, this, what I think you
00:43:47.580
and I are going to agree is that this false ideal of a moral purity that comes from Kant and elsewhere.
00:43:53.420
I mean, there's also kind of the Christian notion here where, you know, you shouldn't be calling
00:43:58.140
attention to your acts of piety or your acts of generosity. Therefore the best form of giving,
00:44:05.020
I think many people, and at least in, in our society believe is not almost by definition,
00:44:10.300
anonymous. If you're giving anonymously, well, then we know it's pure, but if you're giving in
00:44:14.860
a way that calls attention to the giving, well, then there has to be a mixture of motive. But in my
00:44:20.380
view, you actually can do much more good when you are candid about how important philanthropy is to
00:44:27.820
you and how rewarding it is to you and, and how consequential it is in the world. And so just give me
00:44:35.260
I think there's a simple mental shift that we all need to make, which is instead of looking for the
00:44:42.780
bad in each other's motives to look for the good, you know, as a, as a philosophy student,
00:44:48.380
I literally would lie on the floor of my room for hours at a time, trying to figure this one out,
00:44:54.780
because the notion of like, I wanted to be a good person, but to be good without external
00:45:02.460
motivation to be purely good. I couldn't see how that happened. Like, even if you were just trying
00:45:09.500
to obey the call of conscience, wasn't it true that obeying the call of conscience in a sense
00:45:14.620
felt good? And so wasn't that therefore in a way selfish? And so I, I, I think the truth,
00:45:20.380
the truth is that, that generosity has always been, people have always done it for a reason.
00:45:26.540
And, uh, there's, there's always been a little bit of selfishness in, in the mix. Now the amount
00:45:32.940
of selfishness may, may vary in different people, but is it really like if someone does something
00:45:39.500
knowing secretly that it is in their long-term self-interest that if they do this, they're going
00:45:45.420
to feel deeply happy at the end of the day, or that it may actually enhance their reputation in the
00:45:52.860
world? Is that a bad thing? No, it's not. We should, we should embrace it. And so I, I think that,
00:45:58.460
that the, that one of, one of the, the core things I argue for here is that we have to let go of this
00:46:04.060
idea of perfect generosity. It never was perfect. And in, in the modern era, there are actually more
00:46:10.140
reasons than ever to look at the effects of generosity. Generosity is actually an amazing
00:46:16.380
strategy for any person, any organization, any company. You give away stuff and you, you know,
00:46:23.540
incredible things can happen as a result. It can boost your reputation. I, I think rather than
00:46:28.620
criticizing people for that, we should celebrate it because we want more of that. We don't want less
00:46:33.420
of that. We want more of that. And so the simple shift of saying, stop, you know, just stop this
00:46:39.080
ridiculous cynicism about someone's motives or trying to double guess, you know, is someone gives
00:46:44.780
away money. Were they, were they really doing this actually to enhance their reputation? That's not
00:46:50.240
a bad thing. You know, it's good. It's, it's good that they gave away the money for a good course,
00:46:55.000
celebrate it and, and give other people permission to do it. Doing that, just, just that simple shift
00:47:02.560
would unlock so much extra kindness because I think a lot of people are fearful of doing it or
00:47:08.160
certainly fearful of saying anything for precisely the reason that they're going to get shot at. It's
00:47:13.200
crazy. It's really crazy. We've got, we've got to get past that and, uh, look, look at the world
00:47:18.560
more, more realistically. And, and then just connected to that. So, I mean, you said, you said
00:47:24.780
it was sort of kind of obvious in a way that, that, um, you know, giving, you know, can be good for
00:47:30.760
you, can make you happy and so forth. I actually think for a lot of people, it's actually not that
00:47:34.640
obvious. I think that the psychology around this is really quite confusing to me and quite
00:47:39.320
weird because it's true that there are, there are indelible, profound links between generosity
00:47:45.420
and happiness, but I think they're often hidden. And we, in the moment when we're thinking about
00:47:50.460
being generous, what we're actually also aware of is the fear of loss and a loss aversion is a,
00:47:55.960
is a really powerful thing. And it's, it's, it's entirely possible for someone to go through a month
00:48:00.900
without really thinking about opportunities to be kind because, you know, life is hard and that
00:48:06.900
you're focused on our work or we focused on, you know, how, how to get the, you know, the next
00:48:11.720
paycheck in or what, what, whatever it is. And, um, those, those feelings are much more
00:48:17.420
pressing, shall we say. And so part of, I think part of, part of the playbook here, part of the life
00:48:23.840
hack, if you like, is to remind ourselves that actually, if you take a gamble on being generous,
00:48:31.920
even you actually may not feel it with any certainty in the moment, but you can be pretty sure that
00:48:36.740
afterwards you'll get payback. It will, it will, there will be a feeling of fulfillment. And even
00:48:42.620
if, even if there's no payback, if you like, from the people you gave things to, which there may be
00:48:49.140
that as well, there will be payback and just, you know, the sort of feeling of, oh, I can be that
00:48:55.100
person. Everyone at some level wants to be their, their better self, I think. Yeah. Well, I think we
00:49:02.420
perhaps should linger on this point of, um, the difference between intentions and results.
00:49:08.300
I guess there's, there's a third factor here, which we've, we haven't named what we've already
00:49:13.540
mentioned, which is just the way we feel as a result of giving. So I think intentions and reward,
00:49:20.880
uh, from giving and results are all separable and they, they all matter, but they, they matter
00:49:27.800
differently. And I think it's, it's important to optimize for all of them and just to be aware of
00:49:34.740
the trade-offs here. And this is what, one of the things that I, really the main thing that I found
00:49:39.540
so valuable in effective altruism. I haven't, and we'll talk about the, um, bad odor that surrounds
00:49:46.900
this phrase now as a result of Sam Bankman-Fried, but, um, I've never formally considered myself an
00:49:54.460
effective altruist to you because I, there's, it's always seemed a little too online and a little
00:49:59.820
too cultic and a little too shot through with Asperger's or something. But I, you know, I've
00:50:05.380
talked to the principal founders of it, you know, Will McCaskill and, and Toby Ord and, and, and Peter
00:50:11.100
Singer as well, and admire those guys immensely. And the real contribution it's made to my own life
00:50:18.180
is, is, is, is to differentiate these factors so that I can recognize that the reward I get from
00:50:24.920
giving is separable from the results. And there are certain, there are certain things I can give to
00:50:30.040
where the results in the world are as good as they could possibly be, but it's just not that sexy a
00:50:37.420
cause. And it's, I don't find it that interesting or rewarding, frankly, it's just not, it's not the
00:50:42.920
thing that really tugs at my heartstrings. And so what I've decided to do is just optimize for results
00:50:49.920
insofar as that I can do that through people advising me and, and doing research and then
00:50:56.520
consider the good feels and the search for good feels to be an additional project that is, it's,
00:51:03.860
it's almost like a, a guilty pleasure. So I decided, I mean, I, you know, I took the,
00:51:08.060
there are 10% pledged to give, you know, 10% of all pre-tax dollars to the most effective charities.
00:51:14.120
And so I do that, but then I give more than that. And I consider that the, those gifts, you know,
00:51:20.520
to the one person with a GoFundMe who has some truly tragic story, you know, that wouldn't survive
00:51:27.440
an EA analysis, but it's something that I really want to do. And, and the, and the beautiful thing
00:51:32.620
is that it has made it even more rewarding to do those things. Like it's truly, it's almost
00:51:38.040
like a noble form of greed that gets satisfied by helping people in those specific ways that,
00:51:46.500
that fall outside of the, the optimized results analysis that come from an EA perspective.
00:51:53.640
And then, you know, and finally intentions matter, you know, simply apart from the way in which they
00:51:59.040
color our minds moment to moment and, and really dictate, you know, who we are in relation to other
00:52:05.220
people, they matter just because, you know, if someone's giving, you know, to a cause simply to
00:52:11.400
burnish the reputation, that's all they care about. And they don't really care about alleviating human
00:52:16.040
suffering or doing any other good in the world. Well, then if that's what they care about, you can
00:52:21.000
predict in the future that they are going to go, you know, they're going to be blown around by
00:52:25.480
whatever incentives aim at burnishing their reputation. Right. And they're not, they're not going to be
00:52:30.460
reliable partners in the, in the project of reducing human suffering because that's not what they care
00:52:35.400
about. So that's, that's why intention matters. But I just think we can keep all of those specific
00:52:41.780
dials in view and tune them as correctly as we can so as to have both the best possible personal
00:52:49.620
experience and also do the most good in the world. Yeah. So there's a lot there. I agree with you on the
00:52:57.880
intention side. If, if we, you, if you knew for certain short, that the only reason someone was
00:53:01.600
doing something that looks generous was to burnish their reputation, then that, that doesn't really
00:53:07.520
count as, as generosity, but we actually never know that. But it's still good. It achieves good.
00:53:12.900
If some sociopath is going to give you a hundred million dollars to build a hospital, it's still,
00:53:17.440
you still get a hospital, but I wouldn't, I don't think you have to describe that as, as generosity.
00:53:21.840
But I think the truth is that in almost every circumstance, we don't know. I think people's
00:53:26.760
intentions are almost always mixed. I think most people, their main focus is the giving away of
00:53:32.880
the money and the hope that it will do something. And, you know, that they're, they're happy that
00:53:38.100
it's, you know, their reputation may, may benefit as well, but it's not, it's may not be the main
00:53:43.680
driver. The point is from the outside, you don't know. And therefore, what business do we have to
00:53:47.720
take the cynical view? I, I think it's really important just to look for the good in the intention
00:53:54.020
rather than, than, than, than the bad. But in terms of, you know, what you were saying about
00:53:59.700
that, there's this really important dance, I think, between our generous instincts and our
00:54:05.380
reasoned response to those instincts. I think the reason part is really important. Traditionally,
00:54:13.140
people often, you know, when they give, it is, it is on impulse. You know, you see there's some
00:54:18.820
tragedy in the world and you think, oh, I must, you know, text in my, my contribution or
00:54:23.400
whatever. And, and that feels, that feels good. Or we just, you know, we feel great empathy for
00:54:30.260
someone close to us and they have a need and we'll support that. Paul Bloom, you know, wrote this book
00:54:35.480
Against Empathy, which was kind of trying to argue that too much of that is not, you know, doesn't get
00:54:42.040
us where we need to go because there are so many resources in the world. It actually really is
00:54:47.080
important that we spend them wisely. And so I think there's, there's a really important dance that each of
00:54:51.940
us has to do, which is to try to bring our, our, our reason to bear and to say, okay, I care about
00:54:58.400
this. I care about this. Stop. Don't write a check right away. Just think for a minute. What is the
00:55:03.240
most effective thing you can do? And that, you know, so, so to go to effective altruism, if you just ask
00:55:10.420
the question, do we want our altruism to be effective? Well, yes, I think we do. You know, we'd, we'd,
00:55:16.620
we'd like it to be smart. We'd like it to be reason-based with, you know, we'd like the money to be spent
00:55:22.120
wisely. And, uh, and that's why I'm with you on, you know, I, I, I've got huge respect for people like
00:55:28.340
Will McCaskill and the other founders of effective altruism. I think if you view it as a, as a journey,
00:55:35.940
and I think Will would say this now about it, it's not any, any individual recommendation they make at any
00:55:41.620
point in time is provisional. What they're anchored in is the desire to figure out what is a wise way
00:55:48.860
to give. And when you put it that way, that is absolutely the right question to ask. What is a
00:55:54.260
wise way? Because it's, it's not, it's not obvious. There are so many, you know, there's orders of
00:55:58.220
magnitude difference between spending money smartly and spending it stupidly. And, uh, and I think,
00:56:04.360
you know, I, I talk a lot in the book about trying to find the leverage point. You, you, when you spend
00:56:08.960
money, you're, you're, you're looking for what is it that makes this a good value for money? You know,
00:56:14.820
can you leverage technology? Can you leverage government policy? Can you leverage the internet?
00:56:19.640
Can you leverage education and knowledge? And I think when you find something that really, where you
00:56:27.420
get, wow, that organization is being really smart about how they're spending their money. It's, it's
00:56:32.640
very exciting to do it. And I agree with you that it doesn't have the immediate feels of, you know,
00:56:38.960
here's a family in need, you write them a check, they cry, you cry, lovely. But there is,
00:56:44.860
there is actually something satisfying deeply, I think about taking the time to have a plan and to,
00:56:52.140
to, to know, to think about what you can afford to spend, to commit to spending it and to getting it
00:56:58.700
to a point where you feel good about it. I think really makes, makes you feel different. I mean,
00:57:03.300
I know, you know, I, I had a foundation and for years I was, I couldn't figure out what to
00:57:07.460
spend the money on that, that got me excited coming to Ted. I suddenly found, wow, okay,
00:57:12.440
I get this. I'm leveraging the power of ideas. I love that. I love that. And it, it didn't give
00:57:17.820
me the kind of, here's a family crying and I'm, I can do something about it kind of feels, but it
00:57:21.560
gave me a different kind of satisfaction to have found what was for me the right way of, of spending
00:57:26.980
the foundation's money. So I, that's the dance I think is, is start with, start with the, in the
00:57:33.460
biological instinct and the sort of the, the realization that for you to feel good about
00:57:37.500
yourself, you know, generosity should be part of who you are, but then take the time to think it
00:57:43.400
through and, and figure it out. What is the wise way to go? Cause, cause there are just such
00:57:48.540
different answers to that. Well, let's talk about business models because I have a fairly unique
00:57:55.640
one and it, which you alluded and it's, it might be slightly different than you imagine. And, and
00:57:59.840
also you at Ted have a fairly unique one and there, there actually, there's some similarities
00:58:03.640
between the two and I just would love to get your take on this. So originally on the podcast, I had
00:58:08.860
the kind of the NPR PBS Patreon style business where it was, you know, it was free for everyone to
00:58:16.320
listen to it. People expect podcasts to be free. And I didn't, you know, for a variety of reasons,
00:58:21.960
I didn't want to take ads. So I just said, you know, you, if you enjoy what is happening over
00:58:26.980
here, you can support it and here's how. And I think I was, I was very successful in doing that.
00:58:33.080
And it was, it was a, you know, a real business and, and it was quite stable. And I had about
00:58:39.400
probably two years of running it that way where it was just, I mean, the, the income was perfectly
00:58:45.880
reliable. I mean, maybe it, it moved around by two or 3% on any given month, but it just,
00:58:51.140
it seemed completely, I had found what I had found the floor and the ceiling and it was just,
00:58:56.160
it was just humming along. And nothing seemed broken about it except for two things. One,
00:59:02.860
I released my meditation app alongside this podcast and that was structured differently.
00:59:09.780
That was an app in the app store and it was paywalled. I mean, it was, you, you could use
00:59:13.860
it for free for some time, but then you, then you had to subscribe. And if you couldn't afford it,
00:59:18.080
you could just send an email and we would give it to you for free. And that's still the case. And
00:59:21.640
it's still the case on this podcast. But I had these two experiments in business running side by
00:59:28.160
side. One had a paywall and one was sort of opt out and one was opt in, right? And it was really
00:59:34.120
just, you know, MP3 files on both platforms. And I noticed that by comparison with the app,
00:59:40.560
the podcast business was broken. And I didn't do anything about it until I stumbled upon a
00:59:48.140
conversation on Reddit, which truly bowled me over. And I think you might find this interesting
00:59:53.700
because this revealed to me that as a business model, there's something wrong with the spirit
01:00:01.240
of generosity and the spirit of philanthropy. Because I spectated upon this as the thread was
01:00:07.340
something around, you know, do you support Sam's podcast? And if so, why or why not? And someone
01:00:13.460
said, well, I would support the podcast if I knew what he was going to do with the money. And another
01:00:19.120
person said, well, I would support the podcast if I knew how much it costs to run a podcast.
01:00:25.080
And another person said, well, you know, I would support the podcast if I knew how much I thought a
01:00:30.640
person should earn from a podcast, right? And I looked at those three statements and my head
01:00:37.440
practically fell off my shoulders because I realized at a glance there was something deeply
01:00:42.720
wrong here because these are the kinds of considerations that would never have occurred
01:00:46.860
to a person when they were thinking about buying my next book, for instance. I mean,
01:00:50.800
there's literally no person on earth who's ever thought the thought, well, I would buy his next book
01:00:55.760
if I knew how much I thought an author should make from writing books, or I would buy his next book if
01:01:01.800
I knew what he was going to do with the money, right? I mean, these are just not the kinds of
01:01:05.220
thoughts people think. Either you want to read the book or you don't, and you buy it or not. And so
01:01:10.120
I decided to make the podcast emulate the app in that I made the paywall compulsory in that,
01:01:20.100
you know, it's now paywall, then if you're not behind the paywall, you're only hearing
01:01:24.060
the first part of every podcast. But if you can't afford it, you just have to send an email and we
01:01:29.760
give it to you for free, right? And I staff literally full-time something like 30 or 35 people
01:01:35.800
in the customer service and 95% of what they do is just minister to free accounts. And there were
01:01:41.520
days during COVID where a thousand people would send that email a day, right? And, you know, many
01:01:46.660
hundreds send that email on any given day for the app or the podcast or both, and that's fine. So I'm very
01:01:52.760
committed to money never being the reason why someone can't get access to what I'm doing because
01:01:58.260
it's important that that not be the reason. And it being a digital product, you know, allows for that.
01:02:04.060
But the crucial thing to realize here is that when I changed the paywall on the podcast,
01:02:10.860
first of all, when I announced that I was going to do this, many people predicted that it was going
01:02:14.120
to be a disaster, that I, you know, that my audience would shrink to nothing and that no one would pay
01:02:19.660
because everyone expects podcasts to be free. Unlike apps, everyone expects podcasts to be
01:02:24.400
free. And, you know, most of them are, most of them are ad-supported. And what happened was
01:02:29.060
if you'd like to continue listening to this conversation, you'll need to subscribe at
01:02:34.320
samharris.org. Once you do, you'll get access to all full-length episodes of the Making Sense
01:02:39.600
podcast. The podcast is available to everyone through our scholarship program. So if you can't
01:02:44.800
afford a subscription, please request a free account on the website. The Making Sense podcast
01:02:49.840
is ad-free and relies entirely on listener support. And you can subscribe now at samharris.org.