#271 — Earning to Give
Episode Stats
Length
1 hour and 15 minutes
Words per Minute
172.89894
Summary
A company called Humanitix is a ticketing platform that donates 100% of its profits to children's charities. It's like Eventbrite or Ticketmaster, but it's also a registered charity, and all of their profits go to social impact projects for disadvantaged kids. And their service is free for free events and for paid events, and their fees are actually lower than most ticketing services. And again, all the profits from ticket sales go to charity. So if you're organizing an event and selling tickets, this is a totally straightforward and ethical way to make a tangible contribution to the world. But unlike normal charities, they never have to ask for donations, and they're not sitting on a large endowment that's covering their costs. They're just operating a successful business. This does strike me as genuinely new, and it opens up a path for what might be called compassionate capitalism or ethical consumption that has not been altogether obvious before. And I think this is very exciting, and I want to congratulate Josh Ross and Adam McCurtle for doing this. Best of luck to you both. I m wondering if you can appreciate what an amazing feeling that is to be on the receiving end of that kind of generosity. And it seems that you all have given millions of dollars to various causes here. And it's just remarkable to see sincere thanks to you, the Making Sense audience. - Sam Harris This has become one of the amazing and unexpected pleasures of this podcast, and a pleasure of this past year. This is an epiphany I keep having again and again, but I keep hearing about the generosity of the making sense in the world, and that I keep getting more and more aware of it. Thanks to all of you! - Timestamps: 1:00:00 - The Giving What We Can 2:30 - What we Can? 3:15 - How to give at least 10% of pre-tax income to one's income to the most effective charities? 4:40 - How much do you can give? 5:00 6: What we can do? 7:00- The Giving We Can pledge? 8:30 9:10 - The basic criteria? 11:10% of your own income to a good cause? 10:00? 12:30- What we CAN? 13:00 Is the minimal pledge 15:00 What we're giving?
Transcript
00:00:00.000
Welcome to the Making Sense Podcast, this is Sam Harris.
00:00:25.280
Okay, as we get to the end of the year here and into the holidays, I guess the theme of
00:00:34.120
the moment for me is doing good in the world, how to think about that, how to do more of
00:00:41.900
it, how to appropriately show gratitude for one's good luck, and share that luck with
00:00:52.860
those who need it. So at the top here, I am going to produce an ad. As you know, I don't
00:01:01.520
run ads on the podcast, but this is an ad for a very good thing, with which I have no
00:01:08.440
direct affiliation. A couple months ago, Peter Singer, the Australian philosopher and one of
00:01:16.360
the patriarchs of the effective altruism movement, told me about a company called Humanitix, which
00:01:22.720
was founded by Joshua Ross and Adam McCurtle. They started in Australia and New Zealand,
00:01:29.660
but now they are global, and they've just set up their first U.S. office in Denver.
00:01:35.560
And this struck me as an extraordinarily cool idea. Humanitix is an event ticketing platform
00:01:42.540
that donates 100% of its profits to children's charities. So they operate like a for-profit
00:01:50.960
business, in that they don't ask anyone for donations. They just sell tickets. It's like
00:01:56.540
Eventbrite or Ticketmaster. However, they're also a registered charity, and 100% of their
00:02:03.540
profits go to social impact projects for disadvantaged kids. And their service is free for free events,
00:02:09.900
and for paid events, their fees are actually lower than most ticketing services, and they offer
00:02:15.820
a special discount for non-profits and schools, etc. And again, all the profits go to charity.
00:02:23.600
So if you're organizing an event and selling tickets, it seems to me this is just a totally
00:02:28.680
straightforward and ethical way to make a tangible contribution to the world. And it's just a very
00:02:36.100
cool model. I think it has the potential to massively scale the amount of money that gets
00:02:41.280
allocated to philanthropy. Because what they've done is they've effectively replicated the venture
00:02:47.100
capital model in the charity sector, which is to say they've built a real business, and in this case
00:02:53.240
they're taking on businesses like Eventbrite or Ticketmaster, but they're doing it 100% for charity.
00:02:59.840
But unlike normal charities, they never have to ask for donations, and they're not sitting on a large
00:03:05.040
endowment that's covering their costs. They're just operating a successful business.
00:03:09.100
This does strike me as genuinely new. It opens up a path for what might be called compassionate
00:03:18.780
capitalism or ethical consumption that has not been altogether obvious, but all of a sudden,
00:03:26.360
here it is. So I think this is very exciting, and I want to congratulate Josh and Adam for doing this.
00:03:33.920
Great idea. Best of luck to you both. And if any of you want more information,
00:03:38.680
if you're looking for a job and want to work for an inspiring company, or you just want to sell
00:03:44.420
tickets, please check out Humanitix.com. That's H-U-M-A-N-I-T-I-X dot com.
00:03:54.660
Okay. Well, continuing with that theme, I want to say something which I believe I've said before
00:04:00.780
on the podcast, but it's an epiphany I keep having again and again. And it's about the generosity
00:04:08.020
of the Making Sense audience. All of you guys and gals. I keep hearing from charities which I've
00:04:17.900
mentioned on the podcast, or perhaps I've interviewed someone involved. These are charities like the
00:04:23.460
Plowshares Fund, which is working to reduce the threat of nuclear war. I mentioned them a couple of
00:04:30.380
times, in particular when I spoke to William Perry about his book on the topic, or the Good Food
00:04:36.860
Institute, or the Bard Prison Initiative, or GiveWell.org, which recommends a wide range of
00:04:44.200
effective charities. These and other organizations keep contacting me just to say how astoundingly
00:04:51.460
generous my audience is. And I'm wondering if you can appreciate what an amazing feeling that is to be
00:05:00.440
on the receiving end of that kind of information. It seems that you all have given millions of dollars
00:05:06.780
to various causes here. And it's just remarkable to see. So sincere thanks to all of you for that.
00:05:15.140
This has become one of the amazing and unexpected pleasures of doing this job.
00:05:21.700
Now, as many of you know, at some point in 2020 I took the Giving What We Can pledge, which exists in
00:05:29.180
various forums, but the basic pledge is to give at least 10% of one's pre-tax income to the most
00:05:37.240
effective charities each year. And this is the minimal pledge. Some people give much more than that.
00:05:45.140
And that's over and above anything one gives to any other causes, whether it's your church or
00:05:50.720
synagogue or your children's school or to your university, or perhaps it's some GoFundMe campaign
00:05:58.140
that inspires you. The Giving What We Can pledge stands on its own over and above all of this.
00:06:05.840
And the main criteria there is to target what you can rationally understand as the most effective
00:06:11.360
ways of minimizing human and animal suffering and mitigating the most catastrophic risks.
00:06:19.600
So there's often long-term thinking built into many of the charities that effective altruists tend to
00:06:24.820
support. And once again, givewell.org is a great source for recommendations. And Waking Up,
00:06:32.220
my meditation app, was the first company to take the Giving What We Can pledge.
00:06:38.140
At my urging, they created a pledge for companies, which is analogous to the personal pledge.
00:06:46.280
Here you commit to giving 10% of profits to charity each year. And if you want to see some of the
00:06:52.840
organizations we've supported so far, you can go to wakingup.com forward slash foundation.
00:06:59.540
But I wanted to say a little more about taking the Giving What We Can pledge because, as you'll hear,
00:07:05.880
it's relevant to today's podcast. Taking this pledge is psychologically much more interesting
00:07:13.780
than I realized. And it's interesting wherever you sit on the economic spectrum. For instance,
00:07:20.280
if you don't make a lot of money, you might think, well, I need all the money I make. I spend more or
00:07:26.040
less every penny. And if I don't spend every penny, I need to save something for the future. So I
00:07:32.760
certainly can't afford to give a minimum of 10% of my money away every year. But the interesting thing
00:07:38.560
is that as you begin to earn more money, and even a lot of money, you begin to think, well, 10% of what
00:07:46.460
I'm earning now is quite a bit of money to be giving away every year. It's a lot more money than most
00:07:52.800
people at my level give away. And so what's interesting is that you can find a way to be
00:07:58.600
uncomfortable with this pledge at any level of earning. But once you take it, some very interesting
00:08:07.140
things happen. Speaking for myself, it really has become a source of great satisfaction because it's
00:08:15.780
just an amazing privilege to support great causes. And to know that whatever else I'm doing with my
00:08:22.580
time, and however mixed my motives might be in any moment, by making this decision, I've taken all the
00:08:30.240
psychological friction out of my being generous and effective in the world. Because I've decided in
00:08:37.600
advance that I'll support these very good causes to this degree. And my giving here is no longer
00:08:44.460
vulnerable to my moods. Or am I rethinking anything? I mean, the only freedom I have is to give more
00:08:51.760
than 10% away, or to give to other things that wouldn't count toward my minimum of 10% that goes
00:08:57.640
to the most effective charities. As I told Will McCaskill in one of our conversations, it is an amazing
00:09:04.000
feeling to be giving money to a children's hospital or to a woman's shelter, and for it to feel like a
00:09:11.420
guilty pleasure. It's like you're splurging on something that you really want selfishly.
00:09:17.100
The pledge just inverts the usual psychology around generosity in a fascinating way. Anyway,
00:09:23.660
I'll have more to say about that in the new year. We have a project over at Waking Up that's relevant
00:09:29.040
here. But in the meantime, if you're at all interested in taking the pledge in any of its variants,
00:09:38.100
please go to givingwhatwecan.org for more information.
00:09:43.980
Okay, and now for today's podcast. Today I'm speaking with Sam Bankman-Fried. Sam is the founder
00:09:52.180
and CEO of FTX, a cryptocurrency exchange, and he's also the CEO of Alameda Research,
00:09:59.700
a quantitative cryptocurrency trading firm. Forbes described him as the richest person in crypto,
00:10:06.280
and one of the richest people under 30 in history. I believe he's made about $29 billion in the last
00:10:14.020
few years. What is more remarkable than that is that he set out to make all this money explicitly
00:10:20.760
for the purpose of giving almost all of it away to the most effective charities, and to thereby do as
00:10:27.500
much good in the world as he possibly can. Needless to say, he's an early adopter of the
00:10:32.640
Giving What We Can pledge, and as you might imagine, he's one of the most prominent people
00:10:37.140
in the Effective Altruist community. Sam is also the son of two Stanford law professors,
00:10:42.520
and he received a degree in physics from MIT. So in this episode, we talk about how Sam became
00:10:49.880
as wealthy as he has, how he got into cryptocurrency. We have a brief discussion about that space
00:10:55.940
to bring you all up to speed. We talk about the Giving What We Can pledge, and about how Sam thinks
00:11:01.880
about using his resources to do the most good. We talk about not stigmatizing wealth, wealth
00:11:08.520
redistribution, the norms of generosity among the ultra-wealthy, pandemic preparedness, the impact
00:11:15.840
we can have through lobbying, how ambitious we should be in doing good. Anyway, it seemed like a great
00:11:22.400
topic to close out the year on, and I wish you all a happy holiday and a happy new year.
00:11:28.460
This episode is yet another PSA, so there's no paywall. And as always, thanks to all of you who
00:11:33.800
are supporting the show through your subscription. You are in fact what makes all of this possible.
00:11:41.720
Sam Bankman-Fried. I am here with Sam Bankman-Fried. Sam, thanks for joining me.
00:11:53.040
So there's a lot to talk about. My general interest in speaking with you is your all-too-novel
00:12:01.440
interest in effective altruism. But before we jump into that topic, let's talk about your
00:12:06.660
background a little bit. You are now quite famously referred to as, I believe, unless something has
00:12:13.560
changed based on the volatility of crypto since I began this sentence, the wealthiest self-made
00:12:20.560
billionaire under the age of 30, something like that. Is that still approximately true?
00:12:28.520
And how did that happen? I guess before we get into your crypto experience, maybe
00:12:34.240
summarize your background before that. You're only 29, so there's not that many years to run
00:12:41.620
through. But how do you describe your intellectual interests before you jumped into the world of
00:12:50.500
Yeah, totally. So I grew up in Stanford, California, went to MIT after that, and really had no clue what
00:12:59.100
I was going to do with my life there. I sort of like half-heartedly thought maybe I'd be a physics
00:13:03.560
professor for kind of no good reason. And quickly at MIT learned that I didn't really like research,
00:13:11.400
and I probably wasn't really built for it, and that that was sort of like not going to happen.
00:13:18.200
And around the same time, started thinking for the first time about what I should do with my life.
00:13:24.020
And I think that started out basically just coming from a utilitarian standpoint of,
00:13:29.880
you know, what would maximize ultimate well-being of the world. I hadn't thought about it very
00:13:35.100
carefully, but when I finally confronted this, as opposed to just sort of like hiding it somewhere
00:13:42.280
in my mind, it sort of quickly became clear that at least there can be some things I could do that
00:13:46.780
would have real impact. And, you know, that one of those was going to be earning to give.
00:13:53.580
I just basically thought of trying to make what I can so I can donate what I can. And at the time,
00:13:59.700
I think I was sort of most involved with animal welfare organizations. And, you know, basically
00:14:05.500
went to them and said, hey, like, would you prefer my time or my money? And they said, definitely your
00:14:11.080
Mm-hmm. So, but you've jumped into the mode of already earning enough to be of help to anyone.
00:14:18.760
So what was the transition from physics at MIT to finance of some sort?
00:14:25.820
Yeah. So I sort of jumped to the point of thinking about it before I actually got involved in anything
00:14:33.080
that would actually make money. It was very much a sort of like, theoretically, I probably could
00:14:37.560
type thing and, you know, figure out how to do that.
00:14:41.080
And that was sort of the tentative plan. And, but yeah, I hadn't actually yet figured out how.
00:14:48.920
And around that time, I met Will McCaskill and Ben Todd and a few others from sort of the
00:14:54.540
national EA movement who were visiting Cambridge and talked to them about what I was going to do
00:15:01.040
with my life. And they, you know, very much thought that their need to give plan made sense.
00:15:06.040
They also said a bunch of things I hadn't thought about before around what,
00:15:11.080
causes I could ultimately give to. And also confirmed as I'd sort of been thinking that if
00:15:16.100
I was going to earn to give that, like, probably Wall Street was like a good place to look for that.
00:15:21.920
So you met Will McCaskill. Did you also, did you meet Toby Ord as well?
00:15:25.520
I had met him yet. I met him later, but he was not in that particular excursion to the States.
00:15:31.400
Right. So did, did Will give a talk at MIT? Is that where you met him?
00:15:34.360
I think he, he gave a talk at Harvard, but I had lunch with him beforehand in Harvard Square.
00:15:39.860
Right. Yeah. Will's, Will's fantastic. And he, he really is my gateway drug to effective
00:15:45.480
altruism as well. He's been on the podcast a bunch and on the waking up app and Toby has
00:15:50.920
subsequently. And it is fairly thrilling when the scales fall from your eyes and you realize that
00:15:58.920
there's an opportunity to systematically do good in a way that is, it's just like, there's such a clear
00:16:06.640
view of, of ethical daylight in this direction that interestingly becomes uncoupled from the usual
00:16:15.120
things that drive altruism. You just, you know, the good feels of a very compelling story. It's not to
00:16:21.700
say that the good feels aren't important. We want to be as rewarded as possible by the good things we do
00:16:28.200
in life. But there's this other layer of rational acknowledgement that sometimes the most effective
00:16:35.820
ways of benefiting the world are not necessarily the sexiest or not necessarily the, the ones that
00:16:42.760
are effortlessly most enthralling to people. And it's just, just to get a very clear eyed view of all
00:16:49.520
of that. And then to prioritize doing the most good is just a, it's an amazing game to find out
00:16:57.220
that, you know, even exists much less to get involved in. Yeah, I totally agree. And I think
00:17:03.040
that, you know, prior to college, I sort of had somewhere in the back of my mind, oh, maybe I could
00:17:07.760
try and do something with my life that would have impact. And then some sort of part of me is like,
00:17:12.200
oh, I don't know, that sounds kind of hard. I don't even know what that would imply. Maybe I'll just
00:17:16.520
sort of ignore that, you know, who knows what that would mean. And I think like when I started
00:17:21.980
thinking harder about it, and then met, met Will and others and actually sort of dove into the
00:17:27.220
effective altruism community. Yeah, I think one of the first things that really stood out was like,
00:17:31.080
all right, here's like a few concrete proposals, which aren't necessarily the single best thing to do,
00:17:37.140
but are clearly incredibly compelling, and are clearly like massively better than anything sort of
00:17:43.860
like accidental or random that I would have done. And, and that were sort of like, a really
00:17:50.320
convincing case of like, you know, if you think carefully about this, and really do focus on your
00:17:59.560
impact, rather than as you said, on, on sort of just the, you know, that reverberation of the impact
00:18:06.400
back onto you, that you can really get massive, massive numbers.
00:18:12.560
So we've, I noticed we've pitched already into the topic of interest, effective altruism,
00:18:18.240
skipping over the world of crypto. Let's go back for a second, because I would be remiss in not
00:18:25.140
extracting any insights you might have on that topic. Well, first, how did you get into crypto? And
00:18:31.940
what is actually your contribution at this point to that space?
00:18:36.900
Yeah, so I, I went to Wall Street when I when I graduated college and worked as a quant trader at
00:18:44.140
Jane Street Capital for three and a half years. And I had a really, really good time there in a lot of
00:18:49.820
ways. It was a great environment. It was a great fit for me as a job. And it seemed like a really
00:18:55.560
compelling earning to give path. And, you know, they were really good to me there. I just kind of
00:19:01.200
thought that's what my life was going to be. And I was pretty happy with that. And as is, you know,
00:19:06.560
in New York, and then late 2017 came around. And for the first time in three years, I sat down and
00:19:14.280
forced myself to go through an exercise like what I had done in college, where I sort of, you know,
00:19:20.360
drafted down ideas of what I could do, tried to estimate like how much impact could I have through
00:19:25.200
each of them. And shortly after starting that, it became clear what the sort of conclusion of that was
00:19:30.200
going to be, which was that I don't know what I should do with my life that there are actually a
00:19:35.640
lot of things that could have large impact that I wasn't sure which of those was going to be the best
00:19:41.340
and that the only way really to find out was going to be to try them. And it was sort of, you know,
00:19:48.200
either don't try anything and just optimize for this path, or leave and and try a bunch of things and,
00:19:54.760
and that the second in expectation was probably going to be the the better one. So I left this
00:20:02.320
was, you know, late 2017. And I did try a few things I worked briefly for a Center for Effective
00:20:08.540
Altruism. I also started looking at crypto. And this sort of original thesis with crypto was a pretty
00:20:15.200
clear one, which was, it seems like there might be good arbitrages here. Maybe that's true. Maybe that's
00:20:22.820
not true. If that's true, these numbers might be huge. Let's check that out. And so I basically just
00:20:29.800
dove in. And I, you know, create some accounts on some exchanges and like tried to do one example
00:20:36.520
transaction to see like, will this even work? And so what what are in fact, were you arbitraging
00:20:42.480
at that point? So at that point, I mean, it was all over the place. But the clearest one was literally
00:20:48.680
just bitcoins against dollars. Like you look at bitcoins on one US crypto exchange against
00:20:54.060
bitcoins on another US crypto exchange. And you know, they'd be trading for $10,000 on Coinbase
00:20:59.500
and $11,000 on Bitstamp. And in theory, one could then buy a Bitcoin on Coinbase, send it for $10,000,
00:21:08.740
send it to Bitstamp, sell it for $11,000 and make $1,000. And then sort of like rinse and repeat.
00:21:13.860
And, you know, whenever you see something like that, you should wonder, like, is this data real
00:21:19.360
or is it garbage? And in particular, the numbers were hilariously big. Unlike in real trading on
00:21:26.340
Wall Street, if you can make 2% of a percent on a trade, so two basis points, that's a good trade.
00:21:34.480
You know, most firms would be like, yeah, do that trade. You know, if you can do more of that trade,
00:21:38.400
do more of that trade. That's pretty good. Well done. You know, like, not unheard of,
00:21:42.740
but just like a really solid trade. And here we were seeing things like 2%. So 100 times as big.
00:21:51.080
And that's almost always fake when you see it. But it was 100 times as big. And the volume wasn't
00:21:56.600
trivial. You know, it was trading a few billion dollars a day globally. And so in theory, you can
00:22:01.760
sort of do this naive calculation of like, well, let's say you made 1% on every trade. And let's
00:22:07.780
say you did 1% of volume. And so that means, you know, $10 million of volume. And a percent
00:22:14.040
on that is $100,000 per day of revenue. And so, you know, $35 million a year or whatever.
00:22:22.420
And that's a pretty substantial number. You know, and obviously, that was like just some
00:22:28.600
complete bullshit calculation that I did with no idea what a Bitcoin even was. No idea if any of
00:22:34.400
this, you know, any of these numbers are real. But it was enough to convince me that like, maybe
00:22:39.280
there's something good to do here. And that is worth trying out. And so I just sort of created
00:22:44.000
accounts on all the exchanges and started trying to like, you know, go send in the money, buy the
00:22:48.040
cheaper Bitcoins, sell them in other places and see if I could make money doing that.
00:22:51.440
Did that turn out to be real? I mean, what explains that inefficiency of pricing?
00:22:56.980
Yeah, so half of it turned out to be fake. About half the cases, it turned out the data that was
00:23:01.220
reported was just misleading in one way or another. A classic example of this is that they would call
00:23:06.840
it a quote, Bitcoin USD market. But the USD, there would not really be US dollars. It would be like
00:23:13.560
dollars on some sketchy third party payment site running through Russia that cost 25% to get
00:23:21.320
money in and out of and you could only get to a Brazilian bank account. So, OK, that was not a
00:23:27.140
real trade. But we got to get you. We got to get you earning to give first. This is your path is
00:23:32.180
blocked by a labyrinth of sketchy tax. Exactly. Yeah. And then you look at like, OK, how about the
00:23:38.440
legitimate data? And it was sort of like a scaled down version of the same issues where, all right,
00:23:43.440
the Bitstamp Coinbase ARBs were sometimes real. But what would happen when you tried to do them?
00:23:49.340
First of all, you'd pay half a percent in fees, all things considered. Second of all, you have to
00:23:55.980
start by getting dollars from your bank account to Coinbase, right? And you send that wire transfer
00:24:00.120
and then you get a notice from your bank that they shut down your account because they didn't have a
00:24:04.020
compliance policy yet for crypto. And now you no longer have a bank account. And then I guess if you
00:24:10.140
want to do the trade a second time, you need a new bank account. And then you get the funds on
00:24:14.480
Coinbase and you buy Bitcoin and they tell you you can withdraw $100 per day. And so you're sitting
00:24:19.340
there and being like, well, I guess I can make $1 per day doing this trade 1% on the $100. And you
00:24:26.140
reach out to them and be like, hey, can I have higher withdrawal limits and get an automated message
00:24:30.340
back saying, sorry, the queue for getting support from us is three months long right now. And you just
00:24:36.860
start running into all these logistical issues that sort of were reflections of the fact that the
00:24:41.600
ecosystem was incredibly new and incredibly unwieldy and not very well developed. The infrastructure was
00:24:50.640
all broken. And it wasn't impossible to do these trades. It was just like really hard and annoying.
00:24:59.100
I'm guessing you now probably have Brian Armstrong's cell phone number.
00:25:02.020
Yeah, I do. And that was, you know, going through that sort of like step-by-step process of like
00:25:10.960
getting an account manager at these exchanges, finding a bank that was comfortable with the
00:25:16.260
cryptocurrency ecosystem and willing to allow us to send transfers to and from crypto exchanges,
00:25:22.180
things like that, you know, getting automated train systems hooked up to these exchanges,
00:25:26.760
some of which didn't even have APIs. And then eventually looking overseas and saying, well,
00:25:32.620
here's a big arbitrage between American and Japanese exchanges. How do we do that? I guess
00:25:38.500
we have to replicate this entire setup in Japan with a Japanese bank. And that was like the hardest
00:25:44.380
part back in 2017, 2018 of actually doing these trades.
00:25:48.560
So then when did you graduate to build in your own exchange?
00:25:54.040
Yeah. So after about a year of this, it's, you know, there are clearly still good trades to do
00:25:58.680
in crypto, but just as clearly like the ecosystem was a mess and, and the ecosystem in crypto really
00:26:06.480
means the exchanges in a way it doesn't in traditional finance. Like if I asked you, like,
00:26:10.840
what are the five most important finance companies? You probably wouldn't just start listing off like
00:26:16.180
New York stock exchange, NASDAQ, CME, ICE, SIBO, like maybe one of those would make the list.
00:26:22.800
Probably not. You know, I'm guessing you'd name like, I don't know, like Goldman or JP Morgan or
00:26:28.000
something or Robin hood, maybe crypto is different. And the reason crypto is different is that the
00:26:33.200
entire financial stack is collapsed into one product and that product is the exchange. And so when,
00:26:39.380
when you go to buy a stock, you're going through 12 companies, you're going from Robin hood to some
00:26:43.920
payment for order flow firm. There's some stock clearing, there's some dollar clearing system in
00:26:48.120
there, some stock loan desk, you go to an, uh, you know, dark pool, another, you know, PFA firm
00:26:53.100
eventually end up at an exchange. And then the whole other side of that, you know, on the selling side
00:26:57.560
in crypto, the only people involved in the average transaction are the buyer, the seller, and the
00:27:02.380
exchange. And all of those functions from clearing settlement risk compliance, know your customer,
00:27:08.260
um, mobile app API matching engine, all of that is collapsed into the exchange. And so they really
00:27:15.240
are the backbone of, of the trading ecosystem in crypto. And so if you wanted to address the
00:27:21.540
infrastructure, that's where you went and boy, did the infrastructure need addressing.
00:27:32.200
launched spring 2019. And basically the thesis was like, on the one hand, these businesses are
00:27:38.620
making a billion a year collectively. They seem like fairly, like fairly understandable businesses.
00:27:44.220
Like we understand their core function pretty well and could build that their online products,
00:27:50.400
um, which are easier to launch. And they're just shit shows all over the place. Like they're losing
00:27:57.520
a million dollars per day of customer funds to incompetent risk controls. Their customer support
00:28:03.160
departments were nearly non-existent. Many of them basically didn't have compliance departments.
00:28:08.040
Many of them didn't have banking. And it just seemed like, you know, boy, if this is the barrier,
00:28:13.640
like we can do better than that, you know, like that we can build a better product than.
00:28:18.960
On the other hand, I had no idea how to get a customer. And that was sort of our biggest worry
00:28:22.520
about this was like, sure, maybe we build a good product and just no one ever knows about it.
00:28:26.620
No one ever uses it. And I didn't even know where to start with getting users. But even if we said
00:28:31.860
there's an 80% chance of failure from that, like 20% of that upside was still a lot and enough to
00:28:38.080
convince us to go for it. I'm tempted to take a slight detour in describing somewhat what is,
00:28:48.140
I mean, I got to think most of our audience at this point understands what Bitcoin is.
00:28:53.360
I'm happy to give like a one minute version of it.
00:28:56.020
Yeah, let's do it. I mean, you know, there are many people listening to us who
00:28:58.980
have listened to me and Bologi for four hours. So they've gotten an eyeful or an earful,
00:29:05.660
but certainly from a crypto maximalist. But give us your how you describe to someone's grandmother
00:29:15.480
Yeah. And when I got involved, by the way, in crypto, I had no idea what it is other than like
00:29:19.200
a number that went up and down that you could trade. But the core of crypto is basically like,
00:29:24.660
you know, you want some system where you can like send, you know, money back and forth and assets
00:29:31.240
back and forth and information back and forth between each other. And that means you all need
00:29:36.100
to agree on the protocol for it. And you need to agree on like, who decides, you know, ultimately,
00:29:42.580
which transactions happened, who records that. And it was Gmail, the answer is Google does,
00:29:47.960
you know, everyone tells Google, they want to send an email, and then Google,
00:29:51.680
you know, records that and sends it along. And, you know, with the New York Stock Exchange,
00:29:57.720
the answer is, well, the New York Stock Exchange, you send your orders to it, and then it spits out
00:30:02.980
what happened. It sort of is like the controller of this database. With cryptocurrencies, you know,
00:30:10.240
generally, the way it works is, there's some decentralized group of parties that together
00:30:16.020
are effectively voting on what happens step by step. And anyone in the world can submit transactions
00:30:23.100
to them. You know, so I could submit transactions saying, I'm going to send, you know, $30 to my
00:30:29.020
brother, Gabe, or, you know, a third of Bitcoin or whatever. And I also submit, you know, basically
00:30:35.460
a proof that I have the password to this account. And then you have this sort of global group of
00:30:39.940
validators, sometimes it's miners, sometimes it's staking validators, depending on the blockchain,
00:30:46.140
that sort of get together, you know, vote that yes, this is a legitimate transaction,
00:30:50.000
you know, the necessary information was submitted, and record that. And that's a block. And then they
00:30:55.060
iterate on that, adding block after block after block, which sort of adds a new set of transactions
00:31:00.740
onto this like growing ledger of the whole history of the blockchain. And the goal in the end is to
00:31:06.240
create, you know, a system of payments and sending information and money back and forth that doesn't
00:31:12.460
rely on one central party or government to ultimately be like, you know, this source of truth on what's
00:31:19.680
happened. Right, right. So there's no no trusted third party is distributed across thousands of
00:31:26.220
computers. And because it's distributed, it's transparent to everyone. And there's a consensus
00:31:34.120
algorithm that ensures that no one can cheat, or it becomes so expensive to cheat that you would,
00:31:40.160
it's, it's effectively impossible to cheat. Yeah. So, you know, there's this whole space is it has a
00:31:49.180
Wild West component to it. And it is, you know, as I said, incredibly volatile. And you know,
00:31:57.120
the upside is extraordinarily high, as you've demonstrated, but, you know, we have not seen,
00:32:04.380
I guess there have been micro busts in crypto, I'm sure people got in at the peak and then lost
00:32:10.120
a lot at various moments. But it's generally been a very quickly rising upward trend. So more or less
00:32:18.720
everyone, if they got in early and are still in, feels like a genius. How does this go wrong? What's the
00:32:25.540
probability of this going wrong in your view? And, and if it does, what would account for that?
00:32:33.820
I mean, you could also bring in, I guess, I'm sure, regulatory concerns are top of mind. I know
00:32:38.920
you recently testified in front of Congress. So you can bring in that part of the picture as well.
00:32:44.480
Totally. So, you know, I think that there is, it's a volatile asset, and it might go down,
00:32:50.420
and it might go down a lot. We've seen it have 50% movements in both directions in a few day period
00:32:55.180
before, a number of times. And I definitely wouldn't want to promise that it that won't
00:33:00.380
happen. I think that like each year, the odds have gone down substantially that it's going to
00:33:05.140
go away. If you rewind to March 2020, I think there's a real risk of that. You know, Bitcoin
00:33:10.940
dropped down to $3,500 per token, less than 10% of what it is today. And there was just very little
00:33:18.440
liquidity in it. There was no buyer of last resort that was obviously coming out there. The whole space
00:33:23.940
was sort of teetering on the edge and COVID had just hit. The world was a mess. You know, fast
00:33:29.600
forward to today, the amount of institutional capital getting involved in crypto is massively
00:33:34.500
higher. The number of important financial institutions that are purchasing themselves
00:33:39.360
or on behalf of their customers is massively higher. And all of that just means that there's
00:33:45.620
just a lot more, there's a lot more, I think, sort of, you know, power behind what's going on
00:33:54.460
in this space and a lot more people who, if things did drop enough, would be willing to jump in and
00:34:00.540
backstop on the liquidity side. So I think that that risk has gone down substantially. And a number
00:34:06.960
of institutions have basically decided that they are going to get involved one way or another.
00:34:11.000
On the regulatory side, which I do think is one of the bigger risks, I think that risk has also
00:34:16.740
gotten a little bit less big, maybe substantially less big over the last year or two. It's at this
00:34:23.020
point, crypto is too big for people to just go out and ban it in mass. You know, I don't think that
00:34:27.920
you're going to see, you know, major governments, at least not many of them, hard ban crypto.
00:34:33.240
Which is essentially, didn't China, I mean, China banned crypto mining, correct?
00:34:37.500
China banned crypto mining. It's a little bit complicated, exactly what it has and hasn't
00:34:42.120
banned. And I think that the stories on it are lacking a bit of nuance. You know, they've
00:34:48.160
banned mining, they've nationalized some of the industry.
00:34:52.540
Not that much. It hurt it a bit. But it's, I mean, if anything, I think that there is
00:34:56.680
some advantage in, you know, the sort of forced decentralization geographically of it.
00:35:01.940
Hmm. But, you know, I think it's also the case that like, you know, I think it's basically also
00:35:07.340
the case that if you look at, I mean, they, you know, Chinese governance intervened in a number of
00:35:13.300
sectors domestically over the last year, this is one of them. Almost no other world governments have
00:35:18.400
been trying to ban crypto recently, although many of them are trying to regulate it. And so I think
00:35:24.020
what you're going to see instead is, you know, a messy step by step process jurisdiction by jurisdiction,
00:35:28.860
as countries try and decide what the regulatory framework should be for crypto.
00:35:33.280
I think that's going to be messy. It's not going to follow a really clear, you know,
00:35:38.460
sort of a really clear progression. It's not going to be consistent. There's going to be missteps in
00:35:42.680
both directions. And I think that, you know, about six months ago, my biggest worry probably
00:35:47.800
was that for whatever reason, regulations end up forcing crypto out of major jurisdictions,
00:35:54.800
because the regimes just don't end up being workable. But I think there's been a lot more
00:36:00.220
education and a lot more excitement and willingness on the behalf of regulators and lawmakers to engage
00:36:07.380
on that over the last six months. And hopefully that the industry has done a better job of, you know,
00:36:13.780
communicating reasonably about this with regulators, because I don't I don't think the industry had
00:36:18.900
always been very good about that. OK, so now you're you're earning to give out there in the
00:36:26.120
in crypto paradise. And I noticed you also took the the given what we can pledge. I think you took
00:36:33.320
it some years before I did. This is the pledge that was started by Toby Ord and Will McCaskill.
00:36:38.860
And there are various bespoke versions of it. I think that the generic one is to give at least
00:36:45.480
10 percent of one's lifetime income to effective charities. Waking Up was actually the first
00:36:50.920
company to take the pledge. They didn't they didn't have a company based pledge until I twisted
00:36:56.760
their arms. So I'm happy they did that. And so for the for companies, it's giving a minimum of 10 percent
00:37:04.220
of profits each year to effective charities. But what what what pledge did you actually take? How do you
00:37:11.260
think about the pledge you took and how are you implementing it? Yeah. So I guess I've taken a
00:37:17.320
couple over the course of the years in a few different places. I've taken given what we can
00:37:22.680
pledge more generally, if also I pledge to give way more than 10 percent away. And, you know, my my
00:37:32.180
actual goal is to give away almost everything one way or another. I think it's worth caveat that I don't
00:37:38.020
know what formula of that is going to take. And I don't know if all that's going to be, you know,
00:37:42.320
501 C threes. But but but in the end, you know, other than the sort of amount that I'm I'm living
00:37:48.800
on, my goal is to use all the resources that I have to to, you know, do as much good as I can.
00:37:56.400
I'm not particularly planning to. I think it's a it's it's just isn't something that
00:38:03.400
had ever been extremely exciting for me. And I think, you know, it's never been a priority of
00:38:11.600
mine, but also that that that I think it's not how I want to spend my life and my time.
00:38:19.100
I think it will just as a as the older guy on the podcast, the caveat I would add there is that you
00:38:25.300
really are young enough, I think, to not be surprised by a sea change in your in your view on
00:38:32.480
this topic. I mean, when I was your age, I really I think I would have said more or less the same
00:38:37.140
thing. I certainly had zero plans and was somewhat skeptical around the prospect of having a family.
00:38:43.860
But that changed. And so but it's relevance to this topic is, you know, many people I actually
00:38:51.240
like this is the larger topic of just how you view wealth and generational wealth in the context
00:39:00.180
of effective altruism. In my conversations with Will, I've wanted to find a line here where
00:39:08.840
wealth itself is not stigmatized. I think I think we want to live in a world where people grow wealthier
00:39:15.660
and wealthier. And, you know, the pie gets bigger and bigger and bigger so that, you know, even the
00:39:20.340
poorest person 100 years from now would be unrecognizably wealthy by today's standards. I mean, that that's
00:39:27.740
what real success would look like, you know, within whatever constraints the laws of nature impose on
00:39:33.880
us as a species. So wealth itself can't be the problem. And yet so much talk about philanthropy
00:39:42.300
does set up a zero sum contest between living a certain kind of lifestyle and being a good person,
00:39:51.220
essentially. And we have a you know, there are concerns about wealth inequality, which I
00:39:57.360
completely share now, you know, in the US and elsewhere, I think it's, there are just obscene
00:40:04.680
attitudes toward redistribution that you often, you know, meet among the most fortunate people in our
00:40:11.980
society, where there's just there's very little concern about the common good or a very little apparent
00:40:17.280
concern about the common good. Once one has read a sufficient amount of Ayn Rand in Silicon Valley.
00:40:26.160
And so it's, I'm just wondering how you think about things like wealth inequality, generational wealth,
00:40:32.440
and redistribution. I mean, I guess the other piece I'd put in here is I think it's as much as I think we
00:40:38.740
need to engineer a tide that raises all boats, you know, recent proposals of, you know, taxing unrealized
00:40:46.280
gains on billionaires, just seem, whatever the ethics, they seem practically unworkable. So I'm
00:40:54.240
just wondering what you're, you know, as a fantastically wealthy person, who is committed to
00:40:59.540
the common good, how do you view things like wealth inequality, redistribution and, and the rest?
00:41:06.700
Totally. And I think I agree with, you know, a lot of what you said there, where it seems to me like
00:41:12.720
there's way, way too little focus in the world on, on doing good. And, and that what you see a lot of
00:41:21.780
instead is this weird, sort of hybrid thing, in sort of large centers of wealth, which is, it's not
00:41:28.600
exactly trying to use, you know, it's not trying to redistribute, it's not trying to use the resources or
00:41:36.420
wealth to have positive impact on the world. It's sort of, I don't know, almost this weird constrained
00:41:43.640
problem of like, doing things which seem kind of good ish. And also are sort of like, weird brand
00:41:53.220
building exercises almost. And, and I think ends up being a confusing combination of things. And I think
00:41:59.600
that like, the sort of classic type of like, kind of do good or thing would be like, endowing a university
00:42:06.420
building or something like that, which is, it's sort of like is kind of trying to do good, but it's
00:42:12.720
also kind of trying to like, build your personal brand, maybe, or maybe not. But but I guess, it sort
00:42:19.960
of seems hard to me for that not to be part of what it is, given that's like, a big part of the impact
00:42:25.380
of it. And it's sort of is kind of focused on like, like, I don't know that there's like a really
00:42:33.680
coherent theory of top universities are underfunded is the biggest problem in the world right now.
00:42:40.560
And so I think that that form is like, quite popular right now. And I think it's not what you
00:42:47.720
would do if you're actually just trying to like, you know, do what was best for the world or anything
00:42:51.680
like that. And and I think that, you know, this just like, incredibly clear that beyond a moderate
00:42:58.720
amount of wealth, there's just not really anything that you can do, that's going to have much impact
00:43:06.400
on your life, even if it weren't the case that you could have absolutely massive impact on the rest of
00:43:12.860
the world. But in fact, you can have absolutely massive impact on the rest of the world. And, you
00:43:17.940
know, impact that's way outsized compared to what you're sort of putting into it. And, and I think
00:43:23.460
that that is incredibly important. And, and I think that's been sort of one of the key pillars of
00:43:30.820
effective altruism. So I basically agree with, with all that. And, but but then looking at sort of
00:43:37.440
another thing you brought up about, like, proposals to address this, I sort of also agree that I think
00:43:42.420
a lot of the ones we've seen recently have not really, they've seen weirdly not trained on, you
00:43:49.340
know, doing it in an efficient or effective way. And I guess what I mean by that is like, you know,
00:43:55.840
you look at the, you know, unrealized cap gains tax. And I think there's like really compelling
00:44:00.840
arguments for increasing tax rates on the very wealthy. I think that it should almost be your prior,
00:44:08.200
I think that that like, probably it's correct to consider doing that, at least. And, you know,
00:44:14.860
I think there's certainly arguments on addressing a lot of loopholes in the tax code as well. I think
00:44:21.320
like that particular approach is probably not the right approach, because any of these probably what
00:44:27.660
you're getting at, but like, it's a total mess, from an operational perspective, where you end up
00:44:32.840
taxing people for more money than they actually have. And, you know, assessing a tax that they
00:44:38.500
literally can't pay. And there's sort of a question of like, what next? Like, what, what, what does that
00:44:43.760
mean? And, and so I think that that that sort of is, is like, probably not the right instantiation of
00:44:50.060
it. And probably came, you know, in some senses, more from a direction of like, decreasing wealth for
00:44:57.900
the wealthy, rather than thinking about how to have like, positive impact for others. I think it's
00:45:02.960
almost how sometimes society treats these things, is them sort of optimizing for the wrong piece of
00:45:09.020
it. Well, yeah, because there's this moralistic layer to it, which is demonizing extreme,
00:45:17.800
understandably extreme disparities in wealth, in a world where there's obvious suffering that could
00:45:24.000
be addressed by money. But in stigmatizing those disparities, you know, the, the, just the shocking
00:45:31.740
inequalities in the world, we wind up stigmatizing wealth itself. And so there's, you have people like
00:45:37.600
Elizabeth Warren and Bernie Sanders, who did, they don't even attempt to conceal it. What they are
00:45:43.600
communicating is contempt for people of sufficient wealth. I mean, you know, I think they've even said
00:45:51.560
it outright that there's just, there's no way to become a billionaire legitimately, right? I mean,
00:45:57.400
it's like, if the system were as it should be, it would be impossible to be that wealthy. And so
00:46:03.840
there's, there's moral opprobrium attached to having succeeded to the degree that, that you and, you know,
00:46:10.700
something like, I guess, 3000 others on earth at the moment have. I think they're about, I think
00:46:18.120
there's something like 3000 billionaires. And that's, that part seems completely wrong because
00:46:23.360
it's just, you know, I mean, obvious, you know, one, we have to just, on a first principles basis,
00:46:28.760
we just have to acknowledge it's going to be some degree of inequality. And our real interest
00:46:33.680
is in canceling the most painful extremes at the bottom, right? We don't want to cancel the top.
00:46:42.840
We want to cancel the, we want to raise the bottom so that, you know, the poorest among us
00:46:48.560
still have all that normal people, you know, actually need to live lives of, of real integrity
00:46:57.320
and wellbeing. And that seems possible, right? And that we should, that's the thing we should
00:47:02.740
engineer without, and, and we certainly shouldn't want to create any incentives that make it harder
00:47:09.180
to generate wealth in the first place. I totally agree.
00:47:13.300
So how, I mean, you, you view this terrain from an unusually high perch here. What would you recommend
00:47:21.460
our policies be here? I mean, just given that it's possible to incentivize the wrong things and
00:47:28.200
disincentivize the right things and that there's a lot of confusion about just what the, and just basic
00:47:37.960
uncertainty about what the outcomes would be if we, if we rigged the system very differently.
00:47:43.800
If you could tinker with it, what would you recommend that we do?
00:47:48.880
Yeah, totally. And, you know, I'm not an expert, I think on, on tax policy and all of these are just
00:47:54.500
sort of guesses at it. But in the end, I think that sort of in line with what you said, like I would put
00:48:01.460
the focus here on the focus on the good that we're trying to accomplish or on the problems we're
00:48:08.640
trying to stamp out. And so focus on like, you know, those that are in extreme poverty, like what
00:48:15.120
can we do to get them out of poverty? And, you know, the actual amounts that it would take sometimes to
00:48:19.940
do this, they're significant, but I think they're not as gigantic as some people would sometimes think
00:48:26.520
they would be, you know, I think that for, you know, given the amount of resources that we have as a society,
00:48:32.520
we should have plenty, if we're good at targeting, what we need to do in the ways that, you know, places like
00:48:38.380
GiveDirectly against malaria, and, you know, others have done at addressing that, I think looking at, you
00:48:45.080
know, the suffering of factory farmed animals, it's not a good place right now. And, you know, looking at, I mean,
00:48:51.220
obviously pandemic preparedness, whatever, there, there's sort of a lot of areas where we clearly
00:48:54.700
need to make progress as a society, but I would refocus the conversation on those and on what we
00:49:01.460
can do to address those. And I think, you know, a concerted effort could get a lot of progress on
00:49:07.220
them. What about norms around philanthropy among the very wealthy? I mean, the truth is when you're
00:49:14.920
talking about the wealthiest people, you know, the one-tenth of one-tenth of one percent, the kinds
00:49:21.500
of donations that get newspaper articles written about them really are, you know, so someone writing
00:49:30.600
a check for $50 million to a hospital, say, right? I mean, that is a, an astounding act of generosity
00:49:38.840
when you measure it against most people's wealth, but when somebody has $100 billion, you know, it is
00:49:46.360
really a rounding error. They, they couldn't even estimate their wealth within $50 million on any
00:49:51.540
given day, and they don't have to be in cryptocurrency for it to be that obscure. I think we would be in a
00:49:57.560
different situation with respect to the reality of human suffering and, and animal suffering and, um,
00:50:05.420
the, the perception of wealth if people, more people in your situation and beyond had your attitude toward
00:50:17.240
the amount of wealth they were giving away or planning to give away. And it wasn't just about getting your
00:50:22.920
name on, on a university building for a comparatively tiny amount of money as large, as expensive as a building
00:50:30.060
on the Johns Hopkins or Harvard or MIT campus might be. Uh, it's still, when you're talking about how much
00:50:36.860
wealth people privately hold, these, these are in fact crumbs, you know, albeit self-aggrandizing
00:50:44.160
crumbs in, in many cases. So what, what about spreading this meme to the ultra wealthy that virtually no one
00:50:53.320
is doing enough and to have it be, um, I mean, uh, you know, guilt is probably not the best motivator
00:51:00.380
here, but at a certain point, I think we would reach a tipping point where it would just be,
00:51:04.840
it's what Pete, well, it's the only thing that will seem decent to do in the end is to be much
00:51:10.840
more generous than people are tending to be. Yeah, I totally agree. And I think that like,
00:51:15.720
you know, first of all, there's just absolutely massive amounts that could be done by this.
00:51:19.600
And, and, and as you said, like it, the meme has been successfully spread amongst sort of ultra
00:51:25.400
wealthy that, you know, the sort of like, you know, the right thing to do is to give. But as you said,
00:51:31.940
I think on the amount to give, it's like pretty arbitrary right now. It's sort of like, I don't
00:51:35.660
know, you know, like a significant amount, right. But, but what is the significant amount in a lot of
00:51:40.780
cases, a significant amount might be a pretty small fraction of what someone actually has. And,
00:51:46.880
you know, that doesn't mean it all has to happen tomorrow. And, you know, we can get into sort of
00:51:50.060
all the caveats about like, how to do this strategically, but that doesn't change sort
00:51:54.560
of the, the high level thing of like, you know, you should be trying to find ways to do that.
00:52:00.720
And, and I don't think people are in a lot of cases, I think that there's a lot of cases where
00:52:05.260
people are basically just not at all trying to find ways to do good for the world. And so,
00:52:11.480
yeah, I, I agree with all that. And I think that like, that, that really spreading the notion of
00:52:16.880
like, you should try and do as much as you can, you should try and give most of what you have.
00:52:21.980
And when doing it, you should be focusing on how much good you can do, rather than focusing on,
00:52:29.100
you know, a sort of diverse series of goals, many of which are kind of self serving in the end,
00:52:35.280
and are basically just consumption, like, you know, getting your name. I mean, you know, we've,
00:52:42.320
my, my company is paid to get our name on various things. That is not charity.
00:52:46.340
Like you're not, you know, that, that's not going into the, into the charity budget.
00:52:50.300
And, and, and I think that sort of something along with that is that when you think about,
00:52:55.400
about doing good, like think about it from the perspective of the people you're helping
00:52:59.080
rather than necessarily from like your own perspective. And I mean, from that perspective,
00:53:04.500
like, you know, they're not in it for your warm fuzzies, right? Like from their perspective,
00:53:12.240
like it's not relevant to them. Who's giving it's not relevant to them. What like, like the relevant
00:53:19.020
part for the actual impact you're having is the actual impact you can have and on how many people's
00:53:25.220
lives you're having that impact. And I think that, you know, from that perspective, right, you might get
00:53:31.460
all the warm fuzzies that you need from $50 million of donations, but, but that's not the
00:53:37.000
relevant thing. The relevant thing was helping people. And there's a lot more help to, to give
00:53:43.140
Yeah. This is interesting. It's interesting to navigate this space of ethical norms and pseudo
00:53:49.700
ethical norms and questions of pragmatism. So this is something I've spoke about with,
00:53:55.020
with Will at one point. It used to be thought, as it is still widely thought, that the highest form of
00:54:03.760
giving is anonymous giving. Because there you can be absolutely sure that your, you know, ulterior
00:54:11.360
concerns about your own reputation are not in play. You're not just, you know, this is not virtue
00:54:17.140
signaling. It's not vanity. It's not getting your name on the building. You're just prioritizing the good
00:54:22.620
you can do with the money. But, you know, I've come to believe, and Will, at least for the purposes
00:54:28.600
of this conversation, agrees, that at least in certain cases, anonymity is not the, you know,
00:54:36.320
practically and ultimately ethically speaking, is not the highest ideal, because there's so much good
00:54:42.580
to be done by persuading people to follow this example. You know, so for you to be, I mean, you could
00:54:48.780
be anonymously giving, you know, having taken no pledges publicly, and that would be great. But I
00:54:55.100
think it is much better for you to be modeling to your fellow ultra-rich people that this is a value
00:55:02.640
you hold and that one can hold, and that, you know, even there, and there may even be great social
00:55:08.980
rewards for promulgating this value, you know, people, because it's, what we want to do is to spread
00:55:15.960
this attitude as far as we can. And I can just say, you know, personally, whenever I talk about
00:55:21.480
these things on the podcast, and especially whenever I mention any specific charity, what
00:55:27.760
happens is my audience reliably supports that charity, and, you know, to a degree that is
00:55:33.380
fairly astounding. And that's a wonderful thing to inspire that kind of generosity in thousands of
00:55:41.920
people. And none of that good gets done if you just hide your light under a bushel and do it
00:55:47.780
anonymously, content that you're, you have not been contaminated by the sin of your own vanity.
00:55:54.840
Yeah, I agree. It could be a complicated balance, because obviously, you want to do that. You want to
00:56:00.920
be able to, to, to spread that meme around. I think that's one of the most, you know, in the end,
00:56:07.060
that's, for most people, that's the biggest impact that they can have. You also want to obviously make
00:56:12.120
sure while doing it that, you know, you don't sort of lose, lose focus on what mattered in the first
00:56:18.260
place. And that the publicity doesn't become the goal in and of itself, except to the extent that
00:56:25.260
that goal is for spreading, you know, spreading the meme and encouraging others rather than sort of
00:56:29.760
self-satisfaction. But, but I think that contingent on being comfortable that you can weather that
00:56:37.560
and stay committed. Yeah, I agree that, you know, probably, probably the biggest thing that you can
00:56:43.720
do is, is to, you know, help others get to that point where they're optimizing for what impact they
00:56:54.100
Hmm. So what is your approach to giving at this point? What, what are you, what are you doing
00:56:59.400
currently? And what do you plan to do? And, um, is it just personal or is it, or does FTX give a
00:57:06.380
certain amount of profit away? I mean, how do you approach this?
00:57:10.080
Yeah. So I guess I'll start with the last part, which is the easiest, which is that, you know,
00:57:14.500
the bulk of it is personal. FTX is also giving some FTX is giving 1% of what it makes, but that's not,
00:57:20.540
those aren't the big numbers here. Like that's not where I expect most of this impact to come from.
00:57:25.800
I expect most of it to come from, uh, from what I personally give. And, you know, in terms of what
00:57:31.980
it looks like in the end, and I think there's something I think is really, really important
00:57:37.040
is that in the end, there is no metric other than what, what, you know, has the most positive
00:57:43.480
impact on the world. And, you know, if it starts to look like something that I hadn't previously
00:57:48.460
thought was important is the most important thing. I think it's important to recognize that
00:57:53.600
and pivot to that and to keep iterating and not to get sort of stuck in the mindset of one particular,
00:58:00.720
one particular path, but, you know, kind of putting aside, you know, long-term hedge words and
00:58:07.640
focusing on like, okay, but sure. How about like today? You know, I think that the things that I've
00:58:13.080
been looking at the most, there's some amount of giving that I do to a variety of places for
00:58:18.600
a sort of set of reasons that maybe I don't think are ultimately the most important for the bulk of
00:58:24.740
the money, but that I think is quite valuable to do with some of it. And I think that like,
00:58:29.920
you know, examples of things there are, you know, basically like, you know, making sure that I'm
00:58:37.820
giving at least some periodically, no matter what, even if I can't find something that seems like
00:58:43.480
the best place to give to me, to make sure that I'm in the habit of it, making sure that I am
00:58:48.420
supporting causes that I think are good and that I want it to be known I'm supporting and that I want,
00:58:53.940
you know, there to be more supporters of. And so I'm doing sort of some of that, but for sort of the
00:59:00.660
biggest parts of this, you know, and maybe I'll say like, you know, on, on, on those fronts, like
00:59:06.260
I think, you know, I've been giving some to global poverty causes each year. I've been giving some to
00:59:11.780
animal welfare causes each year. I've been giving some to effective altruism, community building
00:59:17.740
charities like center for effective altruism each year. I think the things though, that like I've ended
00:59:23.480
up giving the most to recently and thought were the most interesting, probably fall in a few buckets.
00:59:29.480
One of which is, is pandemic preparedness stuff. And, you know, it's, it's sort of false. And I think
00:59:38.580
this really dangerous middle category right now where it's, you know, decently likely to, you know,
00:59:46.700
pandemics have the potential to be massively more deadly than something like global warming is. But on
00:59:52.560
the other hand, they're actually kind of shovel ready in, in the bad sense. And I think that's something
00:59:58.360
that we've learned over the last few years is that global pandemics can happen. This isn't like a
01:00:04.360
theoretical concern. And I think we've, it's also become super clear that we have no ability as a
01:00:13.040
society to react to them, that we have no, no idea what we're doing. And, you know, we're, we're flying
01:00:19.900
blind here. And, and that's not great. You know, we've, I think like almost no countries would I
01:00:27.260
give more than like a B minus two over the last two years for handling COVID. And we got lucky with
01:00:33.240
COVID. Yeah. We got lucky because it, it's nasty, but it's not, it's not deadly in the way SARS is.
01:00:39.280
Right. Like this isn't something that has like a 30% mortality rate.
01:00:44.720
Yeah. And this was a dress rehearsal that we clearly failed in, you know, in our defense in
01:00:49.980
large part, because, or at least this is a possible alibi. We were just in, we were in the uncanny
01:00:55.460
valley with respect to the lethality of the virus. It just was not lethal enough to get our attention
01:01:01.300
or get everyone's attention. And so now we have debates about whether COVID is even, you know,
01:01:07.340
worse than the flu or whether you should get vaccinated for it. I mean, the one success here
01:01:11.920
is that we did develop the vaccines very quickly, but we can't even get half of our society to agree
01:01:16.660
that they should be vaccinated in the first place. One could imagine that's because this isn't MERS or,
01:01:23.880
or SARS or something that's far more lethal. You know, one can only hope that if there were bodies
01:01:29.300
being piled in the streets, you know, in order of magnitude, worse, lethality, you wouldn't have the,
01:01:36.020
the same conspiracy thinking and sheer lunacy that is causing us to fail this test of, of cooperation
01:01:43.200
and coordination. I would hope so. I, I wish I felt more confident in that than I do.
01:01:49.560
I'm with you. I'm, I'm in my darker moments. I'm, I'm entirely with you. I think a maniac like Alex
01:01:56.080
Jones and anyone who would listen to him is capable of being just as crazy in the presence of the bubonic
01:02:02.340
plague. I think that's mostly right. And, and I think some of it is, as you said, like,
01:02:07.180
we don't have a good understanding of society as some of something. And we're really bad at,
01:02:11.840
at sort of addressing middles, like at saying, this is like clearly more deadly than the flu
01:02:17.240
and clearly less deadly than SARS. Like that's just not, it's not in our lexicon. Our lexicon is
01:02:22.700
like, it's fine or it's terrible, you know? And I think similarly, like we're really bad at
01:02:29.380
strategically addressing things and saying like, this intervention seems to have like 80% reduction
01:02:35.540
for like, not that much cost. This intervention seems to have 25% reduction at enormous cost.
01:02:41.520
Like it is absolutely worth it to do number one and probably not to do number two. And like,
01:02:46.280
instead you just like, I find myself in like a shockingly few set of people who like, you know,
01:02:52.720
thinks that like vaccines are great for this, but that like, it's not clear we should be shutting
01:02:59.120
down society forever for it. So, so yeah, we don't know what we're doing. And if you look at sort
01:03:03.980
of the takeaways that society has had from this, I almost want to say the biggest thing that I
01:03:09.760
noticed is that there isn't a clear takeaway. What's, you know, what moral has society taken
01:03:15.920
from COVID? It's a very confused lesson. I mean, the lesson that many of us have drawn is what you,
01:03:21.340
what you just stated, that we are painfully unprepared for the real thing because we have botched this
01:03:28.840
so fully again, you know, module of the, um, the vaccine, the vaccine development,
01:03:33.700
which, which was, which was some parts of that, I think are a good story where we now have the
01:03:38.120
ability in 24 hours to make an MRNA vaccine, which is absolutely fucking amazing. And it's a superpower.
01:03:43.960
On the other hand, it still took a year from COVID appearing to people getting vaccinated.
01:03:50.380
And so I think when it comes to, I mean, obviously detection, like we're really bad,
01:03:55.120
but also like it then took us eight months after having the vaccine to get it through the process
01:04:00.480
to start giving to people. That was eight months when people were dying and COVID was spreading and,
01:04:06.240
you know, we know, we know that people will volunteer for challenge trials. So if we can
01:04:12.280
just articulate the ethics of that more clearly, I think the next time around we can probably get
01:04:18.800
challenge trials approved and hopefully that, yeah, that would speed it up. Yep. Yeah. So anyway,
01:04:24.400
going back to like what, what I've been giving to recently, some of it is various pandemic
01:04:28.960
preparedness related things. A lot of this actually is lobbying. A lot of this is information for
01:04:34.680
lawmakers and trying to get, you know, the government to take seriously its role in preparing for
01:04:42.920
pandemics. And I think you can have extreme leverage doing that. If you look at, you know,
01:04:49.040
just sort of the ratio of what it costs to, you know, versus how much impact you could potentially
01:04:55.680
have there. I think it's super compelling. Now that doesn't mean it's necessarily going to get
01:05:00.780
there, right? Like it, you could absolutely imagine a world where all that is for naught. And I don't,
01:05:06.300
I don't want to say that that's definitely not happening, but that's, I think one of the things
01:05:09.800
I've been doing that I actually feel sort of weirdly best about, and that I think it's been
01:05:15.680
super high leverage. And then on the side, I've also been having a lot of conversations with people
01:05:20.700
about what infrastructure do we need to be developing as a world to be better prepared
01:05:26.800
for the next pandemic and what we can do to fund some of that infrastructure, whether it's, I mean,
01:05:32.740
there's sort of, you know, one idea, which I've seen thrown around a few times and I think it's
01:05:36.780
like fairly compelling is, you know, what if we just went out and funded a giant vaccine stockpile,
01:05:44.480
right? And it's there variants on that. So that's one direction that I've been, that I've been going
01:05:49.020
with. And that I think is, is just like really high upside. If it can be pulled off well, you know,
01:05:57.080
I think these are threats that could be potentially, you know, existential and if not existential,
01:06:01.060
at least like really fucking bad. And then, you know, I've been doing more generally a bunch of
01:06:07.880
policy work in DC and, you know, electoral work there. And I think there's also just some numbers
01:06:14.240
there that don't quite, that are sort of shockingly out of line with what you might predict. You know,
01:06:19.580
if you just look at like how much is spent on elections, how much is spent on lobbying, it's sort
01:06:25.880
of a big amount, but it's actually a really small amount compared to the scope of, you know, the
01:06:31.000
impact that the government can have and that it has had on, you know, among other things, just like
01:06:36.620
global discourse. Well, that was always amazing to me. I mean, they've lost their influence, I think,
01:06:42.600
a little bit of late, but just to look at how little an organization like the NRA needed to spend
01:06:50.140
every year to completely dominate our politics, right? To become an unmovable object in the middle
01:06:55.960
of American politics for as long as I've been alive. It's a trivial amount of money when you're
01:07:01.900
talking about the resources of even one very wealthy person. Yeah, it's absolutely right. And
01:07:07.320
it's pretty wild. Like at some point, you know, there's all this pushback on their lobbying, but
01:07:12.500
I think some of the answer is like, geez, like that's how much they gave, like how, I don't know,
01:07:18.440
like there are a lot of people who could be looking to have impact in DC. And I think like,
01:07:24.360
you know, we're like, I think people sometimes have the wrong takeaway from, from, from that
01:07:30.440
lobbying. And the takeaway is like, everything is fucked instead of being like, we're getting outplayed
01:07:35.520
here. Right. Well, it's just, it's not all causes are equivalent, right? It's like, yes, everybody
01:07:41.440
thinks they're the good guys, even when they're the bad guys, but there are some good guys, right? And
01:07:46.660
there are, you know, there are benign, there are at least benign causes and they're, you
01:07:51.740
know, truly malicious and destructive causes. So it's, um, yeah, no, I'm, I'm with you there.
01:07:58.840
Do you have people advising you at this point on philanthropy? I mean, that's like, I got to
01:08:03.640
think this is not just you doing Google searches. You must have smart people who are in your ear.
01:08:08.860
Yeah, I've been building that up. And, and I mean, it's something I, I would really want
01:08:13.740
to spend more time on than I can and regret that I can't, or at least haven't just because
01:08:18.720
work takes up so much of my life, but I've been growing out the team that's, that's working
01:08:23.840
with me and advising me on that quite a bit. I recently hired, I don't know if you've ever
01:08:27.780
talked with Nick Beckstead, but he's one of sort of the original, uh, hardcore EA, um,
01:08:34.540
I don't think I've, it's possible we've exchanged emails, but the name's not ringing a bell.
01:08:39.920
Yeah. He's a great guy and he's recently brought him on to help lead our foundation.
01:08:45.760
Okay. I'm working with a number of people in DC who have way more knowledge than I do
01:08:51.140
of that arena. A number of people who have way more knowledge about bio than I do. And,
01:08:55.840
and in general, like trying to, you know, build out subject matter experts on everything that
01:09:00.180
we're working on, in addition to sort of a core group of people focusing on, on what our direction
01:09:06.160
should be. And, and I think it's worth knowing that this also, this isn't all, all me. I'm,
01:09:11.080
you know, really fortunate to have started FTX up with, with a number of, of other effective
01:09:16.800
altruists who have, you know, a ton of respect for and, and who have been working with me on all of
01:09:21.900
this. Nice. Well, if, um, I can ever profit from your research, um, I'd love to do that. I mean,
01:09:28.220
I've got various effective altruists advising me. Um, I don't know if you know, um, Natalie Cargo
01:09:34.360
from long view philanthropy and, uh, the founders pledge and, uh, other groups there that are, um,
01:09:41.840
you know, that I've connected to through will, but yeah, if at any, any point, your summary of
01:09:49.060
what you're doing in any given year can be exported to my brain, I'd love to see what you're doing and
01:09:54.080
perhaps just follow your lead. Absolutely. We should, we should absolutely stay in touch with
01:09:58.540
that. Cause I mean, there's a ton that we've been doing and I think a lot of it is, is super cool.
01:10:02.900
And, and obviously we'd love to, to also just get your thoughts on all of it as well.
01:10:07.640
Nice. Well, Sam, it's been great to speak with you. Is there anything
01:10:10.760
that we didn't cover that you think we should touch?
01:10:14.000
The only other thing maybe I'd touch on briefly is something around how, when you're trying to do
01:10:19.640
good, how ambitious, I think it makes sense to be where, you know, if, if you're just optimizing for
01:10:26.440
your personal wellbeing, because you just sort of cap out pretty quickly with like a really comfortable,
01:10:32.720
life, I think there are a lot of incentives to be not super ambitious on that. But I think that if
01:10:38.760
you're optimizing for impact the world to the world, I think that really changes the story
01:10:44.320
because all of a sudden you're looking at something that can really scale something where there really
01:10:52.560
isn't this sense of like, Oh, you know, you've had a fair bit of impact. You can't really have much
01:10:57.320
more impact than that. I think instead the answer is generally like, no, you can have absolutely
01:11:01.520
massive impact. It just keeps going and going and going. And, and I think that means that it,
01:11:06.300
it makes a lot of sense to shoot really high with it. And I think one piece of that, that we've touched
01:11:10.860
on is like, not just thinking about how can we do good, but thinking about how can I maximize the
01:11:16.720
amount of good that I'm doing with the resources I have, you know, not just giving away 5%
01:11:21.320
eventually, but giving away 95% eventually, and not just trying to give it to something good,
01:11:26.540
but think hard about what would be better, what would be the best that we could give to.
01:11:31.260
I think another side of that though, is also, you know, trying to think about, I mean, if you think
01:11:36.140
about like how much impact you've had with your life, I'm actually, I don't know how you thought
01:11:40.280
about what you're going to do with your life earlier, but you've had absolutely enormous impact
01:11:44.820
impact on a huge number of people. I think like massively outscaling what I think most people
01:11:52.320
would think of with their careers. And I think that thinking about ways that you can have not
01:11:57.520
just some impact, but absolutely enormous positive impact with your life and your career and what
01:12:01.900
that would imply you should be doing is actually like really important and powerful.
01:12:06.320
Yeah. And again, it is in fact separable from the, the good feels component of it. And again,
01:12:14.340
I don't want to diminish the importance of good feels because I mean, that really is the driver
01:12:19.600
for a lot of people, but it's also just the, it's the moment to moment substance of what it's like
01:12:24.220
to be you, you know, it's the difference between whether you are smiling or not. But the reality is,
01:12:30.740
is that there are things each of us can do that can affect the lives of, you know, literally millions
01:12:38.040
of people positively. They're almost entirely out of sight and out of mind, even when we're
01:12:44.200
consciously engaging them. You know, it's like, I mean, you could cut a check for tens of millions
01:12:49.460
of dollars, hundreds of millions of dollars used as effectively as possible. And it could take you
01:12:54.920
five minutes. And the thing that you're, is good, that's going to leave a much bigger residue on how
01:13:01.260
you feel that day is the interaction you had with some stranger in a Starbucks, right?
01:13:06.640
There's a paradox of sorts there. You know, I think we want to optimize both of those things.
01:13:12.220
And I think it's good to reflect in a way on the, you know, the more ambitious good we do,
01:13:18.760
so as to internalize the psychological rewards of doing that good. But I think it is just a fact
01:13:25.640
that certain things will always be more salient than other things. And it just, you'll, I mean,
01:13:32.340
this is the conversation I had with Will about, it's like, if you could run into a burning building
01:13:37.640
and save a child, you know, on your deathbed, that will be the thing you remember as the best
01:13:42.980
thing you ever did in your life, perhaps. But the reality is, is that if you just use your
01:13:49.100
resources at all compassionately, you could be saving children in analogous situations by the
01:13:56.260
thousands and thousands, you know, every day of your life at this point.
01:13:59.720
And it just, but it's not going to feel the same. And that, that's okay. You know,
01:14:03.640
you can get your good feels elsewhere. And this is what rationality is for, is to uncouple us from
01:14:10.660
being entirely pushed around by our emotions and to get a more of a bird's eye view of the ethical
01:14:18.560
terrain so that we can actually do the most good that's there to be done.
01:14:23.360
Completely agree. And I think, you know, as you said, it's, it doesn't mean denying the existence
01:14:28.720
of the feelings that you have, but acknowledging that while it's, it's really important not to lose
01:14:35.600
sight of that and not to forget that the goal in the end is to make people feel better and not to
01:14:42.320
lose track of what that, that feels like, that when you start to scale things, that's not the
01:14:48.280
thing that scales. The thing that scales is the more direct impact that you have on other people's
01:14:55.060
lives. Nice. Well, Sam, to be continued, I look forward to meeting you someday. And please keep
01:15:01.180
me in the loop as you learn more and more about how to do good in the world. Absolutely. And you know,
01:15:06.080
if you're ever thinking about coming down to the Bahamas, we'd love to have you here.