#361 — Sam Bankman-Fried & Effective Altruism
Episode Stats
Length
1 hour and 25 minutes
Words per Minute
171.13644
Summary
Will McCaskill joins me to discuss the fall of the Foundation for Effective Altruism (FETX) and its impact on the effective altruism movement. Will is an associate professor in philosophy and a research fellow at the Global Priorities Institute at Oxford University, and the co-founder of three non-profits based on Ea principles. He is also the author of several books, including "Do Good Better: Effective altruism in a Radical New Way to Make a Difference" and "What We Owe the Future: What we owe the Future." In this episode, we discuss the impact of the FETX collapse on the EA community, the legacy of Sam Bankman Freed and the implosion of the movement, and potential problems with EA long-termism. We also discuss a brief sidebar discussion on Ai Risk, and discuss the effects of the collapse on Will personally, and other topics, on his views on EA and the movement as a whole. Thanks for listening to the Making Sense Podcast, and as always, if you want to support the podcast, there's no paywall for this one, you can subscribe at Samharris.org and if you can't afford a subscription you can get a free subscription for free, request one for free for which you get access to all the latest episodes of the podcast. If you can t afford it, then you can sub-subscriptions, but don't decline! just request a free Subscriptions! And if you don't want to decline, just request it, don t decline. Don't decline? So check your spam folder. Just request that you don t get another free subscription, you won't decline again, and that should never happen again, right? Just ask me again, I'll never get it again, so you get another one that doesn't decline, and I'll get a chance to be included in the next episode of The Making Sense podcast! Sincerely, Sam Harris -- making sense -- make sense? -- Sam -- and now I'll be back here with me again! -- by Sam Harris -- in the making sense podcast? -- by e-mail me -- to be sure that you'll get to hear about it? -- and I don't have to pay for the podcast? and I'm making sense? -- in the future? -- thank you!
Transcript
00:00:00.000
welcome to the making sense podcast this is sam harris okay just a little housekeeping here
00:00:30.000
over waking up we just introduced playlists which has been our most requested feature
00:00:37.360
took a while to do that but that seems like a very auspicious change you can create your own retreats
00:00:43.680
you can create playlists for any purpose despite the name of the app there's a lot of content there
00:00:51.440
that is very good for sleep so you could create a sleep playlist many of us fall asleep to audio
00:00:57.840
these days so thanks to the team over waking up for producing that feature among many others
00:01:05.200
the app is continually improving and what else if you haven't seen coleman hughes on the view
00:01:14.320
promoting his book that is worth finding on youtube coleman was recently on the podcast
00:01:20.240
discussing the book the end of race politics he went on the view to do that as you might expect he
00:01:28.080
was bombarded with a level of moral and political confusion that is genuinely hard to deal with in a
00:01:36.480
confined space when one is short on time and i have to say he really did a perfect job i mean it was
00:01:44.720
absolutely masterful so it's worth watching in case there was any doubt in your mind about coleman's
00:01:51.680
talents if ever there were a commercial for the equanimity that can be achieved through
00:01:57.120
mindfulness that was it so bravo coleman in the last housekeeping i acknowledged the death of
00:02:05.200
danny kahneman i went back and listened to my podcast with him recorded at that event at the beacon
00:02:11.760
theater in new york about five years ago i was pleasantly surprised it's often the case that live
00:02:17.680
events don't translate into the best podcasts i really thought this was a great conversation and
00:02:24.400
danny was really worth listening to there so that was episode 150 if you want to revisit it i really
00:02:30.160
enjoyed hearing it again okay today i'm talking to will mccaskill will is an associate professor in
00:02:38.240
philosophy and a research fellow at the global priorities institute at oxford university he is
00:02:44.160
one of the primary voices in a philanthropic movement known as effective altruism and the
00:02:49.280
co-founder of three non-profits based on ea principles giving what we can 80 000 hours
00:02:57.120
and the center for effective altruism he is also the author of several books including doing good
00:03:03.200
better effective altruism in a radical new way to make a difference and most recently what we owe the
00:03:08.720
future however today we don't talk much philosophy rather we do a post-mortem on the career of sam
00:03:17.440
bankman freed and the implosion of ftx and look at the effect that it's had on the effective altruism
00:03:24.320
movement when we recorded last week sam had not yet been sentenced but he has since and he was
00:03:31.120
sentenced to 25 years in prison which is not as much as he could have gotten but certainly more
00:03:38.800
than the minimum um i must say that strikes me as too long a sentence you'll hear will and i struggle
00:03:46.560
to form a theory of mind of sam in this podcast we discuss the possibilities at some length but when
00:03:54.320
you look at some of the people who don't get 25 years in prison for the malicious things they do
00:04:01.920
i don't know it does not strike me as a fair sentence perhaps i'll talk about that more some
00:04:06.400
other time anyway will and i talk about the effect that this fiasco has had on effective altruism the
00:04:14.000
character of the ea community potential problems with long-termism we have a brief sidebar discussion on ai
00:04:21.120
risk we discuss the effects of the ftx collapse on will personally and other topics there's no paywall
00:04:29.280
for this one as i thought everyone should hear what will has to say on this topic as you'll hear
00:04:34.880
despite the size of the crater that sam bankman freed left on this landscape i consider the principles of
00:04:41.760
effective altruism untouched and while i've always considered myself a peripheral member of the community
00:04:48.800
you'll hear me discuss the misgivings i have with it once again here i've discussed them on previous
00:04:53.760
podcasts as well i just think the backlash against ea is thoroughly wrong-headed and will and i talk
00:05:01.200
about that as always if you want to support the podcast there's one way to do that you can subscribe
00:05:06.640
at samharris.org and if you can't afford a subscription you can request one for free occasionally i hear rumors
00:05:14.560
that someone has requested a free subscription and didn't get one that should never happen so check
00:05:20.800
your spam folder something has gone wrong just request again if that happens to you we don't decline
00:05:28.560
any of those requests and now i bring you will mccaskill
00:05:33.200
i am back here with will mccaskill will thanks for joining me again thanks for having me on so
00:05:45.440
we have a lot to talk about i've been wanting to do a post-mortem with you on the sam bankman freed ftx
00:05:53.440
catastrophe i don't think that's putting it too strongly at least in ea circles so we're going to talk about
00:06:00.080
what happened there your your perception of it uh what it has done to uh the optics around effective
00:06:07.280
altruism and perhaps effective altruism itself where should we start here i mean perhaps you could
00:06:13.200
summarize uh what sam bankman freed's position was in the ea community before the wheels came so fully
00:06:23.680
off where did you meet him and and what you know he he certainly seemed like a promising young man who
00:06:30.160
was going to do great things give us you know maybe perhaps we should take it from the top sure i'm happy
00:06:35.200
to and yeah he did from my perspective seemed like a promising young man even though that's yeah very
00:06:41.840
much not how it turned out so i first met sam all the way back in 2012 i was giving talks uh for a
00:06:50.880
new organization i'd set up 80 000 hours which were about how you can do good with your career
00:06:57.120
and i was going around college campuses uh speaking about this and i can't remember who but someone put
00:07:04.720
me and sam in touch i think he had been quite active on a forum for people who are interested in
00:07:12.000
utilitarian philosophy and so and ideas like earning to give had been discussed on that forum and as a
00:07:19.120
result we met up for lunch and he came to my talk uh he was interested in a number of different
00:07:25.760
career paths at the time so politics earning to give was one perhaps we should remind people what
00:07:30.800
earning to give means because it's is really the proper framing for everything sam was up to sure so
00:07:36.400
earning to give was the idea that rather than say directly working for a charity instead you could
00:07:43.040
deliberately take a career that was higher paying something you're perhaps particularly good at in order to
00:07:48.800
donate a significant fraction of your earnings where depending on how much you made that might be 50
00:07:55.200
percent or more and the core idea was that well you could say become a doctor in the developing world
00:08:03.360
and you do a huge amount of good by doing that or you could earn more and donate enough to pay for
00:08:10.480
many doctors working in the in the same course and thereby perhaps do even more good again and this
00:08:16.320
was one of the things i was talking about at the time and he found the ideas compelling you know
00:08:21.840
we discussed it back and forth at the time i next met him something like six months later at a vegan
00:08:28.480
conference and he told me we hadn't been much in touch in that period but then he told me that he'd
00:08:35.040
got an internship at chain street which is this quantitative trading fund and you know that was very impressive i
00:08:41.680
thought he seemed just this very autonomous very morally motivated person um animal welfare was his
00:08:48.160
main focus at the time uh he said later he'd also asked some animal welfare organizations would they
00:08:54.000
rather his time would he rather they work for him or would they rather that uh he go make money in
00:09:00.000
order to donate it to them and uh they said we'd rather have the money and so he went and did that at
00:09:06.800
chain street but then subsequently left and set up a trading firm called alameda research that was a
00:09:15.200
cryptocurrency trading firm and then a couple of years later an exchange as in a platform where
00:09:21.520
others could trade cryptocurrency called ftx in 2019 those seemed to be incredibly successful so by the
00:09:29.280
the end of 2021 uh he was worth tens of billions of dollars the company ftx was worth 40 billion dollars
00:09:38.400
and he seemed to be living up to his kind of claims he was saying he was going to donate
00:09:45.440
everything essentially everything he earned 99 of his wealth something like that and uh through the course of
00:09:52.720
2022 um had actually started making those donations too had you know donated well north of a hundred
00:09:59.840
million dollars but then as it turned out in november it seemed like the company was not all that it
00:10:06.560
seemed there was you know what you could call a run on the bank except it wasn't a bank so there was a
00:10:14.080
loss of confidence in ftx a lot of people started withdrawing their money but the money that customers had
00:10:19.760
deposited on the exchange that should have been there was not there and that should not have been
00:10:25.760
possible it must have been the case therefore that sam and the others leading ftx had misappropriated
00:10:33.520
that money in some way all the while saying that the assets were perfectly safe that they were not
00:10:39.120
invested that led to the complete collapse of ftx a number of people so three other people who are
00:10:46.000
high up at ftx or alameda so caroline ellison gary wang and nishad singh they all pleaded guilty to
00:10:51.920
fraud a couple of months after the collapse um sam did not plead guilty but there was a trial
00:10:56.720
at the end of last year and he was found guilty and what is his current state he's he's in jail is he
00:11:03.920
awaiting an appeal what's do you have up to the minute information on his progress through the criminal
00:11:10.160
justice system yeah so he's in jail and he's awaiting sentencing which will happen next week i think
00:11:17.600
so i guess one thing we should talk about is a some theory of mind about sam uh you know what his
00:11:26.720
intentions actually were insofar as we can guess about them because it really there are really two
00:11:31.920
alternate pictures here which give a very different ethical sense of him as a person and just the
00:11:38.800
the situation so many people were in in giving him their trust i mean perhaps we can just jump
00:11:46.640
there do you think this was a conscious fraud i mean perhaps there are other variants of this but
00:11:52.720
i'll give you the two that come to mind for me either this was a conscious fraud where he was quite
00:11:59.200
cynically using the concepts of effective altruism but his heart was never really in that place and he was
00:12:07.280
just you know trying to get fantastically wealthy and famous and misappropriating people's funds to
00:12:14.240
that end and it all blew up because just bad luck on some level so he was kind of a bernie madoff style
00:12:22.240
character running something like a ponzi scheme or some unethical variant of you know misappropriating
00:12:29.840
people's people's funds or alternately and and i think quite differently he was somebody who
00:12:37.120
based on what he believed about the the actual ethics of the situation and your probability theory
00:12:44.240
he was taking risks that he shouldn't have taken uh obviously in the end given the outcome but
00:12:50.720
they may well have paid off and you know he was taking these risks because he wanted to do the
00:12:56.720
maximum amount of good in the world with as much of the resources available that he could get his
00:13:03.600
hands around and he just was just placing in the end some silly bets and you know bets that you know
00:13:12.960
that he was allowed to place in a totally unregulated space and it catastrophically failed but it was um
00:13:19.920
by no means guaranteed to fail and he was a on some level a good guy who was ruled by some bad or at least
00:13:29.680
unrealistic expectations of just how many times you can play um a game of roulette and win perhaps
00:13:37.600
there there's some you know middle position between those two cases but well what's your sense of
00:13:42.960
his actual intentions throughout this whole time yeah so this is something that i've now spent many
00:13:52.160
months uh over the last year and a half really trying to understand i didn't know about the fraud or
00:13:59.200
have suspicions about the fraud at the time uh so my understanding of things here is really me trying
00:14:05.200
to piece together a story on the basis of all that's come out as a result of the trial and media
00:14:13.040
coverage over the last year and a half one thing i'll say kind of before we talk on this it's very
00:14:19.440
easy once you start getting into trying to inhabit someone's mental state to start saying things where
00:14:26.880
you know it sounds like you're defending the person or something and so yeah i just want to be clear on
00:14:30.560
on just how bad and how harmful uh what happened was so you know a million people lost money the
00:14:38.080
scale of this is just unbelievable and actually recently the prosecution released twitter messages
00:14:45.040
that sam had received during the collapse and they're really heartbreaking to read like one is from
00:14:51.280
a ukrainian man who had fleed ukraine and in order to get his money out put the money on ftx um another
00:14:58.240
person who was going to be made homeless had four children he needed to feed and so on that point
00:15:04.080
we'll just linger for a second yeah i had heard that a lot of the money was getting recovered it would
00:15:09.680
do you know where that process is and how much has been recovered yeah so as it turns out all customer
00:15:18.400
customers will receive all of the financial all of all of the money they put on the exchange as measured
00:15:25.760
measured in terms of the as measured in terms of the value of what they put on the exchange
00:15:32.320
in november 2022 so the the value as of that date as opposed to the amount they initially put in
00:15:39.840
at whatever date yes but also as opposed to the amount today right so often people will say putting
00:15:45.680
bitcoin on the exchange bitcoin is now worth more right than it was then the standard narrative of this
00:15:51.680
has been that that has happened because crypto has risen and uh anthropic a particular investment
00:15:58.320
that was made um has done well my best understanding is that actually that's not accurate that has helped
00:16:05.360
but even putting that to the side kind of even as of september last year when there had not been
00:16:11.840
a kind of crypto rise customers would have been made whole so the issue was not that money was
00:16:18.480
taken and then just lost in the sense of like spent or just lost on bad grades or something instead
00:16:25.120
that the money was illegally taken and invested into assets that couldn't be liquidated quickly
00:16:33.200
so already we're i think a far distance from someone like bernie madoff right who was whatever the
00:16:41.520
actual origins of his behavior whether he was ever a legitimate investor for the longest time
00:16:48.240
he was making only sham investments uh and just lying to everyone in sight and running a proper ponzi
00:16:54.160
scheme yeah that's right bernie madoff was committing the fraud for about eight years and uh yeah he was
00:17:01.920
needing any time a customer or sorry any time a client of his wanted to withdraw money he would raise more
00:17:09.040
money in order to give it back right to give the fiction you know what should have been there to the
00:17:13.920
the customers which is a ponzi screen yeah alameda and ftx they this is one of the things that's so
00:17:22.080
bizarre about the whole story and even tragic is just that the companies themselves were making money
00:17:28.320
in fact large amounts of money but what but the customer asked it so in that sense they were not
00:17:35.360
ponzi schemes but the customer assets held on ftx that should have been there should have been you
00:17:40.960
you know bank vaulted separate from everything got used by alameda research the trading firm in a way
00:17:47.520
that should not have been possible should not have even been possible right and so you were asking on
00:17:52.080
how to interpret the story and you gave two interpretations one was that effective altruism was
00:18:01.040
a sham his commitment to that was a sham he was just in it for his own power his own greed the second
00:18:06.480
was that it was some carefully calculated bet that you know may have been was illegal had good intentions
00:18:14.160
though or perhaps twisted intentions but didn't pay off my personal take is that it was neither of those
00:18:20.000
things and obviously i'll caveat you know i've followed this a lot because i've really tried to
00:18:26.400
resolve the confusion in my mind about what happened but i'm not an expert in this it's extremely
00:18:30.480
complicated but i think there's a few pieces of evidence for thinking that this just wasn't a rational
00:18:37.920
or calculated decision no matter what utility function sam and the others were following it did
00:18:45.120
not make sense as an action and one piece of evidence is actually just learning more about
00:18:52.400
other white collar crimes so bernie made off being one example but the england scandal and many others
00:18:59.760
too so there's this harvard business professor eugene salters who's written this really excellent book
00:19:05.840
called why they do it about white collar crime and he argues and it's on the basis of interviews with
00:19:13.600
many of the most famous white collar criminals and he argues quite strongly against uh the idea that
00:19:20.720
these crimes are a result of some sort of careful cost benefit analysis mainly you know in part because
00:19:28.080
the cost benefit analysis just does not make sense often these are actually really quite wealthy really
00:19:32.800
quite successful people who have not that much to gain but everything to lose but then secondly looking
00:19:39.360
at how the decisions actually get made the word he often uses is mindless you know it's like people
00:19:44.080
aren't even paying attention it might be that you know this was true for the ceo of mckinsey gets off
00:19:51.120
gets off a call from the board of goldman sachs i think and immediately 23 seconds later calls a friend
00:19:58.080
to tell him about what happened in the board meeting in a way that was illegal insider trading this was not
00:20:03.760
a carefully calculated decision it was just it was irrationality it was it was a failure a failure of
00:20:09.280
intuition rather than kind of reasoning and that's my best guess at what happened here as well where
00:20:16.240
yeah i think what happened it seemed is that i mean yeah well there were so many actually let me uh
00:20:22.880
layer on a few points that could certainly bias some people in the direction of kind of more calculation and
00:20:30.320
less impetuosity than than that because i think both within within ea circles more widely and and
00:20:38.080
certainly it seems within sam's brain there were some ideas that will strike people as as strange but
00:20:44.640
nonetheless difficult to refute you know rationally or logically which is to say it all it all hinges
00:20:51.120
around a topic you and i believe have discussed before on the podcast which comes up in any discussion of
00:20:57.600
of of what's called long-termism which i think we'll get to but it comes down to just how to
00:21:03.920
integrate uh rationally any notion of probability especially probability is where
00:21:12.240
one side of the decision tree uh represents some extraordinarily large possible gains right so
00:21:20.720
i believe sam at one point was accused of believing i mean he may have said something along these lines
00:21:28.240
that if the expected value is such you know that you could i forget if it was in terms in positive
00:21:33.920
terms or or or negative terms in terms of avoiding you know extinction but it sounds like he was willing to
00:21:40.720
just toss a coin endlessly with the risk of ruin on one side uh with a sufficiently large you know expected
00:21:48.720
value you know of positive outcome on the other right it's just like if you you know if you have
00:21:53.520
a chance to win a million dollars on one side and lose you know a hundred thousand on the other uh
00:22:01.520
you should just keep tossing that coin because you're you know your expected value is 50 of a million
00:22:09.440
on one side and 50 of of losing a hundred thousand on the other so it's your expected value is you know
00:22:14.800
450 000 every time you toss that coin but of course if you only have a hundred thousand dollars to lose
00:22:21.280
you toss the coin you can you can lose everything on your first toss right and so he he he just seems
00:22:26.400
to be someone who was looking at the expected value proposition somewhat naively and looking at it
00:22:33.280
with everyone else's money on the line or at least that's you know certain things i've heard said of him
00:22:39.840
or by him suggested that was the case so perhaps bring in your beliefs about how one should think
00:22:46.320
about probability and it kind of ends justifying the means thinking that many people believe has
00:22:54.720
corrupted ea more generally but and and and sam is just kind of the ultimate instance of a cautionary
00:23:01.760
tale there sure so yeah one thing that's just absolutely true was sam was seemed unusually
00:23:10.160
risk tolerant and was unusually risk tolerant and at the outset like when the collapse happened i
00:23:17.200
absolutely was worried that perhaps what had gone on was some sort of carefully calculated fraud
00:23:25.040
carefully calculated willingness to break just to break the law in the service of what he thought was
00:23:31.040
best you know i was worried that maybe there would come out a spreadsheet that you know did a little
00:23:37.520
cost benefit analysis of fraud and was clear for all to see i think yeah that yeah i think there are
00:23:44.080
just good reasons for thinking that's not what happened and let's i'll discuss that first and then
00:23:50.720
let's come back to the important points about attitudes to risk and ends justify the means reasoning
00:23:58.800
but just briefly on why that's i think not what happened so one is just the overall plan
00:24:04.160
makes like so little sense if it was a long con like a con kind of from the very start so if that
00:24:10.800
was the case then why would they be trying so hard to get regulated why would they be so incredibly
00:24:18.800
public about the uh what they were doing and in fact very actively courting press attention or in
00:24:26.160
fact having michael lewis one of the world's leading financial writers following them around having
00:24:31.920
access uh why would they be associating themselves with ea so much as well if that was also what they
00:24:38.800
seemed to care about and then the second thing is it's just absolutely agreed by everyone that
00:24:46.240
the organized the companies were a shambles there was not even the most basic accounting not the most
00:24:52.240
basic corporate controls the in june of 2022 there was a meeting between the high ups where they were
00:25:02.480
very worried because it looked like there was a 16 billion dollar loan from ftx to alameda and they
00:25:09.600
thought that was really bad it turns out that was a bug in the it turned out that was a bug in the code
00:25:13.920
and actually it was only an eight billion dollar loan and they were apparently elated at the time
00:25:19.920
so they didn't know how much their assets were to within ten billion dollars and in fact it was at
00:25:26.240
that time that they discovered that they had been double counting eight billion dollars so customers for ftx
00:25:34.800
one way they could put money on the ftx exchange was by sending money to alameda that alameda then
00:25:42.640
should have given to ftx and it seems that in june of that point of that time they realized that had
00:25:50.640
not been happening alameda had thought that money was within alameda legitimately i guess and ftx had
00:25:59.280
thought that money was there and now i'm not going to claim i know that wasn't like conveniently overlooked
00:26:04.400
or something but at least caroline on the stands testified that you know sam might she at least
00:26:09.680
didn't know that sam knew that that money prior to that point in time uh was in alameda when it
00:26:17.600
whereas it should have been in um ftx and that's the way almost all the money flowed from ftx to alameda
00:26:25.680
there were also there was also like a lending program but which was really focused on by the prosecution
00:26:30.720
but that actually yeah we can go into that in more detail as well i think that's actually not
00:26:36.960
where the action was in terms of how the money how the money moved and if you're just like really get
00:26:43.920
into the heads of the people at this time okay let's now suppose they're the most ruthless
00:26:49.200
consequentialists ever and they want to just make as much money as possible let's say it's just
00:26:57.440
dollars raised they're not risk averse at all which wasn't even true but let's even assume that
00:27:03.440
why on earth would they take that money and then invest it 5.5 billion of it into illiquid venture
00:27:11.520
investments so companies basically it was obviously posing enormous risks on them and the gains were
00:27:17.360
really quite small compared to the potential loss of not only everything in the company but also
00:27:24.560
the huge harm that it would do to the rest of the world to the effect of altruism movement itself
00:27:30.720
it really just makes no sense from a utilitarian perspective at all and that's why when i try and
00:27:39.920
inhabit this mode of them making these kind of carefully calculated rational decisions i think
00:27:45.600
it just there's too many facts that seem intention with that or inconsistent with that for that to be
00:27:51.840
the kind of best at least my best guess at what happened so what do you think was actually going
00:27:57.120
on did you ascribe this in the end to some version of incompetence combined with a dopaminergic
00:28:06.560
attachment to just winning at i guess some kind of gambling task yeah i mean i see the deep kind of vice
00:28:14.960
that ultimately drove this all as hubris where they were not very experienced they were very smart
00:28:22.080
they grew a company in a what was for a while an impressive way and sam in particular i just think
00:28:29.040
thought he was smarter than everyone else and this was something that i didn't like about sam and noticed
00:28:33.840
like during the time he was he would not be convinced of something just because other people even if
00:28:40.080
everyone else believed x and he believed y that that wouldn't give him no port like no pause for
00:28:47.040
doubt and so i think he got kind of corrupted by his own success i think he felt like he had made these
00:28:53.040
bets that had paid off in the spite of people being skeptical time and again and so he just thought he was
00:29:00.880
smarter and that means that very basic things like having good accounting having the kind of adults
00:29:10.480
professionals come in who could do risk management actually point out what on earth was going on
00:29:17.120
with where different different stashes of money were because this is another thing that shows just how
00:29:21.520
insane the whole venture was at the time of collapse they just didn't know where most of their assets
00:29:26.400
were you know there would be hundreds of millions of dollars in a bank account somewhere and they
00:29:30.800
wouldn't even know it existed the bank would have to call them to tell them by the way you've got these
00:29:35.120
assets on hand again if it was a carefully calculated ploy you would want to know where
00:29:39.680
all of your assets were in case there was a mass withdrawal of customer deposits and so i think yeah
00:29:46.480
that hubris also kind of not a risk calculation but maybe an attitude to risk where many people i think
00:29:54.560
when they're in the position of you know really quite rapidly running a multi-billion dollar company
00:29:59.680
would think holy shit i should really get some experienced professionals in here whereas that
00:30:07.040
was and be quite worried about you know have i attended to everything have i basically just have
00:30:14.480
i got this company under control and i think at that point they did you know that was not at all
00:30:20.080
how sam and the others were thinking and then the final thing to say just is that this isn't
00:30:24.160
saying that they didn't commit fraud uh from june onwards uh after this hole has been discovered
00:30:31.040
i think then it becomes pretty clear that you know there are just brazen lies to try to to try to get
00:30:37.600
out of the position that they've put themselves in i think there are also other cases of things that
00:30:42.160
seem like clearly fraud though they are not of the kind of eight billion dollars scale and this was fraud
00:30:49.440
you think to conceal the hole in the boat that was putting everything at risk or this was fraud
00:30:55.680
even when things were appeared to be going well and there was no uh risk of oblivion evident
00:31:03.680
what just what was the nature of the fraud do you think uh yeah i mean again flagging that there's
00:31:09.680
probably lots that i'm saying that are wrong because this is you know it's it's complex and i'm not
00:31:14.400
confident there's like lots of different stories my guess is both like in the trial one thing that
00:31:20.400
came up and i'm surprised didn't have more attention was that ftx advertised it had an insurance fund
00:31:28.000
so if it's liquidation engine basically a bit of technology that meant that even if a customer was
00:31:35.120
borrowing funds on the exchange in order to make basically a bet on the exchange with borrowed money
00:31:41.840
you know on other exchanges you could easily go negative by doing that and that would mean other
00:31:47.120
other users would have to pay to cover that loss ftx had this this automatic automatic liquidation engine
00:31:55.440
that was quite well respected but they said even if that fails there's this insurance fund
00:32:00.400
that will cover any losses however the number that was advertised on
00:32:05.440
on the website seems to have been created just by a random number generator so that seems like really
00:32:14.880
quite clear fraud and i don't know it hasn't been discussed very much but seems totally inexcusable and
00:32:22.160
was applied even when the going was good but then the big fraud the eight billion dollars it seemed like
00:32:29.360
that was that really started kicking in from june of 2022 onwards though i'll also say like you know
00:32:36.560
we can talk about my interactions with the people there it seemed like looking back i think it seems
00:32:41.920
to me like they did not know just how badly the situation bad the situation they were in was but yeah
00:32:48.000
yeah well let's talk about your interactions and uh let's focus on sam to start i mean you bring up the
00:32:55.840
the vice of hubris i mean that you know i only spoke to him once i believe it's possible i had a
00:33:01.840
call with him before i did a podcast with him but uh he was on the podcast once and this was very much
00:33:07.520
in the moment when he was the the darling of the ea community i think he was described as the i guess the
00:33:15.040
he was the youngest you know self-made person to reach uh something like 30 billion dollars i think he
00:33:21.600
was you know 29 and had 29 billion dollars or something at the time i spoke to him and again
00:33:26.800
the purpose of all of this earning was to do the maximum amount of good he could do in the world he
00:33:32.800
was just earning to give as far as the eye could see you know i didn't really encounter his um now
00:33:40.320
famous arrogance in my discussion with him maybe he just seemed you know smart and well-intentioned um
00:33:47.280
and i had no reason to i knew nothing about the details of ftx apart from what he told me and you
00:33:53.520
know i think anyone in in our position of of talking to him about his business could be forgiven for not
00:34:01.520
immediately seeing the fraudulence of it or the potential fraudulence of it given that you know he
00:34:07.200
he had people invest with him you know for quite sophisticated you know venture investors
00:34:13.120
you know early on and um you know they didn't detect the problem right and we were not in the
00:34:19.280
position of of investing with him but it in the aftermath there are details of just how he behaved
00:34:25.280
with people that's just struck me as um arrogant to the point of insanity really i mean just like he's
00:34:32.240
you know in these investor calls apparently he is while describing his business and soliciting i think
00:34:38.880
it was hundreds of millions of dollars at a minimum uh from firms like sequoia he is simultaneously
00:34:45.440
playing video games uh and this is you know celebrated as this delightful uh affectation but
00:34:52.800
i mean clearly he is someone who thinks you know he he need not give people 100 of his attention
00:34:58.960
because uh you know he's got so much bandwidth he can just play video games while while uh having
00:35:04.320
these these important conversations and there were some things in michael lewis's book that that
00:35:09.680
revealed uh or at least seemed to reveal that he was quite a strange person and you know someone who
00:35:15.840
claimed on his own account like at least to lewis that he didn't know what people meant when they said
00:35:21.360
they experienced the feeling of love right like like he so he's neuro atypical at a minimum and um
00:35:28.720
perhaps i just you know offer this to you as just a a series of impressions but how peculiar a person
00:35:36.960
is he and shouldn't there have been more red flags or you know earlier on you know just in terms of his
00:35:46.400
integrity ethically or just his capacity for ethical integrity given you know i mean if someone tells me that
00:35:53.840
they have no idea what anyone means when they say they love other people you know that that is a
00:36:01.520
an enormous red flag i mean it's something that you know i would i would feel compassion for you know
00:36:07.040
that the person is is obviously missing something but as far as collaborating with this person or putting
00:36:14.320
trust in them it's an enormous red flag and so i you know i don't you know i don't know at what point
00:36:19.680
he told lewis that but what what just what was your impression or is your impression of sam as a
00:36:26.160
person and in retrospect were there signs of of his um unreliability ethically you know far earlier than
00:36:36.720
when the emergency actually occurred sure so there's um yeah a lot to say here and briefly on the
00:36:46.160
not feeling love so yeah this my descriptions of sam and feelings about sam are quite varied and
00:36:52.400
very gated on his ability to not feel love it's you know that wasn't something that seemed striking or
00:37:00.480
notable to me like after the michael lewis book and lots of things came out it seemed like he had just
00:37:07.040
emotional flatness across the board and whether that's a result of depression or adhd or autism
00:37:14.960
is like not really that clear to me but that wasn't something that seemed obvious at the time
00:37:21.200
at least i guess i interact with people who are you know relatively emotionally flat quite a lot i
00:37:26.720
certainly wouldn't have said he's a very emotional person he did seem like a very thoughtful incredibly
00:37:32.080
morally motivated person all the way back to 2012 i mean his main concern was for the plight of
00:37:38.400
non-human animals on factory farms for most of that time it's kind of an unusual thing to care
00:37:44.160
about if you're some sort of psychopath or something like that yeah i you know when i first reached
00:37:50.960
out to sam after ftx had been so successful i talked to him um about you know okay you've started
00:37:58.400
this company it's a crypto company isn't crypto like you know pretty sketchy like what hey how much
00:38:04.720
have you thought about risks to the company and so on and there was a narrative that came from him
00:38:09.040
and then was echoed and emphasized by nishad singh who in my experience was really a kind of crucial
00:38:15.520
part of the story like really crucial part of my interface with that world and the story i got told
00:38:22.560
was ftx is trying quite self-consciously to be much more ethical than the standard crypto exchange or
00:38:33.520
anything going on in the crypto world and there are two reasons why we need to do that even putting
00:38:39.280
aside the you know intrinsic desire to act ethically one is because they were trying to get regulated so
00:38:46.240
you know they were very actively courting uh regulation in the us because they thought that was a way in
00:38:52.400
which you know by being you know they were these center-left people they were not the libertarians that
00:38:57.760
populate crypto normally they thought you know that's how they could get the edge over the
00:39:02.800
competitors was by being much more open and open to regulation and then secondly because they planned
00:39:09.040
to give the proceeds away they knew that they would get you know they would face a higher bar for
00:39:13.280
criticism and that claim got yeah made to me over and over again where not just sam but uh nishad you
00:39:22.640
know sam was very busy so you know i spoke to him a number of times like half a dozen times or
00:39:27.440
something one-on-one more times um in kind of group settings but i talked to nishad and nishad really
00:39:34.320
came across like i mean this is the thing that like maybe breaks my heart the most about the whole
00:39:39.680
story where he came across just as this incredibly thoughtful you know morally motivated careful just
00:39:48.880
kind person and i would ask him kind of okay so why are you in the bahamas and there would be an answer
00:39:54.080
which is that that's where they were able to get licensed or i'd ask kind of why is your apartment
00:40:00.960
so nice and they would say well we can't really get kind of mid-level property in the bahamas we just
00:40:07.600
need somewhere that we can create like a campus feel and so yeah it is nicer than we'd like hopefully
00:40:12.640
we can move over time to something a bit less nice so over and over again or other kind of ethical issues in
00:40:18.000
crypto we can go into and yeah over and over again he was painting uh that picture and something that
00:40:24.080
was just so hurtful and confusing is just was he lying to me that whole time like was that just all
00:40:32.000
false or was he just like a gullible fool i haven't followed the trial in sufficient detail to know
00:40:38.960
where what his role was revealed to be in all of this i mean where is where is he and has he been
00:40:45.760
prosecuted and what do you think of of his actual intentions at this point yeah i mean so he pled
00:40:52.960
guilty i think for thought among other things and he testified in these pleadings do you think this was
00:40:58.960
you know just kind of a classically perverse prisoner's dilemma situation where you have people
00:41:05.440
given the the the shadow of prosecution and prison hanging over them they they're willing to testify
00:41:12.480
to things that the government wants to hear but which are not strictly true i mean what what do you
00:41:16.960
what's your theory of mind for the people who pled guilty at this point yeah i mean again this is
00:41:23.280
something that comes up in eugene saltis's book and he talks about where it's a very strange aspect of
00:41:30.000
the u.s legal system like it's not something that happens in the uk where the government will reward
00:41:37.040
people literally with their lives for going on the stand like you know because they will the other
00:41:43.760
people probably will get no jail time they will reward people in that way for going on the stand
00:41:48.320
and testifying and so that just does mean you know they can't tell lies or not verifiable lies
00:41:55.280
but there are very strong incentives to present things in a certain way and again i don't want to
00:42:01.840
this was all sounding much more defensive of sam than i want to be but the saltis book talks about
00:42:07.600
some people who were just you know they would be the hearse with their lawyers for hours and hours and
00:42:12.720
hours in order to seem you know display appropriate contrition and so on and so the view of this that
00:42:18.160
kind of michael lewis took is just you know people understand they will have just said true things
00:42:25.520
throughout but the kind of tone of it is maybe a little different than it really was where there
00:42:34.160
was a lot of you know the co-conspirators talking about how bad they felt and how um they knew what
00:42:40.320
they were doing was wrong at the time they were really torn up that you know seems quite inconsistent
00:42:46.480
with my experience of them but maybe they were just incredible liars one question about that so so
00:42:51.440
they knew what they were doing was wrong could mean many things it could mean that they knew that
00:42:56.720
they were taking risks with people's funds that were unconscionable you know get you know given the
00:43:02.640
possibility of losing money that customers thought was was was safely on the exchange but that's not the
00:43:09.760
same thing as stealing money and misappropriating it in a way that is purely selfish right it's not like
00:43:16.160
we took money that was not ours and we bought you know luxury condominiums in the bahamas with it
00:43:22.480
and hoped no one would notice right it's like that that's one style of fraud this you tell me is it
00:43:28.400
is it possible that they thought they were they were going to you know wager this money on on other
00:43:33.600
real investments you know however shady some of these crypto properties were but they they actually
00:43:39.120
expected enormous returns as a result of that that misappropriation and and that money would come back
00:43:45.120
you know safely into ftx and no one would no one would lose anything in the end if everything worked
00:43:51.840
yeah i mean in terms of how things seem to me i just think they didn't think the company was at risk
00:43:58.560
not at serious risk and here's a couple of there's a few reasons why i mean one this is kind of how
00:44:04.960
i felt like why i was so confused this whole time like you know i visited the bahamas a number of times
00:44:10.160
in 2022 i never saw any kind of change in attitude from them over that time like you would really
00:44:16.720
think if you're engaging this major fraud that something would seep out some sort of flags maybe
00:44:23.520
i'm a fool but i did not see that and in fact even so in september my last trip to the bahamas
00:44:30.560
i heard from michael lewis that sam had been courting funding for ftx from saudi arabia and other
00:44:38.000
places in the middle east and i do not love the idea of taking money from saudi arabia um i have
00:44:44.560
issues with that and it also just struck me as kind of odd and i was aware there was a crypto downturn
00:44:50.080
so i talked in a shard and i say i asked like look is there anything up with the company like are you
00:44:54.880
in trouble and he says no and we talk about this for you know it's not a passing comment for some time
00:45:00.560
and that by any account is like past the point when he you know allegedly had learned about
00:45:09.200
the huge hole that the company faced similarly michael lewis on what was the same time asked
00:45:15.600
both caroline and nashad as a kind of fun question like oh what could go wrong with the company like
00:45:21.040
if this all goes to zero what happened and again he said like no indication of stress upon hearing that
00:45:29.200
question they had fun with it they were like oh maybe crypto is just a lot of hair and everyone
00:45:34.480
gets turned off or maybe sam gets kidnapped that was kind of one of the big worries but nothing that
00:45:39.920
kind of leaked out there so given his guilty pleading and his testimony what what's your belief about
00:45:46.720
your conversation with with nashad at that point do you think he was unaware of of the risk or do you
00:45:53.120
think he was lying to you so i think another thing that came out during the trial though i'm not sure
00:46:00.400
if it was admissible as evidence was that nishad commented to the government kind of immediately
00:46:07.120
upon pleading guilty that in that period he was he kind of thought he still thought that ftx would last
00:46:15.120
for years so and yeah and in terms of just giving an indication of what nishad's personality was like
00:46:21.760
when the collapse happened he really he had to be watched because he was on the verge of suicide
00:46:26.880
um he was so distraught about what happened to the customers and i think was really quite close to
00:46:31.840
taking his own life so then what sort of fraud was he pleading guilty to if he's the kind of person
00:46:38.480
who's suicidal when the wheels come off as though he had no idea that this was this was in the cards
00:46:47.120
right is he i mean you think he's just responding to all of the opprobrium and disgrace aimed his way
00:46:53.040
in the aftermath or or do you think he was actually surprised fundamentally by the risk that was being
00:46:59.520
run and if and if the latter in what sense does he claim to be culpable for a conscious fraud yeah i mean
00:47:06.400
so yeah so i don't know about whether nishad at this time was ignorant as in just like really did
00:47:15.680
not know the risks they were running i don't know if he was just like delusional so again this is a
00:47:22.160
thing that soltez talks about is just you know the the capacity for humans to create a narrative in
00:47:30.880
which they're still you know the good guys i don't know perhaps he thought that yes this was bad
00:47:36.320
but they would get out of it and so it was a little bit bad but it'll be fine maybe maybe that
00:47:41.600
was there too oh yeah one thing that he does is he buys this three and a half million dollar property
00:47:46.720
for himself in october again it's just not the action with someone who again on the stand said that
00:47:52.880
you know how distraught he was and he was talking to sam about this so all of these things are possible
00:47:58.720
to me as for you know him pleading guilty well i mean whichever of these is true i think it would
00:48:05.760
make sense to plead guilty if you're in that situation where you know there are huge costs
00:48:13.440
to going to jail and like i don't know like you know plausibly also he just thought look yes i thought
00:48:20.000
so you know there's a very various stories but plausibly he thought look yes i knew what i was doing was
00:48:24.640
bad i thought it was only a little bit bad actually i was wrong it was very bad it was extremely bad
00:48:30.800
i'm willing to just fess up and take the hit i should get like there are various possible
00:48:35.360
explanations there right there was a lot made i don't know if this appeared in the trial but in
00:48:41.280
in the court of public opinion there was a lot made of a text exchange that sam had with somebody i
00:48:47.120
think it was a journalist or a quasi journalist in the immediate aftermath of of the scandal where
00:48:53.360
he seemed to admit that all the effect of altruism uh lip service was just that it was just the thing
00:49:01.280
you say to liberals to make them feel good i forget the actual language but it just it seemed like he
00:49:07.440
was copying to the fact that his it was that part was always a ruse honestly that when i read those
00:49:13.280
texts i didn't i didn't know how to interpret them but it was not obvious to me that they were the
00:49:18.160
smoking gun they were they appeared to be in in so many minds who are who are ready to bury effective
00:49:24.000
altruism as a scam do you uh know the thread i'm referring to and do you have yeah yeah i know the i
00:49:31.280
know the thread and yeah i mean in a way like from the perspective of the brand of effective altruism
00:49:38.800
maybe it would have been better if he'd actually be that part had been a big con too but no i think
00:49:46.480
he i think he believed in these ideas i think there he was deferring to kind of what you might call like
00:49:51.920
like sort of corporate ethics that are really kind of pr so you know companies will often make these big
00:49:58.560
charitable donations to their local community and so on and i mean in this case yeah exactly everyone
00:50:06.480
knows like this is marketing and i guess i don't know the details but presumably ftx was you know
00:50:11.840
doing stuff like that in the same way other companies do and that my interpretation of those
00:50:16.720
texts is that that's what he was referring to actually it reminds me that the one concern i did have about
00:50:22.400
sam before the scandal broke i don't know if this was contemporaneous with with my conversation with
00:50:29.120
him on the podcast but i i just remember thinking this when i heard how much money he had given away
00:50:34.640
and you referenced it earlier it was something north of a hundred million dollars you know i'm always
00:50:41.040
you know quick to do the math on that and and i recognize what a paltry sum that actually is if you
00:50:46.960
have thirty billion dollars right i mean so it's an enormous amount of money out in the real world where
00:50:52.400
people are grateful for whatever you give them but you know it's it's analogous to somebody who has
00:50:57.680
thirty million dollars giving a hundred thousand dollars away right it's not a sacrifice it's a rounding error on
00:51:04.240
their actual wealth and it's certainly not the sort of thing that i would expect of someone for whom
00:51:10.480
the whole point of becoming fantastically wealthy is to give all of it away and i so i think and i
00:51:16.960
forget if i asked him a question about the pace of his giving during that podcast but it's just i i know
00:51:23.440
that some people think well it's you know what the best thing for me to do is to use these assets to
00:51:27.360
make more assets in the meantime and then i'll give it all away later on but given the the urgency of so
00:51:33.360
many causes and given the the real opportunity to save lives and mitigate enormous suffering you
00:51:40.160
know every day of the week now you know starting now it just you know my spidey sense tingles when
00:51:47.040
when i hear you know a fantastically wealthy person deferring their giving you know to the far future
00:51:53.280
and so i'm wondering what you think of that sure yeah i think that issue is i think that wasn't really
00:52:00.720
an issue so a couple of reasons one is just you know his net worth is basically entirely in ftx
00:52:07.680
there's no way of converting that so if you're in a startup and all your wealth is in the equity of
00:52:14.160
that startup there's not really any way of converting that wealth into money the sort of thing that you
00:52:19.920
could donate you have to basically keep building the company until you can have an exit so have get
00:52:25.120
acquired or sell the company and then you can become more liquid and then the second thing is just at
00:52:31.360
least relative to other business people he was very unusual in wanting to give more and give quickly
00:52:40.240
so i mean i advised on the setup of his uh foundation and it got a lot of criticism for giving for
00:52:47.760
scaling up giving too quickly so going from kind of zero to you know one to two hundred million in a
00:52:53.440
year is like a very big scale up and it's actually just quite hard to do and so i guess i like you
00:53:00.960
know if you were asking me yes it's a tiny fraction and i agree with the point in general that when
00:53:07.360
someone who's a center billionaire then gives you know a hundred million dollars to something that is
00:53:12.560
just really not very much at all especially once they've had that money for decades and they can
00:53:18.240
really kind of distribute it but in that case the way it seemed to me at the time and i guess still
00:53:24.240
does just seem to me was like basically consistent with someone trying to scale up their giving as fast
00:53:28.880
as they can and in fact in a way that you know plausibly should have been paying more attention to
00:53:33.920
the business right and not getting distracted by other things yeah so so what is this what if anything
00:53:39.920
does this say about effective altruism i mean i guess there's a additional question here what has
00:53:45.840
been the effect uh as you perceive it on ea and um the public perception of it the fundraising toward
00:53:54.960
good causes has it forced a rethinking of any principles of effective altruism that you know whether
00:54:02.240
it's earning to give or or a focus on long-termism uh which we haven't talked about here yet but you
00:54:09.280
and i've discussed before and uh you know how large a crater has this left and what has been touched by
00:54:16.000
it and is there any good to come out of this just give give me the the picture as you see it of ea at
00:54:22.400
the moment um yeah i mean huge harm huge harm to ea where you know at the time of the collapse i put it
00:54:31.680
like 20 or something that the ea movement would just die that this was a killer blow and so obviously in
00:54:38.480
terms of hit to the brand you know so many people think ill of ea now or critical of ea now and all
00:54:46.080
sorts of different ways driven by this in a way that's not surprising it was this horrific horrific
00:54:51.520
thing i don't think it happened because of ea i think it happened in spite of ea i think like ea
00:54:59.280
leaders and communicators have been very consistent on the idea that the ends do not justify the means
00:55:04.480
really since the start i mean and really this goes back centuries go back to john stewart mill
00:55:11.360
and actually even sam knew this so again as part of just trying to figure out what happened i did some
00:55:17.440
facebook archaeology there's so there's an essay by eliezer ukowski called the ends do not justify the
00:55:22.400
means among humans basically making the classic point that you are not a god at calculation even if you're
00:55:30.160
a hundred percent consequentialist which i don't think you should be but even so follow heuristics
00:55:36.320
that are tried and true including heuristics not to violate side constraints and this was shared in
00:55:41.920
a group on a kind of discussion group and this is you know well before ftx sam's response was like
00:55:48.000
why are you even sharing this this is obvious everyone already knows this so yeah this was in my view in
00:55:53.520
spite of ea not because of it but yes the damage is huge also internal damage as well morale was very
00:55:59.680
very low trust was very low the thought being well if sam and the others did this then who knows what
00:56:06.880
other people are like and there has just been enormous amount of self-reflection self-scrutiny
00:56:13.040
whether that's because of this catastrophe itself or just if there's any point in time for self-reflection
00:56:19.920
i think it was in the aftermath of that and so there's a whole bunch of things that have changed
00:56:24.800
over the last year and a half not in terms of the principles because you know what is effective
00:56:30.240
altruism it's the idea of using evidence and reason to try to make the world better that principle is
00:56:36.160
still good like i still would love people to increase the amount by which they are benevolent to towards
00:56:42.560
others and increase the amount by which they think extremely carefully and are like really quite intense
00:56:48.720
about trying to figure out how they can have more positive impact with their money or with their
00:56:53.760
time that's just still as true as ever and the actions of one person in no way under undermine that
00:57:01.360
i mean take any ideology take any moral view that you can imagine you will find advocates of that
00:57:08.160
ideology that are utterly repugnant yeah this is the the hitler was a vegetarian principle exactly well
00:57:14.960
and sam was too and so um vegetarians having a bad time yeah exactly exactly but there have been a lot
00:57:22.080
of changes to the kind of institutions within effective altruism so it has essentially entirely
00:57:27.360
new leadership now at least on the organizational side uh so center for effective altruism open philanthropy
00:57:33.760
um 80 000 hours and the boards of at least some of these organizations are like really quite refreshed this
00:57:40.720
is partly just a lot of people had to work exceptionally hard as a result of the fallout
00:57:46.400
and um got really quite burned out um in my own case you know i've stepped back from being on the
00:57:53.760
boards of any of these main organizations and i won't do that again really for quite a while you know
00:57:58.800
one thing was just that i wasn't able to talk about this stuff in the way i really wanted to for you know
00:58:05.680
over the year like i spent again like months little months kind of writing blog posts and rewriting
00:58:10.640
them and having them then kind of knocked back because there were uh there wasn't kind of investigation
00:58:15.680
being held by effective ventures one of the charities and the law firm doing that really didn't
00:58:21.280
want me to speak while that was um ongoing but then also because i think a healthier effective
00:58:26.960
altruism movement is more decentralized than it was and there was an issue when um the collapse happened
00:58:34.640
that i was in the roles of being on the board of the charity also if anyone was being a
00:58:41.200
spokesperson for the ea but also having advised sam and the creation of his of the foundation and that
00:58:47.040
meant i wasn't able to kind of offer guidance and reassurance to the community at that time of crisis
00:58:53.440
in a way that i really wanted to and and wish i'd been able to and so i do think like a healthier
00:58:59.280
ea movement is um you know has greater decentralization in that way and there's some other things happening
00:59:05.200
in that direction too so various organizations are kind of separating or projects like separating out
00:59:11.200
legally and becoming and becoming their own entities yeah i mean in the aftermath i was um certainly
00:59:19.440
unhappy to see so many people eager to dance on the grave of effective altruism and in the worst
00:59:27.040
cases these are people who are quite wealthy and cynical and simply looking for an excuse to
00:59:35.440
judge the the actual good intentions and and real altruism of others as just you know patently false
00:59:43.920
and it was you know there was there was never there there no everyone's just in it for themselves
00:59:48.960
and therefore i rich ayn randian type should feel a completely clear conscience in being
00:59:56.640
merely selfish right it's all a scam right and that's i just think that's an odious
01:00:03.360
worldview and a false one right it's not that everyone is just in it for themselves it's not
01:00:09.040
all just virtue signaling there there are real goods in the world that can be accomplished there are real
01:00:16.400
harms that can be averted and you know being merely selfish is a really is a character flaw and
01:00:23.120
it is possible to be a much better person than that and we should aspire to that and i say this as
01:00:29.760
someone who's been to some degree always somewhat critical or at least leery of ea as a movement and
01:00:37.200
as a community i mean i think i'm you know one of the larger contributors to it just in you know
01:00:42.720
personally and just how much money i give to ea align charities and in in how much i have spread the
01:00:48.640
word about it and and inspired others to take the pledge and to also um give money to give well and
01:00:55.680
and similar organizations but i've you know i've always been and i've spoken to you about this and
01:01:00.560
i've just i've said as much on on this podcast and elsewhere you know i i feel like as a movement it's
01:01:06.880
it's always struck me as too online and for some reason attractive to you know in the in the in the most
01:01:15.760
comedic case you know you know neuro atypical people who are committed to polyamory right i
01:01:20.800
mean it's just there's there's a silicon valley cult-like dynamics that i've detected if not in
01:01:26.080
the center of the movement certainly at its fringe that i think is evinced to some degree in in the
01:01:31.760
life of sam bankman freed too and we haven't talked about just how they were living in the bahamas but
01:01:36.880
you know there's there's there's certainly some colorful anecdotes there and it just it seems to me
01:01:42.000
that there's a there's a culture that you know i haven't wanted to endorse without caveat and yet
01:01:51.200
the principles that you know i've learned from my conversations with you and you know in reading
01:01:57.200
books like your own and you know toby ord's book uh the precipice their ideas about existential risk
01:02:04.880
and you know actually becoming rational around the the real effects of efforts to do good rather than
01:02:12.800
the the imagined effects or the the hoped-for effects divorcing a rational understanding of mitigating
01:02:21.440
human suffering and risk of harm and the good feels we get around you know specific stories and specific
01:02:31.280
and kind of triggers to empathy right and just performing conceptual surgery on on all of that
01:02:37.760
so that one can actually do what one actually wants to do in a clear-headed way guided by compassion and
01:02:45.440
a rational understanding of the effects one can have on the world and it's you know we've talked about
01:02:50.560
many of these issues before in previous conversations i think all of that still stands i mean none of that
01:02:55.680
was wrong and none of that is shown to be wrong by the example of sam bankman freed and so yeah i just
01:03:02.480
you know i do mourn any loss that those ideas have um suffered you know in the in public perception
01:03:11.120
because of this so yeah i mean take do with that what you will but that that's where i've netted out
01:03:16.720
at this point yeah i mean i think it's part of the tragedy of the whole thing it's just you know
01:03:22.800
giving what we can has over 9 000 people who are trying to get who are pledging to give at least 10
01:03:28.640
percent of their income to highly cost-effective charities aiming for 10 000 people this this year
01:03:35.040
for those people you know generally living like pretty normal lives middle class maybe they're you know
01:03:41.920
or maybe they're wealthier like in what way does the action of sam and the others invalidate that
01:03:50.640
and the answer is not at all like that is just as important as ever and yeah one of the things
01:03:57.440
that's so sad is like maybe fewer people will be inclined to do so not for any good national reasons
01:04:03.840
but just because of the you know bad order that surrounds the idea now and that's just a little
01:04:09.040
tragedy i think that's i think donating a fraction of your income to causes that effectively help other
01:04:15.440
people i still think that's a really good a really good way to live you talk about yeah the kind of
01:04:21.680
online cult-like and shot through with asperger's kind of side of the movement i think i want to do say
01:04:29.920
that you know ea is many things or like the ea movement is many things and also of course you can
01:04:35.600
endorse the ideas without endorsing anything to do with the movement but i definitely worry that
01:04:41.120
you know there is a segment that is extremely online and perhaps unusually weird in its culture
01:04:50.240
or something and it's a bit of a shame i think if people get the impression that's kind of what
01:04:58.000
everyone within the ea movement is like on the basis of whoever there's kind of most loud on the internet
01:05:05.040
where you know and people can be poly um if they want and no uh no moral objection uh to that um
01:05:13.120
at all fine way to live people can have all sorts of like weird beliefs too and like maybe some of
01:05:18.480
them are correct like i think ai risk was extremely weird for many years and now people are taking it
01:05:23.520
really seriously so i mean i think that's important but i think the vast majority of people within the
01:05:29.360
effective altruism movement are like pretty normal people they're people who care a lot they're people who are
01:05:35.120
willing to put their money or the time where their mouth is and because they care they're really
01:05:39.840
willing to think this through and you know willing to go where the arguments or the evidence lead them
01:05:46.400
and you know i'm not someone who's naturally kind of on the internet all the time i find twitter and
01:05:52.160
internet forums you know quite off-putting and when i meet people in person who are engaged in the
01:06:00.160
project of effective altruism it just it feels very very different than it does if you're just kind of
01:06:06.800
yeah hanging out on twitter or on some of the forums online or something hmm so is there anything that
01:06:15.600
has been rethought at the level of the ideas i mean the one other issue here which um i don't think it
01:06:23.280
played an enormous role in the in the coverage of the ftx collapse but it's come under some scrutiny and
01:06:30.560
and um become a kind of an ideological cause for concern the emphasis on long-termism which you brought
01:06:38.320
out at book length uh in your last book was that part of the problem here and is there any rethink
01:06:45.760
because because that certainly brings in this issue of probability calculus that turns our decisions
01:06:52.960
into um you know a series of trolley problems wherein ends justify the means thinking at least becomes
01:07:00.800
tempting which is to say that if you if you thought you could a decision you you made had implications for
01:07:07.920
the survival of humanity not just in the near term but out into an endless future you know where
01:07:14.320
trillions upon trillions of lives are at stake and they hang in the balance well then there's a lot
01:07:19.440
you might do if you really took that the numbers seriously right is there anything that you're you
01:07:25.920
have been forced to um revise your thinking on as a result of this yeah so i mean i really think
01:07:33.600
long-termism wasn't at play i mean again like i said i feel like it wasn't what happened to ftx was
01:07:40.000
not the matter of some rational calculation in pursuit of some end i think it looks
01:07:44.320
dumb and immoral from any perspective i also just think like if your concern is with uh hundreds of
01:07:51.120
millions of people in extreme poverty or the tens of billions of animals suffering in factory farms
01:07:57.280
the scale of those problems are more than enough for the same reasons for the same kind of worries to
01:08:03.600
arise and in fact we have seen um like in the animal welfare movement on the fringes people taking
01:08:09.920
violent actions even in the pursuit of what they regarded as the kind of greater good long-termism
01:08:15.760
if anything kind of actually shifts against it because this argument about oh you should be willing
01:08:21.360
to take more risk if you're using your money philanthropically than if you're willing to just spend the
01:08:27.600
money on yourself that argument applies much less strongly in the case of long-termism than it does for
01:08:32.800
global health and development for example because you know if i have five dollars to spend that can
01:08:37.520
buy a bed net if i have a billion and five dollars to spend that that final five dollars still buys
01:08:44.160
a bed net global health and development can just absorb huge amounts of money without the cost
01:08:49.520
effectiveness going down very much the same is not just to follow a further point on along those lines
01:08:55.520
my concern with long-termism has been the way in which it can seem to devalue the opportunity to
01:09:03.200
alleviate present harms and present suffering because if you can tell yourself a story that
01:09:08.320
the one billion people suffering now are you know their interests are infinitesimal compared to the
01:09:15.360
trillions upon trillions who may yet exist if we play our cards right so it's an argument for
01:09:20.960
perhaps overlooking the the immediate suffering of the present out of a concern for the unrealized
01:09:28.400
suffering of the future yeah and i think that's you know in what we are the future i was very careful
01:09:36.400
to defend only what i call the weak form of long-termism that positively impacting the long-term future
01:09:44.560
is a model priority of our time not claiming it's the only one nor claiming it's overwhelming
01:09:50.640
importance either like i think we should be uncertain about this right in you know got a
01:09:55.760
new preface i suggest a goal kind of a way of operationalizing that of rich countries putting
01:10:01.680
one percent at least one percent of their resources to issues that distinctively impact future generations
01:10:08.240
because at the moment they currently put close to zero percent and i do think the the mode of
01:10:14.720
operating in which you think oh a present catastrophe is nothing compared to the unparalleled goods that
01:10:22.480
may come in a trillion years time like i think that's a really bad way of thinking even just from
01:10:26.960
a pure long-term perspective um i think it doesn't have a good track record and it's really not how i would
01:10:34.320
want people to think there has been a different line of criticism that i got from within ea from the
01:10:40.800
publication of what we are the future onwards that i think has had a lot of merit and that line of
01:10:46.320
criticism was that i was misunderstanding actually how near term the risks we were taught that we are
01:10:52.800
talking about were so in particular the risk from ai where the risks we face uh from really advanced
01:11:01.040
artificial intelligence even artificial general intelligence these are coming in the next decade at
01:11:07.280
most the next couple of decades and secondly the scale of the problems that are imposed by
01:11:16.320
technological development like ai are so great that you don't need to think about future generations at
01:11:22.000
all even if you just care about the eight billion people alive today imposing the size of risks that we
01:11:29.120
are imposing on them via these technology is more than enough for this to become one of the top problems
01:11:34.320
that the world should face today and over the last few years since the publication of the book i just
01:11:40.080
think that perspective has been getting more and more vindicated and so i'm now much more worried about
01:11:48.640
really very advanced very fast advances in ai in a very near time frame as in literally the next
01:11:58.000
five six years or the next decade much more worried by risks from that than i was even just a few years ago
01:12:04.880
this is a sidebar conversation but um it's interesting are your concerns mostly around the prospect of
01:12:13.920
unaligned agi or are they the more piecemeal and nearer term and actually already present concerns around
01:12:22.960
just the the misuse of ai at whatever capacity it exists to essentially render societies ungovernable and
01:12:31.280
more or less guarantee you know malicious use at scale that becomes quite harmful to what degree are you focused on
01:12:38.240
one versus the other i think i want to say i'm focused on
01:12:41.520
both but also other things too so i think misalignment risk i think it's real and i think it's serious and i think we should be
01:12:48.880
working on it uh and working on it much more than we currently are i am an optimist about it though as in i think
01:12:55.520
very probably it will either turn out not to be an issue because it's just really quite easy to make
01:13:00.560
advanced ai systems that do what we want them to or we'll put in a big effort and be able to solve
01:13:07.200
solve the problem or we'll notice that the problem is not being solved and we'll actually just
01:13:12.400
hold back put in regulations and other controls for long enough to give us enough time to solve the
01:13:18.000
problem but there should still be more work however i think that ai will pose an enormous array of
01:13:25.200
challenges that haven't really been appreciated and the reason i think this is because i find it
01:13:32.320
increasingly plausible that ai will lead to much accelerated rates of technological progress so
01:13:39.440
imagine there's a kind of thought experiment all the technologies and intellectual developments that you
01:13:45.200
might expect to happen over the coming five centuries everything that might happen there
01:13:51.280
and now imagine all of that happens in the course of three years would we expect that to go well so
01:13:56.880
in that period of time okay we're developing new weapons of mass destruction we now have an automated
01:14:03.840
army and an automated police force so that in principle all military power could be controlled by a single
01:14:10.240
person we now have created beings that plausibly have moral status themselves what economic rights
01:14:16.720
welfare rights political rights uh should they have potentially i mean you talked about misinformation
01:14:22.640
but potentially now we have the ability to just have superhuman persuasive abilities far far better than
01:14:30.160
even teams of the best most charismatic lawyers or politicians in the most targeted ways possible and i think
01:14:39.600
more challenges too like over this period we'll probably also have new conceptual insights and
01:14:45.200
intellectual insights or you know radically changing the game board for us too and all of that might
01:14:50.800
be happening over the course of a very short period of time why now like why might it happen in such
01:14:56.000
a short period of time well that's the classic argument goes back to ij good in the 50s
01:15:02.000
which is once you've got the point in time when ai can build better ai you've got this tight tight feedback
01:15:08.080
loop because once you built the better ai that can help you build better ai and so on and that
01:15:13.120
argument has been subject to really quite intense inquiry over the last few years building it into
01:15:19.440
leading growth models really looking at the input output curves in existing ml development for how much
01:15:26.480
of a gain you get for an increase in input and it really looks like the argument is checking out
01:15:32.240
and then that means that it's not long from the point of time that you've got your first ai that
01:15:37.520
can significantly help you with ai research to then trillions upon trillions of ai scientists that are
01:15:44.560
driving progress in all sorts of scientific domains forward and that's just a really quite dizzying
01:15:50.720
prospect i think misalignment misinformation are some of the challenges we'll have to face but it's really
01:15:57.040
just like it's like facing all of technological progress at once and doing it in an incredibly
01:16:03.520
short period of time such that i think the default outcome is not that we handle that well handling it
01:16:09.840
well or not i think we just birthed a another topic for a future podcast well sure there's a lot
01:16:15.920
there's a lot to talk about there okay so finally where has all of this controversy and confusion landed for
01:16:26.160
you where does it leave you personally and in terms of what you're now doing and and how do you what
01:16:33.840
what's your view optimistic pessimistic or otherwise about ea going forward yeah i mean so the collapse
01:16:43.120
was yeah i mean it was extremely hard for me and there's just been no doubt at all it's the hardest
01:16:51.120
year and a half now of my life both you know so many reasons just the horror of the harms that were
01:16:57.200
caused um the incredible damage it did to organizations and people um that i loved and so i found that just
01:17:05.600
yeah very tough very tough to deal with um and i was you know really quite a dark place like for the
01:17:11.520
first time in my life actually i had this chunk of time where i just didn't i kind of lost the feeling
01:17:17.680
of moral motivation i like didn't really know if i could keep going so i did actually even think
01:17:21.920
about just stepping back like really just giving up on ea as a project in my life because it just
01:17:27.920
it felt kind of tainted and that was weird i mean that was weird not having that motivation
01:17:32.160
what would have produced that effect is it just the public perception of ea becoming so negative is it
01:17:42.480
practically speaking uh fewer funds going into ea organizations that need those funds what was it
01:17:49.360
funds that were getting clawed back because sam bankman freed or ftx had given those funds and now
01:17:55.120
those become you know you know legally challenged what was the actual impact yeah i mean the reason for
01:18:02.960
me thinking you know maybe i was just going to give up was nothing practical like that it was you know
01:18:08.800
psychological like after you know i really felt like i'd been punched or stabbed or something like
01:18:15.360
you know i felt like i'd been building this thing for 15 years and i'd really worked you know
01:18:20.720
unsustainably hard on that i was tired and it had just been blown up in this one in one swoop and it
01:18:29.280
just been you know blown up by sam's actions and yeah it was just hard to then think okay i'm gonna get
01:18:37.440
back up off the ground and go into the shallow pond and rescue another child or something when you
01:18:44.720
say it's been blown up though so you're talking essentially about the brand damage to ea yes with
01:18:51.120
which you have been so closely associated as really one of its progenitors exactly and i was talking
01:18:57.440
about just what's going on what was going on in my mind and not about what in fact happened so we can
01:19:03.280
talk about what the hit was to ea so yeah huge brand damage i think people were definitely keen to
01:19:09.360
disassociate themselves in my own case i actually got surprisingly little in the way of being cancelled
01:19:15.680
but uh it definitely means it's harder for people especially in the aftermath to go out and advocate
01:19:22.800
but i'm now feeling much more optimistic so it was really tough like the year afterwards it was low
01:19:29.120
morale um i think it was just harder for you know people associated to ea to just do the work they
01:19:35.280
wanted to do but a few months ago there was a conference among many of the leaders of the kind
01:19:41.520
of most calling organizations and i honestly just found that really inspiring because for so many people
01:19:48.240
the principles are still there the problems in the world are still there like sam's actions do not
01:19:53.840
change in any way how serious the problems we're facing are and how important it is to take action
01:20:00.160
and how important it is to ensure that our actions have as big an impact as possible so there really
01:20:06.800
was a sense of people rolling up their sleeves and wanting to get back to it and then in terms of the
01:20:11.840
brand like i think once you go off twitter people again are like quite understanding people understand
01:20:16.800
that the actions of one person don't reflect on an entire movement and ultimately just not that
01:20:23.840
many people have heard of either ftx or effective altruism uh so there's still there's still kind of
01:20:30.000
plenty of room to go and so i think we're starting to see things change again and you know it's just
01:20:36.400
made me reflect on the good things that the effective altruism movement is accomplishing and has continued to
01:20:42.720
accomplish so just recently give well has now moved over two billion dollars to its top recommended
01:20:51.520
charities just so insane for me to think that that could be that amount compared to where i was 15
01:20:57.680
years ago it's like hundreds of thousands of people who are alive that would have otherwise been dead
01:21:02.960
like i live in oxford has a population of 120 000 people if i imagine like a nuclear bomb went off and
01:21:09.760
killed everyone in oxford like that would be world news for months and if a kind of group of
01:21:16.160
altruistically motivated people had managed to come together and prevent that plot saving the city of
01:21:22.960
oxford that would be huge news people would write about that for decades and yet that's exactly you know
01:21:29.920
not in nearly as dramatic a way perhaps but um that's what's happened actually several times over
01:21:36.320
and so now i think yeah we're getting back to basics basically back to the core principles of effective
01:21:43.120
altruism back to the idea that you know all people have moral claims upon us we in rich countries have
01:21:51.680
an enormous power to do good and if we do just devote some of our money or some of our time through
01:21:59.120
our careers to those big problems we can make a truly enormous difference that message it's it's as
01:22:05.040
inspiring to me as ever hmm yeah well i certainly could say the same for myself as as i've described
01:22:12.720
i've always viewed myself to be on the periphery of the movement but uh increasingly committed to
01:22:20.080
its principles and um you know those principles have been learned directly from you more than from
01:22:27.040
any other source i think as i said in a blurb to your um your most recent book i i wrote that no living
01:22:35.040
philosophers had a greater impact upon my ethics than will mccaskill and much of the good i now do
01:22:40.320
in particular by systematically giving money to effective charities is the direct result of his
01:22:44.960
influence that remains true and i'm i remain immensely grateful for that influence and so i just
01:22:52.480
you should not read into the noise as a result of what sam bankman freed did any lesson that um detracts
01:23:00.960
from you know all the good you you have done that that many of us recognize that you have done and and
01:23:06.960
all the good you have inspired other people to do it's really it's extraordinarily rare to see
01:23:13.280
abstruse philosophical reasoning result in so much tangible good and such an obvious increase in
01:23:21.280
well-being in in those whose lives have been touched by it so you should keep your chin up and and just
01:23:29.040
get back to your all-too-necessary work it's really it's remains inspiring and you know while you may no
01:23:34.800
longer be the youngest philosopher i talked to uh you still are you still are nearly so so uh just keep
01:23:42.080
going that's my advice well yeah thank you so much sam that really yeah it really means a lot that you feel
01:23:47.520
like that um especially because yeah your advocacy has just it's really been unreal in terms of the
01:23:53.520
impact it's had so i was saying that giving what we can has over 9 000 members now over 1 000 of them
01:24:00.160
cite you cite the sound powders podcast as why they have taken the pledge that's over 300 million dollars
01:24:08.240
of pledged donations and so i guess it just goes to show that yeah the listeners of this podcast you
01:24:15.200
know they just are people who are taking good ideas seriously and not you know you might think
01:24:21.600
people who listen are just interested in the ideas just for their own sake you know they find them
01:24:27.200
intellectually engaging but no actually people are just willing to put those ideas into practice and
01:24:33.920
do something like take giving what we can pledge and that's just yeah that's amazing to see that's great
01:24:40.880
well will we have a another podcast teed up let me know when our robot overlords start um making uh
01:24:48.800
increasingly ominous noises uh such that they're now unignorable and we will we'll have a podcast
01:24:54.320
talking about all the pressing concerns that that ai has birthed because um yeah i share your uh
01:25:01.200
your fears here and um it'll be a great conversation sure i'm looking forward to it