Making Sense - Sam Harris - November 15, 2022


#303 — The Fall of Sam Bankman-Fried


Episode Stats

Length

20 minutes

Words per Minute

152.03253

Word Count

3,078

Sentence Count

155

Misogynist Sentences

2

Hate Speech Sentences

5


Summary

Sam Bankman-Friedrich was one of the most successful crypto-exchange traders in the world, but he was also a fraud, a con artist, a liar, and a fraudster. And now, we know that he was all of these things, but who was he really? And how did he get to where he was, and why did he do it? In this episode of the Making Sense Podcast, I try to figure out who he really was and why he did what he did, and how he got away with it. And I also try to make sense of why he should have been fired from his position as CEO of a cryptocurrency exchange in the first place, even if he was a fraud and a scammer, not a good guy, and that he did it in order to enrich himself and the people he was supposed to be working with, not to do good, and not to make a profit. And that's not a bad thing, but it's also not good at all. a very different picture than Bernie Madoff, and I think we can all agree that this is a much, much better picture of who he was really, and what he really really was. The problem is, I have no idea who he actually is. I don't have any idea, and neither do most people I've ever met him, so I can t even begin to guess who he is really, really, or what he's really all of that really is. I don t have a clue about him but I can tell you that he seems to me that he's not to be a bad guy, at least in any way, and maybe not in the least likely at least not the kind of person I ve ever been except in any kind of way . and that's a very strange person I could be or not even a bad person in a way that I ve had any idea maybe not even good , but like that I have ever had a chance to get to know any -- at least of people ever I have a good sense so - and that s let me know that is a good friend he s a good person, right? this is not ? well, right And ...


Transcript

00:00:00.000 Welcome to the Making Sense Podcast.
00:00:23.760 This is Sam Harris.
00:00:25.140 Okay, well, I want to say something about the Sam Bankman-Fried fiasco.
00:00:32.620 As many of you know, I spoke to Sam once on the podcast.
00:00:36.300 We also put that conversation on the Waking Up app.
00:00:39.460 We've since removed it from the app because it really no longer belongs there, given what happened.
00:00:45.620 But it's still on the podcast. It's episode 271.
00:00:49.420 So you can find it there if you're interested.
00:00:51.580 I'm not going to give a full account of what Bankman-Fried did to destroy his reputation and his wealth
00:00:59.580 and the wealth of many investors and customers.
00:01:03.140 It seems in record time, many details are still coming in.
00:01:08.720 And this is all being very well covered by the press with schadenfreude and cynicism to spare.
00:01:15.620 And that's a point I will return to.
00:01:17.580 But briefly, Bankman-Fried had made tens of billions of dollars trading cryptocurrency
00:01:25.300 and he had built one of the world's largest crypto exchanges, FTX,
00:01:32.400 along with his own investment entity called Alameda Research.
00:01:36.720 And the connection between FTX and Alameda was always unclear.
00:01:43.140 And this seems to have been, in retrospect,
00:01:45.680 something that investors and the business press should have been more interested in.
00:01:50.440 But my interest in Bankman-Fried was entirely due to his stated commitment to effective altruism.
00:01:56.660 He appeared to be the world's greatest example of what has been called earning to give in the EA community,
00:02:02.580 which is setting out to make a lot of money for the express purpose of giving most or all of it away.
00:02:08.540 And at the time I interviewed him, everything suggested that he was doing this.
00:02:13.340 In any case, what seems to have happened last week is that concerns over the financial health of FTX
00:02:20.020 and its links to Alameda Research triggered essentially a run on the bank.
00:02:26.420 FTX customers tried to withdraw their assets en masse.
00:02:30.320 And this revealed an underlying problem at FTX.
00:02:34.920 Unfortunately, Bankman-Fried appears to have used customer assets
00:02:37.860 that should have been safely stored there to fund risky investments through Alameda
00:02:43.940 in a way that seems objectively shady and unethical.
00:02:47.980 Whether or not it was also illegal, I don't think we know.
00:02:52.220 All this happened outside of U.S. jurisdiction,
00:02:55.060 as a majority of crypto trading, in fact, does.
00:02:58.580 So whatever the legality or illegality,
00:03:02.180 Bankman-Fried seems guilty of enormous financial malfeasance.
00:03:06.800 He appears to have lost something like $16 billion in customer assets.
00:03:12.340 And then there were clearly moments where his public comments
00:03:15.580 about the state of the business amounted to lies.
00:03:19.980 These were lies that seemed calculated to reassure investors and customers
00:03:23.700 that everything was fine when everything was really falling apart.
00:03:27.800 Now, I have no idea when all the shady and unethical and probably illegal behavior started.
00:03:34.420 Was he Bernie Madoff from the beginning?
00:03:37.740 Or did he just panic in a financial emergency?
00:03:41.020 Thinking that he could use customer funds just this one time,
00:03:44.260 and then everything would be okay again.
00:03:46.420 The truth is, I don't even know whether Bernie Madoff was Bernie Madoff from the beginning.
00:03:51.440 I don't know whether he was a legitimate investor who was making his clients a lot of money,
00:03:54.780 and then got underwater, and then went Ponzi in an attempt to get back to dry land,
00:04:00.340 and then found that he never could.
00:04:03.160 That wouldn't be good, obviously.
00:04:05.300 But it's a very different picture than of a man who was an evil liar from the very beginning.
00:04:11.860 Just a pure sociopath who knew that he would be bankrupting vulnerable people every step along the way.
00:04:19.700 Perhaps we know what's true about Madoff.
00:04:22.260 I just haven't read very deeply about his case.
00:04:24.780 But the point is, I have no idea who Sam Bankman-Fried was, or is, really.
00:04:32.840 And this morning, I went back and listened to the interview I did with him.
00:04:36.320 And even with the benefit of hindsight, I don't detect anything about him that strikes me as insincere.
00:04:44.700 I don't have any retrospective, spidey sense that makes that conversation appear suddenly in a new light.
00:04:51.600 Now, maybe I'm just a bad judge of character.
00:04:55.880 That is totally possible.
00:04:58.380 I've had people much closer to me than Sam Bankman-Fried, who I've never met and only spoken to twice, once being on that podcast.
00:05:06.740 I've had people much closer to me, people I've worked with, and people who are actual friends, behave in very strange ways that have totally surprised me.
00:05:17.380 And some of these people have large public platforms, and have done things in recent years that I consider quite harmful.
00:05:24.660 I've commented about some of them, and I've held my tongue about others.
00:05:29.060 And frankly, I'm still uncertain about the ethics here.
00:05:33.760 What sort of loyalty does one owe a friend, or a former friend, when that person is creating great harm out in the world?
00:05:42.380 Especially when you yourself have a public platform from which to comment on that harm, and are even being asked to comment.
00:05:51.500 I mean, is it appropriate to treat them differently than one would treat a stranger who is creating the same sort of harm?
00:05:57.240 I don't know.
00:05:59.900 You can probably guess some of the names here.
00:06:02.740 But COVID alone has caused several of my friends, and former friends, to say and do some spectacularly stupid things.
00:06:13.900 And I haven't known what to do about that.
00:06:16.320 However, my point is that I've been very surprised by much of this behavior.
00:06:20.920 So perhaps I am a bad judge of character.
00:06:22.880 However, in the case of Sam Bankman-Fried, I had no prior exposure to him.
00:06:30.200 That podcast conversation was the first time I ever spoke to him.
00:06:34.680 He was simply someone who had been ripped from the pages of Forbes magazine,
00:06:38.540 and promoted by the most prominent people in the effective altruist community.
00:06:42.240 And I had no reason to suspect that he was doing anything shady behind closed doors.
00:06:47.020 And it appears that many of the executives at FTX didn't know what he was doing.
00:06:52.560 And the venture capitalists, like Sequoia, and Lightspeed, and SoftBank, and Tiger Global, and BlackRock,
00:07:00.600 who gave him $2 billion, clearly didn't know what he was doing.
00:07:06.180 And again, when I listen to my interview with him now,
00:07:08.920 I don't detect anything that should have been a red flag to me.
00:07:13.240 Of course, none of this diminishes the harm that Bankman-Fried has caused.
00:07:19.300 As far as I can tell, the accounting is still pretty murky,
00:07:22.240 so it's not clear where the money actually went.
00:07:24.920 But, as I said, he appears to have lost something like $16 billion of customer funds.
00:07:31.460 That's a lot of money.
00:07:33.220 And it's quite possible that some people who listen to my podcast with him
00:07:36.440 could have been so impressed by him and his story
00:07:39.600 that they invested in or through FTX.
00:07:43.240 I would certainly be unhappy to learn that that had happened.
00:07:46.920 And I would deeply regret any role that my podcast played there.
00:07:51.100 But again, I just listened to the conversation this morning,
00:07:54.920 and I still don't hear anything that should have caused me to worry
00:07:57.660 that Sam Bankman-Fried or FTX was not what they seemed.
00:08:01.140 Now, beyond the immediate financial harm he caused,
00:08:06.220 Bankman-Fried has done great harm to the reputation of effective altruism.
00:08:11.120 The revelation that the biggest donor in the EA universe was not what he seemed
00:08:16.760 has produced bomb bursts of cynicism throughout tech and journalism.
00:08:23.780 All of them quite understandable, but also quite unwarranted.
00:08:28.600 First, let me be clear about my own relationship to effective altruism.
00:08:33.740 I've said this before, but I view EA as very much a work in progress,
00:08:38.980 and I have never been identified with the movement or the community,
00:08:43.400 and I've only interacted with a few people in it,
00:08:46.660 mostly by having them on my podcast.
00:08:49.000 And as I said to Will McCaskill,
00:08:50.920 the movement has always seemed somewhat cultic and too online for me to fully endorse.
00:08:57.080 Some of its precepts seem a little dogmatic to me.
00:09:01.920 What I've taken from effective altruism has really been quite simple
00:09:05.740 and can be distilled to two points.
00:09:09.020 The first is that some ways of helping to reduce suffering
00:09:11.760 are far more effective than others,
00:09:15.320 and we should care about those differences.
00:09:18.100 For instance, it's quite possible
00:09:19.740 that trying to solve a problem by one method
00:09:22.520 will be ten or even a hundred times more effective
00:09:25.600 than trying to solve it by another.
00:09:28.480 And in fact, some ways of trying to solve a problem
00:09:30.420 will only make the problem worse.
00:09:33.300 Now, this is such an obvious point
00:09:35.380 that it seems insane
00:09:37.080 that you would need a movement
00:09:38.560 to get people to understand it.
00:09:40.940 But prior to EA,
00:09:43.820 most philanthropy seemed to be basically blind
00:09:47.360 to any rational metrics of success.
00:09:51.200 In fact, many charities are governed by perverse incentives.
00:09:55.020 They can't afford to solve the problem
00:09:57.280 they're ostensibly committed to solving
00:09:58.880 because then they would go out of business.
00:10:02.000 Effective altruism, at least in principle,
00:10:05.220 represents a clear-eyed view
00:10:06.960 of what it takes to do good in the world
00:10:08.980 and to prioritize the most effective ways
00:10:11.580 of doing that good.
00:10:13.420 The second principle,
00:10:14.980 which more or less follows directly from the first,
00:10:18.020 is that there's a difference
00:10:19.200 between causes that make us feel good,
00:10:22.660 that are sexy and subjectively rewarding to support,
00:10:25.920 and those that reduce suffering and death most effectively.
00:10:30.200 As I've discussed many times on the podcast,
00:10:32.720 we are easily moved by a compelling story,
00:10:36.480 particularly one with a single, sympathetic,
00:10:38.980 protagonist,
00:10:39.940 and we tend not to be moved at all by statistics.
00:10:44.140 So if you tell me that one little girl fell down a well,
00:10:47.480 and I can do something to save her right now,
00:10:50.640 well, I'm going to do more or less whatever it takes,
00:10:53.780 especially if that girl lives just a few blocks away from my home.
00:10:57.420 But if you tell me that 10,000 little girls
00:11:00.760 fall down wells every year,
00:11:03.560 and most of them live in Tanzania,
00:11:05.520 I'm not sure what I'm going to do,
00:11:07.060 but it's not going to sweep aside all my other priorities for the day.
00:11:11.640 I might just go get some frozen yogurt in an hour.
00:11:14.860 And if you do convince me to support a charity
00:11:17.160 that is working in Tanzania to save these little girls,
00:11:20.280 however big a check I write,
00:11:22.440 the act of writing it
00:11:23.840 is not going to be as subjectively rewarding
00:11:26.480 as my helping my neighbors save one little girl.
00:11:30.500 It might be 10,000 times more effective,
00:11:33.940 but it will be 10,000 times less rewarding.
00:11:38.340 This is a bug,
00:11:39.460 not a feature of our moral psychology.
00:11:42.520 We are just not built
00:11:43.980 to emotionally respond to data.
00:11:47.740 And yet the data really do indicate
00:11:50.080 the relative magnitude of human suffering.
00:11:53.080 So on this point,
00:11:54.600 effective altruism has convinced me
00:11:56.440 to give in a way that is unsentimental,
00:12:00.780 in that it can be divorced
00:12:01.960 from the good feelings I want to get
00:12:04.180 from supporting causes I'm emotionally connected to.
00:12:07.220 It's caused me to commit in advance
00:12:09.640 to give a certain amount of money away every year.
00:12:12.780 In my case,
00:12:13.460 a minimum of 10% of my pre-tax income.
00:12:16.500 And in the case of waking up,
00:12:17.700 a minimum of 10% of our profits.
00:12:20.100 And to give this money
00:12:20.980 to the most effective charities we can find.
00:12:23.780 As I've said before,
00:12:25.860 I even said this in the intro to the episode
00:12:27.580 with Sam Bankman-Fried,
00:12:29.360 committing in advance
00:12:30.720 to giving a certain percentage away
00:12:32.480 to the most effective charities,
00:12:34.060 and then giving whatever I want
00:12:35.200 to less effective causes,
00:12:36.660 to which I might be more emotionally attached.
00:12:39.580 This has really transformed
00:12:41.500 my ethical life for the better.
00:12:44.220 Because I now experience giving money
00:12:45.840 to a children's hospital,
00:12:47.680 or to some person's GoFundMe page,
00:12:50.280 more or less like a guilty pleasure.
00:12:52.620 It's like I'm splurging on myself.
00:12:56.340 That money could be going
00:12:57.640 to the most effective charities.
00:12:59.460 But selfish bastard that I am,
00:13:01.800 I'm just helping a single individual
00:13:03.620 who may have been the victim
00:13:05.120 of an acid attack in Pakistan.
00:13:07.820 Right?
00:13:09.120 It's hard to capture the psychology of this
00:13:11.200 until you experience it.
00:13:13.160 But it has transformed my thinking
00:13:14.520 about many things.
00:13:16.080 About wealth,
00:13:17.660 and charity,
00:13:18.560 and compassion.
00:13:19.220 And exactly none of this
00:13:23.620 is put in jeopardy
00:13:25.400 by the discovery that Sam Bankman-Fried
00:13:27.680 was guilty of some terrible
00:13:29.580 investment fraud.
00:13:31.620 Again, I'm still not sure
00:13:32.740 what happened with him.
00:13:34.060 Whether he was exactly what he seemed,
00:13:36.280 but then freaked out
00:13:37.280 in the midst of an emergency
00:13:38.280 and started lying and stealing
00:13:39.920 in an attempt to get out of it.
00:13:41.560 Or whether he's a pure con man.
00:13:44.420 But neither of these possibilities
00:13:46.300 has any implication
00:13:48.340 for what I've taken
00:13:49.920 from effective altruism.
00:13:52.040 And the same would be true
00:13:53.040 if there were unhappy revelations
00:13:54.380 yet to come from other prominent people
00:13:56.060 in the EA community.
00:13:57.720 This would be very depressing,
00:13:59.400 but it would have no effect
00:14:00.920 on how my thinking about ethics
00:14:02.960 has changed for the better.
00:14:05.360 Here's an analogy
00:14:06.260 that might make this a little clearer.
00:14:08.380 Imagine a very prominent scientist
00:14:10.500 or a group of scientists
00:14:12.340 is found to have faked their data.
00:14:15.500 This happens from time to time.
00:14:17.740 Imagine that the scientist in question
00:14:19.200 has been so influential
00:14:21.200 and the fraud was sustained for so long
00:14:24.140 that a Nobel Prize
00:14:25.700 now has to be rescinded
00:14:27.200 and an entire department
00:14:28.840 at Harvard shuttered.
00:14:31.160 Would this say anything
00:14:32.440 about the legitimacy of science itself?
00:14:35.500 Of course not.
00:14:37.140 However, I'm now seeing
00:14:38.540 a lot of people
00:14:39.480 respond to the Bankman-Fried debacle
00:14:42.300 as though it reveals
00:14:44.000 that the very idea
00:14:45.480 of effective altruism
00:14:46.920 was always a sham.
00:14:49.640 It's as if people are concluding
00:14:50.800 that no one is ever sincere
00:14:52.600 in their attempts to help others
00:14:54.360 or that there are no better
00:14:56.100 and worse ways
00:14:56.940 of doing good in the world.
00:14:58.460 Certainly not where charity
00:14:59.520 is concerned.
00:15:00.960 It's all just virtue signaling
00:15:02.300 and reputation laundering
00:15:04.040 and sanctimony
00:15:05.760 and lies.
00:15:07.160 There appear to be
00:15:08.980 many very selfish people
00:15:10.920 who didn't much like
00:15:12.380 hearing about someone
00:15:13.180 giving most or all
00:15:14.280 of his wealth away
00:15:15.060 who now feel
00:15:16.420 totally vindicated
00:15:17.940 in their selfishness.
00:15:20.180 They seem to be thinking
00:15:20.820 all those effective altruists
00:15:22.340 were just pretending
00:15:23.660 to be better than me
00:15:25.160 and they're not.
00:15:26.600 They're actually worse
00:15:27.760 because they can't be honest
00:15:29.820 about their selfishness
00:15:31.440 and hypocrisy.
00:15:33.060 Over here in the real world,
00:15:34.500 we're just all in it
00:15:35.540 for ourselves
00:15:36.180 and that's fine
00:15:37.560 because that's all
00:15:38.680 that's possible.
00:15:40.360 Anyway,
00:15:41.280 this is all just morbid
00:15:43.240 and obtuse.
00:15:45.540 Just as you can know
00:15:46.940 that science
00:15:47.560 is a legitimate enterprise
00:15:49.060 because you can actually
00:15:50.660 do science,
00:15:52.440 you can know
00:15:53.380 that effective altruism
00:15:54.960 is legitimate
00:15:55.920 because you can actually
00:15:57.760 do it
00:15:58.360 for the right reasons.
00:16:00.060 As damaging as fraud
00:16:02.600 can be
00:16:03.180 in both domains,
00:16:05.080 no case of fraud
00:16:06.460 can put either domain
00:16:08.780 in question.
00:16:10.540 Fraud has no logical
00:16:11.880 relationship
00:16:12.540 to either enterprise.
00:16:15.360 Scientific frauds
00:16:16.520 are not science.
00:16:18.200 They're frauds
00:16:19.000 and the corrective to them
00:16:20.900 is real science.
00:16:23.500 Ethical frauds
00:16:24.780 are not ethics.
00:16:26.820 They're frauds.
00:16:27.740 The corrective to them
00:16:29.720 is real ethical behavior.
00:16:32.780 The other crazy thing
00:16:34.020 I'm seeing
00:16:34.480 is that people
00:16:35.240 are linking
00:16:35.780 Bankman-Fried's behavior
00:16:37.180 to the ethical philosophy
00:16:38.720 of utilitarianism
00:16:40.680 as though a belief
00:16:42.300 in this philosophy
00:16:43.120 was bound
00:16:44.280 to produce
00:16:45.400 such behavior.
00:16:46.880 And everything
00:16:47.600 I've heard said
00:16:48.620 or seen written
00:16:49.640 about this
00:16:50.280 is pretty confused.
00:16:52.680 I'm sure I'll talk
00:16:53.320 more about this
00:16:53.980 on future podcasts
00:16:55.320 because it's important
00:16:56.600 to get straight.
00:16:57.740 But the short point
00:16:58.780 I'll make here
00:16:59.520 is that
00:16:59.940 in my view
00:17:00.900 the claim
00:17:01.540 that utilitarianism
00:17:02.800 or more properly
00:17:04.080 consequentialism
00:17:05.260 is bad
00:17:06.560 amounts to a claim
00:17:07.960 that it has bad
00:17:08.660 consequences
00:17:09.360 of some sort.
00:17:11.200 But of course
00:17:11.680 if these consequences
00:17:12.620 are worth
00:17:13.300 caring about
00:17:14.120 they can be included
00:17:15.660 in any fuller picture
00:17:17.020 of consequentialism.
00:17:18.900 My point
00:17:19.720 and I made this
00:17:20.320 at length
00:17:21.140 in my book
00:17:21.720 The Moral Landscape
00:17:22.740 which remains
00:17:23.560 the most misunderstood
00:17:24.720 book I've ever written.
00:17:26.040 my point
00:17:27.140 is that
00:17:27.500 everyone really
00:17:28.520 is some form
00:17:29.640 of consequentialist
00:17:31.060 even if they think
00:17:32.060 they're not.
00:17:33.100 You just have to
00:17:33.780 listen to them
00:17:34.420 talk long enough
00:17:35.560 and they start
00:17:36.480 telling you
00:17:36.860 about the consequences
00:17:37.740 they really care about.
00:17:39.840 Deontology
00:17:40.440 for instance
00:17:41.180 a version like
00:17:42.280 Kant's
00:17:42.780 categorical imperative
00:17:43.840 or virtue ethics
00:17:45.840 right
00:17:46.180 I think these
00:17:46.800 positions
00:17:47.540 entail
00:17:48.820 covert claims
00:17:50.080 about the consequences
00:17:51.520 of certain ways
00:17:52.900 of thinking
00:17:53.400 or acting
00:17:54.420 or being
00:17:55.420 in the world.
00:17:57.120 Anyway I'll speak
00:17:57.720 more about that
00:17:58.400 at some point
00:17:59.020 but suffice it to say
00:18:01.180 that if Sam Bankman
00:18:02.700 Freed
00:18:03.240 stole his
00:18:04.400 customers money
00:18:05.160 and gambled it away
00:18:06.860 whatever rationale
00:18:08.160 he may have had
00:18:08.740 in his head
00:18:09.300 it would have been
00:18:10.340 very easy to argue
00:18:11.800 that this was wrong
00:18:13.340 from the point of view
00:18:14.980 of consequentialism.
00:18:16.620 Just look at the
00:18:17.660 consequences
00:18:18.260 they're disastrous
00:18:20.120 they're disastrous
00:18:21.740 for him
00:18:22.380 and for his
00:18:23.840 investors
00:18:24.340 and customers
00:18:25.720 and for anyone
00:18:27.300 in his life
00:18:28.140 who cares about him.
00:18:30.400 I mean
00:18:30.600 just think about
00:18:31.820 the consequences
00:18:32.680 for his parents.
00:18:35.300 His parents are both
00:18:36.260 Stanford law professors
00:18:37.580 now assuming that
00:18:39.160 neither had any idea
00:18:40.240 what he was up to
00:18:41.020 and so they're not
00:18:41.920 at all culpable
00:18:42.660 for what happened
00:18:43.560 and who knows
00:18:44.720 they might actually
00:18:45.600 have been involved
00:18:46.520 I know nothing
00:18:47.280 about them
00:18:47.740 but I suspect not
00:18:49.420 imagine
00:18:50.840 how their lives
00:18:52.300 changed
00:18:52.780 last week
00:18:53.740 they have spent
00:18:55.060 the last decade
00:18:55.960 being congratulated
00:18:57.500 I suspect
00:18:58.700 endlessly
00:18:59.480 for having produced
00:19:01.180 such a marvelous
00:19:02.740 and marvelously
00:19:04.000 successful
00:19:04.840 son
00:19:05.560 can you imagine
00:19:07.420 he got a degree
00:19:08.940 in physics
00:19:09.560 from MIT
00:19:10.280 and then went on
00:19:11.720 to make
00:19:12.120 30 billion dollars
00:19:13.760 for the purpose
00:19:14.840 of giving it
00:19:15.600 all away
00:19:16.260 to charity
00:19:16.880 and then expanded
00:19:18.480 his influence
00:19:19.340 into politics
00:19:20.320 and among celebrities
00:19:21.680 so that he could
00:19:22.900 do even more good
00:19:24.060 ultimately
00:19:24.800 he's hanging
00:19:26.040 with Bill Clinton
00:19:26.920 and Tony Blair
00:19:27.920 and Tom Brady
00:19:28.900 and Giselle
00:19:29.600 but rather than
00:19:30.980 being just a star fucker
00:19:32.320 or an ordinary rich guy
00:19:34.000 he is laser focused
00:19:35.760 on making money
00:19:36.640 to solve the world's
00:19:37.740 most pressing problems
00:19:39.020 can you imagine
00:19:41.100 how proud
00:19:42.660 his parents were
00:19:43.520 and now their son
00:19:45.420 is the next
00:19:46.020 Elizabeth Holmes
00:19:46.900 or worse
00:19:48.340 I mean you just have
00:19:49.760 to consider the lives
00:19:50.760 of these two people
00:19:51.940 and this is a fucking
00:19:53.580 Greek tragedy
00:19:54.580 but of course
00:19:56.440 the harm has spread
00:19:57.620 much further
00:19:58.680 than that
00:19:59.460 anyway
00:20:01.040 those are my thoughts
00:20:01.960 at the moment
00:20:02.680 I will definitely
00:20:04.140 speak more about
00:20:04.800 these issues
00:20:05.320 on future podcasts
00:20:06.460 and I'd be happy
00:20:08.260 to hear suggestions
00:20:08.960 for anyone
00:20:09.560 who I might speak with
00:20:10.500 who could take the conversation
00:20:11.960 in new directions
00:20:13.000 thanks for listening