Making Sense - Sam Harris - August 14, 2022


#292 — How Much Does the Future Matter?


Episode Stats

Length

2 hours

Words per Minute

158.71057

Word Count

19,167

Sentence Count

53

Misogynist Sentences

6

Hate Speech Sentences

14


Summary

Trump's Mar-a-Lago home is raided by the FBI, and Salman Rushdie is attacked on stage by a crazed lunatic. And yet again, I am accused of being soft on Islam, and of pandering to Islamic extremists. What's the problem with this? Is it racism or xenophobia? Or is it Islamophobia? And why do we care so much about what other people think about it? And what does it have to do with freedom of speech and freedom of the press? And who is the real culprit here? All that and more on today's episode of the Making Sense Podcast by Sam Harris. Subscribe to Making Sense to get immediate access to all the latest episodes and listen to them on your favorite podcast platform. Subscribe on Apple Podcasts! Subscribe on iTunes Learn more about your ad choices. If you like what you hear, please consider becoming a supporter of the podcast by becoming a patron of Making Sense. It helps spread the word about the podcast and help spread it around the world. Thanks to our sponsors, including VaynerSpeakers.org and Groundless Media. Make Sense Media. Sam Harris Sam's newest book, "Making Sense" is out now! Learn more at makingsense.co/makingense/sammyspondent and subscribe to the podcast on amazon to become a supporter and receive a free copy of the book "Make Sense Podcasts by Sam's excellent book "The Making Sense" by clicking here in paperback and hardcover edition of The Making Sense at amazon. and paperback by . by The New York Times bestselling paperback edition of "The Vagrant is out in paperback on amzn_ by Penguin Books Thank you for supporting the making sense podcast? by Pizzarelli, thank you're getting a copy of "making sense? and much more! , and we'll send Sam's book out in hardcover "The Best of it all? " " by , and more thanks to so much more, and so much & ? Thanks, Sam's new book , "the Making Sense Press will be out soon! " and of course, thank you, Sam will be giving you all of that, " " and "


Transcript

00:00:00.000 Welcome to the Making Sense Podcast.
00:00:23.700 This is Sam Harris.
00:00:26.860 Well, it's been a big week for news.
00:00:30.000 I think I'm going to wait for a few more shoes to drop
00:00:32.860 before touching any of it.
00:00:37.120 Earlier in the week, Trump's home at Mar-a-Lago.
00:00:42.920 Is that really his home?
00:00:45.100 That golf club that looks like a dictator's palace
00:00:48.520 and what he might call a shithole country?
00:00:51.180 It was raided by, quote, raided by the FBI
00:00:56.440 in search of misappropriated documents
00:01:01.280 that may or may not contain secrets about nuclear weapons
00:01:05.800 or nuclear policy or something nuclear.
00:01:10.140 Again, I think I'll wait for more information to surface
00:01:13.380 before I comment on this, but
00:01:15.060 needless to say, it would not surprise me
00:01:18.640 if Trump had done something terrifyingly inept
00:01:23.000 and self-serving and even nefarious.
00:01:27.300 So, let's just wait and see what happens there.
00:01:31.060 I share many people's concern
00:01:33.240 that the execution of this search warrant
00:01:35.560 was inflammatory enough
00:01:37.860 that if the facts, in the end,
00:01:41.060 don't seem to merit it,
00:01:42.240 it will be yet another thing
00:01:44.700 that bolsters Trump's political stature
00:01:47.840 in the personality cult
00:01:50.180 that has subsumed the Republican Party.
00:01:53.660 And that would be a bad thing.
00:01:55.800 Once again, this is an asymmetrical war of ideas
00:01:58.780 and any misstep that seems to suggest
00:02:04.120 that our institutions have become politicized,
00:02:07.000 in this case, the FBI,
00:02:10.800 run, we should note, by a Trump appointee,
00:02:14.860 Christopher Wray,
00:02:16.040 not the fittest object for allegations
00:02:18.540 of anti-Trump partisanship, really.
00:02:21.480 But any apparent bias here
00:02:25.500 makes any further discussion of facts
00:02:28.840 more or less impossible.
00:02:31.400 So anyway, let's wait for the dust to settle.
00:02:34.100 Well, let's see what was actually in those boxes.
00:02:38.740 And maybe there'll be more to say.
00:02:42.820 And then yesterday,
00:02:45.840 the author Salman Rushdie
00:02:47.100 was attacked on stage
00:02:49.480 by a man wielding a knife
00:02:52.700 under conditions where
00:02:54.680 he obviously did not have enough security.
00:02:58.040 And it sounds like he is on life support now.
00:03:01.980 So that obviously is terrible.
00:03:04.640 And I share everyone's horror on his account.
00:03:09.060 I said as much on Twitter,
00:03:11.380 not so many hours ago,
00:03:12.840 where I wrote,
00:03:14.240 like many of you,
00:03:15.080 I'm thinking about Salman Rushdie now.
00:03:17.640 The threat he's lived under for so long,
00:03:19.920 which was so horrifically realized today,
00:03:22.200 was the product not merely of the hatred
00:03:24.180 and zeal of religious fanatics,
00:03:26.100 but of the cowardice and confusion of secularists.
00:03:29.860 Everyone in arts and letters
00:03:31.080 should have stood shoulder to shoulder
00:03:32.500 with Salman in 1989,
00:03:34.840 thereby distributing the risk.
00:03:36.900 And the fact that so few did
00:03:38.240 is a moral scandal
00:03:39.240 that still casts its shadow over the present.
00:03:41.360 And now I now notice in my
00:03:44.360 at mentions that I am being attacked
00:03:47.260 by many people
00:03:49.840 for being soft on Islam
00:03:53.000 and for using the phrase
00:03:54.860 religious fanatics
00:03:56.400 out of an apparent unwillingness
00:03:58.980 to name Islam
00:04:00.580 and its theology
00:04:01.960 as the culpable party here.
00:04:05.640 All I can say is I'm
00:04:06.800 surprised and
00:04:08.360 somewhat amused by this.
00:04:10.000 I mean, these are people
00:04:10.720 who clearly
00:04:11.400 don't know my history.
00:04:12.620 I have been accused
00:04:14.380 of Islamophobia
00:04:15.260 even more than I've been accused
00:04:17.400 of having
00:04:17.820 Trump derangement syndrome.
00:04:19.580 It might surprise
00:04:20.360 some of you to learn.
00:04:22.220 Anyway, my conscience
00:04:23.180 on that score is
00:04:24.280 quite clear.
00:04:26.920 And I invite those of you
00:04:27.840 who doubt my bona fides
00:04:29.240 as a critic of Islam
00:04:31.160 to do some googling.
00:04:33.660 I even wrote an op-ed
00:04:34.960 with Salman
00:04:35.860 in defense of our mutual friend
00:04:38.020 Ayaan Hirsi Ali
00:04:38.960 I think in 2007.
00:04:41.560 That's on my blog
00:04:43.240 somewhere.
00:04:44.980 I believe the title is
00:04:46.320 Ayaan Hirsi Ali
00:04:47.540 Abandoned to Fanatics.
00:04:50.520 It was published
00:04:51.480 in the Los Angeles Times
00:04:52.580 but I think it's behind
00:04:53.300 a paywall there.
00:04:55.380 Anyway, I don't actually
00:04:56.480 know Salman
00:04:57.780 despite the fact that
00:04:59.420 we wrote that op-ed together.
00:05:01.180 That was accomplished
00:05:02.220 by email.
00:05:03.580 We've never met
00:05:04.760 in person
00:05:05.560 which surprises me.
00:05:08.100 We've always had
00:05:08.780 several good friends
00:05:09.760 in common.
00:05:10.740 Ayaan being one
00:05:12.500 Hitch being
00:05:14.640 another.
00:05:16.800 Salman was
00:05:17.400 one of Hitch's
00:05:18.560 best friends.
00:05:19.980 I was merely a friend.
00:05:22.160 Anyway,
00:05:23.180 thinking about him
00:05:24.160 and
00:05:25.600 I would just say
00:05:27.480 by way of spelling out
00:05:28.460 further
00:05:28.880 the point I was making
00:05:30.560 on Twitter
00:05:31.380 my use of the phrase
00:05:33.120 religious fanatics
00:05:34.240 was not at all
00:05:36.060 an instance of my
00:05:37.280 mincing words
00:05:38.380 with respect to Islam
00:05:39.920 although I can see
00:05:41.580 how someone might
00:05:42.600 have thought that
00:05:43.580 if they were unfamiliar
00:05:44.920 with my work
00:05:45.660 I was just assuming
00:05:47.300 enough familiarity
00:05:48.680 with what I have said
00:05:49.920 and written on this topic
00:05:50.880 that it simply
00:05:51.820 never occurred to me
00:05:53.120 that I could be accused
00:05:54.560 of pandering
00:05:56.040 to Islamic extremists
00:05:57.760 here
00:05:58.300 or to the confused
00:05:59.880 liberal secularists
00:06:01.380 who would support
00:06:02.420 such pandering
00:06:03.080 I just assumed
00:06:04.860 my reputation
00:06:05.780 preceded me here
00:06:06.780 anyway
00:06:08.560 the real point
00:06:09.280 I was making here
00:06:09.940 was that
00:06:10.380 the problem has
00:06:11.520 always been larger
00:06:12.980 than mere
00:06:14.360 Islamic extremism
00:06:16.200 and
00:06:17.120 its intolerance
00:06:18.880 to free speech
00:06:19.840 this extremism
00:06:21.460 has been given
00:06:22.400 an immense amount
00:06:23.820 of oxygen
00:06:24.520 by confused
00:06:25.920 and cowardly
00:06:26.860 and
00:06:27.140 in some cases
00:06:28.500 well-intentioned
00:06:30.100 people
00:06:30.500 who
00:06:31.900 worry that
00:06:33.260 any criticism
00:06:33.940 of Islam
00:06:34.700 will give energy
00:06:36.200 to racism
00:06:37.440 and xenophobia
00:06:38.600 Salman
00:06:40.340 was attacked
00:06:41.120 by such people
00:06:42.200 from the moment
00:06:43.500 the Ayatollah
00:06:44.620 pronounced
00:06:45.420 his fatwa
00:06:46.620 and the list
00:06:48.100 of people
00:06:48.640 who
00:06:49.960 rushed
00:06:51.100 to heap
00:06:52.220 shame
00:06:53.060 upon their heads
00:06:54.040 by
00:06:54.940 criticizing
00:06:56.240 a novelist
00:06:57.600 forced into hiding
00:06:59.320 out of
00:07:00.560 deference
00:07:01.120 for the
00:07:01.640 sensitivities
00:07:02.840 of religious
00:07:04.000 lunatics
00:07:04.660 is an awful
00:07:06.040 list of names
00:07:06.820 and so it is
00:07:08.160 with more recent
00:07:08.840 atrocities
00:07:09.560 like the
00:07:10.440 Charlie Hebdo
00:07:11.160 massacre
00:07:12.020 the people
00:07:13.740 who
00:07:14.180 explicitly
00:07:14.920 or implicitly
00:07:16.060 took the side
00:07:17.180 of the
00:07:17.460 jihadist
00:07:18.140 murderers
00:07:18.880 in criticizing
00:07:20.020 the butchered
00:07:20.740 cartoonists
00:07:21.600 for their
00:07:22.100 alleged
00:07:22.500 insensitivity
00:07:23.380 these are people
00:07:24.560 who are still
00:07:25.020 in good standing
00:07:25.840 in many cases
00:07:27.360 and I'm not aware
00:07:28.900 that any
00:07:29.440 have apologized
00:07:30.620 for having
00:07:31.460 got those
00:07:32.000 moments of
00:07:32.520 moral and
00:07:33.180 political
00:07:33.520 emergency
00:07:34.080 so disastrously
00:07:35.760 wrong
00:07:36.220 I'm doing my
00:07:37.520 best not to
00:07:38.060 name names
00:07:38.580 here
00:07:38.840 but anyone
00:07:40.060 interested
00:07:40.480 can see
00:07:41.520 who took
00:07:42.360 the side
00:07:42.820 of religious
00:07:43.980 barbarians
00:07:44.740 there
00:07:45.560 and there have
00:07:47.140 been so many
00:07:47.660 moments
00:07:48.100 like that
00:07:49.120 since the
00:07:49.940 fatwa
00:07:50.360 Salman was
00:07:51.160 first
00:07:51.600 pronounced
00:07:52.240 and many
00:07:53.980 of you
00:07:54.320 who are
00:07:54.780 living by
00:07:56.100 podcast
00:07:56.580 and
00:07:57.060 sub-stack
00:07:57.580 newsletter
00:07:58.080 might
00:07:59.100 take a look
00:08:00.340 because some
00:08:01.660 of your
00:08:01.880 modern
00:08:02.320 free speech
00:08:03.140 heroes
00:08:03.520 were people
00:08:04.560 who thought
00:08:05.600 that just
00:08:06.060 maybe those
00:08:06.640 cartoonists
00:08:07.420 had it coming
00:08:08.460 to them
00:08:08.820 anyway
00:08:10.440 this is an
00:08:11.040 awful thing
00:08:11.740 I hope
00:08:13.200 Salman recovers
00:08:14.100 and
00:08:15.780 whatever
00:08:17.460 happens there
00:08:18.240 if there's more
00:08:19.200 to say
00:08:19.600 I'll do a
00:08:20.340 podcast
00:08:20.840 on it
00:08:21.440 okay
00:08:24.140 today I'm
00:08:26.020 speaking with
00:08:26.620 Will McCaskill
00:08:27.440 Will is an
00:08:29.220 associate professor
00:08:29.920 in philosophy
00:08:30.540 at the University
00:08:31.260 of Oxford
00:08:31.780 and a
00:08:33.340 TED speaker
00:08:34.140 a past
00:08:35.920 Forbes
00:08:36.600 30 under
00:08:37.240 30 social
00:08:38.260 entrepreneur
00:08:38.720 he also
00:08:40.220 co-founded the
00:08:41.020 Center for
00:08:41.380 Effective Altruism
00:08:42.260 which has raised
00:08:43.660 over a billion
00:08:44.280 dollars for
00:08:45.040 charities
00:08:45.500 and he's the
00:08:47.260 author most
00:08:47.860 recently of a
00:08:48.680 wonderful book
00:08:49.380 titled
00:08:50.500 What We
00:08:51.100 Owe the
00:08:51.400 Future
00:08:51.720 and that
00:08:53.140 is in part
00:08:53.700 the topic
00:08:54.640 of today's
00:08:55.440 discussion
00:08:55.960 I'll just
00:08:57.560 read you the
00:08:57.960 blurb I
00:08:58.760 wrote for
00:08:59.060 that book
00:08:59.500 because it'll
00:09:00.600 give you a
00:09:00.920 sense of how
00:09:01.280 important I
00:09:01.800 think Will's
00:09:02.660 work is
00:09:03.420 no living
00:09:05.420 philosopher has
00:09:06.240 had a greater
00:09:06.700 impact upon
00:09:07.460 my ethics
00:09:08.120 than Will
00:09:08.700 McCaskill
00:09:09.220 in What We
00:09:10.600 Owe the
00:09:10.840 Future
00:09:11.180 McCaskill has
00:09:11.960 transformed my
00:09:12.800 thinking once
00:09:13.400 again by
00:09:14.660 patiently
00:09:15.080 dismantling the
00:09:16.020 lazy intuitions
00:09:17.060 that rendered
00:09:17.840 me morally
00:09:18.440 blind to the
00:09:19.900 interests of
00:09:20.420 future generations
00:09:21.220 this is an
00:09:22.340 altogether thrilling
00:09:23.240 and necessary
00:09:23.980 book and that
00:09:26.100 pretty much says
00:09:26.980 it the book is
00:09:28.740 also blurbed by
00:09:29.980 several friends
00:09:31.640 Paul Bloom
00:09:32.460 Tim Urban
00:09:33.340 Stephen Fry
00:09:34.640 as you'll hear
00:09:35.940 Will has been
00:09:36.440 getting a lot of
00:09:36.960 press recently
00:09:38.040 there was a
00:09:39.340 New Yorker
00:09:39.860 profile that
00:09:41.360 came out a
00:09:41.780 couple of days
00:09:42.260 ago also the
00:09:43.700 cover of Time
00:09:44.220 Magazine was
00:09:45.140 devoted to him
00:09:45.900 and the
00:09:46.580 effective altruism
00:09:47.260 movement
00:09:48.440 and all of
00:09:49.100 this is great
00:09:49.580 to see
00:09:49.960 so Will and
00:09:51.620 I get into
00:09:52.020 the book
00:09:52.380 we talk about
00:09:53.560 effective altruism
00:09:54.540 in general
00:09:55.040 and its
00:09:55.800 emphasis on
00:09:56.360 long-termism
00:09:57.180 and existential
00:09:58.480 risk
00:09:59.000 we talk about
00:10:00.340 criticisms of
00:10:01.320 the philosophy
00:10:02.280 and some of the
00:10:04.260 problems with
00:10:04.880 expected value
00:10:05.780 reasoning
00:10:06.240 we talk about
00:10:08.120 the difference
00:10:08.520 between doing
00:10:09.180 good and
00:10:09.660 feeling good
00:10:10.360 why it's so
00:10:11.620 hard to care
00:10:12.200 about future
00:10:12.820 people
00:10:13.180 how the
00:10:14.540 future gives
00:10:15.040 meaning to
00:10:15.520 the present
00:10:16.020 why this
00:10:17.260 moment in
00:10:17.740 history is
00:10:18.260 unusual
00:10:18.700 which relates
00:10:19.940 to the pace
00:10:20.560 of economic
00:10:21.220 and technological
00:10:22.080 growth
00:10:22.580 we discuss
00:10:24.780 bad political
00:10:25.620 incentives
00:10:26.240 value lock-in
00:10:28.040 the well-being
00:10:29.200 of conscious
00:10:29.680 creatures as a
00:10:30.400 foundation for
00:10:31.140 ethics
00:10:31.500 the risk of
00:10:32.820 unaligned artificial
00:10:33.960 intelligence
00:10:34.660 how we're bad
00:10:36.040 at predicting
00:10:36.620 technological change
00:10:38.060 and many other
00:10:39.520 topics
00:10:39.960 no paywall
00:10:41.420 for this one
00:10:41.980 as I think
00:10:42.820 it's so important
00:10:43.460 as always
00:10:44.460 if you want to
00:10:44.880 support what
00:10:45.300 I'm doing here
00:10:45.900 you can subscribe
00:10:47.380 to the podcast
00:10:48.260 at samharris.org
00:10:49.840 and now I bring
00:10:51.780 you Will McCaskill
00:10:52.980 I am back
00:11:00.180 once again
00:11:00.660 with the great
00:11:01.120 Will McCaskill
00:11:01.920 Will thanks for
00:11:03.080 joining me again
00:11:03.600 thanks so much
00:11:04.640 it's great to be
00:11:05.120 back on
00:11:05.540 so you are
00:11:07.000 having a bit
00:11:08.260 of a wild ride
00:11:09.000 you have a new
00:11:09.920 book
00:11:10.260 and that book
00:11:11.360 is What We
00:11:11.940 Owe the Future
00:11:12.580 and I got an
00:11:14.040 early
00:11:14.420 very early
00:11:15.720 look at that
00:11:16.380 and it's really
00:11:17.800 wonderful
00:11:18.380 and it's
00:11:19.080 I don't think
00:11:20.260 we'll cover
00:11:20.720 all of it here
00:11:22.100 but we'll
00:11:22.460 definitely get
00:11:23.120 into it
00:11:23.540 but before we
00:11:24.520 do that
00:11:25.100 I want to
00:11:26.220 talk about
00:11:27.140 effective
00:11:28.280 altruism
00:11:28.980 more generally
00:11:30.080 and in
00:11:31.340 particular
00:11:31.600 talk about
00:11:32.180 some of the
00:11:32.960 pushback
00:11:34.060 I've noticed
00:11:35.280 of late
00:11:36.120 against it
00:11:36.980 and some
00:11:37.940 of this
00:11:38.200 echoes
00:11:38.620 reservations
00:11:40.020 I've had
00:11:40.760 with the
00:11:41.140 movement
00:11:41.460 and some
00:11:42.080 of which
00:11:42.340 we've
00:11:42.680 discussed
00:11:43.200 and some
00:11:44.220 of it
00:11:44.500 strikes me
00:11:45.460 as frankly
00:11:46.080 crazy
00:11:46.940 so
00:11:47.920 yeah
00:11:48.700 let's just
00:11:49.320 get into
00:11:49.640 that
00:11:49.840 but
00:11:50.040 before we
00:11:51.500 do
00:11:51.620 how are you
00:11:52.200 doing
00:11:52.400 how's
00:11:52.740 life
00:11:53.240 I mean
00:11:53.800 things are
00:11:54.180 very good
00:11:54.540 overall
00:11:54.900 I definitely
00:11:56.000 feel a little
00:11:57.500 bit shell-shocked
00:11:58.460 by the wave
00:11:59.740 of attention
00:12:00.420 I'm currently
00:12:00.900 through
00:12:01.280 and
00:12:02.060 I don't
00:12:04.340 know
00:12:04.540 I'm
00:12:04.860 alternating
00:12:05.960 between
00:12:06.620 excitement
00:12:07.360 and anxiety
00:12:08.040 but
00:12:09.040 overall
00:12:09.460 it's
00:12:09.680 very good
00:12:10.020 indeed
00:12:10.260 and I'm
00:12:10.640 happy
00:12:10.860 these ideas
00:12:11.360 are getting
00:12:11.600 more airtime
00:12:12.160 am I right
00:12:13.100 in thinking
00:12:13.580 that I
00:12:14.480 just saw
00:12:15.700 EA
00:12:16.180 and a long
00:12:17.620 interview
00:12:17.860 with you
00:12:18.280 on the
00:12:18.900 cover of
00:12:19.360 Time magazine
00:12:20.020 that's exactly
00:12:20.740 right
00:12:21.020 so that
00:12:21.360 that was
00:12:22.180 the news
00:12:22.520 of today
00:12:22.880 that came
00:12:23.320 out this
00:12:23.640 morning
00:12:23.920 and I
00:12:24.800 think it's
00:12:25.120 a beautiful
00:12:25.460 piece
00:12:25.800 by Naina
00:12:26.240 Bajakal
00:12:26.740 and yeah
00:12:27.880 I'd encourage
00:12:28.280 anyone to
00:12:28.720 lead it
00:12:28.980 if they
00:12:29.200 want to
00:12:29.400 get a
00:12:29.640 sense
00:12:29.860 of
00:12:30.100 what
00:12:30.720 the
00:12:30.840 EA
00:12:31.000 movement
00:12:31.300 is
00:12:31.520 like
00:12:31.740 from
00:12:31.920 the
00:12:32.020 inside
00:12:32.300 that's
00:12:33.380 great
00:12:33.660 that's
00:12:34.040 great
00:12:34.200 well
00:12:34.340 congratulations
00:12:34.900 I'm very
00:12:35.660 happy to
00:12:36.020 see
00:12:36.260 that
00:12:36.840 attention
00:12:37.280 being
00:12:37.760 bent
00:12:38.040 your
00:12:38.200 way
00:12:38.520 and
00:12:39.120 perhaps
00:12:39.540 we
00:12:39.880 can
00:12:40.060 bend
00:12:40.340 more
00:12:40.680 of
00:12:40.820 it
00:12:40.940 here
00:12:41.260 there's
00:12:42.000 a lot
00:12:42.300 we've
00:12:42.780 already
00:12:43.260 said
00:12:43.540 about
00:12:43.740 this
00:12:43.960 but
00:12:44.140 I
00:12:44.360 think
00:12:44.580 we
00:12:44.700 have
00:12:44.840 to
00:12:44.940 assume
00:12:45.240 that
00:12:45.560 many
00:12:46.280 people
00:12:46.460 in
00:12:46.560 the
00:12:46.580 audience
00:12:46.880 have
00:12:47.060 not
00:12:47.360 heard
00:12:47.740 the
00:12:48.480 many
00:12:48.880 hours
00:12:49.320 you
00:12:49.600 and
00:12:49.720 I
00:12:49.880 have
00:12:50.100 gone
00:12:50.660 round
00:12:50.920 and
00:12:51.080 round
00:12:51.320 on
00:12:51.780 these
00:12:51.940 issues
00:12:52.200 already
00:12:52.540 so
00:12:52.860 at
00:12:53.400 the
00:12:53.540 top
00:12:53.800 here
00:12:54.080 can
00:12:55.060 you
00:12:55.200 define
00:12:55.840 a few
00:12:56.500 terms
00:12:56.920 I
00:12:57.160 think
00:12:57.320 you
00:12:57.460 should
00:12:57.600 define
00:12:58.200 effective
00:12:59.540 altruism
00:13:00.180 certainly
00:13:01.300 what you
00:13:01.720 mean
00:13:02.080 or you
00:13:02.460 think
00:13:02.700 we
00:13:03.140 should
00:13:03.360 mean
00:13:03.700 by
00:13:04.300 that
00:13:04.520 phrase
00:13:04.960 and
00:13:05.580 also
00:13:06.680 long
00:13:07.160 termism
00:13:07.880 and
00:13:08.420 existential
00:13:08.940 risk
00:13:09.540 because
00:13:09.900 those
00:13:10.080 will be
00:13:10.360 coming
00:13:10.600 up a
00:13:10.880 lot
00:13:11.020 here
00:13:11.220 sure
00:13:11.880 so
00:13:12.860 effective
00:13:13.340 altruism
00:13:13.840 is
00:13:14.760 a
00:13:15.240 philosophy
00:13:15.580 and
00:13:15.860 a
00:13:15.960 community
00:13:16.320 that's
00:13:17.460 about
00:13:17.700 ask
00:13:18.200 the
00:13:18.360 question
00:13:18.760 how
00:13:19.420 can
00:13:19.600 we
00:13:19.700 do
00:13:19.820 as
00:13:19.940 much
00:13:20.100 good
00:13:20.260 as
00:13:20.400 possible
00:13:20.780 with
00:13:21.620 our
00:13:22.040 time
00:13:22.400 and
00:13:22.600 with
00:13:22.740 our
00:13:22.860 money
00:13:23.080 and
00:13:23.720 then
00:13:23.880 taking
00:13:24.160 action
00:13:24.520 to
00:13:24.880 put
00:13:25.580 the
00:13:25.740 results
00:13:26.100 of
00:13:26.260 that
00:13:26.440 or
00:13:26.860 answers
00:13:27.100 to
00:13:27.280 that
00:13:27.420 question
00:13:27.780 into
00:13:28.040 practice
00:13:28.480 to
00:13:28.760 try
00:13:28.940 to
00:13:29.080 make
00:13:29.240 the
00:13:29.380 world
00:13:29.560 better
00:13:29.900 long
00:13:31.360 termism
00:13:31.820 is
00:13:32.480 a
00:13:32.960 philosophical
00:13:33.400 perspective
00:13:33.940 that
00:13:34.940 says
00:13:35.460 at
00:13:36.820 least
00:13:37.020 one
00:13:37.300 of
00:13:37.420 the
00:13:37.540 key
00:13:37.720 priorities
00:13:38.140 of
00:13:38.380 our
00:13:38.500 time
00:13:38.820 is
00:13:39.580 to
00:13:39.780 make
00:13:40.100 the
00:13:40.660 long
00:13:40.860 term
00:13:41.080 future
00:13:41.500 of
00:13:41.660 humanity
00:13:41.960 go
00:13:42.200 well
00:13:42.460 that
00:13:43.360 means
00:13:43.580 just
00:13:43.780 taking
00:13:44.080 seriously
00:13:44.640 just
00:13:45.080 the
00:13:45.440 sheer
00:13:45.640 scale
00:13:46.000 of
00:13:46.180 the
00:13:46.300 future
00:13:46.620 how
00:13:47.240 high
00:13:47.480 the
00:13:47.640 stakes
00:13:47.900 could
00:13:48.100 be
00:13:48.260 in
00:13:48.400 shaping
00:13:48.660 it
00:13:48.920 the
00:13:49.760 fact
00:13:50.000 that
00:13:50.200 there
00:13:50.360 really
00:13:50.580 might
00:13:50.820 be
00:13:51.160 events
00:13:51.820 that
00:13:51.980 occur
00:13:52.280 in
00:13:52.460 our
00:13:52.620 lifetimes
00:13:53.220 that
00:13:53.880 could
00:13:54.040 have
00:13:54.280 an
00:13:54.400 enormous
00:13:54.840 impact
00:13:55.500 for
00:13:56.080 good
00:13:56.260 or
00:13:56.460 ill
00:13:56.720 in
00:13:57.280 our
00:13:57.420 own
00:13:57.600 lifetimes
00:13:58.160 and
00:13:58.920 taking
00:13:59.120 seriously
00:13:59.500 that
00:13:59.720 we
00:13:59.900 can
00:14:00.060 help
00:14:00.380 steer
00:14:00.940 those
00:14:02.360 events
00:14:02.780 in a
00:14:03.080 more
00:14:03.200 positive
00:14:03.540 direction
00:14:04.060 helping
00:14:04.780 set
00:14:05.120 humanity
00:14:05.820 on a
00:14:06.100 better
00:14:06.240 path
00:14:06.660 to
00:14:07.180 improve
00:14:08.240 the
00:14:08.360 lives
00:14:08.540 of
00:14:08.640 our
00:14:08.740 grandkids
00:14:09.140 and
00:14:09.280 their
00:14:09.420 grandkids
00:14:09.860 and
00:14:10.000 their
00:14:10.160 grandkids
00:14:10.620 and
00:14:10.880 so
00:14:11.060 on
00:14:11.260 then
00:14:11.880 the
00:14:12.020 final
00:14:12.260 term
00:14:12.560 existential
00:14:13.060 risks
00:14:13.540 are
00:14:14.620 a
00:14:15.140 category
00:14:15.540 of
00:14:15.780 issues
00:14:16.180 that
00:14:17.040 I
00:14:17.820 and
00:14:17.960 others
00:14:18.260 have
00:14:18.600 increasingly
00:14:20.060 focused
00:14:20.580 on
00:14:20.820 or
00:14:20.960 started
00:14:21.340 thinking
00:14:22.080 and
00:14:22.280 worrying
00:14:22.500 about
00:14:22.920 and
00:14:23.800 these
00:14:24.040 are
00:14:24.180 risks
00:14:24.520 that
00:14:24.940 could
00:14:25.580 be
00:14:25.760 enormously
00:14:26.220 impactful
00:14:26.820 not
00:14:27.400 just
00:14:27.720 in
00:14:27.860 the
00:14:27.960 present
00:14:28.280 but
00:14:29.080 in
00:14:29.620 fact
00:14:29.840 for
00:14:29.980 the
00:14:30.080 whole
00:14:30.240 future
00:14:30.640 so
00:14:31.880 some
00:14:32.760 of
00:14:32.880 these
00:14:33.020 risks
00:14:33.400 are
00:14:33.660 risks
00:14:33.940 that
00:14:34.180 threaten
00:14:34.540 extinction
00:14:35.260 such
00:14:36.140 as
00:14:36.540 the
00:14:37.080 use
00:14:37.400 of
00:14:37.960 next
00:14:38.840 gender
00:14:39.100 generation
00:14:39.340 of
00:14:39.540 bioweapons
00:14:40.180 in
00:14:40.940 World War
00:14:41.520 Three
00:14:41.760 with
00:14:42.720 viruses
00:14:43.340 of
00:14:43.660 unprecedented
00:14:44.200 lethality
00:14:44.840 and
00:14:44.980 infectiousness
00:14:46.280 that could
00:14:47.040 really result
00:14:47.620 in the
00:14:48.200 collapse
00:14:48.480 of
00:14:48.640 civilization
00:14:49.160 or
00:14:49.680 the
00:14:50.380 extinction
00:14:50.820 of
00:14:51.180 everyone
00:14:51.900 alive
00:14:52.180 today
00:14:52.460 in
00:14:52.660 the
00:14:52.760 worst
00:14:52.960 case
00:14:53.300 or
00:14:54.620 very
00:14:55.000 rapid
00:14:55.300 advances
00:14:55.740 in
00:14:56.020 artificial
00:14:56.380 intelligence
00:14:57.080 where
00:14:57.960 that
00:14:58.820 could
00:14:59.140 result
00:14:59.620 in
00:14:59.940 as
00:15:00.160 you've
00:15:00.480 argued
00:15:00.920 the
00:15:01.460 AIs
00:15:01.840 being
00:15:02.180 just
00:15:02.440 far
00:15:02.760 more
00:15:02.940 capable
00:15:03.260 than
00:15:03.500 human
00:15:03.760 beings
00:15:04.220 such
00:15:05.000 that
00:15:05.340 it
00:15:06.020 merely
00:15:06.320 the
00:15:06.740 AI
00:15:07.220 systems
00:15:07.660 merely
00:15:08.000 see
00:15:08.280 humans
00:15:08.600 as
00:15:09.100 an obstacle
00:15:09.620 to
00:15:09.840 their
00:15:10.000 goals
00:15:10.380 and
00:15:11.460 we
00:15:12.100 humans
00:15:13.940 are
00:15:14.140 disempowered
00:15:14.680 in light
00:15:15.040 of that
00:15:15.460 or
00:15:16.540 in fact
00:15:17.420 again
00:15:18.360 just
00:15:19.100 made
00:15:19.680 extinct
00:15:20.000 or
00:15:20.160 eliminated
00:15:20.600 by such
00:15:21.380 systems
00:15:21.840 or
00:15:23.260 existential
00:15:24.340 risks
00:15:24.920 can
00:15:25.300 lead
00:15:25.900 to
00:15:26.260 permanent
00:15:27.700 loss
00:15:29.260 of value
00:15:29.780 or future
00:15:30.240 value
00:15:30.700 not just
00:15:32.220 via
00:15:32.560 extinction
00:15:33.080 so a
00:15:33.720 perpetual
00:15:34.100 dystopia
00:15:34.840 that was
00:15:35.940 perhaps
00:15:36.940 perhaps
00:15:36.960 it's
00:15:37.300 a
00:15:37.480 little
00:15:37.620 better
00:15:37.940 than
00:15:38.240 extinction
00:15:38.960 but not
00:15:39.360 much
00:15:39.620 better
00:15:39.920 with
00:15:40.640 a
00:15:40.780 lock
00:15:41.020 in
00:15:41.220 of
00:15:41.460 fascist
00:15:42.040 values
00:15:42.500 that
00:15:43.080 would
00:15:43.220 count
00:15:43.420 as
00:15:43.540 an
00:15:43.660 existential
00:15:44.100 catastrophe
00:15:44.660 as
00:15:44.980 well
00:15:45.160 and
00:15:46.580 increasingly
00:15:47.280 I've
00:15:47.780 been
00:15:47.900 getting
00:15:48.100 to know
00:15:48.560 the
00:15:48.960 leading
00:15:49.860 scientists
00:15:50.440 who think
00:15:50.980 that
00:15:51.220 the
00:15:51.720 scale
00:15:52.060 of
00:15:52.240 such
00:15:52.440 existential
00:15:52.960 catastrophe
00:15:53.500 it's
00:15:53.760 actually
00:15:54.000 not
00:15:54.220 small
00:15:54.540 it's
00:15:55.360 not
00:15:55.600 like
00:15:55.920 in
00:15:56.660 any
00:15:56.920 way
00:15:57.080 like
00:15:57.320 this
00:15:57.620 one
00:15:58.200 in a
00:15:58.400 million
00:15:58.680 chance
00:15:59.520 of an
00:15:59.740 asteroid
00:16:00.120 spike
00:16:00.540 in the
00:16:00.840 next
00:16:01.060 century
00:16:01.380 but
00:16:02.220 much
00:16:02.500 more
00:16:02.800 like
00:16:03.060 the
00:16:03.200 risk
00:16:03.420 of
00:16:03.540 dying
00:16:03.740 in
00:16:03.880 a
00:16:03.980 car
00:16:04.160 crash
00:16:04.460 in
00:16:04.600 one's
00:16:04.860 lifetime
00:16:05.280 where
00:16:06.140 these
00:16:06.540 risks
00:16:06.940 are
00:16:07.420 certainly
00:16:08.280 in
00:16:08.640 the
00:16:08.780 multiple
00:16:09.040 percentage
00:16:09.500 points
00:16:09.940 maybe
00:16:10.240 even
00:16:10.500 tens
00:16:10.760 of
00:16:10.880 percentage
00:16:11.200 points
00:16:11.640 just
00:16:12.280 in
00:16:12.440 our
00:16:12.580 lifetime
00:16:12.920 alone
00:16:13.280 so
00:16:15.460 in
00:16:16.180 the
00:16:16.320 definition
00:16:16.780 of
00:16:17.080 effective
00:16:17.480 altruism
00:16:18.260 I think
00:16:19.020 one
00:16:19.300 could
00:16:19.500 be
00:16:19.980 forgiven
00:16:21.120 for
00:16:21.860 wondering
00:16:22.440 what
00:16:23.260 is
00:16:23.480 novel
00:16:23.980 about
00:16:24.920 simply
00:16:25.880 wanting
00:16:26.400 to do
00:16:26.860 as
00:16:27.040 much
00:16:27.260 good
00:16:27.560 as
00:16:27.840 one
00:16:28.240 can
00:16:28.660 right
00:16:29.220 how
00:16:29.560 is
00:16:30.080 that
00:16:30.420 a
00:16:30.980 new
00:16:31.200 contribution
00:16:31.860 to
00:16:32.400 the
00:16:32.520 way
00:16:32.620 we
00:16:32.780 think
00:16:33.160 about
00:16:33.520 doing
00:16:33.980 good
00:16:34.200 in
00:16:34.300 the
00:16:34.400 world
00:16:34.640 hasn't
00:16:35.160 everyone
00:16:35.560 who
00:16:35.980 has
00:16:36.240 attempted
00:16:36.520 to
00:16:36.720 do
00:16:36.840 good
00:16:37.180 wanted
00:16:38.100 to
00:16:38.300 do
00:16:38.500 as
00:16:38.720 much
00:16:38.900 good
00:16:39.120 as
00:16:39.240 they
00:16:39.380 could
00:16:39.560 get
00:16:40.060 their
00:16:40.200 hands
00:16:40.440 around
00:16:40.700 what
00:16:41.340 do
00:16:41.460 you
00:16:41.580 view
00:16:41.840 as
00:16:42.040 being
00:16:42.320 novel
00:16:42.860 in
00:16:43.420 this
00:16:43.700 movement
00:16:44.020 around
00:16:44.300 that
00:16:44.520 so
00:16:45.600 I
00:16:46.580 am
00:16:47.120 certainly
00:16:47.420 not
00:16:47.620 claiming
00:16:47.940 that
00:16:48.160 effective
00:16:48.500 altruism
00:16:48.960 is
00:16:49.160 some
00:16:49.380 radically
00:16:49.720 new
00:16:50.000 idea
00:16:50.340 I
00:16:50.820 think
00:16:51.020 you
00:16:51.160 can
00:16:51.280 see
00:16:51.640 its
00:16:52.600 intellectual
00:16:53.240 roots
00:16:53.960 going
00:16:54.280 back
00:16:54.620 decades
00:16:55.160 I
00:16:55.440 actually
00:16:55.680 argue
00:17:01.200 sequentialist
00:17:01.820 model
00:17:02.080 philosophers
00:17:02.540 the
00:17:02.860 Mohists
00:17:03.280 but
00:17:04.700 I
00:17:05.480 actually
00:17:05.840 think
00:17:06.200 that
00:17:06.620 this
00:17:07.680 mindset
00:17:08.160 of
00:17:08.460 trying
00:17:08.700 to
00:17:08.860 do
00:17:09.000 as
00:17:09.140 much
00:17:09.320 good
00:17:09.500 as
00:17:09.640 possible
00:17:10.020 is
00:17:10.860 at
00:17:11.140 least
00:17:11.380 somewhat
00:17:12.520 unusual
00:17:12.960 where
00:17:14.000 in
00:17:14.480 general
00:17:14.920 when
00:17:15.340 people
00:17:15.900 try
00:17:16.120 to
00:17:16.260 do
00:17:16.420 good
00:17:16.700 and
00:17:16.900 this
00:17:17.040 was
00:17:17.180 certainly
00:17:17.460 true
00:17:17.700 for
00:17:17.880 me
00:17:18.060 before
00:17:18.320 I
00:17:18.540 started
00:17:18.760 thinking
00:17:19.060 about
00:17:19.360 this
00:17:19.700 they
00:17:20.820 are
00:17:22.180 like
00:17:22.700 looking
00:17:23.040 at
00:17:23.240 what
00:17:23.640 causes
00:17:24.100 have
00:17:24.260 been
00:17:24.380 made
00:17:24.600 salient
00:17:25.000 to
00:17:25.180 them
00:17:25.380 where
00:17:25.860 in
00:17:26.040 my
00:17:26.260 case
00:17:26.560 that
00:17:26.780 was
00:17:27.140 extreme
00:17:27.900 poverty
00:17:28.300 I
00:17:29.120 remember
00:17:29.420 as a
00:17:29.720 teenager
00:17:30.060 reading
00:17:30.380 about
00:17:30.640 how
00:17:30.860 many
00:17:31.140 people
00:17:31.580 died
00:17:31.840 of
00:17:31.980 AIDS
00:17:32.220 each
00:17:32.460 year
00:17:32.700 and
00:17:33.020 just
00:17:33.180 thinking
00:17:33.500 oh my
00:17:33.880 god
00:17:34.100 this
00:17:34.300 is
00:17:34.440 an
00:17:34.580 atrocity
00:17:35.160 how
00:17:35.820 can
00:17:36.020 we
00:17:36.120 not
00:17:36.260 be
00:17:36.400 taking
00:17:36.640 more
00:17:36.860 action
00:17:37.080 on
00:17:37.240 this
00:17:37.540 but
00:17:38.520 what
00:17:38.700 I
00:17:38.840 wasn't
00:17:39.160 doing
00:17:39.440 at
00:17:39.560 the
00:17:39.680 time
00:17:39.980 was
00:17:40.240 sitting
00:17:41.000 back
00:17:41.320 kind
00:17:41.580 of
00:17:41.660 taking
00:17:42.000 a
00:17:42.700 step
00:17:42.860 back
00:17:43.160 and
00:17:43.320 thinking
00:17:43.660 of
00:17:44.400 all
00:17:44.800 the
00:17:45.000 things
00:17:45.380 all
00:17:45.720 the
00:17:45.880 many
00:17:46.200 many
00:17:46.480 problems
00:17:46.840 in
00:17:47.080 the
00:17:47.180 world
00:17:47.480 what
00:17:48.460 are
00:17:48.580 the
00:17:48.700 things
00:17:49.180 what's
00:17:49.580 the
00:17:49.700 thing
00:17:49.860 I
00:17:50.140 should
00:17:50.340 be
00:17:50.460 most
00:17:50.700 focusing
00:17:51.060 on
00:17:51.320 if
00:17:51.920 I
00:17:52.060 want
00:17:52.240 to
00:17:52.360 have
00:17:52.620 the
00:17:52.940 biggest
00:17:53.140 impact
00:17:53.560 I
00:17:53.740 can
00:17:54.000 what's
00:17:54.720 the
00:17:54.820 best
00:17:55.100 evidence
00:17:55.680 saying
00:17:57.960 about
00:17:58.200 this
00:17:58.460 and
00:18:00.760 really
00:18:01.080 thinking
00:18:01.420 that
00:18:01.640 through
00:18:01.920 and
00:18:02.940 so
00:18:03.300 I
00:18:03.620 actually
00:18:03.960 think
00:18:04.200 that
00:18:04.400 mode
00:18:04.620 of
00:18:04.740 reasoning
00:18:05.000 is
00:18:05.200 somewhat
00:18:05.480 unusual
00:18:05.880 and
00:18:06.200 that's
00:18:06.440 what's
00:18:06.720 distinctive
00:18:07.180 about
00:18:07.940 effective
00:18:08.280 autism
00:18:08.640 yeah
00:18:09.540 so let's
00:18:09.860 deal
00:18:10.020 with
00:18:10.160 some
00:18:10.320 of
00:18:10.420 this
00:18:10.640 pushback
00:18:11.520 before
00:18:12.140 moving
00:18:12.920 on
00:18:13.180 there
00:18:13.300 was
00:18:13.440 a
00:18:13.680 I'm
00:18:14.620 sure
00:18:14.760 there's
00:18:14.940 been
00:18:15.140 much
00:18:15.520 more
00:18:15.700 that
00:18:15.840 I
00:18:15.940 haven't
00:18:16.160 noticed
00:18:16.480 but
00:18:16.700 in
00:18:16.800 the
00:18:16.900 last
00:18:17.080 few
00:18:17.240 days
00:18:17.460 I
00:18:17.640 noticed
00:18:18.160 a
00:18:19.000 Wall
00:18:19.300 Street
00:18:19.500 Journal
00:18:19.800 article
00:18:20.220 that
00:18:21.520 really
00:18:22.720 did not
00:18:23.040 have much
00:18:23.480 substance
00:18:24.160 to it
00:18:24.520 it
00:18:24.700 seemed
00:18:24.920 to have
00:18:25.100 been
00:18:25.200 written
00:18:25.340 by
00:18:25.520 someone
00:18:25.740 who
00:18:25.920 just
00:18:26.220 doesn't
00:18:26.700 like
00:18:26.900 paying
00:18:27.160 taxes
00:18:27.620 and
00:18:28.340 didn't
00:18:28.980 seem
00:18:29.160 to like
00:18:29.520 Sam
00:18:29.820 Bankman
00:18:30.200 Freed's
00:18:30.740 politics
00:18:31.420 I
00:18:32.200 didn't
00:18:32.380 really
00:18:32.580 detect
00:18:33.120 any
00:18:33.780 deep
00:18:34.660 arguments
00:18:35.080 against
00:18:35.620 effective
00:18:36.620 altruism
00:18:37.120 there
00:18:37.420 I
00:18:37.660 guess
00:18:37.820 his
00:18:37.980 one
00:18:38.420 salient
00:18:38.780 point
00:18:39.160 was
00:18:39.420 that
00:18:39.700 creating
00:18:40.920 economic
00:18:41.620 value
00:18:42.360 you know
00:18:43.160 i.e.
00:18:43.480 doing
00:18:43.760 business
00:18:44.660 is
00:18:46.160 way to
00:18:46.500 help
00:18:46.700 people
00:18:47.040 you know
00:18:47.940 at
00:18:48.140 scale
00:18:48.520 and
00:18:48.780 reduce
00:18:49.700 unnecessary
00:18:50.440 suffering
00:18:50.940 and
00:18:51.540 i guess
00:18:51.740 i would
00:18:52.000 i think
00:18:52.720 we could
00:18:53.200 easily
00:18:53.680 concede
00:18:54.120 that
00:18:54.340 insofar
00:18:54.880 as
00:18:55.080 that's
00:18:55.380 true
00:18:55.820 you know
00:18:57.200 and
00:18:57.320 wherever
00:18:57.780 it
00:18:58.040 happens
00:18:58.360 to be
00:18:58.680 true
00:18:59.000 it
00:18:59.620 would
00:18:59.740 be
00:18:59.880 very
00:19:00.180 easy
00:19:00.540 for us
00:19:00.900 to
00:19:01.020 say
00:19:01.220 okay
00:19:01.460 well
00:19:01.640 then
00:19:01.900 sure
00:19:02.660 let
00:19:02.940 business
00:19:03.300 solve
00:19:03.640 our
00:19:03.820 problems
00:19:04.260 and
00:19:04.960 so
00:19:05.120 there's
00:19:05.460 you know
00:19:06.000 and i
00:19:06.460 don't
00:19:06.580 think
00:19:06.760 you
00:19:06.880 would
00:19:07.040 resist
00:19:07.400 that
00:19:07.660 conclusion
00:19:08.140 wherever
00:19:08.820 it's
00:19:09.280 true
00:19:09.600 i mean
00:19:09.800 i
00:19:09.880 think
00:19:10.020 one
00:19:10.740 could
00:19:10.920 certainly
00:19:11.140 argue
00:19:11.480 that
00:19:11.780 someone
00:19:12.640 like
00:19:12.860 elon
00:19:13.240 musk
00:19:13.740 by
00:19:14.180 you know
00:19:14.800 you know
00:19:15.060 building
00:19:15.420 desirable
00:19:16.020 electric
00:19:16.420 cars
00:19:16.840 has
00:19:17.280 done
00:19:17.780 if
00:19:18.560 not
00:19:18.760 more
00:19:19.080 good
00:19:19.520 a
00:19:20.080 different
00:19:20.400 species
00:19:20.860 of
00:19:21.140 good
00:19:21.500 than
00:19:22.480 many
00:19:23.180 forms
00:19:24.140 of
00:19:24.300 environmental
00:19:24.820 activism
00:19:25.400 have
00:19:26.100 to
00:19:26.380 move
00:19:27.120 us
00:19:27.420 away
00:19:28.180 from
00:19:28.460 a
00:19:28.720 climate
00:19:29.380 change
00:19:29.720 catastrophe
00:19:30.260 so
00:19:31.340 you know
00:19:31.720 if
00:19:31.860 that's
00:19:32.320 true
00:19:32.640 let's
00:19:33.480 bracket
00:19:33.900 that
00:19:34.220 and
00:19:34.640 it's
00:19:35.040 hard
00:19:35.200 to know
00:19:35.440 what
00:19:35.720 exactly
00:19:36.080 is
00:19:36.220 true
00:19:36.380 there
00:19:36.560 but
00:19:36.740 i
00:19:36.940 certainly
00:19:37.160 have
00:19:37.300 an
00:19:37.400 intuition
00:19:37.780 that
00:19:38.200 there's
00:19:38.980 a
00:19:39.080 place
00:19:39.480 for
00:19:39.940 business
00:19:41.040 and
00:19:41.500 even
00:19:41.680 a
00:19:41.960 majority
00:19:42.960 place
00:19:43.500 for
00:19:43.680 business
00:19:44.200 to
00:19:45.240 affect
00:19:46.440 our
00:19:46.880 climate
00:19:47.280 outcome
00:19:47.840 and that
00:19:48.700 it might
00:19:49.020 be
00:19:49.240 greater
00:19:49.680 than
00:19:50.080 charity
00:19:50.800 he
00:19:51.540 seemed
00:19:51.760 to
00:19:51.880 think
00:19:52.020 that
00:19:52.180 was
00:19:52.320 a
00:19:52.720 real
00:19:53.060 you know
00:19:53.600 some
00:19:53.780 kind
00:19:53.940 of
00:19:54.140 coup
00:19:54.340 de
00:19:54.460 gras
00:19:54.680 against
00:19:55.240 effective
00:19:56.000 altruism
00:19:56.520 and
00:19:56.680 even
00:19:56.900 philanthropy
00:19:57.580 in
00:19:58.100 general
00:19:58.460 but
00:19:59.320 i
00:19:59.920 think
00:20:00.060 that
00:20:00.280 blow
00:20:00.860 just
00:20:01.120 passes
00:20:01.480 right
00:20:01.880 by
00:20:02.120 us
00:20:02.320 doesn't
00:20:02.520 it
00:20:02.760 yeah
00:20:03.640 so
00:20:04.040 the
00:20:04.860 thing
00:20:05.060 that
00:20:05.200 i'll
00:20:05.360 say
00:20:05.660 that's
00:20:06.280 the
00:20:06.700 grain
00:20:07.640 of
00:20:07.800 truth
00:20:08.060 and
00:20:08.220 that
00:20:08.360 criticism
00:20:08.760 is
00:20:09.440 that
00:20:09.760 under
00:20:10.200 some
00:20:10.580 in
00:20:11.500 some
00:20:11.760 circumstances
00:20:12.400 markets
00:20:13.660 can be
00:20:14.480 extremely
00:20:15.000 good
00:20:15.300 ways
00:20:15.740 of
00:20:16.160 allocating
00:20:16.880 resources
00:20:17.860 so
00:20:18.920 take
00:20:20.060 the
00:20:20.220 problem
00:20:20.500 of
00:20:21.120 people
00:20:21.500 having
00:20:21.860 access
00:20:22.280 to
00:20:22.860 the
00:20:23.540 internet
00:20:23.920 and
00:20:24.540 to
00:20:24.980 computing
00:20:25.660 technology
00:20:26.380 in the
00:20:26.860 United
00:20:27.140 States
00:20:27.700 at least
00:20:28.500 for
00:20:28.720 people
00:20:28.940 who
00:20:29.080 are
00:20:29.200 middle
00:20:29.360 class
00:20:29.700 or
00:20:29.860 above
00:20:30.220 then
00:20:31.140 you
00:20:32.080 know
00:20:32.200 businesses
00:20:32.640 are
00:20:33.000 dealing
00:20:33.400 with
00:20:33.580 that
00:20:33.760 like
00:20:34.000 at
00:20:34.120 least
00:20:34.280 relatively
00:20:34.720 well
00:20:35.040 there's
00:20:35.280 great
00:20:35.480 incentives
00:20:35.880 to
00:20:36.760 give
00:20:37.580 this
00:20:37.740 product
00:20:38.060 that
00:20:38.220 are
00:20:38.320 benefiting
00:20:38.660 people
00:20:39.020 people
00:20:39.300 are
00:20:39.420 willing
00:20:39.560 to
00:20:39.700 pay
00:20:39.840 for
00:20:40.020 it
00:20:40.260 and
00:20:40.820 there
00:20:40.940 aren't
00:20:41.160 major
00:20:41.580 externalities
00:20:42.400 however
00:20:43.700 as has
00:20:44.100 been
00:20:44.220 well
00:20:44.460 known
00:20:44.700 in
00:20:44.880 economics
00:20:45.280 for
00:20:45.800 I
00:20:46.520 think
00:20:46.660 100
00:20:46.900 years
00:20:47.260 now
00:20:47.560 there
00:20:48.200 are
00:20:48.280 very
00:20:48.480 many
00:20:48.720 cases
00:20:49.060 where
00:20:49.280 markets
00:20:49.780 don't
00:20:50.420 produce
00:20:50.800 the
00:20:51.280 ideal
00:20:51.680 social
00:20:52.240 outcome
00:20:52.620 which
00:20:57.700 absolute
00:20:59.660 paradigm
00:21:00.120 of this
00:21:00.560 example
00:21:00.920 where
00:21:02.320 if
00:21:02.900 someone
00:21:03.220 mines
00:21:03.820 coal
00:21:04.120 and
00:21:04.300 burns
00:21:04.540 it
00:21:04.780 that
00:21:05.440 can
00:21:05.600 be
00:21:05.720 good
00:21:05.860 for
00:21:06.000 them
00:21:06.180 because
00:21:06.760 they're
00:21:06.860 creating
00:21:07.180 energy
00:21:07.540 but
00:21:08.300 it
00:21:08.540 has
00:21:08.740 this
00:21:08.940 negative
00:21:09.280 effect
00:21:09.680 on
00:21:10.080 third
00:21:10.380 parties
00:21:10.800 that
00:21:11.420 were
00:21:11.560 not
00:21:11.880 privy
00:21:12.600 to
00:21:12.760 that
00:21:12.900 decision
00:21:13.280 they
00:21:14.040 had
00:21:14.220 no
00:21:14.380 influence
00:21:14.780 over
00:21:15.000 it
00:21:15.180 because
00:21:15.680 it
00:21:15.840 warms
00:21:16.040 the
00:21:16.160 climate
00:21:16.400 and
00:21:16.680 it
00:21:16.780 makes
00:21:16.980 those
00:21:17.200 third
00:21:17.440 parties
00:21:17.740 worse
00:21:18.280 off
00:21:18.540 and
00:21:19.340 so
00:21:19.500 the
00:21:19.660 standard
00:21:19.920 solution
00:21:20.340 there
00:21:20.640 and
00:21:21.580 this
00:21:21.980 is
00:21:22.140 endorsed
00:21:22.620 by
00:21:23.100 economists
00:21:24.660 on
00:21:25.120 the
00:21:25.240 right
00:21:25.460 as
00:21:25.620 well
00:21:25.760 as
00:21:25.880 the
00:21:26.000 left
00:21:26.340 it's
00:21:26.720 just
00:21:26.860 uncontroversial
00:21:27.660 is that
00:21:28.400 you want
00:21:28.700 some amount
00:21:29.140 of government
00:21:29.540 involvement
00:21:30.000 there
00:21:30.360 at least
00:21:31.260 in the
00:21:31.900 form of
00:21:32.300 taxation
00:21:32.860 so
00:21:33.540 you know
00:21:33.860 putting a
00:21:34.220 tax on
00:21:34.620 carbon
00:21:34.900 there's
00:21:35.280 other
00:21:35.420 things
00:21:35.620 you can
00:21:35.820 do
00:21:35.960 as
00:21:36.100 well
00:21:36.260 but
00:21:36.480 that
00:21:36.760 would
00:21:36.840 be
00:21:36.940 one
00:21:37.100 way
00:21:37.340 in
00:21:38.280 order
00:21:38.520 to
00:21:38.780 guide
00:21:39.260 market
00:21:39.940 behavior
00:21:40.360 to
00:21:40.660 better
00:21:41.240 outcomes
00:21:42.240 however
00:21:42.960 I would
00:21:43.340 go
00:21:43.480 further
00:21:43.780 than
00:21:44.000 this
00:21:44.220 as
00:21:44.400 well
00:21:44.620 and
00:21:44.920 say
00:21:45.320 you know
00:21:46.380 there
00:21:46.640 are
00:21:46.800 not
00:21:47.000 only
00:21:47.320 cases
00:21:47.720 of
00:21:48.020 market
00:21:48.320 failure
00:21:48.740 there
00:21:49.240 are
00:21:49.360 also
00:21:49.580 cases
00:21:55.460 living
00:21:55.760 in
00:21:56.380 a
00:21:57.160 country
00:21:57.440 with
00:21:57.680 poor
00:21:57.900 political
00:21:58.820 institutions
00:21:59.560 perhaps
00:22:00.120 even
00:22:00.340 a
00:22:00.460 dictatorship
00:22:00.860 they
00:22:02.100 do
00:22:02.360 not
00:22:02.560 have
00:22:02.900 democratic
00:22:03.500 say
00:22:03.920 over
00:22:04.380 the
00:22:04.540 policies
00:22:05.080 that
00:22:05.640 rich
00:22:05.920 countries
00:22:06.300 like
00:22:06.540 the
00:22:06.700 UK
00:22:06.940 and
00:22:07.200 the
00:22:07.300 US
00:22:07.580 are
00:22:07.740 enacting
00:22:08.180 and
00:22:08.760 so
00:22:08.900 they
00:22:09.140 will
00:22:09.340 not
00:22:09.560 be
00:22:09.780 given
00:22:10.020 adequate
00:22:10.440 representation
00:22:11.200 neither
00:22:12.220 in
00:22:12.440 the
00:22:12.580 market
00:22:12.880 nor
00:22:13.880 in
00:22:14.360 political
00:22:14.960 decision
00:22:15.360 making
00:22:15.680 or
00:22:16.500 if
00:22:16.640 we
00:22:16.720 look
00:22:16.860 at
00:22:16.960 non-human
00:22:17.340 animals
00:22:17.720 they
00:22:18.060 have
00:22:18.240 no
00:22:18.440 say
00:22:18.640 at
00:22:18.800 all
00:22:19.040 or
00:22:19.880 if
00:22:20.000 we
00:22:20.080 look
00:22:20.220 at
00:22:20.320 future
00:22:20.540 people
00:22:20.880 those
00:22:21.380 who
00:22:21.580 will
00:22:25.460 they
00:22:26.140 have
00:22:26.820 no
00:22:27.120 say
00:22:27.440 over
00:22:27.800 they
00:22:28.380 have
00:22:28.540 no
00:22:28.700 participation
00:22:29.080 in
00:22:29.400 the
00:22:29.520 market
00:22:29.780 so
00:22:30.180 they
00:22:30.260 have
00:22:30.380 no
00:22:30.500 influence
00:22:30.860 there
00:22:31.160 they
00:22:31.780 have
00:22:31.980 no
00:22:32.140 say
00:22:32.340 in
00:22:32.540 political
00:22:33.360 decision
00:22:33.740 making
00:22:34.000 either
00:22:34.340 and so
00:22:35.080 it's
00:22:35.400 no
00:22:35.560 surprise
00:22:35.960 I
00:22:36.160 think
00:22:36.360 that
00:22:36.540 their
00:22:36.780 interests
00:22:37.280 are
00:22:37.860 systematically
00:22:38.320 neglected
00:22:38.780 and we
00:22:39.460 have
00:22:39.600 no
00:22:39.800 guarantee
00:22:40.160 at all
00:22:40.620 that
00:22:40.820 business
00:22:41.800 as usual
00:22:42.520 for the
00:22:42.900 world
00:22:43.240 would
00:22:43.960 lead to
00:22:44.400 the best
00:22:44.720 outcomes
00:22:45.080 possible
00:22:45.500 for
00:22:45.700 them
00:22:45.920 yeah
00:22:46.680 yeah
00:22:47.640 okay
00:22:48.400 so then
00:22:48.740 there was
00:22:49.140 this
00:22:49.460 article
00:22:50.180 I
00:22:50.380 noticed
00:22:50.800 in
00:22:51.860 this
00:22:52.200 online
00:22:52.780 journal
00:22:53.220 maybe
00:22:53.800 it's
00:22:54.000 also
00:22:54.180 a
00:22:54.360 print
00:22:54.600 journal
00:22:55.080 called
00:22:55.900 current
00:22:56.240 affairs
00:22:56.660 which
00:22:57.220 I
00:22:57.900 hesitate
00:22:58.380 to
00:22:58.620 mention
00:22:58.840 because
00:22:59.120 the
00:22:59.720 editor
00:23:00.180 over there
00:23:00.960 strikes me
00:23:01.500 as someone
00:23:01.840 who is
00:23:02.280 truly
00:23:03.280 mentally
00:23:03.780 unwell
00:23:04.580 given some
00:23:05.220 of the
00:23:05.400 insane
00:23:05.660 things
00:23:05.960 he's
00:23:06.140 written
00:23:06.320 about
00:23:06.560 me
00:23:06.880 and
00:23:07.200 the
00:23:07.620 length
00:23:07.920 at which
00:23:08.240 he's
00:23:08.480 written
00:23:09.020 them
00:23:09.220 but
00:23:09.820 this
00:23:10.380 article
00:23:10.800 not
00:23:11.680 written
00:23:11.900 by him
00:23:12.340 written
00:23:12.700 by somebody
00:23:13.060 else
00:23:13.380 raised
00:23:14.820 some
00:23:15.200 points
00:23:16.220 that I
00:23:16.480 think
00:23:16.700 are
00:23:16.860 valid
00:23:17.520 as well
00:23:18.340 as many
00:23:18.740 that
00:23:19.020 weren't
00:23:19.560 and
00:23:20.520 the
00:23:20.840 valid
00:23:21.120 ones
00:23:21.400 are around
00:23:21.780 the
00:23:21.980 issue
00:23:22.440 of
00:23:23.000 how
00:23:23.800 we
00:23:23.960 think
00:23:24.140 about
00:23:24.340 expected
00:23:24.860 value
00:23:25.520 and
00:23:26.720 kind
00:23:27.660 of
00:23:27.740 a
00:23:27.840 probability
00:23:28.320 calculus
00:23:28.960 over
00:23:29.600 various
00:23:30.540 possibilities
00:23:31.220 in the
00:23:31.880 future
00:23:32.220 and this
00:23:32.600 is really
00:23:33.080 integral
00:23:33.480 to
00:23:34.360 your
00:23:34.880 new
00:23:35.100 book
00:23:35.420 and how
00:23:35.700 we
00:23:35.860 think
00:23:36.080 about
00:23:36.440 safeguarding
00:23:37.320 the
00:23:37.440 future
00:23:37.740 so I
00:23:38.640 think
00:23:38.800 before we
00:23:40.840 dive into
00:23:41.400 your
00:23:41.680 thesis
00:23:42.360 proper
00:23:42.900 let's
00:23:43.980 talk
00:23:44.300 about
00:23:44.800 the
00:23:45.760 concept
00:23:46.540 of
00:23:47.380 prioritizing
00:23:49.000 to
00:23:49.260 whatever
00:23:49.540 degree
00:23:50.140 the
00:23:51.100 well-being
00:23:51.840 of
00:23:52.340 people
00:23:52.880 who
00:23:53.360 don't
00:23:54.380 yet
00:23:54.580 exist
00:23:55.140 when
00:23:55.880 people
00:23:56.100 say
00:23:56.320 well
00:23:56.540 we
00:23:56.980 should
00:23:57.540 prioritize
00:23:58.060 the
00:23:58.380 future
00:23:58.760 to
00:23:59.480 some
00:23:59.680 significant
00:24:00.060 degree
00:24:00.480 it's
00:24:01.140 very
00:24:01.300 easy
00:24:01.480 for them
00:24:01.760 to
00:24:01.980 do
00:24:02.300 this
00:24:02.540 by
00:24:02.980 reference
00:24:03.460 to
00:24:03.660 their
00:24:03.820 children
00:24:04.400 you know
00:24:05.300 young
00:24:05.840 people
00:24:06.300 who
00:24:06.600 currently
00:24:07.480 exist
00:24:08.140 and have
00:24:08.620 no
00:24:08.780 control
00:24:09.080 over
00:24:09.360 anything
00:24:09.700 that's
00:24:10.440 easy
00:24:10.600 to
00:24:10.740 see
00:24:10.880 but
00:24:11.040 when
00:24:11.180 you're
00:24:11.320 talking
00:24:11.600 about
00:24:12.080 the
00:24:13.120 hypothetical
00:24:13.840 billions
00:24:14.560 and
00:24:14.940 trillions
00:24:15.420 of
00:24:15.960 a
00:24:16.460 future
00:24:16.760 humanity
00:24:17.340 that
00:24:18.380 may
00:24:19.100 thrive
00:24:19.820 or
00:24:20.320 not
00:24:21.320 even
00:24:21.560 exist
00:24:22.140 depending
00:24:22.700 on
00:24:22.960 how
00:24:23.180 we
00:24:23.360 play
00:24:23.580 our
00:24:23.780 cards
00:24:24.140 at
00:24:24.280 this
00:24:24.440 moment
00:24:24.720 I
00:24:25.440 think
00:24:25.600 people's
00:24:26.020 moral
00:24:26.280 intuitions
00:24:26.800 begin
00:24:27.100 to
00:24:27.400 fall
00:24:28.060 apart
00:24:28.460 and
00:24:28.840 the
00:24:29.080 most
00:24:29.700 extreme
00:24:30.420 case
00:24:31.360 of
00:24:31.640 the
00:24:32.000 problem
00:24:32.380 here
00:24:32.660 is
00:24:32.880 if
00:24:33.620 you
00:24:33.740 try
00:24:34.120 to
00:24:34.600 run
00:24:35.840 any
00:24:36.040 kind
00:24:36.240 of
00:24:36.320 utility
00:24:36.680 function
00:24:37.280 based
00:24:38.020 on
00:24:38.220 a
00:24:38.400 straightforward
00:24:38.960 calculation
00:24:40.200 of
00:24:40.420 probabilities
00:24:41.060 if
00:24:41.360 you
00:24:41.460 say
00:24:41.700 well
00:24:42.060 if
00:24:42.660 I
00:24:42.780 could
00:24:42.920 just
00:24:43.120 reduce
00:24:43.760 the
00:24:44.320 risk
00:24:44.860 that
00:24:45.300 we
00:24:45.440 will
00:24:45.600 be
00:24:45.740 wiped
00:24:46.000 out
00:24:46.240 by
00:24:46.420 some
00:24:46.680 catastrophic
00:24:47.400 event
00:24:48.460 let's
00:24:48.820 say
00:24:49.000 a
00:24:49.260 global
00:24:49.800 pandemic
00:24:50.300 if
00:24:50.680 I
00:24:50.760 could
00:24:50.880 reduce
00:24:51.300 that
00:24:51.600 by
00:24:51.920 even
00:24:52.360 one
00:24:53.300 in
00:24:53.460 a
00:24:53.600 million
00:24:53.940 given
00:24:54.800 the
00:24:55.280 fact
00:24:55.700 that
00:24:56.220 you
00:24:56.620 have
00:24:56.960 trillions
00:24:57.840 upon
00:24:58.180 trillions
00:24:58.720 of
00:24:59.120 potential
00:24:59.700 lives
00:25:00.300 on
00:25:01.000 that
00:25:01.180 side
00:25:01.440 of
00:25:01.540 the
00:25:01.640 balance
00:25:02.080 well
00:25:02.700 then
00:25:02.860 that's
00:25:03.060 more
00:25:03.220 important
00:25:03.560 than
00:25:03.780 anything
00:25:04.260 I
00:25:04.980 could
00:25:05.120 do
00:25:05.300 now
00:25:05.580 including
00:25:05.920 saving
00:25:06.320 the
00:25:06.520 lives
00:25:06.860 of
00:25:07.140 identifiable
00:25:08.140 people
00:25:08.580 even
00:25:08.980 thousands
00:25:09.680 of
00:25:09.960 identifiable
00:25:10.460 people
00:25:10.820 and
00:25:11.440 that's
00:25:12.040 where
00:25:12.140 people
00:25:12.400 feel
00:25:12.640 like
00:25:12.800 something
00:25:13.080 has
00:25:13.260 simply
00:25:13.540 gone
00:25:13.800 wrong
00:25:14.280 here
00:25:14.660 in
00:25:14.860 how
00:25:15.400 we're
00:25:15.780 thinking
00:25:16.060 about
00:25:16.400 human
00:25:17.260 well-being
00:25:17.780 so
00:25:18.080 perhaps
00:25:18.460 you
00:25:18.600 can
00:25:18.720 address
00:25:19.000 that
00:25:19.260 sure
00:25:19.860 I'm
00:25:20.060 so glad
00:25:20.680 you
00:25:20.880 brought
00:25:21.300 it up
00:25:21.700 because
00:25:22.260 this
00:25:23.300 is a
00:25:23.560 framing
00:25:23.840 that
00:25:24.160 often
00:25:24.600 gets
00:25:24.820 mentioned
00:25:25.200 and
00:25:26.160 I
00:25:27.180 think
00:25:27.320 it's
00:25:27.460 really
00:25:27.660 unfortunate
00:25:28.140 and
00:25:28.720 I
00:25:28.820 want
00:25:28.940 to
00:25:29.040 push
00:25:29.220 against
00:25:29.540 it
00:25:29.780 in
00:25:30.300 the
00:25:30.580 strongest
00:25:30.960 possible
00:25:31.420 terms
00:25:31.880 where
00:25:32.940 if
00:25:34.140 it
00:25:34.900 were
00:25:35.120 the
00:25:35.300 case
00:25:35.660 that
00:25:36.040 the
00:25:36.180 argument
00:25:36.580 for
00:25:36.860 concern
00:25:37.460 and
00:25:37.700 serious
00:25:38.120 concern
00:25:38.680 about
00:25:39.300 how
00:25:39.540 we might
00:25:39.920 impact
00:25:42.400 on
00:25:42.600 these
00:25:42.840 vanishingly
00:25:43.520 small
00:25:43.840 probabilities
00:25:44.460 of
00:25:44.900 enormous
00:25:45.340 amounts
00:25:45.760 of
00:25:45.920 value
00:25:46.260 then
00:25:47.380 I
00:25:47.600 wouldn't
00:25:47.800 be
00:25:47.920 writing
00:25:48.160 this
00:25:48.380 book
00:25:48.640 I
00:25:49.020 would
00:25:49.180 be
00:25:49.300 doing
00:25:49.500 other
00:25:49.720 things
00:25:50.080 and
00:25:50.800 I
00:25:51.040 don't
00:25:51.280 think
00:25:51.500 it
00:25:51.620 does
00:25:51.940 and
00:25:52.720 the
00:25:53.060 reason
00:25:53.300 this
00:25:53.580 framing
00:25:53.880 came
00:25:54.180 up
00:25:54.400 is
00:25:54.560 because
00:25:54.880 in
00:25:55.300 academic
00:25:56.260 articles
00:25:56.920 Nick
00:25:57.740 Bostrom
00:25:58.280 who I
00:25:59.020 believe
00:25:59.200 you've
00:25:59.380 had
00:25:59.520 on
00:25:59.820 he
00:26:01.040 sketched
00:26:01.580 out
00:26:01.740 what
00:26:02.120 are
00:26:02.260 the
00:26:02.380 implications
00:26:02.960 of
00:26:03.840 a
00:26:04.360 very
00:26:04.600 particular
00:26:05.200 moral
00:26:05.800 view
00:26:06.120 expectational
00:26:07.880 total
00:26:08.620 utilitarianism
00:26:09.460 and he
00:26:10.480 said
00:26:10.760 well if
00:26:11.140 you're
00:26:11.300 taking
00:26:12.400 seriously
00:26:12.520 then
00:26:13.580 that's
00:26:14.500 the
00:26:14.620 conclusion
00:26:14.980 you
00:26:15.200 get
00:26:15.440 what
00:26:16.220 he
00:26:16.380 wasn't
00:26:16.680 saying
00:26:16.960 is
00:26:17.100 that
00:26:17.300 we
00:26:17.540 should
00:26:17.720 endorse
00:26:18.020 that
00:26:18.260 conclusion
00:26:18.700 in
00:26:19.440 fact
00:26:19.740 he
00:26:20.100 and
00:26:21.180 essentially
00:26:22.440 everyone
00:26:22.840 I
00:26:23.020 know
00:26:23.340 at least
00:26:24.180 in
00:26:24.380 practice
00:26:24.820 outside
00:26:25.240 of
00:26:25.420 the
00:26:25.560 ivory
00:26:26.920 tower
00:26:27.380 take
00:26:28.000 that
00:26:28.180 as
00:26:28.300 a
00:26:28.400 reductio
00:26:28.920 rather
00:26:29.480 than
00:26:29.920 an
00:26:30.460 implication
00:26:30.920 so
00:26:31.800 Nick
00:26:32.000 himself
00:26:32.600 is
00:26:32.780 not
00:26:32.940 consequentialist
00:26:33.740 even
00:26:33.980 and
00:26:34.680 wrote
00:26:34.840 other
00:26:35.120 papers
00:26:35.560 on
00:26:36.040 this
00:26:36.680 idea
00:26:36.880 of
00:26:37.100 Pascal's
00:26:37.700 mugging
00:26:38.040 so
00:26:38.260 it's
00:26:38.400 similar
00:26:38.620 like
00:26:38.860 Pascal's
00:26:39.360 wager
00:26:39.620 but
00:26:39.860 just
00:26:40.000 without
00:26:40.240 infinite
00:26:42.400 these
00:26:42.660 tiny
00:26:43.080 probabilities
00:26:43.660 of
00:26:43.920 huge
00:26:44.200 amounts
00:26:44.520 of
00:26:44.640 value
00:26:44.960 look
00:26:45.260 it
00:26:45.420 just
00:26:45.680 it
00:26:46.560 doesn't
00:26:46.940 seem
00:26:47.240 like
00:26:47.560 you
00:26:47.720 just
00:26:47.920 do
00:26:48.120 the
00:26:48.280 math
00:26:48.520 and
00:26:48.680 multiply
00:26:49.060 them
00:26:49.400 can
00:26:49.880 you
00:26:49.960 actually
00:26:50.160 describe
00:26:50.680 the
00:26:50.880 Pascal's
00:26:51.460 mugging
00:26:51.840 example
00:26:52.440 sure
00:26:53.440 I'm
00:26:53.740 happy
00:26:53.940 to
00:26:54.220 and
00:26:54.460 so
00:26:55.260 the
00:26:55.520 idea
00:26:55.900 is
00:26:56.220 just
00:26:56.700 you
00:26:57.220 know
00:26:57.580 you've
00:26:57.900 had a
00:26:58.080 nice
00:26:58.220 time
00:26:58.440 at
00:26:58.520 the
00:26:58.620 pub
00:26:58.800 you
00:26:59.120 walk
00:26:59.320 outside
00:26:59.780 a
00:27:00.520 man
00:27:00.800 comes
00:27:01.160 up
00:27:01.320 to
00:27:01.420 you
00:27:01.520 and
00:27:01.640 says
00:27:01.820 give
00:27:02.020 me
00:27:02.120 your
00:27:02.280 wallet
00:27:02.500 you
00:27:03.420 say
00:27:03.700 no
00:27:04.000 and
00:27:05.020 he
00:27:05.160 says
00:27:05.400 okay
00:27:12.400 tomorrow
00:27:12.640 we'll
00:27:13.120 come
00:27:13.340 back
00:27:13.720 and
00:27:14.540 I
00:27:14.820 will
00:27:15.000 give
00:27:15.240 you
00:27:15.500 any
00:27:16.700 finite
00:27:17.340 amount
00:27:17.800 of
00:27:18.040 well-being
00:27:18.440 so
00:27:19.960 just
00:27:21.180 whatever
00:27:21.960 amount
00:27:22.520 of
00:27:22.800 well-being
00:27:23.160 you
00:27:23.460 ask
00:27:23.740 for
00:27:23.940 I
00:27:24.420 will
00:27:24.580 give
00:27:24.760 you
00:27:24.900 it
00:27:25.060 and
00:27:26.560 you
00:27:26.780 say
00:27:26.960 no
00:27:27.280 I
00:27:27.600 don't
00:27:28.120 I
00:27:28.280 just
00:27:28.480 think
00:27:28.640 I'm
00:27:28.820 certain
00:27:29.160 that
00:27:29.340 that's
00:27:29.500 not
00:27:29.640 going
00:27:29.760 to
00:27:29.840 happen
00:27:30.080 and
00:27:31.080 the
00:27:31.500 man
00:27:31.680 says
00:27:31.820 come
00:27:32.000 on
00:27:32.140 you
00:27:32.200 look
00:27:33.920 at
00:27:34.040 my
00:27:34.240 eyes
00:27:34.580 I've
00:27:34.720 got
00:27:34.820 this
00:27:34.960 pale
00:27:35.180 complexion
00:27:36.040 I
00:27:36.260 look
00:27:36.460 a
00:27:36.580 bit
00:27:36.740 you
00:27:36.940 know
00:27:37.020 it's
00:27:37.460 possible
00:27:37.860 that
00:27:38.140 I'm
00:27:38.840 this
00:27:39.140 you
00:27:39.740 know
00:27:39.840 traveler
00:27:40.220 from
00:27:40.460 the
00:27:40.620 future
00:27:40.960 or
00:27:41.360 this
00:27:41.540 alien
00:27:41.880 with
00:27:42.160 like
00:27:42.320 amazing
00:27:42.680 powers
00:27:43.160 or
00:27:43.380 I'm
00:27:43.560 some
00:27:43.740 sort
00:27:44.000 of
00:27:44.380 god
00:27:44.780 it's
00:27:45.500 possible
00:27:46.040 maybe
00:27:46.400 it's
00:27:46.640 a
00:27:46.720 one
00:27:46.900 in a
00:27:47.140 billion
00:27:47.380 billion
00:27:47.740 billion
00:27:48.080 billion
00:27:48.420 that
00:27:49.120 I
00:27:49.260 will
00:27:49.380 be
00:27:49.500 able
00:27:49.660 to
00:27:49.820 do
00:27:49.960 this
00:27:50.300 but
00:27:50.880 it's
00:27:51.040 at
00:27:51.140 least
00:27:51.260 possible
00:27:51.700 and
00:27:52.340 you
00:27:52.420 take
00:27:52.660 that
00:27:52.880 one
00:27:53.080 in a
00:27:53.280 billion
00:27:53.500 billion
00:27:53.800 billion
00:27:54.100 billion
00:27:54.420 multiply
00:27:55.060 it
00:27:55.360 by
00:27:55.680 a
00:27:56.160 sufficiently
00:27:56.560 large
00:27:57.040 amount
00:27:57.280 of
00:27:57.440 well-being
00:27:57.800 for
00:27:58.060 you
00:27:58.320 that
00:27:59.440 in
00:28:00.200 what
00:28:00.440 decision
00:28:00.740 theorists
00:28:01.100 call
00:28:01.340 expected
00:28:01.900 value
00:28:02.380 which
00:28:03.200 is
00:28:03.320 where
00:28:03.460 you
00:28:03.600 take
00:28:03.840 the
00:28:04.100 probability
00:28:04.860 of an
00:28:05.400 outcome
00:28:05.820 and
00:28:06.500 the
00:28:06.840 value
00:28:07.200 of
00:28:07.340 that
00:28:07.440 outcome
00:28:07.780 and
00:28:07.980 multiply
00:28:08.280 them
00:28:08.500 together
00:28:08.800 to get
00:28:09.120 the
00:28:09.240 expected
00:28:09.600 value
00:28:09.960 that
00:28:10.660 expected
00:28:11.160 value
00:28:11.580 is
00:28:11.760 greater
00:28:12.100 than
00:28:12.680 that
00:28:12.820 100
00:28:13.140 pounds
00:28:13.500 for
00:28:13.700 you
00:28:13.880 and
00:28:15.140 then
00:28:15.860 maybe
00:28:16.100 you
00:28:16.260 say
00:28:16.500 oh
00:28:16.800 yeah
00:28:16.980 that's
00:28:17.420 okay
00:28:18.000 I guess
00:28:18.280 that seems
00:28:18.660 correct
00:28:18.980 you hand
00:28:19.320 over
00:28:19.960 the
00:28:20.080 hundred
00:28:20.280 pounds
00:28:20.720 and
00:28:21.960 now
00:28:22.260 what
00:28:22.640 Boszham
00:28:23.120 thinks
00:28:23.980 and
00:28:24.820 what
00:28:25.680 almost
00:28:26.160 everyone
00:28:26.600 who
00:28:27.080 works
00:28:27.380 in this
00:28:27.800 thinks
00:28:28.440 is that
00:28:29.100 that's
00:28:30.740 if you've
00:28:33.260 got a
00:28:33.500 decision
00:28:33.820 theory
00:28:34.160 it's
00:28:34.320 actually
00:28:34.520 not
00:28:34.720 about
00:28:34.960 morality
00:28:35.340 it's
00:28:35.620 about
00:28:35.820 your
00:28:36.000 theory
00:28:36.240 of
00:28:36.380 rationality
00:28:37.000 or
00:28:37.520 your
00:28:37.660 decision
00:28:37.940 theory
00:28:38.300 but
00:28:38.900 if
00:28:39.040 you've
00:28:39.220 got
00:28:39.300 a
00:28:39.440 decision
00:28:39.660 theory
00:28:39.980 that
00:28:40.200 says
00:28:40.400 you
00:28:40.540 should
00:28:40.720 give
00:28:40.880 the
00:28:41.040 wallet
00:28:41.280 to
00:28:41.760 the
00:28:41.900 mugger
00:28:42.180 then
00:28:43.480 something
00:28:44.320 has
00:28:44.580 gone
00:28:44.760 badly
00:28:45.100 wrong
00:28:45.420 and
00:28:46.660 one
00:28:47.260 way
00:28:47.540 that
00:28:47.760 I
00:28:47.880 like
00:28:48.060 to
00:28:48.180 illustrate
00:28:48.480 this
00:28:48.840 and
00:28:49.020 this
00:28:49.160 applies
00:28:49.500 to
00:28:49.820 the
00:28:50.080 and
00:28:51.100 this
00:28:51.380 problem
00:28:51.660 gets
00:28:51.820 called
00:28:52.020 the
00:28:52.160 fanaticism
00:28:52.740 problem
00:28:53.100 and
00:28:53.980 it's
00:28:54.360 a
00:28:54.460 really
00:28:54.620 interesting
00:28:55.140 area
00:28:55.620 of
00:28:55.820 decision
00:28:56.080 theory
00:28:56.380 because
00:28:56.780 I
00:28:57.280 actually
00:28:57.480 think
00:28:57.660 it's
00:28:57.800 the
00:28:57.940 strongest
00:28:58.280 argument
00:28:58.720 against
00:28:59.340 a
00:29:00.260 theory
00:29:00.480 that
00:29:00.680 is
00:29:00.780 otherwise
00:29:01.140 very
00:29:01.500 good
00:29:01.780 which
00:29:02.160 is
00:29:02.380 expected
00:29:02.820 value
00:29:03.140 theory
00:29:03.460 but
00:29:04.020 seeing
00:29:04.340 one
00:29:04.600 way
00:29:04.800 of
00:29:04.920 this
00:29:05.220 going
00:29:05.440 wrong
00:29:05.800 is
00:29:06.460 if
00:29:07.280 this
00:29:07.520 argument
00:29:07.880 is
00:29:08.200 used
00:29:08.580 for
00:29:08.800 giving
00:29:08.980 your
00:29:09.220 money
00:29:09.400 to
00:29:09.580 the
00:29:09.700 mugger
00:29:10.020 or
00:29:10.860 for
00:29:11.360 looking
00:29:13.200 for
00:29:13.580 future
00:29:13.900 amounts
00:29:14.220 of
00:29:14.360 value
00:29:14.640 even
00:29:15.040 if
00:29:15.300 it
00:29:15.440 were
00:29:15.660 just
00:29:15.920 one
00:29:16.100 in
00:29:16.200 a
00:29:16.300 billion
00:29:16.540 chance
00:29:17.060 that
00:29:17.220 you
00:29:17.340 could
00:29:17.500 change
00:29:17.980 it
00:29:18.220 well
00:29:19.000 why
00:29:19.280 are
00:29:19.420 you
00:29:19.540 going
00:29:19.780 after
00:29:20.040 only
00:29:20.380 finite
00:29:20.740 amounts
00:29:21.100 of
00:29:21.260 value
00:29:21.540 there's
00:29:22.420 a
00:29:22.540 long
00:29:22.780 religious
00:29:23.160 tradition
00:29:23.680 that
00:29:24.040 says
00:29:24.240 you
00:29:24.360 can
00:29:24.480 get
00:29:24.640 infinite
00:29:25.060 amounts
00:29:25.440 of
00:29:25.600 value
00:29:25.880 via
00:29:26.140 heaven
00:29:26.460 and
00:29:27.500 so
00:29:27.760 really
00:29:28.120 the
00:29:28.340 upshot
00:29:28.700 if
00:29:28.860 you're
00:29:28.960 taking
00:29:29.240 expected
00:29:29.660 value
00:29:30.000 theory
00:29:30.280 seriously
00:29:32.880 is to
00:29:33.960 become
00:29:34.180 a
00:29:34.360 missionary
00:29:34.600 to
00:29:35.160 try and
00:29:35.680 ensure
00:29:36.060 that
00:29:36.320 you
00:29:36.480 get
00:29:36.700 into
00:29:37.020 as
00:29:37.560 many
00:29:37.820 heavens
00:29:38.120 as
00:29:38.340 possible
00:29:38.700 according
00:29:39.060 to
00:29:39.360 as
00:29:39.880 many
00:29:40.120 different
00:29:40.600 forms
00:29:41.460 of
00:29:42.000 religious
00:29:42.360 belief
00:29:42.760 as
00:29:42.960 possible
00:29:43.300 and
00:29:43.760 try and
00:29:44.080 convince
00:29:44.360 others
00:29:44.640 to
00:29:44.780 the
00:29:44.900 same
00:29:45.160 to
00:29:45.440 do
00:29:45.560 the
00:29:45.740 same
00:29:49.780 if
00:29:49.920 you're
00:29:50.120 really
00:29:50.340 going
00:29:50.620 with
00:29:50.820 the
00:29:50.920 math
00:29:51.140 and
00:29:51.280 doing
00:29:51.460 it
00:29:51.600 takes
00:29:51.860 you
00:29:52.060 and
00:29:52.660 I
00:29:52.780 know
00:29:53.040 of
00:29:53.420 zero
00:29:53.760 people
00:29:54.100 who
00:29:54.380 have
00:29:54.620 done
00:29:55.080 that
00:29:55.240 everyone
00:29:56.020 says
00:29:56.400 look
00:29:56.700 okay
00:29:57.080 something
00:29:57.900 we don't
00:29:58.280 know
00:29:58.440 what
00:29:58.780 I think
00:29:59.340 this is
00:29:59.600 an
00:29:59.740 incredibly
00:30:00.340 hard
00:30:00.880 area
00:30:01.600 within
00:30:02.220 decision
00:30:03.000 theory
00:30:03.360 we don't
00:30:04.480 know
00:30:04.640 what
00:30:04.840 the
00:30:04.980 right
00:30:05.140 answer
00:30:05.440 is
00:30:05.760 but
00:30:06.080 it's
00:30:06.260 not
00:30:06.460 that
00:30:06.780 and
00:30:08.300 the
00:30:08.900 argument
00:30:09.900 for
00:30:10.240 concern
00:30:10.680 about
00:30:10.920 the
00:30:11.080 long
00:30:11.280 term
00:30:11.460 future
00:30:11.860 is
00:30:12.160 the
00:30:12.300 fact
00:30:12.520 that
00:30:12.660 these
00:30:12.860 risks
00:30:13.260 are
00:30:13.800 more
00:30:14.220 like
00:30:14.480 the
00:30:14.620 risks
00:30:14.860 of
00:30:15.020 dying
00:30:15.220 in
00:30:15.340 a
00:30:15.440 car
00:30:15.620 crash
00:30:15.940 and
00:30:16.580 much
00:30:16.780 less
00:30:17.080 like
00:30:17.460 the
00:30:18.180 one
00:30:18.400 in a
00:30:18.600 billion
00:30:18.860 billion
00:30:19.240 Pascal's
00:30:19.940 mugger
00:30:20.140 kind
00:30:20.340 of
00:30:20.460 risk
00:30:20.680 well
00:30:22.380 admittedly
00:30:22.920 this is
00:30:23.260 a
00:30:23.400 problem
00:30:23.740 I have
00:30:24.060 not
00:30:24.340 thought
00:30:25.020 much
00:30:25.300 about
00:30:25.680 but
00:30:26.060 I
00:30:26.540 have
00:30:26.680 a
00:30:26.780 couple
00:30:26.940 of
00:30:27.080 intuitions
00:30:27.600 here
00:30:27.900 kindling
00:30:28.600 as I
00:30:28.980 listen
00:30:29.420 to you
00:30:29.680 describe
00:30:30.140 it
00:30:30.320 one
00:30:30.880 is
00:30:31.100 that
00:30:31.420 obviously
00:30:32.280 expected
00:30:32.700 value
00:30:33.320 becomes
00:30:34.040 eminently
00:30:34.560 rational
00:30:35.120 within the
00:30:36.380 domain
00:30:36.760 of
00:30:37.080 normal
00:30:37.700 probabilities
00:30:38.380 I
00:30:38.660 mean
00:30:38.760 it
00:30:38.960 works
00:30:39.280 in
00:30:39.480 poker
00:30:39.840 it
00:30:40.520 works
00:30:40.880 in
00:30:41.280 any
00:30:41.880 kind
00:30:42.040 decision
00:30:42.440 analysis
00:30:42.960 where
00:30:43.380 you're
00:30:43.700 just
00:30:43.920 trying
00:30:44.240 to
00:30:44.400 figure
00:30:44.620 out
00:30:45.040 what
00:30:45.620 you
00:30:45.780 should
00:30:45.980 do
00:30:46.440 under
00:30:46.840 uncertainty
00:30:47.400 and
00:30:48.300 you're
00:30:48.460 dealing
00:30:48.640 with
00:30:48.860 probabilities
00:30:49.480 of
00:30:49.840 one
00:30:50.340 third
00:30:50.740 or
00:30:51.020 ten
00:30:51.300 percent
00:30:51.820 that
00:30:52.360 is
00:30:53.300 how
00:30:53.520 we
00:30:53.680 organize
00:30:54.100 our
00:30:54.580 intuitions
00:30:55.660 about
00:30:56.560 the
00:30:56.800 future
00:30:57.240 in
00:30:58.000 the
00:30:58.120 most
00:30:58.320 systematic
00:30:58.660 way
00:30:59.340 and
00:31:00.100 I
00:31:00.220 guess
00:31:00.380 there
00:31:00.620 are
00:31:00.860 even
00:31:01.400 corner
00:31:02.240 conditions
00:31:02.680 within
00:31:03.120 the
00:31:03.320 domain
00:31:03.620 of
00:31:03.900 normal
00:31:04.760 probabilities
00:31:05.440 where
00:31:05.720 this
00:31:06.260 is
00:31:06.920 not
00:31:07.120 a
00:31:07.240 repeated
00:31:07.540 game
00:31:07.980 you've
00:31:08.220 got
00:31:08.340 one
00:31:08.720 shot
00:31:09.200 to
00:31:09.500 take
00:31:09.720 this
00:31:09.940 risk
00:31:10.320 and
00:31:10.840 it
00:31:11.600 may
00:31:11.780 be
00:31:12.020 that
00:31:12.240 risking
00:31:12.760 your
00:31:13.000 life
00:31:13.380 whatever
00:31:14.060 you put
00:31:14.520 on the
00:31:14.760 other
00:31:15.020 side
00:31:15.460 of
00:31:15.560 the
00:31:15.680 scale
00:31:16.020 doesn't
00:31:16.800 make
00:31:17.000 sense
00:31:17.300 except
00:31:17.600 we
00:31:18.380 don't
00:31:18.560 live
00:31:18.760 that
00:31:18.940 way
00:31:19.120 we
00:31:19.280 risk
00:31:19.500 our
00:31:19.640 lives
00:31:19.940 every
00:31:20.160 time
00:31:20.400 we
00:31:20.540 get
00:31:20.680 into
00:31:20.840 a
00:31:21.000 car
00:31:21.280 or
00:31:21.420 get
00:31:21.520 on
00:31:21.640 a
00:31:21.740 plane
00:31:22.040 or
00:31:22.200 ride
00:31:22.400 a
00:31:22.520 bike
00:31:22.760 or
00:31:22.900 do
00:31:23.020 it
00:31:23.140 like
00:31:23.360 we've
00:31:23.760 all
00:31:23.880 made
00:31:24.080 these
00:31:24.260 trade-offs
00:31:24.740 whether
00:31:24.960 we've
00:31:25.240 actually
00:31:25.480 done
00:31:25.820 the
00:31:25.960 decision
00:31:26.280 analysis
00:31:26.840 or
00:31:27.500 not
00:31:27.940 and
00:31:28.440 consciously
00:31:29.040 assigned
00:31:29.620 probabilities
00:31:30.280 or
00:31:30.520 not
00:31:30.780 we
00:31:31.280 just
00:31:31.420 have
00:31:31.520 this
00:31:31.680 vague
00:31:32.000 sense
00:31:32.360 that
00:31:32.620 certain
00:31:33.440 types
00:31:33.760 of
00:31:33.920 risks
00:31:34.180 are
00:31:34.360 acceptable
00:31:34.860 and
00:31:35.700 I
00:31:35.800 guess
00:31:35.940 my
00:31:36.160 other
00:31:36.340 intuition
00:31:36.660 is
00:31:37.700 that
00:31:38.020 this
00:31:38.860 is
00:31:39.020 somewhat
00:31:39.420 analogous
00:31:40.160 to
00:31:40.340 the
00:31:40.620 problems
00:31:41.440 disgorged
00:31:42.520 by
00:31:42.980 population
00:31:44.000 ethics
00:31:44.780 where
00:31:44.980 when
00:31:45.120 we
00:31:45.260 talk
00:31:45.600 about
00:31:46.060 things
00:31:46.520 like
00:31:46.700 the
00:31:46.840 repugnant
00:31:47.200 conclusion
00:31:47.760 and
00:31:48.760 you
00:31:48.960 talk
00:31:49.440 about
00:31:49.620 very
00:31:49.940 large
00:31:50.360 numbers
00:31:50.760 of
00:31:50.980 very
00:31:51.180 small
00:31:51.480 things
00:31:51.960 or
00:31:52.560 very
00:31:52.740 small
00:31:52.980 utility
00:31:53.420 you
00:31:54.220 seem
00:31:54.460 to
00:31:54.580 get
00:31:54.760 these
00:31:55.080 grotesque
00:31:55.940 outcomes
00:31:56.920 ethically
00:31:57.680 where
00:31:58.700 do you
00:31:59.240 think
00:31:59.680 this is
00:32:00.460 breaking
00:32:00.720 down
00:32:01.120 or why
00:32:01.520 do you
00:32:01.740 think
00:32:01.920 it
00:32:02.020 is
00:32:02.140 breaking
00:32:02.380 down
00:32:02.700 so
00:32:03.560 the
00:32:03.920 standard
00:32:04.680 response
00:32:06.060 within
00:32:06.520 decision
00:32:06.940 theory
00:32:07.320 would be
00:32:08.060 to have
00:32:08.440 what's
00:32:08.900 so it
00:32:09.420 all gets
00:32:09.720 technical
00:32:10.080 quickly
00:32:10.500 and I
00:32:11.160 apologize
00:32:11.500 if I
00:32:11.880 don't
00:32:12.080 do a
00:32:12.480 good job
00:32:12.900 of
00:32:13.040 explaining
00:32:13.400 the
00:32:13.600 ideas
00:32:13.920 but the
00:32:14.720 standard
00:32:15.020 response
00:32:15.440 is to
00:32:15.700 have
00:32:15.820 what's
00:32:16.060 called
00:32:16.220 a
00:32:16.380 bounded
00:32:16.700 value
00:32:17.120 function
00:32:17.540 and
00:32:18.420 it's
00:32:18.660 easiest
00:32:18.980 to
00:32:19.240 see
00:32:19.440 this
00:32:19.800 when
00:32:20.220 it
00:32:20.460 comes
00:32:20.820 to
00:32:21.280 money
00:32:22.660 let's
00:32:23.040 say
00:32:23.220 so
00:32:24.900 if
00:32:25.220 the
00:32:25.740 mugger
00:32:26.840 in
00:32:27.120 Pascal's
00:32:27.580 mugging
00:32:27.840 was
00:32:28.060 saying
00:32:28.440 okay
00:32:29.100 I
00:32:29.300 will
00:32:29.460 take
00:32:29.780 100
00:32:30.200 pounds
00:32:30.660 off
00:32:30.900 you
00:32:31.120 today
00:32:31.580 and
00:32:32.220 I
00:32:32.320 will
00:32:32.420 come
00:32:32.600 back
00:32:32.820 tomorrow
00:32:33.200 with
00:32:33.760 any
00:32:34.060 finite
00:32:34.460 amount
00:32:34.800 of
00:32:34.980 money
00:32:35.200 you
00:32:35.520 like
00:32:35.800 that
00:32:36.220 you
00:32:36.300 can
00:32:36.440 spend
00:32:36.660 on
00:32:36.800 yourself
00:32:37.120 let's
00:32:37.480 just
00:32:37.600 assume
00:32:37.900 you're
00:32:38.020 completely
00:32:38.480 self-interested
00:32:39.140 to keep
00:32:39.880 it
00:32:39.980 simple
00:32:40.260 I
00:32:41.540 could
00:32:41.780 say
00:32:42.200 look
00:32:43.300 my
00:32:44.160 probability
00:32:44.860 of you
00:32:45.940 coming
00:32:46.260 back
00:32:46.740 is so
00:32:47.460 low
00:32:47.820 that
00:32:48.480 there
00:32:48.720 is
00:32:48.940 no
00:32:49.600 amount
00:32:50.060 of
00:32:50.280 money
00:32:50.560 that
00:32:51.200 you
00:32:51.320 could
00:32:51.480 give
00:32:51.660 me
00:32:51.940 such
00:32:52.580 that
00:32:52.980 the
00:32:53.700 probability
00:32:54.280 times
00:32:57.520 that
00:32:58.360 amount
00:32:58.620 of
00:32:58.760 money
00:32:59.000 would
00:32:59.800 add
00:33:00.080 up
00:33:00.340 to
00:33:00.540 more
00:33:00.920 expected
00:33:01.620 well-being
00:33:02.120 for me
00:33:02.580 than
00:33:02.880 100
00:33:03.180 pounds
00:33:03.560 now
00:33:03.860 so
00:33:04.540 let's
00:33:04.760 say
00:33:04.880 I
00:33:05.020 think
00:33:05.200 it's
00:33:05.360 one
00:33:05.520 in
00:33:05.620 a
00:33:05.720 billion
00:33:05.960 billion
00:33:06.280 that
00:33:06.520 the
00:33:06.660 mugger
00:33:06.860 will
00:33:07.020 come
00:33:07.220 back
00:33:07.580 okay
00:33:08.340 sure
00:33:08.780 the
00:33:09.560 mugger
00:33:09.940 could
00:33:10.340 give
00:33:10.600 me
00:33:10.960 you
00:33:11.620 know
00:33:11.720 a
00:33:11.900 thousand
00:33:12.320 billion
00:33:13.220 billion
00:33:13.860 dollars
00:33:15.100 and
00:33:16.060 that
00:33:16.220 would
00:33:16.380 mean
00:33:16.720 more
00:33:17.480 I
00:33:17.700 would
00:33:17.880 have
00:33:18.260 greater
00:33:18.740 expected
00:33:19.420 financial
00:33:20.060 return
00:33:20.560 however
00:33:21.460 I
00:33:21.720 wouldn't
00:33:22.080 have
00:33:22.260 greater
00:33:22.540 expected
00:33:23.040 well-being
00:33:23.540 because
00:33:24.140 the
00:33:24.280 more
00:33:24.440 money
00:33:24.640 I
00:33:24.800 have
00:33:25.160 the
00:33:25.820 less
00:33:26.140 it
00:33:26.400 impacts
00:33:26.800 my
00:33:27.000 well-being
00:33:27.400 and
00:33:28.040 there's
00:33:28.260 some
00:33:28.520 amount
00:33:28.900 there's
00:33:29.500 only so
00:33:30.000 much
00:33:30.420 that
00:33:31.040 money
00:33:31.260 can
00:33:31.460 buy
00:33:31.680 for
00:33:31.900 me
00:33:32.120 at
00:33:33.000 some
00:33:33.260 point
00:33:33.520 it
00:33:33.760 starts
00:33:33.980 making
00:33:34.300 this
00:33:34.500 like
00:33:34.700 vanishingly
00:33:35.260 small
00:33:35.480 difference
00:33:35.960 such
00:33:36.900 that
00:33:37.260 there's
00:33:37.940 a
00:33:38.040 level
00:33:38.240 of
00:33:38.380 well-being
00:33:38.800 such
00:33:39.140 that
00:33:39.280 I
00:33:39.420 can't
00:33:39.740 get
00:33:40.020 like
00:33:40.380 a
00:33:40.760 thousand
00:33:41.060 times
00:33:41.740 greater
00:33:42.220 well-being
00:33:42.700 with
00:33:43.640 any
00:33:44.520 amount
00:33:44.980 more
00:33:45.260 money
00:33:45.520 and
00:33:46.680 so
00:33:46.780 that's
00:33:46.980 very
00:33:47.180 clear
00:33:47.540 when it
00:33:47.880 comes
00:33:48.100 to
00:33:48.320 financial
00:33:48.860 returns
00:33:49.380 but
00:33:50.060 you
00:33:50.200 might
00:33:50.440 think
00:33:50.700 the
00:33:50.880 same
00:33:51.100 thing
00:33:51.360 happens
00:33:51.720 with
00:33:51.920 well-being
00:33:52.360 so
00:33:53.500 supposing
00:33:54.000 I have
00:33:54.500 the
00:33:54.700 best
00:33:55.260 possible
00:33:55.840 life
00:33:56.200 just
00:33:56.440 like
00:33:56.760 peaks
00:33:57.740 of
00:33:57.960 bliss
00:33:58.660 and
00:33:58.920 enjoyment
00:33:59.780 and
00:34:00.340 amazing
00:34:01.340 relationships
00:34:01.980 and
00:34:02.400 creative
00:34:02.880 insight
00:34:03.320 over
00:34:03.660 and
00:34:03.840 over
00:34:04.140 and
00:34:04.820 that's
00:34:05.080 for
00:34:05.200 some
00:34:05.420 very
00:34:06.000 long
00:34:06.620 lived
00:34:06.840 life
00:34:07.240 perhaps
00:34:08.280 you
00:34:08.460 just
00:34:08.680 imagine
00:34:09.140 that
00:34:09.540 now
00:34:10.220 you
00:34:10.360 might
00:34:10.600 think
00:34:11.020 oh
00:34:12.180 it's
00:34:12.440 just
00:34:12.620 not
00:34:13.020 possible
00:34:13.500 to
00:34:13.760 have
00:34:13.900 a
00:34:14.020 life
00:34:14.260 that's
00:34:15.180 a
00:34:15.360 thousand
00:34:15.680 times
00:34:16.100 better
00:34:16.460 than
00:34:16.780 that
00:34:16.980 life
00:34:17.260 we've
00:34:17.440 just
00:34:17.620 imagined
00:34:17.980 and
00:34:19.260 in
00:34:19.600 fact
00:34:19.880 I think
00:34:20.240 that's
00:34:20.500 how
00:34:20.620 people do
00:34:21.300 in
00:34:21.540 fact
00:34:21.820 reason
00:34:22.140 so
00:34:23.200 even
00:34:23.620 if
00:34:24.000 you
00:34:24.100 say
00:34:24.380 to
00:34:24.560 someone
00:34:24.820 look
00:34:25.180 you
00:34:25.880 can
00:34:26.060 have
00:34:26.280 a
00:34:26.440 50-50
00:34:27.100 chance
00:34:27.680 you
00:34:28.060 either
00:34:28.260 die
00:34:28.520 now
00:34:28.860 or
00:34:29.620 you
00:34:29.880 get
00:34:30.040 to
00:34:30.220 live
00:34:30.400 for
00:34:30.560 as
00:34:30.700 long
00:34:30.880 as
00:34:31.000 you
00:34:31.100 like
00:34:31.360 and
00:34:32.080 choose
00:34:32.380 when
00:34:32.680 to
00:34:32.880 when
00:34:33.540 you
00:34:33.660 pass
00:34:33.880 away
00:34:34.140 I
00:34:35.180 think
00:34:35.360 most
00:34:35.580 people
00:34:35.780 don't
00:34:36.000 take
00:34:36.220 that
00:34:36.420 bet
00:34:36.740 and
00:34:37.420 that
00:34:37.600 suggests
00:34:38.080 that
00:34:38.420 we
00:34:38.580 actually
00:34:38.920 think
00:34:39.320 that
00:34:39.700 perhaps
00:34:40.560 the
00:34:40.740 kind
00:34:40.900 of
00:34:41.120 I
00:34:41.880 mean
00:34:42.000 there's
00:34:42.160 issues
00:34:42.820 with
00:34:43.180 kind
00:34:43.380 of
00:34:43.840 very
00:34:44.320 long
00:34:44.560 lives
00:34:44.920 but
00:34:45.180 we
00:34:45.860 can
00:34:45.980 reformulate
00:34:46.560 the
00:34:46.720 thought
00:34:46.880 experiment
00:34:47.280 either
00:34:48.100 way
00:34:48.360 I
00:34:48.560 think
00:34:48.760 people
00:34:49.020 just
00:34:49.260 do
00:34:49.540 think
00:34:49.980 there's
00:34:50.960 just
00:34:51.180 an
00:34:51.400 upper
00:34:51.620 bound
00:34:52.080 to
00:34:53.520 how
00:34:53.900 good
00:34:54.420 a
00:34:54.660 life
00:34:54.820 could
00:34:55.000 be
00:34:55.260 and
00:34:56.040 as
00:34:56.220 soon
00:34:56.400 as
00:34:56.540 you've
00:34:56.700 accepted
00:34:57.100 that
00:34:57.580 then
00:34:58.160 in
00:34:58.720 the
00:34:58.860 self
00:34:59.060 interest
00:34:59.280 case
00:34:59.720 you
00:35:00.220 avoid
00:35:00.720 you
00:35:01.600 can
00:35:01.760 very
00:35:02.220 readily
00:35:02.560 avoid
00:35:03.080 this
00:35:03.640 kind
00:35:03.800 of
00:35:03.900 Pascal's
00:35:04.360 mugging
00:35:04.640 or
00:35:04.780 Pascal's
00:35:05.280 wager
00:35:05.560 situation
00:35:06.220 and
00:35:07.200 perhaps
00:35:07.600 you
00:35:07.780 can
00:35:07.900 make
00:35:08.100 the
00:35:08.300 same
00:35:08.640 move
00:35:09.820 when
00:35:09.980 it
00:35:10.080 comes
00:35:10.280 to
00:35:10.420 moral
00:35:10.660 value
00:35:11.060 as
00:35:11.240 well
00:35:11.460 where
00:35:12.200 you're
00:35:12.420 ultimately
00:35:12.740 saying
00:35:13.200 there's
00:35:14.060 a
00:35:14.180 limit
00:35:14.380 to
00:35:14.560 how
00:35:14.700 good
00:35:15.020 the
00:35:15.480 world
00:35:15.780 or
00:35:16.160 the
00:35:16.300 universe
00:35:16.720 can
00:35:17.440 be
00:35:17.700 and
00:35:18.420 as
00:35:18.580 soon
00:35:18.780 as
00:35:18.920 you've
00:35:19.060 got
00:35:19.200 that
00:35:19.460 then
00:35:19.620 you
00:35:19.720 avoid
00:35:20.080 Pascal's
00:35:20.560 mugging
00:35:20.840 you
00:35:21.240 avoid
00:35:21.540 Pascal's
00:35:22.060 wager
00:35:22.340 however
00:35:23.260 you do
00:35:23.600 get
00:35:23.740 into
00:35:23.880 other
00:35:24.080 problems
00:35:24.460 and so
00:35:24.800 that's
00:35:25.040 why
00:35:25.140 it's
00:35:25.280 so
00:35:25.440 difficult
00:35:25.840 yeah
00:35:26.580 I mean
00:35:26.740 the
00:35:26.980 thing
00:35:27.320 that
00:35:27.560 I
00:35:28.300 think
00:35:28.500 is
00:35:28.600 especially
00:35:28.980 novel
00:35:29.420 about
00:35:29.740 what
00:35:30.060 you've
00:35:30.280 been
00:35:30.380 doing
00:35:30.680 and
00:35:30.820 about
00:35:31.060 EA
00:35:31.420 in
00:35:32.360 general
00:35:32.840 which
00:35:33.980 I
00:35:34.900 always
00:35:35.240 tend
00:35:35.400 to
00:35:35.520 emphasize
00:35:35.800 in
00:35:36.160 this
00:35:36.440 conversation
00:35:37.140 I
00:35:37.320 think
00:35:37.700 I've
00:35:37.900 brought
00:35:38.060 this
00:35:38.240 up
00:35:38.400 with
00:35:38.520 you
00:35:38.700 many
00:35:39.360 times
00:35:39.680 before
00:35:40.080 but
00:35:40.920 its
00:35:41.240 novelty
00:35:41.720 has
00:35:42.080 not
00:35:42.320 faded
00:35:42.820 for
00:35:43.080 me
00:35:43.300 and
00:35:43.500 here's
00:35:44.380 the
00:35:44.480 non-novel
00:35:45.020 part
00:35:45.380 about
00:35:45.620 which
00:35:45.840 someone
00:35:46.080 could
00:35:46.260 be
00:35:46.440 somewhat
00:35:46.800 cynical
00:35:47.460 there's
00:35:49.140 just
00:35:49.260 this
00:35:49.400 basic
00:35:49.720 recognition
00:35:50.260 that
00:35:50.520 some
00:35:50.720 charities
00:35:51.180 are
00:35:51.380 much
00:35:51.560 more
00:35:51.740 effective
00:35:52.100 than
00:35:52.380 others
00:35:52.740 and
00:35:53.220 we
00:35:53.320 should
00:35:53.440 want
00:35:53.660 to
00:35:53.780 know
00:35:53.920 which
00:35:54.140 are
00:35:54.260 which
00:35:54.580 and
00:35:54.900 prioritize
00:35:55.400 our
00:35:55.720 resources
00:35:56.140 accordingly
00:35:56.640 I mean
00:35:56.880 that's
00:35:57.220 the
00:35:57.900 effective
00:35:58.340 part
00:35:58.840 of
00:35:59.080 effective
00:35:59.560 altruism
00:36:00.080 that's
00:36:00.980 great
00:36:01.280 and
00:36:01.580 that's
00:36:01.880 not
00:36:02.160 what
00:36:02.420 has
00:36:02.800 captivated
00:36:03.760 me
00:36:03.920 about
00:36:04.180 this
00:36:04.400 I
00:36:04.580 think
00:36:04.780 the
00:36:04.980 thing
00:36:05.720 that
00:36:05.920 is
00:36:06.180 most
00:36:07.020 interesting
00:36:07.340 to
00:36:07.520 me
00:36:07.720 is
00:36:08.000 by
00:36:08.520 prioritizing
00:36:09.500 effectiveness
00:36:10.240 in an
00:36:11.560 unsentimental
00:36:12.340 way
00:36:12.720 what
00:36:13.420 you've
00:36:13.620 done
00:36:13.900 for me
00:36:14.320 is
00:36:14.480 you've
00:36:14.800 uncoupled
00:36:15.500 the
00:36:16.140 ethics
00:36:16.800 of
00:36:17.920 what
00:36:18.200 we're
00:36:18.380 doing
00:36:18.840 from
00:36:19.840 positive
00:36:20.640 psychology
00:36:21.280 really
00:36:22.020 just
00:36:22.280 divorcing
00:36:22.800 the
00:36:22.980 question
00:36:23.500 of
00:36:24.060 how
00:36:24.260 I
00:36:24.520 can
00:36:24.680 do
00:36:24.800 the
00:36:24.960 most
00:36:25.180 good
00:36:25.580 from
00:36:26.500 the
00:36:26.660 question
00:36:27.180 of
00:36:27.760 what
00:36:28.060 I
00:36:28.220 find
00:36:28.900 to
00:36:29.360 be
00:36:29.500 most
00:36:29.780 rewarding
00:36:30.340 about
00:36:30.920 doing
00:36:31.200 good
00:36:31.420 in
00:36:31.520 the
00:36:31.620 world
00:36:31.860 I'm
00:36:32.620 always
00:36:33.240 tempted
00:36:33.560 to
00:36:33.840 figure
00:36:34.520 out
00:36:34.680 how
00:36:34.840 to
00:36:35.020 marry
00:36:35.320 them
00:36:35.580 more
00:36:35.860 and
00:36:36.020 more
00:36:36.280 I
00:36:36.460 think
00:36:36.620 because
00:36:36.840 I
00:36:36.960 think
00:36:37.100 that
00:36:37.220 would
00:36:37.300 be
00:36:37.380 the
00:36:37.500 most
00:36:37.740 motivating
00:36:38.260 and
00:36:38.580 by
00:36:38.960 definition
00:36:39.380 the
00:36:39.600 most
00:36:39.800 rewarding
00:36:40.320 way
00:36:40.780 to
00:36:40.980 live
00:36:41.380 but
00:36:42.280 I'm
00:36:42.960 truly
00:36:43.220 agnostic
00:36:43.800 as to
00:36:44.700 how
00:36:44.960 married
00:36:45.460 they
00:36:45.700 might
00:36:45.960 be
00:36:46.240 and
00:36:46.400 when
00:36:46.520 I
00:36:46.680 survey
00:36:47.340 my
00:36:47.660 own
00:36:47.860 experience
00:36:48.480 I
00:36:49.460 notice
00:36:49.800 they're
00:36:50.360 really
00:36:50.560 not
00:36:50.780 married
00:36:51.160 at
00:36:51.340 all
00:36:51.720 for
00:36:52.520 instance
00:36:52.920 over
00:36:53.880 at
00:36:54.040 Waking
00:36:54.280 Up
00:36:54.580 we
00:36:55.040 launched
00:36:55.480 this
00:36:55.900 campaign
00:36:56.480 which
00:36:56.980 we've
00:36:57.220 called
00:36:57.560 100
00:36:57.960 Days
00:36:58.300 of
00:36:58.460 Giving
00:36:58.720 and
00:37:00.060 you know
00:37:00.800 so for
00:37:01.040 100
00:37:01.320 days
00:37:01.700 we're
00:37:01.960 giving
00:37:02.320 $10,000
00:37:03.300 a day
00:37:03.980 to
00:37:04.200 various
00:37:04.700 charities
00:37:05.200 and we're
00:37:05.540 having
00:37:05.840 subscribers
00:37:06.820 over at
00:37:07.180 Waking
00:37:07.380 Up
00:37:07.620 decide
00:37:08.220 which
00:37:09.080 charities
00:37:09.580 among
00:37:10.020 the list
00:37:11.300 that
00:37:11.720 you and
00:37:13.140 friends
00:37:13.700 have helped
00:37:14.100 me
00:37:14.220 vet
00:37:14.500 so
00:37:15.120 every
00:37:15.460 single
00:37:15.800 day
00:37:16.220 we're
00:37:17.360 giving
00:37:17.660 $10,000
00:37:18.320 away
00:37:18.800 to
00:37:18.980 some
00:37:19.200 great
00:37:19.640 cause
00:37:20.480 and
00:37:21.320 we're
00:37:21.520 about
00:37:21.700 30
00:37:22.080 days
00:37:22.400 into
00:37:22.580 this
00:37:22.800 campaign
00:37:23.340 and
00:37:24.100 honestly
00:37:24.440 I
00:37:24.860 think
00:37:25.320 for
00:37:26.140 most
00:37:26.660 of
00:37:26.880 those
00:37:27.220 30
00:37:27.700 days
00:37:28.260 I
00:37:28.780 have
00:37:28.940 not
00:37:29.220 thought
00:37:29.660 about
00:37:29.960 it
00:37:30.120 at
00:37:30.280 all
00:37:30.540 right
00:37:30.980 I
00:37:31.080 mean
00:37:31.160 literally
00:37:31.520 like
00:37:32.040 it
00:37:32.200 has
00:37:32.380 not
00:37:32.620 even
00:37:32.820 been
00:37:33.280 a
00:37:34.080 10
00:37:34.440 second
00:37:34.960 memory
00:37:35.960 on
00:37:36.620 any
00:37:36.820 given
00:37:37.140 day
00:37:37.880 and
00:37:38.760 you know
00:37:39.500 I
00:37:39.600 consider
00:37:39.940 that
00:37:40.380 on
00:37:41.060 some
00:37:41.240 level
00:37:41.460 a
00:37:41.920 feature
00:37:42.380 rather
00:37:42.740 than
00:37:42.940 a bug
00:37:43.480 but
00:37:44.120 it
00:37:44.260 is
00:37:44.460 striking
00:37:44.960 to
00:37:45.140 me
00:37:45.260 that
00:37:45.440 in
00:37:46.080 terms
00:37:46.340 of
00:37:46.480 my
00:37:46.660 own
00:37:47.060 personal
00:37:48.040 well-being
00:37:48.620 and
00:37:48.920 my
00:37:49.060 own
00:37:49.220 sense
00:37:49.420 of
00:37:49.580 connection
00:37:50.160 to
00:37:50.920 the
00:37:51.120 well-being
00:37:51.620 of
00:37:51.880 others
00:37:52.340 I
00:37:53.020 might
00:37:53.260 actually
00:37:53.740 be
00:37:54.120 more
00:37:54.880 rewarded
00:37:55.420 in
00:37:55.600 fact
00:37:55.780 I
00:37:55.980 think
00:37:56.140 I
00:37:56.280 am
00:37:57.040 almost
00:37:57.500 guaranteed
00:37:58.000 to
00:37:58.480 be
00:37:58.640 more
00:37:58.900 rewarded
00:37:59.400 by
00:38:00.340 simply
00:38:00.940 holding
00:38:01.840 the
00:38:02.060 door
00:38:02.320 open
00:38:02.680 for
00:38:02.880 someone
00:38:03.300 walking
00:38:03.780 into
00:38:04.000 a
00:38:04.160 coffee
00:38:04.460 shop
00:38:04.840 later
00:38:05.320 this
00:38:05.560 afternoon
00:38:06.080 and
00:38:06.280 just
00:38:06.420 having
00:38:06.620 a
00:38:06.880 moment
00:38:07.360 of
00:38:07.540 a
00:38:07.960 smile
00:38:08.400 right
00:38:09.160 like
00:38:09.320 that
00:38:09.540 absolutely
00:38:10.400 trivial
00:38:11.180 moment
00:38:12.000 of
00:38:12.300 good
00:38:12.620 feeling
00:38:13.040 will
00:38:13.960 be
00:38:14.080 more
00:38:14.320 salient
00:38:14.820 to
00:38:15.020 me
00:38:15.320 than
00:38:16.580 the
00:38:17.120 $10,000
00:38:17.720 we're
00:38:18.180 giving
00:38:18.360 today
00:38:18.720 and
00:38:18.980 tomorrow
00:38:19.340 and
00:38:19.520 the
00:38:19.620 next
00:38:19.820 day
00:38:20.060 and
00:38:20.200 onward
00:38:20.700 for
00:38:21.740 months
00:38:22.140 so
00:38:22.800 I'm
00:38:22.900 just
00:38:23.000 wondering
00:38:23.200 what
00:38:23.680 you
00:38:23.800 do
00:38:23.980 with
00:38:24.120 that
00:38:24.260 I
00:38:24.340 mean
00:38:24.420 it's
00:38:24.600 a
00:38:24.740 bug
00:38:25.020 and
00:38:25.300 a
00:38:25.440 feature
00:38:25.820 because
00:38:27.060 I
00:38:27.820 really
00:38:28.040 do
00:38:28.220 think
00:38:28.440 this
00:38:28.580 is
00:38:28.680 how
00:38:29.340 you
00:38:29.540 do
00:38:29.880 the
00:38:30.320 most
00:38:30.560 good
00:38:30.860 you
00:38:31.100 automate
00:38:31.720 these
00:38:32.080 things
00:38:32.400 you
00:38:32.540 decide
00:38:32.920 rationally
00:38:33.660 what
00:38:34.380 you
00:38:34.520 want
00:38:34.720 to
00:38:34.800 do
00:38:35.120 and
00:38:35.640 you
00:38:35.720 put
00:38:41.180 you
00:38:41.340 getting
00:38:41.560 captivated
00:38:42.220 with
00:38:42.380 some
00:38:42.520 other
00:38:42.680 bright
00:38:43.000 shiny
00:38:43.320 object
00:38:43.740 and
00:38:44.000 forgetting
00:38:44.400 about
00:38:45.220 all
00:38:45.400 the
00:38:45.540 good
00:38:45.720 you
00:38:45.840 intended
00:38:46.160 to
00:38:46.360 do
00:38:46.600 but
00:38:47.580 I
00:38:48.060 do
00:38:48.340 hold
00:38:48.640 out
00:38:48.860 some
00:38:49.140 hope
00:38:49.520 that
00:38:49.940 we
00:38:50.720 can
00:38:50.940 actually
00:38:51.580 draw
00:38:52.820 our
00:38:53.280 feelings
00:38:53.940 of
00:38:54.280 well-being
00:38:55.320 more
00:38:56.520 and more
00:38:56.860 directly
00:38:57.480 from the
00:38:58.540 actual
00:38:59.020 good
00:38:59.520 we're
00:38:59.980 accomplishing
00:39:00.560 in the
00:39:01.260 world
00:39:01.520 and I
00:39:01.940 just
00:39:02.220 think
00:39:02.440 there's
00:39:02.800 more
00:39:03.820 to
00:39:04.040 think
00:39:04.260 about
00:39:04.580 on
00:39:05.220 the
00:39:05.340 front
00:39:05.520 of
00:39:05.660 how
00:39:06.280 to
00:39:06.440 accomplish
00:39:06.820 that
00:39:07.140 yeah
00:39:08.260 you're
00:39:08.800 exactly
00:39:09.160 right
00:39:09.500 and
00:39:10.080 it's
00:39:10.720 an old
00:39:11.000 chestnut
00:39:11.440 but
00:39:11.660 it's
00:39:11.880 true
00:39:12.220 that
00:39:13.100 our
00:39:14.300 psychology
00:39:14.980 human
00:39:15.360 psychology
00:39:15.940 was
00:39:17.260 did
00:39:17.520 not
00:39:17.740 evolve
00:39:18.320 to
00:39:18.960 respond
00:39:19.400 to
00:39:19.700 the
00:39:19.820 modern
00:39:20.020 world
00:39:20.440 it
00:39:21.100 evolved
00:39:21.540 for
00:39:21.700 the
00:39:21.800 ancestral
00:39:22.200 environment
00:39:22.860 our
00:39:23.520 community
00:39:24.060 the
00:39:25.100 people
00:39:25.900 that we
00:39:26.200 could
00:39:26.340 affect
00:39:26.780 amounted
00:39:28.040 to
00:39:28.240 the
00:39:28.500 dozens
00:39:28.980 of
00:39:29.200 people
00:39:29.440 perhaps
00:39:29.880 certainly
00:39:30.780 the
00:39:31.000 idea
00:39:31.280 that
00:39:31.460 you
00:39:31.560 could
00:39:31.700 impact
00:39:32.060 someone
00:39:32.340 on
00:39:32.560 the
00:39:32.660 other
00:39:32.800 side
00:39:33.040 of
00:39:33.140 the
00:39:33.240 world
00:39:33.560 or
00:39:34.360 in
00:39:34.760 generations
00:39:35.160 to
00:39:35.440 come
00:39:35.660 that
00:39:36.240 was
00:39:36.360 not
00:39:36.620 true
00:39:36.840 in
00:39:36.960 the
00:39:37.040 way
00:39:37.180 it's
00:39:37.340 true
00:39:37.500 today
00:39:37.820 and
00:39:38.820 so
00:39:40.720 there's
00:39:51.840 this
00:39:52.020 project
00:39:52.520 that
00:39:52.880 effective
00:39:53.760 altruism
00:39:54.240 as a
00:39:54.540 community
00:39:54.940 has
00:39:55.240 tried
00:39:55.480 to
00:39:55.640 embark
00:39:55.920 on
00:39:56.140 to
00:39:56.280 some
00:39:56.480 extent
00:39:56.860 which
00:39:57.680 is
00:39:57.840 just
00:39:58.040 trying
00:39:58.260 to
00:39:58.420 align
00:39:58.740 those
00:39:58.980 two
00:39:59.180 things
00:39:59.500 a
00:39:59.660 little
00:39:59.780 bit
00:39:59.980 better
00:40:00.280 and
00:40:01.400 it's
00:40:01.580 tough
00:40:01.860 because
00:40:02.160 we're
00:40:02.380 not
00:40:02.980 fundamentally
00:40:03.540 aligned
00:40:03.980 but
00:40:04.980 the
00:40:05.340 ideal
00:40:05.700 circumstance
00:40:06.240 is
00:40:06.800 where
00:40:07.180 you
00:40:07.840 are
00:40:08.040 part
00:40:08.340 of
00:40:08.500 a
00:40:08.620 community
00:40:09.000 where
00:40:09.680 you
00:40:09.860 can
00:40:10.080 get
00:40:10.400 reward
00:40:11.080 and
00:40:11.740 support
00:40:12.240 and
00:40:12.540 reassurance
00:40:13.220 for
00:40:13.940 doing
00:40:14.160 the
00:40:14.360 thing
00:40:14.600 that
00:40:14.880 actually
00:40:15.320 helps
00:40:15.640 other
00:40:15.840 people
00:40:16.080 the
00:40:16.260 most
00:40:16.580 rather
00:40:17.300 than
00:40:17.520 the
00:40:17.640 one
00:40:17.860 where
00:40:18.200 you
00:40:18.680 get
00:40:18.920 that
00:40:19.100 immediate
00:40:19.480 warm
00:40:20.060 glow
00:40:20.400 and
00:40:21.320 that's
00:40:21.520 not
00:40:21.660 to
00:40:21.800 say
00:40:21.940 you
00:40:22.080 shouldn't
00:40:22.320 do
00:40:22.440 those
00:40:22.660 things
00:40:22.920 too
00:40:23.160 I
00:40:23.780 certainly
00:40:24.060 do
00:40:24.380 and
00:40:24.900 I
00:40:25.340 think
00:40:25.480 it's
00:40:25.600 part
00:40:25.800 of
00:40:25.980 living
00:40:26.240 a
00:40:26.400 good
00:40:26.600 well
00:40:27.220 rounded
00:40:27.440 life
00:40:27.860 but
00:40:28.960 it's
00:40:30.160 just
00:40:30.380 a
00:40:30.580 fact
00:40:30.900 that
00:40:31.180 there
00:40:31.360 are
00:40:31.460 people
00:40:31.760 in
00:40:33.040 very
00:40:33.920 poor
00:40:34.280 countries
00:40:34.840 who
00:40:35.420 are
00:40:36.360 dying
00:40:36.680 of
00:40:36.840 easily
00:40:37.060 preventable
00:40:40.400 there are
00:40:41.080 animals
00:40:41.400 suffering
00:40:41.780 greatly
00:40:42.180 on
00:40:42.360 factory
00:40:42.640 farms
00:40:43.020 we
00:40:43.220 can
00:40:43.380 take
00:40:43.580 actions
00:40:43.980 to
00:40:44.180 alleviate
00:40:44.500 that
00:40:44.760 suffering
00:40:45.100 there
00:40:45.740 are
00:40:45.900 risks
00:40:46.260 that
00:40:46.520 are
00:40:46.800 maybe
00:40:47.440 going
00:40:47.660 to
00:40:47.740 affect
00:40:47.980 the
00:40:48.140 entire
00:40:48.440 fate
00:40:48.700 of
00:40:48.820 the
00:40:48.900 planet
00:40:49.160 as
00:40:49.920 well
00:40:50.100 as
00:40:50.400 killing
00:40:51.100 billions
00:40:51.420 of
00:40:51.600 people
00:40:51.780 in
00:40:51.900 the
00:40:52.000 present
00:40:52.200 generation
00:40:52.700 ah
00:40:53.860 like
00:40:54.180 that
00:40:54.360 is
00:40:54.480 not
00:40:54.680 something
00:40:55.100 that
00:40:55.320 my
00:40:55.500 monkey
00:40:55.780 brain
00:40:56.060 is
00:40:56.340 equipped
00:40:57.580 to
00:40:57.860 deal
00:40:58.080 with
00:40:58.420 and
00:40:59.100 so
00:40:59.560 you
00:41:00.320 know
00:41:00.440 we're
00:41:00.740 just
00:41:01.040 we've
00:41:01.380 got to
00:41:01.600 try
00:41:01.800 as
00:41:01.980 best
00:41:02.140 we
00:41:02.280 can
00:41:02.560 and
00:41:03.280 that's
00:41:03.500 something
00:41:03.780 at least
00:41:04.080 for me
00:41:04.440 where
00:41:04.780 having
00:41:05.500 a
00:41:05.660 community
00:41:05.920 of
00:41:06.180 people
00:41:06.440 who
00:41:06.680 share
00:41:07.000 this
00:41:07.380 idea
00:41:08.160 who
00:41:08.360 I
00:41:08.460 can
00:41:08.600 talk
00:41:08.920 these
00:41:09.160 ideas
00:41:09.660 through
00:41:09.920 with
00:41:10.240 and
00:41:11.200 who
00:41:12.180 can
00:41:12.540 if
00:41:12.740 needed
00:41:13.040 if
00:41:13.760 I
00:41:13.860 find
00:41:14.060 these
00:41:14.240 things
00:41:14.480 hard
00:41:14.820 to
00:41:15.020 provide
00:41:15.320 support
00:41:16.340 and
00:41:16.460 reassurance
00:41:17.060 that's
00:41:18.140 at least
00:41:18.360 a
00:41:18.540 partial
00:41:18.880 solution
00:41:19.320 to
00:41:19.520 this
00:41:19.680 problem
00:41:20.000 to
00:41:20.320 get
00:41:20.940 my
00:41:21.180 monkey
00:41:21.480 brain
00:41:21.820 to
00:41:22.160 care
00:41:23.000 more
00:41:23.340 about
00:41:23.620 the
00:41:23.800 things
00:41:24.040 that
00:41:24.220 really
00:41:24.420 matter
00:41:24.740 well
00:41:25.820 so
00:41:26.200 are
00:41:26.520 there
00:41:26.660 any
00:41:26.860 other
00:41:27.100 problems
00:41:27.660 that
00:41:28.420 have
00:41:28.620 been
00:41:28.960 that
00:41:29.400 you've
00:41:29.720 noticed
00:41:30.120 or
00:41:33.780 you
00:41:34.020 think
00:41:34.440 we
00:41:34.560 should
00:41:34.700 touch
00:41:34.940 on
00:41:35.120 before
00:41:35.320 we
00:41:35.560 jump
00:41:36.000 into
00:41:36.140 the
00:41:36.260 book
00:41:36.460 and
00:41:36.660 this
00:41:37.380 discussion
00:41:37.840 of
00:41:38.180 long
00:41:39.100 termism
00:41:39.580 and
00:41:39.900 existential
00:41:40.260 risk
00:41:40.660 sure
00:41:41.480 I
00:41:41.700 think
00:41:41.980 the
00:41:42.400 number
00:41:43.400 one
00:41:43.720 thing
00:41:44.100 that
00:41:44.720 is
00:41:45.740 in
00:41:46.000 some
00:41:46.260 sense
00:41:46.560 a
00:41:46.680 problem
00:41:46.980 but
00:41:47.360 I
00:41:47.540 think
00:41:47.820 is
00:41:48.340 just
00:41:48.580 kind
00:41:48.780 of
00:41:48.880 where
00:41:49.060 the
00:41:49.200 debate
00:41:49.440 should
00:41:49.660 be
00:41:49.920 is
00:41:50.700 just
00:41:50.860 what
00:41:51.080 follows
00:41:51.500 long
00:41:51.880 termism
00:41:52.360 where
00:41:53.440 the
00:41:54.240 issue
00:41:54.600 the
00:41:54.920 thing
00:41:55.160 the
00:41:55.360 thoughts
00:41:55.700 that
00:41:55.920 I
00:41:56.100 want
00:41:56.380 to
00:41:56.560 promote
00:41:57.020 and
00:41:57.780 get
00:41:57.940 out
00:41:58.120 into
00:41:58.300 the
00:41:58.440 world
00:41:58.760 are
00:41:59.520 that
00:41:59.660 future
00:42:00.620 people
00:42:00.940 count
00:42:01.380 there
00:42:02.080 could
00:42:02.200 be
00:42:02.320 enormous
00:42:02.620 numbers
00:42:02.980 of
00:42:03.200 them
00:42:03.380 and
00:42:03.680 we
00:42:03.880 really
00:42:04.260 can
00:42:04.800 and
00:42:05.380 in
00:42:05.560 fact
00:42:05.780 are
00:42:06.140 impacting
00:42:06.660 how
00:42:06.900 their
00:42:07.080 lives
00:42:07.340 go
00:42:07.600 and
00:42:08.780 then
00:42:08.940 there's
00:42:09.180 the
00:42:09.320 second
00:42:09.680 question
00:42:10.200 of
00:42:10.900 well
00:42:11.600 what
00:42:11.760 do
00:42:11.880 we
00:42:11.960 do
00:42:12.140 about
00:42:12.380 that
00:42:12.620 what's
00:42:13.200 the
00:42:13.320 top
00:42:13.560 priorities
00:42:14.100 for
00:42:14.560 the
00:42:14.660 world
00:42:14.900 today
00:42:15.240 is
00:42:16.260 that
00:42:16.640 the
00:42:16.880 standard
00:42:17.180 environmentalist
00:42:17.980 answer
00:42:18.320 of
00:42:18.540 mitigate
00:42:18.800 climate
00:42:19.180 change
00:42:19.680 reduce
00:42:20.280 resource
00:42:20.640 depletion
00:42:21.160 reduce
00:42:22.120 species
00:42:22.480 loss
00:42:22.900 is it
00:42:24.040 the
00:42:24.280 classic
00:42:24.880 existential
00:42:25.740 risk
00:42:26.100 answer
00:42:26.560 of
00:42:27.240 reduce
00:42:28.040 the risk
00:42:28.440 of
00:42:28.580 misaligned
00:42:29.000 AI
00:42:29.460 reduce
00:42:29.940 the risk
00:42:30.300 of a
00:42:30.520 worst
00:42:30.700 case
00:42:30.920 pandemic
00:42:31.380 is it
00:42:32.840 the idea
00:42:33.580 of just
00:42:33.900 promoting
00:42:34.540 better
00:42:34.840 values
00:42:35.500 is it
00:42:36.380 something
00:42:36.700 that I
00:42:37.020 think
00:42:37.180 you're
00:42:37.400 sympathetic
00:42:37.720 to
00:42:38.100 and
00:42:38.520 others
00:42:39.080 like
00:42:39.260 Tyler
00:42:39.460 Cohen
00:42:39.780 or
00:42:39.900 Patrick
00:42:40.120 Collison
00:42:40.520 are
00:42:41.160 sympathetic
00:42:41.460 to
00:42:41.760 which
00:42:42.020 is
00:42:42.220 just
00:42:42.820 generally
00:42:43.400 make
00:42:43.760 the
00:42:43.920 world
00:42:44.320 saner
00:42:44.880 and
00:42:45.780 more
00:42:46.040 well
00:42:46.280 functioning
00:42:46.660 because
00:42:47.340 you know
00:42:47.780 predicting
00:42:48.120 the future
00:42:48.560 is so
00:42:48.860 hard
00:42:49.220 or is
00:42:50.260 it
00:42:50.380 perhaps
00:42:50.700 about
00:42:51.060 just
00:42:51.800 trying
00:42:52.060 to put
00:42:52.360 the
00:42:52.600 next
00:42:52.920 generation
00:42:53.380 into
00:42:54.100 a
00:42:54.280 situation
00:42:54.680 such
00:42:54.960 that
00:42:55.100 they
00:42:55.360 can
00:42:55.540 make
00:42:55.740 a
00:42:55.860 better
00:42:56.060 and
00:42:56.220 more
00:42:56.380 informed
00:42:56.660 decision
00:42:57.060 and
00:42:57.720 have
00:42:57.880 at
00:42:58.080 least
00:42:58.200 the
00:42:58.360 option
00:42:58.700 to
00:42:59.560 decide
00:43:00.600 how
00:43:01.000 their
00:43:01.340 future
00:43:01.960 will
00:43:02.160 go
00:43:02.420 these
00:43:03.240 are
00:43:03.360 extremely
00:43:03.960 hard
00:43:04.300 questions
00:43:04.700 and
00:43:05.940 I
00:43:07.120 actively
00:43:07.620 want
00:43:07.960 to see
00:43:08.420 an
00:43:08.700 enormous
00:43:09.020 amount
00:43:09.340 of
00:43:09.500 debate
00:43:09.780 on
00:43:10.000 what
00:43:10.140 follows
00:43:10.460 from
00:43:10.660 that
00:43:10.960 perhaps
00:43:12.060 a
00:43:12.740 thousand
00:43:13.020 times
00:43:13.440 as
00:43:13.640 much
00:43:13.860 work
00:43:14.260 as
00:43:14.800 has
00:43:14.940 gone
00:43:15.100 into
00:43:15.340 it
00:43:15.500 I
00:43:15.660 would
00:43:15.780 love
00:43:15.940 to
00:43:16.080 see
00:43:16.340 discussing
00:43:17.420 these
00:43:17.700 issues
00:43:18.060 and
00:43:18.720 there's
00:43:19.000 an
00:43:19.200 enormous
00:43:19.600 amount
00:43:20.600 to say
00:43:22.320 and
00:43:23.320 so
00:43:23.560 that's
00:43:23.860 where
00:43:24.200 I
00:43:24.960 think
00:43:25.140 the
00:43:25.300 real
00:43:26.180 meat
00:43:26.480 of
00:43:26.620 debate
00:43:26.920 lies
00:43:27.360 okay
00:43:28.360 so
00:43:28.540 let's
00:43:28.840 talk
00:43:29.100 about
00:43:29.280 the
00:43:29.420 book
00:43:29.720 because
00:43:30.360 you
00:43:30.900 spoke
00:43:31.200 a couple
00:43:31.460 of
00:43:31.560 minutes
00:43:31.880 ago
00:43:32.540 about
00:43:32.780 how
00:43:33.240 we
00:43:34.320 are
00:43:34.460 misaligned
00:43:35.200 in
00:43:35.900 psychological
00:43:36.700 terms
00:43:38.060 by virtue
00:43:38.840 of
00:43:39.500 imperfect
00:43:40.000 evolution
00:43:40.560 to
00:43:41.600 think
00:43:42.000 about
00:43:42.400 problems
00:43:42.980 at
00:43:43.180 scale
00:43:43.640 and
00:43:44.420 that
00:43:44.880 is
00:43:45.160 certainly
00:43:45.380 true
00:43:45.860 but
00:43:46.460 we
00:43:46.680 have
00:43:47.000 made
00:43:47.440 impressive
00:43:48.240 progress
00:43:49.140 with
00:43:50.120 respect
00:43:50.520 to
00:43:51.460 scaling
00:43:52.820 our
00:43:53.100 concern
00:43:53.560 about
00:43:54.000 our
00:43:54.420 fellow
00:43:54.640 human
00:43:54.920 beings
00:43:55.460 over
00:43:56.160 physical
00:43:56.760 distance
00:43:57.380 distance
00:43:57.900 with
00:43:58.160 respect
00:43:58.520 to
00:43:58.740 space
00:43:59.300 we
00:44:00.020 kind
00:44:00.440 of
00:44:00.520 understand
00:44:01.140 ethically
00:44:02.040 now
00:44:02.500 Peter Singer
00:44:03.720 has done a lot
00:44:04.240 of work
00:44:04.680 on this
00:44:05.620 topic
00:44:06.080 urging us
00:44:06.940 to expand
00:44:07.460 the circle
00:44:07.940 of our
00:44:08.220 moral
00:44:08.660 concern
00:44:09.140 and
00:44:10.220 I think
00:44:11.180 something like
00:44:12.440 100%
00:44:13.160 of the
00:44:13.480 people
00:44:13.880 listening
00:44:14.700 to us
00:44:15.260 will
00:44:16.020 understand
00:44:16.400 that
00:44:16.860 you
00:44:17.040 can't
00:44:18.020 be a
00:44:18.320 good
00:44:18.500 person
00:44:19.080 and
00:44:19.700 say
00:44:20.140 that
00:44:20.540 you
00:44:20.780 only
00:44:21.280 care
00:44:21.720 about
00:44:22.060 people
00:44:22.440 who
00:44:22.660 live
00:44:22.960 within
00:44:23.300 a
00:44:23.440 thousand
00:44:23.740 miles
00:44:24.140 of
00:44:24.360 your
00:44:24.560 home
00:44:25.060 right
00:44:25.520 I mean
00:44:25.660 that's
00:44:25.860 just
00:44:25.980 not
00:44:26.340 that's
00:44:26.980 just
00:44:27.100 not
00:44:27.280 compatible
00:44:27.720 with
00:44:28.260 life
00:44:28.760 in
00:44:28.880 the
00:44:28.980 21st
00:44:29.380 century
00:44:29.700 although
00:44:30.620 there's
00:44:31.720 some
00:44:31.960 discounting
00:44:32.600 function
00:44:33.060 no doubt
00:44:33.720 and it
00:44:34.060 just
00:44:34.360 I think
00:44:34.920 normal
00:44:35.280 psychology
00:44:35.880 would cause
00:44:37.060 us to
00:44:37.380 care
00:44:37.680 more
00:44:38.180 about
00:44:38.640 family
00:44:39.480 and
00:44:39.700 friends
00:44:40.180 than
00:44:41.420 about
00:44:41.760 perfect
00:44:42.420 strangers
00:44:42.940 and
00:44:43.240 certainly
00:44:43.460 perfect
00:44:43.780 strangers
00:44:44.280 in
00:44:45.020 faraway
00:44:45.460 cultures
00:44:45.960 for which
00:44:46.840 we have
00:44:47.120 no affinity
00:44:47.700 etc
00:44:48.340 and we
00:44:49.240 can
00:44:49.500 I think
00:44:50.280 we've
00:44:50.440 discussed
00:44:50.780 this
00:44:51.140 in
00:44:51.900 previous
00:44:52.220 conversations
00:44:52.780 just
00:44:53.100 what
00:44:53.480 might
00:44:53.780 be
00:44:54.000 normative
00:44:54.500 there
00:44:54.780 whether
00:44:54.980 we
00:44:55.140 actually
00:44:55.420 would
00:44:55.600 want
00:44:55.920 perfect
00:44:56.940 indifference
00:44:57.960 and dispassion
00:44:58.980 with respect
00:44:59.460 to all
00:45:00.120 those
00:45:00.260 variables
00:45:00.620 or if
00:45:01.100 there's
00:45:01.320 some
00:45:01.700 hierarchy
00:45:02.880 of concern
00:45:03.640 that's
00:45:03.940 still
00:45:04.120 appropriate
00:45:04.580 and
00:45:05.320 I think
00:45:06.420 it probably
00:45:06.880 is
00:45:07.200 but
00:45:07.460 the topic
00:45:08.440 of today's
00:45:09.080 conversation
00:45:09.640 and of
00:45:10.140 your book
00:45:10.680 is
00:45:11.540 just
00:45:12.200 how
00:45:12.740 abysmally
00:45:13.900 we have
00:45:15.520 failed
00:45:16.480 to
00:45:17.360 generate
00:45:18.080 strong
00:45:18.600 intuitions
00:45:19.380 about
00:45:19.860 the
00:45:20.620 moral
00:45:21.180 responsibility
00:45:22.280 we have
00:45:23.560 and the
00:45:23.780 moral
00:45:24.000 opportunity
00:45:24.580 we have
00:45:25.280 to
00:45:25.820 safeguard
00:45:26.380 the
00:45:26.980 future
00:45:27.320 so
00:45:27.640 perhaps
00:45:27.900 you can
00:45:28.220 with that
00:45:28.600 set up
00:45:29.140 you can
00:45:29.520 give me
00:45:29.780 the elevator
00:45:30.720 synopsis
00:45:31.820 of the
00:45:32.600 new book
00:45:33.060 sure
00:45:34.020 so
00:45:34.420 like I
00:45:35.780 said
00:45:35.960 this book
00:45:36.360 is about
00:45:36.780 long-termism
00:45:37.500 about
00:45:38.420 just
00:45:38.780 taking
00:45:39.380 seriously
00:45:39.860 the future
00:45:40.760 that's
00:45:40.980 ahead of
00:45:41.240 us
00:45:41.520 taking
00:45:42.340 seriously
00:45:42.880 the
00:45:43.920 moral value
00:45:44.860 that future
00:45:45.620 generations
00:45:46.120 have
00:45:46.580 and then
00:45:47.320 also
00:45:47.540 taking
00:45:47.800 seriously
00:45:48.160 the fact
00:45:48.640 that
00:45:48.840 what we
00:45:49.580 do
00:45:49.900 as a
00:45:50.760 society
00:45:51.160 really
00:45:51.460 will
00:45:51.780 impact
00:45:52.300 the
00:45:53.020 very
00:45:53.200 long
00:45:53.460 term
00:45:53.700 and
00:45:54.940 I
00:45:55.200 think
00:45:55.360 there
00:45:55.500 are
00:45:55.620 two
00:45:55.980 categories
00:45:56.920 of
00:45:57.200 things
00:45:57.460 that
00:45:57.620 impact
00:45:57.980 the
00:45:58.140 very
00:45:58.300 long
00:45:58.540 term
00:45:58.840 where
00:45:59.960 there's
00:46:01.120 things
00:46:01.500 that
00:46:01.700 could
00:46:01.960 just
00:46:02.820 end
00:46:03.160 civilization
00:46:03.780 altogether
00:46:04.240 and
00:46:05.620 this
00:46:06.140 concern
00:46:06.640 really
00:46:07.080 got on
00:46:07.520 society's
00:46:08.080 radar
00:46:08.320 with the
00:46:09.140 advent
00:46:09.900 of
00:46:10.100 nuclear
00:46:10.420 weapons
00:46:10.820 I think
00:46:12.040 an all-out
00:46:13.220 nuclear war
00:46:13.900 I think
00:46:14.580 unlikely to
00:46:15.840 kill literally
00:46:16.320 everybody
00:46:16.680 but it
00:46:17.120 could result
00:46:17.620 in
00:46:17.920 hundreds of
00:46:20.300 millions
00:46:20.500 maybe even
00:46:20.940 billions
00:46:21.240 dead
00:46:21.520 unimaginable
00:46:22.500 tragedy
00:46:22.900 which I
00:46:23.540 think
00:46:23.720 would make
00:46:24.040 the world
00:46:24.480 worse
00:46:25.500 not
00:46:25.740 just
00:46:26.640 for
00:46:26.780 those
00:46:26.960 people
00:46:27.180 but
00:46:27.440 for
00:46:27.820 the
00:46:27.960 extremely
00:46:28.820 long
00:46:29.080 time
00:46:29.400 and
00:46:30.040 could
00:46:30.200 result
00:46:30.500 in
00:46:30.640 the
00:46:30.740 collapse
00:46:31.020 of
00:46:31.180 civilization
00:46:31.700 too
00:46:32.120 at the
00:46:33.400 more
00:46:33.560 extreme
00:46:33.920 level
00:46:34.220 there's
00:46:34.700 worries
00:46:35.320 about
00:46:36.180 in
00:46:37.020 particular
00:46:37.540 engineered
00:46:38.060 pandemics
00:46:38.680 and
00:46:39.680 this is a
00:46:40.580 risk
00:46:40.780 that we've
00:46:41.080 been
00:46:41.280 worrying
00:46:41.640 about
00:46:41.860 for
00:46:42.000 many
00:46:42.180 years
00:46:42.480 now
00:46:42.740 where
00:46:43.660 pandemics
00:46:44.580 occur
00:46:44.960 occasionally
00:46:45.480 and are
00:46:46.480 extremely
00:46:46.880 bad
00:46:47.220 and so
00:46:48.620 society
00:46:49.260 doesn't
00:46:49.580 prepare
00:46:49.940 for them
00:46:50.220 in the
00:46:50.400 way
00:46:50.520 they
00:46:50.700 should
00:46:50.920 I
00:46:51.840 think
00:46:52.020 the
00:46:52.280 risk
00:46:52.760 is
00:46:52.880 going to
00:46:53.000 get much
00:46:53.300 worse
00:46:53.600 over
00:46:53.760 time
00:46:54.080 because
00:46:54.880 not only
00:46:55.340 do we
00:46:55.560 have to
00:46:55.820 contend
00:46:56.140 with
00:46:56.380 natural
00:46:56.860 pandemics
00:46:57.400 we will
00:46:58.540 also have
00:46:59.040 to contend
00:46:59.600 with
00:47:00.000 pandemics
00:47:01.360 from viruses
00:47:01.960 that have
00:47:02.480 been enhanced
00:47:03.280 to have
00:47:03.680 more
00:47:03.860 destructive
00:47:04.200 properties
00:47:04.700 or even
00:47:05.480 just
00:47:05.660 created
00:47:06.040 de novo
00:47:06.500 such as
00:47:07.700 in a
00:47:07.860 bioweapons
00:47:08.360 program
00:47:08.800 this is a
00:47:10.020 very scary
00:47:10.500 prospect
00:47:10.920 and
00:47:11.940 current
00:47:12.280 estimates
00:47:12.700 put the
00:47:13.020 risk
00:47:13.280 of
00:47:14.000 all out
00:47:14.780 catastrophe
00:47:15.280 95%
00:47:16.200 data
00:47:16.460 something
00:47:17.040 like
00:47:17.260 1%
00:47:17.760 essentially
00:47:18.220 a second
00:47:19.580 way
00:47:20.180 set of
00:47:21.520 events
00:47:21.860 that could
00:47:22.140 impact
00:47:22.440 the very
00:47:22.740 long
00:47:23.000 term
00:47:23.260 are
00:47:24.140 things
00:47:24.420 that
00:47:24.680 don't
00:47:25.400 impact
00:47:26.540 the
00:47:27.860 length
00:47:28.120 of
00:47:28.260 civilization
00:47:28.740 that is
00:47:29.120 it wouldn't
00:47:29.820 kill us
00:47:30.220 off
00:47:30.400 early
00:47:30.740 but does
00:47:31.740 affect the
00:47:32.180 value of
00:47:32.620 that
00:47:32.760 civilization
00:47:33.300 kind of
00:47:33.700 civilization's
00:47:34.260 quality of
00:47:34.800 life as
00:47:35.180 it were
00:47:35.480 and in
00:47:37.200 what we
00:47:37.500 are the
00:47:37.700 future
00:47:38.060 I focus
00:47:39.200 in particular
00:47:39.760 on values
00:47:40.520 where
00:47:41.580 we're used
00:47:42.700 to values
00:47:43.220 changing an
00:47:43.820 enormous amount
00:47:44.420 over time
00:47:45.060 but that's
00:47:45.780 actually
00:47:46.080 something that I
00:47:47.640 think is
00:47:47.920 contingent about
00:47:48.660 our world
00:47:49.140 and I
00:47:49.800 think there
00:47:50.160 are
00:47:51.000 ways
00:47:52.500 especially
00:47:53.000 via
00:47:53.300 technological
00:47:53.820 and political
00:47:54.600 developments
00:47:55.340 that that
00:47:56.140 change could
00:47:56.760 slow
00:47:57.840 and in the
00:47:59.160 worst case
00:47:59.760 our future
00:48:00.760 could be one
00:48:01.460 of perpetual
00:48:02.180 totalitarian
00:48:02.960 dictatorship
00:48:03.540 that would be
00:48:04.440 very scary
00:48:04.900 indeed
00:48:05.240 that would be
00:48:06.120 a loss of
00:48:06.540 almost all
00:48:06.940 value that we
00:48:07.720 could have
00:48:08.140 an existential
00:48:08.940 risk itself
00:48:09.640 even if that
00:48:10.620 civilization were
00:48:11.420 very long
00:48:11.860 lived
00:48:12.140 and so
00:48:12.900 what can
00:48:13.160 we do
00:48:13.460 and things
00:48:15.580 vary in terms
00:48:16.400 of how
00:48:16.780 tractable
00:48:17.320 they are
00:48:17.680 like how
00:48:18.040 much
00:48:18.240 there is
00:48:18.520 actually
00:48:18.800 to do
00:48:19.180 but there
00:48:20.040 are many
00:48:20.380 things I
00:48:20.760 think we
00:48:21.060 can do
00:48:21.400 so on
00:48:22.800 the
00:48:23.100 ensuring
00:48:24.020 we have
00:48:24.360 a future
00:48:24.660 side of
00:48:25.060 things
00:48:25.380 I actually
00:48:26.420 think that
00:48:26.820 there's
00:48:27.080 very concrete
00:48:28.040 actions we
00:48:28.620 can take
00:48:28.980 to reduce
00:48:29.420 the risk
00:48:29.780 of the
00:48:30.000 next
00:48:30.180 pandemic
00:48:30.600 such as
00:48:31.980 technology
00:48:32.540 to monitor
00:48:34.380 wastewater
00:48:34.760 for new
00:48:36.240 pathogens
00:48:36.580 such as
00:48:37.760 a certain
00:48:38.280 there's this
00:48:38.960 new form of
00:48:39.500 lighting called
00:48:40.080 far UVC
00:48:40.820 that I'm
00:48:41.140 extremely
00:48:41.500 excited about
00:48:42.180 if we can
00:48:43.200 get the
00:48:43.440 cost down
00:48:44.020 enough
00:48:44.360 then we
00:48:44.800 can just
00:48:45.560 be sterilizing
00:48:46.440 rooms in
00:48:46.860 an ongoing
00:48:47.320 way in a
00:48:48.380 way that's
00:48:48.940 not harmful
00:48:49.680 for human
00:48:50.340 health or
00:48:51.660 on the
00:48:52.660 values kind
00:48:53.340 of side of
00:48:53.720 things well
00:48:54.540 we can be
00:48:55.040 pushing for
00:48:55.740 people to be
00:48:56.440 more altruistic
00:48:57.560 more reasonable
00:48:58.620 in the way
00:48:59.040 they think
00:48:59.520 about model
00:49:00.980 matters perhaps
00:49:02.120 more impartial
00:49:02.880 to taking
00:49:03.560 seriously the
00:49:04.620 interests of
00:49:05.440 all sentient
00:49:06.580 beings whether
00:49:07.600 they're human
00:49:08.720 or animal
00:49:09.280 or other or
00:49:10.800 whether they're
00:49:11.500 now or in
00:49:12.080 the future
00:49:12.500 and we
00:49:13.440 can be
00:49:13.700 kind of
00:49:14.080 guarding
00:49:14.820 against the
00:49:15.560 lie like
00:49:16.340 certain
00:49:16.700 authoritarian
00:49:18.080 tendencies that
00:49:19.120 I think are
00:49:19.480 very common
00:49:20.060 and we have
00:49:20.980 seen in
00:49:21.340 history over
00:49:21.940 and over
00:49:22.240 again and
00:49:23.140 that I
00:49:23.360 think are
00:49:23.600 quite scary
00:49:24.060 from a
00:49:24.460 long-term
00:49:24.820 perspective
00:49:25.360 and then
00:49:26.300 lastly we
00:49:26.840 can just
00:49:27.180 start
00:49:27.480 reasoning
00:49:27.860 more about
00:49:28.360 this
00:49:28.600 I said
00:49:29.240 earlier
00:49:29.520 that you
00:49:30.080 know I
00:49:30.680 don't know
00:49:31.000 I'm not
00:49:31.260 like super
00:49:31.660 confident about
00:49:32.280 what we
00:49:32.560 should be
00:49:32.780 doing but
00:49:33.460 certainly we
00:49:34.060 can be
00:49:34.440 doing more
00:49:35.520 investigation
00:49:36.060 building a
00:49:36.960 movement that
00:49:37.480 takes
00:49:37.740 seriously
00:49:38.260 oh yeah
00:49:39.240 like there's
00:49:40.840 this enormous
00:49:41.320 potential future
00:49:42.140 ahead of us
00:49:42.660 and we can
00:49:43.600 do an
00:49:44.320 enormous amount
00:49:44.900 of good to
00:49:45.400 make it better
00:49:45.960 yeah I want to
00:49:47.520 talk about value
00:49:48.500 lock-in and some
00:49:49.880 of the issues
00:49:50.580 there but
00:49:51.840 before we get
00:49:53.100 there I think
00:49:53.920 we should linger
00:49:54.580 on what
00:49:56.000 makes thinking
00:49:57.480 about the future
00:49:58.320 so difficult
00:50:00.040 I think you say
00:50:01.380 somewhere in the
00:50:02.000 book that but I
00:50:03.120 mean both with
00:50:03.720 respect to
00:50:04.400 distance in space
00:50:05.900 and distance
00:50:06.380 in time
00:50:07.060 I think you
00:50:08.040 say something
00:50:08.360 like we
00:50:08.700 mistake distance
00:50:10.100 for unreality
00:50:11.300 right like the
00:50:12.280 fact that
00:50:12.760 something is
00:50:13.340 far away from
00:50:14.180 us makes
00:50:15.520 part of our
00:50:16.460 brain suspect
00:50:17.780 that it may
00:50:19.100 not quite
00:50:19.820 exist and
00:50:21.280 this is again
00:50:21.920 I think this
00:50:22.620 has changed
00:50:24.060 fairly categorically
00:50:26.360 with respect to
00:50:27.260 space where you
00:50:28.920 just can't really
00:50:30.340 imagine that the
00:50:31.200 people suffering
00:50:32.200 over in Yemen
00:50:33.380 don't exist
00:50:34.540 but it's much
00:50:36.240 harder to
00:50:36.740 assimilate this
00:50:38.020 new intuition
00:50:38.700 with respect to
00:50:39.420 time because
00:50:40.060 in this case
00:50:41.520 we are talking
00:50:42.160 about people
00:50:42.880 who don't
00:50:44.380 actually exist
00:50:45.340 you know
00:50:45.800 there'll be some
00:50:47.520 number of
00:50:48.380 thousands of
00:50:48.960 people probably
00:50:49.740 you know
00:50:50.200 three to
00:50:51.240 five thousand
00:50:51.840 people or so
00:50:52.600 born over the
00:50:54.480 course of our
00:50:55.200 having this
00:50:55.640 conversation right
00:50:56.720 globally speaking
00:50:57.440 yeah I think
00:50:58.280 the math is
00:50:58.880 somewhere around
00:50:59.640 there and so
00:51:00.520 it's easy to
00:51:01.720 to argue okay
00:51:03.120 these are people
00:51:03.960 who are coming
00:51:04.560 into this world
00:51:05.440 for which they
00:51:06.420 have no
00:51:07.040 responsibility and
00:51:08.420 we you know
00:51:09.360 this the present
00:51:09.900 generation or
00:51:11.000 generations are
00:51:12.540 entirely responsible
00:51:13.680 for the quality of
00:51:15.560 the world they
00:51:16.620 inherit and the
00:51:17.460 quality of the
00:51:18.460 lives they begin
00:51:19.920 to lead and I
00:51:22.320 would say we
00:51:22.800 don't you know
00:51:24.000 collectively we
00:51:24.660 don't think nearly
00:51:25.740 enough about that
00:51:27.120 responsibility but
00:51:28.780 when you're talking
00:51:29.540 about people who
00:51:30.380 may or may not
00:51:32.100 be born in
00:51:33.660 some distant
00:51:34.780 future we
00:51:36.200 don't think about
00:51:36.700 these people at
00:51:37.280 all and and I
00:51:38.180 would I suspect
00:51:39.300 that most people
00:51:40.620 wouldn't see any
00:51:42.000 problem with that
00:51:43.080 because it's just
00:51:43.900 when you think
00:51:45.040 about what a
00:51:46.100 crime it would be
00:51:47.060 for us to
00:51:47.640 foreclose the
00:51:48.420 future in some
00:51:49.100 way you know if
00:51:49.700 we all die in
00:51:50.360 our sleep tonight
00:51:51.120 painlessly and
00:51:52.780 there is no human
00:51:53.560 future what's bad
00:51:55.500 about that well
00:51:56.440 you have to tell
00:51:57.300 yourself a story
00:51:57.920 about all the
00:51:58.520 unrealized joy and
00:52:00.300 happiness and
00:52:00.920 creativity that
00:52:02.280 exists in
00:52:03.600 potentia on that
00:52:05.080 side of the
00:52:05.540 ledger but there's
00:52:06.840 no there are no
00:52:07.620 victims deprived of
00:52:09.380 all of that
00:52:09.840 goodness because they
00:52:11.240 don't exist right
00:52:12.140 so if this is the
00:52:13.740 very essence of a
00:52:14.560 victimless crime and
00:52:15.980 I do I think you
00:52:16.660 and I have touched
00:52:17.620 that issue in a
00:52:18.240 previous conversation
00:52:19.080 but it's just hard
00:52:20.900 to think about the
00:52:22.500 far future much
00:52:24.240 less prioritize it
00:52:25.420 and I'm just
00:52:26.760 wondering if
00:52:27.140 there's something
00:52:27.440 we can do to
00:52:28.460 shore up our
00:52:29.700 intuitions here
00:52:30.560 I guess the
00:52:31.780 first point I
00:52:32.380 would make is
00:52:32.840 that the only
00:52:33.820 reason why these
00:52:35.840 people are
00:52:36.260 hypothetical is
00:52:38.100 because you the
00:52:39.400 listener are
00:52:40.160 holding out some
00:52:41.440 possibility that
00:52:43.120 they may not
00:52:44.380 exist when you
00:52:45.680 ask yourself what
00:52:46.460 would cause
00:52:47.300 subsequent future
00:52:48.460 generations of
00:52:49.360 people not to
00:52:50.560 exist perhaps
00:52:51.840 there's some
00:52:52.340 other version but
00:52:53.260 you know the
00:52:53.820 only thing I can
00:52:54.420 imagine is some
00:52:55.500 catastrophe
00:52:56.600 of the sort
00:52:57.260 that you have
00:52:58.240 begun sketching
00:52:59.040 out right
00:52:59.460 some pandemic
00:53:00.540 some asteroid
00:53:01.700 impact some
00:53:02.440 nuclear war some
00:53:03.480 combination of
00:53:04.340 those things that
00:53:06.000 destroy everything
00:53:07.620 for everyone which
00:53:08.560 is to say some
00:53:09.300 absolutely horrific
00:53:11.460 cataclysm which we
00:53:13.840 should be highly
00:53:14.400 motivated to avoid
00:53:15.540 right so so if we
00:53:16.980 successfully avoid
00:53:18.060 the worst possible
00:53:19.880 outcome for
00:53:21.040 ourselves and our
00:53:21.980 children and our
00:53:22.760 grandchildren the
00:53:23.940 future people we're
00:53:24.760 talking about are
00:53:25.840 guaranteed to exist
00:53:27.040 it's only time
00:53:28.360 shifted it's not in
00:53:30.300 fact in question and
00:53:32.360 the only thing that
00:53:32.860 would put it in
00:53:33.380 question is some
00:53:34.740 horrific outcome that
00:53:35.820 we are responsible for
00:53:38.040 foreseeing and
00:53:39.500 avoiding and really
00:53:40.680 no one there's no one
00:53:41.440 else to do the job
00:53:42.460 absolutely I mean so
00:53:44.880 yeah there's this
00:53:45.900 general issue where
00:53:46.940 why do we care about
00:53:48.900 people on the other
00:53:49.980 side of the planet
00:53:50.780 much more now than we
00:53:52.300 did let's say a hundred
00:53:53.280 years ago and it's
00:53:54.800 clear that a big part
00:53:55.700 of it is that now we
00:53:56.640 can see people on the
00:53:57.620 other side of the
00:53:58.180 world we get
00:53:58.880 information about them
00:53:59.840 in a way that we
00:54:00.840 never used to where
00:54:02.260 literally there can be
00:54:03.360 a live stream video of
00:54:04.840 someone in Nigeria or
00:54:07.480 Pakistan or India or
00:54:08.820 any other country
00:54:09.940 around the world so
00:54:11.520 that you know those
00:54:12.960 people's lives become
00:54:14.020 vivid to us in a way
00:54:15.680 that's just not
00:54:16.240 possible for future
00:54:16.980 generations and that
00:54:17.860 makes this challenge
00:54:18.700 just so deep in fact to
00:54:20.760 get people to care if
00:54:22.340 there were news from
00:54:23.540 the future you know
00:54:24.780 the newspaper of
00:54:25.920 tomorrow I think we
00:54:27.180 wouldn't be in such a
00:54:27.920 bad state you know we
00:54:29.420 could report on what
00:54:30.380 the world is like in
00:54:31.280 the year 2100 or 2200
00:54:34.560 and I think people
00:54:36.900 would care but that's
00:54:38.320 not the news we get
00:54:39.140 it's all abstract and
00:54:40.800 that and that's really
00:54:41.500 tough the thing I'll say
00:54:43.460 is so you know you
00:54:44.700 mentioned oh the idea
00:54:45.940 of some catastrophe
00:54:48.140 just painlessly kills
00:54:49.980 all of us in our
00:54:50.820 sleep and you know
00:54:52.120 what would that be a
00:54:53.100 tragedy and I mean I
00:54:54.820 think yes and I argue
00:54:56.600 in the book at length
00:54:58.060 that it would be a
00:54:58.880 tragedy both because I
00:55:00.300 think that we should
00:55:01.540 expect the future to be
00:55:02.640 on balance good and
00:55:04.060 secondly because I
00:55:04.860 think the loss of
00:55:06.080 lives at least
00:55:07.820 sufficiently good and
00:55:09.460 certainly the loss of
00:55:10.680 all the moral and
00:55:12.560 artistic and
00:55:13.220 technological progress
00:55:14.500 that we might see in
00:55:15.300 the future that those
00:55:16.760 are great losses but I
00:55:19.000 want to emphasize the
00:55:19.900 case for long-termism
00:55:20.800 does not rely on that
00:55:21.840 you could think that
00:55:23.500 everyone dying
00:55:24.700 painlessly in their
00:55:25.460 sleep would not be a
00:55:26.860 bad thing morally
00:55:27.720 speaking nonetheless a
00:55:30.560 future of perpetual
00:55:31.560 totalitarian
00:55:32.240 dictatorship really would
00:55:33.440 be compared to a
00:55:35.000 wonderful free thriving
00:55:36.720 and liberal future
00:55:38.200 where different people
00:55:39.500 can pursue their
00:55:40.420 visions of the good
00:55:41.360 and we've eradicated
00:55:42.740 issues that plague
00:55:43.920 today like injustice
00:55:45.140 and poverty I just
00:55:47.200 think like you
00:55:48.420 really don't need to
00:55:49.280 engage in much
00:55:49.940 moral philosophy to
00:55:50.880 say that if we have a
00:55:52.340 choice between one or
00:55:53.180 the other then it's
00:55:54.600 better and much better
00:55:55.700 to think about the
00:55:57.080 thriving flourishing
00:55:57.920 future than this
00:55:59.100 dystopian one
00:55:59.840 yeah no that's a
00:56:01.000 great point the forget
00:56:02.640 about the thought
00:56:03.240 experiments it should be
00:56:04.400 pretty easy to
00:56:05.540 acknowledge that
00:56:06.500 there's a difference
00:56:07.700 between having to live
00:56:08.840 under the thousand
00:56:09.660 year Reich and living
00:56:11.760 under some version of
00:56:13.640 the best possible
00:56:15.220 solution to the
00:56:17.100 problem of global
00:56:18.140 civilization is there
00:56:19.620 anything more to say
00:56:21.420 about it just in the
00:56:22.760 abstract about the
00:56:23.980 significance of the
00:56:25.480 future I mean I guess
00:56:26.780 I'm I'm wondering if
00:56:28.160 you've thought at all
00:56:28.860 about how just because
00:56:30.860 again this is it's
00:56:32.240 surprising that I think
00:56:33.560 we think about this
00:56:34.840 almost not at all and
00:56:36.900 yet our assumptions
00:56:38.420 about it do a lot of
00:56:40.180 psychological work for
00:56:41.480 us I mean like I
00:56:42.460 really you know I
00:56:44.040 mean this is this is
00:56:44.780 my job to have
00:56:45.600 conversations like this
00:56:46.640 and I'm surprised at
00:56:48.880 how little I think
00:56:50.600 about the far future
00:56:51.940 right in any kind of
00:56:53.620 significant way and
00:56:55.140 yet if I imagine how
00:56:57.000 I would feel if I
00:56:58.740 knew that it wouldn't
00:57:00.900 exist right I mean if
00:57:02.360 we just figured out
00:57:03.180 that oh you actually
00:57:04.100 the whole story ends
00:57:06.840 in 80 years right so
00:57:09.380 we're unlikely to be
00:57:11.040 around to suffer much
00:57:12.460 or we're going to be
00:57:13.560 out out of the game
00:57:15.100 pretty soon and you
00:57:17.540 know people have
00:57:18.120 haven't had kids don't
00:57:19.800 need to worry about
00:57:20.380 them and so just the
00:57:21.540 basic picture is just
00:57:22.480 the career of the
00:57:23.740 species is radically
00:57:25.880 foreclosed at some
00:57:27.760 point inside of a
00:57:28.580 century what would
00:57:30.100 that do to everything
00:57:32.800 and how would that
00:57:35.100 sap the meaning from
00:57:37.900 what we do and it's
00:57:39.700 just it's an interesting
00:57:40.640 question which I haven't
00:57:41.440 thought much about but
00:57:42.240 do you have any
00:57:42.800 intuitions about that
00:57:44.100 it's a great thought
00:57:45.240 experiment and it's
00:57:46.600 represented in this
00:57:47.580 film children of men
00:57:48.660 where the global
00:57:49.920 population becomes
00:57:50.820 sterile and it's
00:57:52.660 discussed at length in
00:57:53.760 the writings of
00:57:55.640 philosopher Samuel
00:57:56.980 Scheffler in a book
00:57:58.000 called death and the
00:57:58.680 afterlife and he argues
00:58:01.040 that if it were the
00:58:03.240 case that just yeah
00:58:04.400 there's no world at all
00:58:05.620 let's say past our
00:58:06.620 generation or even our
00:58:07.820 children's generation
00:58:08.600 it would sap enormous
00:58:10.580 amounts of in fact
00:58:12.860 perhaps most meaning
00:58:13.860 from our lives this
00:58:15.680 is actually a point
00:58:16.380 made by John Stuart
00:58:17.680 Mill in the end of the
00:58:18.660 19th century where he
00:58:21.280 briefly had this phase
00:58:22.160 of being concerned by
00:58:23.080 future generations
00:58:23.940 arguing actually that
00:58:25.320 means they should be
00:58:26.020 keeping coal in the
00:58:26.760 ground which I think
00:58:28.140 wasn't quite the right
00:58:28.960 you know it's maybe
00:58:30.520 made sense given their
00:58:31.460 views at the time or
00:58:32.220 something that their
00:58:33.280 views were incorrect but
00:58:34.940 he has this beautiful
00:58:35.560 speech to parliament
00:58:36.340 and he has this
00:58:38.340 wonderful line why
00:58:39.460 should we care about
00:58:40.160 posterity after all
00:58:41.500 what has posterity ever
00:58:42.480 done for us and he
00:58:44.320 goes on to argue that
00:58:45.160 actually like posterity
00:58:46.120 has done an enormous
00:58:47.120 amount for us because
00:58:48.280 it gives our projects
00:58:49.540 meaning where you know
00:58:51.400 we build we plant
00:58:52.440 trees that future
00:58:54.360 generations will sit
00:58:55.760 under we build
00:58:56.820 cathedrals that are
00:58:58.340 like a testament to our
00:58:59.600 time and to can be
00:59:01.400 recognized by future
00:59:02.340 generations when we
00:59:03.760 engage in activities
00:59:04.940 like moral progress
00:59:06.080 and political progress
00:59:07.040 and scientific and
00:59:07.920 technological progress
00:59:08.980 part of how we have
00:59:11.680 meaning in doing those
00:59:13.140 things is because it's
00:59:14.700 like we're part of this
00:59:15.560 grand relay base where
00:59:17.320 we have taken the
00:59:18.060 baton from previous
00:59:20.040 generations the things
00:59:21.760 they did badly and the
00:59:22.700 things they did well
00:59:23.440 and we can hand that
00:59:24.800 baton on to future
00:59:25.720 generations making the
00:59:26.760 world a little bit
00:59:27.340 better giving them like
00:59:29.040 you know a little bit
00:59:30.200 of a higher platform to
00:59:31.160 stand on and yeah I
00:59:34.100 think this is just a
00:59:34.780 very widely shared
00:59:35.720 view and I think it
00:59:37.260 kind of resonates with
00:59:38.020 me intuitively too
00:59:39.000 where if I found out
00:59:40.600 that everything was
00:59:42.060 going to end in 50
00:59:42.940 years I wouldn't be
00:59:44.540 like oh well I guess
00:59:45.480 I'll just you know
00:59:46.140 focus on issues that
00:59:47.960 just concern the near
00:59:49.160 term then I would you
00:59:50.740 know I would be
00:59:51.340 devastated even if that
00:59:52.800 didn't personally affect
00:59:54.080 anyone I knew or you
00:59:56.640 know perhaps it's not
00:59:57.220 50 years perhaps it's
00:59:58.140 200 years nonetheless I
01:00:00.040 would still I would be
01:00:00.780 absolutely gutted.
01:00:01.520 Yeah it's really
01:00:03.280 interesting because it's
01:00:04.160 not something that I've
01:00:06.700 ever been aware of but
01:00:08.420 it is a an integral
01:00:10.020 piece of how I think
01:00:11.460 about what we're doing
01:00:13.080 here which is when you
01:00:14.360 think about human
01:00:15.320 progress you think about
01:00:16.320 all the good things we
01:00:17.580 do and all the bad
01:00:18.520 things we can avoid the
01:00:20.440 idea that that is all
01:00:22.220 going to be brought up
01:00:23.100 short in a few short
01:00:24.660 years right like I mean
01:00:25.960 we just think of like
01:00:26.520 well what's the point of
01:00:27.980 building increasingly
01:00:30.060 wonderful and powerful
01:00:32.320 and benign technology
01:00:33.840 or you know creating
01:00:35.300 beautiful art or
01:00:36.900 literature or just
01:00:38.900 making progress on any
01:00:40.440 front philosophically or
01:00:41.900 I mean beyond just
01:00:43.500 securing your day-to-day
01:00:46.520 comfort and you know
01:00:48.540 basic happiness there's
01:00:50.780 something forward-leaning
01:00:51.960 about all of that it's a
01:00:54.880 check that in some sense
01:00:56.620 never gets cached or
01:00:58.240 fully cached or it just
01:00:59.560 gets continually cached
01:01:00.900 in the future if not by
01:01:03.520 our future selves by
01:01:05.200 others to which we're
01:01:06.280 connected and the idea
01:01:07.640 that we could ever know
01:01:09.360 that there's just a few
01:01:10.400 more years of this and
01:01:11.360 then it's all over I mean
01:01:12.880 we all know that
01:01:13.760 personally with respect to
01:01:15.660 our own mortality but
01:01:17.620 there is something about
01:01:18.880 the open-endedness of the
01:01:21.840 whole project for others
01:01:24.140 you know even if you don't
01:01:25.300 have kids that changes
01:01:27.460 again it is I really
01:01:29.500 don't even have my
01:01:31.200 intuitions around this
01:01:32.340 are so unformed but it
01:01:34.360 really does feel like it
01:01:35.880 changes something about
01:01:38.100 day-to-day life if the rug
01:01:40.260 were truly pulled out from
01:01:41.480 under us in a way that we
01:01:42.680 had to acknowledge you
01:01:44.260 know leaving everything
01:01:45.120 else intact right I mean
01:01:47.040 all the buildings are still
01:01:48.020 standing we've got you
01:01:49.140 know there's still a
01:01:50.100 concert Saturday night in
01:01:51.500 the same auditorium and you
01:01:52.820 have tickets but you knew
01:01:55.280 the world was going to
01:01:56.440 end and you were just
01:01:58.400 going to get a collective
01:01:59.720 dial tone at some point
01:02:02.120 in the near future you
01:02:03.560 know I don't know what to
01:02:04.200 what degree but it would
01:02:05.360 change everything to some
01:02:07.120 degree yeah so in what we
01:02:08.800 owe the future I actually
01:02:09.900 start and end the book on
01:02:12.560 the same note which is a
01:02:13.980 note of gratitude where the
01:02:16.460 book it's dedicated to my
01:02:19.500 parents my other Robin and
01:02:21.780 their parents Tom and
01:02:23.300 Nina and Frank and Daphne
01:02:25.340 and so on all the way
01:02:26.760 back all the history that
01:02:28.760 led us here and at the
01:02:31.420 very end of the book in
01:02:32.200 the very last page there's
01:02:33.520 in the print version there's
01:02:35.540 this QR code and if you
01:02:36.580 scan it it takes you to a
01:02:38.700 little short story I wrote
01:02:40.280 I'm not claiming it's any
01:02:41.420 good but I did it because I
01:02:44.140 think it's so hard to
01:02:45.040 visualize the future
01:02:45.820 especially to visualize how
01:02:47.340 good the future could be and
01:02:48.940 in the end of that short
01:02:49.860 story one of the characters
01:02:51.480 reflects on everything that
01:02:53.840 led and it's you know this
01:02:54.980 depiction of this I'm not
01:02:56.760 saying it's the best
01:02:57.300 possible future but at
01:02:58.240 least a good one and one
01:02:59.960 of the characters just
01:03:00.740 reflects on just everything
01:03:01.980 that led to her life today
01:03:04.400 all of the hardship the ways
01:03:06.960 in the past in which people
01:03:09.320 endured sickness and
01:03:11.380 suffering and rides on the
01:03:13.940 subway packed in with other
01:03:15.740 people who are you know
01:03:17.120 miserable on their commute
01:03:18.680 and just being deeply
01:03:20.040 thankful for all of that
01:03:21.480 and that's certainly how I
01:03:23.300 feel where you know the
01:03:25.200 past has given us past
01:03:26.560 generations have given us
01:03:27.560 this mixed bag they've given
01:03:29.540 us many ways in which the
01:03:31.140 world is broken many ways in
01:03:32.740 which we still see enormous
01:03:34.080 suffering but they've also
01:03:35.500 given us wonderful good
01:03:36.900 things too I live in this like
01:03:38.640 liberal democratic country that
01:03:40.180 is exceptionally unusual by
01:03:42.160 historical standards I don't
01:03:44.200 need to suffer pain when I
01:03:46.760 engage in a surgery I don't
01:03:48.700 need to risk for my life from
01:03:50.720 diseases like syphilis or
01:03:52.600 tuberculosis I can see the
01:03:55.480 kind of just on my phone I
01:03:57.140 can just see the most
01:03:57.920 wonderful art produced in the
01:03:59.220 entire world at any time I can
01:04:01.480 see the world I can go
01:04:02.520 traveling in like India Sri Lanka
01:04:04.780 Ethiopia wow I'm just so
01:04:07.660 grateful for all that's been
01:04:08.760 done the very least that I can
01:04:11.180 do is try and take that next
01:04:12.800 passage package of goods and
01:04:14.440 and goods and bads and try and
01:04:16.680 pass it to the fact trying to
01:04:18.060 ensure that there even are
01:04:19.380 future generations that I can
01:04:20.580 pass it to but also to make
01:04:22.460 it better and so that's kind
01:04:24.180 of yeah you know normally I
01:04:26.220 trade in these like logical
01:04:28.140 rational arguments but at
01:04:30.080 least in terms of what
01:04:30.840 motivates me on a on a gut
01:04:32.480 level that's a significant
01:04:33.880 part of it so we have
01:04:36.060 something like 500 million to
01:04:39.180 1.3 billion good years left on
01:04:43.080 this planet if the behavior of
01:04:45.120 our Sun is in any indication so
01:04:47.840 the challenge here is to not
01:04:49.840 screw it up and you know we
01:04:52.060 can leave aside the possibility
01:04:53.420 of of reaching for the stars
01:04:55.340 which we will almost certainly
01:04:56.840 do if we don't screw things up
01:04:58.780 and you know spreading out
01:05:00.860 across the galaxy but even on
01:05:03.040 this lowly planet we have just
01:05:05.880 an extraordinary amount of time
01:05:07.840 to make things better and better
01:05:10.580 and you know we're just at the
01:05:13.100 point where we can get our hands
01:05:15.600 around the kinds of things that
01:05:17.900 could could ruin everything not of
01:05:20.520 our own making like you know
01:05:21.780 asteroid impacts and and naturally
01:05:24.800 spawned pandemics and there's clear
01:05:27.900 ways in which our time is unusual
01:05:30.300 this way I think at one point in the
01:05:32.320 book you discuss how for virtually all
01:05:35.240 of the past we were culturally isolated
01:05:37.760 from one another and we may one day be
01:05:40.320 again culturally isolated if we
01:05:42.300 expand out across the galaxy which is
01:05:44.120 to say that there will be a time in
01:05:45.840 the future where human communities
01:05:47.820 grow far enough apart that what they
01:05:50.960 do has has really no implication for
01:05:53.160 what anyone else does outside of their
01:05:55.400 community and that's and that was
01:05:56.700 that was true of most of human history
01:05:59.680 until you know recent centuries I think
01:06:03.380 there's a more fundamental claim in
01:06:05.900 your book which is that there's good
01:06:08.640 reason to believe that we are at
01:06:10.080 something like a hinge moment in
01:06:12.780 history where the things we do and
01:06:15.520 don't do have a an outsized effect
01:06:19.680 over the trajectory of human life in
01:06:22.980 the future and you know there are
01:06:24.600 things that have come up briefly so far
01:06:27.780 like value lock-in which are related to
01:06:30.560 that but and which we'll talk about but
01:06:33.100 what what do you think about this
01:06:35.300 question of whether or not we're
01:06:37.740 living at a hinge moment and you know
01:06:40.940 how we can know and how we may or may
01:06:43.240 not be mistaken about that I think it's
01:06:46.680 clear that we're living at an extremely
01:06:48.220 unusual time that gives a good argument
01:06:51.240 that for thinking that we're living an
01:06:53.240 unusually impactful time too and I'm not
01:06:56.380 going to claim whether living at the
01:06:57.660 most unusual time or impactful time in
01:07:00.340 history perhaps the next century will be
01:07:02.360 even more influential again but it's
01:07:04.660 certainly very influential and the core
01:07:07.380 reason for this is just kind of think
01:07:09.840 about the tree of technology that
01:07:12.580 civilization can kind of go through from
01:07:15.180 at the very first stage there's flint
01:07:17.280 axes and spear throwers and fire and the
01:07:19.500 wheel and then nowadays there's maybe
01:07:22.860 it's fusion technology and advanced
01:07:24.520 machine learning and biotechnology the
01:07:27.160 current developments and at some point
01:07:29.180 there will be future technology that
01:07:32.000 perhaps we've not even imagined yet
01:07:33.960 here's the question just relative to
01:07:37.220 both the past and the future how quickly
01:07:40.100 are we moving through that technology
01:07:41.700 tree and I think there's very good
01:07:43.720 arguments for thinking that it's much
01:07:45.320 faster we are moving much faster through
01:07:47.720 that technology tree than for almost all
01:07:50.560 of human history and almost all of the
01:07:52.400 future if we don't go extinct in the
01:07:54.320 near term and I think that's clear so
01:07:56.900 economists you know measure this generally
01:07:59.440 with economic growth where at least a
01:08:02.840 significant fraction of economic growth
01:08:04.480 comes from technological progress and the
01:08:08.140 economy is currently growing far far
01:08:10.360 faster than it has done for almost all of
01:08:13.240 human history where today the economy
01:08:15.760 doubles every 20 years or so
01:08:18.480 yeah so it's growing around two percent or
01:08:20.680 so and you compound exactly that it gives
01:08:22.760 you a doubling time of something like
01:08:24.420 25 years or exactly there about yeah two
01:08:27.600 three percent when we were hunter
01:08:29.400 gatherers growth was very close to zero as
01:08:32.240 farmers growth was more like 0.1 percent so
01:08:34.980 doubling every few hundred years or so and
01:08:38.980 so relative to the past we are speeding
01:08:42.040 through technological development much much
01:08:43.960 faster than before you know for most of
01:08:46.620 human history before I mean honestly I say
01:08:49.000 before the scientific industrial revolution
01:08:50.780 really it's before about the year 1900s
01:08:53.120 because there was a slower kind of lamp
01:08:54.340 up but then secondly I think we're speeding
01:08:57.080 through that progress much faster than for
01:08:59.040 most of the future too and the argument for
01:09:02.180 this is that we'll just assume we keep
01:09:05.420 growing at about two percent per year for
01:09:08.920 another 10,000 years how much growth would
01:09:11.600 there have to be in that case well it turns
01:09:15.120 out because of the power of compound growth
01:09:17.580 that it would mean that for every atom
01:09:21.660 within reach that is within 10,000 light
01:09:25.240 years there would be I think it's a hundred
01:09:28.520 billion trillion current civilizations worth
01:09:31.700 of output and now you know maybe that's
01:09:34.880 possible I think it's extremely unlikely
01:09:36.720 and so I think at some point with this level
01:09:40.620 of progress of economic technological progress
01:09:43.380 we would get very close to basically
01:09:46.860 discovering almost everything that is to
01:09:48.640 discover on the order of a few thousand
01:09:51.040 years however as you said there are
01:09:53.780 hundreds of millions of years to go
01:09:55.680 potentially and so that means that as I
01:09:59.560 said we're moving particularly quickly
01:10:01.460 through the tree of technologies and every
01:10:05.320 technology comes with its distribution of
01:10:09.000 powers and capabilities and benefits and
01:10:12.200 disadvantages so if we take the ability to
01:10:15.320 harness energy from the atom well that gave
01:10:19.260 us clean energy it gave fire nuclear power
01:10:21.580 that's an enormous benefit to society it
01:10:24.500 also gave us nuclear weapons that could
01:10:26.920 threaten wars more destructive than any we've
01:10:30.160 ever seen in history and I think future
01:10:33.140 technologies such as biotechnology that can
01:10:37.380 you know allow us to create new viruses or
01:10:40.880 advanced artificial intelligence that each
01:10:44.760 one of these give us this new distribution of
01:10:48.740 powers and some of those powers can have
01:10:52.780 extraordinarily long-lasting impacts such as
01:10:55.540 the power to destroy civilization or such as
01:10:58.300 the power to lock in a particular narrow ideology
01:11:01.360 for all time and that's why the world today is at
01:11:05.740 such a crucial moment yeah the principle of
01:11:09.620 compounding has other relevance here when
01:11:14.100 it comes to this question of just how
01:11:16.760 influential a moment this might be in
01:11:20.020 history it's when you just look at at how
01:11:23.840 effects of any kind compound over time it
01:11:28.180 does cast back upon the present kind of an
01:11:31.340 inordinate responsibility to safeguard the future
01:11:36.580 because I mean we have this enormous potential
01:11:39.780 influence I mean just the slightest deflection
01:11:42.720 of trajectory now potentially has massive implications
01:11:48.180 you know 20 30 40 years from now and it's
01:11:52.000 obviously hard to do the math on these things it's
01:11:54.180 hard to predict the full set of consequences from
01:11:57.000 any change we make but if you just look at what it
01:12:00.560 means to steadily diminish the probability of a
01:12:06.040 full-scale nuclear war year after year even just
01:12:08.920 by you know two percent a year if we could just get
01:12:11.900 that virtuous compounding to happen roll that same
01:12:15.820 logic to every other problem you care about it's
01:12:18.920 very easy to see that getting our act together
01:12:21.580 however incrementally now on all these fronts matters
01:12:26.800 enormously in the not too distant future
01:12:29.740 absolutely I mean my colleague Toby Orden is yeah
01:12:34.780 wonderful book the precipice just suggests that well
01:12:37.820 what's happening now is that our technology is advancing
01:12:40.560 more rapidly than our wisdom and it's a beautiful
01:12:42.920 idea and what we're trying to do is accelerate wisdom
01:12:48.140 you know we're trying to just make us a little bit wiser a
01:12:51.320 little bit more cognizant of you know there's most people
01:12:55.340 fall into one of two camps there are the people who are
01:12:58.580 really into technology the kind of stereotype of the
01:13:01.620 Silicon Valley people and they think technology is great
01:13:04.480 and we just need to keep advancing it and then there are
01:13:07.360 people who are not really into technology and the people
01:13:10.400 who are very skeptical of Silicon Valley and worry about
01:13:13.200 what technology could be doing to society and I think I
01:13:16.600 just really want people to inhabit both frames of mind at
01:13:19.560 once technology has benefits and costs some
01:13:24.580 technologies are just like anesthetic I'm like oh yeah
01:13:28.380 that's just really good let's have more of these like
01:13:31.840 un you know no downside unashamedly good technologies
01:13:35.360 let's speed them up faster for technologies that are very
01:13:39.860 dual use or just extremely powerful and we don't know how
01:13:43.020 to handle them well let's then be careful and considerate
01:13:46.760 about how we're developing these technologies have good norms
01:13:50.200 and regulations in place so that we don't you know basically
01:13:55.340 just walk backwards into a chasm where we wreak global disaster
01:14:02.600 because we're not even paying attention to the impacts that
01:14:06.580 technologies that are just around the corner will have because
01:14:10.820 of this reason of out of sight and out of mind because we're
01:14:13.880 barely thinking about you know we barely think about like the
01:14:17.020 end of the next year certainly not the end or the end of the
01:14:19.340 next political cycle let alone the decades ahead that could bring
01:14:24.340 about radical changes and how they might impact the centuries ahead
01:14:28.000 well simply by uttering the phrase political cycle you have
01:14:32.420 referenced what I consider a major problem here which is we we have a
01:14:37.620 politics that is almost by definition a machine of short-termism whereas
01:14:44.420 just it's taking a long-term view is antithetical to the whole project of
01:14:49.920 becoming president and staying president and then raising money for your next
01:14:55.400 campaign I mean you know you're just not to speak specifically of America at
01:14:59.580 this point I don't know that it's much better in any other country you're not
01:15:02.700 incentivized to think about the far future and to figure out how to help all
01:15:07.260 those people whose names you will never know and and who will never vote for you
01:15:11.540 much less give you money it's just not in the cards right and so one thing that has
01:15:17.920 to be a fairly high priority is to figure out how to engineer a system of
01:15:23.420 political and economic incentives such that we more effortlessly take the future
01:15:29.280 into account absolutely and honestly it's extremely tough so I've worked on this
01:15:35.900 topic I have an article on long-termist political institutions with another
01:15:41.340 philosopher Tyler John and there are some things you can do such as you know some
01:15:48.860 countries have an ombudsperson to represent future generations I'm
01:15:52.640 particularly attracted to the idea of at least an advisory as at least as an
01:15:57.340 advisory body a kind of citizens assembly so a randomly selected group of
01:16:01.480 people from the from the populace who are explicitly given the task of
01:16:06.480 representing the interests of future generations and then making
01:16:09.680 recommendations about what the government could do differently in order to make
01:16:13.960 the future go better I think we can also build in just more information so more
01:16:20.340 think tanks or quasi-governmental organizations that have the task of really
01:16:26.840 trying to project not just the next few years but the next few decades even the
01:16:31.040 next few centuries these are some changes we can make however the fundamental
01:16:37.700 problem is that future generations are being impacted without representation they
01:16:44.780 do not get a vote and the fundamental problem is we cannot give them a vote
01:16:48.580 there's no way they can represent themselves and what do we do about that
01:16:52.660 honestly I think it's going to be very hard to make major progress unless there's some
01:16:59.100 significant cultural change so if we imagine a world where governments are responding to the
01:17:06.600 views and preferences and ideals of their populace of the electorate and that electorate cares about
01:17:11.780 the long term well that's a situation where we can align align political incentives with
01:17:18.360 the long term good and so that's what I think we really need to get to I mean it's just amazing how
01:17:24.520 hard it is even for normal people who have children to prioritize the next decade right I mean even
01:17:33.680 it's a decade where they have every expectation of being alive and suffering the consequences of our
01:17:39.460 present actions and they have every certainly every expectation that their children will be alive in those
01:17:44.680 decades when you look at a problem like climate change or anything else that is slowish moving and
01:17:53.020 has various features like you know being in part a tragedy of the commons running counter to any
01:18:01.640 individual's personal economic incentives etc it's just it all falls by the wayside even when you're not
01:18:11.800 talking about hypothetical people who you will never meet right I mean it's just if there's something
01:18:17.540 so hard about focusing on the big picture and so yeah it does I mean when you talk about cultural change
01:18:25.600 I mean it it's hard to see the thing that would be so sweeping that it would capture more or less
01:18:32.560 everybody all at once or but perhaps that's not even the right goal I mean maybe they're they're only
01:18:38.740 something like 5 000 people whose minds would have to change to fundamentally redirect human energy I
01:18:46.640 mean this is an elitist claim and argument but I suspect that something like that might also be true
01:18:54.180 yeah I think I'm more optimistic on this in general that perhaps than you are where yeah in the course of
01:19:01.880 doing research for what we owe the future I you know dug a lot into the history of various social
01:19:09.080 and moral movements which is environmentalism feminism animal welfare movement and in particular
01:19:15.660 in particular the abolition of slavery and one thing I think it's easy to underestimate
01:19:22.720 is just how weird these ideas were in the past at least in particular for the kind of political elite or at
01:19:31.080 least the people who had power where you don't have to go that long ago let's just focus on the
01:19:37.160 environmentalist movement I think there's many analogies here where the environment can't represent
01:19:42.200 itself it has an advantage that you can see the environment you can relate to it but 150 years ago
01:19:47.920 concern for the environment was really not on the table maybe you had poets Williams Wordsworth starting
01:19:53.060 to cultivate love for nature but there just really wasn't anything like a movement whereas now it is just a
01:19:59.700 part of common sense morality at least in many countries around the world that is by historical
01:20:06.460 standards that is amazingly remarkable like amazingly rapid model change and so at least if we're able to
01:20:15.900 say look I'm not going to be we're not going to be able to make this change to convince everyone of
01:20:21.180 this long-term outlook to convince everyone of having concern for future generations we're not going to be
01:20:27.920 able to convince them within the next few years or even few decades but when we look like 100 years
01:20:32.860 out maybe we can maybe actually we really can affect that sort of sweeping model change where
01:20:40.900 concern for future generations is just on the same on the same level as concern for other people in your
01:20:50.240 community or people of you know different ethnicity than you I'm holding out hope
01:20:56.860 um I'm holding out hope that those sirens aren't people coming to arrest you for your thought crimes
01:21:03.020 I'm sorry this is uh I'm in New York
01:21:06.780 are you are you recording this podcast while robbing a bank
01:21:11.120 look it's for the greater good right the money can go further so that's a you know don't do that
01:21:16.560 just in case my followers take me too literally let's go on record we're against bank robbery
01:21:22.320 yeah at least currently I'm with that I won't say anything about value lock-in on that point but
01:21:27.500 well let's talk about this issue of value lock-in and what you call moments of plasticity and and
01:21:34.940 trajectory changes how do you think about I mean you might want to define some of those terms but
01:21:39.860 what is the um the issue and opportunity here sure so at the moment there's a wide diversity of moral
01:21:49.860 views in the world and rapid moral change so you know the the gay rights movement can build in power
01:21:57.840 the momentum in the 60s and 70s and then you get legalization of gay marriage a few decades later
01:22:03.000 clear example of positive moral change and so we're very used to that we think it's kind of just
01:22:08.060 part of reality we wouldn't expect that that might change in the future such that such that
01:22:14.200 moral change might come to an end the idea of value lock-in is taken seriously that maybe it will
01:22:20.040 will actually maybe moral progress will stall and so value lock-in is the idea that some particular
01:22:25.840 set of some particular ideology or moral view or kind of narrow set of ideologies or moral views
01:22:32.500 could become globally but dominant and persist for an extremely long time and I'm worried about that
01:22:40.040 I think that's a way in which the long-term future could go that could just lead to enormous loss of value
01:22:49.940 could make the future dystopia rather than a positive one and you know the most it's easiest to see this
01:22:58.180 by starting off with maybe the most extreme case and then kind of thinking about ways you might not
01:23:05.400 be quite that extreme but close and most extreme cases just imagine a different history where it
01:23:12.280 wasn't you know the US and UK and liberal countries that won World War II and instead it was Stalinist
01:23:20.660 USSR or let's say the Nazis who won now they really aimed ultimately a global domination they wanted to
01:23:28.440 create a thousand-year empire if that happened and they had a very narrow and like morally abominable
01:23:36.360 ideology if they had succeeded would that have been able to persist forever or at least for an
01:23:43.840 extremely long time and I think actually the answer is yes where ideologies in general and social systems can
01:23:52.040 certainly persist for centuries or even thousands of years so the major world religions are thousands of years old
01:23:57.000 but technology I think over time will give us greater and greater power to control the values of the
01:24:04.860 population that lives in the world and the values in the future as well where at the point in time where we
01:24:13.060 develop the ability to like stop aging such that people's lifespans could last for much much longer
01:24:19.520 well then a single dictator could you know it's not confined to the lifespan of a rule is not confined
01:24:26.420 to the lifespan of a single individual 70 years but instead would be thousands of years or however long that
01:24:34.640 ruler would last even more extreme I think is the point of time at which the ruling beings are digital rather than
01:24:43.060 biological and again this feels like sci-fi but you know we're talking about I actually think many people I
01:24:49.140 respect greatly think that moment is coming in in a few decades time but let's just say it's like you know plausible
01:24:55.980 within this century or centuries well consider what the environment is like for a ruler who is a digital being well they
01:25:07.360 don't die they are in principle immortal software can replicate itself and any piece of machinery would wear out but the AI systems
01:25:18.340 themselves could persist forever so one of the main reasons why we get model change over time namely that people die and
01:25:25.540 are replaced by people who have slightly different views that would be undermined I think in this kind of scenario other causes of
01:25:32.520 change would be undermined too or taken away so I think the fact that we have competition between different
01:25:37.620 model ideas because we have this diversity of model ideas well if the Nazis had won World War II if they had
01:25:45.180 established a world government you know again this is I'm not claiming this is likely or possible but imagine the
01:25:50.180 counterfactual history well then we would lose out on that competition of ideas there wouldn't be that pressure
01:25:56.240 for model ideas to change and then finally is just potentially a changing
01:26:01.300 environment like upsets happen you know civilizations fall apart and so on but I think technology in the future
01:26:07.880 could allow for greater and greater stability so again I think AI is particularly worrying here where
01:26:14.400 you know again we're imagining the leaders of the Nazis controlling the world okay well one way in which
01:26:21.340 dictatorships end is insurrection whether that's by the army or by the people if your army is automated
01:26:28.260 if you have kind of robots policing the streets then that dynamic would end similarly with your police force
01:26:34.640 and again all of this seems like sci-fi but I think we should be thinking about it we should be taking it seriously
01:26:40.620 because it really could come sooner than we think and the technologies we have today would have looked like wild sci-fi
01:26:47.320 just a century ago yeah well maybe we'll close out with a conversation about AI
01:26:52.280 because I share your concern here and I share your interest in in elucidating the bad intuitions that make
01:27:01.740 this not a concern for so many other people so let's get there in a second but remind me what is
01:27:09.640 the lock-in paradox that you talk about in the book sure so the lock-in paradox it's like the
01:27:16.240 liberal paradox of tolerance but you know I push in the book that we don't want to lock in any
01:27:21.980 particular values right now because probably all of the values we're used to are abominable you know
01:27:29.060 I think we're like Plato and Aristotle arguing about the good life and we've not realized the fact that
01:27:36.280 we all own slaves is you know maybe morally problematic so I think that there's enormous amount
01:27:43.420 of moral progress yet to come and we want to ensure that we guarantee that progress and so even if you
01:27:49.720 know I think motivation of thinking oh well it's you know 21st early 21st century western liberal values
01:27:55.180 that's the best thing and we should just ensure they happen forever I'm like no we need progress
01:27:59.800 the lock-in paradox however is that if we wanted to create a world where we do get a sustained period
01:28:07.520 of reflection and moral progress that does mean we'll need some restraints so we might well need
01:28:14.420 to lock in certain ideas like commitment to free debate commitment to tolerance of opposing
01:28:23.640 moral views restrictions on ways of gaining power that aren't via making good arguments and moving people
01:28:33.280 over to your side kind of in the same way that like you know the U.S. constitution at least
01:28:39.520 aspirationally is trying to bring about this like liberal society where many different world views
01:28:44.200 can coexist and where you can make moral progress but in order to have that and not fall into a
01:28:49.520 dictatorial dictatorship you needed to have these restraints on any individual's power and so I think we
01:28:56.400 may well need something like that for the whole world where we stop any particular ideology
01:29:02.740 or just set of people from gaining too much power such that they can just you know control the world
01:29:09.220 so that we can just have continued reflection insight empathy and moral progress yeah it's tempting to
01:29:16.420 try to formulate the minimum algorithmic statement of of this which is is something that allows for the
01:29:25.300 the kind of the kind of incrementalism you've just sketched out and and the the error correction
01:29:31.620 exactly that i think is necessary for any progress and including ethical progress but which locks in the
01:29:40.820 methodology by which one would safeguard the principle of error correction and exactly you know sensitivity to
01:29:49.460 uh the the the open-endedness of the exploration in ethical space right i mean it's like you know
01:29:57.060 i i think there's probably whether you define it as consequentialist or not i i think it's probably less
01:30:02.980 important i mean the consequentialism as we know has its its wrinkles but i think there's an almost
01:30:09.140 axiomatic claim about the primacy of safeguarding the well-being of conscious creatures at the bottom of
01:30:17.220 i think any sane moral enterprise right now it's not in how you how you think about well-being how you
01:30:24.340 aspire to measure it or quantify it how how you imagine aggregating these quantities there are many
01:30:30.500 devils in all those details but the basic claim that you know you have to be at bottom care about the
01:30:37.220 happiness and suffering of conscious creatures and marry that to a truly open-ended conception of and
01:30:44.980 you know and you know perpetually refined and refinable conception of what well-being is all things
01:30:51.700 considered you know and that's where all the discussion remains to be done i think i think those
01:30:57.380 are the bootstraps in some form that we have to pull ourselves up by uh yeah so it is a very notable fact
01:31:05.540 that i think basically all plausible model views that are on the table see you know well-being as one
01:31:14.660 part of the good perhaps they think other things are good too um art flourishing natural environment and
01:31:20.100 so on but at least you know happiness broadly construed flourishing life avoidance of suffering
01:31:27.780 those are good things and i would just well just not to be too pedantic here but i in in my brain
01:31:35.380 all of that collapses to just a fuller understanding of what well-being is i don't think any you know
01:31:42.500 reasonable definition of well-being would would exclude the things you just mentioned and the cash value of
01:31:48.820 the things you just mentioned it has to be at some level how they impact the actual or potential
01:31:55.940 experiences of conscious creatures this is an argument i've made somewhere if you're going to tell
01:32:01.380 me you've got something in a you know locked in a box that is really important it's really valuable it's
01:32:07.860 important that we consider its fate in everything we do but this thing in the box cannot will not
01:32:16.100 affect the experience of any conscious creature now or in the future right well then i i just think
01:32:23.780 that's a a contradiction i mean we what we what we mean by value is something that could be actually or
01:32:29.380 potentially valued by someone somewhere great so i think we differ a little bit on this where
01:32:35.460 i mean i also think like my best guess is that value is just a property of conscious experiences
01:32:44.820 and you know art and knowledge and the natural environment all good things insofar as the impact
01:32:50.100 the well-being of conscious experiences but that's not something i would want to bake in i'm not so
01:32:55.300 certain in it you know there are other views where the satisfaction of carefully considered preferences
01:33:00.420 is of positive value even if that doesn't go via the change in any part in any conscious
01:33:07.860 experience and other people do have the view that but so again that's the phrase i'm actually not
01:33:13.700 familiar with when i hear you say satisfaction of anything including preferences aren't we talking
01:33:21.220 about conscious experience actual or potential uh so on this view it's no so suppose that avistotle
01:33:28.820 wanted to be famous and he wasn't very famous during his time i'm actually not sure but let's
01:33:35.060 let's say that he was certainly much less famous than he is now right does the fact that we are
01:33:39.300 talking about avistotle now increasing his fame does that increase his well-being and it's obviously
01:33:46.100 not impact i'm prepared to answer that question i'm sure you're prepared to say no and honestly and
01:33:51.300 my best guess view is completely agreeing with you that anything we do to talk about avistotle now
01:33:57.220 we could say he smells and we hate him it doesn't change his well-being at all however that's not
01:34:03.060 true other people disagree and i think other smart people disagree and so i certainly wouldn't want to
01:34:08.340 have wouldn't want to say look we figured that out i would want to say look now we've got plenty of
01:34:14.100 time to debate this i think i'm going to win the debate maybe the other person thinks they aren't
01:34:18.340 going to win the debate but we've got all the time in the world to really try and figure this out
01:34:23.780 and i would want to be open to the possibility that i'm badly wrong i guess we are kind of badly
01:34:29.940 wrong on this well that's interesting i would love to uh debate that so maybe as a sidebar you and i
01:34:36.420 can figure out uh who and and when and where and absolutely always happy to talk about yeah okay well
01:34:42.900 so let's talk about ai because it is in in some sense the the apotheosis of of many of these concerns
01:34:50.660 and i agree that should anything like a malevolent agi be built or be put in the service of malevolent
01:35:00.340 people uh the prospect of value lock-in and you know orwellian totalitarianism it just goes way way
01:35:08.180 up yeah uh and that's worth worrying about whatever you think about the the importance of future generations
01:35:13.700 but people are there are several stumbling blocks on the path to taking these concerns about ai
01:35:20.260 seriously the first is and this really is the first knee-jerk reaction of people in the field who
01:35:27.460 don't take it seriously it's the claim that we're nowhere near doing this that are you know our language
01:35:34.980 models are still pretty dumb even though they can produce some interesting texts at this point there's
01:35:42.820 certainly no concern that they could there could be anything like an intelligence explosion in any of
01:35:48.900 the machine learning algorithms we're currently running you know alpha zero for all of its amazing
01:35:54.900 work is still not going to get away from us and it's it's hard to envision what could allow it to i
01:36:02.660 know that there are people who you know on our side of this debate who have given a kind of kind of
01:36:07.700 probability distribution over over the future you know giving it you know a 10 chance it might happen
01:36:13.220 in 20 years and a 50 chance it'll happen within 50 years etc but if in my mind there is there's just there's no
01:36:20.580 need to even pretend to know when this is going to happen the only thing you you need to acknowledge is that if
01:36:28.340 not much needs to be true to make it guaranteed to happen you know that it's really that there's
01:36:36.100 there are really two assumptions one needs to make i mean the one assumption is we will continue to make
01:36:41.540 progress in building intelligent machines at whatever rate unless something terrible happens right i mean
01:36:48.420 this brings us back to the collapse of civilization the only thing that would cause us to stop making
01:36:54.260 progress in building intelligent machines given how valuable intelligence is is uh something truly
01:37:02.100 awful that makes it you know they render some generation of future humans just unable to improve
01:37:09.380 hardware and software so there's that i mean so you know barring catastrophe we're going to continue to
01:37:14.980 make progress and then the only other assumption you need is that intelligence is substrate independent
01:37:23.300 right that it's not it doesn't require the wet wear of a biological brain it can be instantiated in
01:37:30.340 silico and we already know that's true given the piecemeal ai we've already built and we already know
01:37:38.180 that you can build a calculator that's better than any human brain calculator and you can do that in
01:37:43.860 silicon and there's just no reason to think that the input output properties of a complex information
01:37:51.300 processing system are magically transformed the moment you make that system out of meat and so again
01:38:00.260 given any rate of progress and given the assumption of substrate independence eventually and again the time
01:38:09.140 horizon is can be left totally unspecified you know five years or 500 years eventually we will be in the
01:38:16.100 presence of intelligent machines that are far more intelligent and competent and powerful than we are
01:38:24.340 and the the only other point of confusion that i've detected here again and again and again which
01:38:30.260 can be easily left to one side is the question of consciousness right will these machines be sentient
01:38:37.220 and that really is a red herring from my point of view i mean it's it's it's not a red herring ethically
01:38:42.820 it's incredibly important ethically because you know if if they are conscious well then then we
01:38:48.420 can have a conversation about whether they're ethically more you know suddenly more important
01:38:52.980 than we are or at least are equals and whether we're you know creating machines that can suffer
01:38:57.940 etc i mean is that if we're creating simulated worlds that are essentially hell realms for sentient
01:39:04.340 algorithms i mean that would be a terrible thing to do so yes it's all very interesting to consider but
01:39:09.300 assuming that we can't currently know and and we may one day never know even in the presence of
01:39:15.300 machines that say they're conscious and that pass the turing test with flying colors we may still be
01:39:21.380 uncertain as to whether or not they're conscious the whether or not they're conscious is is the
01:39:25.940 question that is totally orthogonal to the question of what it will be like to be in the presence of
01:39:33.220 of machines that are that intelligent right i mean i think intel i think you can dissociate
01:39:38.980 or very likely you can dissociate consciousness and intelligence i mean yes it may be the case that
01:39:44.180 the the lights of consciousness magically come on once you get a sufficient degree of complexity
01:39:50.260 and intelligence on board that that may just be a a fact of our world and we certainly have reason to
01:39:56.340 expect that consciousness arrives far sooner than that when you look around at animals who we deem
01:40:01.140 conscious who are far less intelligent than we are so i do think they're just separable questions
01:40:06.820 but the question of intelligence is really straightforward we're just talking about
01:40:11.380 you know conscious or not we're talking about machines that can engage in goal-oriented behavior and
01:40:18.420 learning in ways that are increasingly powerful and ultimately achieve a power that far exceeds our own
01:40:27.940 given that it's it seems virtually inevitable given these two measly assumptions again any progress and
01:40:35.860 substrate independence for intelligence we have to acknowledge that we or our children or our grandchildren or some generation of human beings
01:40:46.820 stand a very good chance of finding themselves in relationship
01:40:50.180 relationship to intelligences that are far more powerful than they are and that's an amazing
01:40:57.620 situation to contemplate and it's i'll just remind people of uh stuart russell's analogy here which
01:41:04.180 somehow at least in my experience changes people's intuitions on his account it's like we're in the following
01:41:12.020 situation we just got a a message from some distant point in the galaxy which reads people of earth we will
01:41:20.020 arrive on your planet in 50 or 75 or 100 years get ready right you know you don't know when this is going to happen
01:41:28.580 but you know it's going to happen and that statement of of looming relationship delivered in that form would be quite
01:41:37.700 a bit more arresting than the prospect of continued progress in ai and you know i would argue that it
01:41:45.620 it shouldn't be i mean they're they're they're essentially the same case yeah i just think that
01:41:51.620 framing is excellent i mean another framing is if you're chimpanzees and you see this other species
01:41:59.220 homo sapiens that are perhaps less powerful than you to begin with but they seem to be getting rapidly
01:42:05.140 increasing in their power you'd at least be paying attention to that and uh yeah you remind me a little of
01:42:11.300 the second book of the free body problem where the premise i hope it's okay to say is aliens will come
01:42:18.660 and i think it's in a thousand years time and they have far more greater power than civilization on earth
01:42:25.380 does but the entire world just unites to take seriously that threat and work against it and so
01:42:32.900 essentially i completely agree with you that the precise timelines are just not very important
01:42:39.300 compared to just noting if and well not if but basically when it happens that we have ai systems
01:42:46.980 that can do everything that a human being can do except better that will be one of the most important
01:42:52.980 moments in all of history plausibly and well we don't know exactly when it'll come but uncertainty
01:43:00.100 comes but uncertainty counts both ways it could be hundreds of years and it could be so it could be
01:43:06.820 soon for all we know but certainly relative to the tiny amount of time we're thinking about it
01:43:11.700 well we should be spending more but i'll actually just comment on the timelines as well where
01:43:17.780 there's a recent survey just came out last week in fact of just as many leading machine learning
01:43:25.860 researchers as possible i'm not sure the sample size but if it's like the last survey it'll be in the
01:43:31.540 hundreds and they asked well at what point in time will you expect you know do you expect a 50 50 chance
01:43:39.620 of developing human level machine intelligence that is ai systems that are as good or better
01:43:45.380 human beings all tasks essentially that humans can do and they say 37 years so that's in my lifetime
01:43:53.300 i will be entering my retirement for the 50 50 chance of advanced ai when i talk to people who are
01:44:00.740 really trying to dig into this they often have shorter timelines than that they often think that's
01:44:06.820 50 50 within more like 20 years and substantial probability on shorter timelines i tend to be in the more
01:44:14.020 skeptical end of that but basically within any of this range concern about ai it's not merely like oh we
01:44:23.060 should be thinking about something that happens in 200 years because maybe that will impact 2000
01:44:27.140 years it's like no we should be thinking about something that has a chance of catastrophe we're
01:44:34.580 on the same survey the probability that those machine learning researchers that's people who are
01:44:39.940 building these things who you think would be incentivized to say what they're doing is completely
01:44:43.540 safe the probability of an extremely bad outcome as bad as human extinction or worse
01:44:49.700 uh that have the typical probability was five percent of that so put those numbers together
01:44:57.940 and you have 2.5 chance of dying or being disempowered by ai according to
01:45:05.620 machine learning researchers not cherry pecked for being particularly concerned about this
01:45:09.540 you have a 2.5 chance of dying or being disempowered by artificial intelligence in your lifetime
01:45:16.020 that's more than your chance of dying in a car crash and we think a lot about dying in car crashes
01:45:20.740 uh take actions to avoid them there's like a huge industry a huge regulatory system so that people do
01:45:26.580 not die in car crashes what's the amount of attention on risks from ai well it's growing thankfully it's
01:45:32.740 growing and people are taking it really seriously and i expect that to continue into the future but it's
01:45:37.940 still very small it still means that smart morally motivated people can make a really transformative
01:45:45.700 difference in the field by getting involved working on the technical side of things to ensure that ai systems
01:45:51.860 are safe and honest and to also work on the governance side of things to ensure that like we as a society
01:46:00.420 developing this in a responsible way perhaps digging into you know ai is not a monolith there are
01:46:06.340 different sorts of ai technology some of which are more risky than others and ensuring that the
01:46:11.060 safety enhancing parts of ai come earlier the more dangerous parts come later this really is something
01:46:16.980 that people can be contributing now i think they are i think it's hard i'm not claiming that this is
01:46:21.460 like it's clear cut what to do there are the questions over like how much progress we can make but we can
01:46:26.180 at least try hmm yeah and i think the crucial piece there again this is courtesy of stuart russell who is
01:46:34.580 my vote for you know one of the wisest people to listen to on this topic just to make the the safety
01:46:42.180 conversation integral to the conversation about developing the technology in the first place i mean he
01:46:49.540 he asks us to imagine a world where it's just how perverse it would be if engineers who were designing
01:46:58.740 bridges thought about bridge safety only as an afterthought and under great you know duress
01:47:06.020 right so like there's the question about building bridges you know all of the resources put toward
01:47:10.660 that end but then only as an afterthought do engineers have conversations about that what
01:47:16.820 they call the not falling down problem with respect to bridges right i mean no it's completely insane
01:47:22.500 to conceive of build the project of building bridges as being separable from the project of building
01:47:27.860 bridges that don't fall down likewise it should be completely insane to think about the prospect of
01:47:33.220 building agi without grappling with the problem of alignment and uh related matters i mean so in defense of
01:47:42.980 of more near-term concern here many people have mined the history here and found these these great
01:47:51.780 anecdotes about just that reveal how bad we are at predicting technological progress i forget the details
01:47:59.700 here but i think it was i think it was the new york times or some esteemed u.s paper that confidently
01:48:05.060 published a an editorial i think it was just something like two months before the the wright brothers uh
01:48:12.820 conquered flight you know confidently predicting that that human beings will never fly or it will take
01:48:18.580 millions of years to to uh engine because it took evolution millions of years to engineer the bird's wing
01:48:24.340 it would likewise take us millions millions of years to fly and then you know two months later we've got
01:48:29.140 people flying and i think there's one anecdote that you have in your book that i hadn't heard but the
01:48:35.460 the famous physiologist jbs haldane in 1927 i was going to mention this one yeah go ahead yeah go ahead
01:48:43.700 what what was that reference oh yeah well so yeah one of the great scientists of his time uh evolutionary
01:48:49.620 biologist and had some remarkable essays on you know the future and predicting future events and how
01:48:56.580 a large civilization could get uh and he considered the question will there ever be a return trip to the
01:49:03.060 moon and he said oh yeah probably in eight million years and that was yeah did you say 1927 yeah yeah
01:49:11.780 so that shows you like people like to make fun of people in the past making these bold uh scientific
01:49:20.100 you know predictions about technology such as the early computer pioneers at the i think it was the
01:49:26.340 dartmouth conference saying oh yeah we'll build human level artificial intelligence in about six
01:49:30.740 months or so and you know people mock that and say oh that's so absurd but what they forget is that
01:49:35.700 there are just as many people on the other side making absurd claims about how long technology will
01:49:41.060 take in fact technology look things are hard to predict technology can come much later than you might
01:49:48.500 expect it can also come much sooner and in the face of an uncertain future we have to be preparing for
01:49:54.340 the near-term scenarios as well as the long-term scenarios yeah and the thing we have to get
01:49:59.380 straight in the meantime is that the incentives are currently not appropriate for wisdom and and
01:50:08.500 circumspection here i mean it's just we we have something like an arms race condition both with respect to
01:50:15.460 private efforts and you know rival governments one must imagine right and we're not you know we haven't
01:50:21.780 fused our moral horizons with the chinese and the israelis and anyone else who might be doing this
01:50:28.020 work and gotten everyone to agree just how careful we need to be in the final yards of this adventure
01:50:36.180 as we approach the end zone and and it does seem patently obvious that an arms race is not optimal for
01:50:45.060 taking safety concerns seriously it's not what we want and i mean the key issue is just how fast do
01:50:53.860 things go so supposing the development of artificial general intelligence suppose that's a hundred years
01:51:01.540 away or two hundred years away then it seems more likely to me that uh safety around ai will look like
01:51:08.580 safety around other technologies so you mentioned flight in the early days of flight a lot of planes crashed
01:51:16.340 a lot of people died over time we learned primarily by trial and error what systems make flight safer
01:51:25.540 and now very few people die of plane crashes compared to other modes of death especially in more developed
01:51:30.900 countries but we could learn by trial and error because a single plane crash would be a tragedy
01:51:36.500 um for the people involved and for their families but it wouldn't destroy the whole world
01:51:41.220 um and if ai development goes slowly then we could i think i would expect we will learn by that same
01:51:48.340 kind of process of trial and error and gradual improvement but there are arguments for thinking
01:51:52.900 that maybe it will go fast in fact leading economic models if you assume that there is a point at which
01:51:59.780 you get full substitutability between human labor and artificial labor like as ai in particular
01:52:06.340 the process of developing more advanced ai systems then you get very fast growth indeed and if things are
01:52:14.660 moving that quickly well this slow incremental process of trial and error that humans that you know
01:52:22.340 we as a society normally use to uh mitigate the risks of new technology that process is substantially
01:52:30.740 disempowered and i think that increases the risks greatly and again it's something that i really
01:52:36.340 think you know that prospect of very rapid growth is at least something that should be very rapid
01:52:42.020 progress should at least be on the table where again this survey that just came out if you ask
01:52:47.300 leading machine learning researchers though i'll caveat i don't think they're really the the people that
01:52:51.780 one should think of as experts here but they're still kind of relevant people to consider they think it's about
01:52:56.740 even that that will happen like about 50 50 chance that there will be something akin to what is known
01:53:03.140 as the intelligence explosion that doesn't necessarily mean the kind of classic it's two days
01:53:08.260 where we move from no advanced ai at all to ai that's more powerful than all the rest of the world
01:53:13.620 combined perhaps it occurs over the course of years or even like a small number of decades but
01:53:19.060 nonetheless if things are moving a lot faster than we're used to i think that increases the risks greatly
01:53:24.420 well it seems like things are already moving much faster than we're used to it's hard just it's hard
01:53:29.860 to see where we get off this ride because the pace of change is noticeably different and it's you know
01:53:37.220 i'm not a a full kurt swile in here but um it does seem like his claim about accelerating technological
01:53:45.140 change is um it's fairly palpable now i mean in in most sectors i mean i guess there are sectors
01:53:54.180 where virtually nothing is happening or improving i mean that that is also true but when you're talking
01:53:59.540 about information technology and what it does to politics and culture that's you know the recent years
01:54:08.420 have been uh kind of a blizzard of of change and uh it's hard to know where the break is to pull
01:54:16.340 yeah and people should just take a look at the recent developments in machine learning where they're
01:54:23.940 not very famous yet but they will be soon soon and it's truly remarkable what's happening i mean
01:54:29.940 obviously it's still far away from human level artificial intelligence i'm not claiming it's not
01:54:34.340 but when forecasts have been done again of people working in the area of what sort of progress we
01:54:41.620 expect to make things are moving faster even than the kind of the forecast of people who thought it was
01:54:46.900 going to be moving fast so in particular a recent they're called language models it's where they just
01:54:54.180 get trained on enormous amounts of enormous corpuses of human text but one got trained to solve
01:55:01.300 do math proofs essentially at about college level kind of easy college level and the innovation behind
01:55:09.220 this model was very very small indeed they actually just cleaned up the data so that it really could
01:55:14.100 read formula online whereas that was a problem beforehand and it just blew everyone's predictions
01:55:20.580 out of water and so now you have this model where you say you know can't like you give it a mathematical
01:55:25.860 problem kind of early college level prove that so and so or like what's the answer to this question
01:55:31.700 of social or proof and it will do it elite with 50 accuracy and that was something that people
01:55:38.340 weren't expecting for years if you look at like other language models as well i mean okay i'll give an
01:55:44.340 anecdote of uh so one of my responsibilities for the university is marking undergraduate exams and so i asked
01:55:52.020 gpt3 not even the recent the not even the most advanced language model nowadays but one you can
01:55:58.820 get public access to i was curious how good is gpt3 this language model compared to oxford undergraduates
01:56:04.820 at philosophy who studied philosophy for three years and so i got it to answer these questions
01:56:10.180 um like some of the questions that were the students most commonly answered that's great and it passed
01:56:14.980 yeah uh it was not very good it was not terribly good it was in the kind of bottom 10 percent of
01:56:20.340 students but not the bottom two percent of students i think if i'd given the essays to
01:56:25.780 a different examiner i don't think that examiner would have thought oh there's something really
01:56:30.180 odd here this is not a human writing it they would have just marked it and thought like
01:56:34.980 not a very good student you know a student that's good in some ways has good structure
01:56:39.460 it like really knows the art like the essay form but sometimes gets like a little bit derailed by
01:56:44.660 arguments that aren't actually relevant or something that's what it would have thought but would not
01:56:48.980 have thought oh this is clearly nonsense this is an ai or something and that's just you know that's
01:56:54.820 pretty striking we did not have anything close to that just a few years ago and so i don't know anyone
01:57:00.980 who's like this you know i think it's fine to be like an enormous ai skeptic and think that agi is
01:57:06.580 very far away you know maybe that view is correct like i said there's an enormous amount of uncertainty
01:57:12.180 but at least take seriously like this is like a fast moving area of technology
01:57:16.420 yeah yeah and obviously there are many interesting and consequential questions that arise
01:57:22.100 well short of agi i mean just just on this front i mean what will be the experience of
01:57:30.820 all of us collectively to be in the presence of ai that passes the turing test right it does not have
01:57:37.460 to be comprehensive agi it doesn't have to be empowered to do things out in the world but just when
01:57:44.420 siri or alexa become indistinguishable from people at the level of their conversation and you know have
01:57:53.620 access to the totality of human knowledge you know what when does siri on the on your phone become
01:58:00.500 less like a a glitchy assistant and more like an oracle yeah it's uh you know that that could be
01:58:07.300 fairly near term even with blind spots and you know something that you know disqualifies her as as
01:58:14.580 proper ai i mean i think the turing test will fall for most intents and purposes i mean for some
01:58:19.940 people the turing test has already fallen i mean we just have this this kind of laughable case of
01:58:24.420 the mine yeah the google engineer who who thought that you know the their current language model was
01:58:30.340 sentient it was absurd yeah i mean there was no reason to believe that but i would be very surprised
01:58:35.220 if it took more than a decade for the turing test to fall for something like a chatbot uh yeah i mean
01:58:42.660 i'm actually curious about why there haven't been attempts at the turing test using language models
01:58:49.220 my thought is that they actually would have a pretty good chance of passing it so i'm unsure on that but
01:58:53.940 one thing you mentioned is yeah this idea of like oh maybe rather than siri uh you have this incredibly
01:58:59.700 advanced personal assistance kind of like an article i just want to emphasize this is a you know one of
01:59:04.580 the examples where i talk about accelerating the most the kind of safer parts of ai and leaving
01:59:13.700 you know maybe slowing down or not investing much in the more dangerous parts where this idea of kind
01:59:21.620 of article ai just something that is it's not an agent trying to pursue goals in the world instead
01:59:27.700 it's just this input output function you put in text it outputs text and it is like the most helpful
01:59:34.740 extraordinarily smart and knowledgeable person that you've ever met that's the technology we want to
01:59:40.100 have earlier rather than later because that can help us with the the scariest issues around alignment that can help us
01:59:47.940 yeah that can help us align
01:59:50.180 ai that is more like a kind of agent acting in the world
01:59:53.620 and uh some of the leading ai labs are taking exactly that approach and i i think that's the right way to go
01:59:58.900 well will it's always fascinating and uh it was a joy yeah thank you so yeah so to be continued best of luck with the book again
02:00:08.740 that is uh what we owe the future and uh we have by no means covered all of its contents and um until next time will
02:00:16.500 thanks for being here thanks so much sam
02:00:18.500 um
02:00:36.020 you