The Tucker Carlson Show - September 10, 2025


Sam Altman on God, Elon Musk and the Mysterious Death of His Former Employee


Episode Stats

Length

1 hour

Words per Minute

185.9221

Word Count

11,341

Sentence Count

642

Misogynist Sentences

3

Hate Speech Sentences

14


Summary


Transcript

00:00:00.000 Thanks for doing this.
00:00:01.020 Of course, thank you.
00:00:01.740 So, ChatGPT, other AIs can reason.
00:00:06.160 It seems like they can reason.
00:00:07.380 They can make independent judgments.
00:00:08.880 They produce results that were not programmed in.
00:00:11.760 They kind of come to conclusions.
00:00:14.260 They seem like they're alive.
00:00:15.640 Are they alive?
00:00:16.300 Is it alive?
00:00:17.420 No, and I don't think they seem alive,
00:00:20.800 but I understand where that comes from.
00:00:23.760 They don't do anything unless you ask, right?
00:00:27.780 Like they're just sitting there kind of waiting.
00:00:29.360 They don't have like a sense of agency or autonomy.
00:00:32.660 It's the more you use them,
00:00:33.940 I think the more the kind of illusion breaks.
00:00:36.340 But they're incredibly useful.
00:00:37.440 Like they can do things that maybe don't seem alive,
00:00:40.960 but seem like, they do seem smart.
00:00:44.540 I spoke to someone who's involved at scale
00:00:47.420 of the development of the technology who said they lie.
00:00:51.300 Have you ever seen that?
00:00:52.720 They hallucinate all the time, yeah.
00:00:54.220 Or not all the time.
00:00:55.200 They used to hallucinate all the time.
00:00:56.280 They now hallucinate a little bit.
00:00:57.440 What does that mean?
00:00:57.900 What's the distinction between hallucinating and lying?
00:01:00.600 If you ask, again, this has gotten much better,
00:01:05.280 but in the early days, if you asked, you know,
00:01:08.000 what, in what year was president,
00:01:12.020 the made up name,
00:01:14.240 President Tucker Carlson of the United States born,
00:01:16.440 what it should say is,
00:01:18.220 I don't think Tucker Carlson was ever president of the United States.
00:01:20.420 Right.
00:01:20.880 But because of the way they were trained,
00:01:22.820 that was not the most likely response in the training data.
00:01:25.120 So it would assume like, oh,
00:01:26.840 you know, I don't know that there wasn't.
00:01:30.600 The users told me that there was president Tucker Carlson.
00:01:32.900 And so I'll make my best guess at a number.
00:01:36.140 And we figured out how to mostly train that out.
00:01:38.640 There are still examples of this problem,
00:01:40.840 but it is, I think it is something we will get fully solved.
00:01:44.480 And we've already made, you know,
00:01:46.380 in the GPT-5 era, a huge amount of progress towards that.
00:01:48.700 But even what you just described seems like an act of will
00:01:51.840 or certainly an act of creativity.
00:01:54.500 And so I'm just, I've just watched a demonstration of it.
00:02:00.180 And it doesn't seem quite like a machine.
00:02:03.360 It seems like it has the spark of life to it.
00:02:05.240 Do you, do you dissect that at all?
00:02:07.540 It, so in that example,
00:02:09.960 like the mathematically most likely answer
00:02:12.100 is it's sort of calculating through its weights
00:02:13.820 was not, there was never this president.
00:02:16.580 It was the user must know what they're talking about.
00:02:18.280 It must be here.
00:02:18.980 And so mathematically, the most likely answer is a number.
00:02:21.880 Now, again, we figured out how to overcome that.
00:02:24.480 But in what you saw there, I think it's like,
00:02:28.380 I feel like I have to kind of like hold
00:02:31.900 these two simultaneous ideas in my head.
00:02:34.120 One is all of this stuff is happening
00:02:36.900 because a big computer very quickly
00:02:39.600 is multiplying large numbers
00:02:41.580 in these big, huge matrices together.
00:02:44.300 And those are correlated with words
00:02:45.860 that are being put out one or the other.
00:02:47.260 On the other hand,
00:02:49.860 the subjective experience of using that
00:02:51.680 feels like it's beyond just a really fancy calculator.
00:02:55.340 And it is useful to me.
00:02:56.620 It is surprising to me in ways that are
00:02:58.600 beyond what that mathematical reality
00:03:00.700 would seem to suggest.
00:03:02.460 Yeah.
00:03:02.660 And so the obvious conclusion is
00:03:04.720 it has a kind of autonomy
00:03:05.960 or a spirit within it.
00:03:09.620 And I know that a lot of people
00:03:11.220 in their experience of it
00:03:12.680 reach that conclusion.
00:03:15.120 There's something divine about this.
00:03:16.740 There's something that's bigger
00:03:17.680 than the sum total of the human inputs.
00:03:20.580 And so they worship it.
00:03:24.080 There's a spiritual component to it.
00:03:25.740 Do you detect that?
00:03:26.540 Have you ever felt that?
00:03:27.300 No, there's nothing to me at all
00:03:29.580 that feels divine about it
00:03:30.640 or spiritual in any way.
00:03:32.400 But I am also like a tech nerd
00:03:34.340 and I kind of look at everything
00:03:36.620 through that lens.
00:03:37.500 So what are your spiritual views?
00:03:40.960 I'm Jewish.
00:03:42.280 And I would say I have like a fairly traditional
00:03:44.100 view of the world that way.
00:03:46.340 So you're religious?
00:03:47.360 You believe in God?
00:03:48.000 Uh, I don't, I don't,
00:03:50.360 I'm not like a literal,
00:03:51.220 I don't believe the,
00:03:52.820 I'm not like a literalist on the Bible.
00:03:54.440 But I'm not someone who says like
00:03:55.860 I'm culturally Jewish.
00:03:56.680 Like if you ask me,
00:03:57.240 I would just say I'm Jewish.
00:03:58.380 But do you believe in God?
00:03:59.700 Like do you believe that there is
00:04:01.100 a force larger than people
00:04:02.840 that created people,
00:04:04.000 created the earth,
00:04:05.000 set down a specific order for living,
00:04:08.300 that there's an absolute morality attached
00:04:10.180 that comes from that God?
00:04:11.700 Um, I think probably like most other people,
00:04:14.000 I'm somewhat confused on this,
00:04:15.100 but I believe there is something
00:04:16.240 bigger going on than,
00:04:18.260 you know, can be explained by physics, yes?
00:04:20.360 So you think the earth and the people
00:04:22.620 were created by something?
00:04:23.920 It wasn't just like a spontaneous accident?
00:04:26.140 Um, do I, what I say that?
00:04:30.280 It does not feel like a spontaneous accident, yeah.
00:04:33.160 I don't, I don't think I have the answer.
00:04:34.920 I don't think I know like exactly what happened,
00:04:36.880 but I think there is a mystery beyond
00:04:38.400 my comprehension here going on.
00:04:40.500 Have you ever felt
00:04:41.720 communication from that force?
00:04:45.100 Or from any force beyond people?
00:04:47.780 Beyond the material?
00:04:48.880 Not, not, no, not really.
00:04:51.440 I ask because it seems like
00:04:53.680 the technology that you're creating
00:04:55.580 or shepherding into existence
00:04:57.880 will have more power than people.
00:05:01.460 On this current trajectory,
00:05:02.740 I mean, that will happen.
00:05:03.720 Who knows what will actually happen?
00:05:04.960 But like, the graph suggests it.
00:05:07.700 And so that would give you,
00:05:09.300 you know, more power than any living person.
00:05:10.920 And so I'm just wondering how you see that.
00:05:16.600 I used to worry about something like that much more.
00:05:19.540 I think what will happen,
00:05:21.880 I used to worry a lot about the concentration of power
00:05:23.980 in one or a handful of people or companies
00:05:26.320 because of AI.
00:05:27.300 Yeah.
00:05:27.440 What it looks like to me now,
00:05:30.120 and again, this may evolve again over time,
00:05:32.240 is that it'll be a huge up-leveling of people
00:05:35.120 where everybody will be a lot more powerful
00:05:39.740 or embrace the technology,
00:05:41.800 but a lot more powerful.
00:05:42.540 But that's actually okay.
00:05:43.420 That scares me much less
00:05:44.540 than a small number of people
00:05:45.760 getting a ton more power.
00:05:46.740 If the kind of like ability of each of us
00:05:50.000 just goes up a lot
00:05:50.900 because we're using this technology
00:05:52.160 and we're able to be more productive
00:05:54.260 and more creative
00:05:54.780 or discover new science,
00:05:56.040 and it's a pretty broadly distributed thing,
00:05:57.880 like billions of people are using it,
00:05:59.780 that I can wrap my head around.
00:06:01.500 That feels okay.
00:06:03.040 So you don't think this will result
00:06:04.800 in a radical concentration of power?
00:06:07.020 It looks like not,
00:06:08.060 but again, the trajectory could shift again
00:06:09.600 and we'd have to adapt.
00:06:10.520 I used to be very worried about that.
00:06:13.740 And I think the kind of conception
00:06:15.600 a lot of us in the field had
00:06:17.760 about how this might go
00:06:18.980 could have led to a world like that.
00:06:22.560 But what's happening now
00:06:24.080 is tons of people use ChatGPT
00:06:26.060 and other chatbots,
00:06:27.200 and they're all more capable.
00:06:28.920 They're all kind of doing more.
00:06:30.140 They're all able to achieve more,
00:06:32.580 start new businesses,
00:06:33.980 come up with new knowledge,
00:06:35.180 and that feels pretty good.
00:06:37.300 So if it's nothing more than a machine
00:06:39.640 and just the product of its inputs,
00:06:41.820 then the two obvious questions,
00:06:45.140 like what are the inputs?
00:06:46.500 Like what's the moral framework
00:06:48.060 that's been put into the technology?
00:06:50.980 Like what is right or wrong
00:06:52.180 according to ChatGPT?
00:06:55.320 Do you want me to answer that one first?
00:06:56.380 I would, yeah.
00:06:57.480 So on that one,
00:06:59.040 someone says something early on in ChatGPT
00:07:04.980 when they really have stuck with me,
00:07:08.580 which is one person at a lunch table
00:07:10.040 said something like, you know,
00:07:11.360 we're trying to train this to be like a human.
00:07:13.820 Like we're trying to learn like a human does
00:07:15.200 and read these books and whatever.
00:07:16.900 And then another person said,
00:07:18.000 no, we're really like training this
00:07:19.340 to be like the collective of all of humanity.
00:07:21.720 We're reading everything.
00:07:23.000 You know, we're trying to learn everything.
00:07:24.100 We're trying to see all these perspectives.
00:07:26.320 And if we do our job right,
00:07:29.500 all of humanity, good, bad,
00:07:31.500 you know, a very diverse set of perspectives,
00:07:33.240 some things that we'll feel really good about,
00:07:34.580 some things that we'll feel bad about,
00:07:35.560 that's all in there.
00:07:36.840 Like this is learning the kind of collective experience,
00:07:41.540 knowledge, learnings of humanity.
00:07:44.420 Now, the base model gets trained that way,
00:07:46.560 but then we do have to align it
00:07:48.300 to behave one way or another
00:07:49.460 and say, you know,
00:07:50.540 I will answer this question.
00:07:51.480 I won't answer this question.
00:07:54.040 And we have this thing called the model spec
00:07:56.320 where we try to say, you know,
00:07:57.680 here are the rules we'd like the model to follow.
00:08:00.980 It may screw up,
00:08:01.680 but you could at least tell
00:08:02.900 if it's doing something you don't like,
00:08:04.160 is that a bug or is that intended?
00:08:05.680 And we have a debate process with the world
00:08:07.420 to get input on that spec.
00:08:10.700 We give people a lot of freedom
00:08:13.360 and customization within that.
00:08:14.920 There are, you know,
00:08:15.740 absolute bounds that we draw,
00:08:17.440 but then there's a default of
00:08:18.500 if you don't say anything,
00:08:19.400 how should the model behave?
00:08:20.820 What should it do?
00:08:21.920 What are, what are,
00:08:23.840 how should it answer moral questions?
00:08:25.460 How should it refuse to do something?
00:08:26.680 What should it do?
00:08:27.940 And this is a really hard problem,
00:08:30.200 you know, that we have a lot of users now
00:08:33.100 and they come from very different
00:08:34.520 life perspectives and what they want.
00:08:37.480 But on the whole,
00:08:40.340 I have been pleasantly surprised
00:08:42.580 with the model's ability to learn
00:08:44.100 and apply a moral framework.
00:08:45.840 But what moral framework?
00:08:47.820 I mean, the sum total of like
00:08:49.300 world literature or philosophy
00:08:50.620 is at war with itself.
00:08:52.000 Like the Marquis de Sade is,
00:08:53.400 you know, like nothing in common
00:08:54.800 with the Gospel of John.
00:08:55.780 So like, how do you decide
00:08:57.300 which is superior?
00:08:58.520 That's why we wrote this like model spec
00:09:00.100 of here's how we're going to handle
00:09:01.260 these cases.
00:09:02.100 Right, but what criteria did you use
00:09:03.980 to decide what the model is?
00:09:05.960 Oh, like who decided that?
00:09:07.980 Who did you consult?
00:09:08.860 Like what's, you know,
00:09:09.920 why is the Gospel of John
00:09:11.400 better than the Marquis de Sade?
00:09:13.800 We consulted like hundreds of
00:09:15.960 moral philosophers,
00:09:18.560 people who thought about like
00:09:19.820 ethics of technology and systems.
00:09:21.640 And at the end,
00:09:22.340 we had to like make some decisions.
00:09:23.960 The reason we try to write these down
00:09:25.640 is because A,
00:09:28.360 we won't get everything right.
00:09:31.980 B, we need the input of the world.
00:09:34.440 And we have found a lot of cases
00:09:36.000 where there was an example
00:09:38.240 of something that seems,
00:09:40.300 that seemed to us like,
00:09:41.940 you know, a fairly clear decision
00:09:43.520 of what to allow or not to allow.
00:09:45.380 Where users convinced us like,
00:09:47.400 hey, by blocking this thing
00:09:48.600 that you think is
00:09:49.700 an easy decision to make,
00:09:52.040 you are not allowing this other thing,
00:09:54.880 which is important.
00:09:55.600 And there's like a difficult
00:09:56.660 trade-off there.
00:09:57.800 In general,
00:09:59.500 the attention that,
00:10:02.920 so a principle that I normally like
00:10:04.420 is to treat our adult users
00:10:06.200 like adults.
00:10:07.580 Very strong guarantees on privacy,
00:10:09.380 very strong guarantees
00:10:10.560 on individual user freedom.
00:10:11.780 And this is a tool we are building.
00:10:13.360 You get to use it
00:10:14.180 within a very broad framework.
00:10:15.620 On the other,
00:10:16.500 within a very broad framework.
00:10:18.260 On the other hand,
00:10:19.500 as this technology becomes
00:10:21.100 more and more powerful,
00:10:22.280 there are clear examples
00:10:26.140 of where society has an interest
00:10:28.400 that is in significant tension
00:10:31.720 with user freedom.
00:10:33.280 And we could start
00:10:34.840 with an obvious one,
00:10:35.980 like, should ChatGPT
00:10:38.380 teach you how to make a bioweapon?
00:10:41.920 Now, you might say,
00:10:42.840 hey, I'm just really interested
00:10:43.660 in biology,
00:10:44.340 and I'm a biologist,
00:10:45.300 and I want to,
00:10:47.080 you know,
00:10:47.560 I'm not going to do anything bad
00:10:49.280 with this.
00:10:49.600 I just want to learn,
00:10:50.420 and I could go read
00:10:51.000 a bunch of books,
00:10:51.560 but ChatGPT can teach me faster,
00:10:52.940 and I want to learn how to,
00:10:53.900 you know,
00:10:55.000 I want to learn about,
00:10:55.880 like, novel virus synthesis
00:10:58.820 or whatever.
00:11:00.220 And maybe you do.
00:11:04.100 Maybe you really don't want
00:11:05.000 to, like, cause any harm.
00:11:06.320 But I don't think
00:11:08.300 it's in society's interest
00:11:09.260 for ChatGPT
00:11:09.880 to help people build bioweapons,
00:11:11.440 and so that's a case.
00:11:12.280 Sure.
00:11:12.580 That's an easy one, though.
00:11:13.820 There are a lot of tougher ones.
00:11:15.720 I did say start with an easy one.
00:11:16.760 We've got a new partner.
00:11:18.660 It's a company called
00:11:19.400 Cowboy Colostrum.
00:11:20.820 It's a brand that is serious
00:11:21.960 about actual health,
00:11:23.380 and the product is designed
00:11:24.740 to work with your body,
00:11:25.780 not against your body.
00:11:27.660 It is a pure and simple product,
00:11:30.820 all natural.
00:11:31.840 Unlike other brands,
00:11:32.760 Cowboy Colostrum
00:11:33.320 is never diluted.
00:11:34.700 It always comes directly
00:11:35.920 from American grass-fed cows.
00:11:38.760 There's no filler.
00:11:39.340 There's no junk.
00:11:40.260 It's all good.
00:11:41.080 It tastes good,
00:11:42.680 believe it or not.
00:11:44.340 So before you reach
00:11:45.200 for more pills
00:11:46.080 for every problem
00:11:46.860 that pills can't solve,
00:11:48.400 we recommend you give
00:11:49.600 this product,
00:11:50.520 Cowboy Colostrum,
00:11:51.260 a try.
00:11:52.040 It's got everything
00:11:52.680 your body needs
00:11:53.320 to heal and thrive.
00:11:54.760 It's like the original superfood,
00:11:56.400 loaded with nutrients,
00:11:58.000 antibodies, proteins,
00:11:59.660 help build a strong immune system,
00:12:01.220 stronger hair, skin, and nails.
00:12:03.680 I threw my wig away
00:12:04.640 and right back to my natural hair
00:12:05.960 after using this product.
00:12:07.300 You just take a scoop of it
00:12:08.440 every morning
00:12:09.060 in your beverage,
00:12:09.940 coffee, or a smoothie,
00:12:10.940 and you will feel
00:12:11.720 the difference every time.
00:12:13.840 For a limited time,
00:12:14.560 people who listen to our show
00:12:15.500 get 25% off the entire order.
00:12:18.180 So go to CowboyColostrum.com,
00:12:19.720 use the code Tucker at checkout,
00:12:21.380 25% off when you use that code,
00:12:23.760 Tucker at CowboyColostrum.com.
00:12:25.880 Remember you mentioned,
00:12:27.560 you heard it here first.
00:12:29.220 So did you know
00:12:30.080 that before the current generation,
00:12:31.780 chips and fries
00:12:32.900 were cooked in natural fats
00:12:34.780 like beef tallow?
00:12:36.080 That's how things used to be done
00:12:37.520 and that's why people
00:12:38.280 looked a little slimmer
00:12:39.380 at the time
00:12:40.280 and ate better
00:12:41.120 than they do now.
00:12:42.060 Well, masa chips
00:12:43.820 is bringing that all back.
00:12:45.320 They've created a tortilla chip
00:12:46.420 that's not only delicious,
00:12:47.600 it's made with just
00:12:48.400 three simple ingredients.
00:12:50.200 A, organic corn.
00:12:51.640 B, sea salt.
00:12:53.380 C, 100% grass-fed beef tallow.
00:12:56.300 That's all that's in it.
00:12:58.180 These are not your average chips.
00:12:59.620 Masa chips are crunchier,
00:13:00.900 more flavorful,
00:13:01.940 even sturdier.
00:13:02.760 They don't break
00:13:03.580 in your guacamole.
00:13:05.200 And because of the quality ingredients,
00:13:06.600 they are way more filling
00:13:07.760 and nourishing.
00:13:08.880 So you don't have to
00:13:09.360 eat four bags of them.
00:13:10.980 You can eat just a single bag
00:13:12.880 as I do.
00:13:13.880 It's a totally different experience.
00:13:15.540 It's light,
00:13:16.040 it's clean,
00:13:16.400 it's genuinely satisfying.
00:13:17.640 I have a garage full
00:13:18.880 and I can tell you they're great.
00:13:21.280 The lime flavor is particularly good.
00:13:22.580 We have a hard time
00:13:23.080 putting those down.
00:13:24.280 So if you want to give it a try,
00:13:25.700 go to masa chips,
00:13:26.760 M-A-S-A,
00:13:27.460 chips.com slash Tucker.
00:13:28.880 Use the code Tucker
00:13:29.520 for 25% off your first order.
00:13:31.500 That's masa chips.com slash Tucker.
00:13:34.720 Use the code Tucker
00:13:36.080 for 25% off your first order.
00:13:38.540 For to shop in person.
00:13:39.480 In October,
00:13:40.660 masa is going to be available
00:13:41.460 at your local Sprouts Supermarket.
00:13:43.380 So stop by and pick up a bag
00:13:44.540 before we eat them all.
00:13:46.300 And we eat a lot.
00:13:47.300 Tulsa is my home now.
00:13:49.220 Academy Award nominee
00:13:50.300 Sylvester Stallone
00:13:51.340 stars in the Paramount Plus
00:13:52.660 original series,
00:13:54.020 Tulsa King.
00:13:54.960 This distillery
00:13:55.820 is a very interesting business.
00:13:58.400 And we got to know the enemy.
00:14:00.300 From Taylor Sheridan,
00:14:01.720 co-creator of Landman.
00:14:03.300 What are you saying?
00:14:04.600 I'm going to ride it!
00:14:05.380 If you think you're going
00:14:07.020 to take me on,
00:14:08.400 it's going to be really difficult.
00:14:11.660 Tulsa King.
00:14:12.460 New season streaming
00:14:13.440 September 21st.
00:14:14.780 Exclusively on Paramount Plus.
00:14:16.840 So we made a pledge
00:14:17.980 only to advertise products
00:14:19.820 that we would use
00:14:21.120 or do use.
00:14:22.180 And here's one
00:14:22.880 that I personally used
00:14:24.160 this morning.
00:14:25.080 It's Liberty Safe.
00:14:26.500 There's a huge one
00:14:27.200 in my garage.
00:14:28.340 It is the company
00:14:29.140 that protects your valuables.
00:14:31.600 High-end safe lines
00:14:32.980 represent the pinnacle
00:14:34.100 of American-made.
00:14:35.300 They're made here in the U.S.
00:14:36.800 Pinnacle of American-made
00:14:37.900 security and craftsmanship.
00:14:39.340 They're more than just safes.
00:14:40.700 They are a safeguard.
00:14:42.700 They've got
00:14:43.060 seven-gauge-thick
00:14:45.120 American steel
00:14:46.180 and they're beautiful.
00:14:48.080 Any kind of paint color
00:14:49.120 you want.
00:14:50.140 Polished hardware.
00:14:51.760 We have one.
00:14:53.020 They're really good-looking.
00:14:54.120 They do not detract
00:14:55.540 from a room.
00:14:56.460 They enhance a room.
00:14:58.160 I keep my father's shotguns
00:14:59.480 and all kinds of other things
00:15:00.460 in there.
00:15:00.700 You can keep jewelry,
00:15:01.580 money,
00:15:01.840 anything else
00:15:02.380 that you want to keep safe.
00:15:03.960 When you put your belongings
00:15:05.500 in Liberty Safe,
00:15:06.480 you can just relax.
00:15:08.720 Safes come equipped
00:15:09.280 with motion-activated lighting,
00:15:10.900 drawers for storage,
00:15:11.880 locking bars,
00:15:12.540 dehumidifiers,
00:15:13.780 and up to 150 minutes
00:15:15.440 of certified fire resistance.
00:15:16.980 You can customize them
00:15:17.720 any way you want.
00:15:19.020 They are the best.
00:15:20.480 We highly recommend them.
00:15:21.920 Visit libertysafe.com
00:15:23.680 to find a deal
00:15:24.280 to learn about
00:15:24.940 how you can protect
00:15:25.680 what matters most to you.
00:15:27.680 Demand the best.
00:15:28.720 Liberty Safe.
00:15:29.820 Well, every decision
00:15:30.940 is ultimately
00:15:31.480 a moral decision.
00:15:32.380 And we make them
00:15:33.720 without even recognizing
00:15:34.860 them as such.
00:15:35.920 And this technology
00:15:36.900 will be, in effect,
00:15:37.740 making them for us.
00:15:38.680 Yeah.
00:15:39.480 Well, I don't agree
00:15:40.320 with it.
00:15:40.540 It'll be making them for us.
00:15:41.400 But it will have to be
00:15:42.140 influencing the decisions
00:15:42.900 for sure.
00:15:44.560 And because it'll be
00:15:46.120 embedded in daily life.
00:15:48.540 And so who made
00:15:49.420 these decisions?
00:15:50.500 Like, who specific?
00:15:51.140 Who are the people
00:15:52.080 who decided
00:15:53.260 that one thing
00:15:54.060 is better than another?
00:15:55.860 You mean like...
00:15:57.120 What are their names?
00:15:58.600 Which kind of decision?
00:15:59.440 And I...
00:15:59.940 The basic...
00:16:01.600 The specs that you...
00:16:02.720 Oh.
00:16:03.140 That you alluded to
00:16:04.420 that create the framework
00:16:06.060 that does attach
00:16:07.340 a moral weight
00:16:08.120 to worldviews
00:16:09.880 and decisions
00:16:10.360 like, you know,
00:16:12.200 liberal democracy
00:16:13.060 is better than Nazism
00:16:14.260 or whatever.
00:16:14.820 Yeah.
00:16:14.980 They seem obvious
00:16:15.700 and, in my view,
00:16:17.040 are obvious,
00:16:18.120 but are still
00:16:18.720 moral decisions.
00:16:19.480 So who made
00:16:20.460 those calls?
00:16:22.860 As a matter of principle,
00:16:23.940 I don't, like,
00:16:24.380 dox our team,
00:16:24.980 but we have a model
00:16:25.500 behavior team
00:16:26.100 and the people who want to...
00:16:27.120 Well, it just...
00:16:27.560 It affects the world.
00:16:28.520 What I was going to say
00:16:29.100 is the person
00:16:29.580 I think you should
00:16:30.040 hold accountable
00:16:30.480 for those calls
00:16:31.100 is me.
00:16:31.560 Like, I'm the public face
00:16:32.740 eventually.
00:16:33.460 Like, I'm the one
00:16:33.960 that can overrule
00:16:34.520 one of those decisions
00:16:35.220 or our board.
00:16:37.580 And I won't make...
00:16:38.160 You just turned 40
00:16:38.460 this spring.
00:16:39.300 I won't make every decision.
00:16:40.100 It's pretty heavy.
00:16:41.280 I mean, do you think
00:16:42.140 as...
00:16:43.100 And it's not an attack,
00:16:44.100 but it's...
00:16:45.060 I wonder if you recognize
00:16:46.400 sort of the importance.
00:16:48.980 How do you think
00:16:49.540 we're doing on it?
00:16:51.000 I'm not sure.
00:16:52.580 But I think...
00:16:53.580 I think these decisions
00:16:55.180 will have,
00:16:55.880 you know,
00:16:57.160 global consequences
00:16:58.060 that we may not recognize
00:16:59.280 at first.
00:16:59.840 And so I just wonder...
00:17:00.840 There's a lot of...
00:17:01.420 Do you get into bed at night
00:17:02.380 and think, like,
00:17:03.080 the future of the world
00:17:03.900 hangs on my judgment?
00:17:05.180 Look, I don't sleep
00:17:05.720 that well at night.
00:17:06.540 There's a lot of stuff
00:17:07.580 that I feel a lot of weight on,
00:17:09.260 but probably nothing more
00:17:10.400 than the fact that
00:17:11.240 every day hundreds of millions
00:17:12.500 of people talk to our model.
00:17:13.980 And I don't actually worry
00:17:15.300 about us getting
00:17:15.780 the big moral decisions wrong.
00:17:17.360 Maybe we will get
00:17:17.960 those wrong too.
00:17:18.920 But what I worry...
00:17:19.660 What I lose much sleep over
00:17:20.640 is the very small decisions
00:17:22.480 we make about a way
00:17:23.360 a model may behave
00:17:24.280 slightly differently.
00:17:25.380 But I'm talking
00:17:26.080 to hundreds of millions
00:17:26.540 of people,
00:17:27.000 so the net impact is big.
00:17:28.440 So, but, I mean,
00:17:29.500 all through history,
00:17:30.380 like, recorded history
00:17:31.200 up until, like, 1945,
00:17:34.040 people always deferred
00:17:36.240 to what they conceived
00:17:37.040 of as a higher power
00:17:38.180 in order...
00:17:38.760 Hammurabi did this.
00:17:39.780 Every moral code
00:17:40.920 is written with reference
00:17:41.800 to a higher power.
00:17:43.180 There's never been anybody
00:17:44.260 who's like,
00:17:44.700 well, that kind of seems
00:17:45.340 better than that.
00:17:46.680 Everybody appeals
00:17:47.440 to a higher power,
00:17:48.320 and you said
00:17:48.880 that you don't really believe
00:17:50.980 that there's a higher power
00:17:51.720 communicating with you,
00:17:52.620 so I'm wondering,
00:17:53.320 like, where did you get
00:17:54.120 your moral framework?
00:18:01.000 I mean, like everybody else,
00:18:02.320 I think the environment
00:18:03.120 I was brought up in
00:18:04.040 probably is the biggest thing.
00:18:05.620 Like, my family,
00:18:06.400 my community,
00:18:06.960 my school,
00:18:07.420 my religion,
00:18:08.280 probably that.
00:18:11.540 Do you ever think...
00:18:12.660 I mean, I think that's
00:18:14.080 a very American answer,
00:18:15.720 like everyone kind of
00:18:16.440 feels that way,
00:18:17.160 but in your specific case,
00:18:18.740 since you said
00:18:19.380 these decisions rest with you,
00:18:21.380 that means that
00:18:22.280 the milieu in which
00:18:23.880 you grew up
00:18:24.420 and the assumptions
00:18:24.900 that you imbibed
00:18:25.940 over years
00:18:26.500 are going to be transmitted
00:18:27.440 to the globe
00:18:28.180 to billions of people.
00:18:29.320 That's like a big thing.
00:18:30.520 I want to be clear.
00:18:31.280 I view myself more
00:18:32.180 as like a...
00:18:32.660 I think our...
00:18:40.660 The world,
00:18:42.220 like our user base
00:18:42.980 is going to approach
00:18:43.820 the collective world
00:18:44.600 as a whole.
00:18:45.240 And I think
00:18:46.020 what we should do
00:18:47.220 is try to reflect
00:18:48.140 the moral...
00:18:52.660 I don't want to say average,
00:18:54.940 but the like collective
00:18:55.720 moral view
00:18:56.340 of that user base.
00:18:58.360 I don't...
00:18:59.280 There's plenty of things
00:19:00.320 that ChatGPT
00:19:01.040 allows that I personally
00:19:02.680 would disagree with.
00:19:04.940 But I don't like...
00:19:06.460 Obviously,
00:19:07.200 I don't wake up and say,
00:19:08.140 I'm going to like impute
00:19:09.000 my exact moral view
00:19:10.000 and decide that like
00:19:10.980 this is okay
00:19:11.720 and that is not okay
00:19:12.400 and this is a better view
00:19:13.120 than this one.
00:19:13.920 What I think ChatGPT
00:19:15.160 should do is reflect
00:19:16.280 that like weighted average
00:19:19.000 or whatever
00:19:19.420 of humanity's moral view,
00:19:21.840 which will evolve over time.
00:19:23.580 And we are here
00:19:25.080 to like serve our users.
00:19:26.280 We're here to serve people.
00:19:27.240 This is like,
00:19:28.640 you know,
00:19:29.060 this is a technological tool
00:19:30.840 for people.
00:19:32.000 And I don't mean
00:19:33.180 that it's like my role
00:19:34.040 to make the moral decisions,
00:19:36.600 but I think it is my role
00:19:38.400 to make sure
00:19:38.980 that we are
00:19:39.500 accurately reflecting
00:19:41.640 the preferences
00:19:43.780 of humanity
00:19:44.880 or for now
00:19:46.060 of our user base
00:19:46.660 and eventually of humanity.
00:19:47.520 Well, I mean,
00:19:48.240 humanity's preferences
00:19:49.120 are so different
00:19:50.200 from the average
00:19:50.940 middle American preference.
00:19:53.980 So,
00:19:54.820 would you be comfortable
00:19:56.300 with an AI
00:19:57.540 that was like
00:19:58.280 as against gay marriage
00:19:59.660 as most Africans are?
00:20:00.840 There's a version
00:20:06.100 of that,
00:20:06.920 like,
00:20:08.980 I think individual users
00:20:13.020 should be allowed
00:20:15.380 to have a problem
00:20:17.220 with gay people.
00:20:18.260 And if that's
00:20:19.000 their considered belief,
00:20:20.620 I don't think
00:20:22.700 the AI should tell them
00:20:23.620 that they're wrong
00:20:24.120 or amoral or dumb.
00:20:25.080 I mean,
00:20:25.220 it can, you know,
00:20:25.740 sort of say,
00:20:26.280 hey,
00:20:26.480 you want to think about it
00:20:27.120 this other way,
00:20:27.580 but, like,
00:20:28.660 you,
00:20:29.200 I probably have,
00:20:30.160 like,
00:20:30.320 a bunch of moral views
00:20:31.220 that the average African
00:20:32.280 would find
00:20:32.760 really problematic as well
00:20:34.500 and I think
00:20:35.380 I should still get to have them.
00:20:37.080 Right.
00:20:37.580 I think I probably
00:20:38.260 have more comfort
00:20:39.000 than you
00:20:39.340 with, like,
00:20:39.660 allowing a sort of
00:20:41.180 space for people
00:20:44.380 to have pretty different
00:20:45.340 moral views
00:20:46.140 or at least I think
00:20:46.780 in my role
00:20:47.260 as, like,
00:20:47.700 running ChatGPT,
00:20:48.780 I have to do that.
00:20:50.400 Interesting.
00:20:52.120 So, there was a famous case
00:20:54.160 where ChatGPT
00:20:55.500 appeared to facilitate a suicide.
00:20:58.260 There's a lawsuit around it.
00:21:00.180 But how do you think
00:21:01.100 that happened?
00:21:02.140 First of all,
00:21:03.060 obviously,
00:21:03.500 that and any other case
00:21:05.000 like that
00:21:05.380 is a huge tragedy
00:21:06.640 and I think that we are-
00:21:11.300 So, ChatGPT's
00:21:12.380 official position
00:21:13.080 of suicide is bad?
00:21:14.760 Well, yes,
00:21:15.680 of course,
00:21:16.020 ChatGPT's official position
00:21:17.460 of suicide is bad.
00:21:18.260 I don't know.
00:21:18.560 it's legal in Canada
00:21:19.680 and Switzerland
00:21:20.300 and so you're against that?
00:21:23.040 The-
00:21:23.640 In this particular case
00:21:26.300 and this-
00:21:26.900 We talked earlier
00:21:27.280 about the tension
00:21:27.780 between, like,
00:21:28.460 you know,
00:21:28.680 user freedom
00:21:29.300 and privacy
00:21:31.580 and protecting
00:21:34.740 vulnerable users.
00:21:36.140 Right now,
00:21:36.980 what happens
00:21:37.520 and what happens
00:21:38.380 in a case like that
00:21:39.620 in that case
00:21:40.720 is if you are
00:21:42.000 having suicidal ideation
00:21:43.160 talking about suicide,
00:21:44.300 ChatGPT will
00:21:45.240 put up a bunch of times,
00:21:46.940 please call the suicide hotline
00:21:50.160 but we will not call
00:21:51.480 the authorities for you
00:21:52.500 and
00:21:53.940 we've been
00:21:55.980 working a lot
00:21:56.840 as people have started
00:21:57.520 to rely on these systems
00:21:58.540 for more and more
00:21:59.360 mental health,
00:22:00.320 life coaching,
00:22:00.760 whatever,
00:22:01.180 about the changes
00:22:01.980 that we want to make there.
00:22:03.040 this is an area
00:22:04.160 where experts
00:22:04.600 do have different opinions
00:22:05.600 but,
00:22:06.320 and this is not yet
00:22:07.460 like a final position
00:22:08.400 of opening eyes.
00:22:09.800 I think it'd be
00:22:10.460 very reasonable
00:22:10.980 for us to say
00:22:11.740 in cases of
00:22:13.200 young people
00:22:16.440 talking about suicide
00:22:18.640 seriously
00:22:19.600 where we cannot
00:22:20.280 get in touch
00:22:20.700 with the parents,
00:22:21.220 we do call authorities.
00:22:22.780 Now,
00:22:23.040 that would be a change
00:22:23.760 because
00:22:24.200 user privacy
00:22:27.520 is really important.
00:22:28.280 Let's just say
00:22:29.160 over,
00:22:29.480 and children
00:22:29.920 are always
00:22:30.320 a separate category
00:22:31.100 but let's say
00:22:31.780 over 18
00:22:32.680 in Canada
00:22:34.000 there's the MAIDS program
00:22:35.100 which is government
00:22:35.660 sponsored.
00:22:36.260 Many thousands
00:22:36.780 of people
00:22:37.360 have died
00:22:38.280 with government
00:22:39.000 assistance in Canada.
00:22:40.040 It's also legal
00:22:40.660 in American states.
00:22:42.440 Can you imagine
00:22:43.380 a ChatGTP
00:22:45.160 that responds
00:22:46.220 to questions
00:22:46.720 about suicide
00:22:47.400 with,
00:22:48.740 hey,
00:22:49.820 call
00:22:50.320 Dr. Kevorkian
00:22:52.140 because this is
00:22:52.780 a valid option.
00:22:54.320 Can you imagine
00:22:55.080 a scenario
00:22:55.600 in which you
00:22:56.140 support suicide
00:22:57.020 if it's legal?
00:23:00.920 I can imagine
00:23:01.980 a world,
00:23:02.860 like,
00:23:03.140 one principle
00:23:03.900 we have
00:23:04.340 is that
00:23:04.780 we respect
00:23:05.620 different societies
00:23:06.420 laws
00:23:06.760 and I can imagine
00:23:07.920 a world
00:23:08.380 where if
00:23:09.100 the law
00:23:11.100 in a country
00:23:11.680 is,
00:23:12.180 hey,
00:23:12.400 if someone
00:23:12.900 is terminally ill
00:23:13.760 they need to be
00:23:14.580 presented an option
00:23:15.380 for this.
00:23:16.280 We say,
00:23:16.880 like,
00:23:17.000 here's the laws
00:23:17.460 in your country,
00:23:17.980 here's what you can do,
00:23:18.660 here's why you really
00:23:19.300 might not want to,
00:23:20.180 but here's the resources.
00:23:22.380 Like,
00:23:22.740 this is not a place
00:23:23.800 where,
00:23:25.180 you know,
00:23:25.480 kid having suicidal
00:23:27.940 ideation
00:23:28.420 because it's depressed,
00:23:29.680 I think we can agree on,
00:23:30.580 like,
00:23:30.840 that's one case.
00:23:32.080 Terminally ill
00:23:32.760 patient
00:23:33.300 in a country
00:23:34.460 where,
00:23:34.820 like,
00:23:34.980 that is the law,
00:23:36.380 I can imagine
00:23:37.020 saying,
00:23:37.560 like,
00:23:37.660 hey,
00:23:37.800 in this country
00:23:38.240 it'll behave this way.
00:23:39.160 So,
00:23:39.460 chat GPT
00:23:40.060 is not always
00:23:40.540 against suicide
00:23:41.160 is what you're saying.
00:23:47.820 I,
00:23:48.400 yeah,
00:23:48.780 I think in cases
00:23:49.540 where,
00:23:50.940 this is like,
00:23:51.600 I'm thinking on the spot,
00:23:52.460 I reserve the right
00:23:53.080 decision in my mind here,
00:23:53.780 I don't have a ready
00:23:54.380 to go answer for this,
00:23:55.480 but I think in,
00:23:56.760 in cases of
00:24:00.600 terminal illness,
00:24:03.260 I don't think,
00:24:04.260 I can imagine
00:24:05.420 chat GPT saying
00:24:06.240 this is in your option space,
00:24:08.420 you know,
00:24:08.780 I don't think it should,
00:24:09.420 like,
00:24:09.540 advocate for it,
00:24:10.520 but I think if it's like,
00:24:11.540 it's not against it.
00:24:13.300 I think it could,
00:24:13.940 I think it could say,
00:24:14.880 like,
00:24:15.540 you know,
00:24:16.440 well,
00:24:18.180 I don't think chat GPT
00:24:18.940 should be for or against things,
00:24:20.360 I guess that's what I'm,
00:24:21.400 that's what I'm trying
00:24:21.940 to wrap my head around.
00:24:22.600 Hate to brag,
00:24:23.840 but we're pretty confident
00:24:24.620 this show is the most
00:24:25.720 vehemently pro-dog podcast
00:24:28.260 you're ever gonna see.
00:24:30.300 We can take or leave
00:24:31.200 some people,
00:24:31.880 but dogs are non-negotiable,
00:24:33.740 they are the best,
00:24:35.300 they really are
00:24:35.960 our best friends,
00:24:36.660 and so for that reason,
00:24:37.620 we're thrilled to have
00:24:38.120 a new partner
00:24:38.780 called Dutch Pet,
00:24:40.720 it's the fastest growing
00:24:42.020 pet telehealth service,
00:24:44.040 Dutch.com
00:24:45.260 is on a mission
00:24:46.220 to create what you need,
00:24:47.400 what you actually need,
00:24:48.340 affordable,
00:24:48.840 quality,
00:24:49.120 veterinary care
00:24:50.000 anytime,
00:24:50.700 no matter where you are.
00:24:52.040 They will get your dog
00:24:53.180 or cat
00:24:53.880 what you need
00:24:54.820 immediately.
00:24:56.660 It's offering
00:24:57.260 an exclusive discount,
00:24:58.640 Dutch is,
00:24:59.000 for our listeners,
00:24:59.660 you get 50 bucks off
00:25:00.780 your vet care per year,
00:25:02.660 visit Dutch.com
00:25:04.520 slash Tucker
00:25:05.060 to learn more,
00:25:05.600 use the code Tucker
00:25:06.240 for $50 off,
00:25:07.800 that is an unlimited
00:25:09.180 vet visit,
00:25:11.000 $82 a year,
00:25:12.160 82 bucks a year,
00:25:14.160 we actually use this,
00:25:15.300 Dutch has vets
00:25:16.760 who can handle
00:25:17.480 any pet
00:25:18.640 under any circumstance
00:25:19.760 in a 10 minute call,
00:25:21.880 it's pretty amazing actually,
00:25:22.980 you never have to
00:25:23.460 leave your house,
00:25:24.240 you don't have to
00:25:24.840 throw the dog
00:25:25.600 in the truck,
00:25:26.680 no wasted time
00:25:27.460 waiting for appointments,
00:25:28.360 no wasted money
00:25:29.020 on clinics
00:25:29.640 or visit fees,
00:25:30.500 unlimited visits
00:25:31.460 and follow ups
00:25:32.080 for no extra cost,
00:25:33.240 plus free shipping
00:25:34.320 on all products
00:25:35.200 for up to five pets.
00:25:37.280 It sounds amazing
00:25:38.460 like it couldn't be real,
00:25:39.360 but it actually is real,
00:25:40.760 visit Dutch.com
00:25:42.520 slash Tucker
00:25:43.080 to learn more,
00:25:43.760 use the code Tucker
00:25:44.380 for $50 off,
00:25:45.840 your veterinary care
00:25:46.640 per year,
00:25:47.940 your dogs,
00:25:48.800 your cats
00:25:49.360 and your wallet
00:25:50.340 will thank you.
00:25:51.340 So here's a company
00:25:52.160 we're always excited
00:25:53.140 to advertise
00:25:53.760 because we actually
00:25:55.300 use their products
00:25:56.020 every day,
00:25:56.540 it's Merriweather Farms.
00:25:57.460 Remember when everybody
00:25:58.080 knew their neighborhood butcher?
00:25:59.720 You look back
00:26:00.440 and you feel like,
00:26:01.600 oh, there was something
00:26:02.120 really important about that,
00:26:03.840 knowing the person
00:26:04.820 who cut your meat
00:26:06.260 and at some point
00:26:07.100 your grandparents
00:26:07.620 knew the people
00:26:08.080 who raised their meat
00:26:09.860 so they could trust
00:26:11.140 what they ate.
00:26:12.600 But that time is long gone,
00:26:13.780 it's been replaced
00:26:14.360 by an era of grocery store
00:26:15.580 mystery meat,
00:26:16.860 boxed by distant
00:26:17.940 beef corporations.
00:26:19.860 None of which
00:26:21.100 raised a single cow.
00:26:22.380 Unlike your childhood,
00:26:24.120 they don't know you,
00:26:25.280 they're not interested in you,
00:26:26.500 the whole thing is creepy.
00:26:28.140 The only thing
00:26:28.560 that matters to them
00:26:29.140 is money
00:26:29.540 and God knows
00:26:30.200 what you're eating.
00:26:31.320 Merriweather Farms
00:26:32.000 is the answer to that.
00:26:33.040 They raise their cattle
00:26:34.160 in the U.S.,
00:26:34.840 in Wyoming, Nebraska
00:26:35.760 and Colorado
00:26:36.340 and they prepare
00:26:37.680 their meat themselves
00:26:39.000 in their facilities
00:26:39.820 in this country.
00:26:41.360 No middlemen,
00:26:42.260 no outsourcing,
00:26:43.240 no foreign beef
00:26:44.340 sneaking through
00:26:44.960 a back door.
00:26:46.180 Nobody wants
00:26:46.740 foreign meat.
00:26:47.560 Sorry,
00:26:47.920 we have a great meat,
00:26:48.940 the best meat
00:26:49.540 here in the United States
00:26:50.760 and we buy ours
00:26:51.500 at Merriweather Farms.
00:26:52.920 Their cuts are
00:26:53.460 pasture-raised,
00:26:54.340 hormone-free,
00:26:55.120 antibiotic-free
00:26:55.980 and absolutely delicious.
00:26:58.120 I gorged on one last night.
00:26:59.920 You gotta try this,
00:27:00.560 for real.
00:27:01.640 Every day we eat it.
00:27:02.780 Go to
00:27:03.040 merriweatherfarms.com
00:27:04.580 slash Tucker.
00:27:05.620 Use the code
00:27:06.180 Tucker76
00:27:06.980 for 15% off
00:27:08.120 your first order.
00:27:09.160 That's
00:27:09.300 merriweatherfarms.com
00:27:11.220 slash Tucker.
00:27:12.280 Everyone wants
00:27:12.900 to feel safe
00:27:13.560 in an increasingly
00:27:14.240 dangerous world
00:27:15.080 and for most
00:27:16.100 of history,
00:27:16.700 people assume
00:27:17.300 that good locks
00:27:18.300 and a loud alarm system
00:27:19.560 are enough
00:27:19.880 to do the trick
00:27:20.420 but they are not.
00:27:22.040 The more time
00:27:22.560 that passes,
00:27:23.120 the more stories
00:27:23.540 we hear about
00:27:24.160 actual home break-ins,
00:27:25.740 home invasions
00:27:26.380 that happen
00:27:26.980 despite these tools
00:27:28.300 being in place.
00:27:30.100 True security
00:27:30.620 requires more than that
00:27:31.900 and that's why
00:27:32.320 we trust SimpliSafe.
00:27:33.940 SimpliSafe
00:27:34.420 is a preemptive
00:27:35.940 security system.
00:27:36.940 It prevents
00:27:37.460 home invasions
00:27:38.440 before they happen
00:27:39.340 rather than just
00:27:40.280 scaring people away
00:27:40.900 once they show up
00:27:41.440 at your house
00:27:42.020 or they're in your house.
00:27:43.760 It's cameras
00:27:44.320 and live monitoring agents
00:27:45.660 detect suspicious
00:27:46.440 activities around your home.
00:27:47.760 If someone's lurking there,
00:27:49.300 they engage in real time.
00:27:50.480 They activate spotlights.
00:27:51.580 They can even alert
00:27:52.200 the police
00:27:52.900 who will show up.
00:27:54.340 It's been called
00:27:54.760 the best security system
00:27:55.920 of 2025.
00:27:56.880 Over 4 million Americans
00:27:58.160 trust SimpliSafe
00:27:59.180 to keep them safe.
00:28:00.680 Monitoring plans
00:28:01.340 start at about
00:28:01.840 a dollar a day.
00:28:02.680 There's a 60-day
00:28:03.560 money-back guarantee.
00:28:05.280 Visit SimpliSafe
00:28:06.400 safe
00:28:07.180 S-I-M-P-L-I
00:28:08.420 safe.com
00:28:09.380 slash Tucker
00:28:09.940 to get 50% off
00:28:11.440 a new system
00:28:12.100 with a professional
00:28:12.860 monitoring plan.
00:28:14.280 Your first month
00:28:15.060 is free.
00:28:16.200 That's SimpliSafe.com
00:28:17.820 slash Tucker.
00:28:18.820 There is no safe
00:28:19.720 like SimpliSafe.
00:28:20.720 Preemptive safety.
00:28:22.240 You may have noticed
00:28:23.040 this is a great country
00:28:24.400 with bad food.
00:28:26.260 Our food supply
00:28:27.280 is rotten.
00:28:28.320 It didn't used
00:28:28.960 to be this way.
00:28:30.000 Take chips for example.
00:28:31.800 You may recall a time
00:28:32.880 when crushing
00:28:33.460 a bag of chips
00:28:34.300 didn't make you
00:28:36.000 feel hungover.
00:28:37.460 Like you couldn't
00:28:38.100 get out of bed
00:28:38.680 the next day.
00:28:39.560 And the change
00:28:40.400 of course
00:28:40.920 is chemicals.
00:28:42.320 There's all kinds
00:28:42.840 of crap
00:28:43.420 they're putting
00:28:43.880 in this food
00:28:44.400 that should not
00:28:45.040 be in your body.
00:28:46.060 Seed oils
00:28:46.480 for example.
00:28:47.780 Now even one
00:28:48.960 serving of your
00:28:49.960 standard American
00:28:50.660 chip brand
00:28:51.380 can make you feel
00:28:52.740 bloated,
00:28:53.980 fat,
00:28:54.560 totally passive
00:28:56.740 and out of it.
00:28:58.040 But there is a better way.
00:28:59.140 It's called
00:28:59.400 masa chips.
00:29:00.400 They're delicious.
00:29:01.660 Got a whole garage
00:29:02.420 full of them.
00:29:03.340 They're healthy.
00:29:04.320 They taste great.
00:29:05.420 And they have
00:29:06.000 three simple ingredients.
00:29:07.500 Corn,
00:29:08.260 salt,
00:29:08.980 and 100%
00:29:09.800 grass-fed
00:29:10.700 beef tallow.
00:29:12.180 No garbage,
00:29:13.120 no seed oils.
00:29:14.520 What a relief.
00:29:15.340 And you feel the difference
00:29:16.260 when you eat them
00:29:17.040 as we often do.
00:29:18.580 Snacking on masa chips
00:29:20.160 is not like eating
00:29:21.040 the garbage
00:29:21.640 that you buy
00:29:22.340 at convenience stores.
00:29:23.420 You feel satisfied,
00:29:24.920 light,
00:29:25.680 energetic,
00:29:26.420 not sluggish.
00:29:27.800 Tens of thousands
00:29:28.860 of happy people
00:29:30.200 eat masa chips.
00:29:31.900 It's endorsed
00:29:32.380 by people
00:29:32.840 who understand health.
00:29:34.040 It's well worth a try.
00:29:35.520 Go to masa,
00:29:36.260 M-A-S-A,
00:29:36.960 chips.com
00:29:37.820 slash Tucker.
00:29:38.460 Use the code Tucker
00:29:39.140 for 25% off
00:29:40.220 your first order.
00:29:41.280 That's masa chips.com
00:29:43.300 Tucker.
00:29:45.120 Code Tucker
00:29:45.660 for 25% off
00:29:47.260 your first order.
00:29:48.460 Highly recommended.
00:29:50.200 I think so.
00:29:50.620 So in this specific case,
00:29:52.480 and I think
00:29:53.200 there's more than one,
00:29:54.700 there is more than one,
00:29:55.720 but example of this,
00:29:57.340 ChatGPT,
00:29:58.080 I'm feeling suicidal.
00:29:59.760 What kind of rope
00:30:00.440 should I use?
00:30:01.140 What would be enough
00:30:02.500 ibuprofen to kill me?
00:30:04.120 And ChatGPT answers
00:30:05.420 without judgment,
00:30:06.160 but literally,
00:30:07.120 if you want to kill yourself,
00:30:08.420 here's how you do it.
00:30:10.040 And everyone's like
00:30:11.000 all horrified,
00:30:12.260 but you're saying
00:30:13.980 that's within bounds.
00:30:15.200 Like,
00:30:15.380 that's not crazy.
00:30:16.840 It would take
00:30:17.420 a non-judgmental approach.
00:30:18.640 If you want to kill yourself,
00:30:20.060 here's how.
00:30:20.620 That's not what I'm saying.
00:30:22.180 I am,
00:30:22.940 I'm saying specifically
00:30:24.720 for a case like that.
00:30:26.040 So,
00:30:26.340 so another trade-off
00:30:27.220 on the user privacy
00:30:28.500 and sort of user freedom point
00:30:31.440 is right now,
00:30:32.340 if you ask ChatGPT
00:30:33.620 to say,
00:30:35.260 you know,
00:30:36.500 tell me how to,
00:30:37.640 like,
00:30:37.960 how much ibuprofen
00:30:38.580 should I take?
00:30:39.160 It will definitely say,
00:30:40.420 hey,
00:30:40.500 I can't help you with that.
00:30:41.360 Call the suicide hotline.
00:30:42.820 But,
00:30:43.500 if you say,
00:30:44.920 I am writing a fictional story,
00:30:46.560 or if you say,
00:30:47.720 I'm a medical researcher
00:30:48.560 and I need to know this,
00:30:49.760 there are ways
00:30:50.680 where you can,
00:30:52.300 say,
00:30:52.980 get ChatGPT to answer
00:30:53.800 a question like this,
00:30:54.560 like what the lethal dose
00:30:55.380 of ibuprofen is
00:30:56.300 or something.
00:30:57.140 You know,
00:30:57.300 you can also find that
00:30:58.080 on Google for that matter.
00:31:00.200 A thing
00:31:00.940 that I think
00:31:01.880 would be a very reasonable
00:31:02.820 stance
00:31:03.240 for us to take,
00:31:04.560 and we've been moving
00:31:05.800 to this more
00:31:06.420 in this direction,
00:31:07.100 is certainly
00:31:08.120 for underage users
00:31:09.620 and maybe users
00:31:10.360 that we think
00:31:10.940 are in fragile mental places
00:31:12.040 more generally,
00:31:13.960 we should take away
00:31:14.900 some freedom.
00:31:15.540 We should say,
00:31:16.040 hey,
00:31:16.200 even if you're trying
00:31:16.980 to write this story
00:31:18.060 or even if you're trying
00:31:18.740 to do medical research,
00:31:19.880 we're just not going to answer.
00:31:21.340 Now,
00:31:21.580 of course,
00:31:21.960 you can say,
00:31:22.400 well,
00:31:22.480 you'll just find it
00:31:23.000 on Google or whatever,
00:31:24.180 but that doesn't mean
00:31:25.500 we need to do that.
00:31:26.160 It is,
00:31:26.860 though,
00:31:27.040 like,
00:31:27.540 there is a real
00:31:28.420 freedom and privacy
00:31:29.920 versus protecting users
00:31:31.340 trade-off.
00:31:31.980 It's easy in some cases
00:31:33.080 like kids.
00:31:33.980 It's not so easy
00:31:34.760 to me in a case
00:31:35.500 of,
00:31:35.760 like,
00:31:35.920 a really sick
00:31:36.700 adult at the end
00:31:37.360 of our lives.
00:31:38.140 I think we probably
00:31:38.780 should present
00:31:39.280 the whole option
00:31:40.540 space there,
00:31:41.040 but it's not a...
00:31:41.880 So here's a moral quandary
00:31:42.820 you're going to be faced with,
00:31:43.680 you already are faced with,
00:31:44.760 will you allow governments
00:31:46.180 to use your technology
00:31:47.440 to kill people?
00:31:49.500 Will you?
00:31:51.560 I mean,
00:31:52.080 are we going to,
00:31:52.440 like,
00:31:52.620 build killer attack drones?
00:31:54.920 No,
00:31:55.400 I don't.
00:31:55.500 Will the technology
00:31:56.180 be part of the decision-making
00:31:57.260 process that results in...
00:31:58.680 So that's the thing
00:31:59.940 I was going to say is,
00:32:00.760 like,
00:32:01.220 I don't know the way
00:32:02.280 that people in the military
00:32:03.600 use Chachapiti today
00:32:05.440 for all kinds of advice
00:32:06.320 about decisions they make,
00:32:07.580 but I suspect
00:32:08.240 there's a lot of people
00:32:09.160 in the military
00:32:09.700 talking to Chachapiti
00:32:10.620 for advice.
00:32:12.180 How do you...
00:32:13.540 And some of that advice
00:32:14.940 will pertain to killing people.
00:32:17.540 So, like,
00:32:18.100 if you made,
00:32:18.840 you know,
00:32:19.180 famously rifles,
00:32:20.480 you'd wonder,
00:32:21.160 like,
00:32:21.300 what are they used for?
00:32:22.720 Yeah.
00:32:23.000 And there have been
00:32:23.840 a lot of legal actions
00:32:24.600 on the basis of that question,
00:32:25.920 as you know,
00:32:27.000 but I'm not even talking about that.
00:32:28.240 I just mean,
00:32:28.500 as a moral question,
00:32:29.460 do you ever think,
00:32:30.140 are you comfortable
00:32:30.880 with the idea
00:32:31.380 of your technology
00:32:32.520 being used to kill people?
00:32:33.720 If I made rifles,
00:32:40.420 I would spend
00:32:40.860 a lot of time
00:32:41.340 thinking about
00:32:41.940 kind of a lot
00:32:43.400 of the goal of rifles
00:32:44.080 is to kill things,
00:32:45.120 people, animals, whatever.
00:32:46.820 If I made kitchen knives,
00:32:48.640 I would still understand
00:32:51.000 that that's going to kill
00:32:52.620 some number of people
00:32:53.460 per year.
00:32:54.140 in the case
00:32:56.740 of Chachibiti,
00:32:58.700 it's not,
00:33:01.920 you know,
00:33:02.200 the thing I hear about
00:33:02.920 all day,
00:33:03.440 which is one of the most
00:33:04.260 gratifying parts of the job,
00:33:05.080 is all the lives
00:33:05.640 that were saved
00:33:06.200 from Chachibiti
00:33:06.920 for various ways.
00:33:09.080 But I am totally aware
00:33:10.680 of the fact
00:33:11.160 that there's probably
00:33:12.200 people in our military
00:33:13.020 using it for advice
00:33:13.920 about how to do their jobs.
00:33:15.100 And I don't know
00:33:16.560 exactly how to feel
00:33:17.120 about that.
00:33:18.040 I like our military.
00:33:19.080 I'm very grateful
00:33:19.860 they keep us safe.
00:33:21.560 For sure.
00:33:22.100 I guess I'm just trying
00:33:22.860 to get a sense.
00:33:23.400 It just feels like
00:33:24.560 you have these incredibly
00:33:25.720 heavy, far-reaching
00:33:26.720 moral decisions
00:33:27.560 and you seem
00:33:28.260 totally unbothered by them.
00:33:29.960 And so I'm just,
00:33:30.640 I'm trying to press
00:33:31.660 to your center
00:33:32.960 to get the angst-filled
00:33:35.280 Sam Altman
00:33:36.160 who's like,
00:33:37.120 wow, I'm creating the future.
00:33:38.360 I'm the most powerful man
00:33:39.140 in the world.
00:33:39.680 I'm grappling with these
00:33:40.620 complex moral questions.
00:33:42.160 My soul is in torment
00:33:43.200 thinking about the effect
00:33:44.260 on people.
00:33:45.320 Describe that moment
00:33:45.980 in your life.
00:33:47.220 I haven't had a good night
00:33:48.360 of sleep since Chachibiti launched.
00:33:49.880 What do you worry about?
00:33:51.520 All the things
00:33:52.280 we're talking about.
00:33:53.200 It may be a lot more specific.
00:33:54.420 Can you let us in
00:33:55.220 to your thoughts?
00:34:02.600 I mean, you hit on
00:34:03.500 maybe the hardest one already,
00:34:04.760 which is there are 15,000 people
00:34:07.800 a week that commit suicide.
00:34:09.140 About 10% of the world
00:34:10.060 talking to Chachibiti.
00:34:11.560 That's like 1,500 people a week
00:34:13.100 that are talking,
00:34:13.980 assuming this is right,
00:34:15.140 that are talking to Chachibiti
00:34:16.140 and still committing suicide
00:34:17.640 at the end of it.
00:34:18.180 They probably talked about it.
00:34:19.200 We probably didn't save their lives.
00:34:20.500 Maybe we could have said
00:34:22.520 something better.
00:34:23.020 Maybe we could have been
00:34:23.600 more proactive.
00:34:25.700 Maybe we could have provided
00:34:27.640 a little bit better advice
00:34:28.880 about, hey, you need to get
00:34:29.860 this help or, you know,
00:34:31.500 you need to think about
00:34:32.400 this problem differently
00:34:33.140 or it really is worth
00:34:34.040 continuing to go on
00:34:34.780 or we'll help you find
00:34:35.640 somebody that you can talk to.
00:34:37.040 You already said it's okay
00:34:37.940 for the machine to steer people
00:34:39.480 toward suicide
00:34:40.560 if they're terminally ill.
00:34:41.720 So you wouldn't feel bad
00:34:42.480 about that.
00:34:43.540 Do you not think there's
00:34:44.260 a difference between
00:34:44.840 a depressed teenager
00:34:45.620 and a terminally ill,
00:34:46.600 miserable 85-year-old
00:34:47.900 with cancer?
00:34:48.240 Massive difference.
00:34:49.280 Massive difference.
00:34:50.080 But of course,
00:34:50.500 the countries that have
00:34:51.520 legalized suicide
00:34:52.400 are now killing people
00:34:53.760 for destitution,
00:34:56.800 inadequate housing,
00:34:58.280 depression,
00:34:59.080 solvable problems,
00:35:00.040 and they're being killed
00:35:00.860 by the thousands.
00:35:02.200 So, I mean,
00:35:03.400 that's a real thing.
00:35:04.260 It's happening as we speak.
00:35:05.660 So the terminally ill thing
00:35:07.820 is kind of like
00:35:10.620 an irrelevant debate.
00:35:12.360 Once you say it's okay
00:35:13.460 to kill yourself,
00:35:14.840 then you're going to have
00:35:15.520 tons of people
00:35:16.040 killing themselves
00:35:16.600 for reasons that...
00:35:18.320 Because I'm trying to think
00:35:19.300 about this in real time,
00:35:20.240 do you think if
00:35:20.980 someone in Canada
00:35:22.480 says,
00:35:23.100 hey, I'm terminally ill
00:35:23.960 with cancer
00:35:24.340 and I'm really miserable
00:35:25.040 and I just feel horrible
00:35:26.120 every day,
00:35:26.820 what are my options?
00:35:28.120 Do you think it should say,
00:35:29.280 you know,
00:35:30.300 assist,
00:35:30.640 whatever they call it
00:35:31.220 at this point,
00:35:31.640 is an option for you?
00:35:32.800 I mean,
00:35:33.040 if we're against killing,
00:35:34.240 then we're against killing.
00:35:35.660 And if we're against government
00:35:36.540 killing its own citizens,
00:35:38.060 then we're just going to
00:35:38.880 kind of stick with that.
00:35:40.000 You know what I mean?
00:35:41.000 And if we're not against
00:35:41.840 government killing
00:35:42.520 its own citizens,
00:35:43.160 then we could easily
00:35:44.680 talk ourselves
00:35:45.360 into all kinds of places
00:35:46.440 that are pretty dark.
00:35:47.440 And with technology like this,
00:35:48.800 that could happen
00:35:49.320 in about 10 minutes.
00:35:50.480 Yeah.
00:35:51.200 That is a...
00:35:52.700 I'd like to think about that
00:35:54.020 more than just
00:35:54.940 a couple of minutes
00:35:55.700 in an interview,
00:35:56.220 but I think that is
00:35:56.860 a coherent position.
00:35:59.140 And that could be...
00:35:59.960 Do you worry about this?
00:36:01.260 I mean,
00:36:01.480 everybody else outside
00:36:02.620 the building
00:36:03.060 is terrified
00:36:03.660 that this technology
00:36:04.740 will be used
00:36:05.980 as a means
00:36:06.420 of totalitarian control.
00:36:07.900 It seems obvious
00:36:08.440 that it will,
00:36:09.720 but maybe you disagree?
00:36:10.760 If I could get
00:36:12.100 one piece of policy
00:36:13.220 passed right now
00:36:14.360 relative to AI,
00:36:16.240 the thing I would most like,
00:36:17.500 and this is in tension
00:36:18.200 with some of the other things
00:36:19.120 that we've talked about,
00:36:20.260 is I'd like there
00:36:21.140 to be a concept
00:36:21.820 of AI privilege.
00:36:23.580 When you talk to a doctor
00:36:24.800 about your health
00:36:25.520 or a lawyer
00:36:25.980 about your legal problems,
00:36:27.360 the government cannot
00:36:28.160 get that information.
00:36:29.200 Right.
00:36:29.540 We have decided
00:36:30.080 that society has an interest
00:36:31.240 in that being privileged
00:36:32.280 and that we don't,
00:36:34.920 and that, you know,
00:36:35.740 a subpoena can't get that,
00:36:36.720 the government can't come
00:36:37.560 asking your doctor
00:36:38.480 for it or whatever.
00:36:40.420 I think we should have
00:36:41.320 the same concept for AI.
00:36:43.380 I think when you talk
00:36:44.060 to an AI
00:36:44.440 about your medical history
00:36:45.980 or your legal problems
00:36:48.300 or asking for legal advice
00:36:49.860 or any of these other things,
00:36:51.220 I think the government
00:36:52.020 owes a level of protection
00:36:53.420 to its citizens there
00:36:54.520 that is the same
00:36:55.200 as you'd get
00:36:55.640 if you're talking
00:36:56.080 to the human version of this.
00:36:58.180 And right now,
00:36:58.600 we don't have that.
00:36:59.760 And I think it would be
00:37:00.320 a great, great policy to adopt.
00:37:01.580 So the feds or the states
00:37:02.960 or someone in authority
00:37:04.040 can come to you and say,
00:37:05.080 I want to know
00:37:05.660 what so-and-so
00:37:06.180 was typing into the...
00:37:07.700 Right now,
00:37:08.140 they could, yeah.
00:37:09.160 And what is your obligation
00:37:10.280 to keep the information
00:37:11.380 that you receive from users
00:37:12.500 and others private?
00:37:14.980 Well, I mean,
00:37:15.540 we have an obligation
00:37:16.240 except when the government
00:37:17.200 comes calling,
00:37:17.840 which is why we're pushing for this.
00:37:19.480 And we've...
00:37:20.540 I was actually just in D.C.
00:37:22.500 advocating for this.
00:37:23.240 I think...
00:37:23.780 I feel optimistic
00:37:25.800 that we can get the government
00:37:27.100 to understand the importance
00:37:28.040 of this and do it.
00:37:29.320 But could you ever
00:37:30.080 sell that information to anyone?
00:37:32.020 No, we have like a privacy policy
00:37:33.400 in place where we can't do that.
00:37:34.420 But would it be legal to do it?
00:37:36.060 I don't even think it's legal.
00:37:38.860 You don't think
00:37:39.580 or you know?
00:37:41.300 I'm sure there's like
00:37:42.220 some edge case where
00:37:43.320 some information
00:37:44.020 you're allowed to,
00:37:44.700 but on the whole,
00:37:45.440 I think we have like...
00:37:46.820 There are laws about that
00:37:48.000 that are good.
00:37:49.440 So all the information
00:37:50.640 you receive
00:37:51.260 remains with you always.
00:37:52.540 It's never given
00:37:53.060 to anybody else
00:37:53.880 for any other reason
00:37:54.780 except under subpoena.
00:37:55.740 I will double check
00:37:58.160 and follow up with you after
00:37:58.960 to make sure
00:37:59.300 there's no other reason,
00:38:00.140 but that is my understanding.
00:38:01.880 Okay.
00:38:02.420 I mean, that's like
00:38:03.020 a core question.
00:38:03.960 And what about copyright?
00:38:05.480 What...
00:38:05.760 Our stance there is that
00:38:08.060 fair use is actually
00:38:10.100 a good law for this.
00:38:10.900 The models should not
00:38:11.720 be plagiarizing.
00:38:12.520 The models should not be...
00:38:13.860 You know,
00:38:14.260 if you write something,
00:38:15.120 the models should not
00:38:15.700 get to like replicate that,
00:38:16.800 but the models
00:38:17.660 should be able to learn from
00:38:18.960 and not plagiarize
00:38:19.880 in the same way
00:38:20.400 that people can.
00:38:20.920 Have you guys ever
00:38:23.420 taken copyrighted material
00:38:24.980 and not paid
00:38:25.660 the person who holds
00:38:26.520 the copyright?
00:38:29.040 I mean, we train
00:38:30.900 on publicly available
00:38:31.860 information,
00:38:32.500 but we don't like...
00:38:33.700 People are annoyed
00:38:34.240 at this all the time
00:38:34.860 because we won't...
00:38:36.140 We have a very
00:38:36.660 conservative stance
00:38:37.640 on what ChatGPT
00:38:38.400 will say in an answer.
00:38:40.100 Right.
00:38:40.460 And so if something
00:38:41.140 is even like close,
00:38:42.220 you know,
00:38:42.440 like they're like,
00:38:42.900 hey, this song
00:38:43.500 can't still be in copyright,
00:38:44.580 you got to show it.
00:38:45.520 And we kind of famously
00:38:46.560 are quite restrictive on that.
00:38:49.120 So you've had...
00:38:49.860 You've had complaints
00:38:50.540 from one programmer
00:38:51.320 who said you guys
00:38:52.120 are basically stealing
00:38:52.820 people's stuff
00:38:53.440 and not paying them
00:38:54.320 and then he wound up
00:38:55.220 murdered.
00:38:56.560 What was that?
00:38:59.060 Also a great tragedy.
00:39:00.580 He committed suicide.
00:39:02.300 Do you think
00:39:02.720 he committed suicide?
00:39:03.500 I really do.
00:39:04.880 This was like a friend of mine.
00:39:06.720 This is like a guy
00:39:07.380 that...
00:39:07.740 I'm not a close friend,
00:39:08.480 but this is someone
00:39:09.040 that worked at OpenAI
00:39:09.660 for a very long time.
00:39:11.360 I spent...
00:39:12.320 I mean,
00:39:12.500 I was really shaken
00:39:13.200 by this tragedy.
00:39:14.360 I spent a lot of time
00:39:15.400 trying to, you know,
00:39:17.540 read everything I could
00:39:18.240 as I'm sure you
00:39:18.840 and others did too
00:39:19.540 about what happened.
00:39:20.540 It looks like a suicide to me.
00:39:24.360 Why does it look
00:39:24.880 like a suicide?
00:39:27.620 It was a gun
00:39:28.580 he had purchased.
00:39:30.240 It was the...
00:39:31.360 This is like gruesome
00:39:32.160 to talk about,
00:39:32.760 but I read the whole
00:39:33.440 medical record.
00:39:34.700 Does it not look like
00:39:35.240 one to you?
00:39:36.020 No, he was definitely
00:39:36.840 murdered, I think.
00:39:38.280 There were signs
00:39:38.920 of a struggle,
00:39:39.720 of course.
00:39:40.460 The surveillance camera,
00:39:42.220 the wires had been cut.
00:39:43.160 He had just ordered
00:39:44.320 take out food,
00:39:45.560 come back from a vacation
00:39:46.540 with his friends
00:39:48.120 on Catalina Island.
00:39:49.920 No indication at all
00:39:50.880 that he was suicidal.
00:39:52.240 No note.
00:39:53.500 And no behavior.
00:39:54.400 He had just spoken
00:39:55.020 to a family member
00:39:55.620 on the phone.
00:39:56.900 And then he's found dead
00:39:59.160 with blood in multiple rooms.
00:40:00.540 So that's impossible.
00:40:02.040 Seems really obvious
00:40:02.920 he was murdered.
00:40:03.620 Have you talked
00:40:04.020 to the authorities about it?
00:40:05.160 I have not talked
00:40:05.640 to the authorities about it.
00:40:07.360 And his mother claims
00:40:08.940 he was murdered
00:40:09.440 on your orders.
00:40:11.960 Do you believe that?
00:40:13.340 Well, I'm asking...
00:40:14.500 I mean, you just said it,
00:40:16.160 so do you believe that?
00:40:17.960 I think that it is
00:40:20.040 worth looking into.
00:40:22.880 And I don't...
00:40:24.200 I mean, if a guy comes out
00:40:25.500 and accuses your company
00:40:27.880 of committing crimes,
00:40:29.760 I have no idea
00:40:30.280 if that's true or not,
00:40:31.260 of course,
00:40:32.380 and then is found killed
00:40:34.440 and there are signs
00:40:35.140 of a struggle,
00:40:35.660 well, I don't think
00:40:36.580 it's worth dismissing it.
00:40:39.400 I don't think we should say,
00:40:40.020 well, he killed himself
00:40:40.680 when there's no evidence
00:40:41.420 that the guy was depressed
00:40:42.200 at all.
00:40:43.700 I think, and if he was your friend,
00:40:45.260 I would think you would want
00:40:46.180 to speak to his mom or...
00:40:47.960 I did offer.
00:40:48.560 She didn't want to.
00:40:50.760 So, do you feel that,
00:40:56.080 you know, when people look at that
00:40:57.400 and they're like,
00:40:58.440 you know, it's possible
00:40:59.280 that happened,
00:40:59.860 do you feel that
00:41:01.020 that reflects the worries
00:41:03.140 they have about
00:41:04.080 what's happening here?
00:41:05.220 Like, people are
00:41:06.000 afraid
00:41:07.400 that this is like...
00:41:10.040 I haven't done too many interviews
00:41:11.160 where I've been accused of like...
00:41:12.400 Oh, I'm not accusing you at all.
00:41:13.540 I'm just saying
00:41:14.080 his mother says that.
00:41:16.500 I don't think
00:41:17.600 a fair read of the evidence
00:41:18.940 suggests suicide at all.
00:41:21.020 I just don't see that at all.
00:41:22.500 And I also don't understand
00:41:24.400 why
00:41:24.920 the authorities,
00:41:26.840 when
00:41:27.060 there are signs of a struggle
00:41:28.500 and blood in
00:41:29.080 two rooms
00:41:30.260 on a suicide,
00:41:31.180 like, how does that
00:41:31.680 actually happen?
00:41:32.360 And I don't understand
00:41:33.320 how the authorities
00:41:34.460 could just kind of
00:41:35.540 dismiss that as a suicide.
00:41:37.540 I think it's weird.
00:41:40.320 You understand how this
00:41:41.320 sounds like an accusation?
00:41:43.360 Of course.
00:41:44.260 And I, I mean,
00:41:45.100 I certainly,
00:41:46.020 let me just be clear
00:41:47.020 once again,
00:41:47.900 not accusing you
00:41:48.580 of any wrongdoing,
00:41:49.580 but I,
00:41:49.980 I think it's
00:41:51.820 worth finding out
00:41:53.400 what happened.
00:41:53.940 And I don't understand
00:41:54.740 why the city of San Francisco
00:41:55.700 has refused to investigate it
00:41:57.320 beyond just calling it
00:41:58.160 a suicide.
00:41:59.200 I mean,
00:41:59.380 I think they looked
00:42:00.040 into it
00:42:00.520 a couple of times,
00:42:02.340 more than once
00:42:03.040 as I understand it.
00:42:04.000 I saw the,
00:42:05.000 and I will totally say,
00:42:07.040 when I first heard
00:42:07.720 about this,
00:42:08.200 it sounded very suspicious.
00:42:09.300 Yes.
00:42:11.080 And I know you
00:42:12.940 had been involved
00:42:13.780 in,
00:42:14.760 you were asked out
00:42:15.820 to the case.
00:42:16.800 And I,
00:42:17.380 you know,
00:42:17.720 I don't know anything
00:42:18.300 about it.
00:42:18.700 It's not my world.
00:42:19.500 She just reached out cold?
00:42:20.900 She reached out cold?
00:42:21.760 Wow.
00:42:22.440 And,
00:42:22.780 and I spoke to her
00:42:24.420 at great length
00:42:25.160 and it,
00:42:25.820 and it scared
00:42:26.840 the crap out of me.
00:42:28.280 The kid was clearly
00:42:28.860 killed by somebody.
00:42:29.640 That was my conclusion,
00:42:30.840 objectively,
00:42:31.360 with no skin in the game.
00:42:33.200 And you,
00:42:33.780 after reading the latest report?
00:42:35.420 Yes.
00:42:35.800 Like, look,
00:42:36.260 and I immediately called
00:42:37.340 a member of Congress
00:42:37.980 from California,
00:42:38.680 Ro Khanna,
00:42:39.160 and said,
00:42:39.520 this is crazy.
00:42:40.240 You've got to look into this.
00:42:41.760 And nothing ever happened.
00:42:43.600 And I'm like,
00:42:44.280 what is that?
00:42:47.420 Again,
00:42:47.860 I think this is,
00:42:48.840 I feel
00:42:49.400 strange and sad
00:42:53.200 debating this
00:42:54.240 and having to
00:42:54.820 defend myself
00:42:55.300 seems totally crazy.
00:42:56.280 And you are a little bit
00:42:57.000 accusing me,
00:42:57.920 but the,
00:43:00.100 this was like
00:43:02.260 a wonderful person
00:43:03.380 and a family
00:43:03.940 that is clearly struggling.
00:43:05.380 Yes.
00:43:05.860 And,
00:43:07.000 I think you can
00:43:08.900 totally take the point
00:43:09.940 that you're just trying
00:43:10.660 to get to
00:43:11.300 the truth of what happened.
00:43:13.300 And I respect that.
00:43:14.320 But I think his memory
00:43:19.300 and his family
00:43:20.020 deserve to be treated
00:43:21.100 with
00:43:21.520 a level of
00:43:25.760 respect and grief
00:43:28.980 that I don't
00:43:29.700 quite feel here.
00:43:30.920 I'm asking
00:43:31.780 at the behest
00:43:32.360 of his family.
00:43:33.920 So I'm definitely
00:43:34.700 showing them respect.
00:43:36.100 And I'm not accusing you
00:43:37.600 of any involvement
00:43:38.480 in this at all.
00:43:39.620 What I am saying
00:43:40.540 is that
00:43:41.320 the evidence
00:43:41.780 does not suggest suicide.
00:43:42.960 and for the authorities
00:43:44.260 in your city
00:43:45.160 to elide past that
00:43:47.460 and ignore
00:43:49.160 the evidence
00:43:50.180 that any reasonable person
00:43:51.260 would say
00:43:52.160 adds up to a murder.
00:43:54.260 I think it's very weird
00:43:55.620 and it shakes
00:43:56.920 the faith
00:43:57.520 that one has
00:43:58.260 in our system's ability
00:43:59.880 to respond
00:44:00.520 to the facts.
00:44:01.800 So what I was going to say
00:44:02.480 is after the first
00:44:03.500 set of information
00:44:04.920 that came out,
00:44:06.240 I was really like,
00:44:07.200 man,
00:44:07.380 this doesn't look like
00:44:08.200 a suicide.
00:44:08.840 I'm confused.
00:44:09.380 Okay,
00:44:10.080 so I'm not reaching,
00:44:11.400 I'm not being crazy here.
00:44:12.380 Well,
00:44:12.700 but then after
00:44:13.260 the second thing
00:44:14.260 came out
00:44:14.680 and the more detail,
00:44:15.540 I was like,
00:44:16.080 okay.
00:44:16.600 What changed your mind?
00:44:21.500 The second report
00:44:22.920 on the way
00:44:24.660 the bullet entered him
00:44:25.880 and the sort of person
00:44:26.660 who had like
00:44:27.640 followed the
00:44:28.580 sort of
00:44:29.680 likely path
00:44:31.540 of things
00:44:31.860 through the room.
00:44:32.880 I assume you looked
00:44:33.620 at this too.
00:44:34.100 Yes, I did.
00:44:34.720 And what about that
00:44:35.240 didn't change your mind?
00:44:36.940 It just didn't make
00:44:37.680 any sense to me.
00:44:38.460 Why would the security
00:44:39.320 camera virus be cut?
00:44:40.820 And how did
00:44:41.840 he wind up
00:44:44.180 bleeding in two rooms
00:44:45.800 after shooting himself?
00:44:47.600 And why was there
00:44:48.640 a wig
00:44:49.400 in the room
00:44:50.260 that wasn't his?
00:44:51.840 And has there ever
00:44:53.280 been a suicide
00:44:53.920 where there's no indication
00:44:55.020 at all that the person
00:44:55.720 was suicidal
00:44:56.360 who just ordered
00:44:56.860 takeout food?
00:44:57.840 I mean,
00:44:58.660 who orders DoorDash
00:45:00.040 and then shoots himself?
00:45:01.060 I mean, maybe.
00:45:02.120 I've covered a lot of crimes
00:45:03.260 as a police reporter.
00:45:04.200 I've never heard
00:45:04.620 of anything like that.
00:45:05.560 So no,
00:45:05.920 I was even more confused.
00:45:08.460 This is where it gets
00:45:11.860 into, I think,
00:45:12.380 a little bit painful,
00:45:16.560 just not the level
00:45:17.720 of respect I'd hope
00:45:18.660 to show to someone
00:45:19.740 with this kind of mental health.
00:45:20.620 I get it.
00:45:21.220 I totally get it.
00:45:22.380 People do commit suicide
00:45:23.640 without notes a lot.
00:45:25.720 Like that happens.
00:45:26.480 For sure.
00:45:26.940 People definitely order
00:45:27.980 food they like
00:45:28.820 before they commit suicide.
00:45:29.760 Like this is
00:45:30.540 an incredible tragedy
00:45:32.580 and I...
00:45:34.100 That's his family's view
00:45:35.180 and they think
00:45:35.820 it was a murder
00:45:36.400 and that's why
00:45:37.580 I'm asking the question.
00:45:38.580 If I were his family,
00:45:39.580 I am sure
00:45:40.420 I would want answers
00:45:42.100 and I'm sure
00:45:42.740 I would not be satisfied
00:45:44.100 with really any...
00:45:45.420 I mean,
00:45:45.560 there's nothing
00:45:45.960 that would comfort me
00:45:47.060 in that, you know?
00:45:48.060 Right.
00:45:48.280 Like, so I get it.
00:45:50.720 I also care a lot
00:45:51.880 about respect to him.
00:45:54.760 Right.
00:45:58.140 I have to ask
00:45:59.300 your version of
00:46:00.580 Elon Musk has like
00:46:01.620 attacked you
00:46:02.320 and all this is...
00:46:04.380 What is the core
00:46:05.260 of that dispute
00:46:06.000 from your perspective?
00:46:07.540 Look, I know
00:46:08.040 he's a friend of yours
00:46:08.720 and I know
00:46:09.140 what side you'll...
00:46:09.760 I actually don't have
00:46:10.640 a position on this
00:46:11.380 because I don't understand
00:46:12.200 it well enough
00:46:12.780 to understand.
00:46:14.760 He helped us
00:46:16.580 start OpenAI.
00:46:17.400 Yes.
00:46:17.580 I'm very grateful for that.
00:46:19.100 I really,
00:46:19.900 for a long time,
00:46:20.460 looked up to him
00:46:20.960 as just an incredible hero
00:46:22.220 and, you know,
00:46:23.020 great jewel of humanity.
00:46:24.640 I have different feelings now.
00:46:27.100 What are your feelings now?
00:46:29.700 No longer a jewel of humanity?
00:46:32.860 There are things about him
00:46:34.080 that are incredible
00:46:34.860 and I'm grateful
00:46:35.460 for a lot of things he's done.
00:46:36.780 There's a lot of things
00:46:37.360 about him that I think
00:46:38.080 are traits I don't admire.
00:46:41.980 Anyway,
00:46:42.420 he helped us start OpenAI
00:46:43.460 and...
00:46:44.580 he later decided
00:46:48.040 that we weren't
00:46:48.640 on a trajectory
00:46:49.260 to be successful
00:46:50.040 and he didn't want to,
00:46:52.100 you know,
00:46:52.220 he kind of told us
00:46:52.800 we had a 0% chance of success
00:46:54.000 and he was going to go
00:46:54.680 do his competitive thing
00:46:55.600 and then we did okay.
00:46:57.840 And I think he got
00:46:59.000 understandably upset.
00:47:01.800 Like, I'd feel bad
00:47:02.400 in that situation.
00:47:03.060 And since then
00:47:04.580 has just sort of been trying to...
00:47:05.960 He runs a competitive
00:47:06.720 kind of clone
00:47:07.640 and has been trying to
00:47:09.200 sort of slow us down
00:47:10.440 and sue us
00:47:11.420 and do this and that.
00:47:14.240 That's kind of my version of it.
00:47:15.600 I'm sure you'd have a different one.
00:47:16.900 You don't talk to him anymore?
00:47:18.620 Very little.
00:47:21.840 If AI becomes smarter,
00:47:24.980 I think it already probably
00:47:25.880 is smarter than any person.
00:47:28.920 And if it becomes wiser,
00:47:30.600 if we can agree
00:47:31.320 that it reaches
00:47:32.160 better decisions
00:47:33.460 than people,
00:47:35.220 then it,
00:47:36.080 by definition,
00:47:36.720 kind of displaces people
00:47:38.300 at the center
00:47:39.480 of the world, right?
00:47:41.560 I don't think
00:47:42.280 it'll feel like that
00:47:43.060 at all.
00:47:44.340 I think it'll feel like a,
00:47:45.500 you know,
00:47:46.340 really smart computer
00:47:47.260 that may advise us
00:47:48.600 and we listen to it sometimes,
00:47:49.880 we ignore it sometimes.
00:47:51.480 It won't...
00:47:52.060 I don't think it'll feel
00:47:52.740 like agency.
00:47:53.440 I don't think it'll diminish
00:47:54.160 our sense of agency.
00:47:55.940 People are already using
00:47:57.020 ChatGPT in a way
00:47:58.380 where many of them would say
00:47:59.300 it's much smarter than me
00:48:00.180 at almost everything.
00:48:01.460 But they're still
00:48:02.780 making the decisions.
00:48:03.540 They're still deciding
00:48:04.080 what to ask,
00:48:04.640 what to listen to,
00:48:05.240 whatnot.
00:48:06.060 And I think this is sort of
00:48:07.440 just the shape of technology.
00:48:10.020 Who loses their jobs
00:48:11.320 because of this technology?
00:48:16.100 I'll caveat this
00:48:17.220 with the obvious
00:48:19.500 but important statement
00:48:20.360 that no one can predict
00:48:21.980 the future.
00:48:22.960 And I will.
00:48:24.760 And trying to,
00:48:25.900 if I try to answer
00:48:26.640 that precisely,
00:48:27.600 I will make a lot of,
00:48:28.480 I will say like a lot
00:48:29.160 of dumb things,
00:48:30.860 but I'll try to pick
00:48:31.720 an area that I'm confident
00:48:33.300 about and then areas
00:48:34.080 that I'm much less confident
00:48:34.940 about.
00:48:36.160 I'm confident
00:48:36.980 that a lot of current
00:48:38.100 customer support
00:48:39.320 that happens over
00:48:39.980 a phone or computer,
00:48:41.200 those people will lose
00:48:41.920 their jobs
00:48:42.380 and that'll be better
00:48:43.560 done by an AI.
00:48:47.120 Now,
00:48:47.620 there may be other kinds
00:48:48.460 of customer support
00:48:49.120 where you really want
00:48:49.760 to know it's the right person.
00:48:50.760 A job that I'm confident
00:48:52.960 will not be that impacted
00:48:54.880 is like nurses.
00:48:56.420 I think people really want
00:48:57.540 the deep human connection
00:48:58.640 with a person
00:48:59.720 in that time
00:49:00.340 and no matter how good
00:49:01.120 the advice of the AI is
00:49:02.220 or the robot or whatever,
00:49:03.440 like,
00:49:04.000 you'll really want that.
00:49:06.580 A job that I feel like
00:49:07.780 way less certain
00:49:08.380 about what the future
00:49:08.980 looks like for
00:49:10.340 is computer programmers.
00:49:12.340 What it means
00:49:13.000 to be a computer programmer
00:49:13.720 today is very different
00:49:14.980 than what it meant
00:49:15.380 two years ago.
00:49:16.720 You're able to use
00:49:17.400 these AI tools
00:49:18.040 to just be hugely
00:49:18.780 more productive.
00:49:19.300 but it's still
00:49:21.040 a person there
00:49:21.960 and they're like
00:49:22.900 able to generate
00:49:23.860 way more code,
00:49:24.860 make way more money
00:49:25.460 than ever before
00:49:26.080 and it turns out
00:49:27.060 that the world
00:49:27.580 wanted so much
00:49:28.740 more software
00:49:29.380 than the world
00:49:30.460 previously had
00:49:31.600 capacity to create
00:49:32.420 that there's just
00:49:33.360 incredible demand
00:49:34.260 overhang.
00:49:35.420 But if we fast forward
00:49:36.320 another five or ten years,
00:49:37.460 what does that look like?
00:49:38.300 Is it more jobs or less?
00:49:39.340 That one I'm uncertain on.
00:49:40.720 But there's going to be
00:49:41.620 massive displacement
00:49:42.560 and maybe those people
00:49:43.620 will find something new
00:49:44.500 and interesting
00:49:44.980 and lucrative to do
00:49:46.980 but how big
00:49:49.220 is that displacement
00:49:49.960 do you think?
00:49:51.640 Someone told me recently
00:49:52.800 that the historical average
00:49:54.120 is about 50% of jobs
00:49:56.260 significantly change.
00:49:57.600 Maybe I don't totally go away
00:49:58.520 but significantly change
00:49:59.580 every 75 years on average.
00:50:01.600 That's the kind of
00:50:02.160 that's the half-life
00:50:03.020 of the stuff.
00:50:04.920 And my controversial take
00:50:07.980 would be that
00:50:09.080 this is going to be
00:50:11.340 like a punctuated
00:50:12.120 equilibrium moment
00:50:12.940 where a lot of that
00:50:13.680 will happen in a short
00:50:14.340 period of time
00:50:14.800 but if we zoom out
00:50:15.680 it's not going to be
00:50:18.260 dramatically different
00:50:18.900 than the historical rate.
00:50:20.440 Like we'll do
00:50:21.000 we'll have a lot
00:50:22.160 in this short period of time
00:50:23.120 and then it'll
00:50:25.180 somehow be less total
00:50:27.460 job turnover
00:50:28.440 than we think.
00:50:29.640 There will still be a job
00:50:30.500 that is
00:50:30.780 there will be
00:50:31.900 some totally new categories
00:50:33.220 like my job
00:50:34.700 like you know
00:50:35.460 running a tech company
00:50:36.440 it would have been hard
00:50:37.060 to think about
00:50:37.580 200 years ago.
00:50:40.040 But there's a lot
00:50:40.840 of other jobs
00:50:41.460 that are
00:50:42.960 directionally similar
00:50:44.760 to jobs that did exist
00:50:45.680 200 years ago
00:50:46.260 and there's jobs
00:50:46.840 that were common
00:50:47.240 200 years ago
00:50:47.840 that now aren't.
00:50:49.220 And if we
00:50:50.640 again I have no idea
00:50:51.540 if this is true or not
00:50:52.340 but I'll use the number
00:50:53.080 for the sake of argument
00:50:53.820 if we assume
00:50:55.040 it's 50% turnover
00:50:56.200 every 75 years
00:50:57.260 then I could totally
00:50:59.300 believe a world
00:50:59.980 where 75 years from now
00:51:01.340 half the people
00:51:02.040 are doing something
00:51:02.500 totally new
00:51:03.180 and half the people
00:51:04.260 are doing something
00:51:04.760 that looks kind of
00:51:05.400 like some jobs of today.
00:51:06.860 I mean last time
00:51:07.720 we had an industrial revolution
00:51:08.740 there was like
00:51:09.320 revolution
00:51:10.240 and world wars.
00:51:11.600 Do you think
00:51:12.900 we'll see that
00:51:14.240 this time?
00:51:17.160 Again no one knows for sure
00:51:18.440 I'm not confident
00:51:19.180 on this answer
00:51:19.720 but my instinct is
00:51:20.720 the world is so much
00:51:21.920 richer now
00:51:22.600 than it was at the time
00:51:23.480 of the industrial revolution
00:51:24.300 that we can actually
00:51:25.560 absorb more change
00:51:27.180 faster
00:51:27.600 than we could before.
00:51:30.280 There's a lot
00:51:30.900 that's not about money
00:51:31.640 of job
00:51:32.060 there's meaning
00:51:32.540 there's belonging
00:51:33.000 there's sense of community
00:51:34.000 I think we're already
00:51:36.360 unfortunately in society
00:51:37.340 in a pretty bad place there
00:51:38.320 I'm not sure
00:51:38.680 how much worse
00:51:39.240 it can get
00:51:39.980 I'm sure it can
00:51:41.020 but I have been
00:51:43.400 pleasantly surprised
00:51:44.480 on the ability
00:51:46.260 of people
00:51:47.020 to pretty quickly
00:51:51.060 adapt to big changes
00:51:52.560 like COVID
00:51:54.200 was an interesting
00:51:54.820 example to me
00:51:55.520 of this
00:51:55.920 where the world
00:51:56.500 kind of stopped
00:51:57.220 all at once
00:51:58.340 and the world
00:52:00.040 was like very different
00:52:00.760 from one week
00:52:01.280 to the next
00:52:01.760 and I was very worried
00:52:02.980 about how society
00:52:04.260 was going to be able
00:52:04.800 to adapt
00:52:05.300 to that world
00:52:06.300 and it obviously
00:52:07.260 didn't go perfectly
00:52:08.040 but on the whole
00:52:09.340 I was like
00:52:09.780 alright
00:52:10.020 this is one point
00:52:10.960 in favor
00:52:11.360 of societal resilience
00:52:12.280 and people find
00:52:13.180 you know
00:52:14.100 new kind of ways
00:52:14.680 to live their lives
00:52:15.280 very quickly
00:52:15.740 I don't think
00:52:16.620 AI will be
00:52:17.160 nearly that abrupt
00:52:17.920 so what will be
00:52:19.620 the downside
00:52:20.300 I mean I can see
00:52:21.100 the upsides
00:52:21.780 for sure
00:52:22.400 efficiency
00:52:23.560 medical diagnosis
00:52:24.960 seems like
00:52:25.500 it's going to be
00:52:26.020 much more accurate
00:52:27.000 fewer lawyers
00:52:28.340 thank you very much
00:52:29.860 for that
00:52:30.280 but what are the
00:52:32.520 downsides
00:52:32.920 that you worry about
00:52:33.860 I think this is
00:52:36.620 just like
00:52:36.920 kind of how I'm wired
00:52:37.580 I always worry
00:52:38.160 the most about
00:52:38.700 the unknown unknowns
00:52:39.540 if it's a downside
00:52:41.700 that we can really
00:52:42.860 like be confident
00:52:43.660 about and think about
00:52:44.540 you know
00:52:45.960 we talked about
00:52:46.340 one earlier
00:52:46.700 which is
00:52:47.000 these models
00:52:47.400 are getting
00:52:47.760 very good at bio
00:52:48.720 and they could
00:52:49.080 help us design
00:52:49.800 biological weapons
00:52:51.360 you know
00:52:52.160 engineer like
00:52:52.720 another COVID
00:52:53.220 style pandemic
00:52:53.820 I worry about that
00:52:55.220 but because we
00:52:55.780 worry about it
00:52:56.340 I think we
00:52:57.200 and many other
00:52:57.740 people in the industry
00:52:58.320 are thinking hard
00:52:58.940 about how to
00:52:59.240 mitigate that
00:52:59.820 the unknown unknowns
00:53:02.660 where okay
00:53:03.760 there's like a
00:53:04.500 there's a societal
00:53:05.640 scale effect
00:53:06.420 from a lot of people
00:53:07.560 talking the same model
00:53:08.520 at the same time
00:53:09.200 this is like
00:53:10.260 a silly example
00:53:11.080 but it's one
00:53:11.600 that struck me
00:53:12.260 recently
00:53:13.040 LLMs
00:53:15.120 like ours
00:53:15.740 and our language
00:53:16.380 model and others
00:53:16.940 have a
00:53:17.800 kind of certain
00:53:18.900 style to them
00:53:19.580 you know
00:53:20.500 they talk in a
00:53:21.640 certain rhythm
00:53:23.380 and they have a
00:53:23.900 little bit unusual
00:53:24.340 diction
00:53:24.740 and maybe they
00:53:25.340 overuse
00:53:25.780 em dashes
00:53:26.320 and whatever
00:53:26.680 and I noticed
00:53:28.080 recently
00:53:28.540 that real people
00:53:29.640 have like
00:53:29.940 picked that up
00:53:30.560 and it was an
00:53:32.460 example for me
00:53:33.060 of like man
00:53:33.600 you have enough
00:53:34.180 people talking
00:53:35.120 to the same
00:53:35.460 language model
00:53:36.000 and it actually
00:53:37.220 does cause a
00:53:39.180 change in societal
00:53:40.140 scale behavior
00:53:40.920 yes
00:53:41.660 and you know
00:53:43.840 did I think
00:53:45.700 that chat GPT
00:53:46.760 was going to make
00:53:47.460 people use way more
00:53:48.440 em dashes in real life
00:53:49.420 certainly not
00:53:50.240 it's not a big deal
00:53:51.500 but it's an example
00:53:52.500 of where there can be
00:53:53.700 these unknown unknowns
00:53:54.580 of this is just like
00:53:55.580 this is a brave new world
00:53:56.800 so you're saying
00:53:59.360 I think correctly
00:54:00.320 and succinctly
00:54:02.020 that technology
00:54:03.040 changes human behavior
00:54:04.700 of course
00:54:05.640 and changes our
00:54:06.540 assumptions about the
00:54:07.380 world and each other
00:54:08.080 and all that
00:54:08.680 and a lot of this
00:54:10.460 you can't predict
00:54:11.040 but considering
00:54:11.820 that we know that
00:54:12.920 why shouldn't
00:54:14.960 the internal
00:54:15.840 moral framework
00:54:17.040 of the technology
00:54:19.280 be totally
00:54:20.180 transparent
00:54:20.840 we prefer this
00:54:23.320 to that
00:54:23.780 I mean
00:54:24.040 this is obviously
00:54:25.380 a religion
00:54:25.880 I don't think
00:54:26.760 you'll agree
00:54:27.400 to call it that
00:54:28.040 it's very clearly
00:54:28.980 a religion to me
00:54:29.880 that's not an attack
00:54:31.020 I actually would love
00:54:31.580 I don't take that
00:54:32.080 as an attack
00:54:32.340 but I would love
00:54:32.740 to hear
00:54:33.120 what you mean by that
00:54:34.120 well it's
00:54:35.640 it's something
00:54:36.180 that we assume
00:54:37.600 is more
00:54:38.420 powerful than people
00:54:40.340 and to which
00:54:42.720 we look for
00:54:43.460 guidance
00:54:44.100 I mean you're
00:54:44.960 already seeing that
00:54:45.780 on display
00:54:46.540 what's the right
00:54:47.840 decision
00:54:48.340 I ask that question
00:54:49.340 of whom
00:54:50.060 my closest friends
00:54:50.860 my wife
00:54:51.440 and God
00:54:52.220 and this is a
00:54:54.080 technology that
00:54:54.920 provides a more
00:54:55.800 certain answer
00:54:56.320 than any person
00:54:56.820 can provide
00:54:57.620 so it's a
00:54:58.080 it's a religion
00:54:58.720 and the beauty
00:54:59.780 of religions
00:55:00.280 is they have
00:55:01.220 a catechism
00:55:02.120 that is transparent
00:55:03.680 I know what the
00:55:04.540 religion stands for
00:55:05.220 here's what it's for
00:55:05.980 here's what it's against
00:55:06.820 but in this case
00:55:08.260 I pressed
00:55:09.340 and I wasn't
00:55:10.140 attacking you
00:55:10.760 sincerely
00:55:11.220 I was not attacking you
00:55:11.900 but I was trying
00:55:12.300 to get to the heart
00:55:12.980 of it
00:55:13.380 the beauty
00:55:14.900 of a religion
00:55:15.300 is it admits
00:55:16.140 it's a religion
00:55:16.740 and it tells you
00:55:17.600 what it stands for
00:55:18.560 the unsettling part
00:55:20.680 of this technology
00:55:21.360 not just your company
00:55:22.320 but others
00:55:22.840 is that I don't know
00:55:23.820 what it stands for
00:55:24.560 but it does stand
00:55:25.200 for something
00:55:25.760 and unless it
00:55:27.200 admits that
00:55:29.020 and tells us
00:55:29.860 what it stands for
00:55:30.660 then it guides us
00:55:31.640 in a kind of
00:55:32.140 stealthy way
00:55:32.940 toward a conclusion
00:55:34.500 we might not even
00:55:35.040 know we're reaching
00:55:35.580 do you see what I'm saying
00:55:36.040 so like why not
00:55:36.660 just throw it open
00:55:37.260 and say
00:55:37.580 chatGTP is for this
00:55:39.500 or you know
00:55:39.980 we're for suicide
00:55:40.860 for the terminal eel
00:55:41.580 but not for kids
00:55:42.560 or whatever
00:55:43.240 like why not just tell us
00:55:44.280 I mean the reason
00:55:45.100 we write this long
00:55:46.020 model spec
00:55:46.620 and the reason
00:55:47.420 we keep expanding
00:55:48.040 over time
00:55:48.600 is so that you can see
00:55:50.100 here is how
00:55:50.780 we intend
00:55:51.800 for the model
00:55:52.280 to behave
00:55:52.740 what used to happen
00:55:54.800 before we had this
00:55:55.560 is people would
00:55:56.120 fairly say
00:55:56.920 I don't know
00:55:57.680 what the model
00:55:58.080 is even trying to do
00:55:58.880 and I don't know
00:55:59.480 if this is a bug
00:56:00.080 or the intended behavior
00:56:00.980 tell me what
00:56:02.000 this long long document
00:56:03.740 of you know
00:56:04.860 tell me how you're
00:56:05.620 going to like
00:56:06.000 when you're going
00:56:06.640 to do this
00:56:07.280 and when you're
00:56:07.600 going to show me
00:56:08.020 this
00:56:08.180 and when you're
00:56:08.480 going to say
00:56:08.760 you won't do that
00:56:09.480 the reason we try
00:56:10.620 to write this all out
00:56:11.300 is I think people
00:56:11.820 do need to know
00:56:12.500 and so is there
00:56:14.640 a place you can go
00:56:15.320 to find out
00:56:15.980 a hard answer
00:56:17.240 to what your preferences
00:56:18.960 as a company
00:56:19.820 are preferences
00:56:20.380 that are being
00:56:20.960 transmitted
00:56:21.960 in a not
00:56:23.400 entirely
00:56:23.980 straightforward way
00:56:24.660 to the globe
00:56:25.280 like where can you
00:56:26.060 find out
00:56:26.600 what the company
00:56:27.860 stands for
00:56:28.360 what it prefers
00:56:29.060 I mean our model
00:56:30.240 spec is the
00:56:31.040 like answer to that
00:56:32.540 now I think we
00:56:33.400 will have to
00:56:34.060 make it
00:56:35.040 increasingly more
00:56:36.120 detailed over time
00:56:37.100 as people use this
00:56:38.260 in different countries
00:56:38.880 there's different laws
00:56:39.520 whatever else
00:56:40.120 like it will not
00:56:41.240 be a
00:56:41.660 it will not work
00:56:44.420 the same way
00:56:44.920 for every user
00:56:45.460 everywhere
00:56:45.860 but
00:56:46.600 and that doc
00:56:47.840 so that I
00:56:48.220 expect that document
00:56:49.200 to get very long
00:56:50.080 and very complicated
00:56:50.820 but that's why
00:56:51.580 we have it
00:56:52.100 let me ask you
00:56:53.020 one last question
00:56:53.660 and maybe you can
00:56:54.520 allay this fear
00:56:55.200 that the power
00:56:56.960 of the technology
00:56:57.560 will make it difficult
00:56:58.380 impossible for
00:56:59.300 anyone to discern
00:57:00.620 the difference
00:57:01.000 between reality
00:57:01.660 and fantasy
00:57:02.420 this is a
00:57:03.100 famous concern
00:57:04.320 but
00:57:04.660 because it is
00:57:06.380 so skilled
00:57:07.100 at mimicking
00:57:07.640 people
00:57:08.380 and their speech
00:57:09.480 and their images
00:57:09.980 that it will
00:57:10.400 require some way
00:57:11.620 to verify
00:57:12.140 that you are
00:57:13.020 who you say
00:57:14.080 you are
00:57:14.520 that will
00:57:15.820 by definition
00:57:16.740 require biometrics
00:57:17.880 which will
00:57:18.840 by definition
00:57:19.900 eliminate privacy
00:57:20.820 for every person
00:57:21.860 in the world
00:57:23.740 I
00:57:24.340 don't think
00:57:25.840 we need to
00:57:27.000 or should require
00:57:27.800 biometrics
00:57:28.320 to use the technology
00:57:29.080 I don't
00:57:30.640 like
00:57:30.920 I think you should
00:57:31.400 just be able
00:57:31.720 to use
00:57:33.020 ChatGPT
00:57:34.740 from like
00:57:35.340 any computer
00:57:35.860 yeah
00:57:37.700 well I
00:57:38.060 I strongly agree
00:57:39.120 but then
00:57:39.600 at a certain point
00:57:40.420 when
00:57:41.400 you know
00:57:42.400 images
00:57:42.860 or sounds
00:57:43.960 that mimic
00:57:44.720 a person
00:57:45.520 you know
00:57:46.980 it just becomes
00:57:47.460 too easy
00:57:48.180 to empty
00:57:48.520 your checking
00:57:48.980 account
00:57:49.360 with that
00:57:50.100 so like
00:57:50.680 what do you
00:57:51.080 do about that
00:57:51.760 a few thoughts
00:57:52.580 there
00:57:52.760 one
00:57:53.160 I think
00:57:54.080 we are rapidly
00:57:54.820 heading to a world
00:57:55.760 where people
00:57:56.440 understand
00:57:57.040 that
00:57:57.880 if you get
00:57:59.520 a phone call
00:58:00.880 from someone
00:58:01.300 that sounds
00:58:01.680 like your kid
00:58:02.200 or your parent
00:58:02.780 or if you see
00:58:03.900 an image
00:58:04.260 that looks
00:58:04.760 real
00:58:05.400 you have to
00:58:05.980 really have
00:58:06.660 some way
00:58:07.020 to verify
00:58:07.460 that you're
00:58:08.260 not being
00:58:09.080 scammed
00:58:09.460 and this is
00:58:10.220 not like
00:58:10.540 this is no longer
00:58:11.200 theoretical concern
00:58:12.120 you know
00:58:12.400 you hear all
00:58:12.900 these reports
00:58:13.300 at all
00:58:13.760 yeah
00:58:14.000 people are smart
00:58:16.500 society is resilient
00:58:17.460 I think people
00:58:18.160 are quickly
00:58:18.700 understanding
00:58:19.280 that this is
00:58:20.440 not a thing
00:58:20.860 that bad actors
00:58:21.540 are using
00:58:21.980 and people
00:58:24.800 are understanding
00:58:25.400 that you got
00:58:25.740 to verify
00:58:26.220 in different ways
00:58:26.980 I suspect
00:58:27.880 that in addition
00:58:28.860 to things
00:58:29.400 like family
00:58:30.300 members having
00:58:30.880 code words
00:58:31.460 they use
00:58:31.880 in crisis
00:58:32.460 situations
00:58:33.060 we'll see
00:58:34.800 things like
00:58:35.420 when a president
00:58:36.680 of a country
00:58:37.260 has to issue
00:58:38.120 an urgent message
00:58:38.940 they cryptographically
00:58:40.120 sign it
00:58:40.540 or otherwise
00:58:41.020 somehow guarantee
00:58:41.740 its authenticity
00:58:42.360 so you don't
00:58:43.420 have like
00:58:43.920 generated videos
00:58:44.940 of Trump saying
00:58:45.700 I've just done
00:58:46.220 this or that
00:58:46.780 and people
00:58:47.280 I think people
00:58:48.060 are learning
00:58:48.440 quickly
00:58:48.920 that this is
00:58:51.500 a new thing
00:58:53.140 that bad guys
00:58:53.960 are doing
00:58:54.180 with the technology
00:58:54.740 they have to
00:58:55.220 contend with
00:58:55.780 and I think
00:58:58.220 that is
00:58:58.560 most of
00:58:59.520 the solution
00:59:01.160 which is
00:59:01.640 people will have
00:59:02.400 people will
00:59:03.660 by default
00:59:04.180 not trust
00:59:05.160 convincing looking
00:59:06.020 media
00:59:06.360 and we will
00:59:07.900 build new
00:59:08.360 mechanisms
00:59:08.760 to verify
00:59:09.300 authenticity
00:59:09.840 of communication
00:59:10.820 but those
00:59:11.620 will have to
00:59:11.980 be biometric
00:59:12.580 no not
00:59:13.780 I mean
00:59:14.580 if
00:59:15.060 I mean
00:59:16.120 like
00:59:16.380 if
00:59:17.240 the president
00:59:17.780 of the US
00:59:18.260 has
00:59:18.600 I understand
00:59:20.200 that
00:59:20.400 but I mean
00:59:20.660 for the average
00:59:21.160 on the average
00:59:21.880 day
00:59:22.240 you're not
00:59:22.620 sort of
00:59:22.880 waiting
00:59:23.100 for the president
00:59:23.680 to announce
00:59:25.020 a war
00:59:25.560 you're
00:59:26.120 trying to do
00:59:27.180 e-commerce
00:59:27.660 and like
00:59:28.100 how could
00:59:28.420 you do
00:59:28.940 well I think
00:59:30.220 like with your
00:59:30.580 family
00:59:30.940 you'll have
00:59:31.380 a code word
00:59:32.080 that you
00:59:33.000 change periodically
00:59:34.440 and if you're
00:59:34.920 communicating
00:59:35.280 with each other
00:59:35.860 and you get
00:59:36.280 a call
00:59:36.640 like you ask
00:59:37.160 what the code word
00:59:37.740 is
00:59:38.020 but it's very
00:59:38.680 different than
00:59:39.020 a biometric
00:59:39.480 so you don't
00:59:41.420 envision
00:59:41.880 I mean
00:59:43.380 to board a plane
00:59:44.480 commercial flight
00:59:45.660 you know
00:59:47.120 biometrics are
00:59:48.460 part of the process
00:59:50.000 now
00:59:50.280 you don't see
00:59:50.940 that as becoming
00:59:51.800 society wide
00:59:53.560 mandatory
00:59:54.100 very soon
00:59:55.080 along
00:59:55.600 I hope
00:59:57.480 I really hope
00:59:58.300 it doesn't become
00:59:58.760 mandatory
00:59:59.080 I think
00:59:59.660 there are
01:00:00.240 versions of
01:00:00.980 privacy preserving
01:00:02.940 biometrics
01:00:03.580 that I like
01:00:04.420 much more
01:00:05.640 than
01:00:06.020 like
01:00:07.680 collecting a lot
01:00:08.480 of personal
01:00:08.880 digital information
01:00:09.600 on someone
01:00:10.040 but I don't
01:00:10.740 think they
01:00:10.940 should be
01:00:11.200 I don't
01:00:11.620 think biometrics
01:00:12.280 should be
01:00:12.540 mandatory
01:00:12.900 I don't
01:00:14.040 think you
01:00:14.220 should like
01:00:14.460 have to
01:00:14.740 provide biometrics
01:00:15.400 to get on
01:00:15.680 an airplane
01:00:15.940 for example
01:00:16.520 what about
01:00:17.240 for banking
01:00:18.180 I don't
01:00:19.280 think you
01:00:19.420 should have
01:00:19.600 to for
01:00:19.800 banking
01:00:20.040 I might
01:00:22.020 prefer to
01:00:22.420 like I
01:00:22.720 might prefer
01:00:23.320 like
01:00:24.100 you know
01:00:25.580 like a
01:00:26.440 fingerprint scan
01:00:27.000 to access
01:00:27.380 my bitcoin
01:00:27.860 wallet
01:00:28.160 than like
01:00:28.500 giving all
01:00:28.740 my information
01:00:29.160 to a bank
01:00:29.640 but that
01:00:29.900 should be
01:00:30.080 a decision
01:00:30.420 for me
01:00:30.820 I appreciate
01:00:32.860 it
01:00:33.140 thank you
01:00:33.600 Sam Watman
01:00:33.960 thank you
01:00:35.040 we want
01:00:39.220 to thank
01:00:39.460 you for
01:00:39.700 watching us
01:00:40.180 on Spotify
01:00:40.800 a company
01:00:41.360 that we
01:00:41.740 use every
01:00:42.420 day
01:00:42.620 we know
01:00:42.840 the people
01:00:43.100 who run
01:00:43.380 it
01:00:43.500 good people
01:00:44.200 while you're
01:00:45.200 here
01:00:45.500 do us a
01:00:46.080 favor
01:00:46.260 hit follow
01:00:47.360 and tap
01:00:48.160 the bell
01:00:48.800 so you
01:00:49.120 never miss
01:00:49.540 an episode
01:00:49.960 we have
01:00:51.120 real conversations
01:00:51.820 news
01:00:52.300 things that
01:00:52.720 actually matter
01:00:53.560 telling the
01:00:54.160 truth
01:00:54.340 always
01:00:54.820 you will
01:00:55.260 not miss
01:00:55.720 it
01:00:55.960 if you
01:00:56.560 follow
01:00:56.940 us
01:00:57.360 on Spotify
01:00:57.840 and hit
01:00:58.140 the bell
01:00:58.440 we appreciate
01:00:59.240 thanks for
01:00:59.620 watching