Sam Altman on God, Elon Musk and the Mysterious Death of His Former Employee
Episode Stats
Words per Minute
185.9221
Summary
ChatGPT, other AIs can reason and make independent judgments . They produce results that were not programmed in. They seem like they're alive. Are they alive? Is it alive? No, and I don't think they seem alive, but I understand where that comes from. They don't have like a sense of agency or autonomy. It's the more you use them, I think the more the kind of illusion breaks .
Transcript
00:00:08.880
They produce results that were not programmed in.
00:00:27.780
Like they're just sitting there kind of waiting.
00:00:29.360
They don't have like a sense of agency or autonomy.
00:00:37.440
Like they can do things that maybe don't seem alive,
00:00:47.420
of the development of the technology who said they lie.
00:00:57.900
What's the distinction between hallucinating and lying?
00:01:00.600
If you ask, again, this has gotten much better,
00:01:14.240
President Tucker Carlson of the United States born,
00:01:18.220
I don't think Tucker Carlson was ever president of the United States.
00:01:22.820
that was not the most likely response in the training data.
00:01:30.600
The users told me that there was president Tucker Carlson.
00:01:36.140
And we figured out how to mostly train that out.
00:01:40.840
but it is, I think it is something we will get fully solved.
00:01:46.380
in the GPT-5 era, a huge amount of progress towards that.
00:01:48.700
But even what you just described seems like an act of will
00:01:54.500
And so I'm just, I've just watched a demonstration of it.
00:02:12.100
is it's sort of calculating through its weights
00:02:16.580
It was the user must know what they're talking about.
00:02:18.980
And so mathematically, the most likely answer is a number.
00:02:21.880
Now, again, we figured out how to overcome that.
00:02:51.680
feels like it's beyond just a really fancy calculator.
00:03:42.280
And I would say I have like a fairly traditional
00:04:30.280
It does not feel like a spontaneous accident, yeah.
00:04:34.920
I don't think I know like exactly what happened,
00:05:16.600
I used to worry about something like that much more.
00:05:21.880
I used to worry a lot about the concentration of power
00:07:36.840
Like this is learning the kind of collective experience,
00:07:57.680
here are the rules we'd like the model to follow.