The Joe Rogan Experience - October 06, 2023


Joe Rogan Experience #2044 - Sam Altman


Episode Stats

Length

2 hours and 36 minutes

Words per Minute

162.29652

Word Count

25,394

Sentence Count

1,864

Misogynist Sentences

8

Hate Speech Sentences

11


Summary

In this episode of the Joe Rogan Experience podcast, I sit down with AI pioneer Sam Harris to talk about the impact of artificial intelligence on the way we live, and the potential for it to change our world in the coming decades. Sam is a professor of computer science at the University of Toronto, and he's been around the business for a long time. He's also the author of several books, including The Future of AI: A User Interface for the 21st Century, which is a book that explores how AI can change the world, and how we can prepare for the inevitable impact it will have on our jobs and the economy. I think you're going to get a lot out of this episode if you're curious about the future of AI, and what it means for us and the way the world is going to change in the next 50 years, and I think it's going to be a really important conversation to have with someone who's been working on this field for a while, and who has a lot of experience in the field of AI and technology. I hope you enjoy this episode, and if you do, please share it with a friend or colleague who's interested in learning more about AI. Tweet me and let me know what you think! Timestamps: 4:00 - What are your thoughts on AI? 6:30 - How AI is changing the world? 7:00 8:20 - What will AI's role in the future? 9:40 - What is AI's impact on the world 10: What are the downsides of AI 11:15 - How will AI replace our jobs? 12:00 | How AI will change our way of life? 13:30 | Should AI replace human labor? 14:20 15:30 16:10 - What's the best way to prepare for AI 17: What is the role of AI in our lives? 18:10 19:40 Can AI be a good thing? 21:40 | What are we waiting for? 22:00 // Is AI a problem? 23:00 Is AI going to replace human creativity? 25:00? 26:00 Can AI replace us? 27:00 Are we ready for AI in 50 years? 30:00 Will AI be better? 35:00 Does AI have a place in the world ? 36:00 Do we need to be more intelligent?


Transcript

00:00:01.000 Joe Rogan Podcast, check it out!
00:00:04.000 The Joe Rogan Experience.
00:00:06.000 Train by day, Joe Rogan Podcast by night, all day.
00:00:13.000 Hello, Sam.
00:00:13.000 What's happening?
00:00:14.000 Not much.
00:00:15.000 Thanks for coming in here.
00:00:15.000 Appreciate it.
00:00:16.000 Thanks for having me.
00:00:17.000 So, what have you done?
00:00:20.000 Like ever?
00:00:21.000 No, I mean, what have you done with AI? I mean, it's one of the things about this is, I mean, I think everyone is fascinated by it.
00:00:32.000 I mean, everyone is absolutely blown away at the current capability and wondering what the potential for the future is and whether or not that's a good thing.
00:00:45.000 I think it's going to be a great thing, but I think it's not going to be all a great thing.
00:00:49.000 And that is where...
00:00:52.000 I think that's where all of the complexity comes in for people.
00:00:56.000 It's not this like clean story of we're going to do this and it's all going to be great.
00:01:00.000 It's we're going to do this, it's going to be net great, but it's going to be like a technological revolution.
00:01:05.000 It's going to be a societal revolution.
00:01:07.000 And those always come with change.
00:01:10.000 And even if it's like net wonderful, you know, there's things we're going to lose along the way.
00:01:15.000 Some kinds of jobs, some kinds of parts of our way of life, some parts of the way we live are going to change or go away.
00:01:20.000 And No matter how tremendous the upside is there, and I believe it will be tremendously good.
00:01:26.000 There's a lot of stuff we've got to navigate through to make sure.
00:01:30.000 That's a complicated thing for anyone to wrap their heads around, and there's deep and super understandable emotions around that.
00:01:39.000 That's a very honest answer, that it's not all going to be good.
00:01:43.000 But it seems inevitable at this point.
00:01:48.000 Yeah, I mean, it's definitely inevitable.
00:01:50.000 My view of the world, you know, when you're like a kid in school, you learn about this technological revolution and then that one and then that one.
00:01:57.000 And my view of the world now sort of looking backwards and forwards is that this is like one long technological revolution.
00:02:05.000 And we had...
00:02:06.000 Sure, like first we had to figure out agriculture so that we had the resources and time to figure out how to build machines, then we got this industrial revolution, and that made us learn about a lot of stuff, a lot of other scientific discovery too, let us do the computer revolution, and that's now letting us, as we scale up to these massive systems,
00:02:22.000 do the AI revolution.
00:02:28.000 I think it's the You know, although we do have these things to navigate and there will be these downsides, if you think about what it means for the world and for people's quality of lives,
00:02:46.000 if we can get to a world where the cost of intelligence and the abundance that comes with that The cost dramatically falls.
00:02:58.000 The abundance goes way up.
00:03:00.000 I think we'll do the same thing with energy.
00:03:01.000 And I think those are the two sort of key inputs to everything else we want.
00:03:05.000 So if we can have abundant and cheap energy and intelligence, that will transform people's lives largely for the better.
00:03:11.000 And I think it's going to, in the same way that if we could go back now 500 years and look at someone's life, we'd say, well, there's some great things, but they didn't have this.
00:03:20.000 They didn't have that.
00:03:20.000 Can you believe they didn't have modern medicine?
00:03:23.000 That's what people are going to look back at us like, but in 50 years.
00:03:26.000 When you think about the people that currently rely on jobs that AI will replace, when you think about whether it's truck drivers or automation workers, people that work in factory assembly lines, what,
00:03:41.000 if anything, what strategies can be put to mitigate the negative downsides of those jobs being eliminated by AI? I'll talk about some general thoughts, but I find making very specific predictions difficult because the way the technology goes has been so different than even my own intuitions,
00:04:07.000 or certainly my own intuitions.
00:04:08.000 Maybe we should stop there and back up a little bit.
00:04:11.000 What were your initial thoughts?
00:04:15.000 If you had asked me 10 years ago, I would have said, first, AI is going to come for blue collar labor, basically.
00:04:22.000 It's going to drive trucks and do factory work and, you know, it'll handle heavy machinery.
00:04:28.000 Then maybe after that it'll do some kinds of cognitive labor, but it won't be off doing what I think of personally as the really hard stuff.
00:04:38.000 It won't be off proving new mathematical theorems.
00:04:41.000 It won't be off discovering new science.
00:04:45.000 It won't be off writing code.
00:04:47.000 And then eventually, maybe, but maybe last of all, maybe never, because human creativity is this magic special thing, last of all it'll come for the creative jobs.
00:04:57.000 That's what I would have said.
00:05:19.000 But you really still need the human there today.
00:05:21.000 And then B, it's going exactly the other direction.
00:05:23.000 Could do the creative work first.
00:05:25.000 Stuff like coding second.
00:05:26.000 They can do things like other kinds of cognitive labor third.
00:05:30.000 And we're the furthest away from like humanoid robots.
00:05:35.000 So back to the initial question.
00:05:38.000 If we do have something that completely eliminates factory workers, completely eliminates truck drivers, delivery drivers, things along those lines, that creates this massive vacuum in our society.
00:05:55.000 Yeah.
00:05:55.000 So I think there's things that we're gonna do that are good to do but not sufficient.
00:06:03.000 So I think at some point we will do something like a UBI or some other kind of like very long-term unemployment insurance something.
00:06:11.000 But we'll have some way of giving people like redistributing money in society as a cushion for people as people figure out the new jobs.
00:06:21.000 Maybe I should touch on that.
00:06:23.000 I'm not a believer at all that there won't be lots of new jobs.
00:06:27.000 I think human creativity, desire for status, wanting different ways to compete, invent new things, feel part of a community, feel valued, that's not going to go anywhere.
00:06:38.000 People have worried about that forever.
00:06:40.000 What happens is we get better tools and we just invent new things and more amazing things to do.
00:06:47.000 And there's a big universe out there.
00:06:48.000 And I think I mean that like literally in that there's like space is really big, but also there's just so much stuff we can all do if we do get to this world of abundant intelligence where you can sort of just think of a new idea and it gets created.
00:07:06.000 But again, to the point we started with, that doesn't provide great solace to people who are losing their jobs today.
00:07:15.000 So saying there's going to be this great indefinite stuff in the future.
00:07:19.000 People are like, what are we doing today?
00:07:22.000 I think we will, as a society, do things like UBI and other ways of redistribution, but I don't think that gets at the core of what people want.
00:07:31.000 I think what people want is agency, self-determination, the ability to play a role in architecting the future along with the rest of society, the ability to express themselves and create something meaningful to them.
00:07:46.000 And also, I think a lot of people work jobs they hate, and I think we as a society are always a little bit confused about whether we want to work more or work less.
00:07:57.000 But somehow, We all get to do something meaningful and we all get to play our role in driving the future forward.
00:08:09.000 That's really important.
00:08:10.000 And what I hope is as those long-haul truck driving jobs go away, which people have been wrong about predicting how fast that's going to happen, but it's going to happen, we figure out not just a way to Solve the economic problem by,
00:08:27.000 like, giving people the equivalent of money every month, but that there's a way that—and we have a lot of ideas about this—there's a way that we, like, share ownership and decision-making over the future.
00:08:41.000 I think I say a lot about AGI is that Everyone realizes we're going to have to share the benefits of that, but we also have to share the decision-making over it and access to the system itself.
00:08:54.000 I'd be more excited about a world where we say, rather than give everybody on Earth 1.8 billionth of the AGI money, which we should do that too, we say, you get 1.8 billionth slice of the system.
00:09:09.000 You can sell it to somebody else.
00:09:11.000 You can sell it to a company.
00:09:12.000 You can pool it with other people.
00:09:13.000 You can use it for whatever creative pursuit you want.
00:09:16.000 You can use it to figure out how to start some new business.
00:09:19.000 And with that, you get sort of like a voting right over how this is all going to be used.
00:09:25.000 And so the better the AGI gets, the more your little 1.8 billionth ownership is worth to you.
00:09:31.000 We were joking around the other day on the podcast where I was saying that what we need is an AI government.
00:09:37.000 We should have an AI president and have AI right things.
00:09:41.000 Just make all the decisions?
00:09:42.000 Yeah.
00:09:42.000 Have something that's completely unbiased, absolutely rational, has the accumulated knowledge of the entire human history at its disposal, including all knowledge of psychology and psychological study, including UBI,
00:09:58.000 because that comes with a host of You know, pitfalls and issues that people have with it.
00:10:04.000 So I'll say something there.
00:10:06.000 I think we're still very far away from a system that is capable enough and reliable enough that any of us would want that.
00:10:14.000 But I'll tell you something I love about that.
00:10:17.000 Someday, let's say that thing gets built.
00:10:19.000 The fact that it can go around and talk to every person on Earth, understand their exact preferences at a very deep level, how they think about this issue and that one, and how they balance the trade-offs and what they want, and then understand all of that and collectively optimize for the collective preferences of humanity or of citizens of the US,
00:10:40.000 that's awesome.
00:10:41.000 As long as it's not co-opted, right?
00:10:45.000 Our government currently is co-opted.
00:10:47.000 That's for sure.
00:10:47.000 We know for sure that our government is heavily influenced by special interests.
00:10:53.000 If we could have an artificial intelligence government that has no influence, nothing has influence on it.
00:11:01.000 What a fascinating idea.
00:11:02.000 It's possible.
00:11:03.000 And I think it might be the only way.
00:11:06.000 Where you're gonna get completely objective, the absolute most intelligent decision for virtually every problem, every dilemma that we face currently in society.
00:11:19.000 Would you truly be comfortable handing over, like, final decision-making and say, all right, AI, you got it from here?
00:11:24.000 No, no, but I'm not comfortable doing that with anybody, you know?
00:11:28.000 I mean, I was uncomfortable with the Patriot Act.
00:11:32.000 I'm uncomfortable with many decisions that are being made.
00:11:36.000 It's just there's so much obvious evidence that decisions that are being made are not being made in the best interest of the overall well of the people.
00:11:44.000 It's being made in the decisions of whatever gigantic corporations that have donated to and whatever the military industrial complex and pharmaceutical industrial complex and just the money.
00:11:59.000 That's really what we know today, that money has a massive influence on our society and the choices that get made.
00:12:06.000 And the overall good or bad for the population.
00:12:09.000 Yeah.
00:12:09.000 I have no disagreement at all that the current system is super broken, not working for people, super corrupt, and for sure unbelievably run by money.
00:12:20.000 Yeah.
00:12:21.000 And I think there is a way to do a better job than that with AI in some way.
00:12:30.000 And this might just be like a factor of sitting with the systems all day and watching all of the ways they fail.
00:12:35.000 We got a long way to go.
00:12:37.000 A long way to go, I'm sure.
00:12:38.000 But when you think of AGI, when you think of the possible future, like where it goes to, do you ever extrapolate?
00:12:48.000 Do you ever like sit and pause and say, well, if this becomes sentient and it has the ability to make better versions of itself, how long before we're literally dealing with a god?
00:13:01.000 So, the way that I think about this is, it used to be that AGI was this very binary moment.
00:13:08.000 It was before and after.
00:13:10.000 And I think I was totally wrong about that.
00:13:12.000 And the right way to think about it is this continuum of intelligence, this smooth exponential curve.
00:13:20.000 Back all the way to that sort of smooth curve of technological revolution.
00:13:24.000 The amount of compute power we can put into the system, the scientific ideas about how to make it more efficient and smarter, to give it the ability to do reasoning, to think about how to improve itself.
00:13:37.000 That will all come.
00:13:40.000 But my model for a long time, I think if you look at the world of AGI thinkers, there's sort of two, particularly around the safety issues you're talking about, there's two axes that matter.
00:13:52.000 There's what's called short timelines or long timelines to the first milestone of AGI, whatever that's going to be.
00:13:59.000 Is that going to happen in...
00:14:01.000 A few years, a few decades, maybe even longer.
00:14:04.000 Although at this point, I think most people are a few years or a few decades.
00:14:06.000 And then there's takeoff speed.
00:14:08.000 Once we get there, from there to that point you were talking about where it's capable of the rapid self-improvement, is that a slower, a faster process?
00:14:17.000 The world that I think we're heading, that we're in, and also the world that I think is the most controllable and the safest, is the short timelines and slow takeoff quadrant.
00:14:30.000 And I think we're going to have, you know, there were a lot of very smart people for a while who were like, the thing you were just talking about happens in a day or three days.
00:14:39.000 And that doesn't seem likely to me given the shape of the technology as we understand it now.
00:14:45.000 Now, even if that happens in a decade or three decades, it's still like the blink of an eye from a historical perspective.
00:14:53.000 And there are going to be some real challenges to getting that right.
00:14:58.000 And the decisions we make The sort of safety systems and the checks that the world puts in place, how we think about global regulation or rules of the road from a safety perspective for those projects.
00:15:14.000 It's super important because you can imagine many things going horribly wrong.
00:15:18.000 But I've been...
00:15:21.000 I feel cheerful about the progress the world is making towards taking this seriously and, you know, it reminds me of what I've read about the conversations that the world had right around the development of nuclear weapons.
00:15:37.000 At least in terms of public consciousness, this has emerged very rapidly, where I don't think anyone was really aware.
00:15:44.000 People were aware of the concept of artificial intelligence, but they didn't think that it was going to be implemented So comprehensively, so quickly.
00:15:56.000 So chat GPT is on what, 4.5 now?
00:16:00.000 Four.
00:16:00.000 Four.
00:16:01.000 And with 4.5, there'll be some sort of an exponential increase in its abilities?
00:16:06.000 It'll be somewhat better.
00:16:08.000 Each step, you know, from each like half step like that, you kind of, humans have this ability to like get used to any new technology so quickly.
00:16:17.000 Yeah.
00:16:18.000 The thing that I think was unusual about the launch of ChatGPT 3.5 and then 4 was that people hadn't really been paying attention.
00:16:26.000 And that's part of the reason we deploy.
00:16:28.000 We think it's very important that people and institutions have time to gradually understand this, react, co-design the society that we want with it.
00:16:38.000 And if you just build AGI in secret in a lab and then drop it on the world all at once, I think that's a really bad idea.
00:16:43.000 So we had been trying to talk to the world about this for a while.
00:16:49.000 If you don't give people something they can feel and use in their lives, they don't quite take it seriously.
00:16:54.000 Everybody's busy.
00:16:55.000 And so there was this big overhang from where the technology was to where public consciousness was.
00:17:00.000 Now, that's caught up.
00:17:02.000 We've deployed.
00:17:03.000 I think people understand it.
00:17:05.000 I don't expect the jump from like 4 to whenever we finish 4.5, which will be a little while.
00:17:11.000 I don't expect that to be I think the crazy switch, the crazy adjustment that people have had to go through has mostly happened.
00:17:20.000 I think most people have gone from thinking that AGI was science fiction and very far off to something that is going to happen.
00:17:27.000 And that was like a one-time reframe.
00:17:29.000 And now, you know, every year you get a new iPhone.
00:17:32.000 Over the 15 years or whatever since the launch, they've gotten dramatically better.
00:17:36.000 But iPhone to iPhone, you're like, yeah, okay, it's a little better.
00:17:38.000 But now if you go hold up the first iPhone to the 15 or whatever, that's a big difference.
00:17:43.000 GPT 3.5 to AGI, that'll be a big difference.
00:17:46.000 But along the way, it'll just get incrementally better.
00:17:49.000 Do you think about the convergence of things like Neuralink and there's a few competing technologies where they're trying to implement some sort of a connection between the human biological system and technology?
00:18:09.000 Do you want one of those things in your head?
00:18:11.000 I don't until everybody does.
00:18:14.000 Right.
00:18:14.000 And, you know, I have a joke about it.
00:18:16.000 But it's like the idea is like once it gets done, you have to kind of because everybody's going to have it.
00:18:22.000 So one of the hard questions about all of the related merge stuff is exactly what you just said.
00:18:30.000 Like as a society, are we going to let some people merge with AGI and not others?
00:18:36.000 Right.
00:18:36.000 And if we do, then, and you choose not to, like, what does that mean for you?
00:18:43.000 Right.
00:18:43.000 And will you be protected?
00:18:46.000 How you get that moment right?
00:18:48.000 If we imagine all the way out to the sci-fi future, there have been a lot of sci-fi books written about how you get that moment right.
00:18:56.000 Who gets to do that first?
00:18:57.000 What about people who don't want to?
00:18:58.000 How do you make sure the people that do it first actually help lift everybody up together?
00:19:02.000 How do you make sure people who want to just live their very human life get to do that?
00:19:07.000 That stuff is really hard and honestly...
00:19:10.000 So far off from my problems of the day that I don't get to think about that as much as I'd like to, because I do think it's super interesting.
00:19:20.000 But yeah, it seems like if we just think logically, that's going to be a huge challenge at some point, and people are going to want...
00:19:33.000 Wildly divergent things, but there is a societal question about how we're going to, like, the questions of fairness that come there and what it means for the people who don't do it.
00:19:47.000 Super, super complicated.
00:19:49.000 Anyway, on the neural interface side, I'm, in the short term, like, before we figure out how to Upload someone's consciousness into a computer, if that's even possible at all, which I think there's plenty of sides you could take on why it's not.
00:20:06.000 The thing that I find myself most interested in is what we can do without drilling a hole in someone's head.
00:20:15.000 How much of the inner monologue can we read out with an externally mounted device?
00:20:19.000 And if we have an imperfect, low bandwidth, low accuracy neural interface, can people still just learn how to use it really well in a way that's quite powerful for what they can now do with a new computing platform?
00:20:34.000 And my guess is we'll figure that out.
00:20:36.000 I'm sure you've seen that headpiece.
00:20:38.000 There's a demonstration where there's someone asking someone a question.
00:20:42.000 They have this headpiece on, they think the question, and then they literally Google the question and get the answers through their head.
00:20:48.000 That's the kind of direction we've been exploring.
00:20:51.000 Yeah.
00:20:52.000 That seems to me to be step one.
00:20:55.000 That's the pong of the eventual immersive 3D video games.
00:20:59.000 Like you're going to get these first steps and they're going to seem sort of crude and slow.
00:21:05.000 I mean it's essentially slower than just asking Siri.
00:21:08.000 I think if someone built a system where you could think words, it doesn't have to be a question, it could just be your passive rambling inner monologue, but it certainly could be a question.
00:21:20.000 And that was being fed into GPT-5 or 6. And in your field of vision, the words in response were being displayed.
00:21:28.000 That would be the palm.
00:21:30.000 Yeah.
00:21:30.000 That's a very valuable tool to have.
00:21:33.000 And that seems like that's inevitable.
00:21:36.000 There's hard work to get there on the neural interface side, but I believe it will happen.
00:21:40.000 Yeah.
00:21:41.000 I think so, too.
00:21:42.000 And my concern is that the initial adopters of this will have such a massive advantage over the general population.
00:21:48.000 Well, that doesn't concern me because that's like a, you know, that's not, you're not, that's just like better, that's a better computer.
00:21:56.000 You're not like jacking your brain into something in a high-risk thing.
00:21:59.000 You know what you do when you don't want them?
00:22:00.000 When you take off the glasses.
00:22:02.000 So that feels fine.
00:22:05.000 Well, this is just the external device then.
00:22:08.000 Oh, I think we can do the kind of like read your thoughts with an external device at some point.
00:22:13.000 Read your internal monologue.
00:22:15.000 Interesting.
00:22:16.000 And do you think we'll be able to communicate with an external device as well telepathically or semi-telepathically through technology?
00:22:23.000 I do.
00:22:24.000 Yeah.
00:22:25.000 Yeah, I do.
00:22:26.000 I think so, too.
00:22:28.000 My real concern is that once we take the step to use an actual neural interface, when there's an actual operation, and they're using some sort of an implant, and then that implant becomes more sophisticated,
00:22:45.000 it's not the iPhone 1, now it's the iPhone 15, and as these things get better and better, we're on the road to cyborgs.
00:22:54.000 We're on the road to, like, why would you want to be a biological person?
00:22:59.000 Do you really want to live in a fucking log cabin when you can be in the Matrix?
00:23:02.000 I mean, it seems like we're not...
00:23:05.000 We're on this path.
00:23:10.000 We're already a little bit down that path, right?
00:23:13.000 Like if you take away someone's phone and they have to go function in the world today, they're at a disadvantage relative to everybody else.
00:23:20.000 So maybe that's like the lightest weight version of a merge we could imagine, but I think it's worth like, if we go back to that earlier thing about the one exponential curve, I think it's worth saying we've like lifted off the x-axis already down this path, the tiniest bit.
00:23:36.000 And Yeah.
00:23:38.000 Even if you don't go all the way to, like, a neural interface, VR will get so good that some people just don't want to take it off that much.
00:23:45.000 Right.
00:23:46.000 And...
00:23:49.000 That's fine for them, as long as we can solve this question of how do we think about what a balance of power means in the world.
00:23:57.000 I think there will be many people, I'm certainly one of them, who's like, actually the human body and the human experience is pretty great.
00:24:04.000 That log heaven in the woods, pretty awesome.
00:24:06.000 I don't want to be there all the time.
00:24:07.000 I'd love to go play the great video game, but I'm really happy to get to go there sometimes.
00:24:13.000 Right.
00:24:14.000 Yeah, there's still human experiences that are just, like, great human experience.
00:24:20.000 Just laughing with friends, you know, kissing someone that you've never kissed before, that you're on a first date.
00:24:28.000 Those kind of things, they're real moments.
00:24:31.000 It just laughs, having a glass of wine with a friend, just laughing.
00:24:36.000 Not quite the same in VR. Yeah, it's not.
00:24:38.000 Now, when the VR goes super far, so you can't – it's like you are jacked in on your brain and you can't tell what's real and what's not.
00:24:46.000 And then everybody gets like super deep on the simulation hypothesis or the like Eastern religion or whatever.
00:24:51.000 And I don't know what happens at that point.
00:24:53.000 Do you ever fuck around with simulation theory?
00:24:55.000 Because the real problem is when you combine that with probability theory and you talk to the people that say, well, if you just look at the numbers – The probability that we're already in a simulation is much higher than the probability that we're not.
00:25:12.000 It's never been clear to me what to do about it.
00:25:16.000 It's like, okay.
00:25:17.000 That intellectually makes a lot of sense.
00:25:20.000 I think probably, sure.
00:25:22.000 That seems convincing.
00:25:23.000 But this is my reality.
00:25:24.000 This is my life.
00:25:25.000 And I'm going to live it.
00:25:27.000 And I've...
00:25:30.000 You know, from like...
00:25:34.000 2am in my college freshman dorm hallway till now.
00:25:37.000 I've made no more progress on it than that.
00:25:40.000 Well, it seems like one of those...
00:25:44.000 If it is a possibility, if it is real.
00:25:51.000 First of all, once it happens, what are you going to do?
00:25:54.000 I mean, that is the new reality.
00:25:56.000 And in many ways, our new reality is...
00:25:59.000 As alien to hunter-gatherers from 15,000 years ago as that would be to us now.
00:26:08.000 I mean, we've already entered into some very bizarre territory where, you know, I was just having a conversation with my kids.
00:26:16.000 We were asking questions about something, and I always say, let's guess.
00:26:20.000 What percentage of that is this?
00:26:22.000 And then we just Google it.
00:26:23.000 And then just ask Siri, and we pull it up.
00:26:25.000 Like, look at that!
00:26:26.000 Like, that alone is so bizarre compared to how it was when I was 13. And you had to go to the library and hope that the book was accurate.
00:26:37.000 Totally.
00:26:37.000 I was very annoyed.
00:26:40.000 I was reading about how horrible systems like ChatGPT and Google are from an environmental impact because it's, you know, using, like, some extremely tiny amount of energy for each query.
00:26:50.000 And, you know, how we're all destroying the world.
00:26:52.000 And I was like...
00:26:53.000 Before that, people drove to the library.
00:26:55.000 Let's talk about how much carbon they burned to answer this question versus what it takes now.
00:26:58.000 Come on.
00:26:59.000 But that's just people looking for some reason why something's bad.
00:27:03.000 That's not a logical perspective.
00:27:05.000 What we should be looking at is the spectacular changes that are possible through this.
00:27:12.000 And all the problems, the insurmountable problems that we have with resources, with the environment, with cleaning up the ocean, climate change, there's so many problems that we have.
00:27:22.000 We need this to solve all of everything else.
00:27:24.000 And that's why we need President AI. If AI could make every scientific discovery, but we still had human presidents, do you think we'd be okay?
00:27:35.000 No, because those creeps would still be pocketing money and they'd have offshore accounts and it would always be a weird thing of corruption and how to mitigate that corruption, which is also one of the fascinating things about the current state of technology is that we're so much more aware of corruption.
00:27:52.000 There's so much independent reporting and we're so much more cognizant of the actual problems This is really great.
00:28:03.000 One of the things that I've observed, obviously many other people too, is corruption is such an incredible hindrance to getting anything done in a society to make it forward progress.
00:28:14.000 And my worldview had been...
00:28:17.000 I was more US-centric when I was younger and as I've just studied the world more and had to work in more places in the world.
00:28:25.000 It's amazing how much corruption there still is.
00:28:27.000 But the shift to a technologically enabled world, I think, is a major force against it because it's harder to hide stuff.
00:28:36.000 And I do think corruption in the world will keep trending down.
00:28:41.000 Because of its exposure.
00:28:43.000 Yeah.
00:28:44.000 Through technology.
00:28:45.000 I mean, it comes at a cost.
00:28:48.000 And I think the loss...
00:28:50.000 I am very worried about how far the surveillance state could go here.
00:28:55.000 But in a world where...
00:28:59.000 Payments, for example, are no longer like bags of cash, but done somehow digitally.
00:29:04.000 And somebody, even if you're using Bitcoin, can watch those flows.
00:29:08.000 I think that's like a corruption-reducing thing.
00:29:12.000 I agree, but I'm very worried about central bank digital currency and that being tied to a social credit score.
00:29:20.000 Super against.
00:29:22.000 Yeah, that scares the shit out of me.
00:29:23.000 Super against.
00:29:24.000 And the push to that is not...
00:29:26.000 That's not for the overall good of society.
00:29:28.000 That's for control.
00:29:30.000 Yeah, I think like...
00:29:32.000 I mean, there's many things that I'm disappointed that the US government has done recently.
00:29:39.000 But the war on crypto, which I think is a like, we can't give this up.
00:29:44.000 Like, we're going to control this and all this.
00:29:46.000 That's like...
00:29:48.000 That's the thing that, like, makes me quite sad about the country.
00:29:51.000 It makes me quite sad about the country, too.
00:29:53.000 But then you also see with things like FTX, like, oh, this can get without regulation and without someone overseeing it.
00:30:01.000 This can get really fucked.
00:30:05.000 Yeah, I'm not anti-regulation.
00:30:07.000 Like, I think there's clearly a role for it.
00:30:11.000 And I also think FTX was like a sort of comically bad situation that we shouldn't learn too much from either.
00:30:21.000 Yeah, but it's a fun one.
00:30:22.000 Like, it's totally fun.
00:30:23.000 I love that story.
00:30:25.000 I mean, you clearly...
00:30:28.000 I really do.
00:30:29.000 I love the fact that they were all doing drugs and having sex with each other.
00:30:32.000 No, no.
00:30:32.000 It had every part of the dramas.
00:30:36.000 I mean, it's a gripping story because it had everything there.
00:30:39.000 They did their taxes with...
00:30:40.000 What was the program that they used?
00:30:44.000 QuickBooks.
00:30:46.000 They're dealing with billions of dollars of cash.
00:30:48.000 I don't know why I think the word polycule is so funny.
00:30:52.000 Polycule?
00:30:52.000 That was what they, like, when you call a relationship, like a poly but closed, like polyamorous molecule put together.
00:30:59.000 Oh, I see.
00:31:00.000 So they were like, this is our polycule.
00:31:02.000 So there's nine of them and they're poly amongst them.
00:31:03.000 Or ten of them or whatever.
00:31:04.000 Yeah, you call that a polycule.
00:31:05.000 And I thought that was the funny, like, that became like a meme in Silicon Valley for a while that I thought was hilarious.
00:31:11.000 You clearly want enough regulation that that can't happen.
00:31:15.000 But they're like...
00:31:18.000 Well, I'm not against that happening.
00:31:19.000 I'm against them doing what they did with the money.
00:31:22.000 That's what I mean.
00:31:23.000 The polycule is kind of fun.
00:31:24.000 Go for it.
00:31:25.000 No, no.
00:31:25.000 I mean you want enough thing that like FTX can't lose all of its depositors' money.
00:31:29.000 Yes.
00:31:30.000 But I think there's an important point here, which is you have all of this other regulation that people...
00:31:35.000 And it didn't keep us safe.
00:31:38.000 And the basic thing, which was like...
00:31:41.000 You know, let's do that.
00:31:42.000 That was not all of the crypto stuff people were talking about.
00:31:47.000 Yes.
00:31:48.000 I mean the real fascinating crypto is Bitcoin to me.
00:31:52.000 I mean that's the one that I think has the most likely possibility of becoming a universal viable currency.
00:32:01.000 And it's limited in the amount that there can be.
00:32:07.000 It's – people mine it with their own – it's like that to me is very fascinating and I love the fact that it's been implemented.
00:32:16.000 I've had Andreas Antonopoulos on the podcast.
00:32:20.000 When he talks about it, he's living it.
00:32:26.000 He's spending all of his money.
00:32:27.000 Everything he has paid is in Bitcoin.
00:32:29.000 He pays his rent in Bitcoin.
00:32:31.000 Everything he does is in Bitcoin.
00:32:32.000 I helped start a project called WorldCoin a few years ago.
00:32:38.000 And so I've gotten to learn more about this space.
00:32:42.000 I'm excited about it for the same reasons.
00:32:44.000 I'm excited about Bitcoin too.
00:32:45.000 But I think this idea that we have a global currency that is outside of the control of any government is a super logical and important step on the tech tree.
00:32:59.000 Yeah, agreed.
00:33:00.000 I mean, why should the government control currency?
00:33:02.000 I mean the government should be dealing with all the pressing environmental, social, infrastructure issues, foreign policy issues, economic issues.
00:33:11.000 The things that we need to be governed in order to have a peaceful and prosperous society that's equal and equitable.
00:33:20.000 What do you think happens to money and currency after AGI? I've wondered about that because I feel like with money, especially when money goes digital, the bottleneck is access.
00:33:32.000 If we get to a point where all information is just freely shared everywhere, there are no secrets, there are no boundaries, there are no borders.
00:33:42.000 We're reading minds.
00:33:44.000 We have complete access to all of the information of everything you've ever done, everything everyone's ever said.
00:33:52.000 There's no hidden secrets.
00:33:55.000 What is money then?
00:33:56.000 Money is this digital thing.
00:33:58.000 Well, how can you possess it?
00:33:59.000 How can you possess this digital thing if there is literally no bottleneck?
00:34:05.000 There's no barriers to anyone accessing any information?
00:34:09.000 Because essentially, it's just ones and zeros.
00:34:12.000 Yeah.
00:34:12.000 I mean, another way...
00:34:13.000 I think the information frame makes sense.
00:34:15.000 Another way is that money is...
00:34:18.000 Like a sort of way to trade labor or trade like a limited number of hard assets like land and houses and whatever.
00:34:27.000 And if you think about a world where like intellectual labor is just readily available and super cheap, then...
00:34:38.000 That's somehow very different.
00:34:41.000 I think there will always be goods that we want to be scarce and expensive, but it'll only be those goods that we want to be scarce and expensive and services that still are.
00:34:50.000 And so money in a world like that, I think, is just a very curious idea.
00:34:56.000 Yeah, it becomes a different thing.
00:34:57.000 I mean, it's not a bag of gold in a leather pouch that you're carrying around when you're riding on a horse.
00:35:02.000 It's not going to do you much good, probably.
00:35:03.000 Yeah, it's not going to do you much good.
00:35:04.000 But then the question becomes, how is that money distributed?
00:35:07.000 And how do we avoid some horrible Marxist society where there's one totalitarian government that...
00:35:15.000 Don't sit out.
00:35:15.000 That would be bad.
00:35:16.000 I think you've got to, like...
00:35:19.000 My current best idea, and maybe there's something better, is I think if we are right, a lot of reasons we could be wrong, but if we are right that the AGI systems, of which there will be a few, become the high-order bits of influence, whatever,
00:35:34.000 in the world, I think you do need...
00:35:39.000 Not to just redistribute the money but the access so that people can make their own decisions about how to use it and how to govern it.
00:35:45.000 And if you've got one idea, you get to do this.
00:35:48.000 If I've got one idea, I get to do that.
00:35:51.000 And I have like rights to basically do whatever I want with my part of it.
00:35:55.000 And if I come up with better ideas than you, I get rewarded for that by whatever the society is or vice versa.
00:36:00.000 Yeah.
00:36:01.000 You know, the hardliners, the people that are against, like, welfare and against any sort of universal basic income, UBI, what they're really concerned with is human nature, right?
00:36:15.000 They believe that if you remove incentives, if you just give people free money, they become addicted to it, they become lazy.
00:36:22.000 But isn't that a human biological and psychological bottleneck?
00:36:28.000 And perhaps...
00:36:31.000 With the implementation of artificial intelligence combined with some sort of neural interface, whether it's external or internal.
00:36:43.000 It seems like that's a problem that can be solved.
00:36:48.000 That you can essentially, and this is where it gets really spooky, you can re-engineer the human biological system and you can remove all of these problems that people have that are essentially problems that date back to human reward systems when we were tribal people.
00:37:06.000 Hunter-gatherer people, whether it's jealousy, lust, envy, all these variables that come into play when you're dealing with money and status and social status.
00:37:18.000 If those are eliminated with technology, essentially we become a next version of what the human species is possible.
00:37:27.000 Look, we're very, very far removed from tribal, brutal societies of cave people.
00:37:37.000 We all agree that this is a way better way to live.
00:37:40.000 It's way safer.
00:37:43.000 I was talking about this at my comedy club last night.
00:37:47.000 Because my wife was, we were talking about DNA, and my wife was saying that, look, everybody came from K people, which is kind of a fucked up thought, that everyone here is here because of K people.
00:37:59.000 Well, all that's still in our DNA. All that's still—and these reward systems can be hijacked, and they can be hijacked by just giving people money.
00:38:09.000 And, like, you don't have to work.
00:38:10.000 You don't have to do anything.
00:38:11.000 You don't have to have ambition.
00:38:12.000 You'll just have money and just lay around and do drugs.
00:38:16.000 That's the fear that people have of giving people free money.
00:38:21.000 But— If we can figure out how to literally engineer the human biological vehicle and remove all those pitfalls, if we can enlighten people technologically,
00:38:38.000 maybe enlighten is the wrong word, but advance The human species to the point where those are no longer dilemmas because those are easily solvable through coding.
00:38:51.000 They're easily solvable through enhancing the human biological system, perhaps raising dopamine levels to the point where anger and fear and hate are impossible.
00:39:03.000 They don't exist.
00:39:04.000 And, I mean, if you just had everyone on Molly...
00:39:08.000 How many wars would there be?
00:39:10.000 There'd be zero wars.
00:39:11.000 I mean, I think if you could get everyone on Earth to all do Mali once on the same day, that'd be a tremendous thing.
00:39:17.000 It would be.
00:39:18.000 But if you got everybody on Earth to do Mali every day, that'd be a real loss.
00:39:21.000 But what if they did a low dose of Mali, where you just get to, ah.
00:39:26.000 Where everybody greets people with love and affection.
00:39:31.000 And there's no longer a concern about competition.
00:39:34.000 Instead, the concern is about the fascination of innovation and creation and creativity.
00:39:42.000 Man, we could talk the rest of the time about this one topic.
00:39:45.000 It's so interesting.
00:39:46.000 I think...
00:39:50.000 If I could push a button to remove all human striving and conflict, I wouldn't do it, first of all.
00:39:59.000 I think that's a very important part of our story and experience.
00:40:04.000 And also, I think we can see both from our own biological history and also from what we know about AI, that very simple Goal systems, fitness functions,
00:40:20.000 reward models, whatever you want to call it, lead to incredibly impressive results.
00:40:26.000 You know, if the biological imperative is survive and reproduce, look how far that has somehow gotten us as a society.
00:40:34.000 All of this, all this stuff we have, all this technology, this building, whatever else.
00:40:40.000 That I think?
00:40:56.000 Fulfilling this biological imperative to some degree and wanting to impress each other.
00:41:02.000 So I think evolutionary fitness is a simple and unbelievably powerful idea.
00:41:07.000 Now, could you carefully edit out every individual manifestation of that?
00:41:16.000 Maybe, but I don't want to, like, live in a society of drones where everybody is just sort of, like, on Molly all the time either.
00:41:24.000 Like, that doesn't seem like the right answer.
00:41:28.000 Like, I want us to continue to strive.
00:41:30.000 I want us to continue to push back the frontier and go out and explore.
00:41:34.000 And I actually think something's already gotten a little off track in society about all of that and we're...
00:41:47.000 I don't know.
00:41:49.000 I thought I'd be older by the time I felt like the old guy complaining about the youth.
00:41:53.000 But I think we've lost something, and I think that we need more striving, maybe more risk-taking, more explorer spirit.
00:42:07.000 What do you mean by you think we've lost something?
00:42:18.000 Here's a version of it.
00:42:20.000 Very much from my own lens.
00:42:23.000 I was a startup investor for a long time.
00:42:26.000 And it often was the case that the very best startup founders were in their early or mid-20s or late 20s, maybe even.
00:42:36.000 And now they skew much older.
00:42:38.000 And what I want to know is, in the world today, where are the super great 25-year-old founders?
00:42:44.000 And there are a few.
00:42:45.000 It's not fair to say there are none, but there are less than there were before.
00:42:49.000 And I think that's bad for society at all levels.
00:42:55.000 Tech company founders is one example, but people who go off and create something new, who push on a disagreeable or controversial idea, we need that to drive forward.
00:43:08.000 We need that sort of spirit.
00:43:11.000 We need people to be able to put out ideas and be wrong and not be ostracized from society for it or not have it be something that they get canceled for or whatever.
00:43:22.000 We need people to be able to take a risk in their career because they believe in some important scientific quest that may not work out or may sound like really controversial or bad or whatever.
00:43:33.000 You know, certainly when we started OpenAI and we were saying, we think this AGI thing is real and could be, you know, It could be done, unlikely, but so important if it happens.
00:43:43.000 And all of the older scientists in our field were saying, those people are irresponsible.
00:43:48.000 You shouldn't talk about AGI. That's like, you know, they're like selling a scam, or they're like, you know, they're kind of being reckless and it's going to lead to an AGI winter.
00:43:59.000 Like, we said we believed, we said at the time, we knew it was unlikely, but it was an important quest.
00:44:04.000 And we were going to go after it and kind of like, fuck the haters.
00:44:08.000 That's important to a society.
00:44:11.000 What do you think is the origin?
00:44:13.000 Why do you think there are less young people that are doing those kind of things now as opposed to a decade or two ago?
00:44:25.000 I am so interested in that topic.
00:44:28.000 I'm tempted to blame the education system, but I think that interacts with society in all of these strange ways.
00:44:40.000 It's funny, there was this thing all over my Twitter feed recently trying to talk about what caused the drop in testosterone in American men over the last few decades.
00:44:50.000 And no one was like, this is a symptom, not a cause.
00:44:55.000 And everyone was like, oh, it's the microplastics, it's the birth control pills, it's the whatever, it's the whatever, it's the whatever.
00:45:00.000 And I think this is not at all the most important It was interesting to me, sociologically, that there was only talk about what caused it,
00:45:19.000 not about it being an effect of some sort of change in society.
00:45:24.000 But isn't what caused it...
00:45:27.000 Well, there's biological reasons why.
00:45:31.000 Like when we talk about the phthalates and microplastic pesticides, environmental factors, those are real.
00:45:37.000 Totally.
00:45:38.000 And I don't like...
00:45:39.000 Again, I'm so far out of my depth and expertise here.
00:45:41.000 This was...
00:45:41.000 It was just interesting to me that the only talk was about like biological factors and not that somehow society can have some sort of effect on...
00:45:52.000 Do you know what the answer to this is?
00:45:54.000 I don't.
00:45:54.000 I mean I've had a podcast with Dr. Shanna Swan who wrote the book Countdown and that is all about the introduction of petrochemical products and the correlating drop in testosterone, rise in miscarriages, the fact that these are ubiquitous endocrine disruptors that When they do blood tests on people,
00:46:17.000 they find some insane number.
00:46:19.000 It's like 90 plus percent of people have phthalates in their system.
00:46:24.000 I appreciate the metal cups.
00:46:25.000 Yeah, we try to mitigate it as much as possible.
00:46:29.000 I mean, you're getting it.
00:46:30.000 If you're microwaving food, you're fucking getting it.
00:46:33.000 You're just getting it.
00:46:34.000 If you eat processed food, you're getting it.
00:46:36.000 You're getting a certain amount of microplastics in your diet.
00:46:39.000 Estimates have been that it's as high as a credit card of microplastics per week in your body.
00:46:45.000 You consume a credit card of that a week.
00:46:47.000 Whoa.
00:46:48.000 The real concern is with mammals because the introductions, when they've done studies with mammals and they've introduced phthalates into their body, there's a correlating...
00:47:00.000 One thing that happens is these animals, their taints shrink.
00:47:07.000 The mammal, when you look at males, it's 50% to 100% larger than the females.
00:47:13.000 With the introduction of phthalates on the males, the taints start shrinking, the penises shrank, the testicles shrank, sperm count shrinks.
00:47:20.000 So we know there's a direct biological connection between these chemicals and how they interact with bodies.
00:47:30.000 So that's a real one.
00:47:32.000 And it's also...
00:47:34.000 The amount of petrochemical products that we have, the amount of plastics that we use, it is such an integral part of our culture and our society, our civilization.
00:47:46.000 It's everywhere.
00:47:47.000 And I've wondered if you think about how these Territorial apes evolve into this new, advanced species.
00:48:03.000 Wouldn't one of the very best ways be to get rid of one of the things that causes the most problems, which is testosterone?
00:48:12.000 We need testosterone.
00:48:14.000 We need aggressive men and protectors.
00:48:16.000 But why do we need them?
00:48:17.000 We need them because there's other aggressive men that are evil.
00:48:21.000 Right?
00:48:21.000 So we need protectors from ourselves.
00:48:25.000 We need the good strong people to protect us from the bad strong people.
00:48:29.000 But if we're in the process of integrating with technology, if technology is an inescapable part of our life, if it is everywhere, you're using it, you have the internet of everything that's in your microwave, your television, your computers,
00:48:46.000 everything you use, As time goes on, that will be more and more a part of your life, and as these plastics are introduced into the human biological system, you're seeing a feminization of the males of the species.
00:49:01.000 You're seeing a downfall in birth rate.
00:49:04.000 You're seeing all these correlating factors that would sort of lead us to become this more Peaceful, less violent, less aggressive, less ego-driven thing.
00:49:20.000 Which the world is definitely becoming over time.
00:49:24.000 And I'm all for less violence, obviously.
00:49:28.000 But I don't...
00:49:34.000 Look, obviously testosterone has many great things to say for it and some bad tendencies too.
00:49:41.000 But I don't think a world – if we leave that out of the equation and just say like a world that has a spirit that we're going to defend ourselves, we're going to – We're going to find a way to protect ourselves and our tribe and our society into this future,
00:50:05.000 which you can get with lots of other ways.
00:50:07.000 I think that's an important impulse.
00:50:10.000 More than that, though, what I meant If we go back to the issue of where are the young founders?
00:50:20.000 Why don't we have more of those?
00:50:23.000 And I don't think it's just the tech startup industry.
00:50:25.000 I think you could say that about young scientists or many other categories.
00:50:29.000 Those are maybe just the ones that I know the best.
00:50:35.000 In a world with Any amount of technology.
00:50:41.000 I still think it is our destiny in some sense to stay on this curve.
00:50:48.000 And we still need to go figure out what's next and after the next hill and after the next hill.
00:50:54.000 And it would be...
00:50:56.000 My perception is that there is some long-term societal change happening here.
00:51:03.000 And I think it makes us less happy too.
00:51:06.000 Right.
00:51:07.000 It may make us less happy.
00:51:10.000 But what I'm saying is if the human species does integrate with technology, wouldn't a great way to facilitate that to be to kind of feminize the primal apes and to sort of downplay the role— You mean like should the AGI phthalates enter the world?
00:51:30.000 I don't know if it's AGI. I mean maybe it's just an inevitable consequence of technology.
00:51:36.000 Because especially the type of technology that we use, which does have so much plastic in it, and then on top of that the technology that's involved in food systems, preservatives, all these different things that we use to make sure that people don't starve to death.
00:51:49.000 We've made incredible strives in that.
00:51:51.000 There are very few people in this country that starve to death.
00:51:55.000 It's not a primary issue, but violence is a primary issue.
00:52:01.000 But our concerns about violence and our concerns about testosterone and strong men and powerful people is only because...
00:52:11.000 We need to protect against others.
00:52:13.000 We need to protect against other same things.
00:52:16.000 Is that true?
00:52:16.000 Is that really the only reason?
00:52:17.000 Sure.
00:52:18.000 I mean, how many, like, incredibly violent women are out there running gangs?
00:52:22.000 No, no.
00:52:22.000 That part for sure.
00:52:24.000 Clearly not very many.
00:52:26.000 What I meant more is, is that the only reason that society values, like, strong masculinity?
00:52:32.000 Yeah, I think so.
00:52:33.000 I think it's a biological imperative, right?
00:52:35.000 And I think that biological imperative is because we used to have to defend against incoming tribes and predators and animals.
00:52:42.000 And we needed someone who was stronger than most to defend the rest.
00:52:48.000 And that's the concept of the military.
00:52:50.000 That's why Navy SEAL training is so difficult.
00:52:53.000 We want the strongest of the strong to be at the tip of the spear.
00:52:57.000 But that's only because there's people like that out there that are bad.
00:53:03.000 If artificial general intelligence and the implementation of some sort of a device that changes the biological structure of human beings to the point where that is no longer a concern, like if you are me and I am you and I know this because of technology,
00:53:19.000 violence is impossible.
00:53:20.000 Yeah, look, by the time if this goes all the way down the sci-fi path and we're all like merged into this one single like planetary universal whatever consciousness, then yes, you don't.
00:53:31.000 You don't need testosterone.
00:53:32.000 You need testosterone, but you still...
00:53:34.000 Especially if we can reproduce through other methods.
00:53:37.000 Like, this is the alien hypothesis, right?
00:53:40.000 Like, why do they look so spindly and without any gender, you know, when they have these big heads and tiny mouths?
00:53:45.000 They don't need physical strength.
00:53:46.000 They don't need physical strength.
00:53:47.000 They have some sort of a telepathic way of communicating.
00:53:50.000 They probably don't need sounds with their mouths.
00:53:53.000 And they don't need this urge that we have to conquer and to spread our DNA. Like that's so much of what people do is these reward systems that were established when we were territorial apes.
00:54:07.000 There's a question to me about how much you can ever get Rid of that.
00:54:15.000 If you make an AGI and it decides, actually, we don't need to expand.
00:54:22.000 We don't need more territory.
00:54:23.000 We're just like happy.
00:54:24.000 We, at this point, you, me, it, the whole thing all together all merged in.
00:54:28.000 We're happy here on Earth.
00:54:30.000 We don't need to get any bigger.
00:54:31.000 We don't need to reproduce.
00:54:32.000 We don't need to grow.
00:54:33.000 We're just going to sit here and run.
00:54:35.000 A, that sounds like a boring life.
00:54:37.000 I don't agree with that.
00:54:39.000 I don't agree that that would be the logical conclusion.
00:54:41.000 I think the logical conclusion would be They would look for problems and frontiers that are insurmountable to our current existence, like intergalactic communication and transportation.
00:54:57.000 What happens when it meets another AGI the other galaxy over?
00:54:59.000 What happens when it meets an AGI that's a million years more advanced?
00:55:03.000 What does that look like?
00:55:05.000 Yeah.
00:55:06.000 That's what I've often wondered if we are...
00:55:09.000 I call ourselves the biological caterpillars that create the electronic butterfly.
00:55:13.000 That we're making a cocoon right now and we don't even know what we're doing.
00:55:16.000 And I think it's also tied into consumerism.
00:55:20.000 Because what does consumerism do?
00:55:22.000 Consumerism facilitates the creation of newer and better things.
00:55:26.000 Because you always want the newest, latest, greatest.
00:55:28.000 So you have more advanced technology in automobiles and computers and cell phones.
00:55:34.000 And all of these different things, including medical science.
00:55:40.000 That's all for sure true.
00:55:44.000 The thing I was like reflecting on as you were saying that is, I don't think I... I'm not as optimistic that we can or even should overcome our biological base to the degree that I think you think we can.
00:56:02.000 And, you know, to even go back one further level, like, I think society is happiest where there's, like, roles for strong femininity and strong masculinity in the same people and in different people.
00:56:17.000 And I don't, like...
00:56:24.000 And I don't think a lot of these, like, deep-seated things are gonna be able to get pushed aside very easily and still have a system that works.
00:56:36.000 Like, sure, we can't really think about what, if there were consciousness in a machine someday or whatever, what that would be like.
00:56:44.000 And maybe I'm just, like, thinking too small-mindedly, but I think there is something But don't you think that cave people would probably have those same logical conclusions about life and sedentary lifestyle and sitting in front of a computer and not interacting with each other except
00:57:14.000 through text?
00:57:19.000 I mean, isn't that, like, what you're saying is correct.
00:57:21.000 How different do you think our motivations are today and kind of what really brings us genuine joy and how we're wired at some deep level differently than cave people?
00:57:31.000 Clearly, lots of other things have changed.
00:57:33.000 We've got much better tools.
00:57:35.000 But how different do you think it really is?
00:57:37.000 I think that's the problem, is that genetically, at the base level, there's not much difference.
00:57:44.000 And that these reward systems are all – we interact with all of them, whether it's ego, lust, passion, fury, anger, jealousy, all these different things.
00:57:59.000 And you think will be – some people will upload and edit those out.
00:58:02.000 Yes.
00:58:03.000 Yeah.
00:58:03.000 I think that our concern with losing this aspect of what it means to be a person – Like the idea that we should always have conflict and struggle because conflict and struggle is how we facilitate progress, which is true, right?
00:58:18.000 And combating evil is how the good gets greater and stronger if the good wins.
00:58:23.000 But my concern is that that is all predicated on the idea that the biological system that we have right now is correct, And optimal.
00:58:36.000 And I think one of the things that we're dealing with, with the heightened states of depression and anxiety and the lack of meaning and existential angst that people experience, a lot of that is because the biological reality of being a human animal doesn't really integrate that well with this world that we've created.
00:58:58.000 That's for sure.
00:58:59.000 Yeah.
00:59:00.000 And I wonder If the solution to that is not find ways to find meaning with the biological vessel that you've been given, but rather engineer those aspects that are problematic out of the system.
00:59:21.000 To create a truly enlightened being.
00:59:24.000 Like, one of the things, if you ask someone today, what are the odds that in three years there will be no war in the world?
00:59:30.000 That's zero.
00:59:31.000 Like, nobody thinks.
00:59:33.000 There's never been a time in human history where we haven't had war.
00:59:36.000 If you had to say, what is our number one problem as a species?
00:59:41.000 I would say our number one problem is war.
00:59:44.000 Our number one problem is this idea that it's okay to send massive groups of people who don't know each other to go murder massive groups of people that are somehow opposed because of the government, because of lines in the sand and territory.
00:59:58.000 That's clearly an insane thing.
00:59:59.000 It's an insane thing.
01:00:01.000 How do you get rid of that?
01:00:02.000 Well, one of the ways you get rid of that is to completely engineer out all the human reward systems that pertain to the acquisition of resources.
01:00:12.000 So what's left at that point?
01:00:13.000 Well, we're a new thing.
01:00:15.000 I think we've become a new thing.
01:00:17.000 And what does that thing do want?
01:00:18.000 I think that new thing would probably want to interact with other new things that are even more advanced than it.
01:00:25.000 I do believe that Scientific curiosity can drive quite...
01:00:32.000 That can be a great frontier for a long time.
01:00:37.000 Yeah.
01:00:38.000 I think it can be a great frontier for a long time as well.
01:00:40.000 I just wonder if what we're seeing with the drop in testosterone, because of microplastics, which sort of just snuck up on us.
01:00:49.000 We didn't even know that it was an issue until people started studying.
01:00:52.000 How certain is that at this point, that that's what's happening?
01:00:54.000 I don't know.
01:00:55.000 I'm going to go study after this.
01:00:56.000 It's a very good question.
01:00:58.000 Dr. Shanna Swan believes that it's the primary driving factor of the sort of drop in testosterone and all miscarriage issues.
01:01:07.000 Low birth weights.
01:01:08.000 All those things seem to be a direct factor environmentally.
01:01:13.000 I'm sure there's other factors too.
01:01:16.000 The drop in testosterone, it's been shown that you can increase male's testosterone through resistance training and through making...
01:01:24.000 There's certain things you can do.
01:01:26.000 One of the big ones they found through a study in Japan is cold water immersion.
01:01:31.000 Before exercise, it radically increases testosterone.
01:01:35.000 So cold water immersion and then exercise post that.
01:01:39.000 I wonder why.
01:01:40.000 Yeah, I don't know.
01:01:41.000 Let's see who can find that.
01:01:42.000 But it's a fascinating field of study, but I think it has something to do with resilience and resistance and the fact that your body has to combat this External factor that's very extreme that causes the body to go into this state of preservation and the implementation of cold shock proteins and the reduction of inflammation,
01:02:05.000 which also enhances the body's endocrine system.
01:02:08.000 But then on top of that, this imperative that you have to become more resilient to survive this external factor that you've introduced into your life every single day.
01:02:19.000 So there's ways, obviously, that you can make a human being more robust.
01:02:26.000 You know, we know that we can do that through strength training and that all that stuff actually does raise testosterone.
01:02:31.000 Your diet can raise testosterone and a poor diet will lower it and will hinder your integrant system, will hinder your ability to produce growth hormone, melatonin, all these different factors.
01:02:45.000 That seems to be something that we can fix in terms or at least mitigate with decisions and choices and effort.
01:02:54.000 But the fact that these petrochemical – like there's a graph that Dr. Shanna Swan has in her book that shows during the 1950s when they start using petrochemical products in everything, microwave, plastic, saran wrap,
01:03:10.000 all this different stuff.
01:03:11.000 There's a direct correlation between the implementation and the dip and it all seems to line up.
01:03:19.000 Like that seems to be a primary factor.
01:03:23.000 Does that have an equivalent impact on estrogen-related hormones?
01:03:28.000 That's a good question.
01:03:30.000 Some of them actually...
01:03:31.000 I know some of these chemicals that they're talking about actually increase estrogen in men.
01:03:38.000 I don't know.
01:03:39.000 But I do know that it increases miscarriages.
01:03:42.000 So I just think it's overall disruptive to the human body.
01:03:46.000 Definitely a societal wide disruption of the endocrine system in a short period of time.
01:03:51.000 Seems like a just bad and difficult to wrap our heads around.
01:03:55.000 And then pollutants and environmental toxins on top of the pesticides and herbicides and all these other things and microplastics.
01:04:03.000 There's a lot of factors that are leading our systems to not work well.
01:04:08.000 But I just really wonder if this – like are we just clinging on to this monkey body?
01:04:16.000 Are we deciding?
01:04:17.000 I like my monkey body.
01:04:18.000 I do too.
01:04:19.000 Listen, I love it.
01:04:21.000 But I'm also – I try to be very objective.
01:04:24.000 And when I objectively look at it in terms of like if you take where we are now and all of our problems and you look towards the future and like – What would be one way that you could mitigate a lot of these?
01:04:38.000 And it would be the implementation of some sort of a telepathic technology where You know, you couldn't just text someone or tweet at someone something mean because you would literally feel what they feel when you put that energy out there and you would be repulsed.
01:04:55.000 Yeah.
01:04:56.000 And then violence would be if you were committing violence on someone and you literally felt the reaction of that violence in your own being.
01:05:08.000 You would also have no motivation for violence.
01:05:12.000 If we had no aggressive tendencies, no primal chimpanzee tendencies.
01:05:17.000 You know, it's true that violence in the world has obviously gone down a lot over the decades, but emotional violence is up a lot, and the internet has been horrible for that.
01:05:28.000 Yeah.
01:05:28.000 Like, I don't walk...
01:05:29.000 I'm not going to walk over there and punch you because you look like a big, strong guy.
01:05:32.000 You're going to punch me back.
01:05:32.000 And also, there's a societal convention not to do that.
01:05:35.000 But if I didn't know you, I might, like, send a mean tweet about you.
01:05:39.000 And I feel nothing on that.
01:05:41.000 Yeah.
01:05:42.000 And clearly, that has become, like, a mega epidemic in society that we did not evolve the biological constraints on somehow.
01:05:55.000 Yeah.
01:05:56.000 And...
01:05:58.000 I'm actually very worried about how much that's already destabilized us and made us all miserable.
01:06:04.000 It's certainly accentuated it.
01:06:07.000 It's exacerbated all of our problems.
01:06:09.000 I mean, if you read Jonathan Haidt's book, The Coddling of the American Mind, have you read it?
01:06:13.000 Great book.
01:06:14.000 Yeah, it's a great book.
01:06:14.000 And it's very damaging to women, particularly young girls.
01:06:18.000 Young girls growing up, there's a direct correlation between the invention of social media, the introduction to the iPhone, self-harm, suicide, online bullying.
01:06:29.000 People have always talked shit about people when no one's around.
01:06:32.000 The fact that they're doing it now openly to harm people.
01:06:37.000 Horrible, obviously.
01:06:38.000 I think it's super damaging to men, too.
01:06:40.000 Maybe they just talk about it less, but I don't think any of us are set up for this.
01:06:44.000 No, no one's set up for it.
01:06:46.000 And I think famous people know that more than anyone.
01:06:49.000 We all get used to it.
01:06:50.000 Yeah, you just get numb to it.
01:06:52.000 Or if you're wise, you don't engage.
01:06:54.000 I don't even have any apps on my new phone.
01:06:58.000 I've got a new phone and I decided, okay, nothing.
01:07:01.000 That's really smart.
01:07:01.000 No Twitter.
01:07:02.000 So I have a separate phone that if I have to post something, I pick up.
01:07:07.000 But all I get on my new phone is text messages.
01:07:10.000 And is that more just to keep your mind pure and unpolluted?
01:07:14.000 And not tempt myself.
01:07:16.000 Do you know how many fucking times I've got up to go to the bathroom first thing in the morning and spent an hour just sitting on the toilet scrolling through Instagram?
01:07:25.000 Like for nothing.
01:07:26.000 It does zero for me.
01:07:27.000 And there's this thought that I'm going to get something out of it.
01:07:31.000 I was thinking actually just yesterday about how, you know, we all have talked for so long about these algorithmic feeds are going to manipulate us in these big ways and that will happen.
01:07:44.000 But in the small ways already where like scrolling Instagram is not even that fulfilling, like you finish that hour and you're like, I know that was a waste of my time.
01:07:55.000 But it was like over the threshold where you couldn't quite...
01:07:57.000 It's hard to put the phone down.
01:07:59.000 Right.
01:07:59.000 You're just hoping that the next one's going to be interesting.
01:08:02.000 And every now and then, the problem is every like 30th or 40th reel that I click on is wild.
01:08:09.000 I wonder, by the way, if that's more powerful than if every one was wild, if every one was great.
01:08:15.000 Sure.
01:08:16.000 You know, it's like the slot machine effect.
01:08:17.000 You have to mine for gold.
01:08:18.000 Yeah.
01:08:18.000 You don't just go out and pick it like daisies when it's out in the field everywhere.
01:08:21.000 If the algorithm is like intentionally feeding you some shit along the way...
01:08:25.000 Yeah.
01:08:26.000 Well, there's just a lot of shit out there, unfortunately.
01:08:30.000 But it's just in terms of, you know, I was talking to Sean O'Malley, who's this UFC fighter, who's, you know, obviously has a very strong mind, really interesting guy.
01:08:39.000 But one of the things that Sean said is, like, I get this, like, low-level anxiety from scrolling through things, and I don't know why.
01:08:46.000 Like, what is that?
01:08:47.000 And I think it's part of the logical mind realizes this is a massive waste of your resources.
01:08:54.000 I also deleted a bunch of that stuff off my phone because I just didn't have the self-control.
01:08:59.000 I mean, I had the self-control to delete it, but like not to stop once I was scrolling through.
01:09:03.000 And so I think we're just like, yeah, we're getting attention hacked in some ways.
01:09:13.000 There's some good to it too, but we don't yet have the stuff in place, the tools, the societal norms, whatever, to modulate it well.
01:09:22.000 Right.
01:09:23.000 And we're not designed for it.
01:09:24.000 This is a completely new technology that, again, hijacks our human reward systems and hijacks all of the checks and balances that are in place for communication, which Historically has been one-on-one.
01:09:41.000 Historically, communication has been one person to another.
01:09:44.000 And when people write letters to each other, it's generally things like if someone writes a love letter or, you know, they miss you.
01:09:52.000 They're writing this thing where they're kind of exposing a thing that maybe they have a difficulty in expressing in front of you.
01:09:59.000 And it was, you know, generally, unless the person was a psycho, they're not hateful letters.
01:10:05.000 Whereas the ability to just communicate, fuck that guy, I hope he gets hit by a bus, is so simple and easy, and you don't experience...
01:10:15.000 Twitter seems to be particularly horrible for this, as the mechanics work.
01:10:21.000 It really rewards in ways that I don't think anybody fully understands, that taps into something about human psychology.
01:10:29.000 But that's kind of like...
01:10:32.000 That's how you get engagement.
01:10:33.000 That's how you get followers.
01:10:36.000 That's how you get the dopamine hits or whatever.
01:10:45.000 The people who I know that spend all day on Twitter, more of them are unhappy about it than happy.
01:10:50.000 Oh, yeah.
01:10:51.000 They're the most unhappy.
01:10:52.000 I mean, there's quite a few people that I follow that I only follow because they're crazy.
01:10:57.000 And then I'll go and check in on them and see what the fuck they're tweeting about.
01:11:01.000 And some of them are on there 8-10 hours a day.
01:11:04.000 I'll see tweets all day long.
01:11:07.000 And I know that person cannot be happy.
01:11:10.000 They're unhappy and they cannot stop.
01:11:11.000 They can't stop.
01:11:12.000 And it seems like...
01:11:15.000 It's their life.
01:11:18.000 And they get meaning out of it in terms of reinforcement.
01:11:24.000 They get short-term meaning out of it.
01:11:25.000 I think maybe each day you go to bed feeling like you accomplished something and got your dopamine and at the end of each decade you probably are like, where'd that decade go?
01:11:33.000 I was talking to a friend of mine who was having a real problem with it.
01:11:35.000 He's saying he would be literally walking down the street and he'd have to check his phone to see who's replying.
01:11:40.000 He wasn't even looking where he was walking.
01:11:42.000 He was just like caught up in the anxiety of these exchanges.
01:11:46.000 And it's not because of the nice things people say.
01:11:48.000 No, no, no, no.
01:11:49.000 It's all...
01:11:50.000 And with him, he was recognizing that he was dunking on people and then seeing people respond to the dunking and...
01:11:58.000 Yeah.
01:11:58.000 I stopped doing that a long time ago.
01:12:00.000 I stopped interacting with people on Twitter in a negative way.
01:12:02.000 I just won't do it.
01:12:04.000 Even if I disagree with someone, I'll say something as peacefully as possible.
01:12:08.000 I have more of an internet troll streak than I would like to admit.
01:12:12.000 And so I try to just not give myself too much of the temptation, but I slip up sometimes.
01:12:16.000 Yeah.
01:12:17.000 It's so tempting.
01:12:18.000 Totally.
01:12:19.000 It's so tempting, and it's fun.
01:12:20.000 It's fun to say something shitty.
01:12:23.000 I mean, again, whatever this biological system we were talking about earlier, that gets a positive reward in the moment.
01:12:29.000 You know there's reactions.
01:12:32.000 You say something outrageous, and someone's going to react, and that reaction is like energy.
01:12:37.000 And there's all these other human beings engaging with your idea.
01:12:41.000 But ultimately, it's just not productive for most people, and it's psychologically...
01:12:50.000 It's just fraught with peril.
01:12:53.000 There's just so much going on.
01:12:54.000 I don't know anybody who engages all day long that's happy.
01:12:59.000 Certainly not.
01:13:00.000 I don't...
01:13:03.000 I think I've watched it, like, destroy is too strong of a word, but, like, knock off track the careers or lives or happiness or human relationships of people that are, like, good, smart, conscientious people that just, like, couldn't fight this demon because it,
01:13:20.000 like, hacked there.
01:13:21.000 And COVID really accentuated that because people were alone and isolated.
01:13:25.000 And that made it even worse because then they felt...
01:13:30.000 They felt even better saying shitty things to people.
01:13:33.000 I'm unhappy.
01:13:34.000 Yeah.
01:13:35.000 Even worse things about you.
01:13:36.000 And then there was a psychological aspect of it, like the angst that came from being socially isolated and terrified about this invisible disease that's going to kill us all.
01:13:46.000 And, you know, so you have this like, and then you're interacting with people on Twitter, and then you're caught up in that anxiety, and you're doing it all day.
01:13:54.000 And I know quite a few people, especially comedians, that really lost their minds and lost their respect to their peers by doing that.
01:14:02.000 I have a lot of sympathy for people who lost their minds during COVID, because what a natural thing for us all to go through.
01:14:09.000 The isolation was just brutal.
01:14:11.000 But a lot of people did.
01:14:13.000 And I don't think the internet, and particularly not the kind of social dynamics of things like Twitter, I don't think that brought out anyone's best.
01:14:21.000 No.
01:14:22.000 Well, I mean, some people, I think if they're not inclined to be shitty to people, I think some people did seek comfort and they did interact with people in positive ways.
01:14:35.000 I see there's plenty of positive...
01:14:36.000 I think the thing is that the negative interactions are so much more impactful.
01:14:41.000 Yeah.
01:14:42.000 Look, I think there are a lot of people who use these systems for wonderful things.
01:14:46.000 I didn't mean to imply that's not the case, but that's not what drives...
01:14:51.000 People's emotions after getting off the platform at the end of the day.
01:14:54.000 Right, right.
01:14:54.000 And it's also probably not...
01:14:56.000 If you looked at a pie chart of the amount of interactions on Twitter, I would say a lot of them are shitting on people and being angry about things.
01:15:05.000 How many of the people that you know that use Twitter those 8 or 10 hours a day are just saying wonderful things about other people all day versus the virulent...
01:15:13.000 Very few.
01:15:14.000 Yeah.
01:15:15.000 Very few.
01:15:16.000 I don't know any of them.
01:15:17.000 I know...
01:15:19.000 But then again, I wonder, with the implementation of some new technology that makes communication a very different thing than what we're currently...
01:15:29.000 Like, what we're doing now with communication is less immersive than communicating one-on-one.
01:15:35.000 You and I are talking, we're looking into each other's eyes, we're getting social cues, we're smiling at each other, we're laughing.
01:15:42.000 It's a very natural way to talk.
01:15:45.000 I wonder if through the implementation of technology, if it becomes even more immersive than a one-on-one conversation, even more interactive, and you will understand even more about the way a person feels about what you say.
01:16:03.000 Yeah.
01:16:03.000 About that person's memory, that person's life, that person's history, their education, how it comes out of their mind, how their mind interacts with your mind, and you see them.
01:16:18.000 You really see them.
01:16:20.000 I wonder if that...
01:16:21.000 I wonder if what we're experiencing now...
01:16:24.000 It's just like the first time people invented guns, they just started shooting at things, you know?
01:16:29.000 Yeah.
01:16:29.000 If you can like feel what I feel when you say something mean to me or nice to me, like that's clearly going to change what you decide to say.
01:16:39.000 Yes.
01:16:40.000 Yeah.
01:16:40.000 Yeah.
01:16:41.000 Unless you're a psycho.
01:16:42.000 Unless you're a psycho.
01:16:43.000 And then what causes someone to be a psycho and can that be engineered out?
01:16:49.000 Imagine what we're talking about.
01:16:52.000 When we're dealing with the human mind, we're dealing with various diseases, bipolar, schizophrenia.
01:16:58.000 Imagine a world where we can find the root cause of those things and through coding and some sort of an implementation of technology that elevates dopamine and serotonin and does some things to people that eliminates all of those problems.
01:17:21.000 And allows people to communicate in a very pure way.
01:17:26.000 It sounds great.
01:17:27.000 It sounds great, but you're not going to have any rock and roll.
01:17:30.000 Stand-up comedy will die.
01:17:33.000 You'll have no violent movies.
01:17:36.000 There's a lot of things that are going to go out the window.
01:17:38.000 But maybe that is also part of the process of our evolution to the next stage of existence.
01:17:46.000 Maybe.
01:17:46.000 I feel genuinely confused on this.
01:17:49.000 Well, I think you should be.
01:17:51.000 I mean, to be— We're going to find out.
01:17:52.000 Yeah.
01:17:52.000 I mean, to be sure how it's going to— That's insane.
01:17:55.000 But I don't even have, like— Hubris beyond belief.
01:17:57.000 Right.
01:17:58.000 I mean, you just—you, from the—when did OpenAI—when did you first start this project?
01:18:05.000 End of 2015, early 2016. And when you initially started this project, what kind of timeline did you have in mind and has it stayed on that timeline or is it just wildly out of control?
01:18:22.000 I remember talking with John Schulman, one of our co-founders, early on, and he was like, yeah, I think it's going to be about a 15-year project.
01:18:30.000 And I was like, yeah, that sounds about right to me.
01:18:33.000 And I've always sort of thought since then, now, I no longer think of like AGI as quite the end point, but to get to the point where we like accomplish the thing we set out to accomplish, you know, that would take us to like 2030, 2031. That has felt to me like all the way through kind of a reasonable estimate with huge error bars.
01:18:57.000 And I kind of think we're on the trajectory I sort of would have assumed.
01:19:02.000 And what did you think the impact...
01:19:07.000 On society would be like did you when you when you first started doing this and you said okay if we are successful and we do create some massively advanced AGI What is the implementation?
01:19:21.000 And what is the impact on society?
01:19:24.000 Did you sit there and have like a graph, like you had the pros on one side, the cons on the other?
01:19:31.000 Did you just sort of abstractly consider?
01:19:34.000 Well, we definitely talked a lot about the cons.
01:19:39.000 Many of us were super worried about, and still are, about Safety and alignment.
01:19:47.000 And if we build these systems, we can all see the great future.
01:19:50.000 That's easy to imagine.
01:19:51.000 But if something goes horribly wrong, it's like really horribly wrong.
01:19:55.000 And so there was a lot of discussion about and really a big part of the founding spirit of this is like, how are we going to solve this safety problem?
01:20:04.000 What does that even mean?
01:20:06.000 One of the things that we believe is that the greatest minds in the world cannot sit there and solve that in a vacuum.
01:20:12.000 You've got to have contact with reality.
01:20:14.000 You've got to see where the technology goes.
01:20:16.000 Practice plays out in a stranger way than theory.
01:20:20.000 And that's certainly proven true for us.
01:20:22.000 But we had a long list of...
01:20:24.000 Well, I don't know if we had a long list of cons.
01:20:27.000 We had a very intense list of cons.
01:20:29.000 Because, you know, there's like all of the last decades of sci-fi telling you about...
01:20:34.000 How this goes wrong and why you're supposed to shoot me right now.
01:20:37.000 I'm sure you've seen the John Connor chat GPT. I haven't.
01:20:43.000 What is it?
01:20:45.000 It's like John Connor from The Terminator, the kid, looking at you when you open up chat GPT. Yeah.
01:20:55.000 So that stuff we were like very clear in our minds on.
01:20:59.000 Now, I think we understand there's a lot of work to do, but we understand more about how to make AI safe in the world.
01:21:09.000 AI safety gets overloaded.
01:21:11.000 Like, you know, does it mean don't say something people find offensive?
01:21:13.000 Or does it mean don't destroy all of humanity or some continuum?
01:21:17.000 And I think the word is like gotten overloaded.
01:21:20.000 But in terms of the like, not destroy all of humanity version of it, we have a lot of work to do.
01:21:26.000 But I think we have finally more ideas about what can work.
01:21:30.000 And given the way the systems are going, we have a lot more opportunities available to us to solve it than I thought we would have Given the direction that we initially thought the technology was going to go.
01:21:42.000 So that's good.
01:21:43.000 On the positive side, The thing that I was most excited about then and remain most excited about now is what if this system can dramatically increase the rate of scientific knowledge in society?
01:22:00.000 I think that kind of like all real sustainable economic growth, the future getting better, progress in some sense comes from increased scientific and technological capacity.
01:22:15.000 That's how we can solve all the problems.
01:22:17.000 And if the AI can help us do that, that's always been the thing I've been most excited about.
01:22:22.000 Well, it certainly seems like that is the greatest potential, greatest positive potential of AI. It is to solve A lot of the problems that human beings have had forever, a lot of the societal problems that seem to be—I mean,
01:22:38.000 that's what I was talking about in the AI president.
01:22:40.000 I'm kind of not joking because I feel like if something was hyper-intelligent and aware of all the variables with no human bias— And no incentives.
01:22:52.000 Other than, here's your program, the greater good for the community of the United States and the greater good for that community as it interacts with the rest of the world.
01:23:06.000 The elimination of these dictators, whether they're elected or non-elected, who impose their will on the population because they have a vested interest in protecting special interest groups and industry.
01:23:26.000 I think as long as...
01:23:29.000 The thing that I find scary when you say that is it feels like it's humanity not in control.
01:23:36.000 And I reflexively don't like that.
01:23:38.000 But if it's...
01:23:41.000 If it's instead like it is the collective will of humanity being expressed without the mistranslation and corrupting influences along the way, then I can see it.
01:23:51.000 Is that possible?
01:23:52.000 It seems like it would be.
01:23:54.000 It seems like if it was programmed in that regard to do the greater good for humanity.
01:24:00.000 And take into account the values of humanity, the needs of humanity.
01:24:05.000 There's something about the phrase, do the greater good for humanity.
01:24:08.000 I know, it's terrifying.
01:24:09.000 It's very Orwellian.
01:24:10.000 All of it is.
01:24:11.000 But also, so is artificial general intelligence.
01:24:14.000 For sure, for sure.
01:24:15.000 Open the door, Hal.
01:24:16.000 I wish I had worked on something that was less morally fraught.
01:24:20.000 But do you?
01:24:21.000 Because it's really exciting.
01:24:23.000 I mean, I cannot imagine a cooler thing to work on.
01:24:26.000 I feel unbelievably...
01:24:27.000 I feel like the luckiest person on earth.
01:24:29.000 That's awesome.
01:24:29.000 But it is not...
01:24:30.000 It's not on easy mode.
01:24:33.000 Let's say that.
01:24:34.000 Oh, yeah.
01:24:34.000 This is not life on easy mode.
01:24:35.000 No.
01:24:36.000 No, no, no.
01:24:37.000 I mean, you are at the forefront of one of the most spectacular changes in human history.
01:24:44.000 And I would say...
01:24:46.000 No.
01:24:47.000 I would say more spectacular than the implementation of the internet.
01:24:51.000 I think the implementation of the internet was the first baby steps of this.
01:24:57.000 And that artificial general intelligence is the internet on steroids.
01:25:03.000 It's the internet in hyperspace.
01:25:09.000 What I would say is it's the next step and there will be more steps after, but it's our most exciting step yet.
01:25:14.000 Yeah.
01:25:14.000 My wonder is what are those next steps after?
01:25:18.000 Isn't that so exciting to think about?
01:25:20.000 It's very exciting.
01:25:21.000 I think we're the last people.
01:25:23.000 I really do.
01:25:24.000 I think we're the last of the biological people with all the biological problems.
01:25:29.000 I think there's a very— And do you feel excited about that?
01:25:33.000 I just think that's just what it is.
01:25:36.000 You're just fine with it.
01:25:36.000 It is what it is, you know?
01:25:38.000 I mean, that— I don't think you can control it at this point other than some massive natural disaster that resets us back to the Stone Age, which is also something we should be very concerned with because it seems like that happens a lot.
01:25:51.000 We're not aware of it because the timeline of a human body is so small.
01:25:54.000 The timeline of the human existence as a person is a hundred years if you're lucky, but yet the timeline of the earth is billions of years and if you look at how many times Life on Earth has been reset by comets slamming into the Earth and just completely eliminating all technological advancement.
01:26:15.000 It seems like it's happened multiple times in recorded history.
01:26:21.000 I always think we don't think about that quite enough.
01:26:29.000 We talked about the simulation hypothesis earlier.
01:26:31.000 It's had this big resurgence in the tech industry recently.
01:26:34.000 One of the new takes on it as we get closer to AGI is that if ancestors were simulating us, the time they'd want to simulate again and again is right up to the...
01:26:44.000 Right up to the creation of AGI. So it seems very crazy.
01:26:49.000 We're living through this time.
01:26:50.000 It's not a coincidence at all.
01:26:52.000 This is the time that is after we had enough cell phones out in the world recording tons of video to train the video model of the world that's all being jacked into us now via brain implants or whatever.
01:27:02.000 And before everything goes really crazy with AGI. And it's also this interesting time to simulate, like, can we get through?
01:27:10.000 Does the asteroid come right before we get there for dramatic tension?
01:27:13.000 Like, do we figure out how to make this safe?
01:27:14.000 Do we figure out how to societally agree on it?
01:27:16.000 So that's led to, like, a lot more people believing it than before, I think.
01:27:21.000 Yeah, for sure.
01:27:22.000 And...
01:27:23.000 Again, I think this is just where it's going.
01:27:26.000 I mean, I don't know if that's a good thing or a bad thing.
01:27:29.000 It's just a thing.
01:27:30.000 But it's certainly better to live now.
01:27:32.000 I would not want to live in the 1800s and be in a covered wagon trying to make my way across the country.
01:27:39.000 Yeah, we got the most exciting time in history yet.
01:27:41.000 It's the best!
01:27:42.000 It's the best, but it also has the most problems, the most social problems, the most awareness of social, environmental, infrastructure, the issues that we have.
01:27:54.000 We get to go solve them.
01:27:55.000 Yeah.
01:27:56.000 And I intuitively, I think I feel something somewhat different than you, which is I think humans in something close to this form are going to be around humans.
01:28:12.000 For a lot longer than...
01:28:14.000 I don't think we're the last humans.
01:28:16.000 How long do you think we have?
01:28:22.000 Like, longer than a time frame I can reason about.
01:28:25.000 Really?
01:28:25.000 Now, there may be, like...
01:28:27.000 I could totally imagine a world where some people decide to merge and go off exploring the universe with AI and there's a big universe out there, but, like...
01:28:37.000 Can I really imagine a world where, short of a natural disaster, there are not humans pretty similar to humans from today on Earth doing human-like things?
01:28:48.000 And the sort of spirit of humanity merged into these other things that are out there doing their thing in the universe?
01:28:56.000 It's very hard for me to actually see that happening.
01:29:01.000 And maybe that means I'm, like, going to turn out to be a dinosaur and Luddite and horribly wrong in this prediction.
01:29:06.000 But I would say I feel it more over time as we make progress with AI, not less.
01:29:10.000 Yeah, I don't feel that at all.
01:29:12.000 I feel like we're done.
01:29:13.000 In like a few years?
01:29:15.000 No, maybe a generation or two.
01:29:18.000 It'll probably be a gradual change, like wearing of clothes.
01:29:23.000 You know, I don't think everybody wore clothes and they invented clothes.
01:29:26.000 I think it probably took a while.
01:29:27.000 When someone figured out shoes, I think that probably took a while.
01:29:30.000 When they figured out structures, doors, houses, cities, agriculture, all those things were slowly implemented over time and then now become everywhere.
01:29:40.000 I think this is far more transformative.
01:29:44.000 And is part of that because you don't think there will be an option for some people not to merge?
01:29:48.000 Right.
01:29:48.000 Just like there's not an option for some people to not have telephones anymore.
01:29:53.000 I used to have friends like, I don't even have email.
01:29:55.000 Those people don't exist anymore.
01:29:57.000 They all have email.
01:29:58.000 Everyone has a phone, at least a flip phone.
01:30:00.000 I know some people that they just can't handle social media and all that jazz.
01:30:04.000 They went to a flip phone.
01:30:05.000 I don't know if this is true or not.
01:30:07.000 I've heard you can't walk into an AT&T store anymore and still buy a flip phone.
01:30:10.000 I heard that just changed.
01:30:11.000 You can?
01:30:11.000 Oh, really?
01:30:12.000 Someone told me this, but I don't know if it's true.
01:30:14.000 Verizon still has them.
01:30:15.000 I was just there.
01:30:16.000 They still have flip phones.
01:30:17.000 I was like, I like it.
01:30:18.000 I like this fucking little thing that you just call people.
01:30:21.000 And I always romanticize about going to that.
01:30:24.000 But my step was to go to a phone that has nothing on it but text messages.
01:30:29.000 And that's been a few days.
01:30:31.000 Feeling good so far?
01:30:32.000 Yeah, it's good.
01:30:33.000 I still have my other phone that I use for social media, but when I pick that motherfucker up, I start scrolling through YouTube and watching videos and scrolling through TikTok or Instagram.
01:30:45.000 I don't have TikTok, but I tried threads for a little while, but I'm like, oh, this is like a fucking ghost town.
01:30:52.000 So I went right back to X. I live on a ranch during the weekends and there's Wi-Fi in the house but there's no cell phone coverage anywhere else.
01:31:05.000 It's...
01:31:06.000 Every week, I forget how nice it is and what a change it is to go for a walk with no cell phone coverage.
01:31:14.000 It's good for your mind.
01:31:15.000 It's unbelievable for your mind.
01:31:17.000 And I think we have, like, so quickly lost something.
01:31:22.000 Like, out of service just doesn't happen.
01:31:24.000 It doesn't even happen on airplanes anymore.
01:31:26.000 But that...
01:31:31.000 Like, hours where your phone just cannot buzz.
01:31:34.000 Yeah.
01:31:35.000 No text message either.
01:31:36.000 Nothing.
01:31:39.000 I think that's a really healthy thing.
01:31:41.000 I dropped my phone once when I was in Lanai.
01:31:44.000 And I think it was the last time I dropped the phone.
01:31:46.000 The phone was like, we're done.
01:31:47.000 And it just started calling people randomly.
01:31:50.000 Like, it would just call people and I'd hang it up and call another person.
01:31:54.000 I'd hang it up.
01:31:54.000 And I was showing my wife.
01:31:55.000 I was like, look at this.
01:31:56.000 This is crazy.
01:31:56.000 It's just calling people.
01:31:58.000 And so the phone was broken and so I had to order a phone and we were on vacation for like eight days and it took three days for Apple to get me a phone.
01:32:06.000 I bet you had a great three days.
01:32:07.000 It was amazing.
01:32:09.000 It was amazing because when I was hanging out with my family I was totally present.
01:32:14.000 There was no options and I wasn't thinking about checking my phone because it didn't exist.
01:32:19.000 I didn't have one and there was a alleviation of again what would Sean was talking about that low level of anxiety.
01:32:28.000 This sort of like...
01:32:29.000 That you have when you always want to check your phone.
01:32:34.000 Yeah, I think...
01:32:35.000 I think that thing...
01:32:36.000 It's so bad.
01:32:37.000 We have not figured out yet...
01:32:40.000 Like, the technology has moved so fast.
01:32:42.000 Biology moves very slowly.
01:32:44.000 We have not figured out how we're going to function in this society.
01:32:47.000 And get those occasional times when your phone is broken for three days.
01:32:51.000 Yeah.
01:32:52.000 Or you go for a walk with no service.
01:32:54.000 But it's like...
01:32:56.000 I very much feel like my phone controls me.
01:33:02.000 Not the other way around.
01:33:03.000 Uh-huh.
01:33:04.000 And I hate it.
01:33:06.000 But I haven't figured out what to do about it.
01:33:07.000 Well, that's what I'm worried about with future technology.
01:33:11.000 Is that this, which was so unanticipated...
01:33:15.000 If you'd imagine a world when you...
01:33:17.000 Can you imagine going up to someone in 1984 and pointing to a phone and saying, one day that'll be in your pocket and it's going to ruin your life?
01:33:24.000 Like, what?
01:33:26.000 Like, yeah, one day people are going to be jerking off to that thing.
01:33:29.000 You're like, what?
01:33:30.000 One day people are going to be watching people get murdered on Instagram.
01:33:33.000 I haven't seen so many murders on Instagram over the last few months.
01:33:36.000 Really?
01:33:36.000 I've never seen one.
01:33:37.000 I've been a bad timeline.
01:33:38.000 Me and my friend Tom Segura.
01:33:41.000 Every morning we text each other the worst things that we find on Instagram.
01:33:44.000 Why?
01:33:45.000 For fun.
01:33:45.000 He's a comedian.
01:33:46.000 We're both comedians.
01:33:47.000 That's fun to you?
01:33:48.000 Yeah.
01:33:49.000 It's just fucking ridiculous.
01:33:51.000 I mean, just crazy.
01:33:53.000 Car accidents, people get gored by bulls, and we try to top each other.
01:33:59.000 So every day, he's sending me the most- Every day when I wake up and I- Tom, fuck, what do you got?
01:34:04.000 Can you explain what's fun about that?
01:34:07.000 Well, he's a comic and I'm a comic and comics like chaos.
01:34:12.000 We like ridiculous, outrageous shit that is just so far beyond the norm of what you experience in a regular day.
01:34:22.000 Got it.
01:34:23.000 And also the understanding of the wide spectrum of human behavior.
01:34:29.000 If you're a nice person and you surround yourself with nice people, you very rarely see someone get shot.
01:34:35.000 You very rarely see people get stabbed for no reason randomly on the street.
01:34:41.000 But on Instagram, you see that every day.
01:34:45.000 And there's something about that where it just reminds you, oh, we're crazy.
01:34:50.000 Like, the human species, like, there's a certain percentage of us that are just off the rails and just out there, just causing chaos and jumping dirt bikes and landing on your neck and, like, all that stuff is out there.
01:35:05.000 See, even to hear that makes me, like, physically, like, I know that happens, of course.
01:35:09.000 Mm-hmm.
01:35:11.000 And I know like not looking at it doesn't make it not happen.
01:35:14.000 Right.
01:35:15.000 But it makes me so uncomfortable and I'm happy to watch.
01:35:17.000 Oh yeah, it makes me uncomfortable too.
01:35:19.000 But yeah, we do it to each other every day.
01:35:23.000 And it's not good.
01:35:24.000 It's definitely not good.
01:35:25.000 But it's also, I'm not going to stop.
01:35:26.000 It's fun.
01:35:27.000 But why is it fun?
01:35:29.000 It's fun because it's my friend Tom and we're both kind of the same in that way.
01:35:32.000 Just look at that.
01:35:34.000 Look at this.
01:35:34.000 That I get.
01:35:35.000 Look at this.
01:35:35.000 And it's just a thing we started doing a few months ago.
01:35:38.000 Just like, can't stop.
01:35:40.000 And Instagram has learned that you do that, so it just keeps showing you more and more.
01:35:43.000 Instagram knows.
01:35:44.000 My search page is a mess.
01:35:47.000 When I go to the Discover page, oh, it's just crazy.
01:35:50.000 But the thing is, it shows up in your feed, too.
01:35:53.000 That's what I don't understand about the algorithm.
01:35:55.000 It knows you're fucked up, so it shows up in your feed of things.
01:35:59.000 They're not people I follow.
01:36:01.000 But Instagram shows them to me anyway.
01:36:04.000 I heard an interesting thing a few days ago about Instagram and the feed, which is if you use it at off hours, when they have more processing capability available because less people are using it, you get better recommendations.
01:36:16.000 So your feed will be better in the middle of the night.
01:36:18.000 What is better, though?
01:36:20.000 Doesn't your feed- More addictive to you or whatever.
01:36:22.000 Right.
01:36:23.000 So for me, better would be more murders, more animal attacks.
01:36:27.000 Sounds horrible.
01:36:27.000 It's horrible.
01:36:29.000 But it's just, it seems to know that's what I like.
01:36:34.000 It seems to know that that's what I interact with, so it's just sending me that most of the time.
01:36:40.000 Yeah.
01:36:41.000 That probably has all kinds of crazy psychological effects.
01:36:44.000 I'm sure.
01:36:44.000 Yeah, I'm sure that's also one of the reasons why I want to get rid of it and move away from it.
01:36:49.000 Yeah, so maybe it went too far.
01:36:52.000 I don't even know if it's too far.
01:36:54.000 But what it is, is it's showing me the darker regions of society, of civilization, of human behavior.
01:37:04.000 But you think we're about to edit all that out?
01:37:06.000 I wonder if that is a solution.
01:37:08.000 I really do.
01:37:10.000 Because I don't think it's outside the realm of possibility.
01:37:14.000 If we really truly can engineer, like one of the talks about Neuralink that's really promising is people with spinal cord issues, injuries, people that can't move their body, and being able to hotwire that where essentially it controls all these parts of your body that you couldn't control anymore.
01:37:35.000 And so that would be an amazing thing for people that are injured, amazing thing for people that are, you know, they're paralyzed, they have all sorts of neurological conditions.
01:37:46.000 That is probably one of the first, and that's what Elon's talked about as well, one of the first implementations, the restoration of sight, you know, cognitive function enhanced from people that have brain issues.
01:38:01.000 That's tremendously exciting.
01:38:03.000 And like many other technologies, I don't think we can stop neural interfaces because of the great good that's going to happen along the way.
01:38:12.000 But I also don't think we know where it goes.
01:38:14.000 It's Pandora's box for sure.
01:38:16.000 And I think when we open it, we're not going to go back.
01:38:20.000 Just like we're not going to go back to no computers without some sort of natural disaster.
01:38:25.000 By the way, and I mean this as a great compliment, you are one of the most neutral people I have ever heard talk about the merge coming.
01:38:33.000 You're just like, yeah, I think it's going to happen.
01:38:35.000 You know, it's be good in these ways, bad in these ways, but you seem like unbelievably neutral about it, which is always something I admire.
01:38:43.000 I try to be as neutral about everything as possible except for corruption which I think is just like one of the most massive problems with the way our culture is governed.
01:38:56.000 I think corruption and the influence of money is just a giant terrible issue.
01:39:02.000 But in terms of like social issues and in terms of the way human beings believe and think about things, I try to be as neutral as possible.
01:39:12.000 Because I think the only way to really truly understand the way other people think about things is try to look at it through their mind.
01:39:19.000 And if you have this inherent bias and this...
01:39:24.000 You have this very rigid view of what's good and bad and right and wrong.
01:39:31.000 I don't think that serves you very well for understanding why people differ.
01:39:36.000 So I try to be as neutral and as objective as possible when I look at anything.
01:39:42.000 This is a skill that I've learned.
01:39:43.000 This is not something I had in 2009 when I started this podcast.
01:39:47.000 This podcast I started just fucking around with friends and I had no idea what it was.
01:39:51.000 I mean, there's no way I could have ever known.
01:39:53.000 But also I had no idea what it was going to do to me as far as the evolution of me as a human being.
01:40:00.000 I am so much nicer.
01:40:02.000 I'm so much more aware of things.
01:40:05.000 I'm so much more conscious of the way other people think and feel.
01:40:09.000 I'm just a totally different person than I was in 2009. Which is hard to recognize.
01:40:16.000 It's hard to believe.
01:40:17.000 That's really cool.
01:40:18.000 But it's just an inevitable consequence of this unexpected education that I've received.
01:40:25.000 Did the empathy kind of, like, come on linearly?
01:40:27.000 Yes.
01:40:28.000 That was not a...
01:40:29.000 No, it just came on recognizing that the negative interactions on social media that I was doing, they didn't help me.
01:40:40.000 They didn't help the person.
01:40:42.000 And then having compassion for this person that's fucked up or done something stupid, it's not good to just dunk on people.
01:40:48.000 There's no benefit there other than to give you some sort of social credit and get a bunch of likes.
01:40:54.000 It didn't make me feel good.
01:40:56.000 That's not good.
01:40:57.000 And also a lot of psychedelics.
01:40:59.000 A ton of psychedelic experiences from 2009 on, with every one a greater understanding of the impact.
01:41:06.000 I had one recently.
01:41:08.000 And when I had the one recently, the overwhelming message that I was getting through this was that everything I say and do ripples off into all the people that I interact with.
01:41:24.000 And then if I'm not doing something with at least the goal of overall good or overall understanding that I'm doing bad and that that bad is a real thing, as much as you try to ignore it because you don't interface with it instantly,
01:41:41.000 you're still creating unnecessary negativity.
01:41:48.000 And that I should avoid that as much as possible.
01:41:50.000 It was like an overwhelming message that this psychedelic experience was giving me.
01:41:58.000 And I took it because I was just particularly anxious that day about the state of the world, particularly anxious about Ukraine and Russia and China and the political system that we have in this country and this incredibly polarizing way that the left and the right engage with each other and God,
01:42:20.000 it just seems so just tormented.
01:42:25.000 And so I was just...
01:42:26.000 Some days I just get...
01:42:28.000 I think too much about it.
01:42:29.000 I'm like, I need something.
01:42:31.000 Crack me out of this.
01:42:32.000 So I took these psychedelics.
01:42:35.000 Are you surprised psychedelic therapy has not made...
01:42:38.000 From what you thought would happen in, say, the early 2010s until now, are you surprised it has not made more progress sort of on a path to legalization as a medical treatment?
01:42:47.000 No.
01:42:48.000 No, I'm not because there's a lot of people that don't want it to be in place and those people have tremendous power over our medical system and over our regulatory system.
01:42:58.000 And those people have also not experienced these psychedelics.
01:43:02.000 There's very few people that have experienced profound psychedelic experiences that don't think there's an overall good for those things.
01:43:10.000 So the problem is you're having these laws and these rules implemented by people who are completely ignorant about the positive effects of these things.
01:43:21.000 And if you know the history of psychedelic prohibition in this country, It all took place during 1970, and it was really to stop the civil rights movement, and it was really to stop the anti-war movement.
01:43:36.000 And they tried to find a way to make all these things that these people were doing that was causing them to think in these very different ways.
01:43:44.000 Tune in, turn on, drop out.
01:43:46.000 They just wanted to put a fucking halt to that.
01:43:49.000 What better way than to lock everyone who participates in that in cages?
01:43:53.000 Find the people who are producing it.
01:43:55.000 Lock them in cages.
01:43:56.000 Put them in jail for the rest of their life.
01:43:58.000 Make sure it's illegal.
01:43:59.000 Arrest people.
01:44:00.000 Put the busts on television.
01:44:01.000 Make sure that people are aware.
01:44:03.000 And then there's also you connect it to drugs that are inherently dangerous for society and detrimental.
01:44:09.000 The fentanyl crisis.
01:44:11.000 You know, the crack cocaine crisis that we experienced in the 90s, like all of those things, they're under the blanket of drugs.
01:44:18.000 Psychedelic drugs are also talked about like drugs, even though they have these profound spiritual and psychological changes that they I remember when I was in elementary school and I was in like drug education,
01:44:34.000 they talked about, you know, marijuana is really bad because it's a gateway to these other things.
01:44:39.000 And there's this bad one, that bad one, heroin, whatever.
01:44:41.000 And the very end of the line, the worst possible thing is LSD. Did you take an LST and go, oh, they're lying?
01:44:49.000 Psychedelic therapy was definitely one of the most important things in my life.
01:44:55.000 And I assumed, given how powerful it was for me, I struggled with all kinds of anxiety and other negative things.
01:45:05.000 And to watch all of that go away...
01:45:08.000 Like that.
01:45:10.000 I traveled to another country for like a week, did a few things, came back.
01:45:15.000 Totally different person.
01:45:16.000 And I was like, I've been lied to my whole life.
01:45:18.000 I'm so grateful that this happened to me now.
01:45:21.000 Talked to a bunch of other people, all similar experiences.
01:45:24.000 I assumed, this was a while ago, I assumed it was, and I was like, you know, very interested in what was happening in the U.S. I was like, particularly like looking at where MDMA and psilocybin were on the path.
01:45:37.000 And I was like, all right, this is going to get through.
01:45:39.000 And this is going to change the mental health of a lot of people in a really positive way.
01:45:44.000 And I am surprised we have not made faster progress there, but I'm still optimistic we will.
01:45:48.000 Well, we have made so much progress from the time of the 1990s.
01:45:55.000 In the 1990s, you never heard about psychedelic retreats.
01:45:58.000 You never heard about people taking these vacations.
01:46:01.000 You never heard about people getting together in groups and doing these things and coming back with these profound experiences that they relayed to other people and literally seeing people change.
01:46:11.000 Seeing who they are change.
01:46:13.000 Seeing people become less selfish, less toxic, less mean, you know, and more empathetic and more understanding.
01:46:23.000 Totally.
01:46:24.000 Yeah.
01:46:25.000 I mean, I can only talk about it from a personal experience.
01:46:28.000 It's been a radical change in my life, as well as, again, having all these conversations with different people.
01:46:33.000 I feel so fortunate to be able to do this, that I've had so many different conversations with so many different people that think so differently, and so many exceptional people.
01:46:44.000 We're good to go.
01:47:03.000 And there are tools that are in place.
01:47:06.000 But unfortunately, in this very prohibitive society, this society of prohibition, we're denied those.
01:47:15.000 And we're denied ones that have never killed anybody, which is really bizarre when OxyContin can still be prescribed.
01:47:23.000 What's the deal with why we can't make...
01:47:27.000 If we leave, like, why we can't get these medicines that have transformed people's lives, like, more available, what's the deal with why we can't stop the opioid crisis?
01:47:38.000 Or, like, fentanyl seems like just an unbelievable crisis for San Francisco.
01:47:43.000 You remember in the beginning of the conversation when you said that AI will do a lot of good, overall good, but also not no harm?
01:47:53.000 If we legalize drugs, all drugs, that would do the same thing.
01:47:59.000 Would you advocate to legalize all drugs?
01:48:01.000 It's a very complicated question because I think you're going to have a lot of addicts that wouldn't be addicts.
01:48:07.000 You're going to have a lot of people's lives destroyed because it's legal.
01:48:10.000 There's a lot of people that may not be psychologically capable of handling things.
01:48:15.000 That's the thing about psychedelics.
01:48:17.000 They do not ever recommend them for people that have a slippery grasp on reality as it is.
01:48:22.000 People that are struggling, people that are already on a bunch of medications that allow them to just keep a steady state of existence in the normal world.
01:48:32.000 If you just fucking bombard them with psilocybin, who knows what kind of an effect that's going to have and whether or not they're psychologically too fragile to recover from that.
01:48:43.000 I mean, there's many, many stories of people taking too much acid and never coming back.
01:48:49.000 Yeah.
01:48:49.000 Yeah, these are like...
01:48:55.000 Powerful doesn't seem to, like, begin to cover it.
01:48:57.000 Right.
01:48:57.000 Yeah.
01:48:58.000 But there's also, what is it about humans that are constantly looking to perturb their normal state of consciousness?
01:49:06.000 Constantly.
01:49:06.000 Whether it's, we're both drinking coffee, you know, people smoke cigarettes, they do all, they take Adderall, they do all sorts of different things to change and enhance their normal state of consciousness.
01:49:18.000 It seems like, whether it's meditation or yoga, they're always doing something to try to get out of their own way.
01:49:25.000 Or get in their own way.
01:49:26.000 Or distract themselves from the pain of existence.
01:49:30.000 And it seems like a normal part of humans.
01:49:33.000 And even monkeys, like vervet monkeys, get addicted to alcohol.
01:49:37.000 They get addicted to fermented fruits and alcohol.
01:49:39.000 And they become drunks and alcoholics.
01:49:43.000 What do you think is the deep lesson there?
01:49:45.000 Well, we're not happy, exactly.
01:49:48.000 And then some things can make you happy, sort of.
01:49:51.000 A couple of drinks makes you so happy for a little bit, until you're an alcoholic, until you destroy your liver, until you crash your car, until you're involved in some sort of a violent encounter that you would never be involved with if you weren't drunk.
01:50:07.000 You know, I love caffeine, which clearly is a drug.
01:50:12.000 Alcohol, like, I like, but I often am like, yeah, this is like, you know, this is like dulling me and I wish I hadn't had this drink.
01:50:21.000 And then other stuff, like, I mostly would choose to avoid.
01:50:27.000 But that's because you're smart.
01:50:29.000 And you're probably aware of the pros and cons and you're also probably aware of how it affects you and what's doing good for you and what is detrimental to you.
01:50:39.000 But that's a decision that you can make as an informed human being that you're not allowed to make if everything's illegal.
01:50:46.000 Right.
01:50:47.000 Yeah.
01:50:48.000 And also, when things are illegal, criminals sell those things because you're not tampering the desire by making it illegal.
01:50:58.000 You're just making access to it much more complicated.
01:51:01.000 What I was going to say is if fentanyl is really great, I don't want to know.
01:51:05.000 Apparently it is.
01:51:06.000 Apparently it is.
01:51:07.000 Yeah.
01:51:08.000 Peter Berg was on the podcast and he produced that painkiller documentary for Netflix, the docudrama about the Sackler family.
01:51:16.000 It's an amazing piece.
01:51:17.000 But he said that he took OxyContin once recreationally.
01:51:22.000 He was like, oh my god, it's amazing.
01:51:26.000 He's like, keep this away from me.
01:51:28.000 It feels so good.
01:51:29.000 And that's part of the problem.
01:51:31.000 Yeah, it will wreck your life.
01:51:33.000 Yeah, it will capture you.
01:51:35.000 But it's just so unbelievable.
01:51:36.000 But the feeling.
01:51:38.000 How did Lenny Bruce describe it?
01:51:39.000 I think he described heroin as getting a warm hug from God.
01:51:44.000 I think the feeling that it gives you is probably pretty spectacular.
01:51:49.000 I don't know if legalizing that is going to solve the problems, but I do know that another problem that we're not paying attention to is the rise of the cartels and the fact that right across our border where you can walk, There are these enormous,
01:52:07.000 enormous organizations that make who knows how much money, untold, uncalculable amounts of money, selling drugs and bringing them into this country.
01:52:17.000 And one of the things they do is they put fentanyl in everything to make things stronger.
01:52:21.000 And they do it for, like, street Xanax.
01:52:25.000 There's people that have overdosed thinking they're getting Xanax and they fucking die for fentanyl.
01:52:28.000 Yeah, they do it with cocaine, of course.
01:52:32.000 They do it with everything.
01:52:33.000 There's so many things that have fentanyl in them and they're cut with fentanyl because fentanyl is cheap and insanely potent.
01:52:42.000 And that wouldn't be a problem if things were legal.
01:52:44.000 So would you net out towards saying, all right, let's just legalize it?
01:52:48.000 Yeah, I would net out towards that, but I would also put into place some serious mitigation efforts, like in terms of counseling, drug addiction, and ibogaine therapy, which is another thing that's illegal.
01:52:58.000 Someone was just telling me about how transformative this was for them.
01:53:01.000 Yeah, I haven't experienced that personally, but ibogaine, for many of my friends that have had pill problems, and I have a friend, my friend Ed Clay, Who started an ibogaine center in Mexico because he had an injury and he got hooked on pills and he couldn't kick it.
01:53:18.000 Did ibogaine, gone.
01:53:20.000 One time done.
01:53:21.000 One time done.
01:53:21.000 24 hour super uncomfortable experience.
01:53:24.000 It's supposed to be a horrible experience, right?
01:53:25.000 Yeah, it's supposed to be not very recreational.
01:53:28.000 Not exactly something you want to do on the weekend with friends.
01:53:31.000 It's something you do because you're fucked and you need to figure out how to get out of this fuckedness.
01:53:36.000 And that, like, we think about how much money is spent on rehabs in this country, and what's the relapse rate?
01:53:42.000 It's really high.
01:53:44.000 I mean, I have many friends that have been to rehab for drug and alcohol abuse, and quite a few of them went right back to it.
01:53:52.000 It doesn't seem to be that effective.
01:53:54.000 It seems to be effective to people when people have, like, they really hit rock bottom, and they have a strong will, and then they get involved in a program, some sort of a 12-step program, some sort of a narcotics anonymous program.
01:54:05.000 And then they get support from other people and they eventually build this foundation of other types of behaviors and ways to find other things to focus on to whatever aspect of their mind that allows them to be addicted to things.
01:54:21.000 Now it's focused on exercise, meditation, yoga, whatever it is.
01:54:25.000 That's your new addiction and it's a much more positive and beneficial addiction.
01:54:29.000 But the reality of the physical addiction That there are mitigation efforts.
01:54:35.000 Like there's so many people that have taken psilocybin and completely quit drugs, completely quit cigarettes, completely quit a lot because they realize like, oh, this is what this is.
01:54:44.000 This is why I'm doing this.
01:54:47.000 Yeah, that's why I was more optimistic that the world would have made faster progress towards acceptance of – you hear so many stories like this.
01:54:56.000 So I would say, all right, clearly a lot of our existing mental health treatment at best doesn't work.
01:55:01.000 Clearly our addiction programs are ineffective.
01:55:05.000 If we have this thing that in every scientific study or most scientific studies we can see is delivering these unbelievable results, it's going to happen.
01:55:18.000 I still am excited for it.
01:55:20.000 I still think it will be a transformatively positive development, but...
01:55:24.000 It'll change politics.
01:55:26.000 It'll absolutely change the way we think of other human beings.
01:55:30.000 It'll absolutely change the way we think of society and culture as a whole.
01:55:33.000 It'll absolutely change the way people interact with each other if it becomes legalized.
01:55:39.000 And it's slowly becoming legalized.
01:55:40.000 Like, think of marijuana, which is like, you know, the gateway drug.
01:55:44.000 Marijuana is now legal recreationally in how many states?
01:55:49.000 23. And then medically and many more.
01:55:55.000 And, you know, it's really easy to get a license medically.
01:55:58.000 In California, I got one in 1996. I used to be able to just go somewhere and go, I got a headache.
01:56:06.000 That's it.
01:56:07.000 Yeah, I get headaches.
01:56:08.000 I'm in pain a lot, you know.
01:56:10.000 I do a lot of martial arts.
01:56:11.000 I'm always injured.
01:56:11.000 I need some medication.
01:56:12.000 I don't like taking pain pills.
01:56:14.000 Bam!
01:56:15.000 You got a legal prescription for weed.
01:56:16.000 I used to have to go to Inglewood to get it.
01:56:18.000 I used to have to go to the Hood, to the Inglewood Wellness Center.
01:56:22.000 And I was like, this is crazy.
01:56:24.000 Like, marijuana is now kind of sort of legal.
01:56:27.000 And then in 2016, it became legal in California, recreationally.
01:56:32.000 And it just changed everything.
01:56:34.000 I had all these people that were like right-wing people that were taking edibles to sleep.
01:56:41.000 Because now that it's legal, they thought about it in a different way.
01:56:44.000 And I think that drug, which is a fairly mild psychedelic, also has enhancing effects.
01:56:52.000 It makes people more compassionate.
01:56:54.000 It makes people more kind and friendly.
01:56:57.000 It's sort of the opposite of a drug that enhances violence.
01:57:01.000 It doesn't enhance violence at all.
01:57:03.000 Alcohol does that.
01:57:04.000 Cocaine does that.
01:57:05.000 To say a thing that'll make me very unpopular, I hate marijuana.
01:57:08.000 It does not sit well with me at all.
01:57:10.000 What does it do to you that you don't like?
01:57:11.000 It makes me tired and slow for a long time after it.
01:57:15.000 I think also there's biological variabilities, right?
01:57:18.000 Like some people, like my wife, she does not do well with alcohol.
01:57:22.000 She can get drunk off one drink.
01:57:24.000 But it's biological.
01:57:25.000 She got some sort of a genetic...
01:57:27.000 I forget what it's called.
01:57:28.000 Something about Billy Rubin.
01:57:30.000 Something that her body just doesn't process alcohol very well.
01:57:33.000 So she's a cheap date.
01:57:34.000 Oh, all I meant is that genetically, I got whatever the mutation is that makes it an unpleasant experience.
01:57:39.000 Yeah.
01:57:40.000 But what I was saying is, for me, that's the opposite.
01:57:41.000 Alcohol doesn't bother me at all.
01:57:43.000 I could drink three, four drinks, and I'm sober in 20 minutes.
01:57:47.000 And my body just...
01:57:48.000 My liver just is like a blast furnace.
01:57:50.000 Just goes right through it.
01:57:51.000 I can sober up real quick.
01:57:52.000 But...
01:57:53.000 I also don't need it.
01:57:55.000 Like, I'm doing Sober October for the whole month.
01:57:58.000 Feels good?
01:57:58.000 Yeah.
01:57:59.000 Did shows last night.
01:58:00.000 Great.
01:58:00.000 No problems.
01:58:01.000 Not having alcohol doesn't seem to bother me at all.
01:58:05.000 But I do like it.
01:58:06.000 I do like a glass of wine.
01:58:08.000 It's a nice thing at the end of the day.
01:58:09.000 I like it.
01:58:11.000 Speaking of that, and psychedelics in general, many cultures have had a place for some sort of psychedelic time in someone's life or rite of passage.
01:58:21.000 But as far as I can tell, most of them are under...
01:58:25.000 There's some sort of ritual about it.
01:58:28.000 And I do worry that...
01:58:30.000 And I think these things are so powerful that I worry about them just being kind of...
01:58:37.000 Used all over the place all the time.
01:58:39.000 And I hope that we as a society, because I think this is not going to happen even if it's slow, find a way to treat this with the respect that it needs.
01:58:51.000 We'll see how that goes.
01:58:52.000 Agreed.
01:58:53.000 Yeah, I think set and setting is very important.
01:58:57.000 And thinking about what you're doing before you're doing it and why you're doing it.
01:59:02.000 Like I was saying the other night when I had this psychedelic experience, I was just like, God, sometimes I just think too much about the world and that it's so fucked.
01:59:11.000 And you have kids and you wonder, like, what kind of a world are they going to grow up in?
01:59:15.000 And it was just one of those days where I was just like, God, there's so much anger and there's so much this and that.
01:59:21.000 And then it's just...
01:59:22.000 It took it away from me.
01:59:23.000 The rest of the day, like, that night I was so friendly and so happy, and I just wanted to hug everybody.
01:59:29.000 I just really...
01:59:30.000 I got it.
01:59:31.000 I go, oh my god.
01:59:32.000 That is not...
01:59:32.000 Thinking about it wrong.
01:59:34.000 Do you think the anger in the world is way higher than it used to be?
01:59:37.000 Or we just...
01:59:39.000 It's like all these dynamics from social media we were talking about earlier.
01:59:42.000 I think the dynamics in social media certainly exacerbated anger in some people.
01:59:46.000 But I think anger in the world is just a part of...
01:59:51.000 Frustration, inequality, problems that are so clear but are not solved and all the issues that people have.
02:00:00.000 I mean, it's not a coincidence that a lot of the mass violence that you're seeing in this country, mass looting and all these different things are being done by poor people.
02:00:09.000 Do you think AGI will be an equalizing force for the world or further inequality?
02:00:13.000 I think it depends on how it's implemented.
02:00:16.000 My concern is, again, what we were talking about before with some sort of a neural interface, that it will increase your ability to be productive to a point where you can control resources so much more than anyone else.
02:00:30.000 And you will be able to advance your economic portfolio and your influence on the world through that, your amount of power that you can acquire.
02:00:40.000 It will, before the other people can get involved, because I would imagine financially it'll be like cell phones.
02:00:48.000 In the beginning, you remember when the movie Wall Street, when he had that big brick cell phone?
02:00:54.000 It's like, look at him.
02:00:55.000 He's out there on the beach with a phone.
02:00:57.000 That was crazy.
02:00:58.000 No one had one of those things back then.
02:00:59.000 And they were so rare.
02:01:01.000 I got one in 1994 when I first moved to California and I thought I was living in the fucking future.
02:01:06.000 Giant thing?
02:01:06.000 No, it was a Motorola StarTAC.
02:01:08.000 That was a cool phone.
02:01:09.000 I actually had one in my car in 1988. I was one of the first people to get a cell phone.
02:01:14.000 I got one in my car.
02:01:16.000 And it was great because my friend Bill Blumenreich, who runs the Comedy Connection, He would call me because he knew he could get a hold of me like someone got sick or fell out I could get a gig because he could call me So I was in my cars like Joe.
02:01:31.000 What are you doing?
02:01:31.000 Do you have a spot tonight?
02:01:33.000 And I'm like no I'm open.
02:01:34.000 He's like fantastic and so he'd give me gigs So I got a bunch of gigs through this phone work kind of paid itself But I got it just because it was cool Like I could drive down the street and call people dude, I'm driving and I'm calling you like it was nuts To be able to drive,
02:01:50.000 and I had a little antenna, a little squirrely pigtail antenna on my car, on the roof of the car.
02:01:55.000 But now everyone has one.
02:02:00.000 You can go to a third world country, and people in small villages have phones.
02:02:05.000 It's super common.
02:02:07.000 It's everywhere.
02:02:08.000 Essentially, more people have phones than don't have phones.
02:02:11.000 There's more phones than there are human beings.
02:02:13.000 Which is pretty fucking wild.
02:02:15.000 And I think that that initial cost problem, it's going to be prohibitively expensive initially.
02:02:24.000 And the problem is the wealthy people are going to be able to do that.
02:02:27.000 And then the real crazy ones that wind up getting the holes drilled in their head.
02:02:32.000 And if that stuff is effective, maybe there's problems with generation one, but generation two is better.
02:02:38.000 There's going to be a time where you have to enter into the game.
02:02:42.000 There's going to be a time where you have to sell your stocks.
02:02:44.000 Don't wait too long.
02:02:46.000 Hang on there.
02:02:47.000 Go.
02:02:48.000 And once that happens, my concern is that the people that have that will have...
02:02:54.000 Such a massive advantage over everyone else that the gap between the haves and have-nots will be even further and it'll be more polarizing.
02:03:04.000 This is something I've changed my mind on.
02:03:07.000 Someone at OpenAI said to me a few years ago, you really can't just let some people merge without a plan because it could be such an incredible Distortion of power.
02:03:19.000 And we're going to have to have some sort of societal discussion about this.
02:03:23.000 Yeah.
02:03:24.000 That seems real.
02:03:26.000 I think so at this point.
02:03:28.000 Especially if it's as effective as AGI is.
02:03:33.000 Or as, excuse me, ChatGPT is.
02:03:36.000 ChatGPT is so amazing.
02:03:39.000 When you enter into it information, you ask it questions, and it can give you answers, and you could ask it to code a website for you, and it does it instantly, and it solves problems that literally you would have to take decades to try to solve.
02:03:51.000 And it gets to it right away.
02:03:53.000 This is the dumbest it will ever be?
02:03:55.000 Yeah, that's what's crazy.
02:03:57.000 That's what's crazy.
02:03:58.000 So imagine something like that, but even more advanced.
02:04:02.000 Multiple Stages of improvement and innovation forward.
02:04:07.000 And then it interfaces directly with the mind.
02:04:10.000 But it only does it with the people that can afford it.
02:04:13.000 And those people are just regular humans.
02:04:15.000 So they haven't been enhanced.
02:04:19.000 We haven't evolved physically.
02:04:22.000 We still have all the human reward systems in place.
02:04:25.000 We're still basically these territorial primates.
02:04:29.000 And now we have, you know, you just imagine some fucking psychotic billionaire who now gets this implant and decides to just completely hijack our financial systems, acquire all the resources,
02:04:44.000 set into place regulations and influences that only benefit them, and then make sure that they can control it from there on out.
02:04:51.000 How much do you think this actually, though, even requires, like, a physical implant?
02:04:58.000 Some people have access to GPT-7 and can spend a lot on the inference compute for it, and some don't.
02:05:05.000 I think that's going to be very transformative too.
02:05:08.000 But my thought is that once...
02:05:12.000 I mean, we have to think of what are the possibilities of a neural enhancement.
02:05:18.000 If you think about the human mind and how the human mind interacts with the world, how you interact with language and thoughts and facts.
02:05:33.000 I think?
02:05:49.000 When one person can read minds and other people can't, when one person has a completely accurate forecast of all of the trends in terms of stocks and resources and commodities, and they can make choices based on those.
02:06:07.000 I totally see all of that.
02:06:08.000 The only thing I feel a little confused about is You know, human talking and listening bandwidth or typing and reading bandwidth is not very high.
02:06:22.000 But it's high enough where if you can just say, like, tell me everything that's going to happen in the stock market if I want to go make all the money, what should I do right now?
02:06:29.000 And it goes, and then just shows you on the screen.
02:06:32.000 Even without a neural interface, you're kind of a lot of the way to the scenario you're describing.
02:06:37.000 Sure, with stocks.
02:06:39.000 Or with, like, you know, tell me how to, like, invent some new technology that will change the course of history.
02:06:46.000 Yeah.
02:06:48.000 Yeah.
02:06:49.000 All those things.
02:06:50.000 Like I think what somehow matters is access to massive amounts of computing power, especially like differentially massive amounts, maybe more than the interface itself.
02:07:00.000 I think that certainly is going to play a massive factor in the amount of power and influence a human being has, having access to that.
02:07:10.000 My concern is that what neural interfaces are going to do is now you're not a human mind interacting with that data.
02:07:20.000 Now you are some massively advanced version of what a human mind is.
02:07:29.000 And it has...
02:07:32.000 Just profound possibilities that we can't even imagine.
02:07:37.000 We can imagine, but we can't truly conceptualize them because we don't have the context.
02:07:45.000 We don't have that ability and that possibility currently.
02:07:49.000 We can just guess.
02:07:51.000 But when it does get implemented, that you're dealing with a completely different being.
02:08:02.000 The only true thing I can say is I don't know.
02:08:04.000 Yeah.
02:08:05.000 Do you wonder why it's you?
02:08:09.000 Do you ever think, like, how am I at the forefront of this spectacular change?
02:08:21.000 Well, first of all, I think it's very much like...
02:08:25.000 You could make the statement from many companies, but none is as true for OpenAI.
02:08:31.000 The CEO is far from the most important person in the company.
02:08:35.000 In our case, there's a large handful of researchers, each of which are individually more critical to the success we've had so far and that we will have in the future than me.
02:08:48.000 And I bet those people really are like, hmm, this is weird to be them.
02:08:54.000 It's certainly weird enough for me that it, like, ups my simulation hypothesis probability somewhat.
02:09:02.000 If you had to give a guess, what...
02:09:08.000 When you think about the possibility of simulation theory, what kind of percentage do you— I've never known how to put any number on it.
02:09:15.000 Like, you know, every argument that I've read, written, explaining why it's like super high probability, that all seems reasonable to me.
02:09:23.000 It feels impossible to reason about, though.
02:09:25.000 What about you?
02:09:27.000 Yeah, same thing.
02:09:28.000 I go, maybe, but it's still what it is, and I have to exist.
02:09:33.000 That's the main thing, is I think it doesn't matter.
02:09:35.000 I think it's like, okay— It definitely matters, I guess, but there's not a way to know currently.
02:09:45.000 What matters, though?
02:09:46.000 Well, if this really is—I mean, our inherent understanding of life is that we are these biological creatures that interact with other biological creatures.
02:09:55.000 We mate and breed and that this creates more of us.
02:10:00.000 And then hopefully as society advances and we acquire more information, more understanding and knowledge, this next version of society will be superior to the version that preceded it, which is just how we look at society today.
02:10:13.000 Nobody wants to live in 1860 where you died of a cold and there's no cure for infections.
02:10:19.000 It's much better to be alive now.
02:10:23.000 Just inarguably.
02:10:25.000 Unless you really do prefer the simple life that you see on Yellowstone or something, it's like what we're dealing with now, first of all, access to information, the lack of ignorance.
02:10:39.000 If you choose, To seek out information.
02:10:44.000 You have so much more access to it now than ever before.
02:10:47.000 That's so cool.
02:10:48.000 And over time, like if you go back to the beginning of written history to now, one of the things that is clearly evident is the more access to information, the better choices people can make.
02:11:00.000 They don't always make better choices, but they certainly have much more of a potential to make better choices with more access to information.
02:11:08.000 You know, we think that this is just this biological thing, but imagine if that's not what's going on.
02:11:14.000 Imagine if this is a program and that you are just consciousness that's connected to this thing that's creating this experience that is indistinguishable from what we like to think of as a real biological experience from carbon-based life forms interacting with solid physical things in the real world.
02:11:41.000 It's still unclear to me what I'm supposed to do differently or think differently.
02:11:45.000 Yeah, you're 100% right.
02:11:47.000 What can you do differently?
02:11:48.000 I mean, if you exist as if it's a simulation, if you just live your life as if it's a simulation, is that...
02:11:54.000 I don't know if that's the solution.
02:11:58.000 I mean, it's real to me no matter what.
02:12:02.000 It's real, yeah.
02:12:04.000 I'm going to live it that way.
02:12:05.000 And that will be the problem with an actual simulation if and when it does get implemented.
02:12:10.000 If we do create an actual simulation that's indistinguishable from real life, like what are the rules of the simulation?
02:12:20.000 How does it work?
02:12:21.000 Is that simulation fair and equitable and much more reasonable and peaceful?
02:12:26.000 Is there no war in that simulation?
02:12:28.000 Should we all agree to Hook up to it because we will have a completely different experience in life.
02:12:35.000 And all the angst of crime and violence and the things we truly are terrified of, there will be nonexistent in this simulation.
02:12:45.000 Yeah.
02:12:46.000 I mean, if we keep going, it seems like if you just extrapolate from where VR is now.
02:12:53.000 Did you see the podcast that Lex Friedman did with Mark Zuckerberg?
02:12:57.000 I saw some clips, but I haven't got to watch all of them.
02:13:00.000 Bizarre, right?
02:13:01.000 So they're essentially using very realistic physical avatars in the metaverse.
02:13:07.000 Like, that's step one.
02:13:11.000 That's Pong.
02:13:11.000 Maybe that's step three.
02:13:12.000 Maybe it's a little down Pong at that point.
02:13:14.000 Yeah, maybe it's Atari.
02:13:15.000 Maybe you're playing Space Invaders now.
02:13:18.000 But whatever it is, it's on the path to this thing that will be indistinguishable.
02:13:23.000 That seems inevitable.
02:13:25.000 Those two things seem inevitable to me.
02:13:27.000 The inevitable thing to me is that we will create a lifeform that is an artificial, intelligent lifeform that's far more advanced than us, and once it becomes sentient, it will be able to create a far better version of itself.
02:13:42.000 And then as it has better versions of itself, it will keep going, and if it keeps going, it will reach God-like capabilities.
02:13:53.000 The complete Understanding of every aspect of the universe and the structure of it itself.
02:14:01.000 How to manipulate it, how to travel through it, how to communicate.
02:14:07.000 And that, you know, if we keep going, if we survive a hundred years, a thousand years, ten thousand years, and we're still on this same technological exponential increasing and capability path, that's God.
02:14:21.000 We become something like a God.
02:14:25.000 And that might be what we do.
02:14:27.000 That might be what intelligent, curious, innovative life actually does.
02:14:32.000 It creates something that creates the very universe that we live in.
02:14:38.000 Like, creates the next simulation and then...
02:14:40.000 Yeah, maybe that's the birth of the universe itself, is creativity and intelligence, and that it all comes from that.
02:14:47.000 I used to have this joke about the Big Bang.
02:14:50.000 Like, what if the Big Bang is just a natural thing?
02:14:54.000 Like, humans get so advanced that they create a Big Bang machine, and then, you know, we're so autistic and riddled with Adderall that we'd have no concept or worry of the consequences, and someone's like, I'll fucking press it.
02:15:07.000 And they press it and boom!
02:15:10.000 We start from scratch every 14 billion years.
02:15:13.000 And that's what a Big Bang is.
02:15:17.000 I mean, I don't know where it goes, but I do know that if you looked at the human race from afar, if you were an alien life-form completely detached from Any understanding of our culture?
02:15:30.000 Any understanding of our biological imperatives?
02:15:34.000 And you just looked at, like, what is this one dominant species doing on this planet?
02:15:39.000 It makes better things.
02:15:41.000 That's what it does.
02:15:42.000 That I agree.
02:15:45.000 It steals.
02:15:46.000 It does a bunch of things that it shouldn't do.
02:15:48.000 It pollutes.
02:15:49.000 It does all these things that are terrible.
02:15:51.000 But it also consistently and constantly creates better things, whether it's better weapons going from the catapult to the rifle to the cannonballs to the rocket ships to the hypersonic missiles to nuclear bombs.
02:16:05.000 It creates better and better and better things.
02:16:09.000 That's the number one thing it does.
02:16:10.000 And it's never happy with what it has.
02:16:13.000 And you add that to consumerism, which is baked into us, and this desire, this constant desire for newer, better things.
02:16:23.000 Well, that fuels that innovation because that gives it the resources that it needs to consistently innovate and constantly create newer and better things.
02:16:30.000 Well, if I was an alien life form, I was like, oh, what is it doing?
02:16:34.000 It's trying to create better things.
02:16:35.000 Well, what is the forefront of it?
02:16:37.000 Technology.
02:16:38.000 Technology is the most transformative, the most spectacular, the most interesting thing that we create.
02:16:44.000 And the most alien thing.
02:16:46.000 The fact that we just are so comfortable that you can FaceTime with someone in New Zealand, like instantly.
02:16:52.000 We can get used to anything pretty quickly.
02:16:55.000 Anything.
02:16:55.000 And take it for granted almost.
02:16:56.000 Yeah.
02:16:57.000 And well, if you were an alien life form and you were looking at us, you're like, well, what is it doing?
02:17:02.000 Oh, it keeps making better things.
02:17:04.000 It's going to keep making better things.
02:17:05.000 Well, if it keeps making better things, it's going to make a better version of a thinking thing.
02:17:10.000 And it's doing that right now.
02:17:12.000 And you're a part of that.
02:17:13.000 It's going to make a better version of a thinking thing.
02:17:15.000 Well, that better version of a thinking thing, it's basically now in the amoeba stage.
02:17:19.000 It's in the, you know, small multicellular life form stage.
02:17:23.000 Well, what if that version becomes a fucking Oppenheimer?
02:17:28.000 What if that version becomes, like, if it scales up so far that it becomes so hyper-intelligent that it is completely alien to any other intelligent life form that has ever existed here before?
02:17:41.000 And it constantly does the same thing, makes better and better versions of it.
02:17:45.000 Well, where does that go?
02:17:47.000 It goes to a god.
02:17:48.000 It goes to something like a God, and maybe God is a real thing, but maybe it's a real consequence of this process that human beings have of consistently, constantly innovating and constantly having this desire to push this envelope of creativity and of technological power.
02:18:11.000 I guess it comes down to maybe a definitional disagreement about what you mean by it becomes a god.
02:18:17.000 Like, I can totally...
02:18:20.000 I think it becomes something much, like, unbelievably much smarter and more capable than we are.
02:18:26.000 And what does that thing become if that keeps going?
02:18:30.000 And maybe the way you mean it as a god-like force is that that thing can then go create, can go simulate in a universe.
02:18:36.000 Yes.
02:18:37.000 Okay.
02:18:38.000 That I can resonate with.
02:18:40.000 Yeah.
02:18:40.000 I think whatever we create will still be subject to the laws of physics in this universe.
02:18:44.000 Right.
02:18:45.000 Yeah, maybe that is the overlying fabric that God exists in.
02:18:49.000 The God word is a fucked up word because it's just been so co-opted.
02:18:53.000 But, you know, I was having this conversation with Stephen Meyer, who is, he's a physicist.
02:18:59.000 I believe he's a physicist.
02:19:01.000 And he's also religious.
02:19:04.000 It was a real weird conversation.
02:19:06.000 Very fascinating conversation.
02:19:07.000 What kind of religion?
02:19:08.000 A believer in Christ.
02:19:09.000 Yeah, he even believes in the resurrection, which I found very interesting.
02:19:14.000 But, you know, it's interesting communicating with him because he has these little pre...
02:19:22.000 Designed speeches that he's encountered all these questions so many times that he has these very well-worded very articulate responses to these things that I sense are like bits You know like when I'm talking to a comic in like a comic like the I got this bit on train travel and they just tell you they tell you the bit like that's what it's like he has bits on why he believes in Jesus and why he believes and Very very intelligent guy,
02:19:48.000 but I proposed the question When we're thinking about God, what if instead of God created the universe, what if the universe is God?
02:19:56.000 And the creative force of all life and everything is the universe itself.
02:20:02.000 Instead of thinking that there's this thing that created us.
02:20:06.000 This is like close to a lot of the Eastern religions.
02:20:08.000 I think this is an easier thing to wrap my mind around than any other religions for me.
02:20:14.000 When I do psychedelics, I get that feeling.
02:20:17.000 I get that feeling like there's this insane soup of innovation and connectivity that exists all around us.
02:20:27.000 But our minds are so primal.
02:20:30.000 We're this fucking thing.
02:20:32.000 This is what we used to be.
02:20:34.000 What is that?
02:20:36.000 There's a guy named Shane Against the Machine who's this artist.
02:20:40.000 Who created this?
02:20:41.000 It's a chimpanzee skull that he made out of Zildjian symbols.
02:20:44.000 He left that on the back and he just made this dope art piece.
02:20:50.000 Cool.
02:20:50.000 It's just cool.
02:20:52.000 But I wonder if our limitations Are that we are an advanced version of primates.
02:21:00.000 We still have all these things that we talk about, jealousy, envy, anxiety, lust, anger, fear, violence, all these things that are detrimental but were important for us to survive and get to this point.
02:21:13.000 And that as time goes on, we will figure out a way to engineer those out.
02:21:18.000 And that as intelligent life becomes more intelligent and we create a version of intelligent life that's far more intelligent than what we are.
02:21:28.000 We're far more capable of what we are.
02:21:30.000 If that keeps going, if it just keeps going, I mean, ChatGPT.
02:21:35.000 Imagine if you go to ChatGPT and go back to Socrates.
02:21:39.000 And show him that.
02:21:40.000 Explain to him that.
02:21:41.000 And show him a phone and put it on a phone and have access to it.
02:21:44.000 He'd be like, what have you done?
02:21:46.000 Like, what is this?
02:21:47.000 I bet he'd be much more impressed with the phone than ChatGPT.
02:21:51.000 I think you'd be impressed with the phone's abilities to communicate for sure, but then the access to information would be so profound.
02:21:59.000 I mean, back then, I mean, look, you're dealing with a time when Galileo was put under house arrest because he had the gumption to say that the Earth is not the center of the universe.
02:22:09.000 Well, now we fucking know it's not.
02:22:12.000 Like, we have satellites.
02:22:13.000 We send literal cameras into orbit to take photos of things.
02:22:18.000 No, I totally get that.
02:22:20.000 I just meant that we kind of know what it's like to talk to a smart person.
02:22:24.000 And so in that sense, you're like, oh, all right.
02:22:26.000 I didn't think you could talk to a not person and have them be person-like in some responses some of the time.
02:22:31.000 But a phone, man, if you just woke up after 2,000 years and there was a phone, you have no model for that.
02:22:38.000 You didn't get to get there gradually.
02:22:39.000 Yeah.
02:22:40.000 No, you didn't.
02:22:42.000 My friend Eddie Griffin has a joke about that.
02:22:44.000 It was about how Alexander Graham Bell had to be doing coke.
02:22:48.000 He goes, because only someone on coke would be like, I want to talk to someone who's not even here.
02:22:53.000 Yeah.
02:22:57.000 That's what a phone is.
02:22:58.000 Is that something Coke makes people want to do?
02:22:59.000 I don't know.
02:23:00.000 I've never done Coke, but I would imagine it is.
02:23:03.000 Seems to me like it just makes people angry and chaotic.
02:23:06.000 Yeah, a little of that, but they also have ideas.
02:23:12.000 Back to this, where does it go?
02:23:16.000 If it keeps going, it has to go to some impossible level of capability.
02:23:23.000 I mean, just think of what we're able to do now with nuclear power and nuclear bombs and hypersonic missiles and just the insane physical things that we've been able to take out of the human creativity and imagination and through engineering and technology implement these physical devices That are indistinguishable from magic if you brought them 500 years ago.
02:23:52.000 Yeah.
02:23:52.000 I think it's quite remarkable.
02:23:55.000 So keep going.
02:23:56.000 Keep going 100,000 years from now.
02:23:58.000 If we're still here, if something like us is still here, what can it do?
02:24:04.000 In the same way that I don't think Socrates would have predicted the phone.
02:24:07.000 I can't predict that.
02:24:08.000 No, I'm probably totally off.
02:24:10.000 But maybe that's also why comets exist.
02:24:13.000 Maybe it's a nice reset to just like leave a few around, give them a distant memory of the utopian world that used to exist, have them go through thousands of years of barbarism Of horrific behavior and then reestablish society.
02:24:29.000 I mean, this is the Younger Dryas Impact Theory that around 11,800 years ago at the end of the Ice Age, that we were hit by multiple comets that caused the instantaneous melting of the ice caps over North America.
02:24:43.000 Flooded everything?
02:24:44.000 Flooded everything is the source of the flood myths from the Epic of Gilgamesh and the Bible and all those things.
02:24:51.000 And also there's physical evidence of it when they do core samples.
02:24:55.000 There's high levels of iridium, which is very common in space, very rare on Earth.
02:25:00.000 There's micro diamonds that are from impacts and it's like 30% of the Earth has evidence of this.
02:25:06.000 And so it's very likely that these people that are proponents of this theory are correct and that this is why They find these ancient structures that they're now dating to like 11,000, 12,000 years ago when they thought people were hunter-gatherers.
02:25:19.000 And they go, okay, maybe our timeline is really off and maybe this physical evidence of impacts.
02:25:24.000 Yeah, I've been watching that with interest.
02:25:25.000 Yeah.
02:25:26.000 Randall Carlson is the greatest guy to pay attention to when it comes to that.
02:25:30.000 He's kind of dedicated his whole life to it.
02:25:32.000 Which, by the way, happened because of a psychedelic experience.
02:25:35.000 He was on acid once and he was looking at this immense canyon and he had this vision that it was created by instantaneous erosions of the polar caps and that it just washed this wave of impossible water through the earth.
02:25:52.000 It just caught these paths and now there seems to be actual physical evidence of that.
02:25:58.000 That is probably what took place.
02:26:01.000 And that, you know, we're just the survivors.
02:26:04.000 And that we have re-emerged.
02:26:06.000 And that society and human civilization occasionally gets set back to a primal place.
02:26:13.000 Yeah.
02:26:14.000 You know, who knows?
02:26:15.000 If you're right that what happens here is we kind of edit out all of the impulses in ourselves that we don't like.
02:26:22.000 And we get to that world seems kind of boring.
02:26:24.000 So maybe that's when we have to make a new simulation to watch people think they're going through some drama or something.
02:26:29.000 Or maybe it's just...
02:26:31.000 We get to this point where we have this power, but the haves and the have-nots, the divide is too great.
02:26:37.000 And that people did get a hold of this technology and use it to oppress people who didn't have it.
02:26:43.000 And that we didn't mitigate the human biological problems, the reward systems that we have.
02:26:51.000 That I gotta have more and you gotta have less.
02:26:52.000 Yeah.
02:26:53.000 This sort of natural inclination that we have for competition.
02:26:58.000 And that someone hijacks that.
02:27:00.000 I think this is going to be such a hugely important issue to get ahead of before the first people push that button.
02:27:06.000 What do you think about, like, when Elon was calling for a pause on AI? He was, like, starting an AGI company while he was doing that.
02:27:17.000 Yeah.
02:27:18.000 Didn't he start it, like, after he was calling for the pause?
02:27:22.000 I think before, but I don't remember.
02:27:24.000 In any case...
02:27:24.000 Is it one of those you-can't-beat-em-join-em things?
02:27:26.000 Um...
02:27:30.000 I think the instinct of saying, like, we've really got to figure out how to make this safe and good and, like, widely good is really important.
02:27:42.000 But I think calling for a pause is, like, naive at best.
02:27:53.000 I kind of, like...
02:27:57.000 I kind of think you can't make progress on the safety part of this, as we mentioned earlier, by sitting in a room and thinking hard.
02:28:03.000 You've got to see where the technology goes.
02:28:05.000 You've got to have contact with reality.
02:28:09.000 But we're trying to make progress towards AGI, conditioned on it being safe and conditioned on it being beneficial.
02:28:15.000 And so when we hit any kind of, like, block, we try to find a technical or a policy or a social solution to overcome it that could be about the limits of the technology and something not working and, you know, hallucinates or it's not getting smart or whatever.
02:28:29.000 Or it could be there's this, like, safety issue and we've got to, like, redirect our resources to solve that.
02:28:34.000 But it's all, like, for me, it's all this same thing of, like, we're trying to solve the problems that emerge at each step as we get where we're trying to go.
02:28:44.000 And, you know, maybe you can call it a pause if you want, if you pause on capabilities to work on safety.
02:28:49.000 But in practice, I think the field has gotten a little bit wander on the axle there.
02:28:55.000 And safety and capabilities are not these two separate things.
02:28:59.000 This is like, I think, one of the dirty secrets of the field.
02:29:02.000 It's like we have this one way to make progress.
02:29:04.000 You know, we can understand and push on deep learning more.
02:29:09.000 That can be used in different ways, but I think it's that same technique that's going to help us eventually solve the safety.
02:29:18.000 All of that said, as like a human, emotionally speaking, I super understand why it's tempting to call for a pause.
02:29:27.000 Happens all the time in life, right?
02:29:28.000 This is moving too fast.
02:29:30.000 We got to take a pause here.
02:29:32.000 Yeah.
02:29:33.000 How much of a concern is it in terms of national security that we are the ones that come up with this first?
02:29:43.000 Well, I would say that if an adversary of ours comes up with it first and uses it against us and we don't have some level of capability, that feels really bad.
02:29:55.000 But I hope that what happens is this can be a moment where...
02:30:03.000 To tie it back to the other conversation, we kind of come together and overcome our base impulses and say, like, let's all do this as a club together.
02:30:09.000 That would be better.
02:30:11.000 That would be nice.
02:30:12.000 And maybe through AGI and through the implementation of this technology, it will make translation instantaneous and easy.
02:30:23.000 Well, that's already happened.
02:30:24.000 Right.
02:30:25.000 But, I mean, it hasn't happened in real time to the point where you can accurately communicate...
02:30:31.000 Very soon.
02:30:32.000 Very soon.
02:30:33.000 Very soon.
02:30:34.000 Yeah.
02:30:36.000 I do think, for what it's worth, that the world is going to come together here.
02:30:44.000 I don't think people have quite realized the stakes.
02:30:46.000 But this is, like...
02:30:47.000 I don't think this is a geopolitical...
02:30:49.000 If this comes down to, like, a geopolitical fight or race, I don't think there's any winners.
02:30:55.000 And so I'm optimistic about people coming together.
02:30:59.000 Yeah, I am too.
02:31:01.000 I mean, I think most people would like that.
02:31:05.000 If you asked the vast majority of the human beings that are alive, wouldn't it be better if everybody got along?
02:31:13.000 You know, maybe you can't.
02:31:17.000 Go all the way there and say we're just going to have one global effort.
02:31:22.000 But I think at least we can get to a point where we have one global set of rules, safety standards, organization that makes sure everyone's following the rules.
02:31:31.000 We did this for atomic weapons.
02:31:32.000 There's been similar things in the world of biology.
02:31:35.000 I think we'll get there.
02:31:37.000 That's a good example.
02:31:39.000 Nuclear weapons.
02:31:40.000 Because we know the destructive capability of them.
02:31:45.000 And because of that, we haven't detonated one since 1947. Pretty incredible.
02:31:51.000 Pretty incredible.
02:31:52.000 Other than tests.
02:31:53.000 We haven't used one in terms of war.
02:31:57.000 45 or 47. When was the end of the World War II? Wasn't it 47?
02:32:02.000 When they dropped the bombs?
02:32:04.000 I think that was 45. I was wondering if there was more after that.
02:32:07.000 I didn't know about it.
02:32:08.000 No.
02:32:08.000 It might be 45. I think it was, yeah.
02:32:10.000 45. So from 1945, which is pretty extraordinary.
02:32:14.000 It's remarkable.
02:32:15.000 I would not have predicted that, I think, if I could teleport back to 45. No.
02:32:19.000 I would have thought, oh my god, this is just going to be something that people do.
02:32:23.000 Just launch bombs on cities.
02:32:25.000 Yeah.
02:32:26.000 I mean, I would have said, like, we're not going to survive this for very long.
02:32:30.000 And there was a real fear of that.
02:32:31.000 For sure.
02:32:32.000 It's pretty extraordinary that they've managed to stop that, this threat of mutually assured destruction, self-destruction, destruction universe.
02:32:40.000 I mean, the whole world.
02:32:41.000 We have enough weapons to literally make the world uninhabitable.
02:32:45.000 Totally.
02:32:46.000 And because of that, we haven't done it.
02:32:49.000 Which is a good song.
02:32:51.000 I think that should give some hope.
02:32:53.000 Yeah, it should.
02:32:54.000 I mean, Steven Pinker gets a lot of shit for his work because he just sort of downplays violence today.
02:33:02.000 But it's not that he's downplaying violence today.
02:33:04.000 He's just looking at statistical trends.
02:33:06.000 If you're looking at the reality of life today versus life 100 years ago, 200 years ago, it's far more safer than Why do you think that's a controversial thing?
02:33:16.000 Like, why can't someone say, sure, we still have problems, but it's getting better?
02:33:19.000 Because people don't want to say that.
02:33:21.000 Especially people who are activists.
02:33:23.000 They're completely engrossed in this idea that there's problems today, and these problems are huge, and there's Nazis, and there's— But no one's saying there's not huge problems today.
02:33:34.000 Right.
02:33:34.000 No one's saying there's not.
02:33:35.000 But just to say things are better today.
02:33:37.000 Yeah, I guess that— To some people, they just don't want to hear that.
02:33:39.000 Right.
02:33:39.000 But those are also people that are addicted to the problems.
02:33:42.000 The problems become their whole life.
02:33:44.000 Solving those problems become their identity.
02:33:46.000 Being involved in the solutions or what they believe are solutions to those problems become their life's work.
02:33:52.000 And someone comes along and says, actually, life is safer than it's ever been before.
02:33:56.000 Interactions with police is safer.
02:33:58.000 I can see that that's deeply invalidating.
02:33:59.000 Yeah.
02:34:00.000 But also true.
02:34:03.000 And again, what is the problem?
02:34:05.000 Why can't people recognize that?
02:34:07.000 Well, it's the primate brain.
02:34:09.000 I mean, it's all the problems that we highlighted earlier.
02:34:12.000 And that that might be the solution to overcoming that is through technology.
02:34:18.000 And that might be the only way we can do it without a long period of evolution.
02:34:23.000 Because biological evolution is so relatively slow in comparison to technological evolution.
02:34:30.000 And that that might be our bottleneck.
02:34:33.000 We just still are dealing with this primate body.
02:34:37.000 And that something like artificial general intelligence or something like some implemented form of engaging with it, whether it's neural link, something, that shifts the way the mind interfaces with other minds.
02:34:55.000 Isn't it wild that, speaking of biological evolution, there will be people, I think, who are alive forever?
02:35:02.000 The invention or discovery, whatever you want to call it, of the transistor that will also be alive for the creation of AGI. One human lifetime.
02:35:09.000 Yeah.
02:35:10.000 You want to know a wild one?
02:35:11.000 From the implementation from Orville and Wilbur Wright flying the plane, it was less than 50 years before someone dropped an atomic bomb out of it.
02:35:21.000 That's wild.
02:35:22.000 That's fucking crazy.
02:35:24.000 That's crazy.
02:35:25.000 Less than 40, right?
02:35:26.000 That's crazy.
02:35:27.000 Yeah.
02:35:28.000 Bananas.
02:35:33.000 60-something years to land on the moon.
02:35:35.000 Nuts.
02:35:36.000 Nuts.
02:35:37.000 Where, you know, where is it going?
02:35:40.000 I mean, it's just guesswork.
02:35:42.000 But it's interesting.
02:35:43.000 For sure.
02:35:44.000 I mean, it's the most fascinating thing of our time, for sure.
02:35:46.000 It's fascinating intellectually, and I also think it is one of these things that will be tremendously net beneficial.
02:35:55.000 Yeah.
02:35:56.000 We've been talking a lot about problems in the world, and I think that's just always a nice reminder of how much we get to improve, and we're going to get to improve a lot, and I think this will be the most powerful tool.
02:36:10.000 We have yet created to help us go do that.
02:36:13.000 I think you're right.
02:36:15.000 And this was an awesome conversation.
02:36:17.000 Thanks for having me.
02:36:18.000 It's really, really fun.
02:36:18.000 Thank you for being here.
02:36:19.000 I really appreciate it.
02:36:20.000 And thanks for everything.
02:36:21.000 Keep us posted.
02:36:22.000 We'll do.
02:36:22.000 And if you create how, let us know.
02:36:26.000 All right.
02:36:27.000 Thank you.
02:36:27.000 Thank you.
02:36:28.000 Bye, everybody.