In this episode of the Joe Rogan Experience podcast, I sit down with AI pioneer Sam Harris to talk about the impact of artificial intelligence on the way we live, and the potential for it to change our world in the coming decades. Sam is a professor of computer science at the University of Toronto, and he's been around the business for a long time. He's also the author of several books, including The Future of AI: A User Interface for the 21st Century, which is a book that explores how AI can change the world, and how we can prepare for the inevitable impact it will have on our jobs and the economy. I think you're going to get a lot out of this episode if you're curious about the future of AI, and what it means for us and the way the world is going to change in the next 50 years, and I think it's going to be a really important conversation to have with someone who's been working on this field for a while, and who has a lot of experience in the field of AI and technology. I hope you enjoy this episode, and if you do, please share it with a friend or colleague who's interested in learning more about AI. Tweet me and let me know what you think! Timestamps: 4:00 - What are your thoughts on AI? 6:30 - How AI is changing the world? 7:00 8:20 - What will AI's role in the future? 9:40 - What is AI's impact on the world 10: What are the downsides of AI 11:15 - How will AI replace our jobs? 12:00 | How AI will change our way of life? 13:30 | Should AI replace human labor? 14:20 15:30 16:10 - What's the best way to prepare for AI 17: What is the role of AI in our lives? 18:10 19:40 Can AI be a good thing? 21:40 | What are we waiting for? 22:00 // Is AI a problem? 23:00 Is AI going to replace human creativity? 25:00? 26:00 Can AI replace us? 27:00 Are we ready for AI in 50 years? 30:00 Will AI be better? 35:00 Does AI have a place in the world ? 36:00 Do we need to be more intelligent?
00:00:21.000No, I mean, what have you done with AI? I mean, it's one of the things about this is, I mean, I think everyone is fascinated by it.
00:00:32.000I mean, everyone is absolutely blown away at the current capability and wondering what the potential for the future is and whether or not that's a good thing.
00:00:45.000I think it's going to be a great thing, but I think it's not going to be all a great thing.
00:01:10.000And even if it's like net wonderful, you know, there's things we're going to lose along the way.
00:01:15.000Some kinds of jobs, some kinds of parts of our way of life, some parts of the way we live are going to change or go away.
00:01:20.000And No matter how tremendous the upside is there, and I believe it will be tremendously good.
00:01:26.000There's a lot of stuff we've got to navigate through to make sure.
00:01:30.000That's a complicated thing for anyone to wrap their heads around, and there's deep and super understandable emotions around that.
00:01:39.000That's a very honest answer, that it's not all going to be good.
00:01:43.000But it seems inevitable at this point.
00:01:48.000Yeah, I mean, it's definitely inevitable.
00:01:50.000My view of the world, you know, when you're like a kid in school, you learn about this technological revolution and then that one and then that one.
00:01:57.000And my view of the world now sort of looking backwards and forwards is that this is like one long technological revolution.
00:02:06.000Sure, like first we had to figure out agriculture so that we had the resources and time to figure out how to build machines, then we got this industrial revolution, and that made us learn about a lot of stuff, a lot of other scientific discovery too, let us do the computer revolution, and that's now letting us, as we scale up to these massive systems,
00:02:28.000I think it's the You know, although we do have these things to navigate and there will be these downsides, if you think about what it means for the world and for people's quality of lives,
00:02:46.000if we can get to a world where the cost of intelligence and the abundance that comes with that The cost dramatically falls.
00:03:00.000I think we'll do the same thing with energy.
00:03:01.000And I think those are the two sort of key inputs to everything else we want.
00:03:05.000So if we can have abundant and cheap energy and intelligence, that will transform people's lives largely for the better.
00:03:11.000And I think it's going to, in the same way that if we could go back now 500 years and look at someone's life, we'd say, well, there's some great things, but they didn't have this.
00:03:20.000Can you believe they didn't have modern medicine?
00:03:23.000That's what people are going to look back at us like, but in 50 years.
00:03:26.000When you think about the people that currently rely on jobs that AI will replace, when you think about whether it's truck drivers or automation workers, people that work in factory assembly lines, what,
00:03:41.000if anything, what strategies can be put to mitigate the negative downsides of those jobs being eliminated by AI? I'll talk about some general thoughts, but I find making very specific predictions difficult because the way the technology goes has been so different than even my own intuitions,
00:04:15.000If you had asked me 10 years ago, I would have said, first, AI is going to come for blue collar labor, basically.
00:04:22.000It's going to drive trucks and do factory work and, you know, it'll handle heavy machinery.
00:04:28.000Then maybe after that it'll do some kinds of cognitive labor, but it won't be off doing what I think of personally as the really hard stuff.
00:04:38.000It won't be off proving new mathematical theorems.
00:04:41.000It won't be off discovering new science.
00:04:47.000And then eventually, maybe, but maybe last of all, maybe never, because human creativity is this magic special thing, last of all it'll come for the creative jobs.
00:05:38.000If we do have something that completely eliminates factory workers, completely eliminates truck drivers, delivery drivers, things along those lines, that creates this massive vacuum in our society.
00:06:23.000I'm not a believer at all that there won't be lots of new jobs.
00:06:27.000I think human creativity, desire for status, wanting different ways to compete, invent new things, feel part of a community, feel valued, that's not going to go anywhere.
00:06:38.000People have worried about that forever.
00:06:40.000What happens is we get better tools and we just invent new things and more amazing things to do.
00:06:48.000And I think I mean that like literally in that there's like space is really big, but also there's just so much stuff we can all do if we do get to this world of abundant intelligence where you can sort of just think of a new idea and it gets created.
00:07:06.000But again, to the point we started with, that doesn't provide great solace to people who are losing their jobs today.
00:07:15.000So saying there's going to be this great indefinite stuff in the future.
00:07:19.000People are like, what are we doing today?
00:07:22.000I think we will, as a society, do things like UBI and other ways of redistribution, but I don't think that gets at the core of what people want.
00:07:31.000I think what people want is agency, self-determination, the ability to play a role in architecting the future along with the rest of society, the ability to express themselves and create something meaningful to them.
00:07:46.000And also, I think a lot of people work jobs they hate, and I think we as a society are always a little bit confused about whether we want to work more or work less.
00:07:57.000But somehow, We all get to do something meaningful and we all get to play our role in driving the future forward.
00:08:10.000And what I hope is as those long-haul truck driving jobs go away, which people have been wrong about predicting how fast that's going to happen, but it's going to happen, we figure out not just a way to Solve the economic problem by,
00:08:27.000like, giving people the equivalent of money every month, but that there's a way that—and we have a lot of ideas about this—there's a way that we, like, share ownership and decision-making over the future.
00:08:41.000I think I say a lot about AGI is that Everyone realizes we're going to have to share the benefits of that, but we also have to share the decision-making over it and access to the system itself.
00:08:54.000I'd be more excited about a world where we say, rather than give everybody on Earth 1.8 billionth of the AGI money, which we should do that too, we say, you get 1.8 billionth slice of the system.
00:09:42.000Have something that's completely unbiased, absolutely rational, has the accumulated knowledge of the entire human history at its disposal, including all knowledge of psychology and psychological study, including UBI,
00:09:58.000because that comes with a host of You know, pitfalls and issues that people have with it.
00:10:06.000I think we're still very far away from a system that is capable enough and reliable enough that any of us would want that.
00:10:14.000But I'll tell you something I love about that.
00:10:17.000Someday, let's say that thing gets built.
00:10:19.000The fact that it can go around and talk to every person on Earth, understand their exact preferences at a very deep level, how they think about this issue and that one, and how they balance the trade-offs and what they want, and then understand all of that and collectively optimize for the collective preferences of humanity or of citizens of the US,
00:11:06.000Where you're gonna get completely objective, the absolute most intelligent decision for virtually every problem, every dilemma that we face currently in society.
00:11:19.000Would you truly be comfortable handing over, like, final decision-making and say, all right, AI, you got it from here?
00:11:24.000No, no, but I'm not comfortable doing that with anybody, you know?
00:11:28.000I mean, I was uncomfortable with the Patriot Act.
00:11:32.000I'm uncomfortable with many decisions that are being made.
00:11:36.000It's just there's so much obvious evidence that decisions that are being made are not being made in the best interest of the overall well of the people.
00:11:44.000It's being made in the decisions of whatever gigantic corporations that have donated to and whatever the military industrial complex and pharmaceutical industrial complex and just the money.
00:11:59.000That's really what we know today, that money has a massive influence on our society and the choices that get made.
00:12:06.000And the overall good or bad for the population.
00:12:09.000I have no disagreement at all that the current system is super broken, not working for people, super corrupt, and for sure unbelievably run by money.
00:12:38.000But when you think of AGI, when you think of the possible future, like where it goes to, do you ever extrapolate?
00:12:48.000Do you ever like sit and pause and say, well, if this becomes sentient and it has the ability to make better versions of itself, how long before we're literally dealing with a god?
00:13:01.000So, the way that I think about this is, it used to be that AGI was this very binary moment.
00:13:10.000And I think I was totally wrong about that.
00:13:12.000And the right way to think about it is this continuum of intelligence, this smooth exponential curve.
00:13:20.000Back all the way to that sort of smooth curve of technological revolution.
00:13:24.000The amount of compute power we can put into the system, the scientific ideas about how to make it more efficient and smarter, to give it the ability to do reasoning, to think about how to improve itself.
00:13:40.000But my model for a long time, I think if you look at the world of AGI thinkers, there's sort of two, particularly around the safety issues you're talking about, there's two axes that matter.
00:13:52.000There's what's called short timelines or long timelines to the first milestone of AGI, whatever that's going to be.
00:14:08.000Once we get there, from there to that point you were talking about where it's capable of the rapid self-improvement, is that a slower, a faster process?
00:14:17.000The world that I think we're heading, that we're in, and also the world that I think is the most controllable and the safest, is the short timelines and slow takeoff quadrant.
00:14:30.000And I think we're going to have, you know, there were a lot of very smart people for a while who were like, the thing you were just talking about happens in a day or three days.
00:14:39.000And that doesn't seem likely to me given the shape of the technology as we understand it now.
00:14:45.000Now, even if that happens in a decade or three decades, it's still like the blink of an eye from a historical perspective.
00:14:53.000And there are going to be some real challenges to getting that right.
00:14:58.000And the decisions we make The sort of safety systems and the checks that the world puts in place, how we think about global regulation or rules of the road from a safety perspective for those projects.
00:15:14.000It's super important because you can imagine many things going horribly wrong.
00:15:21.000I feel cheerful about the progress the world is making towards taking this seriously and, you know, it reminds me of what I've read about the conversations that the world had right around the development of nuclear weapons.
00:15:37.000At least in terms of public consciousness, this has emerged very rapidly, where I don't think anyone was really aware.
00:15:44.000People were aware of the concept of artificial intelligence, but they didn't think that it was going to be implemented So comprehensively, so quickly.
00:16:08.000Each step, you know, from each like half step like that, you kind of, humans have this ability to like get used to any new technology so quickly.
00:16:18.000The thing that I think was unusual about the launch of ChatGPT 3.5 and then 4 was that people hadn't really been paying attention.
00:16:26.000And that's part of the reason we deploy.
00:16:28.000We think it's very important that people and institutions have time to gradually understand this, react, co-design the society that we want with it.
00:16:38.000And if you just build AGI in secret in a lab and then drop it on the world all at once, I think that's a really bad idea.
00:16:43.000So we had been trying to talk to the world about this for a while.
00:16:49.000If you don't give people something they can feel and use in their lives, they don't quite take it seriously.
00:17:29.000And now, you know, every year you get a new iPhone.
00:17:32.000Over the 15 years or whatever since the launch, they've gotten dramatically better.
00:17:36.000But iPhone to iPhone, you're like, yeah, okay, it's a little better.
00:17:38.000But now if you go hold up the first iPhone to the 15 or whatever, that's a big difference.
00:17:43.000GPT 3.5 to AGI, that'll be a big difference.
00:17:46.000But along the way, it'll just get incrementally better.
00:17:49.000Do you think about the convergence of things like Neuralink and there's a few competing technologies where they're trying to implement some sort of a connection between the human biological system and technology?
00:18:09.000Do you want one of those things in your head?
00:18:58.000How do you make sure the people that do it first actually help lift everybody up together?
00:19:02.000How do you make sure people who want to just live their very human life get to do that?
00:19:07.000That stuff is really hard and honestly...
00:19:10.000So far off from my problems of the day that I don't get to think about that as much as I'd like to, because I do think it's super interesting.
00:19:20.000But yeah, it seems like if we just think logically, that's going to be a huge challenge at some point, and people are going to want...
00:19:33.000Wildly divergent things, but there is a societal question about how we're going to, like, the questions of fairness that come there and what it means for the people who don't do it.
00:19:49.000Anyway, on the neural interface side, I'm, in the short term, like, before we figure out how to Upload someone's consciousness into a computer, if that's even possible at all, which I think there's plenty of sides you could take on why it's not.
00:20:06.000The thing that I find myself most interested in is what we can do without drilling a hole in someone's head.
00:20:15.000How much of the inner monologue can we read out with an externally mounted device?
00:20:19.000And if we have an imperfect, low bandwidth, low accuracy neural interface, can people still just learn how to use it really well in a way that's quite powerful for what they can now do with a new computing platform?
00:20:34.000And my guess is we'll figure that out.
00:20:55.000That's the pong of the eventual immersive 3D video games.
00:20:59.000Like you're going to get these first steps and they're going to seem sort of crude and slow.
00:21:05.000I mean it's essentially slower than just asking Siri.
00:21:08.000I think if someone built a system where you could think words, it doesn't have to be a question, it could just be your passive rambling inner monologue, but it certainly could be a question.
00:21:20.000And that was being fed into GPT-5 or 6. And in your field of vision, the words in response were being displayed.
00:22:28.000My real concern is that once we take the step to use an actual neural interface, when there's an actual operation, and they're using some sort of an implant, and then that implant becomes more sophisticated,
00:22:45.000it's not the iPhone 1, now it's the iPhone 15, and as these things get better and better, we're on the road to cyborgs.
00:22:54.000We're on the road to, like, why would you want to be a biological person?
00:22:59.000Do you really want to live in a fucking log cabin when you can be in the Matrix?
00:23:10.000We're already a little bit down that path, right?
00:23:13.000Like if you take away someone's phone and they have to go function in the world today, they're at a disadvantage relative to everybody else.
00:23:20.000So maybe that's like the lightest weight version of a merge we could imagine, but I think it's worth like, if we go back to that earlier thing about the one exponential curve, I think it's worth saying we've like lifted off the x-axis already down this path, the tiniest bit.
00:23:38.000Even if you don't go all the way to, like, a neural interface, VR will get so good that some people just don't want to take it off that much.
00:23:49.000That's fine for them, as long as we can solve this question of how do we think about what a balance of power means in the world.
00:23:57.000I think there will be many people, I'm certainly one of them, who's like, actually the human body and the human experience is pretty great.
00:24:04.000That log heaven in the woods, pretty awesome.
00:24:06.000I don't want to be there all the time.
00:24:07.000I'd love to go play the great video game, but I'm really happy to get to go there sometimes.
00:24:14.000Yeah, there's still human experiences that are just, like, great human experience.
00:24:20.000Just laughing with friends, you know, kissing someone that you've never kissed before, that you're on a first date.
00:24:28.000Those kind of things, they're real moments.
00:24:31.000It just laughs, having a glass of wine with a friend, just laughing.
00:24:36.000Not quite the same in VR. Yeah, it's not.
00:24:38.000Now, when the VR goes super far, so you can't – it's like you are jacked in on your brain and you can't tell what's real and what's not.
00:24:46.000And then everybody gets like super deep on the simulation hypothesis or the like Eastern religion or whatever.
00:24:51.000And I don't know what happens at that point.
00:24:53.000Do you ever fuck around with simulation theory?
00:24:55.000Because the real problem is when you combine that with probability theory and you talk to the people that say, well, if you just look at the numbers – The probability that we're already in a simulation is much higher than the probability that we're not.
00:25:12.000It's never been clear to me what to do about it.
00:26:40.000I was reading about how horrible systems like ChatGPT and Google are from an environmental impact because it's, you know, using, like, some extremely tiny amount of energy for each query.
00:26:50.000And, you know, how we're all destroying the world.
00:27:05.000What we should be looking at is the spectacular changes that are possible through this.
00:27:12.000And all the problems, the insurmountable problems that we have with resources, with the environment, with cleaning up the ocean, climate change, there's so many problems that we have.
00:27:22.000We need this to solve all of everything else.
00:27:24.000And that's why we need President AI. If AI could make every scientific discovery, but we still had human presidents, do you think we'd be okay?
00:27:35.000No, because those creeps would still be pocketing money and they'd have offshore accounts and it would always be a weird thing of corruption and how to mitigate that corruption, which is also one of the fascinating things about the current state of technology is that we're so much more aware of corruption.
00:27:52.000There's so much independent reporting and we're so much more cognizant of the actual problems This is really great.
00:28:03.000One of the things that I've observed, obviously many other people too, is corruption is such an incredible hindrance to getting anything done in a society to make it forward progress.
00:32:45.000But I think this idea that we have a global currency that is outside of the control of any government is a super logical and important step on the tech tree.
00:33:00.000I mean, why should the government control currency?
00:33:02.000I mean the government should be dealing with all the pressing environmental, social, infrastructure issues, foreign policy issues, economic issues.
00:33:11.000The things that we need to be governed in order to have a peaceful and prosperous society that's equal and equitable.
00:33:20.000What do you think happens to money and currency after AGI? I've wondered about that because I feel like with money, especially when money goes digital, the bottleneck is access.
00:33:32.000If we get to a point where all information is just freely shared everywhere, there are no secrets, there are no boundaries, there are no borders.
00:34:41.000I think there will always be goods that we want to be scarce and expensive, but it'll only be those goods that we want to be scarce and expensive and services that still are.
00:34:50.000And so money in a world like that, I think, is just a very curious idea.
00:35:19.000My current best idea, and maybe there's something better, is I think if we are right, a lot of reasons we could be wrong, but if we are right that the AGI systems, of which there will be a few, become the high-order bits of influence, whatever,
00:36:01.000You know, the hardliners, the people that are against, like, welfare and against any sort of universal basic income, UBI, what they're really concerned with is human nature, right?
00:36:15.000They believe that if you remove incentives, if you just give people free money, they become addicted to it, they become lazy.
00:36:22.000But isn't that a human biological and psychological bottleneck?
00:36:31.000With the implementation of artificial intelligence combined with some sort of neural interface, whether it's external or internal.
00:36:43.000It seems like that's a problem that can be solved.
00:36:48.000That you can essentially, and this is where it gets really spooky, you can re-engineer the human biological system and you can remove all of these problems that people have that are essentially problems that date back to human reward systems when we were tribal people.
00:37:06.000Hunter-gatherer people, whether it's jealousy, lust, envy, all these variables that come into play when you're dealing with money and status and social status.
00:37:18.000If those are eliminated with technology, essentially we become a next version of what the human species is possible.
00:37:27.000Look, we're very, very far removed from tribal, brutal societies of cave people.
00:37:37.000We all agree that this is a way better way to live.
00:37:43.000I was talking about this at my comedy club last night.
00:37:47.000Because my wife was, we were talking about DNA, and my wife was saying that, look, everybody came from K people, which is kind of a fucked up thought, that everyone here is here because of K people.
00:37:59.000Well, all that's still in our DNA. All that's still—and these reward systems can be hijacked, and they can be hijacked by just giving people money.
00:38:12.000You'll just have money and just lay around and do drugs.
00:38:16.000That's the fear that people have of giving people free money.
00:38:21.000But— If we can figure out how to literally engineer the human biological vehicle and remove all those pitfalls, if we can enlighten people technologically,
00:38:38.000maybe enlighten is the wrong word, but advance The human species to the point where those are no longer dilemmas because those are easily solvable through coding.
00:38:51.000They're easily solvable through enhancing the human biological system, perhaps raising dopamine levels to the point where anger and fear and hate are impossible.
00:39:50.000If I could push a button to remove all human striving and conflict, I wouldn't do it, first of all.
00:39:59.000I think that's a very important part of our story and experience.
00:40:04.000And also, I think we can see both from our own biological history and also from what we know about AI, that very simple Goal systems, fitness functions,
00:40:20.000reward models, whatever you want to call it, lead to incredibly impressive results.
00:40:26.000You know, if the biological imperative is survive and reproduce, look how far that has somehow gotten us as a society.
00:40:34.000All of this, all this stuff we have, all this technology, this building, whatever else.
00:42:45.000It's not fair to say there are none, but there are less than there were before.
00:42:49.000And I think that's bad for society at all levels.
00:42:55.000Tech company founders is one example, but people who go off and create something new, who push on a disagreeable or controversial idea, we need that to drive forward.
00:43:11.000We need people to be able to put out ideas and be wrong and not be ostracized from society for it or not have it be something that they get canceled for or whatever.
00:43:22.000We need people to be able to take a risk in their career because they believe in some important scientific quest that may not work out or may sound like really controversial or bad or whatever.
00:43:33.000You know, certainly when we started OpenAI and we were saying, we think this AGI thing is real and could be, you know, It could be done, unlikely, but so important if it happens.
00:43:43.000And all of the older scientists in our field were saying, those people are irresponsible.
00:43:48.000You shouldn't talk about AGI. That's like, you know, they're like selling a scam, or they're like, you know, they're kind of being reckless and it's going to lead to an AGI winter.
00:43:59.000Like, we said we believed, we said at the time, we knew it was unlikely, but it was an important quest.
00:44:04.000And we were going to go after it and kind of like, fuck the haters.
00:44:28.000I'm tempted to blame the education system, but I think that interacts with society in all of these strange ways.
00:44:40.000It's funny, there was this thing all over my Twitter feed recently trying to talk about what caused the drop in testosterone in American men over the last few decades.
00:44:50.000And no one was like, this is a symptom, not a cause.
00:44:55.000And everyone was like, oh, it's the microplastics, it's the birth control pills, it's the whatever, it's the whatever, it's the whatever.
00:45:00.000And I think this is not at all the most important It was interesting to me, sociologically, that there was only talk about what caused it,
00:45:19.000not about it being an effect of some sort of change in society.
00:45:41.000It was just interesting to me that the only talk was about like biological factors and not that somehow society can have some sort of effect on...
00:45:52.000Do you know what the answer to this is?
00:45:54.000I mean I've had a podcast with Dr. Shanna Swan who wrote the book Countdown and that is all about the introduction of petrochemical products and the correlating drop in testosterone, rise in miscarriages, the fact that these are ubiquitous endocrine disruptors that When they do blood tests on people,
00:46:48.000The real concern is with mammals because the introductions, when they've done studies with mammals and they've introduced phthalates into their body, there's a correlating...
00:47:00.000One thing that happens is these animals, their taints shrink.
00:47:07.000The mammal, when you look at males, it's 50% to 100% larger than the females.
00:47:13.000With the introduction of phthalates on the males, the taints start shrinking, the penises shrank, the testicles shrank, sperm count shrinks.
00:47:20.000So we know there's a direct biological connection between these chemicals and how they interact with bodies.
00:47:34.000The amount of petrochemical products that we have, the amount of plastics that we use, it is such an integral part of our culture and our society, our civilization.
00:48:25.000We need the good strong people to protect us from the bad strong people.
00:48:29.000But if we're in the process of integrating with technology, if technology is an inescapable part of our life, if it is everywhere, you're using it, you have the internet of everything that's in your microwave, your television, your computers,
00:48:46.000everything you use, As time goes on, that will be more and more a part of your life, and as these plastics are introduced into the human biological system, you're seeing a feminization of the males of the species.
00:49:01.000You're seeing a downfall in birth rate.
00:49:04.000You're seeing all these correlating factors that would sort of lead us to become this more Peaceful, less violent, less aggressive, less ego-driven thing.
00:49:20.000Which the world is definitely becoming over time.
00:49:24.000And I'm all for less violence, obviously.
00:49:34.000Look, obviously testosterone has many great things to say for it and some bad tendencies too.
00:49:41.000But I don't think a world – if we leave that out of the equation and just say like a world that has a spirit that we're going to defend ourselves, we're going to – We're going to find a way to protect ourselves and our tribe and our society into this future,
00:50:05.000which you can get with lots of other ways.
00:51:10.000But what I'm saying is if the human species does integrate with technology, wouldn't a great way to facilitate that to be to kind of feminize the primal apes and to sort of downplay the role— You mean like should the AGI phthalates enter the world?
00:51:30.000I don't know if it's AGI. I mean maybe it's just an inevitable consequence of technology.
00:51:36.000Because especially the type of technology that we use, which does have so much plastic in it, and then on top of that the technology that's involved in food systems, preservatives, all these different things that we use to make sure that people don't starve to death.
00:51:49.000We've made incredible strives in that.
00:51:51.000There are very few people in this country that starve to death.
00:51:55.000It's not a primary issue, but violence is a primary issue.
00:52:01.000But our concerns about violence and our concerns about testosterone and strong men and powerful people is only because...
00:52:33.000I think it's a biological imperative, right?
00:52:35.000And I think that biological imperative is because we used to have to defend against incoming tribes and predators and animals.
00:52:42.000And we needed someone who was stronger than most to defend the rest.
00:52:48.000And that's the concept of the military.
00:52:50.000That's why Navy SEAL training is so difficult.
00:52:53.000We want the strongest of the strong to be at the tip of the spear.
00:52:57.000But that's only because there's people like that out there that are bad.
00:53:03.000If artificial general intelligence and the implementation of some sort of a device that changes the biological structure of human beings to the point where that is no longer a concern, like if you are me and I am you and I know this because of technology,
00:53:20.000Yeah, look, by the time if this goes all the way down the sci-fi path and we're all like merged into this one single like planetary universal whatever consciousness, then yes, you don't.
00:53:47.000They have some sort of a telepathic way of communicating.
00:53:50.000They probably don't need sounds with their mouths.
00:53:53.000And they don't need this urge that we have to conquer and to spread our DNA. Like that's so much of what people do is these reward systems that were established when we were territorial apes.
00:54:07.000There's a question to me about how much you can ever get Rid of that.
00:54:15.000If you make an AGI and it decides, actually, we don't need to expand.
00:54:39.000I don't agree that that would be the logical conclusion.
00:54:41.000I think the logical conclusion would be They would look for problems and frontiers that are insurmountable to our current existence, like intergalactic communication and transportation.
00:54:57.000What happens when it meets another AGI the other galaxy over?
00:54:59.000What happens when it meets an AGI that's a million years more advanced?
00:55:44.000The thing I was like reflecting on as you were saying that is, I don't think I... I'm not as optimistic that we can or even should overcome our biological base to the degree that I think you think we can.
00:56:02.000And, you know, to even go back one further level, like, I think society is happiest where there's, like, roles for strong femininity and strong masculinity in the same people and in different people.
00:56:24.000And I don't think a lot of these, like, deep-seated things are gonna be able to get pushed aside very easily and still have a system that works.
00:56:36.000Like, sure, we can't really think about what, if there were consciousness in a machine someday or whatever, what that would be like.
00:56:44.000And maybe I'm just, like, thinking too small-mindedly, but I think there is something But don't you think that cave people would probably have those same logical conclusions about life and sedentary lifestyle and sitting in front of a computer and not interacting with each other except
00:57:19.000I mean, isn't that, like, what you're saying is correct.
00:57:21.000How different do you think our motivations are today and kind of what really brings us genuine joy and how we're wired at some deep level differently than cave people?
00:57:31.000Clearly, lots of other things have changed.
00:57:35.000But how different do you think it really is?
00:57:37.000I think that's the problem, is that genetically, at the base level, there's not much difference.
00:57:44.000And that these reward systems are all – we interact with all of them, whether it's ego, lust, passion, fury, anger, jealousy, all these different things.
00:57:59.000And you think will be – some people will upload and edit those out.
00:58:03.000I think that our concern with losing this aspect of what it means to be a person – Like the idea that we should always have conflict and struggle because conflict and struggle is how we facilitate progress, which is true, right?
00:58:18.000And combating evil is how the good gets greater and stronger if the good wins.
00:58:23.000But my concern is that that is all predicated on the idea that the biological system that we have right now is correct, And optimal.
00:58:36.000And I think one of the things that we're dealing with, with the heightened states of depression and anxiety and the lack of meaning and existential angst that people experience, a lot of that is because the biological reality of being a human animal doesn't really integrate that well with this world that we've created.
00:59:00.000And I wonder If the solution to that is not find ways to find meaning with the biological vessel that you've been given, but rather engineer those aspects that are problematic out of the system.
00:59:33.000There's never been a time in human history where we haven't had war.
00:59:36.000If you had to say, what is our number one problem as a species?
00:59:41.000I would say our number one problem is war.
00:59:44.000Our number one problem is this idea that it's okay to send massive groups of people who don't know each other to go murder massive groups of people that are somehow opposed because of the government, because of lines in the sand and territory.
01:00:02.000Well, one of the ways you get rid of that is to completely engineer out all the human reward systems that pertain to the acquisition of resources.
01:01:42.000But it's a fascinating field of study, but I think it has something to do with resilience and resistance and the fact that your body has to combat this External factor that's very extreme that causes the body to go into this state of preservation and the implementation of cold shock proteins and the reduction of inflammation,
01:02:05.000which also enhances the body's endocrine system.
01:02:08.000But then on top of that, this imperative that you have to become more resilient to survive this external factor that you've introduced into your life every single day.
01:02:19.000So there's ways, obviously, that you can make a human being more robust.
01:02:26.000You know, we know that we can do that through strength training and that all that stuff actually does raise testosterone.
01:02:31.000Your diet can raise testosterone and a poor diet will lower it and will hinder your integrant system, will hinder your ability to produce growth hormone, melatonin, all these different factors.
01:02:45.000That seems to be something that we can fix in terms or at least mitigate with decisions and choices and effort.
01:02:54.000But the fact that these petrochemical – like there's a graph that Dr. Shanna Swan has in her book that shows during the 1950s when they start using petrochemical products in everything, microwave, plastic, saran wrap,
01:04:21.000But I'm also – I try to be very objective.
01:04:24.000And when I objectively look at it in terms of like if you take where we are now and all of our problems and you look towards the future and like – What would be one way that you could mitigate a lot of these?
01:04:38.000And it would be the implementation of some sort of a telepathic technology where You know, you couldn't just text someone or tweet at someone something mean because you would literally feel what they feel when you put that energy out there and you would be repulsed.
01:04:56.000And then violence would be if you were committing violence on someone and you literally felt the reaction of that violence in your own being.
01:05:08.000You would also have no motivation for violence.
01:05:12.000If we had no aggressive tendencies, no primal chimpanzee tendencies.
01:05:17.000You know, it's true that violence in the world has obviously gone down a lot over the decades, but emotional violence is up a lot, and the internet has been horrible for that.
01:06:14.000And it's very damaging to women, particularly young girls.
01:06:18.000Young girls growing up, there's a direct correlation between the invention of social media, the introduction to the iPhone, self-harm, suicide, online bullying.
01:06:29.000People have always talked shit about people when no one's around.
01:06:32.000The fact that they're doing it now openly to harm people.
01:07:16.000Do you know how many fucking times I've got up to go to the bathroom first thing in the morning and spent an hour just sitting on the toilet scrolling through Instagram?
01:07:27.000And there's this thought that I'm going to get something out of it.
01:07:31.000I was thinking actually just yesterday about how, you know, we all have talked for so long about these algorithmic feeds are going to manipulate us in these big ways and that will happen.
01:07:44.000But in the small ways already where like scrolling Instagram is not even that fulfilling, like you finish that hour and you're like, I know that was a waste of my time.
01:07:55.000But it was like over the threshold where you couldn't quite...
01:08:26.000Well, there's just a lot of shit out there, unfortunately.
01:08:30.000But it's just in terms of, you know, I was talking to Sean O'Malley, who's this UFC fighter, who's, you know, obviously has a very strong mind, really interesting guy.
01:08:39.000But one of the things that Sean said is, like, I get this, like, low-level anxiety from scrolling through things, and I don't know why.
01:09:24.000This is a completely new technology that, again, hijacks our human reward systems and hijacks all of the checks and balances that are in place for communication, which Historically has been one-on-one.
01:09:41.000Historically, communication has been one person to another.
01:09:44.000And when people write letters to each other, it's generally things like if someone writes a love letter or, you know, they miss you.
01:09:52.000They're writing this thing where they're kind of exposing a thing that maybe they have a difficulty in expressing in front of you.
01:09:59.000And it was, you know, generally, unless the person was a psycho, they're not hateful letters.
01:10:05.000Whereas the ability to just communicate, fuck that guy, I hope he gets hit by a bus, is so simple and easy, and you don't experience...
01:10:15.000Twitter seems to be particularly horrible for this, as the mechanics work.
01:10:21.000It really rewards in ways that I don't think anybody fully understands, that taps into something about human psychology.
01:11:18.000And they get meaning out of it in terms of reinforcement.
01:11:24.000They get short-term meaning out of it.
01:11:25.000I think maybe each day you go to bed feeling like you accomplished something and got your dopamine and at the end of each decade you probably are like, where'd that decade go?
01:11:33.000I was talking to a friend of mine who was having a real problem with it.
01:11:35.000He's saying he would be literally walking down the street and he'd have to check his phone to see who's replying.
01:11:40.000He wasn't even looking where he was walking.
01:11:42.000He was just like caught up in the anxiety of these exchanges.
01:11:46.000And it's not because of the nice things people say.
01:13:03.000I think I've watched it, like, destroy is too strong of a word, but, like, knock off track the careers or lives or happiness or human relationships of people that are, like, good, smart, conscientious people that just, like, couldn't fight this demon because it,
01:13:36.000And then there was a psychological aspect of it, like the angst that came from being socially isolated and terrified about this invisible disease that's going to kill us all.
01:13:46.000And, you know, so you have this like, and then you're interacting with people on Twitter, and then you're caught up in that anxiety, and you're doing it all day.
01:13:54.000And I know quite a few people, especially comedians, that really lost their minds and lost their respect to their peers by doing that.
01:14:02.000I have a lot of sympathy for people who lost their minds during COVID, because what a natural thing for us all to go through.
01:14:13.000And I don't think the internet, and particularly not the kind of social dynamics of things like Twitter, I don't think that brought out anyone's best.
01:14:22.000Well, I mean, some people, I think if they're not inclined to be shitty to people, I think some people did seek comfort and they did interact with people in positive ways.
01:14:56.000If you looked at a pie chart of the amount of interactions on Twitter, I would say a lot of them are shitting on people and being angry about things.
01:15:05.000How many of the people that you know that use Twitter those 8 or 10 hours a day are just saying wonderful things about other people all day versus the virulent...
01:15:19.000But then again, I wonder, with the implementation of some new technology that makes communication a very different thing than what we're currently...
01:15:29.000Like, what we're doing now with communication is less immersive than communicating one-on-one.
01:15:35.000You and I are talking, we're looking into each other's eyes, we're getting social cues, we're smiling at each other, we're laughing.
01:15:45.000I wonder if through the implementation of technology, if it becomes even more immersive than a one-on-one conversation, even more interactive, and you will understand even more about the way a person feels about what you say.
01:16:03.000About that person's memory, that person's life, that person's history, their education, how it comes out of their mind, how their mind interacts with your mind, and you see them.
01:16:29.000If you can like feel what I feel when you say something mean to me or nice to me, like that's clearly going to change what you decide to say.
01:16:52.000When we're dealing with the human mind, we're dealing with various diseases, bipolar, schizophrenia.
01:16:58.000Imagine a world where we can find the root cause of those things and through coding and some sort of an implementation of technology that elevates dopamine and serotonin and does some things to people that eliminates all of those problems.
01:17:21.000And allows people to communicate in a very pure way.
01:17:58.000I mean, you just—you, from the—when did OpenAI—when did you first start this project?
01:18:05.000End of 2015, early 2016. And when you initially started this project, what kind of timeline did you have in mind and has it stayed on that timeline or is it just wildly out of control?
01:18:22.000I remember talking with John Schulman, one of our co-founders, early on, and he was like, yeah, I think it's going to be about a 15-year project.
01:18:30.000And I was like, yeah, that sounds about right to me.
01:18:33.000And I've always sort of thought since then, now, I no longer think of like AGI as quite the end point, but to get to the point where we like accomplish the thing we set out to accomplish, you know, that would take us to like 2030, 2031. That has felt to me like all the way through kind of a reasonable estimate with huge error bars.
01:18:57.000And I kind of think we're on the trajectory I sort of would have assumed.
01:19:07.000On society would be like did you when you when you first started doing this and you said okay if we are successful and we do create some massively advanced AGI What is the implementation?
01:19:51.000But if something goes horribly wrong, it's like really horribly wrong.
01:19:55.000And so there was a lot of discussion about and really a big part of the founding spirit of this is like, how are we going to solve this safety problem?
01:21:11.000Like, you know, does it mean don't say something people find offensive?
01:21:13.000Or does it mean don't destroy all of humanity or some continuum?
01:21:17.000And I think the word is like gotten overloaded.
01:21:20.000But in terms of the like, not destroy all of humanity version of it, we have a lot of work to do.
01:21:26.000But I think we have finally more ideas about what can work.
01:21:30.000And given the way the systems are going, we have a lot more opportunities available to us to solve it than I thought we would have Given the direction that we initially thought the technology was going to go.
01:21:43.000On the positive side, The thing that I was most excited about then and remain most excited about now is what if this system can dramatically increase the rate of scientific knowledge in society?
01:22:00.000I think that kind of like all real sustainable economic growth, the future getting better, progress in some sense comes from increased scientific and technological capacity.
01:22:15.000That's how we can solve all the problems.
01:22:17.000And if the AI can help us do that, that's always been the thing I've been most excited about.
01:22:22.000Well, it certainly seems like that is the greatest potential, greatest positive potential of AI. It is to solve A lot of the problems that human beings have had forever, a lot of the societal problems that seem to be—I mean,
01:22:38.000that's what I was talking about in the AI president.
01:22:40.000I'm kind of not joking because I feel like if something was hyper-intelligent and aware of all the variables with no human bias— And no incentives.
01:22:52.000Other than, here's your program, the greater good for the community of the United States and the greater good for that community as it interacts with the rest of the world.
01:23:06.000The elimination of these dictators, whether they're elected or non-elected, who impose their will on the population because they have a vested interest in protecting special interest groups and industry.
01:23:41.000If it's instead like it is the collective will of humanity being expressed without the mistranslation and corrupting influences along the way, then I can see it.
01:25:38.000I mean, that— I don't think you can control it at this point other than some massive natural disaster that resets us back to the Stone Age, which is also something we should be very concerned with because it seems like that happens a lot.
01:25:51.000We're not aware of it because the timeline of a human body is so small.
01:25:54.000The timeline of the human existence as a person is a hundred years if you're lucky, but yet the timeline of the earth is billions of years and if you look at how many times Life on Earth has been reset by comets slamming into the Earth and just completely eliminating all technological advancement.
01:26:15.000It seems like it's happened multiple times in recorded history.
01:26:21.000I always think we don't think about that quite enough.
01:26:29.000We talked about the simulation hypothesis earlier.
01:26:31.000It's had this big resurgence in the tech industry recently.
01:26:34.000One of the new takes on it as we get closer to AGI is that if ancestors were simulating us, the time they'd want to simulate again and again is right up to the...
01:26:44.000Right up to the creation of AGI. So it seems very crazy.
01:26:52.000This is the time that is after we had enough cell phones out in the world recording tons of video to train the video model of the world that's all being jacked into us now via brain implants or whatever.
01:27:02.000And before everything goes really crazy with AGI. And it's also this interesting time to simulate, like, can we get through?
01:27:10.000Does the asteroid come right before we get there for dramatic tension?
01:27:13.000Like, do we figure out how to make this safe?
01:27:14.000Do we figure out how to societally agree on it?
01:27:16.000So that's led to, like, a lot more people believing it than before, I think.
01:27:42.000It's the best, but it also has the most problems, the most social problems, the most awareness of social, environmental, infrastructure, the issues that we have.
01:27:56.000And I intuitively, I think I feel something somewhat different than you, which is I think humans in something close to this form are going to be around humans.
01:28:27.000I could totally imagine a world where some people decide to merge and go off exploring the universe with AI and there's a big universe out there, but, like...
01:28:37.000Can I really imagine a world where, short of a natural disaster, there are not humans pretty similar to humans from today on Earth doing human-like things?
01:28:48.000And the sort of spirit of humanity merged into these other things that are out there doing their thing in the universe?
01:28:56.000It's very hard for me to actually see that happening.
01:29:01.000And maybe that means I'm, like, going to turn out to be a dinosaur and Luddite and horribly wrong in this prediction.
01:29:06.000But I would say I feel it more over time as we make progress with AI, not less.
01:29:27.000When someone figured out shoes, I think that probably took a while.
01:29:30.000When they figured out structures, doors, houses, cities, agriculture, all those things were slowly implemented over time and then now become everywhere.
01:29:40.000I think this is far more transformative.
01:29:44.000And is part of that because you don't think there will be an option for some people not to merge?
01:30:33.000I still have my other phone that I use for social media, but when I pick that motherfucker up, I start scrolling through YouTube and watching videos and scrolling through TikTok or Instagram.
01:30:45.000I don't have TikTok, but I tried threads for a little while, but I'm like, oh, this is like a fucking ghost town.
01:30:52.000So I went right back to X. I live on a ranch during the weekends and there's Wi-Fi in the house but there's no cell phone coverage anywhere else.
01:31:58.000And so the phone was broken and so I had to order a phone and we were on vacation for like eight days and it took three days for Apple to get me a phone.
01:33:17.000Can you imagine going up to someone in 1984 and pointing to a phone and saying, one day that'll be in your pocket and it's going to ruin your life?
01:34:23.000And also the understanding of the wide spectrum of human behavior.
01:34:29.000If you're a nice person and you surround yourself with nice people, you very rarely see someone get shot.
01:34:35.000You very rarely see people get stabbed for no reason randomly on the street.
01:34:41.000But on Instagram, you see that every day.
01:34:45.000And there's something about that where it just reminds you, oh, we're crazy.
01:34:50.000Like, the human species, like, there's a certain percentage of us that are just off the rails and just out there, just causing chaos and jumping dirt bikes and landing on your neck and, like, all that stuff is out there.
01:35:05.000See, even to hear that makes me, like, physically, like, I know that happens, of course.
01:36:01.000But Instagram shows them to me anyway.
01:36:04.000I heard an interesting thing a few days ago about Instagram and the feed, which is if you use it at off hours, when they have more processing capability available because less people are using it, you get better recommendations.
01:36:16.000So your feed will be better in the middle of the night.
01:37:10.000Because I don't think it's outside the realm of possibility.
01:37:14.000If we really truly can engineer, like one of the talks about Neuralink that's really promising is people with spinal cord issues, injuries, people that can't move their body, and being able to hotwire that where essentially it controls all these parts of your body that you couldn't control anymore.
01:37:35.000And so that would be an amazing thing for people that are injured, amazing thing for people that are, you know, they're paralyzed, they have all sorts of neurological conditions.
01:37:46.000That is probably one of the first, and that's what Elon's talked about as well, one of the first implementations, the restoration of sight, you know, cognitive function enhanced from people that have brain issues.
01:38:03.000And like many other technologies, I don't think we can stop neural interfaces because of the great good that's going to happen along the way.
01:38:12.000But I also don't think we know where it goes.
01:38:16.000And I think when we open it, we're not going to go back.
01:38:20.000Just like we're not going to go back to no computers without some sort of natural disaster.
01:38:25.000By the way, and I mean this as a great compliment, you are one of the most neutral people I have ever heard talk about the merge coming.
01:38:33.000You're just like, yeah, I think it's going to happen.
01:38:35.000You know, it's be good in these ways, bad in these ways, but you seem like unbelievably neutral about it, which is always something I admire.
01:38:43.000I try to be as neutral about everything as possible except for corruption which I think is just like one of the most massive problems with the way our culture is governed.
01:38:56.000I think corruption and the influence of money is just a giant terrible issue.
01:39:02.000But in terms of like social issues and in terms of the way human beings believe and think about things, I try to be as neutral as possible.
01:39:12.000Because I think the only way to really truly understand the way other people think about things is try to look at it through their mind.
01:39:19.000And if you have this inherent bias and this...
01:39:24.000You have this very rigid view of what's good and bad and right and wrong.
01:39:31.000I don't think that serves you very well for understanding why people differ.
01:39:36.000So I try to be as neutral and as objective as possible when I look at anything.
01:41:08.000And when I had the one recently, the overwhelming message that I was getting through this was that everything I say and do ripples off into all the people that I interact with.
01:41:24.000And then if I'm not doing something with at least the goal of overall good or overall understanding that I'm doing bad and that that bad is a real thing, as much as you try to ignore it because you don't interface with it instantly,
01:41:41.000you're still creating unnecessary negativity.
01:41:48.000And that I should avoid that as much as possible.
01:41:50.000It was like an overwhelming message that this psychedelic experience was giving me.
01:41:58.000And I took it because I was just particularly anxious that day about the state of the world, particularly anxious about Ukraine and Russia and China and the political system that we have in this country and this incredibly polarizing way that the left and the right engage with each other and God,
01:42:35.000Are you surprised psychedelic therapy has not made...
01:42:38.000From what you thought would happen in, say, the early 2010s until now, are you surprised it has not made more progress sort of on a path to legalization as a medical treatment?
01:42:48.000No, I'm not because there's a lot of people that don't want it to be in place and those people have tremendous power over our medical system and over our regulatory system.
01:42:58.000And those people have also not experienced these psychedelics.
01:43:02.000There's very few people that have experienced profound psychedelic experiences that don't think there's an overall good for those things.
01:43:10.000So the problem is you're having these laws and these rules implemented by people who are completely ignorant about the positive effects of these things.
01:43:21.000And if you know the history of psychedelic prohibition in this country, It all took place during 1970, and it was really to stop the civil rights movement, and it was really to stop the anti-war movement.
01:43:36.000And they tried to find a way to make all these things that these people were doing that was causing them to think in these very different ways.
01:44:11.000You know, the crack cocaine crisis that we experienced in the 90s, like all of those things, they're under the blanket of drugs.
01:44:18.000Psychedelic drugs are also talked about like drugs, even though they have these profound spiritual and psychological changes that they I remember when I was in elementary school and I was in like drug education,
01:44:34.000they talked about, you know, marijuana is really bad because it's a gateway to these other things.
01:44:39.000And there's this bad one, that bad one, heroin, whatever.
01:44:41.000And the very end of the line, the worst possible thing is LSD. Did you take an LST and go, oh, they're lying?
01:44:49.000Psychedelic therapy was definitely one of the most important things in my life.
01:44:55.000And I assumed, given how powerful it was for me, I struggled with all kinds of anxiety and other negative things.
01:45:16.000And I was like, I've been lied to my whole life.
01:45:18.000I'm so grateful that this happened to me now.
01:45:21.000Talked to a bunch of other people, all similar experiences.
01:45:24.000I assumed, this was a while ago, I assumed it was, and I was like, you know, very interested in what was happening in the U.S. I was like, particularly like looking at where MDMA and psilocybin were on the path.
01:45:37.000And I was like, all right, this is going to get through.
01:45:39.000And this is going to change the mental health of a lot of people in a really positive way.
01:45:44.000And I am surprised we have not made faster progress there, but I'm still optimistic we will.
01:45:48.000Well, we have made so much progress from the time of the 1990s.
01:45:55.000In the 1990s, you never heard about psychedelic retreats.
01:45:58.000You never heard about people taking these vacations.
01:46:01.000You never heard about people getting together in groups and doing these things and coming back with these profound experiences that they relayed to other people and literally seeing people change.
01:46:25.000I mean, I can only talk about it from a personal experience.
01:46:28.000It's been a radical change in my life, as well as, again, having all these conversations with different people.
01:46:33.000I feel so fortunate to be able to do this, that I've had so many different conversations with so many different people that think so differently, and so many exceptional people.
01:47:03.000And there are tools that are in place.
01:47:06.000But unfortunately, in this very prohibitive society, this society of prohibition, we're denied those.
01:47:15.000And we're denied ones that have never killed anybody, which is really bizarre when OxyContin can still be prescribed.
01:47:23.000What's the deal with why we can't make...
01:47:27.000If we leave, like, why we can't get these medicines that have transformed people's lives, like, more available, what's the deal with why we can't stop the opioid crisis?
01:47:38.000Or, like, fentanyl seems like just an unbelievable crisis for San Francisco.
01:47:43.000You remember in the beginning of the conversation when you said that AI will do a lot of good, overall good, but also not no harm?
01:47:53.000If we legalize drugs, all drugs, that would do the same thing.
01:47:59.000Would you advocate to legalize all drugs?
01:48:01.000It's a very complicated question because I think you're going to have a lot of addicts that wouldn't be addicts.
01:48:07.000You're going to have a lot of people's lives destroyed because it's legal.
01:48:10.000There's a lot of people that may not be psychologically capable of handling things.
01:48:17.000They do not ever recommend them for people that have a slippery grasp on reality as it is.
01:48:22.000People that are struggling, people that are already on a bunch of medications that allow them to just keep a steady state of existence in the normal world.
01:48:32.000If you just fucking bombard them with psilocybin, who knows what kind of an effect that's going to have and whether or not they're psychologically too fragile to recover from that.
01:48:43.000I mean, there's many, many stories of people taking too much acid and never coming back.
01:49:06.000Whether it's, we're both drinking coffee, you know, people smoke cigarettes, they do all, they take Adderall, they do all sorts of different things to change and enhance their normal state of consciousness.
01:49:18.000It seems like, whether it's meditation or yoga, they're always doing something to try to get out of their own way.
01:49:48.000And then some things can make you happy, sort of.
01:49:51.000A couple of drinks makes you so happy for a little bit, until you're an alcoholic, until you destroy your liver, until you crash your car, until you're involved in some sort of a violent encounter that you would never be involved with if you weren't drunk.
01:50:07.000You know, I love caffeine, which clearly is a drug.
01:50:12.000Alcohol, like, I like, but I often am like, yeah, this is like, you know, this is like dulling me and I wish I hadn't had this drink.
01:50:21.000And then other stuff, like, I mostly would choose to avoid.
01:50:29.000And you're probably aware of the pros and cons and you're also probably aware of how it affects you and what's doing good for you and what is detrimental to you.
01:50:39.000But that's a decision that you can make as an informed human being that you're not allowed to make if everything's illegal.
01:51:39.000I think he described heroin as getting a warm hug from God.
01:51:44.000I think the feeling that it gives you is probably pretty spectacular.
01:51:49.000I don't know if legalizing that is going to solve the problems, but I do know that another problem that we're not paying attention to is the rise of the cartels and the fact that right across our border where you can walk, There are these enormous,
01:52:07.000enormous organizations that make who knows how much money, untold, uncalculable amounts of money, selling drugs and bringing them into this country.
01:52:17.000And one of the things they do is they put fentanyl in everything to make things stronger.
01:52:21.000And they do it for, like, street Xanax.
01:52:25.000There's people that have overdosed thinking they're getting Xanax and they fucking die for fentanyl.
01:52:28.000Yeah, they do it with cocaine, of course.
01:52:33.000There's so many things that have fentanyl in them and they're cut with fentanyl because fentanyl is cheap and insanely potent.
01:52:42.000And that wouldn't be a problem if things were legal.
01:52:44.000So would you net out towards saying, all right, let's just legalize it?
01:52:48.000Yeah, I would net out towards that, but I would also put into place some serious mitigation efforts, like in terms of counseling, drug addiction, and ibogaine therapy, which is another thing that's illegal.
01:52:58.000Someone was just telling me about how transformative this was for them.
01:53:01.000Yeah, I haven't experienced that personally, but ibogaine, for many of my friends that have had pill problems, and I have a friend, my friend Ed Clay, Who started an ibogaine center in Mexico because he had an injury and he got hooked on pills and he couldn't kick it.
01:53:54.000It seems to be effective to people when people have, like, they really hit rock bottom, and they have a strong will, and then they get involved in a program, some sort of a 12-step program, some sort of a narcotics anonymous program.
01:54:05.000And then they get support from other people and they eventually build this foundation of other types of behaviors and ways to find other things to focus on to whatever aspect of their mind that allows them to be addicted to things.
01:54:21.000Now it's focused on exercise, meditation, yoga, whatever it is.
01:54:25.000That's your new addiction and it's a much more positive and beneficial addiction.
01:54:29.000But the reality of the physical addiction That there are mitigation efforts.
01:54:35.000Like there's so many people that have taken psilocybin and completely quit drugs, completely quit cigarettes, completely quit a lot because they realize like, oh, this is what this is.
01:54:47.000Yeah, that's why I was more optimistic that the world would have made faster progress towards acceptance of – you hear so many stories like this.
01:54:56.000So I would say, all right, clearly a lot of our existing mental health treatment at best doesn't work.
01:55:01.000Clearly our addiction programs are ineffective.
01:55:05.000If we have this thing that in every scientific study or most scientific studies we can see is delivering these unbelievable results, it's going to happen.
01:58:11.000Speaking of that, and psychedelics in general, many cultures have had a place for some sort of psychedelic time in someone's life or rite of passage.
01:58:21.000But as far as I can tell, most of them are under...
01:58:39.000And I hope that we as a society, because I think this is not going to happen even if it's slow, find a way to treat this with the respect that it needs.
01:58:53.000Yeah, I think set and setting is very important.
01:58:57.000And thinking about what you're doing before you're doing it and why you're doing it.
01:59:02.000Like I was saying the other night when I had this psychedelic experience, I was just like, God, sometimes I just think too much about the world and that it's so fucked.
01:59:11.000And you have kids and you wonder, like, what kind of a world are they going to grow up in?
01:59:15.000And it was just one of those days where I was just like, God, there's so much anger and there's so much this and that.
01:59:39.000It's like all these dynamics from social media we were talking about earlier.
01:59:42.000I think the dynamics in social media certainly exacerbated anger in some people.
01:59:46.000But I think anger in the world is just a part of...
01:59:51.000Frustration, inequality, problems that are so clear but are not solved and all the issues that people have.
02:00:00.000I mean, it's not a coincidence that a lot of the mass violence that you're seeing in this country, mass looting and all these different things are being done by poor people.
02:00:09.000Do you think AGI will be an equalizing force for the world or further inequality?
02:00:13.000I think it depends on how it's implemented.
02:00:16.000My concern is, again, what we were talking about before with some sort of a neural interface, that it will increase your ability to be productive to a point where you can control resources so much more than anyone else.
02:00:30.000And you will be able to advance your economic portfolio and your influence on the world through that, your amount of power that you can acquire.
02:00:40.000It will, before the other people can get involved, because I would imagine financially it'll be like cell phones.
02:00:48.000In the beginning, you remember when the movie Wall Street, when he had that big brick cell phone?
02:01:16.000And it was great because my friend Bill Blumenreich, who runs the Comedy Connection, He would call me because he knew he could get a hold of me like someone got sick or fell out I could get a gig because he could call me So I was in my cars like Joe.
02:01:34.000He's like fantastic and so he'd give me gigs So I got a bunch of gigs through this phone work kind of paid itself But I got it just because it was cool Like I could drive down the street and call people dude, I'm driving and I'm calling you like it was nuts To be able to drive,
02:01:50.000and I had a little antenna, a little squirrely pigtail antenna on my car, on the roof of the car.
02:02:48.000And once that happens, my concern is that the people that have that will have...
02:02:54.000Such a massive advantage over everyone else that the gap between the haves and have-nots will be even further and it'll be more polarizing.
02:03:04.000This is something I've changed my mind on.
02:03:07.000Someone at OpenAI said to me a few years ago, you really can't just let some people merge without a plan because it could be such an incredible Distortion of power.
02:03:19.000And we're going to have to have some sort of societal discussion about this.
02:03:39.000When you enter into it information, you ask it questions, and it can give you answers, and you could ask it to code a website for you, and it does it instantly, and it solves problems that literally you would have to take decades to try to solve.
02:04:22.000We still have all the human reward systems in place.
02:04:25.000We're still basically these territorial primates.
02:04:29.000And now we have, you know, you just imagine some fucking psychotic billionaire who now gets this implant and decides to just completely hijack our financial systems, acquire all the resources,
02:04:44.000set into place regulations and influences that only benefit them, and then make sure that they can control it from there on out.
02:04:51.000How much do you think this actually, though, even requires, like, a physical implant?
02:04:58.000Some people have access to GPT-7 and can spend a lot on the inference compute for it, and some don't.
02:05:05.000I think that's going to be very transformative too.
02:05:49.000When one person can read minds and other people can't, when one person has a completely accurate forecast of all of the trends in terms of stocks and resources and commodities, and they can make choices based on those.
02:06:08.000The only thing I feel a little confused about is You know, human talking and listening bandwidth or typing and reading bandwidth is not very high.
02:06:22.000But it's high enough where if you can just say, like, tell me everything that's going to happen in the stock market if I want to go make all the money, what should I do right now?
02:06:29.000And it goes, and then just shows you on the screen.
02:06:32.000Even without a neural interface, you're kind of a lot of the way to the scenario you're describing.
02:06:50.000Like I think what somehow matters is access to massive amounts of computing power, especially like differentially massive amounts, maybe more than the interface itself.
02:07:00.000I think that certainly is going to play a massive factor in the amount of power and influence a human being has, having access to that.
02:07:10.000My concern is that what neural interfaces are going to do is now you're not a human mind interacting with that data.
02:07:20.000Now you are some massively advanced version of what a human mind is.
02:08:09.000Do you ever think, like, how am I at the forefront of this spectacular change?
02:08:21.000Well, first of all, I think it's very much like...
02:08:25.000You could make the statement from many companies, but none is as true for OpenAI.
02:08:31.000The CEO is far from the most important person in the company.
02:08:35.000In our case, there's a large handful of researchers, each of which are individually more critical to the success we've had so far and that we will have in the future than me.
02:08:48.000And I bet those people really are like, hmm, this is weird to be them.
02:08:54.000It's certainly weird enough for me that it, like, ups my simulation hypothesis probability somewhat.
02:09:46.000Well, if this really is—I mean, our inherent understanding of life is that we are these biological creatures that interact with other biological creatures.
02:09:55.000We mate and breed and that this creates more of us.
02:10:00.000And then hopefully as society advances and we acquire more information, more understanding and knowledge, this next version of society will be superior to the version that preceded it, which is just how we look at society today.
02:10:13.000Nobody wants to live in 1860 where you died of a cold and there's no cure for infections.
02:10:25.000Unless you really do prefer the simple life that you see on Yellowstone or something, it's like what we're dealing with now, first of all, access to information, the lack of ignorance.
02:10:39.000If you choose, To seek out information.
02:10:44.000You have so much more access to it now than ever before.
02:10:48.000And over time, like if you go back to the beginning of written history to now, one of the things that is clearly evident is the more access to information, the better choices people can make.
02:11:00.000They don't always make better choices, but they certainly have much more of a potential to make better choices with more access to information.
02:11:08.000You know, we think that this is just this biological thing, but imagine if that's not what's going on.
02:11:14.000Imagine if this is a program and that you are just consciousness that's connected to this thing that's creating this experience that is indistinguishable from what we like to think of as a real biological experience from carbon-based life forms interacting with solid physical things in the real world.
02:11:41.000It's still unclear to me what I'm supposed to do differently or think differently.
02:13:25.000Those two things seem inevitable to me.
02:13:27.000The inevitable thing to me is that we will create a lifeform that is an artificial, intelligent lifeform that's far more advanced than us, and once it becomes sentient, it will be able to create a far better version of itself.
02:13:42.000And then as it has better versions of itself, it will keep going, and if it keeps going, it will reach God-like capabilities.
02:13:53.000The complete Understanding of every aspect of the universe and the structure of it itself.
02:14:01.000How to manipulate it, how to travel through it, how to communicate.
02:14:07.000And that, you know, if we keep going, if we survive a hundred years, a thousand years, ten thousand years, and we're still on this same technological exponential increasing and capability path, that's God.
02:14:27.000That might be what intelligent, curious, innovative life actually does.
02:14:32.000It creates something that creates the very universe that we live in.
02:14:38.000Like, creates the next simulation and then...
02:14:40.000Yeah, maybe that's the birth of the universe itself, is creativity and intelligence, and that it all comes from that.
02:14:47.000I used to have this joke about the Big Bang.
02:14:50.000Like, what if the Big Bang is just a natural thing?
02:14:54.000Like, humans get so advanced that they create a Big Bang machine, and then, you know, we're so autistic and riddled with Adderall that we'd have no concept or worry of the consequences, and someone's like, I'll fucking press it.
02:15:17.000I mean, I don't know where it goes, but I do know that if you looked at the human race from afar, if you were an alien life-form completely detached from Any understanding of our culture?
02:15:30.000Any understanding of our biological imperatives?
02:15:34.000And you just looked at, like, what is this one dominant species doing on this planet?
02:15:49.000It does all these things that are terrible.
02:15:51.000But it also consistently and constantly creates better things, whether it's better weapons going from the catapult to the rifle to the cannonballs to the rocket ships to the hypersonic missiles to nuclear bombs.
02:16:05.000It creates better and better and better things.
02:16:10.000And it's never happy with what it has.
02:16:13.000And you add that to consumerism, which is baked into us, and this desire, this constant desire for newer, better things.
02:16:23.000Well, that fuels that innovation because that gives it the resources that it needs to consistently innovate and constantly create newer and better things.
02:16:30.000Well, if I was an alien life form, I was like, oh, what is it doing?
02:17:13.000It's going to make a better version of a thinking thing.
02:17:15.000Well, that better version of a thinking thing, it's basically now in the amoeba stage.
02:17:19.000It's in the, you know, small multicellular life form stage.
02:17:23.000Well, what if that version becomes a fucking Oppenheimer?
02:17:28.000What if that version becomes, like, if it scales up so far that it becomes so hyper-intelligent that it is completely alien to any other intelligent life form that has ever existed here before?
02:17:41.000And it constantly does the same thing, makes better and better versions of it.
02:17:48.000It goes to something like a God, and maybe God is a real thing, but maybe it's a real consequence of this process that human beings have of consistently, constantly innovating and constantly having this desire to push this envelope of creativity and of technological power.
02:18:11.000I guess it comes down to maybe a definitional disagreement about what you mean by it becomes a god.
02:19:09.000Yeah, he even believes in the resurrection, which I found very interesting.
02:19:14.000But, you know, it's interesting communicating with him because he has these little pre...
02:19:22.000Designed speeches that he's encountered all these questions so many times that he has these very well-worded very articulate responses to these things that I sense are like bits You know like when I'm talking to a comic in like a comic like the I got this bit on train travel and they just tell you they tell you the bit like that's what it's like he has bits on why he believes in Jesus and why he believes and Very very intelligent guy,
02:19:48.000but I proposed the question When we're thinking about God, what if instead of God created the universe, what if the universe is God?
02:19:56.000And the creative force of all life and everything is the universe itself.
02:20:02.000Instead of thinking that there's this thing that created us.
02:20:06.000This is like close to a lot of the Eastern religions.
02:20:08.000I think this is an easier thing to wrap my mind around than any other religions for me.
02:20:14.000When I do psychedelics, I get that feeling.
02:20:17.000I get that feeling like there's this insane soup of innovation and connectivity that exists all around us.
02:20:52.000But I wonder if our limitations Are that we are an advanced version of primates.
02:21:00.000We still have all these things that we talk about, jealousy, envy, anxiety, lust, anger, fear, violence, all these things that are detrimental but were important for us to survive and get to this point.
02:21:13.000And that as time goes on, we will figure out a way to engineer those out.
02:21:18.000And that as intelligent life becomes more intelligent and we create a version of intelligent life that's far more intelligent than what we are.
02:21:28.000We're far more capable of what we are.
02:21:30.000If that keeps going, if it just keeps going, I mean, ChatGPT.
02:21:35.000Imagine if you go to ChatGPT and go back to Socrates.
02:21:47.000I bet he'd be much more impressed with the phone than ChatGPT.
02:21:51.000I think you'd be impressed with the phone's abilities to communicate for sure, but then the access to information would be so profound.
02:21:59.000I mean, back then, I mean, look, you're dealing with a time when Galileo was put under house arrest because he had the gumption to say that the Earth is not the center of the universe.
02:23:16.000If it keeps going, it has to go to some impossible level of capability.
02:23:23.000I mean, just think of what we're able to do now with nuclear power and nuclear bombs and hypersonic missiles and just the insane physical things that we've been able to take out of the human creativity and imagination and through engineering and technology implement these physical devices That are indistinguishable from magic if you brought them 500 years ago.
02:24:10.000But maybe that's also why comets exist.
02:24:13.000Maybe it's a nice reset to just like leave a few around, give them a distant memory of the utopian world that used to exist, have them go through thousands of years of barbarism Of horrific behavior and then reestablish society.
02:24:29.000I mean, this is the Younger Dryas Impact Theory that around 11,800 years ago at the end of the Ice Age, that we were hit by multiple comets that caused the instantaneous melting of the ice caps over North America.
02:24:44.000Flooded everything is the source of the flood myths from the Epic of Gilgamesh and the Bible and all those things.
02:24:51.000And also there's physical evidence of it when they do core samples.
02:24:55.000There's high levels of iridium, which is very common in space, very rare on Earth.
02:25:00.000There's micro diamonds that are from impacts and it's like 30% of the Earth has evidence of this.
02:25:06.000And so it's very likely that these people that are proponents of this theory are correct and that this is why They find these ancient structures that they're now dating to like 11,000, 12,000 years ago when they thought people were hunter-gatherers.
02:25:19.000And they go, okay, maybe our timeline is really off and maybe this physical evidence of impacts.
02:25:24.000Yeah, I've been watching that with interest.
02:25:26.000Randall Carlson is the greatest guy to pay attention to when it comes to that.
02:25:30.000He's kind of dedicated his whole life to it.
02:25:32.000Which, by the way, happened because of a psychedelic experience.
02:25:35.000He was on acid once and he was looking at this immense canyon and he had this vision that it was created by instantaneous erosions of the polar caps and that it just washed this wave of impossible water through the earth.
02:25:52.000It just caught these paths and now there seems to be actual physical evidence of that.
02:27:30.000I think the instinct of saying, like, we've really got to figure out how to make this safe and good and, like, widely good is really important.
02:27:42.000But I think calling for a pause is, like, naive at best.
02:27:57.000I kind of think you can't make progress on the safety part of this, as we mentioned earlier, by sitting in a room and thinking hard.
02:28:03.000You've got to see where the technology goes.
02:28:05.000You've got to have contact with reality.
02:28:09.000But we're trying to make progress towards AGI, conditioned on it being safe and conditioned on it being beneficial.
02:28:15.000And so when we hit any kind of, like, block, we try to find a technical or a policy or a social solution to overcome it that could be about the limits of the technology and something not working and, you know, hallucinates or it's not getting smart or whatever.
02:28:29.000Or it could be there's this, like, safety issue and we've got to, like, redirect our resources to solve that.
02:28:34.000But it's all, like, for me, it's all this same thing of, like, we're trying to solve the problems that emerge at each step as we get where we're trying to go.
02:28:44.000And, you know, maybe you can call it a pause if you want, if you pause on capabilities to work on safety.
02:28:49.000But in practice, I think the field has gotten a little bit wander on the axle there.
02:28:55.000And safety and capabilities are not these two separate things.
02:28:59.000This is like, I think, one of the dirty secrets of the field.
02:29:02.000It's like we have this one way to make progress.
02:29:04.000You know, we can understand and push on deep learning more.
02:29:09.000That can be used in different ways, but I think it's that same technique that's going to help us eventually solve the safety.
02:29:18.000All of that said, as like a human, emotionally speaking, I super understand why it's tempting to call for a pause.
02:29:33.000How much of a concern is it in terms of national security that we are the ones that come up with this first?
02:29:43.000Well, I would say that if an adversary of ours comes up with it first and uses it against us and we don't have some level of capability, that feels really bad.
02:29:55.000But I hope that what happens is this can be a moment where...
02:30:03.000To tie it back to the other conversation, we kind of come together and overcome our base impulses and say, like, let's all do this as a club together.
02:31:17.000Go all the way there and say we're just going to have one global effort.
02:31:22.000But I think at least we can get to a point where we have one global set of rules, safety standards, organization that makes sure everyone's following the rules.
02:32:32.000It's pretty extraordinary that they've managed to stop that, this threat of mutually assured destruction, self-destruction, destruction universe.
02:32:54.000I mean, Steven Pinker gets a lot of shit for his work because he just sort of downplays violence today.
02:33:02.000But it's not that he's downplaying violence today.
02:33:04.000He's just looking at statistical trends.
02:33:06.000If you're looking at the reality of life today versus life 100 years ago, 200 years ago, it's far more safer than Why do you think that's a controversial thing?
02:33:16.000Like, why can't someone say, sure, we still have problems, but it's getting better?
02:33:19.000Because people don't want to say that.
02:33:23.000They're completely engrossed in this idea that there's problems today, and these problems are huge, and there's Nazis, and there's— But no one's saying there's not huge problems today.
02:34:09.000I mean, it's all the problems that we highlighted earlier.
02:34:12.000And that that might be the solution to overcoming that is through technology.
02:34:18.000And that might be the only way we can do it without a long period of evolution.
02:34:23.000Because biological evolution is so relatively slow in comparison to technological evolution.
02:34:30.000And that that might be our bottleneck.
02:34:33.000We just still are dealing with this primate body.
02:34:37.000And that something like artificial general intelligence or something like some implemented form of engaging with it, whether it's neural link, something, that shifts the way the mind interfaces with other minds.
02:34:55.000Isn't it wild that, speaking of biological evolution, there will be people, I think, who are alive forever?
02:35:02.000The invention or discovery, whatever you want to call it, of the transistor that will also be alive for the creation of AGI. One human lifetime.
02:35:11.000From the implementation from Orville and Wilbur Wright flying the plane, it was less than 50 years before someone dropped an atomic bomb out of it.
02:35:56.000We've been talking a lot about problems in the world, and I think that's just always a nice reminder of how much we get to improve, and we're going to get to improve a lot, and I think this will be the most powerful tool.
02:36:10.000We have yet created to help us go do that.