The Joe Rogan Experience - May 09, 2019


Joe Rogan Experience #1294 - Jamie Metzl


Episode Stats

Length

2 hours and 28 minutes

Words per Minute

176.14592

Word Count

26,234

Sentence Count

1,963

Misogynist Sentences

13

Hate Speech Sentences

50


Summary

In this episode, Harry talks to a man who is a cacao shaman, a plant medicine practitioner, and a scientist who specializes in manipulating genetics. Harry and Harry talk about what it means to live in an entirely created world, and how we need to find a balance between the things we like and the things that we don t like. And they talk about how we can find a way to get back to a healthy, balanced life, even if we don't like things the way they are. It's a wild ride, and it's one you don't want to miss. This episode was produced and edited by Alex Blumberg and Annie-Rose Strasser. Our theme song is Come Alone by Suneatersound, courtesy of Lotuspool Records. Our ad music is by Haley Shaw. Additional music was made by Joseph McDade, and our ad music was written and performed by Justin Timberlake. Thanks to our sponsor, VaynerSpeakers, for producing the music for this episode and for the use of our logo and logo design by our theme song by our composer, Evan Handyside, who also wrote music for the intro and outro music, which was done by Ian Dorsch, and the rest of our mixing and mastering of this episode was done in part 2 of the album, which is available on all major podcast directories. . We hope you enjoy the episode, and share it with your friends and family. Thank you so much for all your support, we really appreciate it. Cheers, Amy and support us! - Amy and Jamie. - Thank you, Amy, Sarah, James, Matt, Jack, and Rachel Sarah, Rachel, Evan, Emma, Ben, and Ben, Matthew, and Jack, James and Rachel, John, and James, Sarah thank you, and thank you all for making this podcast so much love you all so much, thank you for listening and supporting us, so much , and so much more, so thank you so, so please leave us a review, so we can keep coming back for more of our work, and we can do more of this, and keep on coming, and more of you all of your support is appreciated, and so on and so forth, we appreciate you, so you can have more of your feedback, and you can be more of these things, and all of you can help us keep it coming.


Transcript

00:00:03.000 Boom.
00:00:03.000 And we're live.
00:00:04.000 Hello, sir.
00:00:05.000 How are you?
00:00:05.000 I'm great, Harry.
00:00:06.000 Nice to see you.
00:00:07.000 Thanks.
00:00:07.000 You were eating chocolate when you got here, and you told me that you are a cacao shaman.
00:00:12.000 And I said those are strong words.
00:00:13.000 They are.
00:00:14.000 What does that mean?
00:00:15.000 So I was in Berlin last year giving a talk at a tech conference, and somebody invited me to a sacred cacao ceremony.
00:00:21.000 Never heard of it.
00:00:22.000 I thought, wow, that sounds awesome.
00:00:23.000 I love chocolate.
00:00:24.000 I went.
00:00:25.000 It was so wonderful.
00:00:27.000 And at the end, they were talking about these people, these great cacao shamans.
00:00:31.000 And I thought, what is that?
00:00:32.000 I got to be one of those.
00:00:33.000 And so I came back.
00:00:34.000 I looked for certification.
00:00:35.000 There wasn't certification.
00:00:37.000 I self-declared.
00:00:38.000 And then I started doing cacao ceremonies in New York.
00:00:41.000 And I have hundreds of people who come.
00:00:43.000 It's really wonderful and it's exciting.
00:00:45.000 Like if you want to be a doctor, you got to go to medical school, right?
00:00:48.000 You want to be a comedian.
00:00:49.000 You got to become a professional.
00:00:50.000 You got to put in your time.
00:00:51.000 Cacao shaman, just show up.
00:00:53.000 It's like putting up a shingle.
00:00:54.000 Hey, I'm a cacao shaman.
00:00:55.000 If anybody shows up and they have a good time, then you're real.
00:00:58.000 Now, how much do you need to know about cacao?
00:01:00.000 Like the nutritional properties of it?
00:01:01.000 Cacao is amazing.
00:01:03.000 It's great stuff.
00:01:04.000 It's incredible.
00:01:05.000 So, definitely cacao.
00:01:07.000 People have been using it ceremonially for about 5,000 years.
00:01:10.000 So, it's incredible.
00:01:12.000 Chocolate makes people happy.
00:01:14.000 It helps your brain function, your circulation.
00:01:17.000 There's all these kinds of incredible things.
00:01:18.000 But in the ceremonies that I do, I have two key messages.
00:01:22.000 One is, you are the drug.
00:01:24.000 I mean, we all take...
00:01:25.000 People take drugs.
00:01:26.000 People take...
00:01:32.000 Thank you.
00:01:34.000 Thank you.
00:01:43.000 I believe that there are no – I say this in my ceremonies – there's no such thing as sacred cacao or sacred plants or sacred mountains or sacred people if we don't treat life with sacredness.
00:01:52.000 But if we recognize that everything is sacred, then we infuse life with sacredness and meaning.
00:01:58.000 Anyway, that's why I do it.
00:01:59.000 It's a lot of fun.
00:02:00.000 That's very interesting from a guy who specializes essentially in manipulating life.
00:02:05.000 Well, you know, we have manipulated life as humans for a very, very long time.
00:02:11.000 But it's interesting, you know, the idea of things being sacred, but your specialty is...
00:02:17.000 Manipulating genetics, right?
00:02:18.000 Yeah, well, so that is this strange moment that we're in, because for about 3.8 billion years, our species has evolved by this set of principles we call Darwinian evolution, random mutation and natural selection.
00:02:32.000 It's brought us here.
00:02:33.000 We used to be single-cell organisms, and now look at us.
00:02:36.000 And there's been a lot of magic in that process, and there still is, but we humans are deciphering some of that magic.
00:02:43.000 We are looking under the hood of what it means to be human, and we are increasingly going to have the ability to manipulate all of life, including our own.
00:02:51.000 Yeah, that is very unnerving to a lot of people.
00:02:55.000 It's uncomfortable and scary.
00:02:58.000 Yeah, it is.
00:02:59.000 They like things the way they are.
00:03:01.000 Jamie, I'd like to stay the way I am.
00:03:03.000 We always think that.
00:03:04.000 Why do we always think that?
00:03:05.000 Because there's a built-in conservatism in our brains.
00:03:10.000 And yet, we live these lives that are entirely dependent on these radical changes that are Ancestors have created.
00:03:18.000 I mean, we didn't find this building or agriculture or medicine in nature.
00:03:23.000 We built all those things.
00:03:25.000 Then everybody gets a new baseline when you're born.
00:03:27.000 And you think, well, I want organic corn.
00:03:32.000 I want whatever.
00:03:34.000 But all these things are creation.
00:03:36.000 We live in an entirely created world, and our ability to manipulate and change that world is always growing.
00:03:43.000 And I think we need to recognize that, but being afraid is okay, and being excited is okay, and we need to find the right balance between those two emotions.
00:03:51.000 I think for a lot of people, they feel like so many changes happened, particularly when you're talking about genetically modified foods.
00:03:56.000 So many things happened before they realized they had happened.
00:04:00.000 So when they're like, hey man, I don't want to eat any GMO fruit.
00:04:03.000 Well, then you probably shouldn't eat any fruit.
00:04:06.000 Because everything that you buy has been changed.
00:04:10.000 Every orange that you buy, that's not what an orange used to be like.
00:04:13.000 You could buy an apple.
00:04:14.000 Apples didn't used to be like that.
00:04:16.000 Tomatoes didn't used to be like that.
00:04:17.000 No, I know.
00:04:18.000 We reset our baseline just from when we were kids.
00:04:21.000 So if you went back 12,000 years ago to the end of the last Ice Age and you said, all right, find me all these things that we buy at Whole Foods.
00:04:29.000 Most of them didn't exist.
00:04:30.000 We've created them.
00:04:31.000 Sure.
00:04:31.000 And then in the 1970s, we had the ability to do what's recombinant DNA, what people call genetic modification.
00:04:38.000 And people are afraid because it's, well, that feels unnatural.
00:04:41.000 We're applying science to food.
00:04:45.000 And, you know, that's the issue.
00:04:47.000 And now we're entering the era of genetically modified humans, and there's that same level of uncomfortableness.
00:04:53.000 But what happened, the reason why I've written this book, Hacking Darwin, is that if we approach genetically modified humans in the same way we approach genetically modified foods, which is the scientists say, hey, we've got this, we're going to manage them responsibly, and it just kind of happens to people.
00:05:10.000 People are going to go nuts.
00:05:12.000 I mean, imagine how agitated people are about GMO foods.
00:05:15.000 If they don't have a say in how the human experience of genetic modification plays out, people are going to go berserk.
00:05:23.000 So we have this window of time where we can start bringing everybody into an inclusive conversation about where we're going because where we are going is just radically different from where we've been.
00:05:34.000 Yeah, I think it's an awareness issue and I also think it's a perception issue.
00:05:39.000 I think that everything people do is natural.
00:05:42.000 Including cities.
00:05:43.000 I think cities are natural.
00:05:44.000 That's why they're all over the world.
00:05:45.000 I think they're as natural as beehives.
00:05:47.000 And I think as much as we like to think that technology is not natural, it's clearly something people naturally make.
00:05:53.000 Of course.
00:05:54.000 They make it in every culture, if they can.
00:05:56.000 It's the history of our species.
00:05:57.000 And we kind of misuse this word naturally.
00:05:59.000 Natural.
00:06:00.000 Because what is natural?
00:06:02.000 I mean, maybe natural was when we used to live and we were just part of nature.
00:06:07.000 I always say, it's like, people say, oh, I love nature.
00:06:10.000 I love going out and hiking in the woods.
00:06:12.000 The reason you love hiking in the woods is that we've murdered all the predators.
00:06:15.000 It's like in the old days, you stay in your cave, you're not going out and hiking in the woods.
00:06:20.000 There's stuff that's going to kill you out there.
00:06:22.000 I know.
00:06:22.000 That was a massive luxury to go wander through the forest with no weapons.
00:06:26.000 Yeah.
00:06:26.000 Nobody did that.
00:06:27.000 Exactly.
00:06:28.000 No, exactly.
00:06:28.000 But people say, oh, I want nature.
00:06:29.000 I want my natural corn.
00:06:31.000 I want my natural chihuahua, even though 25,000 years ago, there's no chihuahuas.
00:06:38.000 There's wolves.
00:06:39.000 And look what we've done to them.
00:06:40.000 I know.
00:06:40.000 Well, look what we have done to them.
00:06:42.000 Made them pugs.
00:06:43.000 If you had natural wheat, or natural corn in particular, natural corn used to be a tiny little thing.
00:06:49.000 Yeah, it's just a few weeds.
00:06:50.000 Yeah, weird, gross little...
00:06:52.000 Yeah.
00:06:53.000 Grain.
00:06:53.000 Yep.
00:06:54.000 And then now we made it this big, juicy, sweet, delicious thing that you put butter on.
00:06:58.000 Which is great.
00:07:00.000 That's great.
00:07:01.000 As long as it doesn't have glyphosate on it.
00:07:03.000 We can't fetishize that there's some kind of imaginary world where kind of everybody was wearing Birkenstocks and eating in Whole Foods.
00:07:13.000 That imaginary world sucked for us.
00:07:15.000 That's why we left it.
00:07:16.000 It's true.
00:07:17.000 But there's some sort of a balance, right?
00:07:18.000 Yeah.
00:07:19.000 We do appreciate the nature aspect of our world and eagles and salmon and all these wild things and to be able to see them is very cool.
00:07:28.000 Yeah.
00:07:28.000 But yeah, you don't want to get eaten by those things.
00:07:30.000 You don't want them everywhere.
00:07:32.000 You want to be able to go out and get your newspaper without being worried about getting attacked by a jaguar.
00:07:36.000 It helps.
00:07:37.000 Yeah.
00:07:40.000 When you think about the future, at least me, let me tell you my concern.
00:07:45.000 Yeah.
00:07:46.000 I'm worried that rich people are going to get a hold of this technology quick and they're going to have massive unfair advantages in terms of intellect, in terms of physical athletic ability.
00:07:57.000 I mean, we really can have a grossly imbalanced world radically quickly if this happens fast where we don't understand exactly what the consequences of these actions are until it's too late and then we try to play catch up with rules and regulations and laws.
00:08:14.000 Yeah, that's a very, very real danger, and that's why I've written this book.
00:08:18.000 That's why I'm out speaking every day about this topic, because we need to recognize that if we approach these revolutionary technologies using the same values that we experience today, where we're here and very comfortable, but just down the road,
00:08:33.000 there are people who are living without many opportunities.
00:08:36.000 There are people in parts of the world, like Central African Republic, where there's just a war zone, kids are born malnourished.
00:08:42.000 If those are our values today, we can expect that when these future technologies arrive, we'll use those same values.
00:08:51.000 So it's real, and right now we have an opportunity to say, all right, these technologies are coming.
00:08:57.000 Whatever we do, these technologies are coming.
00:08:59.000 There's a better possible future and a worse possible future.
00:09:02.000 And how can we infuse our best values into the process to optimize the good stuff and minimize the bad stuff?
00:09:08.000 And certainly what you're saying is a real risk.
00:09:10.000 Think of what happened when European countries had slightly better weapons and slightly better ships than everybody else.
00:09:17.000 They took over the world and dominated everybody.
00:09:19.000 And so, yeah, it's very real.
00:09:23.000 Governments need to play a role in ensuring broad access and regulating these technologies to make sure we don't get to that kind of dystopian scenario that you've laid out.
00:09:33.000 Well, it's also in terms of governments regulating things.
00:09:36.000 Like, why are they qualified?
00:09:38.000 Who are they?
00:09:39.000 Who are the governments?
00:09:40.000 They're just people, right?
00:09:41.000 They're people that are either elected or some sort of a monarchy.
00:09:46.000 You're dealing with either kings and queens and sheiks, or you're dealing with presidents.
00:09:52.000 And we've seen in this country that sometimes our presidents don't know what the fuck they're talking about, right?
00:09:57.000 So who are they to disrupt science, to disrupt this natural flow of technology?
00:10:03.000 Well, we need somebody to do it.
00:10:06.000 We need some representation of our collective will just to avoid some of the things like you just mentioned.
00:10:13.000 That's the reason why humans banded together and created governments.
00:10:19.000 And the reason for democracy, especially if you have more functioning democracies, is that your government in some ways reflects the will of the people.
00:10:28.000 And the government does things...
00:10:30.000 That individuals can't do.
00:10:31.000 And I know there are libertarian arguments where everyone should just, like, if you want a little road in front of your house, either go build the road or pay somebody.
00:10:38.000 But there are a lot of things, even in that model, that won't get done.
00:10:42.000 There are a lot of kind of big national, even global concerns that you need some kind of I think we're good to go.
00:11:03.000 Every person needs to really understand these revolutionary technologies like genetics, like AI, and all of our responsibilities and opportunities to say, hey, this is really important.
00:11:14.000 Here are the values that I think that I cherish.
00:11:18.000 And just like you said, I don't want it to be That the wealthiest people are the ones who have kids with higher IQs and live longer and healthier than everybody else.
00:11:27.000 So we have to raise our voice.
00:11:29.000 And there needs to be a bottom-up process and a top-down process, and it's really hard.
00:11:36.000 Many people have a concern that someone else is going to do what we're not willing to do first.
00:11:42.000 Yes.
00:11:42.000 Right?
00:11:42.000 They're worried.
00:11:43.000 China and Russia.
00:11:44.000 Those are the big ones.
00:11:44.000 China and Russia.
00:11:45.000 Especially China.
00:11:46.000 Yeah.
00:11:46.000 They're very technologically advanced.
00:11:48.000 And their innovation's off the chain.
00:11:50.000 When it comes to...
00:11:51.000 Like Huawei just recently surpassed Apple as the number two cell phone manufacturer in the world.
00:11:58.000 Five years ago, they had a single-digit share of the marketplace.
00:12:03.000 Yeah.
00:12:03.000 Now they're number two on the planet Earth.
00:12:05.000 I mean, they hustle.
00:12:07.000 And if they just decide to make super people, and they do it before we do, that's what people are worried about, right?
00:12:14.000 They're worried about – there's trivial things, seemingly trivial, like athletics, and then there's things that are really – like, what's to stop people from just becoming the Hulk?
00:12:25.000 What's to stop people from becoming immortal?
00:12:28.000 What's to stop – what is – Well, two questions.
00:12:32.000 First is China.
00:12:34.000 I think it's a really big issue.
00:12:35.000 The story of the 21st century, one of the biggest stories of the 21st century will be how the US-China rivalry plays out and the playing field will be with these revolutionary technologies.
00:12:47.000 And China has a national plan to lead the world in these technologies by 2050. They're putting huge resources.
00:12:53.000 They have really smart people they are really focused on.
00:12:57.000 And it's a big deal.
00:12:58.000 In genetic technologies, when last year, my book Hacking Dharma was already in production in November when it was announced that these first genetically engineered babies had been born in China.
00:13:09.000 And so I called the publisher and said, we need to pull this back out of production because I need to reference this.
00:13:13.000 But it didn't require much of a change.
00:13:15.000 I'd already written...
00:13:16.000 This is happening.
00:13:17.000 We're going to see the world's first gene-edited humans.
00:13:19.000 It's going to happen first in China.
00:13:21.000 And here's why.
00:13:22.000 So I had to add a few sentences saying, and it just happened in October of 2018. So China is on that path.
00:13:31.000 And we need to recognize that on one hand, the United States needs to be competitive.
00:13:37.000 On the other hand, we don't want a runaway arms race of the human race, and that's why we need to find this balance between national ambition and some kind of global rules.
00:13:48.000 That's really hard to do.
00:13:50.000 Yeah, and the other thing is that we're competing with them, and so if they decide to do it first, we're almost compelled to do it second, or compelled to try to keep up.
00:14:00.000 How far away do you think we are from physically manipulating living human beings, Versus fetuses, versus something in the womb.
00:14:10.000 So physically manipulating living human beings, we're there.
00:14:13.000 We're there.
00:14:25.000 When you're younger, your body is better able to fight cancers.
00:14:29.000 What you can do with someone with a cancer, you take their cells, you manipulate their cells to give them cancer-fighting superpowers, and you put them back into the person's body.
00:14:39.000 And now the person's body behaves like you're a younger person.
00:14:42.000 You have the ability to fight back.
00:14:43.000 So gene therapies are already happening.
00:14:45.000 A relatively small number of them have already been approved, but there is a list of thousands of them with regulators and applications to regulators around the world.
00:14:55.000 So the era of making genetic changes to living humans, that's already here.
00:15:01.000 What can they do with it so far?
00:15:03.000 So far, most of it is focused on treating diseases.
00:15:08.000 But a lot more is coming.
00:15:11.000 Because when people think about the human genome, our genome isn't a disease genome.
00:15:16.000 It's not a healthcare genome.
00:15:17.000 It's a human genome.
00:15:18.000 And so we are going to be able to do things that feel like crazy things, like changing people's eye color, changing people's Skin color to funky things.
00:15:27.000 I mean, there's a lot of stuff that we're not doing now that we will be able to do.
00:15:31.000 How far away do you think we are from something like that?
00:15:33.000 10 years?
00:15:34.000 So in 10 years, we're going to have green people.
00:15:37.000 If someone so chooses.
00:15:39.000 Yeah, if someone so chooses.
00:15:40.000 What if it sucks?
00:15:40.000 Will they be able to go back to normal color?
00:15:42.000 Well, if it's...
00:15:43.000 That's a good question.
00:15:44.000 If it's with this kind of gene therapy, and it's a small number of genes...
00:15:50.000 Probably, but we are messing with very complex systems that we don't fully understand.
00:15:54.000 So that's why there's a lot of unknowns.
00:15:55.000 And coming back to your point on regulation, that's why I don't think we want a total free-for-all where people say, hey, I'm going to edit my own jeans.
00:16:02.000 Yeah, and you don't want some backyard hustler.
00:16:06.000 Yeah, it's true.
00:16:07.000 Lab.
00:16:07.000 It's true.
00:16:08.000 It comes by saying about the Hulk.
00:16:09.000 I mean, I just think that there are all kinds of...
00:16:11.000 We're humans.
00:16:12.000 We're diverse.
00:16:13.000 Any kind of thing that you can think of, there is a range.
00:16:16.000 And there's crazy on the left and crazy on the right and crazy on the top.
00:16:20.000 So people are going to want to do things.
00:16:21.000 And the question is, for any society, what do we think is okay and what do we think is not okay?
00:16:28.000 And maybe there should be some...
00:16:30.000 I believe there should be some limit to how far people can go with experimenting with...
00:16:37.000 Possibly, likely on themselves, but certainly on their future children.
00:16:40.000 Certainly on their future children, yeah.
00:16:42.000 But once you're 18, I think, do whatever the fuck you want.
00:16:45.000 Well, maybe 25. 25. We're going to have a lot of 25-year-olds with gills.
00:16:51.000 We probably will.
00:16:53.000 It's like the tattoo.
00:16:54.000 Seemed like a good idea.
00:16:55.000 Yeah, well we probably will.
00:16:58.000 So you think we're probably like 50 years away from that being a reality?
00:17:02.000 So I think that the genetic revolution has already begun.
00:17:07.000 And it's going to fundamentally change our lives in three big areas.
00:17:11.000 The first is our healthcare.
00:17:12.000 So we're moving from a system of generalized healthcare based on population averages.
00:17:16.000 So when you go to your doctor, you're treated.
00:17:18.000 Because you're a human, just based on average.
00:17:21.000 And we're moving to a world of personalized medicine, and the foundation of your personalized healthcare will be your sequence genome and your electronic health records.
00:17:28.000 That's how they know you are you.
00:17:30.000 And that's how they can say, this is a drug, this is an intervention that'll work for you.
00:17:34.000 When we do that, then we're going to have to sequence everybody.
00:17:38.000 So we're going to have about 2 billion people have had their whole genome sequenced within a decade.
00:17:44.000 And then we're going to be able to compare What the genes say to how those genes are expressed.
00:17:49.000 And then humans become a big data set.
00:17:52.000 And that's going to move us from precision to predictive healthcare where you're going to be just born and you're going to have all this information.
00:17:58.000 Your parents have all this information about how certain really important aspects of your life are going to play out.
00:18:02.000 And some of that is going to be disease-related.
00:18:05.000 But some of that's just going to be life-related, like you have a better-than-average chance of being really great at math or having a high IQ or low IQ or being a great sprinter.
00:18:14.000 And how do we think about that?
00:18:15.000 And then, again, a revolution that's already happening.
00:18:18.000 We're just going to change the way we make babies.
00:18:20.000 We're going to get away from sex as the primary mechanism for conceiving our kids.
00:18:25.000 We'll still have sex for all the great reasons we do.
00:18:28.000 And that's going to open up a whole new world.
00:18:37.000 I think we're going to move to an era where people who make babies through sex I think we're good to go.
00:19:11.000 Using in vitro fertilization and embryo screening, that 3% can be brought down significantly.
00:19:16.000 And what happens if you see somebody 20 years from now who has a kid with one of those preventable diseases?
00:19:22.000 Do you think that's fate?
00:19:23.000 Or do you think, well, wait a second, those parents, they made an ideological decision about how they wanted to conceive their kids.
00:19:31.000 So I think we're moving towards some really deep and fundamental changes.
00:19:34.000 Hmm.
00:19:36.000 Well, yeah.
00:19:37.000 That's an interesting conversation of whether or not—I wonder if we're ever going to get to a point where people don't allow people, sort of like people don't allow people to not get vaccinated.
00:19:50.000 Right.
00:19:51.000 There's a lot of that going on today.
00:19:54.000 Right, right.
00:19:55.000 Which is great, right?
00:19:56.000 You don't want diseases floating around.
00:19:58.000 But what if that gets to the place where we do that with people?
00:20:03.000 With people creating new life forms?
00:20:05.000 What if you say, hey, you are being irresponsible, you're just having sex and having a kid.
00:20:09.000 I know that's how your grandma did it.
00:20:11.000 We don't do it that way in 2099. Yeah.
00:20:13.000 I think it's going to be hard to do that in a society like the United States, but in a country like North Korea?
00:20:19.000 They'll be able to do that.
00:20:20.000 Or if they said, look, you can make babies however you want, but if you make babies the old-fashioned way, and if your kid has some kind of genetic disorder that was preventable, we're just not going to cover it with insurance.
00:20:35.000 So you're going to have a $10 million lifetime bill.
00:20:40.000 You don't need to require something.
00:20:43.000 You can create an environment where people's behaviors will change.
00:20:46.000 And then there will be...
00:20:49.000 I mean, right now, somebody sees some little kid riding around their bicycle without a helmet.
00:20:54.000 They're kind of looking at the parents like, hey, what are you doing?
00:20:56.000 How come you don't have a helmet on your kids?
00:20:58.000 And I just think that we're moving toward this kind of societal change where people will, I believe, see conceiving their kids in the lab as a safer alternative.
00:21:10.000 And it's not just safety because once you do that, then that opens you up To the possibility of all other kinds of applications of technology, not just to eliminate risks or prevent disease, but you have a lot more information.
00:21:27.000 So already it's possible to roughly rank order 15 pre-implanted embryos, tallest to shortest, in a decade from highest genetic component of IQ to lowest genetic component of IQ. This stuff is very real and it's very personal.
00:21:42.000 What do you think will be the first thing that people start manipulating?
00:21:44.000 I think certainly health.
00:21:47.000 Health will be the primary driver because that's every parent's biggest fear.
00:21:52.000 And that is what is going to be kind of the entry application.
00:21:55.000 People wanting to make sure that their kids don't suffer from terrible genetic diseases.
00:22:01.000 And then I think the second will probably be longevity.
00:22:05.000 I mean, right now there's a lot of work going on sequencing people, the super-agers, people who live to their late 90s, people to 100, to identify what are the genetic patterns that these people have.
00:22:17.000 So it's like to live to 90, you have to do all the things that you advocate, healthy living and whatever.
00:22:21.000 But to live to 100, you really need the genetics to make that possible.
00:22:25.000 So we're going to identify what are some of the genetic patterns that allow you to live Those kinds of long lives.
00:22:31.000 But then after that, then it's wide open.
00:22:34.000 I mean, it's higher genetic component of IQ, outgoing personality, faster sprinter.
00:22:40.000 I mean, we are humans.
00:22:41.000 We are primarily genetic beings, and we are going to be able to look under the hood of what it means to be human, and we'll have these incredible choices.
00:22:51.000 And it's a huge responsibility.
00:22:53.000 How long do you think before you have a person with four arms?
00:22:56.000 I think it's going to take a long time.
00:22:58.000 A couple hundred years?
00:22:59.000 Well, the thing is, here's how I see it.
00:23:01.000 So the real driver, there's two primary drivers.
00:23:04.000 One will be embryo selection.
00:23:07.000 So right now, average woman going through IVF has about 15 eggs extracted.
00:23:13.000 And then in IVF, in vitro fertilization, those eggs are fertilized using the male sperm.
00:23:20.000 And in average male ejaculation, there's about a billion sperm cells.
00:23:24.000 So men are just giving it away.
00:23:25.000 Women, human, female mammals are a little bit stingy.
00:23:30.000 But then the next killer application is using a process called induced pluripotent stem cells.
00:23:36.000 And so Shinya Yamanaka, this great Japanese scientist, won the 2012 Nobel Prize for developing a way to turn any adult cell into a stem cell.
00:23:44.000 So a stem cell is a kind of cell, it can be anything.
00:23:47.000 And so you take, let's say, a skin graft.
00:23:50.000 That has millions of cells.
00:23:52.000 You induce those adult skin cells into stem cells.
00:23:57.000 So you use these four things called Yamanaka factors.
00:23:59.000 And so now you have, let's call it 100,000 stem cells.
00:24:04.000 And then you can induce those cells into egg precursor cells, and then eggs.
00:24:09.000 So all of a sudden, humans are creating eggs like salmon on this huge scale.
00:24:13.000 So you have 100,000 eggs, fertilize them with the male sperm, In a machine, an automated process, you grow them for about five days.
00:24:23.000 And then you sequence cells extracted from each one of those.
00:24:27.000 And the cost of genome sequencing in 2003 was a billion dollars.
00:24:31.000 Now it's $800.
00:24:32.000 It's going to be next to nothing within a decade.
00:24:36.000 And then you have real options, because then you get this whole...
00:24:39.000 A spreadsheet, an algorithm.
00:24:40.000 And then you go to the parents and say, well, what are your priorities?
00:24:43.000 And maybe they'll say, well, I want health, I want longevity, I want high IQ. When you're choosing from big numbers like that, you have some real options.
00:24:51.000 And then on top of that, then there is this precision gene editing, the stuff that happened in China last year.
00:24:57.000 I think it will be...
00:24:58.000 And coming back to your question about four arms, I think it's going to be very...
00:25:02.000 People have this idea that tools like CRISPR are going to be used.
00:25:06.000 Someone's going to sit at a computer and say, like, four arms and three heads.
00:25:16.000 We're good to go.
00:25:29.000 Yeah, and then you would get the best of that and then work on those.
00:25:33.000 Exactly.
00:25:34.000 That's exactly the model.
00:25:35.000 You get that, and then you say, alright.
00:25:37.000 What if you had like 20 sons that were awesome, and they didn't tell you about 18 of them, and you kept two of them, and then 18 of them shipped off to some military industrial complex, turned them into assassins?
00:25:53.000 We're good to go.
00:26:13.000 China is showing how the internet can actually be really actively used to suppress people.
00:26:18.000 Facebook is taking people's information and Google in a way that's frightening a lot of people.
00:26:23.000 It shouldn't be that these companies can do whatever they want.
00:26:26.000 We have to have some way of establishing limits because not every individual is able to entirely protect themselves.
00:26:33.000 They don't have the power.
00:26:34.000 They don't have all the information.
00:26:35.000 We need some representatives helping us with that.
00:26:38.000 The real concern is the competition, right?
00:26:40.000 The real concern is whether or not we do something in regards to regulation that somehow or another stifles competition on our end and doesn't allow us to compete with Russia and China, particularly China.
00:26:50.000 Yeah, that's exactly right.
00:26:51.000 And so what we need to do is to find that balance.
00:26:55.000 And one of the big issues for this is privacy.
00:26:58.000 So if you kind of look around the world, I'd say of the kind of the big countries and groupings of countries, there's three models of privacy.
00:27:05.000 There's Europe, which has the strongest privacy protections for all kinds of data, including genetic data.
00:27:11.000 There's China, that has the weakest.
00:27:14.000 And there's the United States that has the middle.
00:27:16.000 And the paradox is from an individual perspective, we all are thinking, well, we kind of want to be like Europe because I don't want somebody accessing my personal information, especially my genetic information.
00:27:27.000 This is like my most intimate information.
00:27:30.000 But...
00:27:31.000 Genetics is the ultimate big data problem.
00:27:34.000 And so you need these big data pools and you'd access to these big data pools in order to unlock the secrets of genetics.
00:27:42.000 So these three different groupings, everyone's making a huge bet on the future and the way we're going to know who wins.
00:27:48.000 Like right now in the IT world, we have Amazon and Apple and Google and those big companies.
00:27:54.000 But whoever gets this bet right, They will be the ones who will be leading the way and making a huge amount of money on these technologies.
00:28:03.000 What we're talking about is a trillion, multi-trillion dollar industry.
00:28:07.000 How do you think this is going to affect things like competitive athletics?
00:28:10.000 Hugely.
00:28:11.000 So right now, we have this problem.
00:28:15.000 Someone like Lance Armstrong, who is manipulating his body.
00:28:19.000 And what he's basically doing is adding more red blood cells so that he can carry more oxygen.
00:28:25.000 And people feel that that's cheating.
00:28:27.000 It's a different topic that probably everybody in the Tour de France was doing exactly that when he won.
00:28:33.000 But what if, which will be the case, we're going to be able to sequence the people, let's say nobody's doing drugs, and we sequence all these athletes.
00:28:42.000 Some of them will just have a natural genetic advantage.
00:28:46.000 Their bodies will naturally be doing what Lance Armstrong had manipulated his body to do.
00:28:51.000 You know that's happening with a sprinter right now?
00:28:53.000 Yeah.
00:28:54.000 That female sprinter that has high levels of testosterone?
00:28:57.000 Yeah.
00:28:57.000 And I feel really sorry for her, but we have categories.
00:29:02.000 I mean, with your world in mixed martial arts, I mean, I think I remember in the past there was some person who was kind of a borderline on a...
00:29:12.000 Between genders and we're just kicking the shit out of all of these women in cage fighting.
00:29:17.000 It's like we have these categories of man and woman.
00:29:19.000 We know that the gender identities are fluid, but how do we think about it when these genetic differences confer advantages?
00:29:28.000 So if your body is primed To do something.
00:29:33.000 Maybe you could have like a Plato's Republic world where everybody fulfills a function that you are genetically optimized to do.
00:29:41.000 And then you could imagine that being a very competitive kind of environment.
00:29:45.000 But what do you do for now in something like the Olympics?
00:29:48.000 If somebody has this huge genetic advantage, should we let somebody else manipulate their bodies?
00:29:53.000 There's this thing called gene doping.
00:29:55.000 In order to change the expression of genes, so your body to act like you're as naturally genetic enhanced as somebody else, it's complicated.
00:30:03.000 Are they capable of doing certain physical enhancements through gene doping right now?
00:30:09.000 Yeah.
00:30:10.000 Like, what can they do right now?
00:30:11.000 No, no, so the way it works is so your genes instruct your cells to make proteins.
00:30:17.000 That's how the whole system works.
00:30:20.000 So you can change genes or you can trigger the expression of proteins.
00:30:26.000 So you can get people's bodies to behave As if they had these periodic optimizations.
00:30:33.000 Yeah.
00:30:34.000 And so that's why now the World Anti-Doping Agency, I mean, they are now starting to look at gene doping.
00:30:41.000 And this is the first time that that's even being considered as a category.
00:30:45.000 And then there are… Are there people that have done that successfully?
00:30:49.000 You know, I don't know the answer to that.
00:30:51.000 I know that WADA is looking for it, which makes me assume that it must have done, but I haven't seen it.
00:30:58.000 I've looked for it.
00:30:58.000 I haven't seen any reports.
00:30:59.000 If China starts winning everything.
00:31:01.000 Well, China is.
00:31:01.000 So I wrote one of my sci-fi novels, Genesis Code, was about this.
00:31:06.000 So China, as you know, has their system of their Olympic sports schools.
00:31:10.000 And the way it works is they test kids all around the country.
00:31:14.000 So let's just say it's diving.
00:31:17.000 And they identify what are the core skills of a diver?
00:31:20.000 What do you need?
00:31:21.000 And then they go around the country and they test kids, and then they bring a bunch of them to their Olympic sports schools.
00:31:26.000 And then they get them all involved, and then some kids are the best of those kids, and then the best of those kids, and then you get with these champs.
00:31:33.000 That's why China advanced so rapidly.
00:31:36.000 But what happens if they're doing that, but it's at the genetic level?
00:31:40.000 And there are countries like Kazakhstan that are already announcing that they are going to be screening all of their athletes.
00:31:46.000 The science isn't there yet, so it's impossible right now to say, well, I'm going to do a genome sequence of somebody, and I know this person has the potential to be an Olympic sprinter.
00:31:57.000 But 10 years from now, that's not going to be the case.
00:31:59.000 Wow.
00:32:01.000 Yeah, it's sort of going to throw a monkey wrench in the whole idea of what is fair when it comes to athletics.
00:32:06.000 Yeah, what is fair?
00:32:07.000 What is human?
00:32:08.000 Right.
00:32:08.000 What is human?
00:32:09.000 Yeah.
00:32:10.000 I mean, look, it's not like people don't already alter their bodies by training, by diet, exercise, all sorts of different recovery modalities, cryotherapy, sauna, high elevation training,
00:32:26.000 all these different things that they do that manipulates the human body.
00:32:31.000 But, it's not like, it would be kind of crazy if you had sports, but you couldn't practice and you couldn't work out.
00:32:37.000 Like, we want to find out what a person's really like.
00:32:39.000 No practice, no working out.
00:32:41.000 And that's the thing, is like, we are moving, it comes back to what we were saying before about nature.
00:32:46.000 It's like we have this feeling, nature somehow feels comfortable to us.
00:32:51.000 That's what we're used to.
00:32:52.000 All this stuff that you're talking about, nobody was doing that 10,000 years ago.
00:32:56.000 It's like, hey, I'm running after a buffalo.
00:32:58.000 Yeah.
00:32:59.000 And so as these boundaries change, as the realm of possibility changes, then we're going to be faced with all of these questions.
00:33:09.000 Even now, look at a sport like competitive weightlifting.
00:33:12.000 They have the real competitive bodybuilding.
00:33:17.000 And you see these guys, and they're monsters.
00:33:20.000 And then they have these drug-free guys, and everybody looks like a yogi.
00:33:25.000 They still look pretty big.
00:33:26.000 They look pretty big, but not compared to these other guys.
00:33:29.000 The only way to get those freak levels is through steroids.
00:33:33.000 Yeah.
00:33:33.000 And so, like, how are we going to police this?
00:33:37.000 And I think it's going to be very difficult.
00:33:38.000 And so, maybe we can have some kind of natural area of life.
00:33:43.000 But I think that our model of what's normal...
00:33:46.000 Is just going to change.
00:33:48.000 Because like I was saying in the beginning, we set our baseline based on how we grew up.
00:33:52.000 And that, it seems about right.
00:33:53.000 Like it seems about right to us that everybody gets immunizations.
00:33:58.000 But immunizations are a form of superpower.
00:34:00.000 Imagine if our ancestor, they couldn't even imagine immunizations.
00:34:03.000 What an unfair advantage when you have 100 million people dying of Spanish flu.
00:34:09.000 So all this stuff is scary and it's going to normalize, but how it normalizes That's what it played in.
00:34:16.000 Well, the world has changed so much just in the last 20 years, but it feels like this is just scratching the surface in comparison to what's coming.
00:34:24.000 People misunderstand, and they underestimate the rate of change.
00:34:29.000 And the reason that they do that is since the beginning of the digital revolution, we have experienced a thing called exponential change.
00:34:37.000 You've heard of Moore's Law, which is basically computing power roughly doubles every two years.
00:34:42.000 And we've internalized Moore's Law, and that means that every new iPhone we expect to be better and stronger and faster and all these kinds of things.
00:34:50.000 But now we're entering a world where we're going to have exponential change across technology platforms.
00:34:57.000 And so we think about, well, what does exponential change mean in the context of biology?
00:35:02.000 Well, at the very, very beginning, it's genome sequencing is going to be basically free.
00:35:09.000 But we're going to be able to change life.
00:35:12.000 And because we're on this J-curve, Like when you think of what's a 10-year unit of change looking in the rearview mirror, that amount of change is only going to take five years going forward and then two years and then one year.
00:35:24.000 And so that's the reason why I've written this book is we have to get that this stuff is coming fast.
00:35:29.000 And if we want to be part of it, we have to understand it and we have to make our voices heard.
00:35:36.000 What makes you nervous about this?
00:35:39.000 All right, three big areas.
00:35:40.000 First, humans and all of us are incredibly complex.
00:35:45.000 I mean, we talk about genetic code, which is mind-bogglingly complex.
00:35:50.000 But our genetics exists within the incredible, incredibly complex systems biology.
00:35:56.000 We have all these things like our microbiome, our virome, our proteome, our metabolome.
00:36:01.000 And then that exists within the context of our environment and everything's always changing and interacting.
00:36:07.000 And so we are messing and we have the tools to mess and we will mess because we're this hubristic species with these really complex ecosystems, including ourselves, we don't fully understand.
00:36:18.000 That's number one.
00:36:18.000 Two, and you mentioned it before, this issue of equity.
00:36:21.000 What happens if we have – every technology has to have first adopters.
00:36:25.000 If you don't have it, you never get the technology.
00:36:27.000 But what happens if a group of people move much more quickly than other people?
00:36:31.000 Whether it's real or not, even if they believe it's real, you could imagine big, dangerous societal changes.
00:36:37.000 And the third big area is diversity.
00:36:38.000 When we think about diversity, we think, well, it's great to have diverse workplaces and schools and we're more better people for it and we're more competitive.
00:36:47.000 But diversity is something much, much, much deeper.
00:36:50.000 In Darwinian terms, diversity is random mutation.
00:36:54.000 Like, that's our core survival strategy as a species.
00:36:57.000 If we didn't have that, you could say we'd still be single-cell organisms, but we wouldn't.
00:37:01.000 We would have died because the environment would have changed and we wouldn't have had the built-in resilience to adapt.
00:37:07.000 Yeah, that is really important when you think about diversity, right?
00:37:10.000 That we need a non-uniformity when it comes to our own biology.
00:37:14.000 We need a bunch of different kinds of people.
00:37:16.000 We have to have it.
00:37:17.000 Even if we optimize for this world, the world will change.
00:37:21.000 There's no good and bad in evolution.
00:37:23.000 There's just well-suited for a particular environment.
00:37:25.000 If that environment changes, The best suited person for your old environment may be the least suited person for the new environment.
00:37:33.000 So even if we have things that seem like really great ideas now, like optimizing health.
00:37:38.000 So if you have sickle cell disease, you're probably going to die and you're going to die young and it's going to be excruciatingly painful.
00:37:46.000 And so you would say, well, let's just get rid of sickle cell disease, which we can do.
00:37:51.000 But if you are a recessive carrier of the single-cell disease gene, you don't have it, you're just carrying it, and you have a pretty significant risk of passing it on to your kids, but you also have an additional resistance to malaria.
00:38:07.000 And so we are almost certainly carrying around all kinds of recessive traits, maybe even ones that we don't like that are harming us now, but that could be some protection against some future danger that we don't yet understand or haven't faced.
00:38:22.000 And so the challenge is that diversity has just happened to us for 4 billion years.
00:38:26.000 Now we're going to have to choose it, and that's a big challenge for us.
00:38:31.000 So essentially, we're going to have, without doubt, some unintended consequences, some unintended domino effect, things that are going to take place that we really can't predict.
00:38:45.000 We just have to kind of go along with this technology and see where it leads us as it improves.
00:38:50.000 If you go back and look at surgeries from the 1950s, comparison to surgery of 2019...
00:38:57.000 I mean, I would never advise someone to get their knee operated by a 1950s physician.
00:39:03.000 But that's kind of, someone's going to have to be an early adopter when it comes to these genetic therapies.
00:39:09.000 Yeah, no, so I agree with you, but where I would slightly, where I would add to what you're saying is, these technologies, they're going to happen, they're going to play out.
00:39:18.000 Mm-hmm.
00:39:18.000 What's at play now is not whether these technologies are going to advance.
00:39:22.000 They will advance in a way that is going to just blow people's minds.
00:39:27.000 What's at play is what are the values that we are going to weave into the decision-making process so that we can get a better outcome than we otherwise would have had.
00:39:36.000 And that's what, in my view, that's the real important issue now.
00:39:40.000 Yeah.
00:39:41.000 Unintended consequences are something that I've been taking very seriously lately when I'm paying attention to technology as it's used in social media.
00:39:52.000 Particularly, one of the things that's disturbed me quite a bit over the last few weeks is that there's a model that they use, and not intentionally, But there's a model that they use to get people upset about things, show you things in your feed that you argue against because that makes you click on them more and you engage in them more.
00:40:14.000 And because of the fact that we have this advertiser-based model where people are trying to get clicks, Yeah.
00:40:37.000 Right.
00:40:38.000 And we've gotten to this point where this is just an accepted part of our lives.
00:40:42.000 Did you go to check your Google feed or your Facebook feed and go, what the fuck are they doing?
00:40:47.000 Is this real?
00:40:48.000 They're going to pass this?
00:40:49.000 God damn it.
00:40:50.000 And then you get mad and then you engage with people online and then it results in more revenue.
00:40:56.000 Right.
00:40:57.000 Getting them to stop that.
00:40:58.000 If you had to go to Facebook and say, hey, Mark Zuckerberg, I know you have fucking $100 billion or whatever you got, but you can't make any more money this way.
00:41:05.000 Because what you're doing is fucking up society.
00:41:07.000 Because you're encouraging dissent.
00:41:10.000 You're encouraging people to be upset and arguments.
00:41:12.000 And you're doing it at great financial reward but great societal cost.
00:41:17.000 So stop.
00:41:19.000 He's not going to do it, right?
00:41:20.000 Well, he may not do it.
00:41:21.000 That comes back to the point about regulation.
00:41:22.000 The question is, how big is your stick?
00:41:24.000 Well, one of the guys who was the founder of Facebook is now coming out and saying that Facebook needs to be broken up.
00:41:30.000 And then he was one of the original founders.
00:41:32.000 And he's like, it has gotten so far out of hand.
00:41:35.000 It's so far away from where it is.
00:41:36.000 It's literally affecting global politics.
00:41:39.000 Yeah.
00:41:40.000 Well, it is.
00:41:40.000 And so one option is to break it up.
00:41:42.000 It seems to have worked pretty well with AT&T. Another option is to regulate it, which in my mind would be a better approach.
00:41:51.000 And that is to say, here's what's okay and here's what's not okay.
00:41:55.000 And this stuff is really intricate.
00:41:57.000 You have to really get down beneath these algorithms, which are unbelievably complex, but you're exactly right.
00:42:03.000 I mean, what we're seeing now Right.
00:42:11.000 Right.
00:42:17.000 Right.
00:42:20.000 Yeah.
00:42:39.000 Societies look very, very different.
00:42:41.000 Everyone's life experiences, we kind of take for granted.
00:42:44.000 You can go out the door, walk to Starbucks and not get shot, or you can have your house, something happens, your house gets robbed, you call the police, and the police aren't the ones who've robbed your house.
00:42:55.000 I mean, there's all these kinds of crazy things.
00:42:58.000 If we break down the foundations that underpin our lives, that's really dangerous.
00:43:04.000 Right.
00:43:04.000 What I was kind of getting at was that through this process of this algorithm, how this algorithm selects things that shows you in your feed and how people are getting upset by this and how this is generating massive amounts of revenue, once it's already happened, it's very difficult to stop.
00:43:20.000 And my concern would be that this would be a similar thing when it comes to genetic engineering.
00:43:26.000 We're saying we need to be able to put regulations on this, we need to be able to establish...
00:43:30.000 But once it gets out of the bag, once it gets rolling, and I have...
00:43:35.000 Do you remember when Mark Zuckerberg sat in front of all those politicians?
00:43:38.000 They had no fucking idea what they were talking about.
00:43:41.000 You make money, yeah.
00:43:42.000 Piss poor prepares.
00:43:43.000 It shows you, like, these are the people that are looking out for us, good fucking luck.
00:43:47.000 These are Luddites.
00:43:48.000 They're dumbasses.
00:43:49.000 They're fools, right?
00:43:51.000 And they don't know anything about- Some are, there's better and worse, but yeah.
00:43:55.000 But almost everyone was underwhelming and under-impressive.
00:44:00.000 In that hearing.
00:44:00.000 In that hearing.
00:44:01.000 The fact that they're dealing with one of the most important moments of our time, but they didn't bring on some sort of a legitimate technology expert who could explain the pitfalls of this and do so in a way that the rest of the world's going to know.
00:44:14.000 So they're not going to protect us from genetic engineering either, right?
00:44:17.000 I totally agree.
00:44:17.000 Because they're generalists in terms of their education for the most part, and they're not concerned.
00:44:22.000 They're concerned with raising money for their campaign.
00:44:25.000 They're concerned with getting re-elected.
00:44:26.000 That's what they're concerned with.
00:44:27.000 Yeah, I totally agree with you that if we wait to focus on this issue until it becomes a crisis, it's going to be too late because all the big decisions will have been made.
00:44:38.000 The reason why I wrote this book, the reason why I'm on my almost week three of this book tour doing events like this every day is what I am saying in every form that I can is this is really important.
00:44:51.000 We were watching the news yesterday, they had this royal baby in the UK. I don't give a shit.
00:44:57.000 It doesn't affect my life.
00:44:58.000 But what is at play now is the future of our entire species and our democracy and our lives.
00:45:04.000 And we have to be focusing on those things because we have a moment now where we can, to a certain extent, influence how these revolutions play out.
00:45:16.000 And if we just wait around, if we're distracted and we're focusing on all this stuff that's sucking up our attention, and whether it's Trump or Brexit or Mueller and all these things, I mean, how much of our time are we spending focused on?
00:45:27.000 It's fine, let's pay a little bit of attention, but there's really big stuff 50 years from now, 100 years from now, no one's going to look back and say, oh, that was the age of Trump.
00:45:35.000 They're going to say that was the age when after almost 4 billion years of evolution, humans took control of their own evolutionary process, and it's huge, and it's going to change all of life.
00:45:46.000 And what I'm trying to do is to say everybody has to have a seat at the table, whether you're a conservative Christian, whether you're a biohacking transhumanist, everybody needs to be at the table because we're talking about is the future of our species.
00:46:02.000 We're talking about the future of our species, but are we even capable of understanding the consequences of these actions, the stuff that we're discussing?
00:46:10.000 Right now, I'm not.
00:46:12.000 I'm talking about it, but if someone said, hey, you've got to go speak in front of people about the consequences in a very clear one-hour presentation, I'd be like, no, I'm not.
00:46:22.000 I don't know what I'm talking about.
00:46:23.000 One, we can go together, so you're good.
00:46:25.000 Thank you.
00:46:26.000 But two, the reason why I've written this book, Hacking Darwin, is I wanted to say if you could read just one book, And it's written just for everybody in a very clear way with a lot of jokes that I think are funny, my mother laughed at them as well, that you get it.
00:46:39.000 And then once you know just the basics, as a human being, anybody, it has an equal right to be part of this conversation as the top scientist or the leaders of any country.
00:46:52.000 I would agree with you there, but I don't think that other people are going to see it that way.
00:46:56.000 I think the people that are in control, they're not going to say, Hey, we need to be fair with everyone, all the citizens of the world.
00:47:02.000 How do you feel we should proceed?
00:47:03.000 But that's why we need this bottom-up groundswell, but we can't have a bottom-up groundswell if just general people aren't even aware of what the issues are.
00:47:14.000 And that's the challenge, and that's why forums like yours are just so important.
00:47:18.000 I mean, you have all of these people.
00:47:20.000 And then, you know, maybe everyone doesn't listen to this podcast and say, all right, I get it.
00:47:24.000 I can go give that hour-long speech.
00:47:26.000 But you can read a couple books, and then you can give an hour speech.
00:47:30.000 Because the issues, like, yes, there are scientific issues.
00:47:33.000 But this isn't a conversation about science.
00:47:35.000 This is about values and ethics and our future.
00:47:38.000 And it has to be a conversation for everybody.
00:47:40.000 Yeah, it's not just a scientific conversation.
00:47:42.000 It's a conversation about the future of this species and what the species will become.
00:47:47.000 And that's something we're wholly unqualified.
00:47:50.000 No, but here's a little vote for optimism.
00:47:53.000 Okay.
00:47:54.000 We have never been this literate as a species.
00:47:57.000 True.
00:47:57.000 We've never been this educated.
00:47:58.000 I don't think we've ever been this nice either.
00:48:01.000 I hope so.
00:48:01.000 I really do.
00:48:02.000 I really think that.
00:48:02.000 When you look at all the wars and all the murder that used to happen, it's actually, this is the best time ever to be alive.
00:48:08.000 Still sucks for people that are in bad situations.
00:48:10.000 For sure, no, but it's, yes, on average it's better.
00:48:13.000 And we've never been this connected.
00:48:15.000 So we have, so in the book I call for a species-wide dialogue on the future of human genetic engineering.
00:48:20.000 You think, oh, that's nuts.
00:48:22.000 Seven billion people on earth, how are they gonna do that?
00:48:25.000 But we have the opportunity and we have to try because you don't want, like with the beginning of the genetically modified crops era, the scientists were actually really responsible, but regular people weren't consulted and they felt these guys just did it to me.
00:48:39.000 So if you have all the marchers with genetically modified organisms, we are entering the era of genetically modified humans and that's gonna scare the shit out of people.
00:48:49.000 And so we need to start preparing and we need to make people feel that they're respected and included.
00:48:54.000 And our government leaders aren't going to do it for us.
00:48:56.000 So we have to find ways of engaging ourselves.
00:48:59.000 And that's why with me, with the book, I set up a website where people can share their views, debate with other people.
00:49:07.000 I really want everybody to be part of this conversation.
00:49:10.000 How do you think it's going to play out in terms of how people, various religions perceive this?
00:49:15.000 Yeah.
00:49:15.000 So there's a real variation.
00:49:17.000 So there are people on one end of the spectrum who believe that this is quote-unquote playing God.
00:49:24.000 And if you believe that the world was created exactly as it is by some kind of divine force and that it's wrong for humans… To change, to quote-unquote, play God, it's hard to explain how you could justify everything that we've done.
00:49:41.000 I mean, we've changed the face of life on this planet Earth.
00:49:45.000 But I really respect people who say, look, I think that there's a line, that I believe that life begins at conception, and that any kind of manipulation after conception is interfering, that's going too far, and I respect that.
00:50:00.000 And those people need to have a seat at the table.
00:50:03.000 And there's certainly very strong religious views.
00:50:06.000 In Judaism, there's an idea called tikkun olam, which means that the world is created cracked and broken, and it's the responsibility of each person to try to fix it.
00:50:15.000 And that's a justification for using science and doing things to try to make the world a better place.
00:50:20.000 And then there are now these new kind of, I mean, transhumanism.
00:50:23.000 It's almost like a religion.
00:50:25.000 It's this religion of science.
00:50:27.000 And so we're going to have...
00:50:29.000 We're humans.
00:50:29.000 We're so diverse.
00:50:30.000 We are going to have this level of diversity.
00:50:34.000 And the challenge is, how do we have a process that brings everybody in?
00:50:39.000 But it's tough.
00:50:40.000 So when we're talking about genetic...
00:50:48.000 Yeah.
00:50:49.000 Yeah.
00:51:03.000 Yeah.
00:51:04.000 Yeah.
00:51:17.000 Yeah, well, people are already doing it in Sweden.
00:51:19.000 Sure.
00:51:20.000 What are they doing in Sweden?
00:51:21.000 Yeah, they're putting just little chips in their hands and under their skin, and they're using it to open doors and access things.
00:51:28.000 So it's just starting.
00:51:29.000 So I definitely believe, you know, right now, we look at photographs of our parents, and It's like, God, look at your hair, your clothes.
00:51:36.000 That's crazy.
00:51:37.000 Definitely, I think that 20 years from now, 30 years from now, people are going to look at pictures of us and say, what's that little rectangular thing?
00:51:45.000 And you're going to say, that was a phone.
00:51:47.000 And they'll say, what?
00:51:48.000 It's like, yeah, we used to carry it around in our pocket.
00:51:51.000 Well, like Michael Douglas, when you watch him in that movie Wall Street, he's got that giant brick phone on the beach.
00:51:55.000 Exactly.
00:51:56.000 So, we are all Michael Douglas because our technology, you're absolutely right, is not going to be something that we carry around.
00:52:05.000 Technology is coming inside of our bodies.
00:52:08.000 That is the future of where it's going.
00:52:10.000 And, you know, people say, well, what does human genetic engineering have to do when we know that AI is going to get more and more powerful?
00:52:19.000 The future of technology, the future of all of this, it's not human or AI. It's human plus AI. And that is what's going to drive our – we are co-evolving with our technology, and that's what's going to drive us forward.
00:52:32.000 But you're exactly right to be afraid and to be concerned.
00:52:35.000 And again, everything comes to, well, how are we going to regulate it?
00:52:38.000 Are we going to have guardrails of how far is too far?
00:52:41.000 Are we going to let companies just do whatever they want, or are we going to put restrictions on what they can do?
00:52:46.000 I think letting the whole world decide, though, you're going to run into those religious roadblocks.
00:52:51.000 For sure.
00:52:52.000 And that's the challenge is that the science is advancing exponentially, whatever we do.
00:52:57.000 And so we have to have our understanding of the science needs to at least try to keep pace.
00:53:05.000 Regulations need to keep up.
00:53:07.000 I'm part of the World Health Organization International Advisory Committee on Human Genome Editing.
00:53:11.000 So we're meeting six times this year in Geneva.
00:53:13.000 And the question that we're asking is, how do we think about global regulation, at least to try to put limits on the far ends of what's possible?
00:53:23.000 And it's really, really difficult.
00:53:26.000 But that's why we need to have this kind of process.
00:53:29.000 And it seems impossibly ambitious, but every crazy idea has to begin somewhere.
00:53:34.000 So you're doing every couple months?
00:53:36.000 Yeah, yes.
00:53:38.000 Wow.
00:53:39.000 Yeah.
00:53:39.000 Because they want to be on top of it as things change.
00:53:42.000 Well, that's the goal.
00:53:44.000 It's just, it's so hard because...
00:53:46.000 Almost impossible, I imagine.
00:53:47.000 It's impossible.
00:53:47.000 It's impossible.
00:53:48.000 And that's why...
00:53:49.000 I think?
00:54:04.000 Because there's not a crisis, people are focusing on other things.
00:54:08.000 Open any news site.
00:54:09.000 What do you see?
00:54:10.000 It's not like the really important stuff.
00:54:13.000 It's Trump did this or Kardashians did that.
00:54:17.000 We're in this culture where there are a lot of draws on our attention, but sometimes there's really important stuff and people are afraid of it.
00:54:23.000 People are afraid of science.
00:54:25.000 People feel like, I remember science from high school.
00:54:26.000 I didn't like it.
00:54:28.000 I was uncomfortable.
00:54:30.000 You know, this is for technical people.
00:54:32.000 And I just feel like we can't – science is so deeply transforming the world, not just around us, but within us.
00:54:39.000 And so we have to understand it.
00:54:41.000 And people who are explaining science like me, the onus is on us.
00:54:46.000 Like if somebody reads my book and says, well, that was really dense.
00:54:49.000 That was too hard.
00:54:50.000 Like, that's my failure.
00:54:53.000 Like, I was giving a talk in New York a couple of weeks ago, and so I gave my talk, and I tried to make this really accessible for people.
00:55:01.000 People were all jazzed up, they got it.
00:55:03.000 And then there was this wonderful guy, this brilliant senior scientist at this major stem cell research center.
00:55:10.000 And so the host said, all right, Jamie just talked.
00:55:14.000 Can you give us a little background on the science?
00:55:17.000 This guy knows so much.
00:55:18.000 And he started going, and it was very technical.
00:55:21.000 And I could just see the faces of the people in the audience.
00:55:24.000 It was like, oh, God, what's happening here?
00:55:26.000 And their level of excitement, it just shrunk.
00:55:29.000 Because they couldn't really put it all in a box.
00:55:32.000 And scientists aren't trained, by and large, to communicate anything.
00:55:38.000 And to see in the future, so a little more than a month ago, I was in Kyoto in Japan, and I went to the laboratory of the world's leading scientists who's doing a process of what I mentioned earlier, of turning adult cells into stem cells into eggs.
00:55:53.000 And so this will revolutionize the way humans reproduce.
00:55:57.000 And so I was in a meeting with his top postdoc students.
00:56:01.000 So these are like really the cutting edge of these technologies.
00:56:04.000 And I went around to each of them and I said, here's my question.
00:56:06.000 I have two questions for each of you.
00:56:07.000 One, tell me what you're doing now.
00:56:10.000 And two, tell me what are the implications of what you're doing now for 50 years from now.
00:56:15.000 And the first question was, oh, I'm doing this and we're doing this with mouse models and people were so animated.
00:56:21.000 And then 50 years from now, people just froze and it was so uncomfortable.
00:56:25.000 They were like squeezing the table Just because that's not what scientists do.
00:56:30.000 They are trained to say, well, this is the thing just in front of me.
00:56:33.000 So I thought I was writing this book for the general public, but I'm being invited to speak to thousands of doctors and scientists because what they're saying is we get that we're doing this little piece of this, and whether it's lab research or fertility doctors or all sorts of things,
00:56:48.000 but it's really hard to put together the whole story of the genetics revolution and what it means for us and for society.
00:56:55.000 Yeah, man.
00:56:57.000 That is interesting about scientists, right?
00:56:59.000 They're just concentrating on the task at hand.
00:57:01.000 I mean, wasn't that, that was like one of the big concerns about the Manhattan Project, right?
00:57:06.000 This is the task.
00:57:07.000 The task is how do you figure out how to do it.
00:57:08.000 So they figure out how to do it, not The eventual consequences.
00:57:13.000 So, when Robert Oppenheimer, who was the lead of the Manhattan Project, when that first bomb went off, I mean, he has his famous quote.
00:57:22.000 Yeah, exactly.
00:57:24.000 I mean, the English common translation was, holy shit, what have we done?
00:57:29.000 And this science is real.
00:57:33.000 But it's not one person doing it.
00:57:37.000 Science has been diffused, at least with nuclear power.
00:57:41.000 It was a relatively small number of people.
00:57:44.000 And it was one or two states that could do it.
00:57:47.000 Now, with precision gene editing, you get the Nobel Prize for figuring out how to do, you will get the Nobel Prize for figuring out how to do CRISPR gene edits.
00:58:00.000 But to apply it, once the formula already exists, you get like an A- in your high school biology class.
00:58:06.000 So this technology is out there, it's cheap, it's accessible.
00:58:11.000 Did you go to that 2045 conference in Manhattan a couple years back?
00:58:16.000 No.
00:58:17.000 Do you know about all that 2045?
00:58:19.000 That's part of the thing with these transhumanist folks.
00:58:22.000 They believe that with their own calculations of the exponential increase of technology, that somewhere around 2045. There's a singularity.
00:58:30.000 Yeah.
00:58:31.000 At the very least, we're going to reach this point where you're going to be able to either download consciousness or have some sort of an artificially intelligent sentient being that's hanging out with you.
00:58:41.000 Yeah, so I'm involved.
00:58:43.000 I'm on faculty for one of the programs of Singularity University called Exponential Medicine.
00:58:48.000 And so we're thinking a lot about that.
00:58:50.000 I actually had an editorial in the New York Times a few weeks ago imagining a visit to a fertility clinic in the year 2045. And again, because we're on this exponential change, it's really hard for people to internalize, to kind of feel how fast these changes are coming.
00:59:08.000 I do think, though, Ray Kurzweil, who's a really incredible genius, he thinks that we are soon going to get to a point where our artificial intelligence is self-learning.
00:59:19.000 But AI, if it gets to the point where it can read something, read and comprehend, like in seconds, it will read every book ever written in human history.
00:59:29.000 And then when you have all these doublings and all this more knowledge, you can imagine how that would happen pretty quickly.
00:59:36.000 The counterargument against, and I think that it will, But I don't think that our human brains are, on one hand, they're incredibly complex, and they're also kind of irrational.
00:59:47.000 I mean, we have all these different layers.
00:59:48.000 We have our lizard brain, and every decision that we make, there's the rational decision.
00:59:52.000 But then there's all the other stuff that our brains, that doesn't even rise to the level of our awareness, that our brains are processing.
01:00:00.000 And right now, we only really have one really effective artificial intelligence algorithm, which is for pattern recognition.
01:00:07.000 But if you think of pattern recognition as a core skill of what our brains do, our brains probably have a thousand, two thousand different skills.
01:00:14.000 But the core thing is whether we reach this singularity moment or not.
01:00:20.000 These technologies are going to become incredibly more powerful.
01:00:23.000 They're going to become increasingly integrated into our lives and into our beings and part of our evolutionary process.
01:00:30.000 There's no longer, oh, we just have our biological evolution and our technological evolution, and those are separate things.
01:00:36.000 They're connected.
01:00:36.000 It's going to be that weird question of whether or not if an artificial intelligence is going to be able to absorb all the writing that human beings have ever done and really understand us.
01:00:46.000 Yeah.
01:00:46.000 Will they really still be able to understand us just because they get all the writing?
01:00:49.000 So right now, you would say no.
01:00:52.000 I'd say no, yeah.
01:00:53.000 But 20 years from now?
01:00:55.000 50 years from now?
01:00:55.000 100 years from now?
01:00:56.000 They could come up with a reasonable facsimile.
01:00:59.000 I mean, they could figure out a way to get it close enough to...
01:01:02.000 Yeah.
01:01:03.000 You know, where it's like her, like that.
01:01:05.000 Yeah, yeah.
01:01:06.000 That's an essential point.
01:01:07.000 Because I think when people imagine this AI future, they're imagining like some intimate relationship with some artificial intelligence that feels just like a human.
01:01:17.000 I don't think that's going to happen.
01:01:19.000 You don't?
01:01:20.000 Well, no, but just because AI, it will be its own form of intelligence.
01:01:24.000 And it may not be, frankly, we wouldn't want AIs with these brains like we have that have all these different impulses that are kind of imagining all this crazy stuff.
01:01:32.000 We may want them to be more rational than we are.
01:01:37.000 Chimpanzees are our close relatives.
01:01:39.000 They don't think just like us.
01:01:40.000 We're not expecting them to think like us.
01:01:43.000 They're their own thing.
01:01:44.000 And I think AIs will be their own things.
01:01:46.000 Will we be interacting with them?
01:01:48.000 Will we be having sex with them?
01:01:49.000 Yes.
01:01:50.000 But it's not going to be that they're just like us.
01:01:54.000 They're going to be these things that live within us, live with us, and together we're going to evolve.
01:02:01.000 Well, they're certainly already better at doing certain things like playing chess.
01:02:05.000 I mean, it took a long time for an artificial intelligence to be able to compete against a real chess master, but now they swamp them.
01:02:12.000 And they learn quickly, like incredibly quickly.
01:02:15.000 They teach themselves.
01:02:16.000 Yeah, so first we had chess, and chess people said, oh, that's what it means to be a human.
01:02:22.000 The computers will never beat humans at chess.
01:02:24.000 Now, it's like everyone says, well, no human could ever compete.
01:02:27.000 And then they said, well, there's this Chinese game of Go, which kind of when people here look at it, it looks kind of like checkers, but it's actually way more sophisticated, way more complicated than chess.
01:02:36.000 I heard that there are more moves in Go, more potential moves than there are stars in the universe.
01:02:42.000 Yes, yes.
01:02:43.000 So then they had AlphaGo.
01:02:46.000 That this company DeepMind, which was later acquired by Google, they built this algorithm that in 2016 defeated the world champions of Go.
01:02:56.000 People thought that we were decades away.
01:02:58.000 And then DeepMind created this new program called AlphaZero.
01:03:03.000 And AlphaZero, with AlphaGo, they gave it access to all of the digitized games of Go.
01:03:09.000 So it very quickly was able to learn from how everybody else had played Go.
01:03:14.000 AlphaZero, they just said, here are the basic rules of Go.
01:03:18.000 And they let AlphaGo just play against itself with no other experience other than here are the rules and play against.
01:03:25.000 And in four days, AlphaZero destroyed AlphaGo.
01:03:30.000 And then...
01:03:33.000 Alpha Zero destroyed the world champions of chess and destroyed every other computer program that had ever played chess.
01:03:42.000 And again, those computer programs had internalized all the chess games of Grandmasters.
01:03:46.000 Alpha Zero had not internalized any.
01:03:49.000 It just played against itself for a few days.
01:03:51.000 And then Shogi, which is a Japanese traditional game, kind of like chess, it destroyed the Grandmasters of that.
01:03:58.000 That's what I'm saying.
01:03:59.000 The world is changing.
01:04:02.000 It's changing so much faster than we anticipate.
01:04:05.000 And we have to be as ready for that as we can.
01:04:08.000 I think we need to come to grips with the fact that we're way stupider than we think we are.
01:04:12.000 We think we're really intelligent, and we are.
01:04:15.000 In comparison to everything else on this planet.
01:04:17.000 But in comparison to what is possible, we are really fucking dumb.
01:04:21.000 In comparison to what this computer can do and what the future of that computer is going to be.
01:04:26.000 Maybe that computer is going to redesign another computer.
01:04:28.000 This is good, but I've got some hiccups here.
01:04:31.000 Yeah.
01:04:32.000 No, it's true.
01:04:32.000 And yet the technology is us.
01:04:34.000 Right.
01:04:34.000 Sort of.
01:04:35.000 It's not like this technology is some alien force.
01:04:37.000 It's like we create art.
01:04:40.000 Collectively.
01:04:41.000 Yeah.
01:04:41.000 You mentioned cities.
01:04:43.000 We create these cities, which are these incredible places where dreams can happen.
01:04:48.000 Cities like here in Los Angeles or New York, where I'm from.
01:04:51.000 So this technology is us.
01:04:53.000 And the challenge is, how can we make sure that this technology serves our needs rather than undermines our needs?
01:05:00.000 Right.
01:05:01.000 Yeah, and whether or not our needs supersede the needs of the human race or supersedes the needs of the planet.
01:05:08.000 Yeah.
01:05:11.000 We're almost too much chimp, right, to contemplate these critical decisions in terms of how it's going to unfold from here on out.
01:05:20.000 We really might, not we, but the people that are actually at the tip of the spear of this stuff, they really might be affecting the way the planet exists.
01:05:30.000 Absolutely.
01:05:31.000 And we're doing that now.
01:05:32.000 I mean, there was an article that came out the other day.
01:05:35.000 There's a million species that are on the verge of extinction.
01:05:38.000 We are driving all these other species to extinction.
01:05:40.000 We're warming the planet.
01:05:42.000 So humans are the determining factor in many ways for how this planet plays out.
01:05:48.000 And that's why, in my mind, everything comes back to values.
01:05:51.000 You're right.
01:05:52.000 We have this...
01:05:54.000 Lizard nature, this monkey nature.
01:05:57.000 It's who we are.
01:05:58.000 You wouldn't want to take that away because that's the core of what we are.
01:06:02.000 And yet we're also a species that has created philosophy.
01:06:05.000 We've created beautiful religions and traditions and art.
01:06:09.000 And the question is, which version of us is going to lead us into the future?
01:06:16.000 If it's this tribal primate with these urges, that's really frightening.
01:06:22.000 If we can say...
01:06:23.000 We've done better and worse in history.
01:06:25.000 And we had this terrible Second World War.
01:06:27.000 And yet at the end of the Second World War with American leadership, the world came together.
01:06:32.000 We established a united nations.
01:06:34.000 We established these concepts of human rights.
01:06:36.000 Like you can't just kill everybody in your own country and say, hey, it's just my business.
01:06:41.000 So we have this capability, but it's always a struggle.
01:06:45.000 I mean, these forces are always at war with each other in many ways.
01:06:48.000 Yeah.
01:06:49.000 It's just too much to think about.
01:06:51.000 Yeah, but we have to.
01:06:53.000 I know, we do have to.
01:06:54.000 One of the things that's always been amusing to me is that we seem to have this insatiable desire to improve things.
01:07:04.000 And I've always wondered why.
01:07:06.000 But is that maybe because this is what human beings are here for?
01:07:10.000 It's what we do.
01:07:11.000 It's who we are.
01:07:13.000 Right.
01:07:13.000 But is this a product?
01:07:14.000 It's just a...
01:07:16.000 Us being intelligent, trying to survive against nature, and predators, and weather, and all the different issues that we came up, that we evolved growing up and dealing with.
01:07:26.000 And then now, we just want things to be better.
01:07:29.000 We just want things to be more convenient, faster, but more data.
01:07:34.000 You're aware of Elon Musk's neural link technology.
01:07:38.000 Sure.
01:07:38.000 How much do you know about it?
01:07:39.000 I know a decent amount.
01:07:40.000 And my friend Brian Johnson, he has a company, Kernel.
01:07:42.000 There's a few different companies that are trying to think about these brain-machine interfaces.
01:07:46.000 And what are they trying to do?
01:07:47.000 Basically, what they're trying to do is to find a way to connect our brains to our machines.
01:07:53.000 And there's a little bit of progress.
01:07:55.000 Our brains are incredibly complicated and they're messy.
01:07:59.000 I mean, there's a lot that's happening.
01:08:02.000 But we are increasingly figuring out how to connect our brains to our technology.
01:08:08.000 And so people are imagining a time when we can do things like download memories, download ideas, or upload memories and upload ideas.
01:08:18.000 And there's some very early science that is suggesting that this will be The very early days.
01:08:26.000 But Elon was giving the impression that sometime this year they're going to release something.
01:08:33.000 You know, they may release something, but it's not going to be something that's going to change the world.
01:08:37.000 Because that technology is way more nascent than even the genetics technology that I've been talking about.
01:08:44.000 It's not like it's at all remotely possible that this year you're going to be able to upload a full memory or download a full memory.
01:08:53.000 But there are little things that are happening, but every journey begins with a step.
01:08:58.000 But the technology is fairly transparent in terms of where the state of the art is right now?
01:09:04.000 It is in that it's extremely early.
01:09:07.000 This stuff is...
01:09:08.000 So when you think about systems that we understand, I mentioned that we understand just a little bit about genomics.
01:09:15.000 We know less about the brain.
01:09:17.000 The brain is kind of the great unknown of this universe.
01:09:20.000 We know more about the oceans than we know about our brain.
01:09:23.000 I mean, we know very, very little.
01:09:25.000 We understand that if you kind of...
01:09:26.000 We're good to go.
01:09:46.000 I think he's off based on your use of the word your.
01:09:53.000 So I mentioned that a month ago I was in Kyoto and I was at this stem cell lab, but I also went to another lab of a guy named Hiroshi Ishiguro, who's the world's leading humanoid roboticist.
01:10:05.000 And so he's the guy who was on the cover of Wired and he's created these robot avatars.
01:10:10.000 And like I had a conversation with this robot woman, And it was really interesting because I could see that if I would smile, she'd smile and lean forward.
01:10:20.000 And if I had an over-exaggerated sad face, she'd change her expression.
01:10:25.000 And she can have basic conversations.
01:10:30.000 But we're still a long way, and so from having full robotics, but I had this, for robotic human interactions, but I had this debate with Ishiguro, and he was saying that he thought that the future of humanity was non-biological,
01:10:46.000 that we were going to kind of unload ourselves to these non-biological entities, and that is how we would gain our immortality.
01:10:53.000 And I argued something very different.
01:10:56.000 I feel like we are biological beings.
01:10:58.000 I think we'll fully integrate with our technology, but if we ever become entirely non-biological, then that's not us.
01:11:05.000 Either we will have committed suicide as a species, or these robots will have killed us.
01:11:12.000 Because even if, let's just say, that I could download my entire consciousness to some kind of robot, and let's just say that was possible, that robot would be me for that first Right.
01:11:40.000 Right.
01:11:51.000 If that's your consciousness, your consciousness is in these ones and zeros?
01:11:55.000 Yeah.
01:11:56.000 I mean, that's terrifying.
01:11:59.000 Yeah.
01:11:59.000 What's terrifying is if someone didn't like you and they said, I'm going to make one version of you suffer for all eternity.
01:12:05.000 Yeah, yeah, yeah.
01:12:06.000 And I'm going to just download you while you sleep.
01:12:08.000 It's true.
01:12:08.000 But I have something worse than that.
01:12:11.000 Okay.
01:12:11.000 Death.
01:12:12.000 And so I think that nobody is going to say, well, I'm going to be Joe living a life or I'm going to like not be Joe and I'll just have my...
01:12:34.000 I think some people will want that, not everybody.
01:12:36.000 I think some people will, but they don't know what they're getting, right?
01:12:38.000 In terms of you don't know what that experience is going to be like, nor do you know if there is some sort of a chemical gateway that happens in the mind when you do expire and allows you to pass through to the other dimension that your consciousness and your soul longs to travel to.
01:12:55.000 I hope you're right about that.
01:12:57.000 I'm definitely not right.
01:12:59.000 I've written about this in my novel.
01:13:00.000 It's like, yeah, but I think kind of when you're dead, you're just dead.
01:13:03.000 Why do you think that, though?
01:13:04.000 Well, just because I think that this kind of immortality comes because time stops.
01:13:09.000 Time is this relative concept.
01:13:10.000 And so at the moment that you die, that's immortality for you because time stops flowing for you.
01:13:18.000 That's what Einstein taught us.
01:13:19.000 Time is this relative concept.
01:13:21.000 Other people, very legitimately, and there's no way to prove it, feel that we have this soul and this soul can travel to other dimensions.
01:13:28.000 I happen to believe that we are biological beings and our experience of the soul, whatever, is connected to our biology.
01:13:35.000 When our biology stops functioning, those experiences, whatever they are, stop being accessible, at least to us.
01:13:42.000 Have you had any psychedelic experiences?
01:13:44.000 I haven't.
01:13:45.000 And I was so tempted.
01:13:47.000 We started the interview talking about my chocolate shamanism.
01:13:51.000 You haven't had anything?
01:13:52.000 I haven't.
01:13:53.000 Do you want to?
01:13:55.000 I don't think so, and I'll tell you why.
01:13:57.000 So I listened to the Michael Palin interviews, and he had his great conversation with Sam Harris, and I really think that this psilocybin stuff is real.
01:14:08.000 You just got decriminalized in Denver.
01:14:09.000 I know, in Denver.
01:14:10.000 I was just there the other day.
01:14:12.000 But, as I said before, I think that the ultimate drug is us.
01:14:18.000 And so for me, I would rather, and I definitely think that our awareness, it doesn't encompass everything that is knowable, everything that we could know, but we hem ourselves in.
01:14:29.000 And if we want to get out of those limitations, certainly drugs are ways that people have used for many thousands of years.
01:14:41.000 What does that mean?
01:14:56.000 That the drug is us.
01:14:57.000 That if we want to expand our consciousness, there are all kinds of ways, whether it's meditation or awareness or just simple appreciation.
01:15:05.000 That's when I do these cacao ceremonies.
01:15:07.000 What I say is you have this cacao in front of you, but it's not just this.
01:15:12.000 Think of...
01:15:13.000 The person in Honduras who planted the seed, the person who watered that seed, the person who took the plant, the person who paved the road to bring the plant.
01:15:22.000 And I just think that we can expand our consciousness through our own means, and then we always have access.
01:15:29.000 I hear what you're saying, but you're saying this from a person that's never had psychedelic experiences in a It's really preposterous.
01:15:34.000 If you did experience what psilocybin can do to you, you definitely wouldn't be saying it this way.
01:15:40.000 You also wouldn't be thinking that you take it and then you're not on it anymore because it's profoundly influential for your perspective in regards to the whole rest of your existence.
01:15:50.000 There's many people that have had psychedelic experiences that Think about it as a rebirth, that they've gone through this and changed.
01:15:59.000 So why would you have this rigid thought process about drugs and not drugs, but yet you don't have it about cacao, which is a mild drug?
01:16:09.000 Yeah, and so you're right that it may not be entirely consistent.
01:16:12.000 Some of the people you've described are good friends of mine who've really done it, and I've talked to them about it, and I'm endlessly curious.
01:16:20.000 So why don't you do it?
01:16:22.000 The reason is, so far, I have been on this journey to see what's possible within myself.
01:16:29.000 And I'm still on that journey.
01:16:30.000 I don't want to close off any possibility for anything.
01:16:34.000 How would you assume that it would close things off?
01:16:35.000 That's what's confusing.
01:16:36.000 Yeah.
01:16:37.000 It's just opening you up to a new experience that other people have found to be profoundly influential.
01:16:43.000 Yeah.
01:16:45.000 So far, you're very resistant to this.
01:16:47.000 Like, even when I'm talking, you're like, yeah, yeah, yeah.
01:16:49.000 Like, you can't wait to come back with your own rational perspective.
01:16:53.000 No, you're right.
01:16:53.000 You know what's so funny?
01:16:53.000 I will come back to this.
01:16:54.000 From the perspective of a person who hasn't experienced anything.
01:16:57.000 You're so right.
01:16:58.000 And just, I think it's such a great point.
01:17:00.000 Because I'm very close with, actually, with the Tibetans.
01:17:02.000 One of my closest friends is the prime minister of the Tibetan exile government.
01:17:06.000 So, I've been many times to Dramsala in India.
01:17:09.000 I've met with His Holiness the Dalai Lama many times.
01:17:12.000 And the most incredible thing about meeting with these guys, and they are all people who've found these incredible states of heightened consciousness, so much that their brains are changed when they go into the fMRI machines.
01:17:28.000 But When you have a conversation with them, it's not like what we do.
01:17:32.000 Like, exactly.
01:17:32.000 And thank you for calling me out.
01:17:33.000 Like, you say something.
01:17:34.000 I've already, in my head, countered what you're saying before you're finished saying.
01:17:38.000 That's why you kept saying, right.
01:17:39.000 Exactly.
01:17:40.000 Which probably means that you made me a little uncomfortable, which is good.
01:17:43.000 That's what we want.
01:17:44.000 And these guys, it's like you'd talk to them, and they would just be so tuned in to what you were saying.
01:17:49.000 And they would just kind of think about it.
01:17:51.000 Then you'd finish.
01:17:52.000 And then they'd kind of look up.
01:17:54.000 And then, because we're Americans, we want, you know, somebody stops speaking.
01:17:59.000 Like, you have to, if you don't speak right away.
01:18:02.000 And then there was, like, a minute.
01:18:04.000 And then it's like, you're kind of looking around.
01:18:06.000 Was it me?
01:18:07.000 Did I say something?
01:18:07.000 And then they would come back with this incredibly thoughtful thing.
01:18:11.000 So, you're right.
01:18:12.000 You know, I don't want to close off any possibility.
01:18:14.000 There are many different things that we could do.
01:18:18.000 The path that I have been on, and certainly with this cacao, and you're right, cacao is like a...
01:18:23.000 I mean, not a huge one, but like a mild drug, and life is a mild drug.
01:18:43.000 To a different identity, a different consciousness.
01:18:46.000 And now he said, I don't do psilocybin, but I do daily meditation, but I can see where I'd like to go, what's possible.
01:18:53.000 So I get that.
01:18:54.000 Yeah, there's a bunch of different substances out there that have very similar profound effects.
01:19:00.000 Yeah, yeah.
01:19:02.000 There's a real thought, and this is something that Terence McKenna described way back in the late 90s, early 2000s.
01:19:11.000 He believed that you're going to be able to recreate a lot of psychedelic states through virtual reality so that people that don't want to actually do a drug will be able to experience what it's like to be on that drug.
01:19:25.000 And that, I mean, it's very theoretical and hypothetical.
01:19:30.000 Yeah.
01:19:31.000 You know, who knows whether or not that's possible.
01:19:33.000 But that could be one other way that human beings interface with technology.
01:19:38.000 So humans, in my view, are far more hackable than we think.
01:19:44.000 That there are so many that we just imagine our biology as being fixed, but our biology is really variable.
01:19:51.000 Like I have a friend who's an anesthesiologist at Stanford.
01:19:56.000 I think?
01:20:15.000 Where the pixelation of virtual reality will be equal to life.
01:20:20.000 And so you're going to be in this VR space.
01:20:23.000 And it will look.
01:20:24.000 It may smell.
01:20:26.000 It could be with haptic suits.
01:20:27.000 It may feel just like life.
01:20:30.000 And our brains, I don't know if you've done these things with these VR glasses where You go and you can see people.
01:20:39.000 You're just like in a hall.
01:20:41.000 And you put on the glasses.
01:20:42.000 And now you're in an elevator going to the top on the outside, like a window cleaner's elevator to the top of this high rise.
01:20:51.000 And there's a little rickety board.
01:20:53.000 And then there's this cat at the end of the board.
01:20:56.000 And they're saying, yeah, you have to go.
01:20:57.000 Go save the cat.
01:20:59.000 And you've already seen that you're just in a hall.
01:21:02.000 You know it in your brain.
01:21:04.000 There's this cat.
01:21:05.000 Everybody's looking at you, and you've seen all these other people panic.
01:21:08.000 And you think, well, when I'm there, I'm going to be so...
01:21:11.000 I'm just going to go grab that cat.
01:21:12.000 And you're terrified.
01:21:13.000 Like, you're trying to override your lizard brain.
01:21:16.000 And your lizard brain is saying, like, no, don't step...
01:21:18.000 They have that for our HTC Vive.
01:21:21.000 Yeah.
01:21:21.000 We need to set that up, Jamie.
01:21:23.000 We need a two-by-four that we put on the ground for that.
01:21:25.000 Oh, it's incredible.
01:21:26.000 And so I think that this whole concept of reality is that our technology is going to be changing our sense of reality.
01:21:35.000 And then what's real?
01:21:36.000 Like, if you feel...
01:21:48.000 I mean, I think that there's real big issues here.
01:21:52.000 Yeah, I mean, that is the matrix, right?
01:21:55.000 And if it feels better and it's more enjoyable than real life, what is going to stop people from doing the Ray Kurzweil deal and downloading yourself into this dimension?
01:22:07.000 Not much.
01:22:09.000 Whether it's possible to do a full download or not, I mean, I think that's an open question.
01:22:14.000 But whether people are going to be more comfortable living in these alternative worlds, and whether we're going to be able to say, oh, no, that is the fake world.
01:22:23.000 Like, if you're in this virtual world, but you have friends in that world, you're interacting in that world, you have experiences that feel every bit as real in that world as in our world, And people say, oh no, that's not real.
01:22:38.000 Those aren't your friends.
01:22:39.000 Like, even now.
01:22:40.000 Like, you know, we all, people with global lives, you kind of have these friends.
01:22:43.000 Like, I have a good friend in Mongolia.
01:22:45.000 We talk all the time.
01:22:47.000 Do you ever see them in person?
01:22:48.000 Once in a while.
01:22:49.000 Like, once every year or two.
01:22:50.000 It's great to see them.
01:22:51.000 Well, that's a real person, though.
01:22:52.000 No, it's a real person.
01:22:53.000 But that's someone you actually know.
01:22:53.000 No, no, this isn't like a pen.
01:22:54.000 No, but I don't mean that.
01:22:55.000 I mean, you actually do know them.
01:22:57.000 No, absolutely.
01:22:57.000 But if it was someone that you only talked to online and they lived in Mongolia, that's where things get weird.
01:23:02.000 It's true, but let's just say, following that hypothetical, you have that person.
01:23:06.000 They're part of your whole life.
01:23:08.000 And they're with you.
01:23:10.000 They're with you through your life experiences.
01:23:12.000 You call them up when you're sad.
01:23:15.000 Is it so essential that you've met that person physically?
01:23:19.000 Is that the core idea?
01:23:23.000 It's not essential, but it means a lot.
01:23:25.000 It does.
01:23:27.000 Because we are these physical beings and we are these virtual beings, but figuring out what's the balance is going to be really tricky.
01:23:38.000 Yeah, what is the balance?
01:23:40.000 I'm worried about augmented reality, too.
01:23:43.000 When you see people that use Snapchat filters and they give themselves doggy ears and stuff like that, how long before that is just something that people choose to turn on or turn off about life itself?
01:23:56.000 You'll be able to see the world through different lenses.
01:23:59.000 The sky could be a different color.
01:24:00.000 The plants could be a different color.
01:24:02.000 Yeah.
01:24:02.000 I might write about this in one of my novels, Eternal Sonata, where I think we're just going to have these contact lenses, and it'll be different kinds of information based on what people want.
01:24:13.000 I mean, like, I'll meet with you, and it'll say, all right, this is Joe, here's a little bit of background, whatever, and we'll have useful information.
01:24:21.000 Or you're walking around a city, and you'll get little alerts of things you might do, or history.
01:24:28.000 That's what they were thinking about with Google Glasses, right?
01:24:30.000 Yeah, I know, but it just was so annoying that people wanted to kill people.
01:24:33.000 It was just too weird.
01:24:34.000 Yeah, yeah, yeah.
01:24:35.000 Yeah, it was just too, everybody felt like they were getting filmed, too.
01:24:37.000 They were.
01:24:38.000 Yeah, I mean, when you're walking around with Google glasses on, you assume that people were recording everything.
01:24:43.000 It was very strange.
01:24:44.000 They were, but I think that's another thing that we're just, all of our lives are going to be recorded.
01:24:48.000 Of course.
01:24:48.000 Yeah.
01:24:48.000 Now, do you think that that's going to come in the form of a contact lens, or do you think it's going to come in the form of ski goggles that you're going to put on?
01:24:56.000 Nobody wants to look like an idiot.
01:24:59.000 That's not true.
01:25:01.000 No, but in the beginning, you talked about Michael Douglas, or my favorite one is Kurt Russell in Escape from New York.
01:25:08.000 What do you have there?
01:25:09.000 It's like this really cool tech.
01:25:11.000 He finally gets out of this Manhattan hell, and he's got this phone, and it's like this big.
01:25:17.000 So there's Mr. President's cell phone?
01:25:19.000 It's like very, very early days.
01:25:21.000 And so now we have these kind of glasses, and there's like a little bit of cachet.
01:25:26.000 Tom has a cell phone that's that big.
01:25:28.000 Have you seen that?
01:25:29.000 No.
01:25:30.000 I was at the Verizon store.
01:25:31.000 Yeah.
01:25:32.000 I think it's like an attachment or an accessory to a phone.
01:25:35.000 Yeah, it's like you can bring it with you and not bring your other phone.
01:25:40.000 That's the idea behind it.
01:25:42.000 So you could decide, well, I'm going out, let me just bring my tiny phone for essentials.
01:25:46.000 But all this tiny, I mean, the phones are going to get, quote-unquote phones are just going to get so small.
01:25:50.000 That's why I say they're going to come inside of us.
01:25:52.000 You'll have like a little contact lens, maybe a little thing in your ear, maybe like a little permanent implant behind your tooth.
01:25:58.000 Or one of your teeth will replace a tooth with a computer.
01:26:02.000 Pipe it right into your nerves.
01:26:04.000 Any kind of crazy stuff you can think about is probably going to happen.
01:26:06.000 Some of it will take, some of it won't.
01:26:09.000 Yeah, it seems like that's what we're going to have to see, like how it plays out.
01:26:13.000 And that's one of the things when you were talking about scientists that are working on these things, they're working on what's right in front of them.
01:26:18.000 They're not looking at the greater landscape itself in terms of what the future holds.
01:26:22.000 It's not their job, and that's why we need other people.
01:26:25.000 I certainly see myself— Who are those people?
01:26:26.000 You and who else?
01:26:27.000 Who would you elect?
01:26:28.000 If Trump came to you and said, Jamie, we've got problems.
01:26:32.000 We need to figure out the future.
01:26:33.000 What should we do?
01:26:35.000 So, certainly, we need to have a mix of different kinds of people.
01:26:39.000 And so, I—certainly, people like me who are kind of big-picture futurists, we need that.
01:27:03.000 Mm-hmm.
01:27:06.000 We need a developing world.
01:27:08.000 We need all kinds of people.
01:27:09.000 But in terms of the people who are kind of articulating the big picture of the world and what are the challenges that we're facing, I certainly put myself in that category.
01:27:18.000 People like Yuval Noah Harari who are just kind of big, also kind of big thinkers, people like Sid Mukherjee.
01:27:25.000 And I just think we have to articulate the big picture and we have to do it in a way so that people can see themselves in this story And then enter into the conversation.
01:27:38.000 Are you writing this book just to sort of educate people and let them understand exactly what is going on and that it is a really volatile and chaotic and amazing time?
01:27:52.000 And that all these things are...
01:27:53.000 Are you doing this book to...
01:27:55.000 What essentially was Hacking Dharma?
01:27:58.000 What was the motivation behind it?
01:28:00.000 Was it for...
01:28:01.000 A person like me?
01:28:02.000 Or was it for everyone?
01:28:04.000 It was for everyone.
01:28:06.000 And so what I really wanted to do...
01:28:07.000 So the background...
01:28:08.000 I can give you just a little bit of background.
01:28:10.000 So more than 20 years ago, I was working on the National Security Council.
01:28:14.000 And my then boss, Richard Clark, who was then this obscure White House official who was jumping up and down saying, we need to be focusing on terrorism and Al-Qaeda and Bin Laden.
01:28:26.000 And he was trying to tell everybody...
01:28:28.000 And nobody was paying attention to it.
01:28:29.000 It was totally marginalized.
01:28:31.000 And when 9-11 happened, Dick's memo was on George Bush's desk saying exactly that.
01:28:37.000 We need to focus on Al-Qaeda.
01:28:38.000 Here's what's going to happen.
01:28:39.000 And Dick, even before then, would always tell me that if everyone in Washington was focusing on one thing, you could be sure there was something much more important that was being missed.
01:28:47.000 And so more than 20 years ago, I was looking around.
01:28:50.000 I saw these little pieces of disparate information, and I came to the conclusion that the genetics revolution was going to change everything.
01:28:55.000 So I I educated myself.
01:28:57.000 I started writing articles.
01:28:59.000 I was invited to testify before Congress.
01:29:02.000 And then to try to get that story out, I wrote my two most recent near-term sci-fi novels, Genesis Code and Eternal Sonata.
01:29:10.000 And when I was on book tours for those, and I explained the science to people, the way I kind of a self-educated citizen scientist...
01:29:18.000 And a novelist would explain the science.
01:29:20.000 All of a sudden, people got it.
01:29:22.000 And that was when I realized I needed to write a book about the genetics revolution that people could absorb, that wouldn't scare people.
01:29:28.000 But my mission for the book is that this stuff, as we've talked about, it's so important.
01:29:35.000 Everybody needs to be part of the conversation.
01:29:37.000 We have this brief window.
01:29:39.000 And what I'm calling for is this species-wide dialogue on the future of human genetic engineering.
01:29:44.000 And I have a whole game plan in the book about what people can do, how they can get involved, people individually, on a national level.
01:29:53.000 We have to put a lot of pressure on our elected leaders and say, stop focusing on the crap.
01:29:57.000 There is really important stuff that needs to be addressed, and we need leadership.
01:30:02.000 I'm speaking in Congress a week and a half from now, talking about these issues.
01:30:10.000 So we need to have...
01:30:11.000 And on an international level, we have to have some kind of international system.
01:30:16.000 We're so far away from being able to do that.
01:30:18.000 We don't even know what the standards are, but we have to be pushing.
01:30:21.000 So you think of it in terms of the same way we have with nuclear weapons?
01:30:26.000 Yeah.
01:30:26.000 In many ways, yeah.
01:30:27.000 But the thing is, with nuclear weapons, a lot of that happened at the state level, at the country level.
01:30:32.000 This needs to happen at a popular level and at a government level.
01:30:37.000 So the only way that that's going to happen effectively is we need real comprehensive education on this subject.
01:30:42.000 It's not something that people can just guess, right?
01:30:45.000 Absolutely.
01:30:45.000 They need to know what's the consequences, where we're at right now, right?
01:30:49.000 Yeah.
01:30:50.000 Yeah, and that's like Sanjay Gupta had a wonderful quote that's actually on the cover of my book, which is, if you can read one book on the future of our species, this is it.
01:30:57.000 So what I've tried to do is to say, like, if you just want to go to one place to understand what's happening, what's at stake, what it means for you, and what you can do now if you want to get involved, I've tried to do that.
01:31:09.000 Ironically, I'm now, as I mentioned, being asked to speak to thousands of doctors and scientists because they're all reading this book and they're saying, this is positioning my work in a much bigger context.
01:31:20.000 Sanjay Gupta is a very interesting cat because he was very anti-marijuana and then started doing research on it and then totally flipped 180 degrees, which to me is a great sign of both humility and intelligence.
01:31:32.000 He recognized that the data was different than his presuppositions.
01:31:36.000 Yeah.
01:31:37.000 He had these prejudices that were very common.
01:31:39.000 Yeah.
01:31:40.000 Now, when you speak to Congress, do they brief you in terms of what they would like specifically for you to address?
01:31:48.000 No.
01:31:49.000 So this one, I've been asked to go and speak, and a lot of members of Congress are going to be invited.
01:31:55.000 And what I'm going to tell them is, look, this...
01:32:04.000 What are you going to say to try to really get it into their head?
01:32:07.000 What I'm going to say is that the genetics revolution is here.
01:32:11.000 If we don't have a system, if we don't have a rational system to manage it, if we don't have a system, you talked about public education, the challenge that we face in the United States is we traditionally have had a representative democracy.
01:32:24.000 And now we're transitioning from a representative democracy to a popular democracy.
01:32:29.000 So Switzerland has a popular democracy, but they have really well-educated people who have enough information to make smart decisions.
01:32:36.000 We haven't educated our public, and yet the public is making big decisions, and a lot of it is happening just on a gut feeling.
01:32:43.000 That's what's happening with trade agreements, where people just have a feeling it's bad, Without the ability to really get into the details.
01:32:52.000 And so we are having that transition, which means there's a lot of responsibility on us to educate our public.
01:32:57.000 And it's a tragedy.
01:32:58.000 We treat people in this country like you can just throw people away.
01:33:02.000 Like if you're in some crappy school system and your chances of success are so minimized, not because of anything that you've done, just because of your circumstances.
01:33:13.000 And it's unacceptable.
01:33:15.000 It is unacceptable.
01:33:15.000 I mean, equal opportunity is what we really should all strive for.
01:33:19.000 And I think some people conflate that with equal success.
01:33:25.000 And you're not going to get the same equality of outcome.
01:33:28.000 You're going to get different amounts of effort and different people are qualified or more talented at different things.
01:33:35.000 But what I'm worried about is what I said initially, that some people are going to get a hold of this stuff quickly, and it's going to give them a massive advantage.
01:33:46.000 The first person that has the ability to go forward in time five minutes is going to be able to manipulate the stock market in an unprecedented way.
01:33:57.000 I don't think that that's really possible in our lifetime, but that's the kind of thing I'm talking about.
01:34:01.000 You could get so far ahead that if we're talking about competition, there will be no catching up.
01:34:06.000 But you don't have to travel in time to do that.
01:34:09.000 Right.
01:34:09.000 But I'm saying if that was a technology.
01:34:13.000 But there's real technologies that are likely to happen, which are going to confer billions, tens, hundreds of billions of dollars of benefit.
01:34:22.000 Of advantage.
01:34:22.000 Yeah.
01:34:22.000 And that stuff is happening now.
01:34:25.000 And this is the concern with getting behind countries like China.
01:34:28.000 Because if they get ahead of us in something like that, they're already moving in this direction in terms of technology.
01:34:34.000 Yeah, so I am a proud American.
01:34:36.000 My father and grandparents came here as refugees.
01:34:39.000 I believe in what this country at our best stands for.
01:34:42.000 I want us to continue to be the country that's setting an example for the rest of the world, that is articulating what are ideals of Responsibility and governance, good governance and accountability and all these things that we've championed.
01:34:57.000 And because of that, I want us to get our act together politically and I want us to be the leading technological country in the world.
01:35:06.000 And so I think that's what's at stake and we're losing so much time because there was a time in the period after the Second World War Where we recognized that technological leadership was the foundation for everything else.
01:35:18.000 We had recreated the world out of the ashes of the war.
01:35:22.000 But we realized that we needed to have the economic growth.
01:35:24.000 We needed to have the competition.
01:35:25.000 We needed to have these technologies.
01:35:27.000 And it was a miracle what we've done.
01:35:30.000 And now we've lost our focus and we have to regain it.
01:35:34.000 The way I look at humans and the way I look at the human race today in 2019 is like we're driving very fast through fog, and it's very difficult to see what's in front of us.
01:35:44.000 When I look back at – I don't know if you've ever read any H.G. Wells, but some of his predictions about the future are really interesting, because he was pretty close on quite a few things.
01:35:55.000 But that vision, to be able to sit there and use your imagination, close your eyes and think, what is this going to be like?
01:36:03.000 What we're dealing with now, as opposed to what?
01:36:07.000 2119. Right.
01:36:08.000 Which is similar to H.G. Wells versus us, right?
01:36:12.000 What the fuck is that going to be like?
01:36:14.000 Yeah.
01:36:15.000 Is there going to be a time where there are no diseases, there is no death, and that we just have to regulate population control in some sort of other manner?
01:36:26.000 And the only way people die, they're going to die from accidents and things along those lines.
01:36:30.000 But mortality in terms of old age?
01:36:34.000 According to David Sinclair, this is a fixable issue.
01:36:38.000 It's a matter of when they fix it.
01:36:40.000 Do they fix it in 20 years or 30 years or 50 years?
01:36:43.000 Maybe.
01:36:43.000 David is a friend and I have a whole chapter.
01:36:46.000 I don't want to misquote him either.
01:36:47.000 I have a whole chapter in the book on the science of human life extension.
01:36:52.000 So I think definitely it's real that we're going to live healthier longer.
01:36:56.000 We're going to harness our technology for that.
01:36:58.000 I don't think that immortality, that biological immortality is in the cards for us.
01:37:03.000 Maybe not immortality because we'll still be biologically vulnerable.
01:37:06.000 We'll have hearts and brains and all that stuff.
01:37:08.000 Yeah, I think we will age slower and we will live healthier longer and I think it's going to be great.
01:37:14.000 But back to your core point, I mean that's the reason why I also write science fiction is that the world of science is changing so fast that we really need to apply a lot of imagination to imagine where it's going.
01:37:27.000 Because if you're just looking at what's happening now, It's like this train is going to speed by you.
01:37:33.000 We have to kind of imagine – it's like Wayne Gretzky.
01:37:34.000 We have to imagine where the puck is going to be, not where it is now.
01:37:38.000 And I mentioned George Church, who's like – he's at Harvard.
01:37:41.000 He's like the living Charles Darwin.
01:37:43.000 And I do a lot of speaking alongside George.
01:37:46.000 And it's become our little thing that he says that he reads science fiction.
01:37:51.000 Like mine, and then says, well, that's pretty cool.
01:37:53.000 How can we do that?
01:37:54.000 And what I do is I look at the research coming out of labs like George's, and I say, all right, well, that's where we are now.
01:38:01.000 What's that going to mean in 20, 50, 100 years?
01:38:05.000 Science fiction plays a more important role than it ever has in kind of imagining where we're going, and it's that imagining that allows us to try to say, well, what if that's one of the options of where we're going?
01:38:21.000 Yes.
01:38:24.000 Yes.
01:38:26.000 Yes.
01:38:28.000 Yes.
01:38:37.000 It's going to change us in a way that is something that we're not really prepared to understand or deal with.
01:38:45.000 What do you think that's going to be?
01:38:45.000 I think it's going to be predictive genetics.
01:38:48.000 Right now, it's like you go to your doctor when you're sick.
01:38:54.000 This could have been some genetic disorder that you had from the moment you were conceived.
01:38:59.000 And it was ticking.
01:39:00.000 And it was ticking.
01:39:01.000 And you showed up 50 years later when that's been manifest.
01:39:05.000 So it's going to be very different.
01:39:07.000 You're taking your kid home from the hospital, your newborn.
01:39:32.000 And the doctor says, When we have all of that information.
01:39:35.000 And there are things now that we call fate.
01:39:37.000 And it's just a different model.
01:39:39.000 And so I think that, and once we have that, that's going to change a lot of things.
01:39:43.000 It's going to fundamentally transform our healthcare.
01:39:45.000 What we call healthcare now is really sick care.
01:39:47.000 You show up with a symptom.
01:39:49.000 And this is going to be predictive.
01:39:50.000 And it's going to change the way we make babies because people are going to have real choices about which embryos to implant.
01:39:58.000 And we're going to have a lot of information about a lot of really intimate stuff.
01:40:02.000 So you feel like genetic manipulation and genetic engineering, genetic understanding, genetic knowledge, and then applied genetic medicine.
01:40:10.000 Yes.
01:40:10.000 Those are going to be the big changes in the next 20 years, even more so than technology?
01:40:14.000 Well, it's interconnected because there's really, it's like a super convergence of these technologies.
01:40:19.000 So the genetics revolution is the artificial intelligence revolution in the sense that the complexity of genetics is so great.
01:40:27.000 It's Way beyond what our brains on their own could process.
01:40:31.000 And so really all these technologies are touching each other.
01:40:35.000 And so the biological models are now influencing the AI. So for example, we are coming to the limits of silicon storage.
01:40:45.000 But DNA has unlimited storage capacity.
01:40:49.000 So it's, as I've said before, the boundaries between biology and AI, or genetics and AI, is going to be very blurry.
01:40:58.000 Yeah, that is an interesting concept, right?
01:41:01.000 The idea of storing information in DNA. And that has been discussed.
01:41:05.000 Yeah, DNA is the greatest information storage mechanism ever imagined.
01:41:11.000 But the question is, what happens when you do store things in there, and how does that information interact with all the rest of the stuff that's in your body already?
01:41:17.000 Well, I mean, if you can do it in your body, it doesn't have to be in your body.
01:41:21.000 Right, it doesn't have to be external.
01:41:21.000 But just think, like, your DNA has four billion years of history, and it's done a great job of recording it.
01:41:28.000 It's incredible.
01:41:28.000 Yeah.
01:41:28.000 I have old 8-track tapes.
01:41:31.000 They haven't lasted.
01:41:32.000 That is a squirrely concept that you have all that data inside your head.
01:41:36.000 I mean, that's also when people make, when they try to understand instincts that people have.
01:41:43.000 These are some sort of genetically encoded memories or some understanding of things that are dangerous.
01:41:48.000 Right.
01:41:49.000 And that these, they're in there because this is how we've learned over the years without actually having to experience these things personally.
01:41:56.000 Yeah.
01:41:56.000 Yeah, no, so that's, it's baked in.
01:41:59.000 Our genetics are baked into us.
01:42:01.000 And so, you know, I don't know if you've been to Indonesia before.
01:42:04.000 I was in Indonesia and I went to this place called Komodo Island.
01:42:08.000 Oh, wow.
01:42:08.000 Where the dragons are?
01:42:09.000 It's the Komodo dragons are.
01:42:10.000 And it was fascinating because it's like you can tell they don't have plaintiff's attorney.
01:42:13.000 So you're just walking around.
01:42:14.000 There are all these Komodo dragons.
01:42:15.000 Yeah, these are like the most deadly creatures on earth.
01:42:18.000 And there's like some little guy with a little stick.
01:42:19.000 And it's like, how effective is that stick?
01:42:22.000 So you're just walking around?
01:42:24.000 You're just walking around.
01:42:25.000 Because the Komodo dragons, when they're not killing people or killing animals, they're just sitting there.
01:42:29.000 Oh, Jesus Christ.
01:42:30.000 So it's pretty scary.
01:42:31.000 Do they ever get jacked?
01:42:32.000 Do people ever go there and get bitten?
01:42:34.000 Yes.
01:42:34.000 And they say, oh, it's only a few times a year.
01:42:36.000 It's like, wow, a few times a year.
01:42:37.000 That seems like a lot.
01:42:38.000 A few times a year is a lot.
01:42:39.000 Anyway, but the way it works for a Komodo dragon, a mother lays the egg and then buries the egg and then forgets where the egg is.
01:42:48.000 And then let's just say that this egg hatches and this little Komodo dragon comes out and the mother sees her own baby Komodo dragon.
01:42:57.000 She'll eat it in a second.
01:42:59.000 Oh, Jesus.
01:43:00.000 And so if you're a Komodo dragon, you better have your entire survival strategy baked into your DNA because nobody's teaching you anything.
01:43:09.000 And so for us, we have this sense that it's like parenting is really important.
01:43:14.000 It is.
01:43:15.000 Environment is really important.
01:43:16.000 It is.
01:43:16.000 But so much of who and what we are is baked into our genetics.
01:43:20.000 And I think that's going to be this challenge.
01:43:23.000 We're going to see ourselves as increasingly genetic beings.
01:43:26.000 We can't become genetic determinists, think that we're just genetics, but we're going to know a lot more.
01:43:31.000 We're going to demystify a lot of what it means to be a human.
01:43:34.000 Poof.
01:43:35.000 Yeah.
01:43:36.000 Poof is right.
01:43:37.000 But are we going to lose the romance and just the randomness of life because that's what people are concerned with, right?
01:43:44.000 Yeah.
01:43:49.000 Especially in particular with things like intelligence and athletic performance.
01:43:53.000 We're not going to appreciate freaks as much.
01:43:55.000 Or maybe we'll all want to be freaks because the freaks are the ones who push us.
01:44:00.000 You're not going to want to be a moron.
01:44:02.000 Well, your question, it's the essential question.
01:44:06.000 It's like, what makes a human?
01:44:08.000 A human isn't just someone with a higher IQ. That doesn't make you a better human.
01:44:11.000 That makes you someone with a higher IQ. But how are we going to think about constructing societies when it's up to us?
01:44:17.000 Like, if we are going to say we value certain people, certain ideas, I think we're going to need artists.
01:44:24.000 Like right now, people like artists are sometimes in the mainstream.
01:44:28.000 Sometimes they're on the fringe.
01:44:29.000 But artists are going to be maybe the most important people in this new world.
01:44:32.000 And like right now in hospitals, we have kind of a hierarchy.
01:44:36.000 And like the most technical people are the people who are valued the most.
01:44:39.000 And the least technical people, like some of the nurses or nurses' aides, are the people who are often valued and paid the least.
01:44:46.000 But when technology can do these technological feats, what's going to be left is how can we be great humans?
01:44:55.000 How can we emote?
01:44:56.000 How can we connect?
01:44:57.000 How can we create art?
01:44:58.000 And if we get swept away by this tide of science, and you know how excited I am about the science, we could really undermine our humanity.
01:45:08.000 Right.
01:45:09.000 And as for humans, what humans value is many aspects of that humanity.
01:45:13.000 The art, the creations.
01:45:15.000 Yeah.
01:45:16.000 Yeah.
01:45:19.000 Yeah.
01:45:40.000 It's still miraculous.
01:45:42.000 And we need to celebrate that.
01:45:43.000 And we can't allow us to say that we are just our genetics or even just our biology.
01:45:48.000 But we also can't just say biology has nothing to do with it.
01:45:52.000 And especially because we're going to know more about our biology and about our differences.
01:45:57.000 And that's normal.
01:45:58.000 I mean, it used to be in the old days that everyone thought, well, God is weather.
01:46:02.000 And now we understand weather pretty much, and nobody's saying, oh, that lightning, that's God delivering a message.
01:46:07.000 It could be.
01:46:08.000 But we still have that mystery.
01:46:11.000 And I think that in some ways it's about our orientation.
01:46:14.000 Like, how do we make sure that we keep this view of life, that we have artists and humanists who are just at the core of this conversation about where we're going?
01:46:25.000 What if that mystery ultimately turns out to just be ignorance, and that as you develop more and more understanding, there's less and less mystery?
01:46:31.000 Would we like to be less smart?
01:46:33.000 Would we like to be more overwhelmed by possibility?
01:46:38.000 It could be.
01:46:39.000 Could that be part of what romance is?
01:46:41.000 It could be.
01:46:42.000 And certainly, like the unknown, we wake up every morning.
01:46:46.000 Sure.
01:46:46.000 And we just don't know the answer.
01:46:47.000 And there are some people, going back to the issues of life extension, there are some people who say, well, that death is essential for appreciating life.
01:46:57.000 I talk about this stuff all around.
01:46:59.000 And then there are people who say, you're talking about eliminating these terrible diseases, but I know somebody who had that terrible disease.
01:47:08.000 And their suffering was a gift to everybody else because we all had more humanity in response to their suffering.
01:47:14.000 I was like, well, that's kind of screwed up.
01:47:15.000 I'd prefer them to not have that suffering.
01:47:17.000 Those people are thinking wacky.
01:47:19.000 It's wacky.
01:47:20.000 But we need to – I totally agree with you that if we allow ourselves to get swept away with this kind of scientific determinism, if we don't say we really value – Our humanistic traditions, our artists, our cultures, we could get lost.
01:47:37.000 We could become obsolete.
01:47:38.000 We could become obsolete, but we could also just become less human, and there's something wonderful, and there's magic in humans.
01:47:43.000 Do you think that monkeys used to think, man, we can't become a human, we become less monkey?
01:47:48.000 Yeah.
01:47:48.000 Do you know what I'm saying?
01:47:49.000 But no being looks in the mirror and recognizes that they are evolving.
01:47:55.000 We've only been homo sapiens for about 300,000 years.
01:47:59.000 So we just, it's hard.
01:48:01.000 We know where we've come from because you see all those little charts from high school biology.
01:48:05.000 But it's really hard for people to imagine being something else in the future.
01:48:10.000 It's outside of our consciousness.
01:48:12.000 It's H.G. Wells squared.
01:48:14.000 Yeah, and we are monkeys.
01:48:15.000 It's just that we've redefined our monkey thing.
01:48:17.000 We do it in a little different way.
01:48:20.000 That's what I was kind of getting at.
01:48:21.000 Are you concerned at all with artificial life?
01:48:24.000 Are you concerned about the propagation of artificial intelligence?
01:48:28.000 Well, there are different kinds of artificial life.
01:48:30.000 So, one is artificial intelligence, and I know people like Elon Musk and late Stephen Hawking are afraid.
01:48:38.000 Terrified.
01:48:38.000 Yeah.
01:48:39.000 And I think that we need, whether it's right or not, I think it's great for us to focus on those risks.
01:48:45.000 Because if we just say, oh, that's crazy, and we don't focus on it, it increases the likelihood of these bad things happening.
01:48:52.000 So, kudos to Elon Musk.
01:48:54.000 But I also think that we're a long way away from that threat.
01:48:59.000 And we will be enormous beneficiaries of these technologies.
01:49:04.000 And that's why, I don't want to sound like a broken record, but that's why I keep saying it's all about values.
01:49:09.000 I think we should take those threats very seriously.
01:49:11.000 Values are so abstract and we don't agree on them.
01:49:14.000 It's true, but like Elon Musk, I mean, they've set up this institute where to say, well, what are the dangers?
01:49:20.000 And then what are the things that we can do now?
01:49:22.000 What are standards that we can integrate, for example, into our computer programming?
01:49:26.000 And so I mentioned my World Health Organization committee.
01:49:29.000 The question is, well, what are the standards that we can integrate into scientific culture that's not going to cure everything, but may increase the likelihood we'll have a better rather than worse outcome?
01:49:39.000 Right.
01:49:39.000 But isn't there an inherent danger in other companies or other countries rather not complying with any standards that we set because it would be anti-competitive?
01:49:48.000 Yes.
01:49:48.000 Like that would somehow or another diminish competition or diminish their competitive edge.
01:49:54.000 It's true.
01:49:55.000 And that's the balance that we're going to need to hold.
01:49:59.000 And it's really hard.
01:50:00.000 But we have a window of opportunity now.
01:50:04.000 Now, to try to get ahead of that.
01:50:06.000 And like I said, we have chemical weapons, biological weapons, nuclear weapons, where we've had international standards that have roughly held.
01:50:13.000 I mean, there was a time when slavery was the norm, and there was a movement to say, this is wrong, and it was largely successful.
01:50:20.000 So we have history of being more successful rather than less.
01:50:25.000 And I think that's the goal.
01:50:27.000 But you're right.
01:50:28.000 I mean, this is a race between the technology and the best values.
01:50:32.000 My real concern about artificial intelligence is that this paradigm shifting moment will happen before we recognize it's happening, and then it'll be too late.
01:50:40.000 Yes, that's exactly right.
01:50:42.000 And that's, like I was saying, that's why I've written the book, that's why I'm out on the road so much talking to people, why it's such an honor for me to be, and pleasure for me to be here with you talking about it, because we have to reach out to people, and people can't be afraid Of entering this conversation because it feels too technical or it feels like it's somebody else's business.
01:51:01.000 This is all of our business because this is all of our lives and it's all of our futures.
01:51:06.000 So if in the future you think 20 years the thing that's going to really change the most is predictive genetics and to be able to predict accurately a person's health, what do you think- Health and life.
01:51:19.000 The biggest detriment for all this stuff and the thing that we have to avoid the most.
01:51:24.000 Yeah.
01:51:24.000 So one is, as I mentioned, this determinism.
01:51:27.000 Just because if we just kind of take our sense of wonder about what it means to be a human away, like that's really going to harm us.
01:51:37.000 We talked about equity and access to these technologies.
01:51:41.000 And the technologies don't even need to be real.
01:51:44.000 In order to have a negative impact.
01:51:48.000 So in India, there are no significant genetic differences between people in different castes, but the caste system has been maintained for thousands of years because people just have accepted these differences.
01:52:00.000 So it's a whole new way of understanding what is a human.
01:52:05.000 And it's really going to be complicated.
01:52:07.000 And we aren't ready for it.
01:52:08.000 We aren't ready for it culturally.
01:52:09.000 We aren't ready for it educationally.
01:52:12.000 Certainly our political leaders aren't paying much of any attention to all of this.
01:52:16.000 We have a huge job.
01:52:17.000 Oof.
01:52:18.000 Oof.
01:52:19.000 So when you sit down and you give this speech to Congress— Yeah.
01:52:24.000 What are you anticipating from them in terms of like, do you think that there's anything that they can do now to take certain steps?
01:52:32.000 Yes.
01:52:33.000 So a few things.
01:52:34.000 One is we need to have a national education campaign.
01:52:38.000 I mean, this is so important.
01:52:39.000 I would say it's on the future of genetics revolution and of AI, because I think it's crazy.
01:52:49.000 I learned French in grade school and high school, and I'm happy to speak French.
01:52:54.000 But I would rather have people say, this is really important stuff.
01:52:59.000 So that's number one.
01:53:01.000 Number two is we need to make sure that we have a functioning regulatory system.
01:53:07.000 In this country, in every country, and I do a lot of comparative work.
01:53:11.000 And like the United Kingdom, they're really well organized.
01:53:13.000 They have a national healthcare system, which allows them at a national level to kind of think about long-term care and trade-offs.
01:53:22.000 In this country, the average person changes health plans every 18 months.
01:53:26.000 And I was talking with somebody the other night, and they were working on a predictive health company.
01:53:32.000 And And they said their first idea was they were going to sell this information to health insurers because like, wouldn't this be great if you're a health insurer and you had somebody who was your client and you could say, hey, here's some information.
01:53:45.000 You can live healthier and you're not going to have this disease 20 years from now.
01:53:49.000 And what he found out is the health insurers, they could have cared less because people were just, they were only going to be part of it for a year and a half.
01:53:55.000 So we really need to think differently about how do we invest in people over the course of their lives.
01:54:02.000 Certainly, education is one, but thinking long-term about health and well-being is another.
01:54:06.000 What do you think is going to be the first technological innovation in terms of what's already on the pipeline right now that's going to radically alter human beings?
01:54:19.000 So radically, I think it's going to be the end of procreative sex.
01:54:24.000 And so when we stop conceiving our babies through sex, and we're selecting our embryos, that's going to open up this massive realm of possibility.
01:54:35.000 And certainly, when we expand the number of fertilized eggs that we're choosing from, that is really, I think that's kind of the killer application of genetics to the future of human life.
01:54:47.000 Do you see that being attainable to the general population anytime in the near future?
01:54:52.000 Once the technology gets established, it seems like it's going to be wealthier people that are going to have access to it first, right?
01:55:00.000 Well, it depends.
01:55:02.000 Probably, yes.
01:55:03.000 But when you think about right now, we have all these people who are born with these terrible, in many cases, deadly genetic diseases and disorders.
01:55:12.000 And what is the societal expenditure for lifetime care for all those people?
01:55:18.000 I mean, this is huge, huge amounts of money.
01:55:20.000 So if we were to eliminate many of, not the people, but prevent those diseases and disorders from taking place in the first place, And we could use that money to provide IVF and embryo screening to everybody using just the economic models now.
01:55:38.000 But then there's another issue that we have to talk about.
01:55:41.000 It's really sensitive.
01:55:42.000 So I talk a lot about this.
01:55:45.000 And I talk in the book about people with Down syndrome.
01:55:49.000 I have a lot of friends who have kids with Down syndrome.
01:55:53.000 These are wonderful kids and they deserve every opportunity to thrive the same as everybody else.
01:55:59.000 And I'm really sensitive because people say, well, hey, if you're...
01:56:01.000 And I say, like, Down syndrome is largely not going...
01:56:04.000 There aren't going to be newborns with Down syndrome 10 or 20 years from now.
01:56:07.000 So people say, well, what are you saying about my kids?
01:56:10.000 Like, are you saying that if this is going to be eliminated, that my kid has less of a right to be as somebody else?
01:56:17.000 And I always say...
01:56:19.000 Absolutely not.
01:56:19.000 And we need to be extremely sensitive that we're not dehumanizing people.
01:56:24.000 But if you have 15 or 15 fertilized eggs in a lab and you have to pick which one gets implanted in the mother, and one of them just has a disease like Tay-Sachs or sickle cell disease where they're going to die before they're 10 years old.
01:56:40.000 Would you choose, affirmatively choose, to implant that embryo versus the 9 or 14 or whatever the number is of other ones?
01:56:48.000 And so these are really sensitive things, and we can't be blasé about them.
01:56:54.000 But we will have these choices, and we're going to have to figure out how do we make them.
01:56:59.000 Well, I think there's also a real possibility of them being able to fix that.
01:57:03.000 Yeah, and some things will be fixable, and some things won't.
01:57:07.000 And so that's why, though, for these single gene mutation disorders, there's a debate.
01:57:13.000 I was speaking in Berkeley the other day, and so I was talking about these two options.
01:57:18.000 One is embryo selection, and one is gene editing.
01:57:21.000 And so...
01:57:22.000 There were different people who got up and said, oh no, we can do embryo selection.
01:57:25.000 That's the way that we should prevent these diseases.
01:57:29.000 But gene editing, that's going too far.
01:57:32.000 That's playing God.
01:57:33.000 And so for different things, there'll be different options.
01:57:36.000 When you hear that, that that's going too far, who's usually saying that?
01:57:42.000 There's two groups.
01:57:43.000 I mean, one is certainly in the religious community.
01:57:45.000 We're saying, well, this is playing God.
01:57:48.000 But there's another kind of – it's like a progressive community who are the kinds of people who are uncomfortable with genetically modified crops, people who are saying that – There's this slippery slope that once we start making what are called germline genetic modifications,
01:58:06.000 so germline is that are sperm, eggs, and embryos.
01:58:08.000 If you make a change to an adult human, it doesn't pass to their kids.
01:58:11.000 If you make a change to a sperm, an egg, or an embryo, it will pass on.
01:58:15.000 And so there are a lot of people who are saying, well, we don't understand genetics well enough to make these changes that will last forever.
01:58:22.000 I'm not in that view.
01:58:23.000 I just think that we need to be cautious and we need to weigh the risks and the benefits of everything that we do.
01:58:29.000 Do you think we do know enough about those changes?
01:58:32.000 It depends because if we're – it depends on what we're selecting against.
01:58:36.000 Like if the thing we're selecting against is some kind of terrible genetic disease that's going to kill somebody when they're a little kid, we have a lot of latitude because the alternative is death.
01:58:47.000 And that's why I was so critical of this Chinese biophysicist who created – who genetically engineered these two little girls born in China last year because he wasn't in the gene edits that weren't – Probably weren't successful.
01:59:00.000 It wasn't to eliminate some disease or disorder.
01:59:04.000 He was trying to confer the benefit of increased resistance to HIV. And so I think that we need to be very mindful and we need to be doing kind of a cost-benefit analysis of the different interventions.
01:59:17.000 And there was an unintended side effect of this, they believe, a perceived potential unintended side effect, and that's increased intelligence.
01:59:25.000 Well, it's a possibility.
01:59:27.000 Possibility.
01:59:27.000 How does that work?
01:59:28.000 So this gene, it's called a CCR5, is this gene, and when it was disrupted in some mouse studies, those mice became a little bit able to navigate mazes.
01:59:42.000 And so that was what led people to believe that this disruption of the CCR5 could potentially lead to that kind of change in human.
01:59:50.000 Nobody really knows.
01:59:51.000 There's lots of things that happen in mice that don't have analogs in humans.
01:59:55.000 And that was why it was so irresponsible, is that this scientist in secret made these gene edits.
02:00:03.000 He didn't get a proper consent from the parents.
02:00:06.000 Oh, really?
02:00:07.000 Yeah, yes.
02:00:08.000 Because it's China.
02:00:09.000 Because it's China.
02:00:09.000 He just let it ride.
02:00:10.000 Yeah.
02:00:11.000 I mean, the parents were all manipulated.
02:00:13.000 Really?
02:00:14.000 Yeah.
02:00:14.000 And so that's the thing.
02:00:16.000 So you're exactly right.
02:00:17.000 Like, we are humans.
02:00:18.000 We're nuts as a species.
02:00:20.000 And so we need to try to establish some kind of guide rails, guard rails, about what's okay, what we're comfortable with, what we're not.
02:00:29.000 Now, this guy is not operating in an isolated incidence.
02:00:33.000 There's got to be a shit ton of that going on right now as we're talking in China.
02:00:37.000 What do you think is happening over there?
02:00:39.000 You know, I think China has a lot of money, they have brilliant people, and they have a government that is hell-bent on leading the world in advanced technology.
02:00:50.000 And the scientific culture in China is just very different than it is here.
02:00:56.000 And so we know what we know, but we don't know what we don't know.
02:01:01.000 And it's a really, really big deal, because China is in many ways a Wild West.
02:01:07.000 And the technology exists to do some really big stuff.
02:01:12.000 And that's why we have to at least try to establish standards.
02:01:18.000 Will we succeed fully?
02:01:19.000 No.
02:01:19.000 But maybe we can do better than worse.
02:01:21.000 Are you anticipating seeing a lot of freaky things come out of China?
02:01:24.000 Yes.
02:01:25.000 Whoa, you said that very quick.
02:01:27.000 Yeah, it's true.
02:01:28.000 I spent a lot of time in China.
02:01:32.000 It's a different thing with China.
02:01:34.000 China has this great ancient civilization.
02:01:37.000 But they destroyed their own civilization in the Great Leap Forward and the Cultural Revolution.
02:01:43.000 They burned their books.
02:01:44.000 They smashed their own historic relics.
02:01:47.000 And so it's really, it's a society in many ways that's starting from scratch.
02:01:50.000 And so all of these norms that people get, inherit through their traditions, China in many ways doesn't have.
02:01:59.000 And so it's very different.
02:02:01.000 And China is growing.
02:02:03.000 I mean, they are increasingly powerful.
02:02:05.000 And China is going to be a major force defining the world of the 21st century.
02:02:11.000 That's why America has to get its act together.
02:02:13.000 That's a hard concept for us to grasp when we think about the fact that they had the Great Wall, they have so much ancient art and architecture.
02:02:19.000 We just assume they're a really old culture.
02:02:22.000 They are, but they wiped it out.
02:02:24.000 That's so crazy.
02:02:26.000 That's a unique perspective.
02:02:28.000 That's why if you want to see great Chinese art, you have to go to Taiwan.
02:02:31.000 Because when the Chinese nationalists left in 1949, as they were losing the Civil War, they took the treasures and they put them in the National Museum of Taiwan.
02:02:41.000 In the Cultural Revolution, the Great Leap Forward, China, the Red Guards were just smashing all of their own stuff, their own ancient history.
02:02:49.000 And now the Chinese Communist Party is saying, oh no, we're going back and we have this great 5,000-year-old culture.
02:02:55.000 In some ways it's true, but in some ways it's like an adolescent culture without these kinds of restrictions that other societies have.
02:03:03.000 That's such a unique perspective that I haven't heard before.
02:03:06.000 It makes so much sense in terms of like how frantic they are at restructuring their world.
02:03:11.000 Yeah.
02:03:12.000 And they feel that they got screwed over because there is this vague sense of Chinese greatness when you hear the word Middle Kingdom.
02:03:19.000 It's like China's the center of the world and everybody else is some kind of tributary.
02:03:23.000 And so they're monumentally pissed off.
02:03:27.000 That these colonial powers came and overpowered them and they had to make all these concessions.
02:03:32.000 They had to give land away.
02:03:33.000 And hell-bent on regaining it.
02:03:36.000 They're playing the long game.
02:03:37.000 They are playing the long game.
02:03:39.000 And we're not.
02:03:39.000 And we are not.
02:03:40.000 And we have to be mindful of it.
02:03:41.000 That's also something you can do if you have complete control of your population.
02:03:44.000 You don't have to worry about people's opinions or you can just go in the direction that you feel is going to benefit the Chinese power.
02:03:52.000 Yeah.
02:03:52.000 The power that be.
02:03:53.000 Yeah.
02:03:53.000 This is a country run by engineers.
02:03:55.000 We're a country run largely by lawyers and reality TV people, I guess.
02:04:00.000 But in China, it's run by engineers.
02:04:01.000 So there are all these problems, and the answer is always engineering.
02:04:04.000 So if you have a population problem, the answer is the one-child policy.
02:04:08.000 Environmental problem, you have Three Gorges Dam.
02:04:10.000 You don't have water in the north of China.
02:04:12.000 You build this massive, biggest water project in the world from south to north.
02:04:16.000 You want to...
02:04:17.000 When in the Olympics, you engineer your population, you take kids away from their families and put them in their Olympic sports school.
02:04:23.000 So I write about this in Genesis Code.
02:04:26.000 If you're China and you kind of have this Plato's Republic model of the world and we're going to kind of identify the genetic or maybe manipulate these genetic superstars to be our greatest scientists and mathematicians and business leaders and political leaders, like there's a model that you can imagine for how to do it.
02:04:44.000 Wow.
02:04:44.000 It makes you really nervous.
02:04:46.000 It should.
02:04:48.000 Yes.
02:04:49.000 That's the thing.
02:04:51.000 That's why, like, I just feel like with this country, we don't have time to have all these distractions.
02:04:57.000 We're focusing on junk.
02:05:00.000 Like what?
02:05:01.000 Just like all of this, you know, I'm on CNN all the time when I'm home in New York.
02:05:06.000 And I always say, like, you guys, and I'm talking about kind of geopolitical issues, China and North Korea.
02:05:11.000 What I always say is, like, you guys recognize this is porn.
02:05:14.000 Like, CNN and MSCB, that's like one kind of porn.
02:05:18.000 And Fox and whoever else, Infowars, that's another kind of porn.
02:05:22.000 But it's all porn.
02:05:23.000 And we're drawing people's attention to these few stories.
02:05:28.000 But there's these big stories that we have to focus on.
02:05:31.000 And certainly, the rise of China is such an essential story for the 21st century because China is competing in all of these technologies.
02:05:41.000 And China, it's like, go, go, go.
02:05:43.000 I mean, people in China who were involved in the tech world...
02:05:46.000 When they go and visit Silicon Valley, uniformly they say, we cannot believe these people are so lazy.
02:05:51.000 Like, why are they not working 24 hours a day?
02:05:54.000 Why are they not issuing new products every week?
02:05:59.000 And so this is, I mean, they are racing.
02:06:03.000 And it's going to have huge implications for the world.
02:06:06.000 And so if we believe in our values, as I believe we should, we have to fight for them.
02:06:11.000 And the place that we have to fight for them first is here.
02:06:19.000 This drama, this reality TV drama of our government is another day where we're not focusing on the big things.
02:06:25.000 How are we going to get our act together?
02:06:26.000 How are we going to lead the world in technology?
02:06:28.000 Another example is immigration.
02:06:31.000 We have this whole fight of how do we keep people out.
02:06:33.000 What I'd like to do is to go to the State Department and say, all right, every embassy in the world, you have a new job.
02:06:40.000 We're going to give you whatever number, 500 slots per year.
02:06:43.000 You have to, in your country, find the 500 most brilliant, talented, creative, entrepreneurial people and say, we're giving you a green card.
02:06:52.000 We're going to give you a little starter money.
02:06:54.000 We want you to move to the United States and just start a life and have kids.
02:06:58.000 And we should be creaming the crop, skimming the cream.
02:07:03.000 Over the rest of the world.
02:07:05.000 Like, we could take over, we could revitalize this country, but we're having this fight of how do we keep a small number of refugees out?
02:07:12.000 And it's just, we're not focusing on the right things.
02:07:15.000 That's, again, another very, very interesting perspective.
02:07:19.000 We learned about Huawei in this country, really.
02:07:24.000 Well, I learned about it because they put out some pretty innovative phones and some interesting technology.
02:07:29.000 But we learned it because the State Department was telling people to stop using their phones.
02:07:34.000 Do you think that that is trying to stifle the competition?
02:07:38.000 Because the market share that they have, if they do really have the number two selling cell phones in the world now, that's not from America.
02:07:47.000 America's largely out of that conversation.
02:07:50.000 And if they were in America, they would probably dominate in America as well.
02:07:54.000 Because they're cheaper.
02:07:55.000 Yeah, and they're really good.
02:07:57.000 I mean, their phones are insane.
02:07:59.000 The cameras in their phones are off the charts.
02:08:02.000 They put some video of the zoom capability of their newest phone, and people were calling bullshit.
02:08:10.000 There's no way.
02:08:10.000 That's not even possible.
02:08:11.000 But it turned out it was true.
02:08:13.000 It really can zoom like a super expensive telephoto lens.
02:08:17.000 Sure.
02:08:18.000 Yeah, yeah.
02:08:19.000 So, Huawei, it's a complicated story.
02:08:22.000 For sure, the founder of Huawei is a former Chinese military officer.
02:08:27.000 For sure, in the early stages of their company, they stole, straight out stole, lots of source code from companies like Cisco.
02:08:37.000 For sure, we should be really worried if Huawei is the sole supplier of the infrastructure that supports 5G all around the world because the Chinese government would have access to everything.
02:08:52.000 And so that leads us to the question is, one...
02:08:54.000 Is there a problem with Huawei itself?
02:08:57.000 But then two is, let's just say, and I think the answer to that first question is probably yes.
02:09:04.000 But then the question two is, let's just say Huawei is a legit company and they're not totally intimately connected to the Chinese government.
02:09:13.000 Can we trust their relationship with the Chinese government?
02:09:16.000 And the Chinese government has a rule that every one of these companies has the big Chinese national companies, national champion companies, They have a Communist Party cell inside of that company.
02:09:28.000 I think that we can't think of big Chinese companies just like we think of companies here.
02:09:34.000 We have to think of them as quasi-state actors.
02:09:37.000 That's why this fight that's happening right now is so important.
02:09:41.000 That's why when China is out investing in different parts of the world, including Africa, their companies are kind of acting like arms of the government.
02:09:50.000 They're making all kinds of Investments that don't really make sense if you just see, well, this is a company doing something.
02:09:57.000 If you say that this is a company with backing by the state that's fulfilling a function that supports the state, it's a very different model.
02:10:05.000 So I am actually quite concerned about Huawei, and I'm not a fan of everything that this administration is doing, but I think on China, it's important that we need to stand up, and I think pushing back on Huawei is the right thing to do.
02:10:19.000 I'm uncomfortable about this for two reasons.
02:10:21.000 One, I'm uncomfortable about that, about the Chinese government being inexorably connected to this global superpower in technology.
02:10:28.000 But I'm also uncomfortable that it sets a precedent for other nations to follow.
02:10:33.000 Because they're like, look, this is the only way to compete.
02:10:35.000 Because what you were talking about, the investments that Huawei or that the Chinese government makes in these other countries and that don't seem to make sense if you're just dealing with a company.
02:10:44.000 Right, right.
02:10:44.000 But if you're dealing with someone who's trying to take over the world, it makes a lot of sense.
02:10:50.000 Yeah, and so when we have our companies, you're out in some place in Africa and you're competing with a Chinese company to do something, build a port or whatever.
02:10:59.000 And you're competing because you are an American company.
02:11:02.000 And so you have your cat, this is the port, what's the income stream going to be about it?
02:11:06.000 And you have a certain amount that you can bid Because otherwise it becomes a bad investment.
02:11:11.000 But if the Chinese company, their calculus is not, is this a good or bad investment?
02:11:16.000 It's what is the state interest in controlling or quasi-controlling this asset?
02:11:21.000 And so that's why we can't project ourselves onto the Chinese.
02:11:26.000 We can't say they're just like us, just different.
02:11:29.000 We have different models and our models are competing.
02:11:31.000 Do you think that we should avoid Huawei products like consumers should?
02:11:35.000 Well, I think the government should very tightly regulate products like Huawei products.
02:11:43.000 Because some other network, like routers, they've shown that they're using them to extract information.
02:11:51.000 And so we have a long history of European, Japanese, South Korean companies that have invested very well.
02:11:58.000 They've out-competed us, and we've allowed the Japanese companies to out-compete our auto manufacturers, and that was fine.
02:12:06.000 In the 1970s, our cars had become shit because we had this monopoly.
02:12:12.000 And so I'm all for open competition.
02:12:14.000 I'm all for free trade.
02:12:16.000 It has to be fair.
02:12:18.000 But I think that what China is doing, China recognized as a state that they could use the tools of capitalism to achieve state ends.
02:12:27.000 And I think we need to be very cautious about that.
02:12:29.000 That's interesting that you compare it to the automotive market because the consequences are so much different, right?
02:12:34.000 So much different.
02:12:35.000 But we do have a model to go on.
02:12:37.000 We could see what happened.
02:12:38.000 We made shitty cars.
02:12:40.000 The Japanese took over.
02:12:40.000 Yeah.
02:12:41.000 And then we made better cars.
02:12:42.000 I have a rental car here in Los Angeles, and I went to the rental car place at LAX, and they had all of the different cars.
02:12:52.000 And there was like a Nissan and a Toyota, and there was a Cadillac.
02:12:58.000 And I thought, you know, I said, I'm going to go with the Caddy.
02:13:01.000 So it's a great car.
02:13:02.000 Oh, they're amazing.
02:13:03.000 They're incredible.
02:13:04.000 Yeah, American cars are very good now.
02:13:06.000 They're great!
02:13:07.000 And so I'm all for competition, but I just feel like what some Chinese companies are doing, it's not competition.
02:13:14.000 They have become, not all of them, but quasi-state actors.
02:13:18.000 And if that's what they're doing, I think we need to respond to them in that way.
02:13:23.000 Interesting.
02:13:24.000 What else should we be concerned with?
02:13:25.000 Should we be concerned with anything that North Korea is doing?
02:13:28.000 Oh, absolutely.
02:13:28.000 So I have spent a lot of time in North Korea.
02:13:32.000 Yeah, that's why.
02:13:33.000 Yeah, so I've advised the North Korean government on the establishment of special economic zones, which I certainly believe if North Korea could have economic growth and integrate into the rest of the world, that would be great.
02:13:47.000 When was this that you went over there?
02:13:48.000 This was in 2015, but I've been there twice, crossed the border from China and zigzagged the country by land, visited 10 or 12 different sites, so spent almost two weeks by land.
02:14:00.000 What was that like?
02:14:01.000 Incredible.
02:14:02.000 I mean, North Korea, one, it's the most organized place I've ever seen.
02:14:08.000 I mean, there's not anywhere.
02:14:10.000 There's like on the side of the road, the stones are all raked.
02:14:14.000 There's not a stick.
02:14:15.000 Every little line is drawn.
02:14:17.000 It's like total control.
02:14:19.000 In the agricultural areas, there were very few machines and very few farm animals.
02:14:24.000 So I saw people pulling plows.
02:14:26.000 Like, you know, you usually have the animal in front of the plow and the person behind here.
02:14:30.000 There were like two people in front of the plow and one person behind.
02:14:33.000 The people were the animals.
02:14:37.000 And we would go and visit these – just because they didn't – a lot of the animals got eaten when they had their famine.
02:14:43.000 And so we visited these different sites for these special economic zones.
02:14:47.000 And they would say like what they had done and what they were thinking about doing.
02:14:51.000 And I would say like, how do you – do you know anything about the market?
02:14:55.000 Like what are you going to sell here?
02:14:57.000 And they said, well, we know about clearing land and building a fence.
02:15:01.000 And then we went to Pyongyang and I spoke to about 400 economic planners and I said, look, I know you have these plans to do these special economic zones.
02:15:10.000 It's totally going to fail.
02:15:11.000 The way it's going to work, you have to connect to the market economy.
02:15:14.000 You have to empower your workers.
02:15:15.000 You need information flow.
02:15:17.000 How else are you going to learn and adapt?
02:15:20.000 So North Korea, it's a really dangerous place.
02:15:23.000 And now it's even more dangerous because President Trump, it was this kind of nonsensical Hail Mary in these meetings with Kim Jong-un.
02:15:33.000 There was never any indication that the North Koreans were planning on giving up their nuclear weapons.
02:15:38.000 They never said they would.
02:15:40.000 It's the last thing they would do because their goal is survival.
02:15:43.000 And so there was this kind of head fake, which was like a PR stunt to be able to say, all right, we're having these meetings.
02:15:51.000 And of course, the North Koreans weren't ever going to give up their nuclear weapons.
02:15:55.000 They're still not.
02:15:56.000 So now things are ramping up.
02:15:58.000 So North Korea in the last couple of days has started firing missiles again.
02:16:02.000 The United States today, the US military seized a North Korean ship.
02:16:07.000 So we're going back to this very dangerous And so I think we really need to do a much better job.
02:16:16.000 We need much more.
02:16:17.000 North Korea, it's really hard.
02:16:19.000 And these guys are really smart.
02:16:22.000 People say, well, these guys are poor.
02:16:25.000 They must not be smart.
02:16:27.000 We're playing cards with them.
02:16:29.000 We've got the whole deck.
02:16:31.000 They don't have one card.
02:16:33.000 And yet, they're...
02:16:35.000 They're in the game.
02:16:36.000 They're in the game.
02:16:37.000 They're holding us to a stalemate and it's really worrying.
02:16:39.000 And why did you go over there?
02:16:41.000 What were you thinking?
02:16:43.000 So I thought a lot about it because I have a background in human rights.
02:16:48.000 I was a human rights officer for the United Nations in Cambodia.
02:16:51.000 I'm the child of a refugee.
02:16:53.000 I have this very strong belief in human rights and in supporting people.
02:16:59.000 In North Korea, they have about 120,000 people in the most brutal, horrific prison camps in And so when I was asked to be part of this six-person delegation advising them on the establishment of special economic zones,
02:17:16.000 one instinct was, screw them, I don't want to be part of this at all.
02:17:20.000 But I also felt that if North Korea could have some kind of integrated economic development, That would at least connect them to the world that would create some kind of leverage and that would help people.
02:17:35.000 So I decided to go.
02:17:38.000 And I'm glad that I did, but these are really hard issues.
02:17:43.000 And it's very unfortunate that in President Trump's negotiations with the North Koreans, human rights was never once mentioned.
02:17:51.000 And I think that that's coming back to values.
02:17:53.000 We have to be clear about who we are and what we stand for and be consistent in fighting for it.
02:17:58.000 Do you think that Trump didn't bring that up because he wanted to be able to effectively communicate with them and not put them on their heels?
02:18:08.000 Maybe, but I feel like had they done – I mean, I think that if he thought – That there was a real chance of progress.
02:18:16.000 But the hard thing was he didn't know much about the North Koreans.
02:18:20.000 He has people.
02:18:21.000 We have brilliant people working in the United States government.
02:18:23.000 And all of those people, all of the U.S. intelligence agencies were telling President Trump that the North Koreans have absolutely no intention of giving up their nuclear weapons.
02:18:32.000 And so maybe he did think that he would charm Kim Jong-un or he would say, hey, we're going to give you economic development or whatever.
02:18:40.000 I think for most people who were observers of North Korea who had watched it for a while, thought that was not – so we gave away a lot.
02:18:48.000 So we didn't mention human rights.
02:18:49.000 We suspended our military exercises.
02:18:52.000 We gave them the legitimacy of a presidential meeting, which they've been wanting for 30 years, and we didn't get anything back.
02:18:59.000 So had we gotten something back, then you could say, well, that was a risk we're taking maybe.
02:19:04.000 Yeah, I haven't heard described that way, but I'm agreeing with what you're saying.
02:19:08.000 What do you think he could have done differently, though?
02:19:10.000 I don't think the meeting should have happened with no conditions.
02:19:17.000 So if he had said, I'm open to meet with the North Koreans, Which is something the North Koreans have always wanted.
02:19:23.000 We could have met with the North Koreans anytime immediately for the last 30 years.
02:19:27.000 But in order to do it, they need to do this, this, and this.
02:19:31.000 And if they do it, we'll meet.
02:19:34.000 Like that would have been a legitimate thing.
02:19:35.000 But what he said is somebody, the North Korean, I mean, sorry, the South Korean national security advisors peeked into his office and he goes, hey, they want to meet.
02:19:44.000 And it was like, sure.
02:19:45.000 That seems like an interesting thing to do.
02:19:47.000 And I think that with this diplomacy, you kind of have to get something.
02:19:51.000 And so we gave away so much up front, and the North Koreans didn't have an incentive to do anything in return.
02:19:57.000 Was his perspective that it would be better to be in communication and to be friends with this guy?
02:20:01.000 Was that what he was thinking?
02:20:04.000 But we have real interests in the sense that we have large military forces in Seoul.
02:20:10.000 We have a lot at stake.
02:20:12.000 We have our closest ally, Japan, who's had citizens abducted.
02:20:16.000 And so I think that was what he thought is like, let's be friendly.
02:20:20.000 And then with the force of personal chemistry, everything will unlock.
02:20:26.000 But I think that was always extremely unlikely.
02:20:30.000 What do you think is going to happen to that country?
02:20:32.000 I think eventually, and I've written this, I think eventually this regime will collapse under its own weight, but it's really held out a long time because you think of the collapse of the Soviet Union.
02:20:43.000 The Soviet Union had enough bullets to survive.
02:20:46.000 If they had said, you know, we're just going to shoot everybody at the Berlin Wall and every dissenter, they would still be – North Korea has essentially murdered millions of people.
02:20:55.000 So with famine and execution and prison camps – So I think they're going to stay for a while, but eventually there will be leaders in North Korea who will come to the conclusion that it's safer to oppose the Kim family than to wait for the Kim family to come and get you.
02:21:13.000 And that tends to happen in these kind of totalitarian systems where there's so little trust, there's so little loyalty.
02:21:20.000 Jesus.
02:21:21.000 Yeah.
02:21:21.000 What are their conditions like technologically?
02:21:25.000 What is their infrastructure like?
02:21:27.000 So the general infrastructure is absolutely terrible.
02:21:30.000 I mean, they have roads in the big cities that are actually quite nice roads because there's no cars.
02:21:36.000 But their infrastructure is terrible.
02:21:39.000 I mean, all of their power supply, they have brownouts, blackouts all the time.
02:21:45.000 Their manufacturing is all being decimated.
02:21:48.000 So it's terrible.
02:21:49.000 But they have really focused their energy on building these nuclear weapons because they think that these nuclear weapons give them leverage to do things and to extract concessions and to get – it's terrible infrastructure.
02:22:04.000 So do – They don't have an internet, right?
02:22:08.000 But they have something similar, but it only allows them access to a few state-run websites?
02:22:13.000 Well, average person doesn't have access to the internet.
02:22:16.000 So the way it works is it's all about loyalty.
02:22:19.000 So you need three or so generations of loyalty to the Kim family to even set foot in Pyongyang, the capital.
02:22:26.000 Really?
02:22:27.000 Yes.
02:22:27.000 So it's not like you can kind of move around or whatever.
02:22:30.000 It's like just to be in the capital, like you have to have your loyalty proven.
02:22:36.000 And so average person out in the country, they don't have access to much of anything.
02:22:41.000 They have a little bit more now than they did in the past.
02:22:44.000 And then for this relatively small number of elites who are largely in Pyongyang and in the other cities, where there's a ring of defense around these cities, and just to enter, you have to have all of these checks.
02:22:56.000 Some of them have access to limited internet, but it's tightly controlled, and it's not like you're going on Google and going wherever you want.
02:23:06.000 Right.
02:23:07.000 And they probably would get in trouble if they Googled the wrong thing.
02:23:10.000 Yes.
02:23:10.000 And trouble, it's not just you trouble.
02:23:13.000 Like, if my brother or my uncle does something that gets me in trouble with the regime, the whole extended family is out.
02:23:22.000 And that means either you go to prison camps or you're kicked out of Pyongyang.
02:23:26.000 I mean, it's all about collective punishment.
02:23:29.000 People are terrified.
02:23:30.000 And by that ruthless punishment structure that they've set up, that's how they've kept control of the country.
02:23:35.000 Yeah, yes.
02:23:35.000 And everybody's forced to rat on each other, right?
02:23:37.000 Yeah, absolutely.
02:23:39.000 They're actually compelled to tell on each other for one thing that you did.
02:23:45.000 If you don't, then what?
02:23:47.000 If you don't, then you are complicit.
02:23:50.000 And that's these horrible stories.
02:23:52.000 I've met a lot of these people who were in the prison camps.
02:23:56.000 I have a friend of mine.
02:23:58.000 She was this 13-year-old girl.
02:24:00.000 And her father was a low-level North Korean official, and then he was accused of something, and so this family that was privileged all of a sudden was out, and just these horrible things, and prison, and rape,
02:24:16.000 and this little...
02:24:17.000 I mean, now she's in the United States and incredibly positive.
02:24:20.000 I mean, it's amazing how resilient she is.
02:24:23.000 But this is like a real hell.
02:24:24.000 It's an issue, and I think that for us as Americans, as humans, We're less human when there are people who are suffering like this.
02:24:32.000 Yeah, I agree.
02:24:33.000 Now, you were traveling all over North Korea.
02:24:37.000 What were they having you do while you were out there?
02:24:39.000 So what we would do is we would go from one of these special economic zones to the other.
02:24:44.000 And in each one, it was kind of the same story.
02:24:46.000 You'd get there.
02:24:46.000 There'd be like a big field.
02:24:48.000 The farmers had been kicked off.
02:24:50.000 There was a fence around it.
02:24:52.000 And then the group of the local officials would come and they'd have like a big chart And they have a plan, like here's where we're going to build this building.
02:25:00.000 And I would always ask the same question, like, what are you going to do here?
02:25:04.000 Why do you think you're going to be competitive?
02:25:07.000 How do you know what the market prices are?
02:25:10.000 How are your workers going to be empowered so they can change things?
02:25:14.000 I mean, in the old days, it used to be you just kind of have these automaton workers.
02:25:17.000 Now workers are actually making big decisions and fixing things.
02:25:21.000 And they didn't have an answer to any of those questions.
02:25:23.000 And that's what happens when you have these totalitarian, top-down systems, is that being creative is actually really dangerous.
02:25:32.000 So if somebody says, do X, you just do X. Wow.
02:25:36.000 Yeah.
02:25:37.000 It's really incredible.
02:25:39.000 And it's so sad because I spend a lot of time in South Korea.
02:25:43.000 And this is the most dynamic place.
02:25:46.000 Often I go to Seoul just to see what technology is going to show up here a few years in the future.
02:25:52.000 I mean, Seoul is like the future.
02:25:54.000 And then just 35 miles from Seoul is the demilitarized zone.
02:25:59.000 And the other side, it's incredible.
02:26:02.000 And the real problem would be once they finally did get free of that...
02:26:10.000 I mean, you can call him whatever you want, a dictator, and his family.
02:26:15.000 What tools would they have?
02:26:18.000 Like, how prepared would they be to be autonomous?
02:26:21.000 Well, it's really the good thing, the benefit that they have, if there is...
02:26:25.000 So here's my thought of what a scenario might look like.
02:26:28.000 I think eventually...
02:26:30.000 Probably there'll be some kind of coup attempt against the Kim family.
02:26:34.000 Let's just say it succeeds.
02:26:35.000 But that would probably result in another military dictatorship with another group.
02:26:39.000 Well, we don't know.
02:26:40.000 Because then, I think, immediately, I think the Chinese would invade.
02:26:44.000 Really?
02:26:45.000 Yeah, because when people think of the Korean War from the early 1950s, they think, oh, it's the Korean War.
02:26:49.000 It must have been...
02:26:50.000 The Americans fighting against the Koreans.
02:26:53.000 But the Korean War, the two sides was America and the South Koreans fighting against the North Koreans and the Chinese.
02:27:00.000 The Chinese did the most of the fighting.
02:27:05.000 And so China, North Korea is the only country in the world that has a treaty alliance with China, kind of like we have with Japan and with South Korea.
02:27:13.000 And so China, their biggest fear is having a reunified Korean peninsula I think if there was a coup, the Chinese would immediately move in militarily.
02:27:35.000 I think it would be agreed that the Chinese would stay and they just would put on blue helmets, like as a UN force.
02:27:41.000 And then we'd have to negotiate what happens next.
02:27:43.000 And I think what the Chinese would do would be say, well, we'll leave when the Americans leave.
02:27:47.000 I think that would be what will likely happen.
02:27:51.000 I think we're going to see a Korean reunification.
02:27:54.000 And the good news of these reunified countries like East and West Germany is there's a whole system of law that is just – North Korea will be swallowed into South Korea.
02:28:04.000 And then you have law.
02:28:06.000 You have an infrastructure.
02:28:07.000 And it will take one or two generations.
02:28:10.000 But I think that will eventually happen.
02:28:11.000 And I'm hoping it can happen without nuclear war or terrible bloodshed.
02:28:17.000 But it's going to be a big challenge.
02:28:19.000 Goddamn.
02:28:20.000 Yeah.
02:28:20.000 That sounds insurmountable.
02:28:22.000 Just hearing you talk about that, about North Korea getting absorbed by South Korea, I'm like, oh my god, good luck.
02:28:28.000 Just imagine.
02:28:30.000 Just imagine the whole thing.
02:28:32.000 Well, listen, man, it's been a fascinating conversation.
02:28:35.000 I really appreciate it.
02:28:36.000 I really appreciate you coming here and You've certainly sparked a lot of interesting ideas in my head and I'm sure a lot of other people's heads as well.
02:28:44.000 And I would like to see down the line where this all goes.
02:28:47.000 And I hope we don't get swallowed up by machines.
02:28:49.000 We won't, but it's up to us to fight for what we believe in.
02:28:53.000 Well, thank you very much, man.
02:28:53.000 My great pleasure.
02:28:54.000 I really appreciate it.
02:28:55.000 It was a lot of fun.
02:28:55.000 Anytime.
02:28:55.000 Bye, everybody.