The Joe Rogan Experience - March 28, 2019


Joe Rogan Experience #1274 - Nicholas Christakis


Episode Stats

Length

2 hours and 21 minutes

Words per Minute

170.32759

Word Count

24,090

Sentence Count

2,062

Misogynist Sentences

22

Hate Speech Sentences

29


Summary

In this episode, Joe Biden sits down with his good friend and former student, Nicholas Pizzi, to discuss the controversial Halloween costume incident at Yale, and how he handled the aftermath of the incident. Joe also talks about his own experience with a woman wearing a gay costume and yelling at a group of students, and the lessons he learned about how to handle the situation. Joe Biden is a professor of psychology at the Berklee College of Education and the director of the Yale Safe Schools Program. He is also the author of the book and is the host of the TV show on HBO's Hard Knocks. He is married to Jamie Biden and they have a daughter, a son, a daughter-in-law, and a stepson, who are all college students. Joe and Nicholas discuss how they became friends, how they met, and what it was like being a teacher in the late 90s and early 2000s. They also talk about the Yale costume incident and how Joe handled a woman who dressed up as a gay man in a costume and yelled at a bunch of students in a public school. And they talk about what it's like to be a teacher and a parent in the early 20s and how to deal with the pressures of being a public figure in the 21st century. What do you think of Halloween costumes? What would you do if you were a gay student in a gay school? What kind of costume do you would you wear to Halloween? How would you dress like a gay person in a rainbow? Do you have a rainbow in your Halloween costume? Will it work? Would you wear a rainbow at a rainbow on Halloween night? And what would you tell your parents think about it? Does it work with a gay friend in a Halloween costume at a party? Should you wear it like that? Why do you like it or not? Is it a good idea to be gay or a straight person? Can you be a good teacher? ? How much money you should be allowed to buy a rainbow, or a rainbow what do you have in your closet? All of these things? and so much more? We ll answer these questions and more, and more on this episode of in this episode featuring Joe Biden's interview with Joe Biden! Thanks for listening to Joe Biden and his answers to these questions, and much more!


Transcript

00:00:00.000 Gay folks took over the rainbow.
00:00:03.000 Four...
00:00:04.000 Three...
00:00:05.000 Two...
00:00:07.000 Will it work today, Jamie?
00:00:11.000 Yes.
00:00:11.000 Hello, Nicholas.
00:00:13.000 Hey, Joe.
00:00:13.000 How are you, man?
00:00:14.000 Great to meet you.
00:00:15.000 It's really good to meet you, too.
00:00:16.000 I became aware of you, like many people did, with the infamous Halloween costume incident at Yale.
00:00:25.000 Explain that for people who don't know what happened, because it was kind of a crazy scene.
00:00:30.000 It went national.
00:00:31.000 Yes, it was a moment when, around the country, many students were struggling with how to balance conflicting...
00:00:40.000 Try to keep that a little bit closer to your face.
00:00:43.000 There you go.
00:00:57.000 I think we're good to go.
00:01:15.000 And some of those values came into tension.
00:01:17.000 And so around the country, there was a lot of heat about this.
00:01:22.000 And I happened to walk into a propeller myself and wound up in some challenging circumstances.
00:01:31.000 And it was not the worst thing that's ever happened to me, but it was in the top 10 challenging moments I've had in my life, let's say.
00:01:45.000 Yeah, that's a very lawyer-like way of describing exactly what happened.
00:01:49.000 I mean, the thing is, I struggle.
00:01:55.000 I mean, you can tell the story if you want, and then I can correct things.
00:01:57.000 But here's the thing.
00:01:58.000 It's my job to be a teacher, and I have taken responsibility for teaching young people.
00:02:05.000 And it is the case that many people...
00:02:09.000 I mean, yeah.
00:02:22.000 But the thing is, is that, you know, my commitment is to teaching more generally, and I don't want to be defined by that event.
00:02:31.000 I don't want that to become the most important thing about me.
00:02:35.000 You know, I have this book that we're going to talk about that is an important thing in my life, instantiates my values.
00:02:41.000 It talks about what I think is important about the world.
00:02:43.000 So I'm trying to be balanced about it.
00:02:46.000 It's one thing that happened.
00:02:47.000 I did my best.
00:02:49.000 It's in the past.
00:02:49.000 Well, let me help out here because you're being so nice about the whole thing.
00:02:55.000 So people know what we're talking about.
00:02:57.000 There was an incident that was captured on someone's cell phone where you were standing there.
00:03:02.000 It was an hour of footage, five or six different angles, so a clip went viral.
00:03:06.000 But I want to emphasize that there were many people filming that day, and an hour or more of the two or three hours I was out there is available.
00:03:14.000 Well, I'm glad that you had the courage to do that, though, to stand out there and talk to those kids.
00:03:20.000 But some of them were clearly...
00:03:23.000 There's something that happens when people become extremely self-indulgent when they know that they have this platform and they have someone who is in a position of authority and they get to hamstring them in front of the public.
00:03:35.000 And that's what I felt was going on.
00:03:36.000 Just my understanding of human nature, I knew what she was doing.
00:03:40.000 What she was doing by shouting and screaming, this is our fucking home, you know, we're supposed to be safe here.
00:03:47.000 I was like, oh, I see what's going on.
00:03:49.000 She's throwing up the flag of virtue for all of her friends to see how amazing she was, so she's putting on a show.
00:03:55.000 People do that.
00:03:56.000 It's human nature.
00:03:58.000 You handled it admirably.
00:03:59.000 You stood there and you just listened to her and you never yelled back and you never raised your voice and you remained calm, but That sort of environment where the children, and I want to say children,
00:04:15.000 they're basically adults, but acting like children.
00:04:17.000 But this is one of the ironies.
00:04:19.000 People that age, you know, can fight in wars and lose their lives.
00:04:23.000 And so I think it's a difficult challenge because on the one hand, it's right and appropriate to hold people responsible for their actions.
00:04:31.000 Certainly if you're 20 years old, you're an adult.
00:04:34.000 You're still growing.
00:04:35.000 You're still changing.
00:04:36.000 You're still learning.
00:04:37.000 I'm not the same man I was when I was 20. But, you have to be responsible for your behavior.
00:04:43.000 So I don't think you get a total pass either.
00:04:46.000 It's hard.
00:04:46.000 No, you do not get a total pass.
00:04:48.000 And, you know, for folks who don't Have a 20-year-old in their life and don't remember what it was like.
00:04:56.000 You're not a fully formed thing yet.
00:04:59.000 You're filled with chaos.
00:05:00.000 You have emotions and hormones.
00:05:03.000 And then you're at school and you're probably away from the instructions of your parents for the first time.
00:05:11.000 And you're cutting loose and trying out new things.
00:05:14.000 New ways of communicating that way.
00:05:16.000 It's a mess.
00:05:18.000 But most people felt horrified watching that, that you were subjected to that when you're being very reasonable.
00:05:24.000 And also, what it all came about was your wife had sent out an email saying like, hey, maybe it should be okay for someone to wear a fucked up Halloween costume.
00:05:36.000 Maybe it's okay for someone to dress up like Crazy Horse.
00:05:40.000 Well, actually, just to be clear, what Erika was saying in that note was not – this is a very important intellectual distinction.
00:05:47.000 I think we've lost a lot of nuance in our political lives in general in our country right now and also in the nuance in the way we think about difficult topics.
00:05:56.000 So what Erika was saying was not that necessarily the – she was not taking a position on any particular costumes like this is okay.
00:06:03.000 In fact, many of the costumes that would have offended the students would offend her.
00:06:07.000 Right.
00:06:07.000 Right.
00:06:13.000 Right.
00:06:35.000 Yeah.
00:06:48.000 She was saying, do you students really want to surrender that kind of control over your own lives to older adults?
00:06:54.000 And apparently many students did, actually.
00:06:57.000 They did.
00:06:58.000 I don't believe they did.
00:07:00.000 I think they wanted absolute enforcement of what they thought to be wrong or right.
00:07:07.000 Yes.
00:07:08.000 Yes, I think that's right.
00:07:09.000 So they...
00:07:14.000 Many, but not all of the students.
00:07:16.000 I mean, let's also be very clear.
00:07:19.000 Part of the motivation in Erica writing that note was that many of her students, and in fact, many hundreds of other students felt infantilized by this policy.
00:07:31.000 And there had been a big buildup prior to that event, including an article in the New York Times about these Halloween costume policies around the country.
00:07:38.000 I think we're good to go.
00:08:05.000 And one of the rules said you shouldn't mock religion, for example, was one of the provisions.
00:08:11.000 So a university-wide email went out, signed by 13 people, saying, you know, don't mock people's deeply held faith traditions.
00:08:22.000 Well, what if, for the sake of argument, you had been abused by a priest and you wanted, at Halloween, to dress up as a Catholic priest, for example, you know, holding a doll?
00:08:32.000 And someone else who was Catholic was very deeply offended by that.
00:08:37.000 Well, who should adjudicate that?
00:08:40.000 Is it the role of the institution to come down and say, yes, you can express yourself this way?
00:08:46.000 No, you cannot.
00:08:47.000 And so the argument was, let the young people learn.
00:08:52.000 Let them sort it out themselves.
00:09:12.000 Serves the objectives of righteous social progress.
00:09:14.000 If we really want to do better in our society, or in any society, in my view, we have to create an environment where we can talk to each other, grant good faith, listen carefully, make subtle distinctions, and free people up to express what they're thinking so we can have a real marketplace of ideas.
00:09:33.000 That's, you know, my commitment or my belief.
00:09:36.000 Well, that's a wonderful belief.
00:09:37.000 I love it.
00:09:38.000 I mean, that's really, I couldn't agree with you more enthusiastically.
00:09:43.000 That's really, that sounds like the best possible environment for growing up and learning, as long as you have someone to sort of moderate or someone to mediate if things go sideways.
00:09:56.000 Yes, or I don't think you necessarily need a third party mediator, but you do need a shared understanding of core liberal principles.
00:10:06.000 And these principles do include, as I mentioned earlier, a kind of commitment to free and open expression, a commitment to debate, a commitment to reason.
00:10:13.000 So how are you and I going to come to a better understanding of what is true about the world?
00:10:18.000 We could fight, right?
00:10:19.000 And then the stronger person would decide what's right.
00:10:22.000 We could vote.
00:10:24.000 It doesn't seem quite right either.
00:10:26.000 You know, 350 cardinals voted that Galileo was wrong.
00:10:29.000 That didn't make Galileo wrong.
00:10:31.000 Or we could use principles of reason and inquiry to try to appreciate the world together, right?
00:10:37.000 We're looking out at the world and saying, that's confusing.
00:10:39.000 You know, does the sun – does the earth revolve around the sun or does the sun revolve around the earth?
00:10:44.000 Or that's confusing.
00:10:45.000 Should a king have – you know, should a king have ultimate authority in a state?
00:10:49.000 Or is that not how we want to organize a state?
00:10:52.000 So we – you and I look at the world and debate and think about, okay, and we exchange reasons and we use evidence and ways of understanding and studying the world.
00:11:03.000 That, to me, is the only way to truth, actually.
00:11:06.000 Now, some people will think that religion is a way to truth, right?
00:11:09.000 They think that the truth is God-given, for example.
00:11:12.000 Now, I am very sympathetic to religious belief systems, but I don't think that's a way to truth.
00:11:17.000 It's a way to some truths, actually.
00:11:18.000 It's a way to some wisdom.
00:11:20.000 But anyway, so that's what our universities and our society – our universities are officially committed to that.
00:11:27.000 The mottos of our universities are all about free inquiry and pursuit of knowledge.
00:11:32.000 And our country is committed to that in our Bill of Rights, right?
00:11:34.000 We have a commitment to free and open expression, freedom of assembly, freedom of religion, and so forth.
00:11:40.000 And those ground rules then, in my view, make it possible for us to have a better society.
00:11:47.000 And there's more I can talk about.
00:11:49.000 I'm sure we will get into it.
00:11:50.000 No, I'm sure we will.
00:11:51.000 Again, I couldn't agree with you more enthusiastically.
00:11:53.000 I just think we need more reasonable conversations and less screaming and less shouting people down and less stopping.
00:12:00.000 Less mob action, I think.
00:12:01.000 Yes, yes.
00:12:02.000 The mob action is very weird because I don't remember it.
00:12:07.000 From the Vietnam War protests to what's going on today, there was this long gap where...
00:12:16.000 You didn't hear about universities shutting down speech.
00:12:20.000 Yes.
00:12:21.000 This is fairly new.
00:12:23.000 This is within the last half decade or so.
00:12:25.000 Well, let's not – yes.
00:12:27.000 Yes.
00:12:29.000 I mean, there's always an undercurrent of tension about this at universities and in our society at large.
00:12:36.000 You know, let's not forget the McCarthy era where you had the right wing was, you know, really interested in shutting down communists.
00:12:42.000 Like, if you were a professor or an artist who had far-left political views, you were screwed.
00:12:48.000 And that was wrong.
00:12:49.000 Or even if you went to a communist meeting to find out what it was all about, just to educate yourself.
00:12:55.000 Yes, in fact, that's a great example because that, like, right now I see a lot of people being criticized for following online people they disagree with.
00:13:04.000 Which is nuts!
00:13:05.000 Crazy.
00:13:06.000 So, just because I follow someone doesn't mean I agree with what they're saying.
00:13:10.000 I'm interested to learn.
00:13:11.000 What are they saying?
00:13:12.000 I'm friends with people I don't agree with.
00:13:14.000 Yeah, me too!
00:13:14.000 Yes, me too!
00:13:15.000 I'm friends with so many people I don't agree with.
00:13:17.000 I have a friend.
00:13:18.000 I have friends across the political spectrum from the – I don't have any monarchists among my friends.
00:13:23.000 I don't think I – But I have friends from the far right to far left.
00:13:27.000 I have a friend who really believes – he's so libertarian, he thinks there should be private ownership of roads.
00:13:33.000 Whoa.
00:13:34.000 Yeah.
00:13:34.000 That's ridiculous.
00:13:35.000 Yeah.
00:13:36.000 I think that's ridiculous.
00:13:36.000 And we debate.
00:13:38.000 He must be white.
00:13:40.000 He is, actually.
00:13:41.000 Ah!
00:13:42.000 My wife says she would just once like to meet a poor libertarian.
00:13:47.000 Yes, they don't exist.
00:13:47.000 A poor female libertarian, yes.
00:13:49.000 No.
00:13:49.000 Yeah, that's a ridiculous position.
00:13:51.000 I think so.
00:13:52.000 Private Rose, get the fuck out of here.
00:13:54.000 Yeah, okay, but so here's the thing.
00:13:55.000 But okay, how are we going to persuade this man that he's...
00:13:58.000 He also thinks...
00:14:00.000 Somewhat less controversially, it's a harder decision.
00:14:03.000 He also thinks that you should be able to sell your organs.
00:14:07.000 We should have a market in kidneys.
00:14:11.000 While you're alive?
00:14:13.000 Yeah, you can give away one of your kidneys.
00:14:14.000 But yeah, you can.
00:14:16.000 And we allow you to give it away, but we don't allow you to sell it.
00:14:19.000 Wasn't there an instance, Jamie, that you were telling me about, about a young guy who sold his kidney to get an iPhone?
00:14:24.000 Was that in another country?
00:14:25.000 In another country.
00:14:26.000 Yes.
00:14:26.000 Yes.
00:14:26.000 And wound up having an infection and lost his second kidney as well?
00:14:31.000 Yes, and then would need to be undiagnosed.
00:14:32.000 Yes.
00:14:33.000 An iPhone.
00:14:34.000 Yeah, and these things happen.
00:14:37.000 But in the United States, it's prohibited.
00:14:39.000 You can't do that.
00:14:39.000 You also can't sell your blood in the United States.
00:14:41.000 There's a reason for that.
00:14:42.000 Yes.
00:14:43.000 And that's a good reason, a good public health reason.
00:14:45.000 Yeah, because people who want to sell their blood are usually fucked up.
00:14:48.000 Correct.
00:14:49.000 And so it's not a safe – the blood supply is safer in countries where you – Voluntarily.
00:14:53.000 Yeah, altruistically doing it.
00:14:55.000 Exactly.
00:14:55.000 But the kidney is a harder case.
00:14:57.000 Anyway, he believes that.
00:14:58.000 He – anyway, so I love debating him and I learn from him.
00:15:02.000 Like recently he said to me, and I think I have a better answer, he said he doesn't understand why blackmail is illegal.
00:15:09.000 Oh, Jesus Christ.
00:15:11.000 But the point is, the point is...
00:15:14.000 Those anarchists and libertarians, all of them need their asses kicked.
00:15:18.000 They really do.
00:15:19.000 Settle down.
00:15:20.000 Exactly.
00:15:21.000 But the things...
00:15:22.000 Exactly.
00:15:23.000 These are like...
00:15:23.000 And I think any kind of extreme ideology.
00:15:26.000 But the point is, we can learn.
00:15:27.000 There is some wisdom.
00:15:29.000 Almost anywhere, right?
00:15:31.000 And the problem comes from excess expression.
00:15:35.000 The problem comes when we take things to extremes and we get to private ownership of roads.
00:15:46.000 But anyway.
00:15:48.000 And by the way, if you're an anarchist or a libertarian, I'm kidding.
00:15:51.000 I don't really think you need to get your ass kicked.
00:15:53.000 I'm just joking around.
00:15:55.000 You're going to have a mob after you now.
00:15:57.000 It's a position that I always feel like could be remedied with psychedelic drugs.
00:16:00.000 It could be, yes.
00:16:01.000 I really feel like it almost always could be.
00:16:04.000 I get where they're coming from.
00:16:07.000 I understand personal responsibility, the idea that the free market should decide.
00:16:12.000 I get all that.
00:16:12.000 But we already accept that there's some things that we agree on that we should all chip in to pay for.
00:16:20.000 Like roads.
00:16:22.000 Yeah, or like assessments of drug purity, for example.
00:16:26.000 So the very rich could set up a laboratory in their basement so whenever a doctor prescribes a medication, they could see if the drugs are safe and pure.
00:16:34.000 The rest of us pay taxes, and we say we're all going to pitch in together, and we're going to have the FDA, and they are going to certify drug purity so that when I go to my pharmacist and buy a drug, the pharmaceutical company isn't killing me by shoddy manufacturing practices.
00:16:50.000 So, I think that's right.
00:16:51.000 We get together as a free society and we do these things.
00:16:55.000 We want a non-corrupt judiciary, right?
00:16:58.000 We don't want people to be able to bribe judges.
00:17:00.000 For sure.
00:17:00.000 Exactly.
00:17:01.000 So, there are certain foundational elements of our civilization, of our society, I think?
00:17:25.000 Who believe that vaccines kill people, for which there's no scientific evidence, that they're wrong.
00:17:30.000 We could imprison them.
00:17:31.000 That's force.
00:17:33.000 We could vote, which is sort of what we're doing.
00:17:35.000 We're saying, okay, well, you're a minority group who believes these things, so we're not going to allow you to control policy.
00:17:40.000 Or we could try to win the battle of ideas and persuade them.
00:17:44.000 Ultimately, that's the only path that's, in my view, that gets us to where we want to be.
00:17:48.000 Yeah, and just an honest assessment of the actual data.
00:17:54.000 Like, what we really know and understanding how these scientists come to these conclusions.
00:17:58.000 But the problem is these echo chambers where people get involved with online that magnify all of these beliefs and you get radicalized.
00:18:09.000 I mean, I've seen it.
00:18:09.000 Also true.
00:18:09.000 People get involved in these Facebook groups, these anti-vax Facebook groups or, you know, all sorts of different things.
00:18:16.000 I mean, that's how these flat earth people get Yes.
00:18:18.000 They start listening only to people that are involved in this circle.
00:18:23.000 They don't have a greater understanding of the science involved.
00:18:27.000 Did you just see, I just saw online, there's a cruise to the, a cruise for a flat earther, I don't know if you saw it.
00:18:33.000 To the ice wall.
00:18:34.000 Yeah, the ice wall!
00:18:38.000 I thought, that's a new wrinkle, because the old flat earthers used to think that the water was shown falling off the disk of the earth, you know, like the edge of the earth that was just a disk.
00:18:48.000 Now, the new theory that there's an ice wall, actually, it's kind of not falsifiable.
00:18:53.000 That is to say, you could get on a cruise and sail to the edge of the earth, and you would find a wall of ice there, Antarctica.
00:18:59.000 So you think, ah, it's flat.
00:19:01.000 In other words, they have redefined their theory to...
00:19:11.000 Are you aware of hashtag space is fake No, I'm not.
00:19:17.000 I have to say I've not exposed myself to that set of ideas.
00:19:21.000 There's a bunch of people that believe that space is fake.
00:19:24.000 That it's not real.
00:19:26.000 That there's no real space.
00:19:28.000 And that there's like lights up there.
00:19:31.000 And that this is some sort of a plan by Satan.
00:19:36.000 A lot of it's very biblical, which is really interesting.
00:19:39.000 A lot of the flat earth stuff is very biblical.
00:19:41.000 It has to do with the firmament, and they use descriptions and depictions from the Bible.
00:19:46.000 Yeah, it's super bizarre.
00:19:49.000 And what's really bizarre is when you listen to the YouTube videos or these discussions that are done by people that use words.
00:19:57.000 That are real.
00:19:58.000 They string them together correctly.
00:20:00.000 They have like full sentences.
00:20:02.000 They appear to be articulate.
00:20:04.000 It's very confusing if you're a dummy.
00:20:06.000 If you listen to those, you go, wow, this guy's making a lot of sense.
00:20:09.000 He's not.
00:20:10.000 But it sounds like he's making a lot of sense because he's using all these words that are correctly used.
00:20:15.000 There's no ums.
00:20:16.000 He's saying it articulately.
00:20:18.000 Everything seems like, oh my goodness, this man is exposing.
00:20:23.000 He's exposing the reality, but it's not.
00:20:25.000 It's just fucking nonsense.
00:20:26.000 And if you don't know any better, and that's all you listen to, that's where your head will go.
00:20:31.000 The same with the anti-vax movement.
00:20:32.000 If you only listen to these anti-vaxxers, they're making so much sense.
00:20:36.000 Like, oh my god, it's giving everybody all sorts of ailments.
00:20:40.000 You're on the spectrum.
00:20:41.000 And they have a theory of how it does that, which is not, it uses, as you say, scientific words, but it's actually not scientifically correct.
00:20:48.000 You know, it does this, which then does that, which then does that.
00:20:50.000 They lay out a kind of causal chain, which is false.
00:20:53.000 And then there's a problem of nuance and perspective because there's so many people that get vaccinated.
00:20:58.000 There's hundreds of millions of people in this country, billions of people worldwide, and then there are instances, rare occurrences where people have real issues with vaccinations.
00:21:09.000 Well, there are some where they have real issues.
00:21:11.000 So, for example, there's some vaccines which are known to cause certain neurological conditions, rarely, one out of a million or one out of a hundred thousand vaccinations.
00:21:19.000 More commonly is the situation in which you have vaccination is so common, everyone is getting vaccinated, and often that occurs near to an occurrence of some other rare condition, and people associate the two.
00:21:31.000 They think, oh, because of the vaccine this happened.
00:21:33.000 No, it's a coincidence.
00:21:34.000 Right.
00:21:34.000 Either or, yes.
00:21:36.000 There's both.
00:21:37.000 And there's also, you know, if it's one out of a million and you have 300 million people, you have 3 million.
00:21:42.000 You know, I mean, 1 million people with an issue is a big deal.
00:21:47.000 With 300 million people, you easily could have 300 really big cases.
00:21:56.000 You know, 300 cases where people have died from vaccines, and then you bring those in front of people and say, oh my god, and then there's this one, and this one, and this one, and there's 298 more, and you're like, holy shit!
00:22:07.000 All these people are dying from vaccines?
00:22:09.000 You know, it doesn't feel good if it's your child, but when we look at the greater perspective of humanity...
00:22:22.000 Yes.
00:22:24.000 Yes.
00:22:25.000 Yes.
00:22:35.000 Yes.
00:22:38.000 Yes, and save children's lives.
00:22:40.000 I had a woman yesterday who is an expert.
00:22:42.000 She's a medical historian, an expert in Victorian-era surgery, Lindsay Fitzharris, and she wrote this great book called The Butchering Art.
00:22:50.000 And in it, there's all these images.
00:22:52.000 One of them she brought up of what smallpox actually looks like when people get it.
00:22:57.000 It's horrible.
00:22:58.000 Yes.
00:22:58.000 It just covers people's bodies.
00:23:00.000 We've eradicated it.
00:23:01.000 Yes.
00:23:01.000 It's painful and you die from it.
00:23:04.000 It's actually gone.
00:23:05.000 We don't get it anymore in this country.
00:23:07.000 It's fucking incredible.
00:23:08.000 I mean, it's credible.
00:23:10.000 But that's obviously neither here nor there.
00:23:12.000 So, this book, Blueprint, The Evolutionary Origins of a Good Society.
00:23:18.000 When did you start this?
00:23:19.000 About nine or ten years ago.
00:23:22.000 And I... At the time in my lab we were doing research on friendship.
00:23:26.000 We were doing research on why people have friends.
00:23:30.000 It's actually – it's not difficult to provide an account for why we have sex with each other.
00:23:35.000 Many animals – most animals are – well, I don't know if it's most, but animals either reproduce sexually or asexually.
00:23:42.000 And most animals – I'm trying to remember now what the relative proportion is.
00:23:45.000 Anyway, I'm going to say most.
00:23:46.000 Most animals reproduce sexually.
00:23:48.000 And it's not hard to provide an account for how – For why sex originated, why we reproduce sexually.
00:23:57.000 It's not hard to provide an account for why we are choosy in our mates or why we are careful in who we have sex with.
00:24:05.000 But human beings don't just mate with each other, we befriend each other.
00:24:08.000 We form long-term, non-reproductive unions to other individuals to whom we're not related.
00:24:16.000 I think?
00:24:26.000 I think?
00:24:35.000 And that set the stage then for exploring all kinds of other things in our lives, like why we love each other, for example.
00:24:40.000 When we have sex with a person, we tend to become attached to them.
00:24:45.000 We develop emotional sentiment about them.
00:24:48.000 That's not an essential to having sex, yet we do that.
00:24:52.000 And then I became interested in other kinds of good things, like not just love and friendship, but cooperation and teaching.
00:24:59.000 Teaching is another crazy thing.
00:25:04.000 Welcome to my show!
00:25:23.000 What's called social learning?
00:25:24.000 Social learning is really efficient.
00:25:26.000 So if I put my hand in the fire, I learn that I burn myself, I pull my hand out, I've learned something.
00:25:32.000 I paid a price and I learned something.
00:25:34.000 I could observe you putting your hand in the fire.
00:25:37.000 You pay all the price, but I gain most of the knowledge.
00:25:41.000 It's almost as good.
00:25:42.000 I learn, oh, people, you shouldn't put your hand in the fire.
00:25:43.000 I saw that Joe put his hand in the fire.
00:25:45.000 So social learning is super efficient, learning from others.
00:25:49.000 But we take it to an even further level.
00:25:51.000 We don't just passively observe other animals of our own species and learn from them.
00:25:56.000 We teach each other.
00:25:58.000 That is very rare in the animal kingdom where one animal sets out to teach another animal something.
00:26:04.000 So the book is about the evolutionary origins of a good society.
00:26:10.000 It's also a kind of...
00:26:13.000 I think?
00:26:32.000 But I think the bright side has been denied the attention it deserves because we have also evolved to love and to befriend each other and to be kind to each other and to cooperate and to teach each other and all these good things.
00:26:44.000 And I'll shut up.
00:26:47.000 Please don't.
00:26:48.000 No, and here's the thing.
00:26:49.000 Here's the sort of one way to think about this.
00:26:53.000 This must have been the case that the benefits of a connected life outweighed the costs.
00:27:00.000 We would not be living socially and If my exposure to you harmed me on net, in other words, if I came near you and you were violent to me, you killed me, or you gave me misinformation, you told me lies about the world,
00:27:16.000 then my connection to you would ultimately harm me, that I should be better off living as an isolated animal.
00:27:22.000 So animals that come together to live socially, the benefits of that must outweigh the costs.
00:27:27.000 My living, us living as a group.
00:27:29.000 So all this attention to the ways in which our interactions are bad, that we kill each other, that we steal from each other, that we lie to each other, that we have tribalism and all of these traits, which we do.
00:27:40.000 Every century is replete with horrors.
00:27:43.000 I'm not like Pangloss.
00:27:45.000 I don't think like Pollyanna, like, oh, everything's great.
00:27:48.000 That's not me.
00:27:49.000 But what is me is a kind of optimistic focus on the good parts of human nature and the recognition that those good parts must in toto overwhelm the bad parts.
00:28:03.000 Well, they certainly have to.
00:28:05.000 There's so many human beings.
00:28:06.000 I mean, it's obvious that this is working.
00:28:09.000 We have propagated.
00:28:11.000 We're everywhere.
00:28:12.000 We're on every single patch of land that's occupiable.
00:28:15.000 In fact, you're exactly right.
00:28:17.000 The argument, and that's discussed in the book, the way we have achieved the kind of social conquest of the earth… The way our species is spread out to occupy every niche, which is also very rare.
00:28:29.000 Most animals live in one – grizzlies live in this part of the world.
00:28:32.000 They don't live in Amazonia.
00:28:34.000 And polar bears live in this part of the world.
00:28:36.000 They don't live in Arizona, et cetera.
00:28:41.000 So – but our species lives everywhere and the way we have come to be able to do that is by the capacity to have culture, to teach and learn from each other, to accumulate knowledge.
00:28:51.000 So in the book, I talk about lots of this famous set of stories called the Lost European Explorer Files about how European explorers are lost.
00:29:01.000 They lose their supplies.
00:29:02.000 They wind up dying and – But they're in an environment in which other people thrive and survive because they have learned how to live there.
00:29:12.000 So we've spread out around the world.
00:29:13.000 And then there's a chapter in the book at the beginning about shipwrecks.
00:29:20.000 So I have this – should I go on?
00:29:23.000 Yeah.
00:29:23.000 So I have this – so what I'd like to do is – what I try to set out to do in the beginning of the book is I say, look, it's clear that our genes shape the structure and function of our bodies.
00:29:37.000 It is increasingly clear that our genes also shape the structure and function of our minds, our behaviors, whether you're risk-averse, how intelligent you are, whether you have wanderlust.
00:29:50.000 These properties are properties that depend in part on your genes.
00:29:54.000 But it's also clear to me, and that's what the book argues, is that our genes shape not just the structure and function of our bodies, not just the structure and function of our minds, but also the structure and function of our societies.
00:30:05.000 And to really prove that, what we would need is something known as the forbidden experiment.
00:30:11.000 And the forbidden experiment is an experiment in which we took a group of babies.
00:30:22.000 How would they organize themselves socially?
00:30:27.000 Is there kind of an innate society that human beings are pre-wired to make in an essence?
00:30:33.000 Now obviously that's unethical and cruel, but actually monarchs for thousands of years have contemplated this experiment.
00:30:41.000 So Herodotus writes about how one of the ancient Egyptian pharaohs wanted to know what kind of language would – what was a natural language we had in us that we would speak if we were not taught a language.
00:30:52.000 So this pharaoh, it is said, took two babies and gave them to a mute shepherd to raise to see how did the children speak when they grew up.
00:31:00.000 And Emperor Akbar attempted this.
00:31:04.000 There was a couple of European kings that attempted this.
00:31:07.000 Obviously, we can't actually do this.
00:31:08.000 So what I do in the book is I look at a series of other approximations of that.
00:31:13.000 And one chapter is devoted to looking at shipwrecks, groups of men typically, but sometimes men and women, who between 1500 and 1900, there were 9,000 shipwrecks.
00:31:42.000 Oh, wow.
00:31:45.000 All over the world where they occurred and when they occurred and how many people there were.
00:31:51.000 And so then I got all the original accounts from the sailors, from the people on the wrecks, and all contemporary archaeological excavations of those wrecks where they had been excavated and tried to understand what kind of society did these isolated crews actually wind up making.
00:32:10.000 And there were some amazing stories that I found in there.
00:32:15.000 So, they stayed for at least two months.
00:32:17.000 How many of them actually established a real civilization?
00:32:20.000 How many of them stuck forever?
00:32:21.000 No, no one was stuck forever.
00:32:23.000 Most of those crews were eventually...
00:32:25.000 In fact, all of those crews had at least one survivor because if they had all died, then I wouldn't be able to know about them.
00:32:31.000 But there's one famous case...
00:32:33.000 In which these sailors were stranded near Australia, I think somewhere in the Pacific, and they managed to catch a big petrel, one of those huge birds, like a condor.
00:32:43.000 And they put a little note in a little tiny bottle and they tied it to its feet.
00:32:50.000 And this petrel flew thousands of miles and landed in Australia and was found!
00:32:55.000 Wow.
00:32:56.000 With a note indicating where the ice-stranded sailors were.
00:32:59.000 And a ship was sent to go find the men.
00:33:02.000 And they got there, but they had all died.
00:33:05.000 They were all gone.
00:33:06.000 So they used this bird.
00:33:09.000 I should have ate the bird.
00:33:10.000 Well, no, they didn't eat the bird.
00:33:11.000 They should have.
00:33:12.000 No.
00:33:12.000 It needs to be alive.
00:33:15.000 I think if you had that choice, you would communicate rather than eat, Joe, I think.
00:33:19.000 Yes.
00:33:20.000 Yes.
00:33:20.000 For a little bit.
00:33:21.000 Well, until the very end.
00:33:23.000 Yes, yes, yes.
00:33:24.000 Did they starve to death?
00:33:25.000 We don't know.
00:33:26.000 Nobody knows.
00:33:27.000 But the point is that for me to be able to describe what happened, we needed at least one survivor.
00:33:32.000 And often, there were many cases where everyone survived.
00:33:36.000 I mean, there was one pair of cases that was amazing to me.
00:33:42.000 In 1846, in South Auckland Islands, just north of Antarctica, south of New Zealand, the Grafton was wrecked on the southern part of the island.
00:33:51.000 I can't remember how big the island was.
00:33:53.000 It's in the book.
00:33:53.000 Maybe let's say 90 miles long or something or 20 miles long.
00:33:57.000 I think it's 20 miles long.
00:33:58.000 On the southern part of the island, five men are wrecked on the Grafton.
00:34:05.000 And on the northern part of the island, the Inverco wrecks.
00:34:09.000 Nineteen men are wrecked on the Inverco.
00:34:12.000 All the Grafton crew survives.
00:34:14.000 And both crews were on the island at the same time.
00:34:18.000 They never encountered each other.
00:34:20.000 They're struggling for survival.
00:34:21.000 It's like an experiment.
00:34:23.000 Like, who's going to win?
00:34:24.000 I'm tempted to say fear factor.
00:34:27.000 And the question is, who's going to survive and how and why?
00:34:32.000 Everyone on the Grafton crew survives, and 16 of the 19 men on the Inverco crew die.
00:34:38.000 There's also cannibalism in that crew, so it's a very different outcome for various reasons.
00:34:45.000 Anyway, so the point is that in the book, I start with a series of stories about how people come together.
00:35:12.000 I'm going to go and we're going to make it again.
00:35:20.000 We're going to start afresh.
00:35:22.000 I look at settlements in Antarctica of scientists.
00:35:25.000 I look at the Pitcairn, the mutiny on the bounty.
00:35:29.000 I look at the Shackleton expedition.
00:35:30.000 Many, many cases of stranded, isolated groups of people trying to make a new social order.
00:35:36.000 And then I also use data from experiments we do in my lab.
00:35:41.000 We have this software where...
00:35:43.000 Tens of thousands of people have come and played these games.
00:35:45.000 We can create these temporary artificial societies of real people where people come and spend an hour or two, and we, with this godlike way, can engineer the society.
00:35:55.000 We can have a lot of inequality or little inequality or various other features, and then we can observe what happens.
00:36:02.000 And I look at all of that data, all those stories, and say, look, there is a deep and fundamental way that no matter what, human beings Yeah,
00:36:34.000 no.
00:36:35.000 No, so here's the point.
00:36:36.000 Yeah, so here's the argument.
00:36:40.000 You look around the world and the way – the example I give is that, yes, there's huge cultural variation around the world.
00:36:46.000 Just like you said, totalitarian societies, there's – people have different foods and they have different ways of dressing and there's enormous cultural variation and it's marvelous and interesting and obvious to anybody.
00:36:57.000 But I think we're missing the forest from the trees.
00:36:59.000 To me, this is like you and I are sitting on a plane and we look at a hill that's 300 feet and 900 feet and we say those are very different hills.
00:37:08.000 But actually, if we took a step back, we would see that we were on a plateau, and one was a mountain that was 10,300 feet, and another was a mountain that was 10,900 feet.
00:37:19.000 And actually, there are these much more deep and fundamental plate tectonic forces that are creating these two mountains that are very similar, but we are just focused on the superficial top.
00:37:30.000 So the argument in the book is that everywhere in the world, people have friendship.
00:37:44.000 Even in a place like North Korea?
00:37:50.000 So totalitarian states apply huge cultural pressure to suppress this innate tendency.
00:37:57.000 It's like religious, you need a lot of belief in God to suppress your innate desire to have sex, right?
00:38:05.000 So you can have a belief system that's very powerful, that kind of prevents you, squashes what would otherwise be a kind of inescapable inclination you have.
00:38:17.000 So totalitarian regimes, and this is discussed in the book too, They are very threatened by the institution of the family.
00:38:24.000 They're threatened.
00:38:24.000 You need to owe your loyalty to the state, not to your family, not to your friends.
00:38:31.000 And so they have a series of institutions that, you know, everyone is comrade.
00:38:34.000 Everyone gets called comrade, for example.
00:38:38.000 Or a lot of times – well, I don't know if I want to speak at the state level.
00:38:43.000 Let me take it down a notch to communes.
00:38:46.000 So if you think about communes, if you're going to make a commune of people and you want them to feel real loyalty to the commune, one way you can do that is you want to reduce the commitment people have to their partners, let's say,
00:39:02.000 their mates.
00:39:04.000 And in order to do that, you can go to one of two extremes.
00:39:07.000 Either you can prohibit sex, like the shakers, and you say, okay, no one's going to have sex with anyone because we're all in a commune and we all love each other and we're not going to have special love for particular people.
00:39:18.000 Or you could go to the other extreme and you can have polyamory.
00:39:20.000 Say, everyone's going to have sex with everyone else.
00:39:23.000 Once again, you see, that subverts the special relationship that people might form with particular individuals.
00:39:29.000 And so both of those strategies, even though they're opposite, are attempting to do the same thing, which is to break down real relationships, face-to-face relationships between individuals, so that you can have a commitment to this higher group.
00:39:41.000 And that's what totalitarian states also face the same dilemma, and that's also why, incidentally, a lot of those states try to reduce gender differences, right?
00:39:51.000 Like, you know, the Mao jacket, the men and women all were wearing the similar kind of attire, for instance, and Because they want to have people see themselves as interchangeable and not as individuals and relationships not be particular.
00:40:06.000 Did you study cults?
00:40:08.000 A little bit.
00:40:10.000 Not a lot.
00:40:13.000 I talk a little bit in the book about cults, but I don't really need to get to cults in order to make the arguments that I'm making.
00:40:18.000 No, I mean, not even just to make arguments, just to compare, because that is essentially like, in particular, the Ragnish cult in Oregon, the Wild Wild West, or Wild Wild Country documentary on Netflix.
00:40:31.000 Did you see that?
00:40:32.000 No, I haven't.
00:40:32.000 It's fantastic.
00:40:33.000 They essentially took over an entire town and started busing in homeless people to vote.
00:41:03.000 They're going to branch off from regular – they're unsatisfied with regular civilization.
00:41:08.000 They're going to all move to some location.
00:41:10.000 Yes.
00:41:11.000 That's a primitive and ancient impulse.
00:41:14.000 Like I was saying, people have been doing that since time immemorial.
00:41:16.000 It's how America got started.
00:41:18.000 Yes.
00:41:18.000 Yes, yes.
00:41:19.000 Screw this.
00:41:21.000 I'm going over there to start again.
00:41:22.000 But again and again, when people do that, they keep expressing some of these fundamental beliefs.
00:41:27.000 Yes.
00:41:27.000 Yes.
00:41:28.000 It's like saying – yes.
00:41:29.000 Anyway, go on.
00:41:30.000 No, please.
00:41:30.000 No, no, no.
00:41:31.000 I don't have anything to add.
00:41:32.000 I mean, I just was reinforcing what you said.
00:41:34.000 But yes, that's right.
00:41:35.000 Well, I'm always fascinated by people that are unhappy with the current state of affairs.
00:41:39.000 They don't like the way society feels to them.
00:41:43.000 They don't feel like they belong and they want to try somewhere else.
00:41:46.000 I mean, and what's really interesting to me is the last time someone did this as a country, as far as I know, is the United States.
00:41:52.000 There really is...
00:41:54.000 I mean, it's also very unique that this is one of the weirdest countries in the world in terms of our ability to freely express ourselves and we have more guns.
00:42:04.000 Well, the thing about America, the American experiment is about the fact that anyone can be an American.
00:42:12.000 My parents immigrated from Greece.
00:42:14.000 I was raised in this country.
00:42:17.000 To be an American means to buy into a certain set of principles like the Bill of Rights.
00:42:23.000 And many other countries are very xenophobic.
00:42:25.000 You can't become a Japanese.
00:42:27.000 You can't be nationalized in Japan.
00:42:30.000 I mean, you can, but it's extremely difficult and rare.
00:42:33.000 So it's a very homogeneous country.
00:42:35.000 Switzerland is another country.
00:42:36.000 It's very difficult to become Swiss.
00:42:37.000 You can't be nationalized as a Swiss.
00:42:39.000 I mean, you can, but it's extremely difficult and rare.
00:42:41.000 So – but the United States, you know, we say you are an American.
00:42:45.000 If you – from all the whole world, you're welcome.
00:42:48.000 Bring us your tired, your – you know, the famous saying on the foot on the – I forgot the saying.
00:42:54.000 It's very poetic on the bottom of the Statue of Liberty.
00:42:58.000 You're wretched, you're forlorn, whatever it is.
00:43:02.000 And you can come to these shores and make your life anew and all you need to do to be an American is to buy into a commitment to constitutional governance, democratic rule, bill of rights and these principles.
00:43:17.000 We should note that there were millions of people that were brought as slaves involuntarily to these shores.
00:43:22.000 We don't always realize our best virtues.
00:43:26.000 We allow people to come like the Irish and treat them as second-class citizens or the Italians or the Greeks even.
00:43:33.000 We don't always do that.
00:43:35.000 But the idea that you're putting on the table, which I think is correct, is that – You can be an American.
00:43:41.000 This is a special, unusual experiment.
00:43:43.000 You can't reinvent yourself quite that way, to my knowledge, in any other colony or country.
00:43:49.000 It's one of the weirdest things that this is a country where anti-immigrant sentiments are running rampant when the entire foundation of the country is based on immigration.
00:43:59.000 That's the only way people got here if you're not a Native American.
00:44:02.000 Yes.
00:44:03.000 It was taken from the Native Americans, and everyone since then is an immigrant or a descendant of an immigrant.
00:44:07.000 That's correct.
00:44:08.000 And even the Native Americans came here from somewhere else.
00:44:09.000 They came 20,000.
00:44:10.000 Yes.
00:44:10.000 They absolutely right.
00:44:11.000 The whole thing is crazy.
00:44:12.000 Yes.
00:44:12.000 The whole thing is crazy.
00:44:14.000 Yes.
00:44:15.000 And it's such a unique environment for expression.
00:44:19.000 I mean, there's really no other country that has as free expression.
00:44:23.000 Even the Brits don't.
00:44:24.000 That's right.
00:44:25.000 And Canada certainly doesn't.
00:44:27.000 Correct.
00:44:27.000 And they're our neighbors.
00:44:28.000 Yes.
00:44:29.000 It's a very unusual thing, and this unusual thing is the most recent incarnation of a country.
00:44:37.000 Yes.
00:44:38.000 Yes, I think that's right.
00:44:39.000 And I think there are – you know, then this ties in with a whole set of ideas about American exceptionalism.
00:44:44.000 You know, are we – how different are we?
00:44:47.000 What is the source of our wealth?
00:44:50.000 What is the height of our civilization?
00:44:53.000 You know, I am distressed by some of the direction our country is going in at the moment.
00:44:58.000 But I think in the long arc of history, I think the United States stands for many of the best principles in the world.
00:45:07.000 And I'm prepared to defend those principles.
00:45:10.000 I am too.
00:45:11.000 And I think, like you were saying with your libertarian friend and, you know, someone who may be an anarchist or whatever, there's – There's room for all these weird opinions.
00:45:21.000 Yes.
00:45:41.000 You know, you're going to have certain ridiculous ideas and awful ideas that are amplified in this volume that is an incredible mass of humans.
00:45:51.000 Yes, I think a large – I think that's right.
00:45:53.000 I think our size contributes to or makes a kind of heterogeneity of ideas more easy.
00:46:00.000 You know, if we were a tiny country, although even in small democracies, you know, like you go to European countries that are tiny, Spain, for example.
00:46:07.000 I mean, it's not tiny, but it's tiny compared to us.
00:46:10.000 You know, there's a lot of difference of beliefs from far left to far right.
00:46:14.000 But I think the key aspect which you were talking about earlier, which again you're highlighting, which I agree with, is that we want an environment in which people can – the ground rules are clear.
00:46:23.000 So, you know, you can't – there's no physical contact allowed, right?
00:46:27.000 So we draw a bright line distinction between words and deeds.
00:46:31.000 So I completely reject the idea that words are violent.
00:46:35.000 Yeah, totally.
00:46:36.000 I totally reject that.
00:46:38.000 And because we have different words for it.
00:46:40.000 They're two different things.
00:46:41.000 Totally different.
00:46:43.000 So ground rules are, you know, I can't touch you, but I can speak.
00:46:47.000 Other ground rules are that we are committed to open expression.
00:46:50.000 A good ground rule would be that we grant positive intent.
00:46:53.000 We grant good intent.
00:46:54.000 That is to say, I try to put what you're saying in the most favorable light.
00:46:58.000 First, I think about it.
00:46:59.000 I say, okay, now wait a minute.
00:47:00.000 What is he saying?
00:47:00.000 What does he mean by that?
00:47:02.000 He might.
00:47:03.000 Now, you may be an idiot.
00:47:04.000 A person may be an idiot.
00:47:05.000 They may be vile.
00:47:06.000 They may be violent.
00:47:07.000 They may be wrong.
00:47:08.000 You know, all of those things are also possible.
00:47:10.000 But that's not the first go-to.
00:47:12.000 So anyway, if we set those ground rules, I think, I believe strongly that in the marketplace of ideas, truth will out and righteousness will out.
00:47:22.000 That's what I think.
00:47:23.000 Maybe I'm wrong.
00:47:24.000 Maybe, in fact, what we need is a benevolent dictator.
00:47:28.000 Yeah.
00:47:32.000 Yeah.
00:47:35.000 Yeah.
00:47:49.000 Situation is not so great.
00:47:50.000 I really wish there was a strong man that would come down and fix it.
00:47:54.000 It's very tempting.
00:47:55.000 This is the inclination towards Trump.
00:47:57.000 I think in some part, yes.
00:47:58.000 Trumpism is a little bit about this fantasy that we will, you know, that the way out is to have a kind of imposition from above.
00:48:07.000 And I think that's very dangerous, actually.
00:48:10.000 And we were talking about earlier in college campuses.
00:48:12.000 It's the same principle, right?
00:48:13.000 Like the idea that Big Daddy is going to come down and tell us what to do and fix the situation, I think is undemocratic in the end.
00:48:22.000 But Big Daddy has to follow the rules that these children want.
00:48:26.000 I mean, this is part of the issue with the idea of words equal violence.
00:48:31.000 I mean, this is not a well-thought-through idea, and this is an idea that is really prevalent.
00:48:36.000 Words can lead to violence.
00:48:37.000 Words can be painful.
00:48:39.000 They can hurt your feelings.
00:48:43.000 They can be unpleasant.
00:48:44.000 All of that is true.
00:48:45.000 But words are different than violence.
00:48:46.000 They just are.
00:48:47.000 And so I think we need to, you know, and in fact...
00:48:50.000 As John Haidt and Greg Lukianoff argue, we actually might want to create other reasons to draw the distinction between words and violence and to cultivate an appreciation for that distinction, and that is by allowing people to speak,
00:49:07.000 we may actually reduce violence because we can identify who has these crazy ideas.
00:49:14.000 So if I believe that someone hates people like me, Yeah.
00:49:31.000 Right?
00:49:32.000 Yeah.
00:49:33.000 Yeah.
00:49:36.000 So that's their argument that Lukianov and Haidt make, that actually this is a potentially additional benefit of creating a free and open marketplace of ideas, is we identify where the crazy is.
00:49:46.000 You know, here are all these people who are talking about the anti-vaxxers.
00:49:48.000 I'd like to know who are the people that hold these beliefs, because as a public health expert...
00:49:54.000 Right.
00:50:05.000 Right.
00:50:08.000 Right.
00:50:10.000 Right.
00:50:15.000 Yeah, and the solution to these bad ideas is for someone to come up and give a better idea.
00:50:22.000 Yes.
00:50:22.000 Someone to debate or to explain what's wrong with it, and to do it in a reasonable manner.
00:50:28.000 When people start shouting and screaming and pulling fire alarms, the idea of silencing people from speaking, that somehow this is going to help, this is also part of deplatforming.
00:50:42.000 Yes.
00:50:42.000 Where people call for deplatforming people.
00:50:44.000 No, I think that's wrong.
00:50:45.000 Even based on just reasonable people with differing opinions.
00:50:49.000 Peter Tatchell is a gay rights activist in England who went to prison for his rights, been imprisoned in foreign countries for defending gay rights, and he was deplatformed in England a couple of years ago.
00:51:04.000 No, here's the problem with deplatforming.
00:51:06.000 So first of all, it is totally right and appropriate to protest.
00:51:11.000 So if someone is speaking something you don't want, I will strongly defend protest.
00:51:15.000 Stand outside, yell and scream, hold banners up, whatever.
00:51:20.000 You can't interfere with the right of the speaker to express themselves, first point.
00:51:24.000 But even more important, the reason we don't want that is not so much because we're interested in the right of the speaker.
00:51:32.000 It's because we're interested in the rights of the listeners.
00:51:35.000 The people who want to listen to that person have a right to listen to that person in a free society.
00:51:47.000 I am interfering with the ability of all the people who want to hear you to hear you.
00:51:51.000 It's their rights that matter too.
00:51:53.000 So if we – the deplatforming, it's not about, oh, so-and-so was unable to speak at such and such a place.
00:52:00.000 It's the fact that all the people that wanted to hear so-and-so were deprived of their opportunity to do so.
00:52:05.000 So I think the answer to words we do not like, the answer to speech we do not like is more speech.
00:52:11.000 It's not silencing.
00:52:12.000 Yeah, and there's also the obvious situation you put someone in when you do attempt to silence them.
00:52:19.000 You put them under duress, and their message changes.
00:52:22.000 You make someone more combative.
00:52:24.000 And this has often been the argument for why Trump became president in the first place, that people were tired of the argument on the other side.
00:52:34.000 Yeah.
00:52:35.000 I mean, I'm not a political scientist, and I follow that literature a little bit.
00:52:39.000 I think there is a strong argument that that is one of the factors that contributed to Trump's success.
00:52:45.000 Let's keep in mind, however, that the majority of Americans voted for Hillary Clinton, and I think the majority of Americans didn't vote.
00:52:53.000 Correct.
00:52:54.000 But of the people who voted, yes, that's right.
00:52:56.000 That's another whole problem.
00:52:58.000 But 63 or so million, I mean, about 3 million more people voted for Hillary Clinton than voted for Donald Trump nationally.
00:53:05.000 So, I forgot how we got onto him.
00:53:08.000 What were you saying?
00:53:09.000 We were talking about people wanting to silence people, the forcing of political correctness, and the rebounding of that is the reinforcing of someone who comes along like Trump.
00:53:21.000 Yes, I would agree with that.
00:53:23.000 And I think that that's another, you know, that sort of is a variant of the argument we were discussing earlier, which is that one of the advantages of creating a free and open society is that you allow, you know, live and let live.
00:53:34.000 And then you don't, you tend to, you avoid creating kind of suppressed animosities or you can help to avoid it.
00:53:42.000 Yeah, this open communication is so critical, and it's also critical to have reasonable, polite conversation.
00:53:51.000 Like, people can oppose each other in their idea, but you should be able to express how and why you oppose that idea without it being this sort of personal vendetta.
00:54:04.000 Yes, I agree with that.
00:54:05.000 I mean, you know, I think we have to accept that there will be – people will get angry.
00:54:09.000 I mean, that's part of having an open society, and I think we need to accept – It's part of being a person.
00:54:13.000 Yes, and I think we need to accept that some people – not everyone will – Will engage in discourse the way you and I might want to engage in that discourse.
00:54:23.000 But I do agree with you completely that ideally we would have a kind of civilized conversation that allowed us to learn and to grow.
00:54:31.000 And I think ultimately that, as we've been saying, is better for our society as well.
00:54:35.000 Well, I think we should acknowledge that people are going to be upset, but we should also applaud people for not being upset.
00:54:43.000 I think there's a higher value to people being able to communicate reasonably.
00:54:48.000 Yes, I agree.
00:54:49.000 I don't think that that's reinforced enough, and I don't think that's appreciated enough.
00:54:54.000 I don't want to get any disagreement from me on that, yes.
00:54:57.000 Yeah, I mean, I just think this is something that we can do.
00:55:02.000 Yes.
00:55:02.000 And we can get better at it.
00:55:03.000 Well, I think...
00:55:04.000 It's like martial arts training.
00:55:06.000 You know, I think that self-discipline is not an easy thing, Joe.
00:55:09.000 And like anything else worth doing in life, like basically anything worth doing takes effort.
00:55:15.000 It's tempting.
00:55:16.000 The go-to strategy that many people have, so I think it's important to note that free speech is difficult and it's not an easy thing.
00:55:24.000 It's a natural inclination to want to silence your opponents.
00:55:28.000 But it's wrong, and it's harmful, and it's actually harmful to you to do that.
00:55:33.000 So I think we need to have an educational system that cultivates that, that cultivates the capacity to tolerate an idea that you don't like, to think about that idea, and then to respond to that idea.
00:55:49.000 So I guess what I'm saying is it does require some training.
00:55:54.000 It doesn't come naturally, unfortunately.
00:55:56.000 Yeah.
00:55:56.000 But it should be reinforced.
00:55:57.000 And I think there's a way to do that.
00:56:00.000 And there's a way to appreciate that.
00:56:01.000 And there's a way to call that out when you see it.
00:56:05.000 I think the world needs more of it.
00:56:07.000 And if we can figure out a way to do that, we will find that our differences are not nearly as egregious.
00:56:14.000 They're not nearly as disgusting as we like to think they are.
00:56:17.000 Well, that's exactly what I argued Blueprint, that there's such, you know, like, you know, when you go to a foreign country, initially, you're overwhelmed by the different food and the different smells and the different architecture.
00:56:26.000 And anyone who's traveled even to a different state has had this experience.
00:56:30.000 And yet, actually, once you get to know the people, you see that they're very human.
00:56:34.000 They're like us.
00:56:35.000 They love their partners and they Hang out with their friends and they work together to build a civilization and a society and they have schools and they teach and they learn and they do all of these basic things that are a fundamental part of our common humanity.
00:56:48.000 And this is what I talk about in Blueprint at Length.
00:56:51.000 You know, like I just – I think it's – I think there's a kind of flawed beauty to the world that captivates me.
00:57:03.000 And it's a little bit on the – there's this aesthetic tradition in Japan and a philosophy called wabi-sabi.
00:57:09.000 Do you know what wabi-sabi is?
00:57:11.000 No.
00:57:11.000 You probably know about it, but you may not know the word.
00:57:13.000 I've heard it.
00:57:14.000 I can't remember.
00:57:15.000 Do you know like how – like the Western aesthetic for pottery is like these perfectly symmetrical, beautifully glazed pots.
00:57:21.000 But there's a tradition in Japan of slightly imperfect pots, like a cracked pot or a pot that's slightly misshapen.
00:57:29.000 It's very difficult, the masters, to make these pots.
00:57:31.000 And it's called wabi-sabi and it's about how imperfections, a kind of beauty of imperfection, a kind of flawed beauty.
00:57:40.000 Like a hot girl with a gap in her teeth.
00:57:42.000 Yes, I suppose.
00:57:43.000 Yeah.
00:57:47.000 Yes, I suppose that could be an example of that.
00:57:51.000 Or, you know, Elle MacPherson famously had that little – was it her?
00:57:56.000 I forgot which was the famous model that had – Sidney Crawford.
00:57:57.000 Sidney Crawford, yeah, had that famous mole on her face.
00:58:02.000 So it's a flawed beauty.
00:58:03.000 So here's the point.
00:58:05.000 It's not hard to look around the world and see the violence and the murder and the warfare and the incompetent leadership and all of these awful things about our species.
00:58:15.000 But we're really a fucking unbelievable species, actually, who do amazing things when you compare us to other species.
00:58:22.000 And there's a kind of flawed beauty to us.
00:58:24.000 And I think that it's wrong to be seduced to the dark side, you know?
00:58:30.000 It's wrong to like only focus on the bad stuff.
00:58:34.000 I also think it's a kind of moral and philosophical laziness, right?
00:58:37.000 If we allow ourselves to just think that, oh, you know, people are awful – It kind of relieves us of any duty to be good and to work to make the world better.
00:58:48.000 It's a kind of, you know, surrender to the dark side.
00:58:51.000 I think that's wrong.
00:58:52.000 And the book shows exactly how and why that's wrong and how natural selection has shaped all these wonderful qualities which are shared the world over.
00:59:01.000 So you go to the foreign country, you're initially perplexed by their crazy practices, and then slowly but surely you find our common humanity.
00:59:10.000 And anyway, I find that...
00:59:12.000 It's pleasing, at least to me, that perspective.
00:59:15.000 You know, that's one of the cool things about travel, right?
00:59:17.000 You broaden your perspective and your understanding of what it means to be a person.
00:59:21.000 Go to these different environments.
00:59:22.000 And yourself, too, yeah.
00:59:23.000 Yeah, they're different foods or different art.
00:59:27.000 I think?
00:59:46.000 And then after a while, you start to say, hey, actually, this is pretty good.
00:59:49.000 You know, this is not a crazy thing after all.
00:59:51.000 They put pine resin in their wine?
00:59:53.000 Yes.
00:59:54.000 You know, the first time I had Scotch whiskey, I didn't know what I thought about it.
01:00:01.000 And now I love whiskey, right?
01:00:03.000 It's an acquired taste.
01:00:04.000 So the first time you drink something like that, you think, you know, yes, they put resin.
01:00:07.000 They put pine resin in their white wine.
01:00:09.000 They chill it.
01:00:10.000 I should have brought you some.
01:00:11.000 Maybe I'll send you some.
01:00:12.000 You know what's not an acquired taste?
01:00:12.000 What's not?
01:00:13.000 Ouzo?
01:00:13.000 Kool-Aid.
01:00:14.000 Kool-Aid.
01:00:14.000 It's delicious.
01:00:15.000 From the beginning, yeah.
01:00:16.000 Just right out of the jump.
01:00:18.000 It's cold.
01:00:19.000 It's just so good.
01:00:20.000 Yes, yes, yes.
01:00:22.000 You don't have to convince anybody.
01:00:24.000 Yes, that's right.
01:00:25.000 That's right.
01:00:26.000 Some things are just good right out of the box.
01:00:29.000 They're just good.
01:00:30.000 Kool-Aid is just good.
01:00:32.000 I mean, I don't recommend you drink it all the time.
01:00:33.000 It's full of sugar.
01:00:34.000 It's terrible for you.
01:00:36.000 Damn, that stuff tastes good.
01:00:37.000 It's like fried foods, you know?
01:00:40.000 Yeah, sure.
01:00:41.000 Just yummy.
01:00:42.000 A lot of them.
01:00:43.000 Yeah, French fries.
01:00:44.000 Yes.
01:00:44.000 I mean, come on, man.
01:00:45.000 Salt and ketchup?
01:00:46.000 You don't like that?
01:00:46.000 Yes, come on.
01:00:47.000 How could you not like that?
01:00:48.000 No, I'm not sure about what I should mention, but anyway, I love Popeye's fried chicken.
01:00:54.000 I do as well.
01:00:55.000 I love it so bad.
01:00:56.000 It's awful.
01:00:57.000 It's terrible for you.
01:00:58.000 My wife is unlikely to listen to this full podcast, or I'll skip over this part so she doesn't hear this part, but...
01:01:04.000 And my sister will be listening, probably, and so she will laugh when she gets to this part because whenever I see a Popeye's, I just pull over and indulge myself.
01:01:14.000 Terrible.
01:01:15.000 Popeye's is good, but you know what?
01:01:16.000 If you really want to indulge and you like chicken, Roscoe's.
01:01:20.000 Is that here in LA? I haven't known.
01:01:22.000 Oh, you don't know.
01:01:23.000 No, I don't know.
01:01:24.000 Roscoe's, chicken, and waffles.
01:01:26.000 Dude, I tried to go there the other day with my family.
01:01:29.000 Don't you make that face.
01:01:31.000 I tried to go there the other day with my family on a Sunday.
01:01:33.000 There was an hour and a half wait.
01:01:35.000 On the same plate?
01:01:36.000 On a Sunday.
01:01:36.000 Chicken and waffles?
01:01:37.000 Of course on the same plate!
01:01:39.000 What are you, a communist?
01:01:41.000 Yeah, man, it's an L.A. tradition.
01:01:42.000 Do the waffles have syrup on them, too?
01:01:43.000 Hell yeah, and butter.
01:01:46.000 And the chicken is fried?
01:01:47.000 Yes, perfectly.
01:01:50.000 It's damn delicious.
01:01:52.000 Okay, I'm going to open my mind.
01:01:52.000 You sound like you're from another planet.
01:01:54.000 I make maple syrup.
01:01:56.000 I live in Vermont, and I make maple syrup.
01:01:58.000 I tap my own trees.
01:01:59.000 Wow, what a freak.
01:02:00.000 Yeah, exactly.
01:02:01.000 You're giving people a hard time for waffles and chicken together?
01:02:04.000 Exactly.
01:02:04.000 The colonel has it now.
01:02:06.000 The colonel's a liar.
01:02:07.000 He does not have it.
01:02:08.000 Get him out of here.
01:02:09.000 That's not even the Colonel.
01:02:10.000 The Colonel's Norm MacDonald.
01:02:12.000 I saw him.
01:02:13.000 That is not...
01:02:13.000 No, they do not have real...
01:02:14.000 There's one Colonel.
01:02:15.000 It's Norm MacDonald.
01:02:16.000 It's a temporary promotion or something to have.
01:02:18.000 This is nonsense.
01:02:20.000 Yeah, but they do not...
01:02:20.000 Oh, my God.
01:02:22.000 Chicken and waffles.
01:02:22.000 And the syrup goes on top of the chicken?
01:02:24.000 Yeah, that's so good.
01:02:25.000 That waffles and chicken tastes like cat litter compared to Roscoe's.
01:02:29.000 There's also Sweet Chick LA he could try, too.
01:02:30.000 Get this out of here.
01:02:33.000 All that stuff can go fuck off.
01:02:36.000 Roscoe's.
01:02:36.000 Chicken and waffles.
01:02:37.000 And you get the greens too.
01:02:38.000 The collard greens.
01:02:39.000 I like collard greens.
01:02:40.000 Yeah, that's fine.
01:02:41.000 Damn good.
01:02:42.000 That's fine.
01:02:42.000 But that's a waste of maple syrup.
01:02:44.000 Take it from a man who makes it to put it on chicken.
01:02:46.000 I'm sure Sweet Chick is good.
01:02:46.000 I'm sure Sweet Chick is good.
01:02:47.000 It's Nas's place.
01:02:48.000 I get it.
01:02:49.000 I love it.
01:02:49.000 I've been there.
01:02:50.000 It's great.
01:02:51.000 Roscoe's.
01:02:52.000 Okay, I'll have a look.
01:02:53.000 Everybody can fuck off.
01:02:53.000 There's a reason why there's an hour and a half wait on a Sunday.
01:02:56.000 Okay.
01:02:56.000 How long are you in town for?
01:02:58.000 Just a day.
01:02:59.000 Maybe I'll be back, though.
01:03:00.000 I'll be back in a couple of weeks.
01:03:00.000 Just shoot over there right after the show.
01:03:02.000 All right.
01:03:02.000 Maybe I'll go there for lunch.
01:03:03.000 Go to the one on Gower.
01:03:04.000 Okay.
01:03:05.000 Ooh, man, it's good.
01:03:06.000 All right.
01:03:06.000 Maybe I'll try that.
01:03:07.000 It's so good.
01:03:08.000 And it's also one of those places that's been there forever.
01:03:10.000 We used to get it.
01:03:11.000 I found out about it from 1994 when I was doing news radio.
01:03:15.000 95-ish, I guess.
01:03:17.000 Have you been in L.A. since you left Massachusetts?
01:03:20.000 No, I went to New York for a couple years and then I moved out here.
01:03:22.000 I moved out here in 94. Okay, so you've been here a long time.
01:03:24.000 Yeah, and when I was on news radio, they got it for, like, you could order lunch and someone ordered Roscoe's chicken and waffles.
01:03:30.000 So you became addicted.
01:03:31.000 And I was like, what is this?
01:03:32.000 Like, waffles, it was just like you.
01:03:34.000 Waffles and chicken, there it looks.
01:03:36.000 That doesn't look as good as it looks when you're there.
01:03:39.000 When you're there and you smell it, it's damn good.
01:03:42.000 Sorry, I'm doing a Roscoe's commercial here, but I'm a fan.
01:03:45.000 That's an incredible combination of items, I just have to say.
01:03:47.000 It's so good.
01:03:49.000 It's so good.
01:03:50.000 And afterwards, you better have nothing to do, man, because you're going into a food coma, son.
01:03:57.000 Anyway, how do we get to that?
01:04:00.000 I don't know.
01:04:00.000 I was going to tell you maple syrup stories.
01:04:02.000 Oh, yeah.
01:04:02.000 Going to, yeah.
01:04:03.000 Different countries.
01:04:04.000 Different countries and opening your mind.
01:04:05.000 So you are counteracting my resin-flavored white wine with the maple syrup and crusted fried chicken.
01:04:13.000 Well, I'm a giant fan of spicy food.
01:04:15.000 I love spicy food.
01:04:16.000 So I really, really enjoyed Thailand.
01:04:19.000 I really enjoyed their style of cooking and their kind of food.
01:04:23.000 Are you one of those people who...
01:04:30.000 I like Habanero.
01:04:34.000 I like things pretty spicy compared to the average person, but I have friends that put me to shame.
01:04:39.000 I have a buddy of mine that I used to do Fear Factor with, my friend Tommy Hershko.
01:04:43.000 Shout out to Tommy.
01:04:44.000 And I used to eat, I ate chili with him.
01:04:47.000 I couldn't fucking believe how hard he could go.
01:04:51.000 I'm like, this is crazy.
01:04:52.000 I think people just have a different inherent, like, it's almost like built into their body.
01:04:58.000 It's both, I think.
01:04:58.000 It's both.
01:04:59.000 Some people are better able, it's like some are faster runners than others.
01:05:02.000 But it's also training.
01:05:03.000 So you slowly work your way up to being able to tolerate and like those really super hot peppers.
01:05:09.000 I find it very unpleasant.
01:05:11.000 I have a friend just like you who really is into it, like really seeks out the hotness.
01:05:15.000 I also think it's a little bit like addiction.
01:05:18.000 Like you tolerate like...
01:05:19.000 As you get used to the less hot stuff, now you need more and more stuff in order to get the same.
01:05:24.000 It's not a high, exactly.
01:05:26.000 Some people think it's a high, by the way.
01:05:27.000 There's a little high to it.
01:05:29.000 Some people say that.
01:05:30.000 Again, it's not for me.
01:05:31.000 I cook meat with jalapenos.
01:05:33.000 I slice it up, and I'll have a piece of the meat with the jalapenos, especially elk with jalapenos.
01:05:39.000 It's sensational.
01:05:40.000 It's so good.
01:05:42.000 Yes.
01:05:43.000 Yeah, but my kids always make fun of me because I'm bald, so my whole head is covered with sweat, and they come over and wipe my head, and they're like, look at you, you're so gross!
01:05:53.000 How old are your kids?
01:05:54.000 The youngest ones are eight and ten.
01:05:56.000 And you have how many?
01:05:57.000 Three.
01:05:58.000 I have three.
01:05:58.000 All daughters.
01:06:00.000 All daughters.
01:06:00.000 I have a 22-year-old, a 10-year-old, and an 8-year-old.
01:06:02.000 You'll live longer with daughters.
01:06:05.000 Really?
01:06:05.000 If you plot dad's survival on the y-axis and fraction of female children on the x-axis, survival is slightly longer for men who have a higher fraction of daughters as children.
01:06:16.000 I think it's because boys drive you to your fucking grave because they're so goddamn crazy.
01:06:20.000 There's lots of theories as to why it happens, and that is, in fact, one of them.
01:06:24.000 It's framed a bit more scientifically than that.
01:06:26.000 But that's the basic theory.
01:06:28.000 My 10-year-old is a maniac.
01:06:29.000 My 10-year-old daughter's – and I just imagine if she was a boy, I'd be terrified that she'd be just lighting things on fire and blowing up buildings.
01:06:37.000 Yes.
01:06:37.000 Yeah.
01:06:38.000 Boys are a problem.
01:06:39.000 It can be.
01:06:40.000 I mean, I think – I mean, I think it's – I think the – Well, I mean, we could get onto the whole gender issue.
01:06:53.000 I'm not sure we want to, but I think boys are responsible.
01:06:57.000 Let's talk about chimpanzees.
01:06:58.000 It's easier.
01:06:59.000 Male chimps do most of the violence.
01:07:02.000 About 95% of the violence in murders are committed by male chimpanzees, and most of the victims are males.
01:07:09.000 And I think there is no doubt that biology plays a very important role in Male proclivity to violence, for example.
01:07:18.000 So there are trouble.
01:07:19.000 So boys can be a problem that way.
01:07:21.000 And I think the many ways in which society, our cultural traits that we invent, their purpose is to shape and guide those tendencies to violence to kind of mitigate them.
01:07:34.000 But we don't just need – again, going back to the book, we don't just need – we don't just use culture for that purpose.
01:07:42.000 There's an argument in the book that we humans have domesticated ourselves.
01:07:46.000 So – If you look at – if you compare dogs to wolves and domesticated cats to wild cats from which they descended or guinea pigs to the wild guinea pigs from which they descended or horses to – the wild horses to which they descended.
01:08:08.000 We're good to go.
01:08:30.000 If you compare human beings, but those animals were domesticated by humans.
01:08:35.000 Like, I deliberately allowed the reproduction of this member of the litter and not that member because this member was nicer.
01:08:43.000 And so across time, we evolve a more domesticated version of the ancestral species.
01:08:48.000 So we get my miniature dachshund from a wolf, like the kind of things that were photographed out in your studio here.
01:08:56.000 Crazy transition.
01:09:00.000 Now, if you look at humans and you compare us to our ancestors or to other primates, for all the world it looks like we have been domesticated.
01:09:13.000 We are more peaceful and placid.
01:09:14.000 We have sex outside.
01:09:16.000 Non-reproductive sex is another thing.
01:09:18.000 So these domesticated animals will have sex even when it's not time to reproduce.
01:09:24.000 We Our tails, we don't have tails anymore, but our tails get shortened.
01:09:31.000 There are all these features that we have, these behavioral qualities and these physical properties that we have.
01:09:38.000 We get a feminization of our faces.
01:09:40.000 Our jaws become smaller.
01:09:42.000 Like if you look at, you compare these domesticated animals to their non-domesticated ancestors, the domesticated versions are less violent.
01:09:50.000 So we lose a lot of the traits that physical and psychological traits associated with violence.
01:09:57.000 But there was no one that domesticated us.
01:09:58.000 So the theory is, the question is, how?
01:10:00.000 How did that happen?
01:10:01.000 And one of the theories that's discussed in Blueprint, and that's advanced by other scientists, this is not my work, is that we self-domesticated.
01:10:10.000 And that what happened over the millennia, Over millions of years, is that weaker individuals in our groups, when one individual became too autocratic and too violent and too powerful,
01:10:25.000 they banded together and killed that guy.
01:10:28.000 And so, over time, we were killing the more violent members of our species, weeding out those people.
01:10:35.000 And therefore, the gene pool changed across time and we self-domesticated.
01:10:40.000 We are more peaceful today than we would have been because we domesticated ourselves.
01:10:46.000 And this is one of the arguments that's also made to help explain the origins of goodness, actually.
01:10:51.000 And the origins of cooperation, because it would take a few good people to kill the bad person that's running everything that's evil.
01:10:58.000 Correct.
01:10:59.000 That's exactly right.
01:11:00.000 Recreational sex does occur in bonobos, which is really weird, isn't it?
01:11:05.000 Because they're so similar to regular chimps.
01:11:07.000 Yes, but they're not the same species.
01:11:09.000 They also have homosexual sex.
01:11:11.000 They use sex to make up.
01:11:13.000 So yeah, they're a very licentious species.
01:11:18.000 That's exactly right.
01:11:19.000 And bonobos are felt to be a self-domesticated chimpanzee.
01:11:24.000 So bonobos are to chimps as, let's say, dogs are to wolves.
01:11:30.000 But the dogs we domesticated.
01:11:32.000 The bonobos self-domesticated is the theory.
01:11:34.000 Do they know why or how?
01:11:37.000 Well, the theory is that they did it, like we were saying, by weeding out, killing the more aggressive members.
01:11:42.000 What we know must have happened is that the nicer guys must have been able to have more offspring.
01:11:48.000 So the gene pool changed over time because of the differential success of the nicer guys.
01:11:55.000 Now, people have looked at this even in human societies.
01:11:57.000 They've looked, for instance, there's a study I talk about in the book of different pathways to reproductive success amongst the Tsimani, which is a group in Amazonia, And other societies are similar.
01:12:10.000 So you can either be like big and strong, Or you can be charismatic and have useful knowledge.
01:12:17.000 In both ways, you have more children.
01:12:21.000 So there are these competing ways in our species of enhancing your reproductive fitness.
01:12:28.000 Are you aware of Sapolsky's work with baboons?
01:12:32.000 That's a fascinating case, right?
01:12:34.000 Because they were studying baboons in Africa that would eat from human garbage.
01:12:39.000 And a bunch of them got sick and died.
01:12:42.000 And it turns out that the most violent and ruthless of them got sick and died, and it changed the entire culture of the baboon tribe.
01:12:50.000 Oh, I don't know that story.
01:12:51.000 That's interesting.
01:12:52.000 Oh, it's a fascinating one.
01:12:52.000 They started grooming each other and being kind to each other.
01:12:55.000 Oh, my God.
01:12:56.000 Yeah, that's a good example.
01:12:57.000 But there was an accidental.
01:12:59.000 It was an accidental.
01:12:59.000 It was an accidental, but it lasted for generations.
01:13:02.000 And when he returned to study them, he found that they were still this different kind of baboon tribe.
01:13:07.000 Oh, I think I did read about this a little bit.
01:13:09.000 Yeah.
01:13:09.000 I'm doing a shitty job, I'm sure, of explaining it, but I love that guy.
01:13:13.000 I'm so fascinated by that guy's work.
01:13:15.000 Yes, he's very impressive.
01:13:17.000 And I know, now that you're reminding me, I'm a little familiar with that particular study.
01:13:23.000 I didn't know that it started with garbage, however.
01:13:25.000 But it was a coincidental extermination of the more violent members of the troop.
01:13:30.000 Yes.
01:13:30.000 Yeah, so they were removed from the gene pool.
01:13:32.000 And it changed the entire culture to the point where generations later, they were still using this...
01:13:40.000 Peaceful, yeah.
01:13:42.000 More kind.
01:13:43.000 Well, it didn't just change the culture.
01:13:45.000 It may have changed the culture, but it appears we're arguing to have changed the gene pool.
01:13:49.000 It's like an evolutionary pressure that's been applied.
01:13:51.000 So you have big dogs and small dogs.
01:13:54.000 You don't allow the big ones to reproduce.
01:13:55.000 You just reproduce the small ones.
01:13:57.000 You get small dogs in the end.
01:13:58.000 Well, I've had dogs my whole life, and one of the things that you do realize- A kind you have?
01:14:03.000 Right now, I have a golden retriever.
01:14:04.000 Yeah, we have a white lab.
01:14:07.000 Yeah.
01:14:08.000 A yellow lab.
01:14:09.000 And I've had a bunch of different dogs.
01:14:11.000 I've had mastiffs and pit bulls and German shepherds.
01:14:13.000 No, small.
01:14:14.000 We have a dachshund, too.
01:14:15.000 You don't have small dogs?
01:14:16.000 My oldest daughter has a tiny chihuahua.
01:14:18.000 It's a pain in the ass, aren't they?
01:14:19.000 No, he's the best.
01:14:20.000 I love him.
01:14:20.000 They just bark all the time, though.
01:14:21.000 No, he doesn't.
01:14:22.000 He doesn't bark that much.
01:14:22.000 He barks a little bit, but he's really smart.
01:14:24.000 He's actually a mutt.
01:14:25.000 He's Chihuahua and Australian Shepherd, but he's like that big.
01:14:29.000 He's a tiny little thing.
01:14:30.000 He's the best.
01:14:31.000 But my point being is that you can see, if you get a dog from a breeder, you really can see how they can cultivate certain types of behavior.
01:14:42.000 Like a good example of my Mastiff who passed away this year.
01:14:49.000 We're good to go.
01:15:09.000 Purposely, any time a dog showed any aggression towards people or any aggression towards dogs, he wouldn't let them breed.
01:15:14.000 So how can anyone hear stories like that or know stories like that and not then also think that genes play a role in human behavior?
01:15:21.000 Oh, you have children?
01:15:22.000 Yes.
01:15:22.000 You realize it when you have children.
01:15:24.000 You see it like, okay, this is not...
01:15:26.000 I didn't do this.
01:15:27.000 This comes from me.
01:15:29.000 There's certain traits that my children have that I watch and I go, okay, this is not...
01:15:35.000 I didn't teach them this.
01:15:36.000 They just started this way.
01:15:37.000 They were born this way.
01:15:38.000 They've got my fucked up brain.
01:15:40.000 There's something in there.
01:15:41.000 They don't see how crazy I am in terms of how hard I work at things, how obsessive I get with things.
01:15:49.000 They're just doing it.
01:15:50.000 It's very weird.
01:15:51.000 Mm-hmm.
01:15:52.000 It's very weird because you see, you go, oh, well, okay, well, how much of this shit that's in me is, well, how much of me is me deciding to be this person, and how much of me has no choice?
01:16:03.000 About half and half, I would say, overall, on average, across traits.
01:16:07.000 How much do you think gets passed down through genetics in terms of inclinations, like the nature?
01:16:15.000 Dispositions.
01:16:15.000 Yes.
01:16:16.000 About half.
01:16:17.000 On average.
01:16:18.000 So, for example, about half the – you know, how religious you are or how risk-averse you are.
01:16:23.000 Like I can – about half the variation in how – if you look at a group of people and some are more risk-averse than others, about half of that has to do with their genes and half has to do with how they were raised or what environments they grew up in.
01:16:36.000 So, you know, there's a kind of innateness to many of our qualities, and you can shape them.
01:16:41.000 You know, for example, you couldn't make me a musician, unfortunately.
01:16:44.000 I have almost no musical talent.
01:16:46.000 I can dance, I think.
01:16:47.000 I mean, I think others would even say that I can do that.
01:16:50.000 So it's not just, like, I think I can dance, but I can't.
01:16:53.000 But I have no musical ability whatsoever.
01:16:55.000 I would say I'm tone deaf, and, you know, I can appreciate music, but I can't produce it.
01:17:01.000 There's no way you could train me, I don't think, to be a musician.
01:17:06.000 But, so some of it is inborn and some of it is taught for all of these qualities.
01:17:11.000 Yes.
01:17:12.000 It's a fascinating thing to watch it emerge from a child, isn't it?
01:17:16.000 Yes.
01:17:16.000 As a parent, you see where it comes from.
01:17:18.000 Although, we have adopted, like I, my mother had three biological children and I have two adopted siblings.
01:17:25.000 I come from actually a multiracial family.
01:17:27.000 I have a black sister and a Chinese brother.
01:17:29.000 I think?
01:17:49.000 The kind of inherited traits that these people, that we all have.
01:17:54.000 And you see the shaping by how you're raised.
01:17:57.000 So both are important.
01:17:59.000 And this is incidentally why, if you ever have anyone, it's not nature or nurture.
01:18:03.000 It's both.
01:18:04.000 Always.
01:18:05.000 Almost in every single trait, actually.
01:18:07.000 Well, that's the case of so many things in this life.
01:18:10.000 We want everything to be binary.
01:18:12.000 Yes.
01:18:13.000 It's nuts.
01:18:15.000 We were talking earlier, it's a total loss of nuance and an inability to see any gray.
01:18:21.000 And some people think, and I think that's what you were talking about, some people think that we are hardwired to like dichotomies.
01:18:30.000 To see, you know, male and female and up and down and good and evil and left and right and to simplify the world by finding out that we like it, that it's soothing to us to think that the world can be divided into two categories.
01:18:44.000 But in fact, many times, not always, like up and down is sort of clear, but many times it can't.
01:18:51.000 There's shades of grey.
01:18:52.000 And it's harder.
01:18:54.000 That's harder to live in the grey, actually.
01:18:56.000 Yes, I completely agree.
01:18:57.000 And that's why I've always been opposed – I mean, I think it's incredibly foolish to deny that, but people find comfort in denying that.
01:19:05.000 They find comfort in being tribal.
01:19:07.000 They find comfort – Us and them.
01:19:09.000 Yeah, us versus them is the classic, right?
01:19:12.000 Yes.
01:19:12.000 Yes.
01:19:13.000 Yes, it's a simplified view of the world, and it's foolish and dangerous, actually.
01:19:18.000 Yeah.
01:19:21.000 Sometimes you're at war with an enemy.
01:19:24.000 It's me or him or us or them.
01:19:27.000 There are circumstances in which it's a difference.
01:19:29.000 For survival.
01:19:29.000 Yes, for survival.
01:19:31.000 In that mode.
01:19:31.000 Yes.
01:19:32.000 Yeah, I get it.
01:19:33.000 But I think a kind of worldview which says we are good, they are evil, as we've been saying in different kind of ways in different parts of our conversation, is I think foolish and wrong and ultimately self-injurious actually.
01:19:49.000 So – We used to have – I know you've done martial arts.
01:19:52.000 I spent years training in Shotokan karate, a very traditional Japanese style, which I loved.
01:19:59.000 I'm sure you've had the same thing.
01:20:00.000 You actually are grateful to your opponent.
01:20:03.000 You bow to your opponent.
01:20:04.000 You say thank you to your opponent, right?
01:20:06.000 Sure.
01:20:06.000 The opponent is necessary for you to learn.
01:20:08.000 Oh, yeah.
01:20:09.000 I mean, this is the whole point.
01:20:11.000 Not just your opponent, training partners.
01:20:13.000 Of course, yes.
01:20:13.000 You want people to be able to beat you.
01:20:15.000 Yes, yes.
01:20:15.000 You get better.
01:20:16.000 That's exactly right.
01:20:18.000 So this is, you know, I think that the kind of that aspect of that kind of training is a life lesson as well, right?
01:20:25.000 The capacity to see that, and the same happens with ideas.
01:20:29.000 How do my ideas get better?
01:20:30.000 How do I discover in my laboratory new knowledge?
01:20:33.000 I discover it against opposition, right?
01:20:35.000 Someone says, you're wrong about that.
01:20:37.000 It's not true.
01:20:38.000 And I'm like, oh yeah, let me prove it to you.
01:20:41.000 Here's what I'm going to go back and do more experiments and come back to you with more arguments and more data and show you that actually I'm right about this.
01:20:47.000 Or not!
01:20:48.000 You go back to your lab and you're like, oh shit, they were right.
01:20:52.000 You know, we were wrong.
01:20:53.000 So that's the way you uncover truth, right?
01:20:56.000 It's the way you get to more perfection.
01:20:59.000 It's the kind of yin and yang, actually.
01:21:02.000 So yes, I think that this simplification of the world to think of, you know, I'm good and you're evil, really misunderstands in many,
01:21:18.000 not all, but in many circumstances it misunderstands what's happening.
01:21:20.000 And also it brings back this problem that human beings have always had with ego and this need to be right and that identifying yourself in each individual discussion and debate and battle and needing to triumph.
01:21:36.000 And even though you desire to be correct, you have to understand when you are not.
01:21:40.000 And you have to appreciate someone who shows you that you are incorrect because they are allowing you to grow.
01:21:46.000 You're not a finished product.
01:21:47.000 There's no way you can be.
01:21:48.000 Yes.
01:21:49.000 I think that's why I like arguing with people I disagree with because that's when I learn more stuff.
01:21:53.000 If I talk to people I agree with, I don't learn as much.
01:21:56.000 So you get together with that Private Roads dude.
01:21:58.000 That dude and some other dudes.
01:22:00.000 I just came from his house and he's crazy.
01:22:03.000 But anyway, he'll laugh.
01:22:04.000 He will listen to this and he'll be laughing right now.
01:22:07.000 What does he do for a living?
01:22:08.000 He's a financier.
01:22:09.000 There he goes!
01:22:10.000 No fucking way.
01:22:12.000 Goddamn, that's cliche.
01:22:14.000 Yes, exactly.
01:22:15.000 That's hilarious.
01:22:16.000 That is hilarious.
01:22:18.000 So, it is hilarious.
01:22:21.000 It's like a pro-gun mercenary.
01:22:23.000 Yeah, exactly!
01:22:25.000 That's a surprise!
01:22:27.000 Yeah, who saw that coming?
01:22:29.000 That's really funny.
01:22:31.000 Hold on, I was going to say something to you about...
01:22:33.000 Arguing with people you disagree with.
01:22:36.000 Hold on, I lost the train.
01:22:37.000 I thought I had said before we talked about my friend.
01:22:39.000 You learn from them.
01:22:41.000 Anyway, I lost the train there, but...
01:22:42.000 No worries.
01:22:43.000 Yeah.
01:22:44.000 So, yeah, I mean, that's another issue that I've faced with this podcast, where people get upset at me for having people on that have opinions that they disagree with.
01:22:54.000 That's nuts.
01:22:55.000 Yeah, they think that you're doing a disservice by providing a platform.
01:22:59.000 That's nuts.
01:23:00.000 That phrase they keep saying, platform, giving them a platform.
01:23:04.000 No.
01:23:05.000 I think you have power, which you should use wisely.
01:23:09.000 I have power.
01:23:09.000 I should use light.
01:23:10.000 We all have some power in some parts of our lives.
01:23:12.000 And I think it is okay to say you have some power.
01:23:16.000 You do.
01:23:17.000 You have lots of millions of listeners.
01:23:19.000 People respect you.
01:23:22.000 Lots of people, presidents, CEOs, people of power.
01:23:24.000 But the idea that by talking to someone, you are somehow abusing that power, that's crazy to me.
01:23:31.000 In fact, quite the opposite.
01:23:32.000 I think that you are shining bright light of day onto ideas.
01:23:38.000 Let people discuss them.
01:23:39.000 Let's – It's also quite schizophrenic.
01:23:42.000 I mean, have you ever seen when a schizophrenic person draws these connections where they have one person and that person met this other person and that person used to work with this other person and that person met Hitler?
01:23:53.000 Yes.
01:23:53.000 So you know Hitler.
01:23:54.000 Yes.
01:23:55.000 Have you ever seen those?
01:23:56.000 Yes.
01:23:56.000 It's really similar in the same sort of a way.
01:23:58.000 Yes.
01:23:59.000 It's this weird sort of thing where you're not allowed to even communicate or be in contact with someone who is...
01:24:08.000 And it's very childlike, this perspective.
01:24:12.000 And it's very binary.
01:24:14.000 You can't be my friend if you're Susie's friend.
01:24:15.000 Exactly.
01:24:15.000 Yes, it's so fucking stupid.
01:24:17.000 Yes, I would agree with that.
01:24:18.000 And it's a really common thing today that you're seeing people are trying to reinforce this idea and push it on other folks.
01:24:26.000 Well, I think one thing, you know, like I think that – Like we were talking about, I think that exposing ourselves to a breadth of ideas, to people we disagree with, I think – and creating an environment in which people can express themselves is good.
01:24:46.000 You're not going to get any arguments from me against on that point.
01:24:49.000 No, and I just think it's better for everybody, like we were talking about before, when you meet someone who can give you a lesson and express something in a way that makes you reconsider your own ideas that you hold sacred.
01:25:03.000 I mean, I'll give you an example.
01:25:06.000 When I met my wife 30 years ago, I wasn't pro-death penalty, but I would say I was neutral to the death penalty.
01:25:14.000 I would be like, you know, Ted Bundy, the state can put him to death.
01:25:19.000 And I had all the kind of conventional reasons or I didn't really care.
01:25:22.000 He's a vile person.
01:25:23.000 He killed all these people.
01:25:24.000 He tortured them.
01:25:26.000 If the families will get any relief, whatever, that's fine.
01:25:30.000 I had some concerns because I was a statistician about conviction of the innocent and I support the Innocence Project and I am very concerned with police brutality.
01:25:39.000 I have for years been advocating the racializing of police brutality as vile and abhorrent and must be firmly resisted.
01:25:48.000 I think that the prosecutorial misconduct, the way prosecutors lie and put people in prison, there have been many, many cases of people on death row who are innocent.
01:26:00.000 That should offend our conscience.
01:26:02.000 So even back then, I had some concerns about the death penalty because I – Because I recognize that we can't be perfect.
01:26:10.000 We're going to convict some innocent people and also let some guilty people go free.
01:26:14.000 That's not as bad as putting to death the innocent, but they're both bad.
01:26:18.000 So I had that concern about the death penalty, but otherwise I was like, it's okay.
01:26:23.000 My opinions have totally changed.
01:26:25.000 I'm completely opposed to the death penalty now for many reasons, not just the statistical reason, but also I think it's immoral.
01:26:31.000 I don't think the state should put to death.
01:26:32.000 I think we can deprive you of liberty.
01:26:34.000 I think we can make sure you're not a threat to society.
01:26:37.000 We can lock you up for the rest of your life.
01:26:39.000 But I think the state should not be taking people's lives in that way.
01:26:44.000 There's something extraordinarily strange about locking someone up.
01:26:48.000 It's very strange.
01:26:49.000 Well, we have a carceral state.
01:26:51.000 I mean, you know, we lock up a higher...
01:26:53.000 Our fraction of people incarcerated, I think, is the same as Stalinist Russia.
01:26:57.000 And we have very long prison sentences, which are nuts.
01:27:00.000 You don't need them for deterrence.
01:27:02.000 Especially for nonviolent drug offenses.
01:27:04.000 Especially for all nonviolent offenses should have much shorter...
01:27:07.000 We should have more...
01:27:09.000 We should have higher certainty of punishment.
01:27:11.000 A higher fraction of people who have actually committed a crime should be punished.
01:27:14.000 But I think we could cut in half or less...
01:27:17.000 The duration of the sentences.
01:27:18.000 I think you'll be able to deter criminals from doing things with a three-month sentence if they are very confident that they will be convicted if they're caught.
01:27:27.000 Whereas now we have a system where most are not convicted, like this Jesse Smollett thing, which is just ridiculous in the news.
01:27:34.000 And only a tiny fraction are convicted.
01:27:39.000 But they're given huge long sentences.
01:27:41.000 It's like they're paying the sentence for everyone that didn't – it doesn't make any sense.
01:27:44.000 And it's expensive.
01:27:45.000 It ties up our prison system.
01:27:47.000 Actually, can I go to tell you another story?
01:27:50.000 So there's a situation a few years ago when there's a very famous – Yeah.
01:28:16.000 And he told the story actually at Yale to students about how he had just come back from a summit – President Obama was still president – where he was trying to help the students to see that you can find common ground with your political opponents and that you need to listen to them and talk to them in order to find that ground.
01:28:36.000 And so he told the following story.
01:28:38.000 He said – I just came back from Camp David where there was a meeting about how to reduce incarceration in our society.
01:28:46.000 And he said the Koch brothers were there and the students all hissed and Newt Gingrich was there and the students all hissed and a bunch of liberal people were there and the students were really happy about that.
01:28:56.000 And then they said, well, why did you go?
01:28:57.000 How could you associate yourself with those evil people?
01:29:00.000 And he said, look, he said, the conservatives want to reduce incarceration because it's expensive.
01:29:06.000 The liberals want to reduce incarceration because it's unjust.
01:29:10.000 And the libertarians want to reduce incarceration because the state shouldn't be depriving people of liberty.
01:29:15.000 And I can find common ground with these people and reduce incarceration.
01:29:19.000 Why would I not talk to them?
01:29:20.000 And the students didn't seem to understand that.
01:29:23.000 They were like, they couldn't get it.
01:29:26.000 That's why they shouldn't be able to vote.
01:29:27.000 Yeah, it should be 30!
01:29:30.000 I think it should be 25. I mean, and so, I don't know how we got onto this, you know, like talking to you is so much fun because it's like we're all over the place, but how did I come up with this example?
01:29:41.000 We were talking about talking to political enemies, was it, or something else?
01:29:44.000 Yes.
01:29:44.000 We're talking about people telling you that you shouldn't associate with people that have varying opinions.
01:29:49.000 Yes, yes.
01:29:50.000 And, you know – oh, and no, we were talking about incarceration and prison sentences and so forth.
01:29:54.000 So we have a horrible problem in our society with incarceration.
01:29:58.000 A larger fraction of our populace is incarcerated.
01:30:01.000 We deprive – after you've paid your debt to society, we often have these – we deprive you of your right to vote, which I think is wrong.
01:30:08.000 You've paid your debt to society.
01:30:09.000 You should be able to reenter society.
01:30:11.000 That's the point.
01:30:11.000 You're paying taxes.
01:30:12.000 You're a part of our community.
01:30:13.000 Yes, exactly.
01:30:14.000 You were in prison for 10 years.
01:30:15.000 That's enough.
01:30:16.000 Now we want you to feel a part of society.
01:30:19.000 We want to welcome you back if we have that vision of justice.
01:30:22.000 Well, how about the registered sex offender?
01:30:24.000 That's a serious problem, especially for crimes, these crazy cases which offend my conscience.
01:30:30.000 Well, I know a guy who got charged as a registered sex offender because he urinated outside.
01:30:37.000 Yeah, that's nuts.
01:30:37.000 You get caught in the South, urinating outside.
01:30:40.000 Yeah, that's nuts.
01:30:40.000 That's just prosecutorial abuse.
01:30:42.000 Or, you know, you have these Romeo and Juliet laws, which are not in every state now, thank God.
01:30:47.000 Alas, they are not in every state.
01:30:50.000 You know, you have a 16-year-old boy and a 14-year-old girl.
01:30:53.000 There have to be exceptions for that kind of sexual practice.
01:30:56.000 You know, they're exchanging sexually explicit images.
01:31:00.000 They should not be considered sex offenders for the rest of their lives.
01:31:03.000 That's nuts.
01:31:04.000 So...
01:31:05.000 So, yes, so all of those things.
01:31:07.000 But the problem is not only do we have a huge fraction of people in prison, we have extremely long prison sentences compared to many European countries for the same crime.
01:31:16.000 And it's costly, it's unjust, it's ineffective.
01:31:19.000 I think we should change the policies on this, and maybe we will.
01:31:24.000 There's also the idea of reforming them.
01:31:29.000 They're not using all the tools within their disposal.
01:31:32.000 They're not really doing a good attempt at it.
01:31:34.000 And I just don't think it does anything other than make their life hell for a short period of time, which we're hoping, we hope, deters them from doing future crime.
01:31:43.000 Well, they're different.
01:31:44.000 There's justice, there's deterrence, there's safety, right?
01:31:47.000 Like, so violent criminals that we put in jail, we need to do that.
01:31:51.000 I mean, I'm not interested in being killed by somebody who, you know, killed someone else.
01:31:55.000 She should have been in jail for a while.
01:31:57.000 20 years, some amount of time.
01:31:59.000 For murder?
01:32:00.000 Yeah, for murder.
01:32:01.000 You think 20 years is enough?
01:32:04.000 Well, European standards are about 20 years, actually, and they're different things.
01:32:12.000 Like, if you want to deprive them, if your vision is they're being punished for the killing of a life, therefore they've surrendered their life, it's sort of eye for an eye kind of justice.
01:32:20.000 They would be the rest of their lives in jail.
01:32:23.000 And we can debate whether that's reasonable or not.
01:32:25.000 If you want to provide a public safety reason, people often age out of their violence.
01:32:30.000 So a lot of men typically – these were talking mostly about men who do these things – by the time they're in their 40s or 50s, they're much less violent.
01:32:38.000 Testosterone declines.
01:32:39.000 They get older and wiser.
01:32:41.000 They're not interested in criminal – in that kind of criminal behavior.
01:32:44.000 Many of them are not.
01:32:45.000 So that suggests you don't need life sentences for murder.
01:32:49.000 And I think it also depends.
01:32:50.000 And we have gradations of murder.
01:32:52.000 You know, we have like the impulsive stuff that intent matters, the planfulness matters, the depravity matters.
01:32:57.000 All of these things are factors.
01:32:59.000 And I don't think we should have a one-size-fits-all incarceration for murder.
01:33:03.000 Yes, that's my opinion.
01:33:04.000 What do you think?
01:33:05.000 I think it depends entirely on the circumstances.
01:33:08.000 If two men are engaged in some sort of a dispute and one winds up killing the other one, that's a big difference between that and someone breaking into your house and killing your daughter.
01:33:18.000 Yes, correct.
01:33:19.000 And I also think even in that, like I really am opposed to these stand your ground laws.
01:33:24.000 I think those are, if you have the opportunity to avoid conflict and to avoid, you are not, I would prefer as a state to require that you walk away from Even if it makes you feel embarrassed, then give you the right to kill someone for offending you.
01:33:41.000 And those videos of the guys that shot the guy on his knees in the parking lot in the – I forgot what state it was, like not long ago, a year or two ago, they got into an altercation in the parking lot.
01:33:52.000 Like if I have words with someone in a parking lot – Shot a guy on his knees?
01:33:55.000 Yes.
01:33:55.000 There was no threat to him.
01:33:57.000 And he was not prosecuted on the Stand Your Ground argument, which is nuts.
01:34:00.000 Why was the guy on his knees?
01:34:02.000 I forgot.
01:34:02.000 He said, don't shoot me or something.
01:34:04.000 Oh, Jesus.
01:34:05.000 So it was crazy.
01:34:05.000 And he didn't get prosecuted for that?
01:34:07.000 I don't think so.
01:34:08.000 We can look up the facts.
01:34:09.000 There were several cases.
01:34:10.000 There were several cases like this.
01:34:12.000 But, you know, like I remember when I was doing Shotokan Karate, My sensei, Kazumi Tabata, this was years ago, 30 years ago now, and he told us the following story.
01:34:23.000 He said there was a sensei in this village in Japan and the students were coming to the dojo and there was the best student and then all the other students.
01:34:34.000 And they were walking through the village and they approached a horse that was on the street from the rear.
01:34:41.000 And it startled the horse and as the horse reared up and kicked its leg, the best student instantly did a kind of avoidance, kind of twisted his body and avoided the kick and the horse's leg went right in front of him and all the other students were amazed at his ability.
01:34:59.000 And they get to the dojo and they tell the sensei, this is my sensei telling me this story, telling all of us this story.
01:35:05.000 And those students get to the dojo and they tell the sensei the story, marveling at the ability of this master student to deftly avoid the strike.
01:35:14.000 And the sensei is very angry and they don't understand why.
01:35:18.000 Why is he so angry?
01:35:19.000 He said if he were a really good student of mine, he would have walked on the other side of the street.
01:35:24.000 He would have avoided the horse altogether.
01:35:27.000 So, the real wisdom is to avoid avoidance of conflict in the first place.
01:35:31.000 There's no reason to seek out conflict.
01:35:34.000 And so, on these stand your ground laws, you know, if the choice is either you just avoid the conflict, you know, someone swore at you or called you an asshole or was an unreasonable jerk.
01:35:45.000 That doesn't give you the right to kill them.
01:35:48.000 So, anyway, I don't know how we got onto this as well.
01:35:52.000 Death penalty?
01:35:54.000 Oh, yeah, for crimes for murder.
01:35:56.000 Exactly, exactly.
01:35:57.000 So, you know, there are different gradations, yeah.
01:35:59.000 Yeah, I just don't know how much of a deterrent it is locking people up.
01:36:05.000 I'm not sure.
01:36:06.000 I'm not really sure if that actually stops people from doing things.
01:36:10.000 I think it stops some people.
01:36:11.000 I think there have been academic research on this.
01:36:13.000 I just don't think there's any real rehabilitation other than personal choice.
01:36:19.000 I mean, I think the real rehabilitation comes from someone making a personal choice to never be that person again.
01:36:25.000 Be that way again, yes.
01:36:26.000 For most of them, you're being locked up with a bunch of hardened criminals, and that's your community.
01:36:33.000 But you're not suggesting we have a society in which when you commit violent acts, we do nothing.
01:36:37.000 No, I'm not.
01:36:38.000 No, I'm not.
01:36:39.000 No, I'm suggesting...
01:36:40.000 You're struggling with this is what you're saying.
01:36:41.000 Yeah, the concept of nuance.
01:36:43.000 Yes.
01:36:43.000 This is applied here better than anywhere else, I think.
01:36:47.000 I got the impression looking at your face a moment ago when I told my sort of sweet sensei Japanese karate story that you didn't agree necessarily.
01:36:54.000 No, that's a very wise way of looking at it.
01:36:56.000 Yeah, don't be near a fucking horse that wants to kick you.
01:36:59.000 Yeah.
01:36:59.000 Very smart.
01:37:00.000 Yeah, get out of there.
01:37:01.000 I'm a big believer in avoiding conflict.
01:37:03.000 Yeah.
01:37:04.000 I'm the first guy to go, we should get out of here.
01:37:06.000 I'm talking about physical conflict, not intellectual conflict, right?
01:37:08.000 Oh, yeah, exactly.
01:37:09.000 Physical conflict.
01:37:09.000 Words and violence are different, right?
01:37:12.000 Yes.
01:37:12.000 Extremely, extremely different.
01:37:14.000 Yeah, I mean, intellectual conflict, I think, is actually important.
01:37:17.000 Yeah.
01:37:17.000 You learn from it.
01:37:19.000 Very rarely do you learn too much.
01:37:21.000 You learn, don't do that again.
01:37:22.000 That's what you learn from physical conflicts.
01:37:24.000 Don't do that again.
01:37:26.000 It's just, you know, what happens in nature with animals happens with people if you let them get to that level.
01:37:33.000 You scratch down to the, you know, remove that thin film of society and let people beat each other with rocks.
01:37:40.000 Yes, we are violent, but I keep coming back to what I argue in Blueprint.
01:37:43.000 You know, we have those tendencies, but equally we have tendencies to be kind and friendly, and we have to create the environment to foster those.
01:37:52.000 I have a, there's a sense in which, and I talk about this in the book, there's a sense in which as we create those environments, We actually change ourselves as a species.
01:38:18.000 Making those of us that are born with certain abilities better off, which then leads to those environments being created even more.
01:38:24.000 Let me give you an example of that.
01:38:26.000 The most famous example of this is something known as lactase persistence.
01:38:30.000 So many people, about half the world, adults can drink milk.
01:38:34.000 The other half cannot.
01:38:36.000 They get lactose intolerant.
01:38:38.000 Well, why can you drink milk as an adult?
01:38:41.000 Have you ever thought about that?
01:39:00.000 There was therefore no reason for any adult to be able to digest lactose, which is the principal sugar in milk, because there was no lactose in your diet.
01:39:08.000 You didn't encounter milk.
01:39:09.000 So human beings were able to digest lactose when they were babies.
01:39:13.000 They lost that capacity, all human beings.
01:39:15.000 When they got to about two or three or four or five when they weaned, they no longer were able to digest milk.
01:39:19.000 So the enzymes in their body were programmed, as it were, to only work when they were infants.
01:39:25.000 Well, about between three and nine thousand years ago, in multiple places in Africa and in Europe, human beings suddenly domesticate animals.
01:39:33.000 We domesticate milk-producing animals like cattle and sheep and goats and camels.
01:39:38.000 And now, all of a sudden, there's a supply of milk around us.
01:39:42.000 Because of our cultural innovation, because of the thing we invented, we created the domestic breeds, now we have milk.
01:39:50.000 Now, therefore, those among us who were mutants, who were born with the ability to have our lactase, the enzyme that digests lactose, persist into adulthood – this is known as lactase persistence – Welcome to my show!
01:40:24.000 It turns out that this has happened several times.
01:40:26.000 This has been well documented.
01:40:27.000 The genetics of this has all been worked out several times in the last 3,000 to 9,000 years.
01:40:32.000 Because of a human cultural product, we have evolved to be a slightly different genetically.
01:40:38.000 And it doesn't stop with cows.
01:40:42.000 I think that when we invent cities about over 5,000 years ago, so we invent agriculture about 10,000 years ago.
01:40:50.000 It's debated exactly when we invent cities.
01:40:52.000 But between 5,000 and 10,000 years ago, we start having fixed settlements.
01:40:55.000 Earlier, you and I were talking about population density and having to live with other people, which is not our ancestral state, not packed, not with other people.
01:41:01.000 We always lived socially.
01:41:03.000 I think that as we invent cities, people with different kinds of brains are better able to survive in cities.
01:41:10.000 So now that we've invented cities, we're advantaging people with certain kinds of brains.
01:41:15.000 And therefore, I think in 1,000 or 2,000 or 5,000 years, just like the milk example, there'll be different people as a result of something we humans manufactured that we made.
01:41:25.000 And I could keep giving you examples of this.
01:41:27.000 There's a – in the book, I have another example of a We're good to go.
01:41:53.000 And they do it nothing except with weights and wooden goggles.
01:41:57.000 They dive down into the seabed and forage, and they hunt underwater with spears.
01:42:03.000 Okay?
01:42:04.000 They hunt underwater with spears.
01:42:06.000 It's mind-boggling.
01:42:07.000 Wow.
01:42:08.000 But they...
01:42:10.000 Have evolved to have different spleens and different oxygen metabolism than you and I. So those among them that could survive the dives fed their families, made more babies, and now we think this happened 2,000 years ago.
01:42:23.000 They're different.
01:42:24.000 The ones that couldn't died.
01:42:27.000 So their invention of a seafaring way of life, their invention of a way of living at sea, the boat technology, the spearfishing technology, the The invention of those technologies creates an environment, a cultural environment around them,
01:42:43.000 which modifies natural selection and changes the kind of genes that those people have.
01:42:49.000 These are discussed in Blueprint, and there are many examples of this.
01:42:52.000 I want to see an image of these goggles.
01:42:55.000 Yeah, if you Google their little slitted goggles, if you Google Sea Nomad goggle, you may come up with it.
01:43:04.000 And let me give you- What are they using for a lens?
01:43:06.000 There's no lens.
01:43:07.000 What?
01:43:08.000 Yeah, they're little slits.
01:43:09.000 So what's the point?
01:43:11.000 I didn't look at the technology at that level.
01:43:14.000 But if there's no lens, then it doesn't protect your eyes.
01:43:17.000 I think it may reduce glare underwater by having you look through slits.
01:43:23.000 It's got something on it.
01:43:26.000 I can see it.
01:43:27.000 Well, no, because they didn't have glass.
01:43:29.000 The one this kid's holding up has got something on it.
01:43:32.000 Let me see.
01:43:34.000 That's an actual modern scuba.
01:43:38.000 He's holding up a wooden diving mask.
01:43:40.000 Yeah, he may have made it from wood, but the ancient one...
01:43:42.000 Yeah, this is the Bajau.
01:43:45.000 So if you look at...
01:43:46.000 So look at...
01:43:47.000 Can you find...
01:43:48.000 I'm looking, but...
01:43:49.000 ...goggle.
01:43:49.000 I mean, it'll be hard to find.
01:43:50.000 Maybe no one's put it on.
01:43:51.000 And now, of course, they have modern technology, so they can...
01:43:54.000 Right.
01:43:54.000 They can...
01:43:55.000 Those are totally modern plastic.
01:43:57.000 Yes.
01:43:58.000 But they used to have these wooden goggles.
01:44:00.000 Anyway, your point is that they adopted, or adapted, rather, to this new lifestyle, sort of like...
01:44:07.000 Genetically adapted.
01:44:08.000 Yes, like the Inuit have developed this ability to not get frostbite and to not get numb fingers in cold weather.
01:44:16.000 I did not know that example, but that would be an example of that.
01:44:18.000 Yeah, this is an example from, I believe they were talking about it from Alaska, that they did genetic testing on these people and they did different circulation.
01:44:27.000 Yes, yes, yes.
01:44:29.000 Yes, that would be another example of just exactly that example.
01:44:31.000 I didn't put that one in the book, but yes.
01:44:33.000 Yeah, we're incredibly flexible, right?
01:44:35.000 Yes.
01:44:35.000 Well, we have two kinds of flexibility.
01:44:37.000 So think about like when we settled the Tibetan Plateau, when human beings settled the Tibetan Plateau, there were different challenges up there.
01:44:46.000 It's cold up there and there's not a lot of oxygen up there.
01:44:49.000 Now, we could – genetic evolution is not fast enough.
01:44:52.000 We didn't become furry.
01:44:53.000 You know, like one way to cope with the cold is to become furry again.
01:44:57.000 We didn't do that.
01:44:58.000 Why?
01:44:59.000 Because we had clothing.
01:45:00.000 We had cultural means of coping with this situation.
01:45:04.000 So for the cold, to cope with the cold, we used culture.
01:45:10.000 I think we're good to go.
01:45:26.000 So the people who live in the Himalayas, they actually have different kinds of hemoglobin compared to you and me, better able to extract oxygen from the environment.
01:45:35.000 So there are two different challenges that are coped with in different ways.
01:45:39.000 One is coped with culturally, by cultural evolution.
01:45:41.000 One is coped with genetically, which is much slower, with genetic evolution.
01:45:45.000 And it's the cultural evolution, it's the cultural traits that natural – so natural selection equips us with a capacity to accumulate knowledge and to teach each other stuff.
01:45:56.000 And given that rare ability, as we discussed earlier, we're able to spread out across the planet and live in all these dissimilar environments.
01:46:04.000 We use our cultural ability to dominate the planet, basically.
01:46:08.000 Now, when you were creating this, were you actually thinking of it as a blueprint that someone would follow?
01:46:17.000 Yeah.
01:46:19.000 Yes and no.
01:46:21.000 I wasn't thinking of it that way.
01:46:22.000 But having finished the book, I do think that there are – like I don't in the book – I talk a little bit in the book about implications for – of these ideas for artificial intelligence.
01:46:32.000 Like as we create robots, even as we create sex robots or autonomous vehicles or forms of bots online, how should those bots be programmed so as not to injure our society?
01:46:44.000 So there are some policy implications I discuss in the book.
01:46:48.000 But I wasn't thinking of this as a prescription, like this is the way to live a good life.
01:46:54.000 But partly because, as I argue in the book, we don't need to affirmatively seek a good life.
01:47:01.000 We have been endowed by natural selection with the capacity to make a good life.
01:47:07.000 Full of these qualities.
01:47:08.000 So this blueprint is – I want to use the word God-given.
01:47:11.000 It doesn't come from God.
01:47:12.000 But it's God-given.
01:47:13.000 It comes from somewhere else.
01:47:15.000 It comes from natural selection that we do this.
01:47:20.000 How much time have you put into artificial intelligence?
01:47:23.000 A lot.
01:47:23.000 We do a lot of work in my lab on AI. What about sex robots?
01:47:27.000 What rules should they give for sex robots?
01:47:30.000 How much could that damage interpersonal relationships?
01:47:33.000 Yes.
01:47:33.000 That's a great question.
01:47:35.000 That's exactly the right question in my view.
01:47:37.000 So our concern with sex robots...
01:47:40.000 From a liberty point of view, should not in the slightest be whether you enjoy a sex robot.
01:47:45.000 It's your business.
01:47:47.000 You buy a robot, do what you want.
01:47:49.000 I really don't – I see – I would be hard-pressed to object.
01:47:54.000 The problem is with – let's back up from the less provocative.
01:47:57.000 Let's come back to sex.
01:47:57.000 Let's pick a simpler example first.
01:47:59.000 Let's talk about your children talking to Alexa.
01:48:02.000 So the person who designs Alexa wants to make your child's experience easy and pleasant.
01:48:08.000 And as part of the programming of Alexa, because they want to make Alexa the obedient servant of your child, it doesn't require your child to say, please, Alexa, would you play the music for me?
01:48:18.000 Your child can be as rude as she wants to Alexa, and Alexa will do what she wants.
01:48:22.000 What you should be concerned about, however, is not your child's interaction with Alexa.
01:48:26.000 What you should be concerned about is what your child is learning from interacting with Alexa that then she takes to the playground.
01:48:32.000 So now she's rude to other children.
01:48:35.000 So Alexa is corroding our social fabric.
01:48:37.000 Alexa, in this example, is making children rude to each other.
01:48:41.000 So our concern is not so much, do we make, you know, like Asimov's laws of robotics, it's not that we want to program the robot so that they don't harm you.
01:48:53.000 It's true, the first law, we don't want the robot to, through an act of commission or omission, harm or allow a human to come to be harmed.
01:49:00.000 It's that we're concerned about how the robot, in interacting with you, might cause you to harm others.
01:49:09.000 So, in the Alexa example, we might want to regulate the programming of devices that speak to children.
01:49:18.000 Not because we want to deprive your daughter of the right to speak how she wants, but because we recognize that that robot is going to cause your daughter to be rude to other people.
01:49:29.000 Is it really?
01:49:30.000 Do you really think?
01:49:31.000 Yes, the Alexa example.
01:49:32.000 Alexa, what's the weather?
01:49:33.000 That that would make your child?
01:49:35.000 Slowly but surely, I think it will contribute.
01:49:37.000 So it's an example.
01:49:39.000 It's not like – I'm not arguing that Alexa should become – I think it's so novel to kids that they know it's not a person.
01:49:45.000 I don't think it really- All right, but we're using these examples to build the thing.
01:49:48.000 So let's talk about the sex robots now.
01:49:50.000 So some people believe that actually the emergence of sex robots, which will surely appear in the next 10 or 20 years, will be a fantastic boon.
01:50:02.000 They think that- People will be able to experiment.
01:50:06.000 You'll be able to experiment with same-sex relationships, for example, group sex.
01:50:11.000 You might learn to be a better lover so you could practice with the robots and therefore you'd be more experienced when you were having sex with a real human.
01:50:20.000 So you can't get venereal diseases from a sex robot.
01:50:24.000 You can't hurt their feelings.
01:50:26.000 So people think that the argument based on ethical grounds is that this would be terrific.
01:50:32.000 I think?
01:50:50.000 And they furthermore think that it would result in one having a kind of anonymous or impersonal interactions with humans subsequently, that you'll be entrained to, let's say, want an obedient partner, for example.
01:51:04.000 Yeah.
01:51:05.000 I don't have a stand on this.
01:51:06.000 I don't know which way it's going out.
01:51:07.000 And in a way, I don't have to make a stand on it because what I'm interested in recognizing is that when we talk about allowing people to have sex with sex robots, not allowing that it's going to happen, the focus of our concern should be not what is your experience in your bedroom when you have sex with a sex robot.
01:51:23.000 Our concern as a state, like my interest, Right.
01:51:30.000 Right.
01:51:43.000 But if you start polluting the environment, you're harming me.
01:51:46.000 So now I have a reason for intervening in your activities on your land.
01:51:49.000 You can't pollute your own land if that pollution runs off onto my land.
01:51:54.000 And so the similar argument can be made.
01:51:56.000 Or look at autonomous vehicles.
01:51:58.000 Here's an example.
01:52:00.000 Right now we have all roads, almost all roads have just human drivers.
01:52:05.000 In 20 or 30 years, almost all roads will probably have only non-human drivers.
01:52:10.000 Machines will drive.
01:52:10.000 Those autonomous vehicles probably can be yoked together.
01:52:15.000 They can communicate with each other so that you'll have like trains of cars moving in synchrony.
01:52:21.000 Like each of them will be communicating with the other nearby cars and you'll have laminar flow where all these vehicles are smoothly moving and joining the highway and leaving the highway and communicating on a citywide scale, slowing traffic down miles away because they anticipate with AI that there'll be a jam here if they don't do that.
01:52:37.000 And I think that'll be actually great.
01:52:39.000 I'm actually looking forward to autonomy.
01:52:41.000 I mean, I still like to take my car to a speedway, but, you know, drive itself with stick, which I like.
01:52:46.000 But, you know, But in between, we're going to have a world of what I call hybrid systems of human-driven cars and autonomous vehicles coexisting on a plane, on an even plane.
01:53:00.000 And we need to be worried about that because these autonomous vehicles, when we interact with them, are going to change how we interact with each other.
01:53:08.000 For example, do we program the autonomous vehicle to drive at a constant steady speed?
01:53:14.000 If you're the designer of the car, you might say, gee, I don't want this car to crash.
01:53:19.000 I want the car to drive in a very predictable fashion, and that's what's best for the occupants of the car.
01:53:24.000 That's what's going to allow me to sell more vehicles.
01:53:27.000 But it may be the case that actually when people are in contact with such a vehicle, they get lulled into a false sense of security.
01:53:35.000 Oh, that vehicle never does anything new.
01:53:37.000 I don't need to pay so much attention to the car in front of me.
01:53:39.000 I just drive at a steady clip.
01:53:42.000 And then they veer off and they go to a part of the highway where they're just human drivers.
01:53:47.000 And now having been lulled into a false sense of security, they cause more collisions.
01:53:51.000 They're not paying attention.
01:53:54.000 I think?
01:54:16.000 Once again, the lesson here is that it's not just about the one-on-one interaction between the robotic artificial intelligence and the human being.
01:54:24.000 It's about how the robots affect us.
01:54:26.000 And in my lab, we do many experiments in social systems where we take a group of people and we drop online, we drop a bot or in the laboratory we have a physical robot and we watch how the presence of the robot affects It doesn't just modify how the human interacts with the robot,
01:54:43.000 but how the humans interact with each other.
01:54:45.000 So if we put a robot right there, looking at us with its third eye, would we, you know, would it change how you and I talk to each other, make us different?
01:54:55.000 That's the experiments we're doing.
01:55:00.000 That's going to be a problem.
01:55:01.000 I mean, we see the difference between humans that have porn addictions.
01:55:07.000 Yeah, that's a good example.
01:55:08.000 Porn addictions, when people develop this very impersonal way of communicating with people, and they think about sex and the objectification of the opposite sex in a very different way.
01:55:21.000 It flavors the way you think of- It flavors your expectations, yes.
01:55:25.000 Yes, and it makes it difficult.
01:55:26.000 It can make it difficult for you to have normal sexual relationships if you come to see if your expectations are guided by porn.
01:55:36.000 And that is going to be radically magnified by some sort of artificial life form that you create that's indistinguishable.
01:55:45.000 If you can have an indistinguishable sex partner that is some incredibly beautiful woman that is a robot, and then you...
01:55:54.000 Or man.
01:55:55.000 Many women would be quite happy to change their spouses for robots.
01:55:59.000 I wonder if women are going to be as into it as men.
01:56:02.000 Because I think women desire more emotional intimacy on a scale than men do.
01:56:12.000 I think the jury's still out on what the relative balance between men and women...
01:56:19.000 We might be surprised that we'll be replaced with male sex bonds.
01:56:24.000 Especially given societal expectations and women conform to those and...
01:56:28.000 And also given how a pain in the ass a lot of men can be.
01:56:30.000 Sure.
01:56:31.000 So it could go both ways.
01:56:32.000 I'm not prepared to make a prediction who's going to be better off in the gender debate with the emergence of sex robots.
01:56:38.000 It may be the way you suggest.
01:56:39.000 I don't know.
01:56:40.000 Well, we're also in this weird transition genetically where they're doing genetic experiments on humans and with the advent of CRISPR and emerging technologies.
01:56:49.000 I talk about that in the book, too.
01:56:51.000 It's entirely possible that there's not going to be any frumpy bodies anymore.
01:56:55.000 That's hundreds of years away.
01:56:56.000 Is it?
01:56:57.000 Yes, I think so.
01:56:58.000 I wonder.
01:56:59.000 I mean, I don't know if it is.
01:57:00.000 I think if they start cracking them out in China and they start giving birth to eight-foot-tall supermen with 12-inch dicks, we're going to have a real issue.
01:57:08.000 Yes.
01:57:10.000 Yes, we will.
01:57:11.000 Yes, that's the least of it.
01:57:12.000 Yes.
01:57:16.000 But I mean, it's really entirely possible that in the future they're going to have that, that we're going to have perfect humans.
01:57:22.000 Yes, I think that is likely.
01:57:23.000 The debate is how far in the future.
01:57:25.000 So I don't think we're going to start by using these technologies to cure monogenic diseases.
01:57:31.000 So, you know, like thalassemia, for example.
01:57:34.000 So diseases or certain immune deficiencies, a disease where a single gene is defective, And those will be the initial targets.
01:57:41.000 But once we start with that, eventually I think there will be people who will want to genetically engineer other people, their offspring, for example, and modify them in the ways that you suggest.
01:57:51.000 Maybe not 12-inch dicks, but maybe ability to run fast or something else.
01:57:55.000 Far smarter.
01:57:56.000 Isn't that one of the side effects they showed with the genetic manipulation of these Chinese babies to eliminate HIV? That they made them smarter?
01:58:05.000 No, I don't know if they made them smarter.
01:58:06.000 What's clear from the most recent findings I've seen from that case...
01:58:10.000 Is that unsurprisingly, as anyone could predict, the technology is not good enough to restrict the mutations to one particular region of the genome.
01:58:20.000 So there were other changes in the genome in these children that occurred elsewhere rather than the targeted region, which was to increase their immunity to HIV. And we don't know what those are.
01:58:31.000 Those could kill those kids quickly.
01:58:33.000 We could make them better in some ways.
01:58:35.000 We have no way of knowing yet.
01:58:36.000 But I think the conclusion was that it increased their intelligence.
01:58:39.000 I have not seen those results, and I think it would be premature.
01:58:42.000 I find that.
01:58:43.000 It would be premature to come to that conclusion.
01:58:45.000 Their problem is also sensationalist clickbait, which is that's what you want to click.
01:58:51.000 Not just that they did the HIV, and they made them smarter.
01:58:54.000 It's going to get like 40% more clicks.
01:58:55.000 Yes.
01:58:57.000 Versus, you know.
01:58:57.000 Yeah, 40%.
01:58:59.000 I mean, that's just the nature of humans, right?
01:59:04.000 Just to be clear, I talk about the CRISPR example in Blueprint.
01:59:08.000 I actually talk about how these technologies – again, my lens on it is how these technologies are going to change how we interact with each other.
01:59:16.000 And it goes back to the example we were talking about at the beginning when we invented cities – That was a technology that changed how we interacted with each other.
01:59:23.000 So human beings for a very long time have been inventing – when we invented weapons, that was a technology that changed how we interact with each other.
01:59:31.000 So we have previously done this kind of thing.
01:59:33.000 We've invented a technology that changed how we interact with each other and I'm very interested in the – Yeah, I'm incredibly interested in this because I love to study history, and I love to study how crazy the world was 4,000,
01:59:49.000 5,000 years ago, 1,000 years ago, and what it's going to be like in the future.
01:59:54.000 I just think our understanding of the consequences of our actions are so different than anybody has ever had before.
01:59:59.000 We have just such a broader relationship.
02:00:03.000 First of all, we have examples from all over the world now that we can study very closely, which I don't think really was available to that many people up until fairly recently.
02:00:13.000 You mean, I'm sorry, you're saying the examples are more numerous or our capacity to discern them is higher?
02:00:17.000 Our capacity to discern them and just our in-depth understanding of these various cultures all over the world.
02:00:23.000 Like what you've been telling me today about the divers and others.
02:00:28.000 We just have so much more data and so much more of an understanding than ever before.
02:00:33.000 Yes.
02:00:33.000 I love the idea that we are – I mean, I believe that this is probably the best time ever to be alive, and I think that it's probably – I think that's true.
02:00:43.000 I think there's certainly a lot of terrible things that are wrong in the world today.
02:00:46.000 Also true.
02:00:47.000 But I think that there's less of that – And more good than there's ever been before.
02:00:54.000 No, I think that's right.
02:00:55.000 But one of the arguments that I make is, this is a kind of Steven Pinker argument that you're outlining, which is, you know, with the emergence of, I mean, people are living longer than they ever have on the whole planet, fewer people in starvation, we have less violence, I mean, every indicator of human well-being is up.
02:01:12.000 And it's partly due or largely due in the recent last thousand years to the emergence of the enlightenment and the philosophy and the science that was guided, that emerged about 300 years ago and 200 and some odd years ago and culminating in the present and continuing.
02:01:29.000 So I think this is not just a kind of so-called Whiggish view of history.
02:01:33.000 It's not just a progressive sort of fantasy.
02:01:36.000 I think it's the case that these philosophical and scientific moves that our species made in the last few hundred years has improved our well-being.
02:01:43.000 However, as we've been discussing today, it's not just historical forces that are tending towards making us better off.
02:01:52.000 A deeper and more ancient and more powerful force is also at work, which is natural selection.
02:01:58.000 It's evolutionary and not just historical forces that are relevant to our well-being.
02:02:02.000 And we don't just need to look to philosophers to find the path to a good life.
02:02:08.000 Natural selection has equipped us with these capacities for love and friendship and cooperation and teaching and all these good things we've been discussing that also tend to a good life.
02:02:15.000 So yes, I totally agree with you.
02:02:18.000 We're better off today than we've ever been.
02:02:20.000 On average, across the world.
02:02:22.000 However, it's not just that that's contributing to our well-being.
02:02:26.000 This natural selection is literally why we are in this state now and why we were hoping this trend will continue and we will be in this better place 50 years from now, 100 years from now.
02:02:38.000 Well, natural selection doesn't work over those timescales, so those are historical forces.
02:02:42.000 But the point is we are set up for success.
02:02:44.000 Yes.
02:02:45.000 We are equipped with these – you're given five fingers and an opposable thumb, which allows you to manipulate tools.
02:02:53.000 So natural selection has given you an opposable thumb.
02:02:56.000 Culture lets you use a computer.
02:02:58.000 Do you worry about the circumventing of this natural process by artificial intelligence?
02:03:03.000 That artificial intelligence is going to introduce some new, incredibly powerful factor into this whole chain of events.
02:03:11.000 That by having sex robots and sex or robot workers, things becoming automated.
02:03:21.000 I'm concerned.
02:03:23.000 I'm very concerned about how technology is going to affect our economy.
02:03:28.000 Again, these concerns were not the first generation to face these concerns.
02:03:31.000 There were similar concerns with the industrial revolution, that workers were being put out of work when machines were invented.
02:03:37.000 Nevertheless, work persisted.
02:03:38.000 People still had jobs to do.
02:03:40.000 There was a disruption.
02:03:41.000 There's no doubt about it.
02:03:42.000 I think Google and the information revolution and these types of robotic automation are disruptive.
02:03:49.000 They're going to affect how we...
02:03:56.000 I thought you were alluding to, just to check if you were, to the debate, which I don't know the answer to, on whether AI will, you know, are we going to face like a Terminator-type existence where, you know, the machines rise up and kill us all?
02:04:09.000 Or not.
02:04:10.000 And, you know, very smart people are on both sides of that debate.
02:04:12.000 And I read them all and like, he's right.
02:04:16.000 And then I read the guy that has the opposite opinion.
02:04:18.000 I'm like, no, no, he's right.
02:04:20.000 And then it goes back and forth.
02:04:21.000 I don't know who's right.
02:04:22.000 It goes back to nuance, right?
02:04:24.000 Yes, it is nuance, but it's hard to know whether – and again, we're not talking over our lifetimes.
02:04:28.000 We're talking over hundreds of years.
02:04:30.000 You know, is there a time a thousand years from now when the human beings will say, what the hell were our ancestors doing inventing artificial intelligence?
02:04:36.000 They're wiping us out.
02:04:38.000 I don't know the answer to that question.
02:04:40.000 Well, I think there's an issue also with the concept of artificial.
02:04:44.000 Like, artificial life, artificial intelligence, I think it's going to be a life.
02:04:49.000 It's just going to be a life that we've created.
02:04:52.000 And I don't think it's artificial.
02:04:53.000 I just think it's a different kind of life.
02:04:55.000 I think that we're thinking of biologically based life, of sex...
02:05:01.000 Yes.
02:05:01.000 Well, some people...
02:05:03.000 Reproduction, in terms of the way we've always known it, as being the only way that life exists.
02:05:08.000 But if we can create something, and that something decides to do things, it decides to recreate...
02:05:13.000 Wipe us out and live on its own.
02:05:15.000 Yeah, it's a silicone-based life form.
02:05:17.000 Like, why not?
02:05:17.000 Why does life have to be something that only exists through the, you know, multiplication of cells?
02:05:23.000 Yes.
02:05:24.000 That's very charitable of you.
02:05:26.000 And...
02:05:27.000 People make that claim.
02:05:29.000 Some people think that those machines in the distant future will look back at us as like one stage of evolution that culminated in them.
02:05:38.000 I've always said that we are some sort of an electronic caterpillar that doesn't know that it's going to give birth to a butterfly.
02:05:45.000 We're making a cocoon and we don't even know what we're doing.
02:05:48.000 That's a great metaphor.
02:05:50.000 I have a hard time accepting that.
02:05:51.000 Because you're a person.
02:05:52.000 Yes.
02:05:53.000 It's against my interests.
02:05:54.000 But we're so flawed.
02:05:55.000 All these things we've outlined, all the problems with us, those will go away with artificial intelligence.
02:05:59.000 This is a deep philosophical question, Joe.
02:06:02.000 I think it's inevitable, and I think if the single-celled organisms are sitting around wondering what the future would be going to be like, Are we going to be replaced?
02:06:08.000 Will they make antibiotics that kill us?
02:06:10.000 Yes, they are going to make antibiotics that kill us!
02:06:13.000 I mean, we are so flawed.
02:06:15.000 We do pollute the ocean.
02:06:16.000 We do pull the fish out of it.
02:06:18.000 We do fuck up the air.
02:06:19.000 We do commit genocide.
02:06:20.000 There's all these things that are real.
02:06:22.000 But the artificial life won't have those problems because it won't be emotionally based.
02:06:26.000 It won't be biologically based.
02:06:28.000 It'll just exist.
02:06:31.000 That's a really good story.
02:06:33.000 We're so flawed.
02:06:34.000 Why not accept something so much better?
02:06:36.000 No, we're not.
02:06:36.000 I'm not going to grant we're flawed.
02:06:36.000 Oh, we're very flawed.
02:06:37.000 We are flawed, but like I said, we have a flawed beauty.
02:06:39.000 So how are you not going to grant it?
02:06:40.000 We are very flawed, though.
02:06:41.000 We are flawed.
02:06:42.000 I think it's beautiful, too, but I think vultures probably think they're beautiful, too.
02:06:45.000 That's why they breed with each other.
02:06:47.000 Well, they are beautiful, but the point is I think we have a flawed beauty.
02:06:50.000 I'm going to stick to my principles that we are, despite our flaws, worth it.
02:06:54.000 There is something wonderful about us, and I think that wonderful creative quality is the reason why we created artificial life in the first place.
02:07:02.000 It's like this lust for creation.
02:07:06.000 We've had that impetus, you know, if you look at a lot of the...
02:07:10.000 The art, whether it's the Egyptian, you know, the pyramids or other kinds of artistic expression, we seem to have had a desire to transcend death, you know, to make things that looked like us but weren't alive forever,
02:07:26.000 actually.
02:07:27.000 So, I mean, I think in that regard, I think you're quite right that it's not going to stop.
02:07:32.000 That tendency is not going to stop.
02:07:33.000 Now, your very, as I said, charitable, positive take on On the claim and your analogy to single-celled organisms, which are just, you know, but a fleeting, not a fleeting, they're still there, but a phase in our evolution, you know, is something I'm going to have to be thinking about because it's disturbing,
02:07:49.000 honestly.
02:07:50.000 Well, it's an objective perspective if I took myself out of the human race, which I really can't, but if I tried to fake it.
02:07:56.000 I would say, oh, I see what's going on here.
02:07:58.000 These dummies are buying iPhones and new MacBooks because they know that this is what's going to help the production of newer, more superior technology.
02:08:09.000 The more we consume, it's also based, I think, in a lot of ways, our insane desire for materialism is fueling this.
02:08:19.000 And it could be an inherent property of the human species that it is designed to create this artificial life.
02:08:26.000 And that literally is what it's here for.
02:08:29.000 And much like an ant is creating an ant hill and doesn't exactly have some sort of a future plan for its kids and its 401k plan, that what we're doing is like this inherent property of being a human being.
02:08:41.000 Our curiosity, our wanderlust, our desire, all these things.
02:08:45.000 Yeah, all these things are built in because if you follow them far enough down the line, 100 years, 200 years, it inevitably leads to artificial life.
02:08:54.000 Yes.
02:08:55.000 I think that's possible.
02:08:59.000 And, of course, we're not going to be alive to test that idea.
02:09:02.000 Maybe we will.
02:09:04.000 Maybe with CRISPR and all this crazy shit that's coming down the line.
02:09:07.000 No, no.
02:09:07.000 Come on, Joe.
02:09:07.000 I don't think so?
02:09:08.000 No.
02:09:08.000 Nothing's going to happen.
02:09:09.000 The pace of innovation, people always have been saying, if you go back every decade, people are saying, just around the corner, just around the corner.
02:09:14.000 These things take forever.
02:09:16.000 They're very hard.
02:09:16.000 Biological systems are very hard to engineer.
02:09:19.000 Of course, the people who do that kind of work will often, I think a lot of them engage in snake oil.
02:09:25.000 They want to fund their research.
02:09:26.000 Sure, but I think it's entirely possible that there's a 20-year-old listening to this podcast right now.
02:09:30.000 Who will be 150. Yes, that's possible.
02:09:32.000 Maybe a lot more than that.
02:09:34.000 I think it's entirely possible that 30-year-olds today could be 150. But I think you give another 10 years of research, you give maybe 10 years more, I think it's entirely possible.
02:09:45.000 Well, there's a famous bet about this, you may know, the Olshansky-Alstead bet.
02:09:49.000 Yeah, I heard about that.
02:09:50.000 Yes, where they bet that about 10 or 20 years ago, they bet that there was a person born that year.
02:09:56.000 Who would live to be 150?
02:09:58.000 And on one side you had one guy who said no.
02:10:00.000 They bet a billion dollars and they endowed it with – they opened up a bank account.
02:10:07.000 They put in – they're using compound interest to get to that sum of money.
02:10:10.000 And they obliged – Fucking nerds.
02:10:12.000 Yes.
02:10:12.000 And they obliged their – What a great bet.
02:10:16.000 Yes.
02:10:17.000 And they designated the National Academy of Sciences or some entity like that that would – That would adjudicate the bet in 150 years.
02:10:26.000 And they specified the kinds of documentation that might be needed.
02:10:30.000 And they allowed for in the future, there may be other ways of ascertaining how old someone is and those can be used.
02:10:35.000 And that's the bet.
02:10:37.000 So you might be right about that.
02:10:39.000 Like, you know, there are humans that live naturally to be 120. We have that capacity.
02:10:43.000 Actually, here's an interesting idea.
02:10:45.000 Why do we die at all?
02:10:46.000 Why has natural selection never given us an immortal species?
02:10:49.000 Have you ever thought about that?
02:10:52.000 Yeah, I have.
02:10:53.000 I have never reached a conclusion, but I always figured you live long enough, well, especially up until recent history, only long enough to recognize it was all crazy hustle.
02:11:08.000 That's more philosophical.
02:11:09.000 I'm looking for a scientific answer.
02:11:11.000 Here's one answer for why we're not immortal.
02:11:13.000 So if you think about it, why would natural selection not have created a creature that lived forever?
02:11:19.000 Why should we die?
02:11:22.000 Okay, so here's the one answer.
02:11:23.000 It's not known for sure if this is the answer, but this is a good answer.
02:11:27.000 Imagine there are two different kinds of things that can kill you, intrinsic causes and extrinsic causes.
02:11:33.000 So things inside your body that result in you dying, defects, diseases, and so forth, or things outside your body, like accidents, lightning strikes, trees fall and you just die, and so forth.
02:11:46.000 Because it's impossible to eliminate all extrinsic causes, because some people are going to die from accidents, it would be inefficient from the point of view of evolution to evolve to be immortal.
02:11:58.000 Because we would have all this capacity to be immortal.
02:12:01.000 We would have these bodies capable of immortality, which let's say would be evolutionarily demanding, like to evolve anything like an eye or a brain or Any quality, lactase, right?
02:12:11.000 Like we talked about earlier, you don't have lactase persistence into adulthood because it's not needed.
02:12:17.000 So evolution doesn't waste anything.
02:12:18.000 There'd be no reason for that.
02:12:19.000 So there would be no reason, the argument goes, to evolve immortality because inevitably some people would be killed eventually by accidents anyway.
02:12:34.000 So unless you can create a world in which there are no accidents, there are no extrinsic causes of death, it would be inefficient from an evolutionary point of view to evolve immortality.
02:12:45.000 So death, the reason we die naturally, some people think, is that, the reason we die naturally is that there are unnatural causes of death in the world, like accidents.
02:12:57.000 If we could eliminate the unnatural causes, So that nowhere, no time ever were we ever killed by trees falling or lightning strikes or things like that, then actually over time we would evolve to live indefinitely.
02:13:14.000 This is the theory.
02:13:16.000 It's a crazy idea.
02:13:18.000 It is fascinating, but do you think that nature had that sort of foresight?
02:13:22.000 Well, it's not a foresight, but that's how natural selection works.
02:13:24.000 Think about, like, if I have suddenly magically transformed your body at great expense to make you capable of immortality, and then two days from now you're hit by a bus.
02:13:35.000 I've wasted all that effort.
02:13:36.000 But if you've only done it to one person, you've wasted that effort.
02:13:38.000 If you did it to other people, you have the potential to create an incredibly wise person with a thousand years of life and experience and education and learning.
02:13:46.000 Yeah, but he also would die.
02:13:48.000 He also would die.
02:13:49.000 So everyone eventually would die from these extrinsic causes.
02:13:51.000 Perhaps.
02:13:52.000 Well, no, that's the assumption in the model.
02:13:55.000 If it's not perhaps, if in fact there are no extrinsic, if in fact there is a world in which you're never struck by lightning, never hit by a bus, never a tree branch, Then the theory is that we would have evolved to be immortal.
02:14:07.000 So it's almost like the life that you live, you're inevitably going to get killed by extrinsic causes.
02:14:13.000 Yes.
02:14:13.000 And if you extend that life to a thousand years, then it's absolutely going to happen.
02:14:17.000 Yes.
02:14:17.000 Therefore, why bother?
02:14:18.000 That's just living in a bubble, just terrified of the world fallen rocks landing on your head.
02:14:24.000 Yeah, but you can't take this theory and this model and apply it to an individual and an individual life.
02:14:30.000 It's about how our species evolved.
02:14:31.000 It's not about how you should live your life.
02:14:33.000 I mean, it's also true.
02:14:34.000 I don't think you should live your life afraid.
02:14:37.000 I think that's a difficult – I think that's a sad life to live a life afraid.
02:14:41.000 It takes practice to be unafraid.
02:14:43.000 I wonder if you'd be more afraid if you could live a thousand years without an accident.
02:14:47.000 You know, because if you're one of those crazy rock climber dudes like Alex Honnold.
02:14:51.000 Yeah, he's crazy.
02:14:52.000 He's crazy.
02:14:53.000 I love him, though.
02:14:54.000 Have you met him?
02:14:55.000 Yeah, I've had him on a couple times.
02:14:56.000 Oh, my God.
02:14:57.000 Yeah, he's awesome.
02:14:58.000 I'm sure he is.
02:14:59.000 You know, his amygdala, of course, his amygdala is fucked up.
02:15:01.000 You know this, right?
02:15:02.000 What do you mean?
02:15:02.000 He has no fear.
02:15:03.000 No, he does have fear.
02:15:04.000 You're wrong.
02:15:05.000 Oh, really?
02:15:05.000 He absolutely has fear.
02:15:06.000 He just understands his capacity and his ability.
02:15:09.000 You think he's rational?
02:15:10.000 He says, I can do this, therefore I should not be afraid?
02:15:12.000 Because I read that he scanned his brain and that his fear centers are different than the rest of us, is what I read.
02:15:17.000 Maybe that's wrong.
02:15:18.000 I don't know.
02:15:18.000 Did he tell you?
02:15:19.000 He didn't say anything about that.
02:15:21.000 No, I think he's just freakish.
02:15:22.000 I don't know about that, man.
02:15:24.000 He said, basically, that the experience, he just stays mellow and calm, and that if things go wrong, it's really bad.
02:15:32.000 Like, you don't want to be freaking out.
02:15:35.000 Yes.
02:15:35.000 It's like cave divers.
02:15:37.000 Yeah.
02:15:37.000 You don't panic when you're underwater and you lose your way.
02:15:40.000 Right.
02:15:40.000 It consumes oxygen a lot.
02:15:42.000 An amazing story that my friend Donald Cerrone, he's a UFC fighter, told about being trapped in a cave and just barely getting out when it was running out of oxygen.
02:15:51.000 Yeah.
02:15:53.000 Horrible, crazy, scary story.
02:15:55.000 And you have to – those guys are also different.
02:15:56.000 Either they're born that way or they learn to be that way.
02:15:59.000 You have to keep calm because when you and I lose our cool and start hyperventilating, our oxygen consumption skyrockets.
02:16:06.000 Right.
02:16:06.000 And that's the opposite of what you need to do in that situation.
02:16:09.000 That's actually what he talked about.
02:16:10.000 Yeah.
02:16:10.000 You know, like trying to stay calm and battling the demons.
02:16:14.000 Yes.
02:16:14.000 I'm not going to die like this.
02:16:16.000 Yes.
02:16:16.000 Yeah.
02:16:17.000 What an incredible story.
02:16:19.000 Yeah.
02:16:19.000 The Alex Honnold thing – There is something?
02:16:22.000 Yeah, I watched the movie.
02:16:23.000 He got an MRI and they said that.
02:16:24.000 Here's a quote.
02:16:25.000 What did it say?
02:16:26.000 Yeah, his amygdala is different.
02:16:27.000 But what did it say?
02:16:28.000 How did it say it?
02:16:29.000 The kid's amygdala isn't firing.
02:16:31.000 Yes.
02:16:32.000 Okay, but isn't that possible that that's just through development of constant practice of staying calm while you're in life-threatening situations?
02:16:40.000 It's possible.
02:16:41.000 I would like to see fighters' brains measured in that regard.
02:16:44.000 I would like to see soldiers, special forces guys.
02:16:48.000 Yes, I think that's right.
02:16:49.000 And the guys, the special force guys, it's like the capacity to shoot back when you're being shot at, keeping your calm, moving positions, and so forth.
02:16:58.000 Those are all very important abilities, not panicking.
02:17:01.000 And it is also the case that some people, for example, the most famous study in this regard was a study of London taxi drivers.
02:17:07.000 London taxi drivers can go from any point in the city to any other point in the city.
02:17:11.000 It's called the knowledge.
02:17:12.000 They have a mental map of the whole city and it's freakish.
02:17:15.000 It takes years to be able to know how to navigate the city with tens of thousands of street names and they can do it by like dead reckoning.
02:17:25.000 They scanned – this was a paper about 10 years ago.
02:17:27.000 They brain scanned these guys and they had – I forgot which region of the brain but they had through learning – It is felt.
02:17:34.000 Modify that region of their brain.
02:17:36.000 So it's possible Holland is like you say, that he learned to be this way, that his amygdala isn't firing because he trained himself.
02:17:44.000 But I think – Honnold.
02:17:45.000 Honnold.
02:17:45.000 Honnold.
02:17:46.000 I'm sorry.
02:17:47.000 Honnold is this way because he learned this way.
02:17:48.000 But it's more likely I think that he's like Usain Bolt that was born with incredibly high preponderance of fast twitch fibers in his legs so he can run like the wind.
02:17:59.000 And he trains as well.
02:18:00.000 You have both, right?
02:18:01.000 Good athletes require both.
02:18:03.000 Innate ability plus training.
02:18:05.000 And I think Honnold is probably like that.
02:18:08.000 He's probably born with an amygdala, doesn't fire so much, and he's an amazing climber.
02:18:13.000 It's purely speculative, right?
02:18:14.000 And also the nature versus nurture would apply to chess players as well.
02:18:18.000 I would like to see their brain scanned, like Gary Kasparov.
02:18:22.000 I know Gary, yes.
02:18:23.000 I would love to see that guy's brain scanned.
02:18:25.000 Yes.
02:18:25.000 Yeah, he's an interesting guy.
02:18:27.000 What?
02:18:27.000 I'd say the article goes way more into depth than what I just showed you about just that sentence.
02:18:32.000 She's the doctor who studied him.
02:18:34.000 It was specifically looks at people that go under high stress and look for those kinds of things.
02:18:39.000 She's been doing that since 2005, I guess.
02:18:41.000 And she goes, it's pages long, this whole thing about his brain.
02:18:45.000 But it is unusual.
02:18:46.000 But it also, the amount of time, think about people that are in high stress.
02:18:50.000 High stress is one thing.
02:18:51.000 This kid is in a life-threatening, absolute fatality situation.
02:18:58.000 Yes, mistake is death.
02:18:59.000 Every day.
02:19:00.000 I know.
02:19:01.000 All day.
02:19:02.000 I know.
02:19:03.000 I mean, he lives in a van and just climbs.
02:19:06.000 Yes.
02:19:06.000 That's what he does.
02:19:07.000 It's really fascinating.
02:19:08.000 Yes.
02:19:08.000 Yes.
02:19:09.000 It is.
02:19:10.000 It's amazing.
02:19:11.000 Honestly, it's amazing.
02:19:11.000 So, I mean, I don't know.
02:19:12.000 I'd never met him.
02:19:13.000 I admire him very much and I love this.
02:19:15.000 Like we said at the beginning, it's very important to have skills of any kind.
02:19:20.000 So, his skills are amazing.
02:19:22.000 I admire musical skills and carpentry skills and martial arts skills and statistical skills and medical skills.
02:19:28.000 I admire skills.
02:19:29.000 You know, I think it's worth cultivating.
02:19:31.000 I do as well.
02:19:31.000 Yeah, and it's worth cultivating those skills.
02:19:33.000 Well, you find out more about yourself through acquiring these skills and knowledge and information and just abilities.
02:19:39.000 You learn, and you also learn about how to acquire skills.
02:19:43.000 Yes, I think that's right.
02:19:45.000 And I think it's also a kind of – you also find oftentimes that the practice of acquiring a skill – It teaches you other things that can then be used in other areas.
02:19:55.000 So even if you – like you make the effort to learn the violin or to learn Chinese, for example, or whatever, some effort, that self-discipline then can be translated into something that you're not so good at, but it's still useful to have that.
02:20:10.000 Trevor Burrus That's the Miyamoto Musashi quote from The Book of Five Rings, once you know the way broadly, you see it in all things.
02:20:16.000 Yes.
02:20:18.000 That's very good.
02:20:19.000 I remember – my mind is flashing back to what we were talking about immortality.
02:20:23.000 Do you remember that scene at Helm's Deep in The Lord of the Rings when they're protecting the castle and the elves come and help the humans?
02:20:36.000 Do you know the movie?
02:20:36.000 You probably know the movie.
02:20:37.000 Yeah.
02:20:38.000 And I was always very sad when these elves were killed, because if they hadn't been killed by extrinsic forces, they would have lived a long time.
02:20:46.000 They would have been immortal.
02:20:47.000 So it's like an especially sad loss.
02:20:49.000 Anyway, I had wanted to mention that earlier when we were talking about that thing.
02:20:51.000 So, no, what was the saying again you just said about- Once you know the way broadly, you'll see it in all things.
02:20:56.000 Yes.
02:20:57.000 That's right.
02:20:57.000 Yeah.
02:20:58.000 It was just about acquiring excellence in something.
02:21:01.000 Yes.
02:21:02.000 And that you understand what it takes to acquire excellence in something.
02:21:05.000 And you can apply that to other things as well.
02:21:07.000 It's the same process.
02:21:08.000 Yes.
02:21:08.000 Just a different path.
02:21:10.000 Yes.
02:21:10.000 That's right.
02:21:11.000 Well, listen, Nicholas, thank you so much for being here.
02:21:14.000 Thank you so much.
02:21:14.000 I can't wait to read this book.
02:21:15.000 I'm going on vacation, so I'm going to read this with me.
02:21:17.000 All right.
02:21:18.000 And I really enjoyed talking to you, man.
02:21:20.000 I really did.
02:21:21.000 Thank you so much.
02:21:22.000 Joe, thank you so much for having me.
02:21:22.000 I really appreciate it.
02:21:23.000 I've been really grateful.
02:21:24.000 Thanks, man.
02:21:24.000 Thank you so much.
02:21:25.000 Bye, everybody.