The Jordan B. Peterson Podcast - July 14, 2019


12 Rules: London, Ontario: The inevitability, utility and danger of hierarchies


Episode Stats

Length

1 hour and 57 minutes

Words per Minute

165.65007

Word Count

19,463

Sentence Count

1,211

Misogynist Sentences

7

Hate Speech Sentences

29


Summary

This episode is a throwback to one of Dad's 12 Rules for Life lectures from July 21st, 2018 at Centennial Hall in London, Ontario. In this episode, Dr. Jordan B. Peterson discusses the inevitability, utility, and danger of hierarchies in the world, and why it's important to have a problem in mind when you're talking about them. He also talks about the importance of having a problem you're trying to solve, because otherwise, why bother to be a human being if you don't have one in mind? And, as always, thank you so much for all of the support and love you've shown over the past few months. The outpouring of support has been unbelievable, and I can't thank you enough for all the words of encouragement you've all shown. If you're suffering, please know you are not alone. There's hope, and there's a path to feeling better. Go to Dailywire Plus now and start watching Dr. Peterson's new series on Depression and Anxiety. Let this be the first step towards the brighter future you deserve. Thank you again. Dr. B.B. Peterson - The Jordan Peterson Podcast is a new series that could be a lifeline for those battling depression and anxiety. We know how isolating and overwhelming these conditions can be, and we wanted to take a moment to reach out to those listening who may be struggling. With decades of experience helping patients, Dr Jordan Peterson offers a unique understanding of why you might be feeling this way. In his new series, he provides a roadmap towards healing, and how you can find a way to find your way forward. - Let this series be a guide towards a brighter, brighter future that you deserve to live a life of peace and a brighter future. . - Daily Wire Plus The Jordan B Peterson Podcast is a podcast that could help you become a better version of yourself, not less lonely, happier, more fulfilled, more purposeful, and less stressed, more positive thoughts and a more positive outlook on your day-to-day life. , and a better understanding of who you are worthy of a life you deserve a brighter tomorrow you deserve it. This podcast is a reminder that we are all capable of it all, not just one more chance to be better than the one you deserve and that you can choose to live your best life, not the other one you choose to believe in.


Transcript

00:00:00.960 Hey everyone, real quick before you skip, I want to talk to you about something serious and important.
00:00:06.480 Dr. Jordan Peterson has created a new series that could be a lifeline for those battling depression and anxiety.
00:00:12.740 We know how isolating and overwhelming these conditions can be, and we wanted to take a moment to reach out to those listening who may be struggling.
00:00:20.100 With decades of experience helping patients, Dr. Peterson offers a unique understanding of why you might be feeling this way in his new series.
00:00:27.420 He provides a roadmap towards healing, showing that while the journey isn't easy, it's absolutely possible to find your way forward.
00:00:35.360 If you're suffering, please know you are not alone. There's hope, and there's a path to feeling better.
00:00:41.800 Go to Daily Wire Plus now and start watching Dr. Jordan B. Peterson on depression and anxiety.
00:00:47.460 Let this be the first step towards the brighter future you deserve.
00:00:57.420 Welcome to Season 2, Episode 17 of the Jordan B. Peterson Podcast.
00:01:03.380 I'm Michaela Peterson, Dad's daughter and collaborator.
00:01:06.420 This week's episode, titled 12 Rules London, Ontario, The Inevitability, Utility, and Danger of Hierarchies,
00:01:13.820 is a throwback to one of Dad's 12 Rules for Life lectures from London, Ontario, recorded on July 21st, 2018 at Centennial Hall.
00:01:21.900 Quick update on the family situation. It's still dire. Mom's surgical complication isn't better yet.
00:01:28.060 Life just keeps on throwing curveballs one after the other.
00:01:31.060 We're thinking of going to the States for care, so hopefully that'll be what happens.
00:01:34.860 It's really difficult to make decisions when you're stressed to the gills.
00:01:38.020 Weird evolutionary flaw, eh?
00:01:40.220 When you need your brain the most in stressful situations, sometimes you're so stressed you can't think.
00:01:45.380 That doesn't seem useful at all.
00:01:47.360 The inpour of support has been unbelievable, so thank you again.
00:01:50.560 It's almost like having a peripheral support system. It's really cool.
00:01:56.560 When we return, 12 Rules London, Ontario, The Inevitability, Utility, and Danger of Hierarchies.
00:02:10.700 Please welcome my father, Dr. Jordan B. Peterson.
00:02:20.560 Thank you.
00:02:25.380 Thank you very much.
00:02:29.020 I was a little worried about London.
00:02:32.200 I thought maybe we saturated the southern Ontario market,
00:02:36.560 because two days ago there were only 60% of the tickets it sold,
00:02:41.260 but they told me just before I came on tonight that we're sold out.
00:02:45.080 So, that's really nice.
00:02:47.920 Yeah.
00:02:51.160 And Dave made it, despite the torrential downpour and the trip from L.A.,
00:02:57.220 so it looks like the omens are good, as far as I can tell.
00:03:00.880 So, I thought I would talk to you tonight.
00:03:06.040 I always like to have, there's two things I like to do in these lectures.
00:03:09.760 I like to have a problem in mind, you know, because, like, when you're writing an essay, for example,
00:03:16.200 if you're writing something, you need to have a problem that you're trying to solve when you're writing,
00:03:20.980 because otherwise, why bother writing?
00:03:22.600 And it should be a problem that you actually care about,
00:03:27.080 because otherwise, what you're writing is just nonsense,
00:03:29.440 and you're going to bore yourself to death,
00:03:31.400 and you're going to write something trivial,
00:03:33.160 and whoever reads it isn't going to care.
00:03:35.160 So, that isn't really much of a way to spend your time.
00:03:38.140 And when you're talking, it's the same thing,
00:03:40.040 is that you should have a problem, right?
00:03:41.580 Because why else are you talking, unless you're trying to solve a problem?
00:03:45.580 And maybe you could be talking because you think you have the solution,
00:03:50.300 but you probably don't.
00:03:51.420 And, well, you know, not if it's a complicated problem,
00:03:55.000 but you can try to formulate the problem more clearly,
00:03:57.340 and you can try to generate a solution,
00:03:59.500 and that's a good reason for a talk.
00:04:03.440 And that's what I like to do with these lectures,
00:04:07.220 is every night I come out, I want to have a problem in mind
00:04:10.040 that the whole lecture centers around.
00:04:12.380 And then I also use the lecture as an opportunity to further my thought.
00:04:17.740 So, I'm going to, of course, talk about, to some degree,
00:04:21.420 at least, to talk about the rules in 12 Rules for Life.
00:04:25.000 But I don't want to just go over what I've already,
00:04:29.860 what would you say, concluded.
00:04:31.960 I want to push what I'm thinking past where it's already got to.
00:04:36.980 And so, this is, and then I think that that's useful,
00:04:41.400 not only practically, so that I can think more clearly,
00:04:45.300 but I actually think it's what you want to do when you have a lecture,
00:04:47.940 because partly what people want to hear when they hear someone talk
00:04:52.500 is whatever conclusions they've come to.
00:04:54.740 But more particularly, and I really noticed this with the talks
00:04:58.560 that I've been doing with Sam Harris,
00:05:00.100 people really like to see the act of thoughts coming to be.
00:05:06.020 Like, they like to see the thinking itself.
00:05:08.960 And, well, because it's an adventure if you're actually thinking,
00:05:13.040 because you don't know where you're going to go.
00:05:15.160 You know, I saw this, this is what artists do, by the way,
00:05:20.640 real artists, not propagandist types.
00:05:24.600 Well, there's a difference.
00:05:25.860 A propagandist is someone who already knows where they're going when they start.
00:05:29.920 And then, whatever they do is just to justify where they're going.
00:05:33.220 But a real artist doesn't know where he or she is going to begin with.
00:05:37.040 I saw this great film once.
00:05:38.540 You can find it on YouTube.
00:05:39.620 It's a black-and-white film of Picasso painting back in 1957.
00:05:44.840 I think if you typed into YouTube,
00:05:46.480 black-and-white Picasso painting video, you'd probably find it.
00:05:50.180 And they had Picasso painting on a window and were filming from the other side
00:05:56.760 so you could see what he was doing.
00:05:58.720 You know, so it was really interesting because you don't often get to see,
00:06:01.500 or you don't ever get to see a great artist actually doing what it is that they do.
00:06:06.240 Picasso, here's something cool about Picasso.
00:06:09.000 There's many things cool about Picasso,
00:06:10.820 but this just goes to, this is a good illustration
00:06:13.900 of how vastly different people are in their abilities.
00:06:17.600 So, there's a site also online called the Online Picasso Project, logically enough.
00:06:24.560 And somebody, the person who generated that site,
00:06:27.320 was trying to document all of Picasso's works,
00:06:30.100 which is way harder than you think.
00:06:32.940 You think, so, just guess in your imagination for a minute,
00:06:37.280 just estimate how many paintings and pieces of art Picasso produced.
00:06:41.380 So, imagine he produced, had a productive life of about 70 years, something like that.
00:06:46.960 So, you know, you figure 70 years,
00:06:49.420 what's he knocking out a painting a week or maybe a month?
00:06:52.160 So, that's 12, it's something like 900 paintings in his lifetime.
00:06:56.100 It's a lot of paintings, eh?
00:06:57.200 Because I suspect that most of you have produced zero paintings.
00:07:01.020 So, 900 paintings or artworks is a lot.
00:07:04.500 But that isn't even close.
00:07:07.120 You're not even in the ballpark.
00:07:08.760 He produced 65,000 pieces of art.
00:07:12.280 Right, it's absolutely amazing.
00:07:14.340 And Johann Sebastian Bach, he wrote so much music that if you hired someone just,
00:07:20.100 and he wrote it by hand, of course, because he wrote hundreds of years ago.
00:07:24.340 If you hired a modern copyist just to copy what J.S. Bach wrote,
00:07:28.640 it would take him 20 years of eight-hour days just to copy what J.S. Bach composed.
00:07:37.700 So, anyways, so that's absolutely beyond comprehension, right?
00:07:41.500 I mean, people can be so good at things, you just can't possibly imagine it.
00:07:45.780 And, anyways, an artist, when an artist is working,
00:07:48.560 an artist is trying to puzzle something out.
00:07:51.960 And artists, like visual artists, do this.
00:07:54.460 You know, you think the purpose of a visual artist is to produce something beautiful.
00:07:57.420 And that's not the purpose of a visual artist.
00:08:00.320 The purpose of a visual artist is to solve a complex problem of perception.
00:08:05.580 And, although you may not know it,
00:08:07.280 the way you look at the world is deeply shaped by the way great artists have learned to look at the world.
00:08:12.380 Because they produce their artworks, and they present the world in a certain way,
00:08:15.960 and then that affects everything.
00:08:17.440 It affects the way movies portray the world,
00:08:20.520 and it affects the way television portrays the world, and advertisers.
00:08:23.520 And so, you come to see the world through the eyes of great artists.
00:08:27.580 And what they're doing is solving complex perceptual problems.
00:08:31.080 And so that you can see differently.
00:08:32.960 And they don't know exactly how they're doing it.
00:08:34.980 So, when they filmed Picasso painting on this glass,
00:08:38.820 it's so fun to watch him, because he'd paint, and then he'd erase,
00:08:41.540 and then he'd paint, and then he'd erase, and then he'd paint some more,
00:08:43.840 and he'd cover that over, and then he'd wipe the whole thing clean,
00:08:47.120 and then he'd start again.
00:08:48.040 And it was just constant playing with the forms.
00:08:51.280 You know, and at some point, he was done.
00:08:53.400 And then he'd be on to the next one.
00:08:55.420 But it was the same thing.
00:08:57.500 Constant playing with perceptual forms.
00:08:59.400 And the art is actually just a record of his exploration.
00:09:04.100 And I went and visited an artist yesterday,
00:09:07.140 Thaddeus Bruneau, who lives in Hamilton, in a little town, Ancaster.
00:09:15.560 That's it.
00:09:16.500 And I did a video for him at the opening of his gallery presentation about two years ago.
00:09:23.180 He makes these interesting paintings that they look,
00:09:26.400 they almost look like a rusty metal when you're up close,
00:09:29.440 or an old wall, something like that.
00:09:32.540 But if you get back 10 feet, then you can see a ghostly face
00:09:35.920 that sort of emerges from the abstraction.
00:09:38.240 And they're really beautiful faces.
00:09:40.740 They kind of look like the face that Michelangelo might have produced.
00:09:44.480 He's very influenced by that high Renaissance art.
00:09:47.800 And it's this ghostly emerging of form from a chaotic background.
00:09:52.400 And I was looking at what he was doing yesterday.
00:09:54.620 And he's introduced collages.
00:09:57.300 And the same thing, faces coming out of the background.
00:10:00.740 But he's overlaid them now with collages cut from magazines that look like maps.
00:10:06.420 And so he's playing with this idea of form emerging from the void.
00:10:10.780 But he's playing.
00:10:11.800 I saw him do the same thing.
00:10:13.900 He sands down the collage.
00:10:15.480 And he puts new things on it.
00:10:16.800 And he's constantly playing.
00:10:18.640 And so a piece of art is a record of that creative play.
00:10:23.820 And that's what I hope to do in lectures,
00:10:25.720 to play with the ideas and see where the hell they might go.
00:10:29.580 And, you know, I just did four talks with Sam Harris,
00:10:34.040 two in Vancouver on June 24th and 26th.
00:10:37.460 And then we went to Dublin and to London just a week ago.
00:10:41.640 And that was really something.
00:10:42.680 We met Douglas Murray there.
00:10:44.440 And so he was part of the...
00:10:45.920 Brett Weinstein was with us in Vancouver.
00:10:48.840 And he was that professor who got chased out of Evergreen College,
00:10:51.800 the biologist, about a year ago.
00:10:53.720 And so...
00:10:54.140 And then Douglas Murray joined us in Dublin and London.
00:10:58.820 And he's a columnist for The Spectator and an author and a very intelligent guy.
00:11:02.720 And it was really interesting.
00:11:05.200 Sam and I were talking about the relationship between facts and values,
00:11:08.600 which is a fairly abstract philosophical set of concepts.
00:11:12.040 Or maybe the relationship between religion and science,
00:11:14.680 which is another way of viewing the same thing.
00:11:16.740 We're trying to figure out how it is that people can take the complex world
00:11:20.640 of objective facts that manifests itself in front of us
00:11:24.160 and derive out principles for action and for perception.
00:11:29.060 How do we get from what is to what should be?
00:11:32.200 It was a crucial, crucial question.
00:11:34.260 It's actually the question that your entire brain has evolved to answer.
00:11:38.500 It's a really hard question.
00:11:40.620 Because it's the question of how should you conduct yourself in the world.
00:11:44.040 And there isn't a harder question than that.
00:11:46.260 And so it was so interesting to do these talks for a whole bunch of reasons.
00:11:50.140 First of all, Sam is a very articulate exponent of the materialist atheist perspective,
00:11:56.160 which is a very powerful perspective.
00:11:57.680 And so it was good to talk to him just to face that.
00:12:01.460 Because one of the negative consequences, I think, of materialist atheism
00:12:05.740 is that it tilts people towards nihilism.
00:12:08.160 You might say, well, that's an inevitable consequence of a particular kind of harsh truth.
00:12:12.740 And that might be the case.
00:12:14.460 But I think that the fact that that viewpoint tilts people towards nihilism
00:12:19.360 is actually an indication that there's something wrong with the viewpoint.
00:12:22.980 And we talked about that for 10 hours, basically, right?
00:12:26.740 Two and a half hours both times in Vancouver and then again in Dublin and in London.
00:12:31.840 And there were 3,000 people at each of the Vancouver shows.
00:12:35.660 And so that was pretty unbelievable to begin with, that people think about that, eh?
00:12:41.480 People are a lot smarter than we think.
00:12:44.160 You know, and I've really been thinking about this a lot
00:12:47.280 because it's unbelievable that so many people would come out for a discussion like that
00:12:50.860 because it was as high a level discussion as Sam and I could manage.
00:12:54.660 And so you could imagine that there could be a higher level discussion.
00:12:57.600 But still, it was about the same level as a pretty decently conducted PhD defense
00:13:04.840 at a credible university.
00:13:06.720 And yet the audience was just lasered right in, you know?
00:13:10.620 And we talked for an hour, and then we were going to shift to Q&A.
00:13:13.840 But we didn't because we asked the audience if we should just continue the discussion.
00:13:17.620 And everyone roared, you know, we asked them to vote by clapping.
00:13:21.660 And it was overwhelmingly the majority of the audience wanted the discussions to continue.
00:13:25.940 So 3,000 people sat there for two and a half hours at each of the venues
00:13:31.100 as we walked through this abstract, complex, philosophical discussion.
00:13:37.040 And so, and I mean, I think people were there to see whether Sam's viewpoint would prevail
00:13:42.940 or whether my viewpoint would prevail.
00:13:44.820 Sort of, that's why they came.
00:13:46.220 But that isn't really what held people there.
00:13:48.640 What held people was their willingness to participate in an active discussion,
00:13:53.880 to see where it was going to go.
00:13:55.480 Because it wasn't like Sam or I knew where this was going to go.
00:13:59.080 You know, I'd never met him before.
00:14:00.800 We did two podcasts together, but I'd never met him in person.
00:14:03.840 We had no bloody idea if we were going to be able to talk sensibly with each other for,
00:14:09.440 well, for 10 hours in total.
00:14:11.420 You know, so it was unscripted.
00:14:12.860 And we met beforehand.
00:14:14.080 We had dinner, and we talked about what we were hoping to accomplish.
00:14:17.740 And, you know, we kind of outlined the topics we might cover, but it was completely off the cuff.
00:14:24.460 You know, and so a risk.
00:14:26.800 And that's another thing that you want to see when you go see someone speak.
00:14:30.040 You want to bloody well see them take a risk.
00:14:32.600 You know, because there's no tension.
00:14:34.700 There's no dynamism in the discussion, in a lecture, unless you know the person might fail.
00:14:39.600 And so if you just read your notes, you just stick to the script, well, you're not going to fail completely.
00:14:45.740 You're not going to fall on your face, but you're certainly not going to succeed because you don't produce any tension.
00:14:51.320 And so what people were participating in, essentially, was the process by which the ideas were coming to be.
00:14:59.020 And that's a really exciting thing to participate in.
00:15:01.560 And then I've been reflecting on this a lot as well because I've done 55 or 56 cities.
00:15:08.220 My wife and I have been traveling around a lot in the last four months.
00:15:11.800 And, you know, I've been speaking to audiences of this size or up to twice this size, I guess.
00:15:16.960 This is a smaller venue.
00:15:18.720 And people come out.
00:15:21.160 And people are also watching these long-form interviews on YouTube.
00:15:26.560 You guys know about them.
00:15:27.580 And the sorts of things that Joe Rogan is doing, three hours long, or Ruben, with a slightly shorter format.
00:15:34.860 People are right into those discussions, you know.
00:15:37.600 And all sorts of people, you know.
00:15:39.880 There's lots of working-class guys that come up to me after my talks and say,
00:15:43.440 look, I've been listening to the podcast when I'm doing long-haul trucking or I'm running my forklift or whatever it happens to be.
00:15:50.660 And it's all of a sudden, it seems to be the case that there's an immense public hunger for long-form, high-level discourse.
00:16:00.520 And a lot of that's been revealed by these new technologies.
00:16:03.760 So here, I'll tell you one more story before I start talking about what I actually want to talk about tonight.
00:16:07.940 But when I was in London, three or four days ago, I went on this television show called Hard Talk.
00:16:20.420 And I went to the BBC studios.
00:16:23.280 And there was a round glass table in this little room, little sound studio, with a couple of video monitors behind it.
00:16:30.320 And it kind of looked polished, you know, like a television show often does.
00:16:33.880 And the interviewer came out.
00:16:37.840 We chatted a bit.
00:16:38.860 He was a decent guy.
00:16:40.180 The kind of guy that looks like he would belong on television, you know,
00:16:42.800 because people who do that sort of thing are sort of disproportionately good-looking.
00:16:47.140 It's like a prerequisite for it.
00:16:48.740 Not always, but generally speaking.
00:16:51.460 And the camera started to roll.
00:16:54.800 I had 25 minutes, which is a long time for network TV.
00:16:59.160 The camera started to roll.
00:17:00.820 And he stood up and he read the intro on the teleprompter.
00:17:04.780 And we talked a little bit before that.
00:17:06.740 And he seemed all right to me.
00:17:08.160 You know, we had a decent conversation.
00:17:10.420 And he apologized for making me wait, even though he hadn't.
00:17:13.860 And so he was being civil and all of that.
00:17:18.100 And then he read the intro off the teleprompter.
00:17:21.000 And then he sat down to interview me.
00:17:22.780 And it was so weird.
00:17:24.100 I felt for a minute like I'd been, it was like it was in a little time warp.
00:17:27.720 And I'd been propelled back to something like 1970.
00:17:30.500 I all of a sudden felt that this was done, this format.
00:17:35.000 It was over.
00:17:36.120 And the reason I felt that was because I was trying to have a conversation with him.
00:17:40.820 Because we'd sort of started a conversation before the cameras got rolling.
00:17:44.600 And there was a human being there that I was talking to.
00:17:47.320 The same thing happened when I talked to Kathy Newman to begin with in the Channel 4 interview that some of you may have seen.
00:17:53.340 To begin with, I was talking to a human being.
00:17:57.760 But when the cameras rolled, I was no longer talking to a human being.
00:18:01.120 And it was less the case with the guy from Hard Talk.
00:18:05.380 He wasn't as committed to whatever it was that she was committed to when she was interviewing me.
00:18:12.480 But it was kind of off-putting because he had some questions and they were all listed on a list of questions.
00:18:22.620 You know, and I guess that's okay.
00:18:24.540 But it isn't.
00:18:25.340 You know, when you go talk to somebody, you sit down and have a serious conversation.
00:18:28.540 It's not like you bring a list of questions.
00:18:30.260 You kind of assume that if you're having a discussion, that the discussion will proceed organically, without pre-scripting.
00:18:39.120 And that your response to the person you're talking to will be dependent on their response.
00:18:45.360 Right?
00:18:45.780 And that makes it kind of a dance.
00:18:47.600 You even see this, by the way, with mothers and infants.
00:18:50.160 Little bitty infants, you know, like newborns.
00:18:52.600 Or if you take a mother who's interacting with her infant properly, and you videotape her interacting with the infant, and you speed up the videotape, you can see them dancing.
00:19:03.880 Like, there's a response from the infant, and there's a response from the mother, and they automatically engage in a dance.
00:19:09.760 And there was no dancing in the Hard Talk studio.
00:19:12.540 There was none.
00:19:13.580 And it really, I had a little epiphany.
00:19:16.000 I thought, oh, I see what's happening.
00:19:17.640 I'm actually not talking to a person here.
00:19:20.440 I'm talking to a puppet.
00:19:21.760 But, like Pinocchio, I'm talking to someone whose strings are being pulled from behind the seams.
00:19:27.600 And this isn't an insult to the interviewer, by the way.
00:19:30.880 I suppose it is, just to a tiny degree.
00:19:33.480 Well, I'll tell you why it is and why it isn't.
00:19:37.400 It isn't, because then I thought, well, because I kept trying to, he'd ask me a tough question.
00:19:43.160 Because, like, it was hard talk, right?
00:19:44.580 So, I'm going to get a tough question.
00:19:46.420 But all the questions are already pre-made.
00:19:48.920 And so, they're not that tough.
00:19:50.780 They're, because someone, a bunch of people, a committee sat down and wrote them out.
00:19:55.180 And so, they're not really dependent on me being there, right?
00:19:58.520 And they're certainly not dependent on him being there.
00:20:01.200 Because he's just the mouthpiece of the committee.
00:20:04.200 And then I thought, well, and I kept trying to get under that and talk to him.
00:20:09.140 You know, because I was talking about serious things.
00:20:11.040 And I hadn't crafted my responses.
00:20:13.760 I hadn't decided what the outcome of the discussion was going to be.
00:20:18.080 I wasn't using it to sell books or any of those things.
00:20:20.800 If it sells books, that's just fine.
00:20:22.480 And I've got no problem and a certain moral obligation to, you know, to publicize my book.
00:20:28.700 Because lots of people are depending on its success.
00:20:31.360 But I didn't go into hard talk thinking, this is how I want this to go.
00:20:35.200 And this is how I'm going to answer the questions.
00:20:36.840 Because of that, I went on there thinking, I'm going to talk to this guy and see what happens.
00:20:41.600 Which is much more entertaining, in some sense, way of progressing.
00:20:45.280 But there was none of that with him.
00:20:47.060 And I kept trying to get under his questions and talk to the person, you know.
00:20:52.620 But that wasn't happening.
00:20:54.920 And then I thought, well, why isn't it happening?
00:20:57.520 Then I thought, well, of course it's not happening.
00:21:00.180 And it isn't him.
00:21:01.180 And that's why this isn't an insult about him.
00:21:05.180 Well, first of all, television, studio television, is unbelievably expensive.
00:21:10.140 Right?
00:21:10.820 Especially if it's a big network.
00:21:12.120 It's hundreds of thousands of dollars per minute.
00:21:14.040 You don't muck about with that.
00:21:16.560 And there's advertisers who are advertising and paying for it.
00:21:19.980 And they don't want to see a mistake, that's for sure.
00:21:22.360 And so, of course, it's going to be scripted right down to the last detail.
00:21:27.880 And so, there's nothing dynamic about the conversation.
00:21:31.100 And therefore, there's nothing real.
00:21:32.920 And so, it ends up being something more like canned entertainment or scripted.
00:21:37.360 It's like propaganda.
00:21:38.660 Back to the artist issue.
00:21:40.240 It's like propaganda instead of art.
00:21:42.400 And the medium, to use Marshall McLuhan's term, the medium can't afford the risk of genuine interaction.
00:21:54.300 So, there isn't any.
00:21:56.120 And then, so that's really interesting.
00:21:57.700 That's really worth thinking about.
00:21:59.180 Because that's something that television, and to a lesser extent, radio, and to an even lesser extent, classic print media has done.
00:22:06.440 It's made everything scripted, and part of the reason for that is because it's too expensive to not script it.
00:22:12.280 Okay, but YouTube has blown the bandwidth requirements out of TV, and so of podcasts, right?
00:22:18.620 Length is free.
00:22:20.940 And so, you can take risks.
00:22:22.980 And, and, not only can you take risks, but the length is free.
00:22:30.640 So, Sam Harris was talking to Dave Rubin a couple of weeks ago on Rubin's podcast.
00:22:38.660 And he was talking about the difference between the new media and the old media.
00:22:44.820 Just for your information, you might find this interesting.
00:22:48.180 The London Times published two days ago a statistic that said that more people in the UK are now getting their television, and especially their, more people are watching television, so to speak, on YouTube and online than through the networks.
00:23:04.140 It's, it's tilted.
00:23:05.320 So, it's, it's, the majority are now online.
00:23:08.180 So, that's a big deal, man.
00:23:09.500 That's a walloping big deal.
00:23:10.960 And, as far as I can tell, the networks are just, they're in a death spin, and they're dying so fast that it's beyond belief.
00:23:16.880 And, of course, so are the newspapers.
00:23:18.780 And that's why they've gone cap and hand to the federal government, for example, which, which is definitely a sign of, of their demise, having to do that.
00:23:25.820 So, and I'm not saying that with any great joy.
00:23:28.660 It's just a technological fact.
00:23:31.560 And the reason YouTube and online TV is killing network TV is because, is because YouTube can do absolutely everything that network TV can do, and a bunch more, way, with way less expense, in a much wider format.
00:23:45.180 And so, of course, it's going to kill it.
00:23:47.520 And so, when it, here's another consequence, and this is relevant to what happened with Harris in Vancouver, and in Dublin, and London, and here, with all, with all of you people coming out tonight.
00:23:59.920 TV makes people look stupid.
00:24:02.480 Narrow bandwidth, to be more precise, makes people look stupid.
00:24:06.960 So, Harris said to Rubin, so let's say he goes on, John Anderson on CNN, and John Anderson's really interested in what Harris has to say, and gives him six minutes, which is staggeringly generous by TV criteria.
00:24:25.420 Right? You're lucky if you get 30 seconds, and even if you get 30 seconds, it isn't usually you that gets the 30 seconds.
00:24:31.720 It's your face shows up, and someone says for 15 seconds what it took you 30 seconds to say.
00:24:36.640 So, it's really mediated.
00:24:38.500 And so, everything is compressed into this tiny little channel.
00:24:42.440 And so, well, and then you might think, well, what happens?
00:24:45.300 So, everything, everybody looks stupid, because you can't take something complex and compress it into a tiny little channel like that without oversimplifying it like mad.
00:24:54.480 A soundbite, right?
00:24:55.860 And then, of course, politicians, and everybody who's trying to act in public, have to craft their message to fit the soundbite, or they don't get any time at all.
00:25:05.460 And then, so that's not good.
00:25:07.600 It's like, I'm not going to think much of you if I have to look at you through an opening that's only this big all the time.
00:25:13.480 I'm not going to think there's much of you at all, or much to you.
00:25:16.820 And then I think what happens with TV is two other things.
00:25:20.320 First of all, the journalists that operate on TV have to be those who will accept being scripted, because they'll just leave if they can't accept it.
00:25:30.660 So, the guy that I was talking to on Hard Talk, part of the reason I couldn't get underneath him to talk to him was because he hadn't been there for like 20 years.
00:25:39.540 He'd been scripted for so long, that's what he did.
00:25:42.860 And I explained why, so fair enough.
00:25:45.580 And he adapted to the medium, or the medium chased out all the people who weren't like him, and that's all that was left.
00:25:52.120 But now, all of a sudden, there's no bandwidth requirement.
00:25:57.620 There's no bandwidth restriction.
00:25:59.960 And so what's happened?
00:26:01.200 Well, how many of you watch, how many of you binge on like Netflix series?
00:26:06.760 Yeah, okay.
00:26:09.820 So, that's really cool.
00:26:10.740 So, this is another thing that's really interesting.
00:26:12.720 So, you know, the plot complexity of TV shows has shot up massively since the 1970s, eh?
00:26:18.520 So, if you're trying to figure out how intelligent the audience was, the intelligence of the audience in the 1970s is nothing compared to the intelligence of the audience now.
00:26:29.120 You might think, well, great, we're so much smarter.
00:26:31.180 It's like, well, no, yes, perhaps somewhat smarter.
00:26:34.340 But mostly, the bandwidth restriction is gone.
00:26:38.460 And so it turns out that people don't want half an hour sitcoms, or even one and a half hour made-for-TV movies.
00:26:44.740 Because that was pushing the envelope on TV, man, 90 minutes, you know.
00:26:48.520 It's no, you want 48 hours of dense drama with multi-layer characters.
00:26:53.040 You actually want, what you want actually looks a lot like literature.
00:26:57.680 You know, because the closest analogues to stories like Breaking Bad, say, or The Sopranos is great literature.
00:27:05.700 Like the Russian literature, with its multitude of characters and its layered plotting and its complex themes.
00:27:11.260 Breaking Bad being a very good example of that, because that's a, what would you call it?
00:27:15.220 That's a variant on the theme of beyond good and evil, or a variant on the theme of crime and punishment.
00:27:20.580 And it approaches that complexity and, it approaches the complexity and depth of great literature.
00:27:26.520 And so it turns out that, hey, look at that, we're all smart enough to actually appreciate great literature.
00:27:31.920 Maybe not in its written form, but who cares, in some sense.
00:27:36.900 It's the density of the ideas that matters.
00:27:40.180 And so, you remove the bandwidth requirements, and we look way smarter than we did.
00:27:45.020 And then the same things happen with YouTube and the podcast.
00:27:48.860 It's like, well, now you can listen to a three-hour discussion, despite your fragmented attention span.
00:27:54.000 That was the theory.
00:27:54.680 These young people have no attention span whatsoever.
00:27:57.680 It's like, that's wrong, clearly.
00:28:00.760 And so, if you have the opportunity to listen to an in-depth three-hour discussion that's real-time and that's spontaneous, there's a huge market for it.
00:28:10.240 And so, that's absolutely cool.
00:28:12.880 And it's just in time, too, because we have a lot of complex problems to solve and a lot more coming up because of the rate of technological transformation.
00:28:20.600 It turns out we're capable of having discussions that are much more profound than anybody realized.
00:28:25.920 So, that's very cool.
00:28:28.920 We have a new medium, or a set of new medium.
00:28:31.480 The other thing you see is with podcasts, because they're also revolutionary.
00:28:35.820 So, video online is revolutionary because it allows for this long-term, in-depth discussion on your terms, on your time.
00:28:46.420 With no production barrier and also delivered in a format that people can discuss, can engage in discussion with.
00:28:53.820 Because you can put up your own damn videos and cut things up and comment on them.
00:28:57.020 And so, right now, people, with my videos online, there's about 300 videos that I put up, people are cutting 20,000 clips a week out of them and commenting on them.
00:29:08.980 So, there's this huge, so the technology enables this discussion that wasn't possible with television.
00:29:14.860 Two-way discussion.
00:29:15.920 So, that's very cool.
00:29:17.020 And then, if that's transformed into podcasts with Rogan, Rogan's a great example.
00:29:21.360 So, I think Joe Rogan is the most powerful interviewer who's ever lived, if you look at just sheer numbers.
00:29:26.700 So, he gets 1.5 billion downloads of his podcast a year.
00:29:31.440 150 million a month, right?
00:29:33.180 Which is just absolutely beyond comprehension.
00:29:36.040 I asked him at one point, I said, Joe, you know, I think you're probably the most powerful interviewer that ever lived.
00:29:40.880 What do you think about that?
00:29:42.380 And he said, I just try not to think about it.
00:29:45.700 So, but I'm sure Joe thinks about it because he's a lot smarter than, even if you think he's smart, he's actually smarter than you think.
00:29:56.700 So, he's quite an interesting person because he's like 95th percentile for tough because he was a fighter and 95th percentile for being in good physical shape and 95th percentile for being funny because he's ridiculously funny and 95th percentile for being smart.
00:30:15.360 So, he's quite the person to contend with.
00:30:17.800 You don't meet someone who lines up at the high end of the distribution on that many dimensions very often.
00:30:23.900 So, it's not by accident that he's where he is, even though he is a beneficiary of this technological revolution.
00:30:30.120 So, well, so, that's all very interesting as far as I'm concerned and it's helped me also account for why everyone is showing up to these talks.
00:30:42.220 We are on the cusp of a technological transformation in communication.
00:30:47.700 And it looks like it's one that's deepening our capacity to discuss things in an intelligent and profound manner.
00:30:55.840 And even more importantly, it looks like we're up to the task.
00:30:58.980 So, that's exceptionally cool.
00:31:01.000 So, and it's really a remarkable thing to be able to participate in the dawning of that, right?
00:31:07.320 With the podcasts, not only do you enable those long-form discussions, but you enable people to capitalize on found time, which is a big deal.
00:31:17.080 Because if you're going to read a book, you have to sit and read it.
00:31:19.080 And lots of people really haven't made friends with books.
00:31:21.820 They're not as literate as they might be.
00:31:23.740 And they're a little leery of books.
00:31:25.580 They don't buy them.
00:31:27.080 Very few people buy hardcover books.
00:31:29.060 I think it's one percent.
00:31:30.360 You know, only eight percent of people go to a movie in a theater once a month or more.
00:31:34.600 So, even movies are relatively, what would you call it?
00:31:41.420 It's a rarefied taste.
00:31:43.000 And books are even more so.
00:31:44.340 But God only knows how many people can listen.
00:31:47.320 Maybe it's ten times as many people can listen as can read.
00:31:50.440 You know, read thoroughly.
00:31:52.100 And then you can listen while you're gardening or driving or exercising and doing whatever else you might be doing.
00:31:57.820 And you can't read or watch videos when you're doing other things.
00:32:01.060 And so, you know, the audio book market is also exploding like mad.
00:32:06.320 And so, anyways, all this bodes extraordinarily well for our ability to communicate in effective and profound manner.
00:32:14.240 And hooray for that.
00:32:15.720 Maybe we'll all get a lot smarter than we were very, very rapidly.
00:32:19.620 And that would be wonderful.
00:32:21.160 And I do believe that it's possible.
00:32:22.740 Okay, so, that sort of brings you up to date, and me as well, on the sorts of things that I've been thinking about.
00:32:33.640 And now I want to talk to you a little bit about, mostly, I'm going to talk about the first rule in Twelve Rules for Life.
00:32:39.440 Because there's a problem that I want to address tonight.
00:32:41.740 And the problem is, it's the problem of the left.
00:32:48.160 But I'm not approaching it precisely as a critic of the left any more than I would be a critic of the right.
00:32:55.180 I want to approach the problem of the left as if it's a technical problem.
00:32:58.680 Because I do think it's a technical problem.
00:33:00.800 And I want to lay out why it's a technical problem.
00:33:02.980 So, a little bit of background for this.
00:33:05.880 And I'm going to do that in the context provided by the first rule, which is stand up straight with your shoulders back.
00:33:12.800 Which is a meditation, at least in part, on hierarchies and their permanence.
00:33:19.360 And the strategy that you need to implement if you're going to be successful in the permanent hierarchy.
00:33:26.780 Now, in Twelve Rules for Life, in the first chapter, I talk about hierarchies.
00:33:32.980 And there's a reason for that.
00:33:34.520 And the reason is, at least in part, to address, what would you call it, a profound, but also very attractive, criticism of the West.
00:33:53.080 It's capitalist structure, it's private property structure, it's individualism.
00:33:56.620 Leveled by the radical leftists, most particularly the Marxists.
00:34:00.820 Now, the Marxist types, with a little help from the post-modern types, tend to conceptualize the West as a patriarchy and as an oppressive patriarchy.
00:34:14.080 And to lay, and that's a hierarchical structure, the patriarchy, with, in principle, a few people dominating the top of it.
00:34:21.100 And to also lay the fact of that hierarchy and its unequal distributions at the feet of Western civilization and capitalism.
00:34:32.660 And so, it's partly what I'm trying to address in the first rule.
00:34:35.860 It's like, okay, let's take a look at that.
00:34:37.400 And what I was attempting to put forward was a proposition, which is, the problem is way deeper than that.
00:34:45.420 Like, we could give the devil his due and say that those who criticize the structure of hierarchies for their tendency to tilt towards domination by power,
00:34:56.440 and their proclivity to dispossess people, to make people stack up at the bottom, that's all accurate.
00:35:05.080 But it's even more accurate than the Marxist types presume, because they make the presumption that that's a consequence of the political system,
00:35:15.400 and the economic system, and the social system, and so forth, that exists, particularly in the West.
00:35:22.020 But that's not the case, because hierarchical structures that dispossess, that are rigid and dispossess, have been around for 350 million years.
00:35:34.980 And so, I traced them back to crustaceans, lobsters, somewhat famously now.
00:35:42.620 Now, I'm being pursued by lobsters everywhere I go.
00:35:45.140 People give me lobster oven mitts, and our lobster salt and pepper shakers, and cheeses.
00:35:51.080 I've got more lobsters than you can shake a stick at, which isn't something I ever really planned, you know, but serves me right.
00:35:57.760 But the point that I was trying to make was, well, first of all, that hierarchies are the most common method for solving the problem of cooperation and competition
00:36:09.620 in relationship to scarce resources by living creatures, period, not capitalists, not westerners, not human beings, not even mammals, right?
00:36:23.640 Not even reptiles. Glorified insects have hierarchies.
00:36:29.700 And hierarchies have been around for so long that the most fundamental neurological structures of our nervous system,
00:36:38.740 the ones that run on serotonin, and serotonin is the neurochemical that actually sets up your nervous system
00:36:43.980 and kind of organizes its functions like a conductor organizes an orchestra.
00:36:49.220 Hierarchies have been around for so long that the primary phenomenon to which your nervous system has adapted
00:36:57.300 is, in fact, the hierarchy.
00:37:00.200 And that's the 350 million year problem.
00:37:03.740 So if we're going to talk about hierarchies and their problems, and we should, we'll get to that in a minute,
00:37:08.900 we're not going to lay the damn problems at the feet of the West or capitalism or even human beings.
00:37:15.000 We're going to look way deeper than that.
00:37:17.780 Okay, so...
00:37:18.900 Going online without ExpressVPN is like not paying attention to the safety demonstration on a flight.
00:37:26.100 Most of the time, you'll probably be fine, but what if one day that weird yellow mask drops down from overhead
00:37:31.560 and you have no idea what to do?
00:37:33.860 In our hyper-connected world, your digital privacy isn't just a luxury.
00:37:37.240 It's a fundamental right.
00:37:39.020 Every time you connect to an unsecured network in a cafe, hotel, or airport,
00:37:43.280 you're essentially broadcasting your personal information to anyone with a technical know-how to intercept it.
00:37:48.320 And let's be clear, it doesn't take a genius hacker to do this.
00:37:51.520 With some off-the-shelf hardware, even a tech-savvy teenager could potentially access your passwords,
00:37:56.640 bank logins, and credit card details.
00:37:58.880 Now, you might think, what's the big deal?
00:38:01.000 Who'd want my data anyway?
00:38:02.340 Well, on the dark web, your personal information could fetch up to $1,000.
00:38:06.420 That's right, there's a whole underground economy built on stolen identities.
00:38:11.240 Enter ExpressVPN.
00:38:12.980 It's like a digital fortress, creating an encrypted tunnel between your device and the internet.
00:38:17.680 Their encryption is so robust that it would take a hacker with a supercomputer over a billion years to crack it.
00:38:23.320 But don't let its power fool you.
00:38:25.040 ExpressVPN is incredibly user-friendly.
00:38:27.480 With just one click, you're protected across all your devices.
00:38:30.500 Phones, laptops, tablets, you name it.
00:38:32.580 That's why I use ExpressVPN whenever I'm traveling or working from a coffee shop.
00:38:36.820 It gives me peace of mind knowing that my research, communications, and personal data are shielded from prying eyes.
00:38:42.800 Secure your online data today by visiting expressvpn.com slash Jordan.
00:38:47.540 That's E-X-P-R-E-S-S-V-P-N dot com slash Jordan, and you can get an extra three months free.
00:38:54.240 ExpressVPN.com slash Jordan.
00:38:55.800 Right, and so let's say that you tilt towards the Marxist end of things, and you actually care about the dispossessed.
00:39:06.400 Not that you actually necessarily genuinely care about the dispossessed if you're a Marxist.
00:39:12.680 But you might be using that as a nice cover story, and a little bit of it might be true.
00:39:17.260 But let's say that, let's give the devil his due and say that, well, some people and some parts of each person actually do care about the dispossessed, the poor, the struggling, and all of that in a genuine manner.
00:39:29.700 And so then the first thing I would say is, well, if you actually do care about the dispossessed, then you should bloody well take the problem a lot more seriously than the Marxists do.
00:39:39.080 And that's partly what I was trying to do in chapter one.
00:39:41.980 It's a way worse problem than you think.
00:39:45.000 Okay, let's run through a couple of propositions.
00:39:49.940 Proposition number one.
00:39:52.080 You actually have to do something in the world.
00:39:55.560 You have to act.
00:39:56.900 And the reason for that is that you'll die if you don't.
00:40:00.780 You'll suffer, and then you'll disappear.
00:40:04.580 You can't just sit there and live.
00:40:07.680 For a little while you can, because there you are sitting there, and you're alive and all of that.
00:40:11.200 But it's not something you can do for a very long period of time.
00:40:14.120 You can't do it physiologically.
00:40:16.000 And you can't even do it psychologically.
00:40:17.900 Like, most people are very, very miserable if they're not actively engaged in something on a pretty regular basis.
00:40:24.300 So it's like the demand for action is built into you.
00:40:28.120 And it's been built into you ever since single cells started to move actively in the world.
00:40:33.280 That's a very, very long time.
00:40:36.100 So approach and avoidance, so that's movement towards and movement away from things, that's the most fundamental, apart from the ability to reproduce, it's like the most fundamental element of active life.
00:40:49.460 It's a very old issue that you have to act in the world.
00:40:52.180 All right, in order to act in the world, which you have to do in order to survive, then you have to, you have to value things.
00:41:00.540 So, what things?
00:41:04.120 Well, a small number of things.
00:41:06.060 Because part of what you're doing when you're acting in the world, or even looking at the world, which is actually a form of action, by the way.
00:41:12.860 Because when you look at the world, it isn't like you're a passive.
00:41:17.120 It isn't like your eyes are just taking in what's there.
00:41:19.960 Your eyes are moving around like mad, constantly.
00:41:22.780 They have little tiny movements called saccades.
00:41:26.600 And if they stop, then you go blind right away.
00:41:29.540 Your eye is moving constantly.
00:41:31.460 And then there's larger movements, because your eyes dart around all the time.
00:41:36.000 And so, I'm telling you that because it shows that perception is dependent on action.
00:41:40.820 And then not only that, you have to focus your attention on something rather than everything.
00:41:46.400 So, like when I'm talking to the audience, I don't just sort of glance blindly at everything.
00:41:51.880 I'm focusing very intently on a single person.
00:41:54.960 Not even on the person, but on their face.
00:41:57.400 And not even on their face, but on their eyes.
00:42:01.100 Like our focus is unbelievably intense and narrow.
00:42:06.580 And so, you have to act even to perceive.
00:42:11.180 And to perceive and act, you have to select.
00:42:14.160 And the way you select is by ignoring almost everything and privileging something.
00:42:19.580 And to ignore everything and to privilege something is to value, right?
00:42:23.840 Because what you're doing just by looking at something is acting out the proposition that one thing is more important than everything else.
00:42:30.820 So, there's no perception or action without a hierarchy.
00:42:35.440 Period.
00:42:36.700 You can't even see the world without a value hierarchy.
00:42:41.200 So, that's a really cool thing to know.
00:42:44.260 Sorry about that.
00:42:47.260 And then, so, a hierarchy.
00:42:52.220 A hierarchy is inevitable because you can't even see without one.
00:42:56.200 Let alone act.
00:42:57.160 You know, and if your hierarchy isn't quite pointed, so that you can figure out what the point of things is, then you're just confused.
00:43:08.600 It's like it isn't even though you can reduce the world to two or three things that you value.
00:43:13.480 Because that doesn't work out.
00:43:14.940 If you wake up in the morning and you don't know which of two or three things to do, what you end up doing is nothing, right?
00:43:21.320 Because you're tortured by the conflict between the two or three things.
00:43:24.720 Even though there's maybe like 50,000 things you can do, even way more than that, reducing them to two or three doesn't do the trick.
00:43:33.120 You have to reduce what you're going to do to one thing, which you privilege above everything else.
00:43:39.740 Perception and action are dependent on a value hierarchy.
00:43:43.140 Well, there's no getting rid of the damn hierarchy.
00:43:44.940 Okay, next thing.
00:43:45.900 Let's say that you do decide that you're going to do something, and you start to act, but then it turns out that you're going to act in the world, in the social world.
00:43:55.180 Because you're not alone, ever, not really.
00:43:59.660 Maybe for brief periods of time.
00:44:01.320 Maybe if you live way the hell out in the bush, but you don't, and very few people do.
00:44:05.040 And even the people who do often have something that's not so right about them, and they tend to go off the rails very rapidly.
00:44:11.640 Because we don't do well as individuals, purely isolated.
00:44:15.800 You outsource your sanity.
00:44:18.100 You know, you surround yourself by people to whom you make yourself vaguely acceptable.
00:44:24.660 And then they slap you a little bit all the time to keep you in line.
00:44:28.220 And that's how you stay sane.
00:44:29.880 It's not like you got all that organized by yourself.
00:44:32.520 You certainly don't.
00:44:33.480 And so, you know that, because you have your family around you, and although they probably drive you crazy,
00:44:38.980 they also keep you on the straight and narrow to quite a remarkable degree.
00:44:43.540 And your friends do the same thing, and your employers, and so forth.
00:44:47.120 So, anyways, when you're out in the world, and you've decided to act,
00:44:52.440 you have to act in a context that's composed of all these other people.
00:44:56.020 And so how do you do that?
00:44:57.600 Well, generally what you do is you make a hierarchy.
00:45:00.880 That's what you do at work.
00:45:01.940 Look, there's a hierarchical structure.
00:45:04.260 You know, that's what you do in the government.
00:45:06.240 That's what you do in, well, that's even what you do in friendships, to some degree.
00:45:10.600 You can do that in bars.
00:45:11.860 You know, there's a hierarchy in a bar.
00:45:13.640 Everyone knows that.
00:45:15.260 So, especially if it's a bar where people are striving to meet each other.
00:45:19.000 There are people who are very successful at that, and people who aren't.
00:45:21.640 You organize your cooperation and competition in relationship to a valued goal into a hierarchy.
00:45:30.360 And then, and you have to, because otherwise you cannot organize what you do socially.
00:45:35.380 So, you can't see the world, and you can't act without a hierarchy.
00:45:38.220 And then you can't act in the world without producing a social hierarchy.
00:45:41.120 And that's why even lobsters have hierarchies.
00:45:42.920 And chickens, I wrote about them a little bit in the first chapter of Twelve Rules for Life, too.
00:45:48.680 So, chickens have a pecking order.
00:45:50.920 So, when the farmer comes out to feed the chickens, it's the celebrity chickens that get to eat first.
00:45:55.540 And the hanger's on, they're at the back, and they get the leftover grain.
00:45:59.800 And the price they pay for moving ahead in the hierarchy is that they get pecked a lot, maybe to death.
00:46:07.020 And so, the less successful chickens accept waiting for food as a viable alternative to being pecked to death.
00:46:14.500 And that's exactly what we do with our hierarchies,
00:46:16.880 is that instead of having a fight about who gets access to what all the time,
00:46:21.860 and risking the fight, and the fights are very, very unpleasant,
00:46:25.860 we formulate a hierarchy, and then everyone is positioned,
00:46:30.120 and then we all know whose turn it is when.
00:46:33.320 And then, and then if we're smart, we don't base the hierarchy on power.
00:46:38.620 This is another problem with the radical leftist view of the world.
00:46:41.640 Because we don't care that much about power, except when things deteriorate.
00:46:46.840 And power doesn't solve problems.
00:46:48.920 So, for example, there is a hierarchy of plumbers.
00:46:52.940 And, you know, if you go into any town, some of the plumbers are doing very well.
00:46:56.680 And those are usually the plumbers whose pipes don't leak,
00:47:01.260 and who get a reputation for being able to do the job properly.
00:47:04.660 And maybe they're also good business people, and maybe they have a marketing flair,
00:47:08.380 and maybe they're good at self-promotion.
00:47:10.020 Who the hell knows?
00:47:10.820 But at the basis of it, at least, especially with something like plumbing,
00:47:14.240 is they know how to put pipes together that don't leak.
00:47:18.180 And it turns out that we actually want pipes that don't leak,
00:47:21.780 because nobody wants a house full of sewage.
00:47:24.020 And so you actually try to find the competent plumber,
00:47:26.860 and you peel from the top of the dominance hierarchy.
00:47:29.160 And so, if we're smart, we make our hierarchies based on competence.
00:47:34.940 And competence would be the ability to pursue something of value,
00:47:39.940 but even more importantly, to pursue something of value that everyone values.
00:47:43.880 Not just you.
00:47:45.240 It has to be something that other people want, too.
00:47:48.260 And so it's okay that there's a hierarchy,
00:47:50.680 because what you want, if you're trying to solve a problem that everybody wants to solve,
00:47:55.440 is to produce a structure where the people who are the best at solving it
00:47:58.560 get to solve it.
00:48:00.780 Otherwise, you don't solve the problem.
00:48:02.700 So you need the damn hierarchy.
00:48:04.660 So let's stop complaining about them.
00:48:06.740 Let's stop assuming they're secondary consequences of the Western capitalist patriarchy.
00:48:11.840 Because they're not.
00:48:13.700 Now, that's a right-wing position.
00:48:17.060 Fundamentally.
00:48:17.700 Because one of the things that characterizes the right-wing is
00:48:20.440 comfort with and respect for hierarchies.
00:48:24.320 And so there's a place for the right, because we need the damn hierarchies.
00:48:29.000 And we should have some respect for them and some gratitude for them,
00:48:32.240 because we can't organize our perception without them,
00:48:35.060 and we can't act towards valued ends in the world without them.
00:48:38.800 But, and this is why the left is necessary.
00:48:46.480 Just because hierarchies are necessary,
00:48:49.720 and just because they've always existed, essentially,
00:48:53.820 doesn't mean they're without their problems.
00:48:57.340 And they have intrinsic problems.
00:48:59.280 And this is part of the postmodern criticism of hierarchies.
00:49:04.880 When you privilege something above everything else,
00:49:07.320 so that you can see it or act towards it,
00:49:09.400 you dispossess everything else.
00:49:12.200 You make it a lower status.
00:49:14.980 Now, that's okay when you're looking at the world,
00:49:16.760 because you can't look at everything at once.
00:49:19.980 It's an impossibility not to ignore and dispossess most of everything.
00:49:26.800 But if it's an economic situation, it gets kind of brutal.
00:49:31.720 You know, so we set up hierarchies,
00:49:33.160 even if they're hierarchies of competence.
00:49:35.560 First of all, a hierarchy of competence can degenerate, which it does.
00:49:41.040 You know, like the mafia is not precisely a hierarchy of competence.
00:49:46.620 It's a hierarchy of power.
00:49:49.340 And so as a hierarchy of competence deteriorates,
00:49:52.420 it can transform into a hierarchy of power.
00:49:54.320 And that means that people who aren't necessarily competent
00:49:57.820 come to dominate it and to extract resources from it.
00:50:01.920 And that's how hierarchies degenerate.
00:50:04.120 And they do degenerate.
00:50:05.740 And part of what the left says is,
00:50:08.000 yeah, you claim that your hierarchy is one of competence,
00:50:10.680 but I can see it becoming corrupted by nothing but power.
00:50:14.240 It's like, yes, that's a valid criticism.
00:50:16.600 We've known even the ancient Egyptians,
00:50:18.980 the ancient Egyptians had a god of hierarchy.
00:50:21.940 That was Osiris.
00:50:23.380 I wrote about that a little bit in 12 Rules for Life,
00:50:25.920 but more in my book, Maps of Meaning.
00:50:27.580 And they also had a god of the degeneration of hierarchies.
00:50:31.220 And that god was named Set.
00:50:33.460 And Set was, Set as a name,
00:50:35.920 Set was the name that becomes Satan
00:50:37.880 as the Christian revolution emerged out of its Egyptian and Greek background.
00:50:44.600 And so, we've known forever, at least in dramatic form,
00:50:50.200 that hierarchies of competence can degenerate into hierarchies of power,
00:50:54.240 and that that's bad news.
00:50:55.640 And that part of what we need to do is to be awake and be careful
00:50:58.940 and to adjust our hierarchies to make sure that they stay functioning
00:51:03.280 as the useful tools they're properly designed to be.
00:51:07.640 And then the other thing that the left says is,
00:51:12.580 well, what if you can't function in the hierarchy,
00:51:15.120 even if it's a hierarchy of competence?
00:51:17.900 Now, this is a very complicated problem,
00:51:20.020 and this is why I wanted to address this as a technical issue.
00:51:23.520 I mean, the first issue is,
00:51:25.600 there's going to be some time in your life
00:51:27.460 when you're not very good at maneuvering in the hierarchy,
00:51:30.700 even if you're competent, right?
00:51:31.880 Because you're going to be sick,
00:51:34.020 or someone close to you is going to be sick,
00:51:36.360 and that's going to take you out too.
00:51:38.360 You know, if you have a really close family member,
00:51:40.120 like a child, and they're ill,
00:51:42.020 it's like you're compromised in a serious way,
00:51:44.560 directly proportionate to their illness,
00:51:46.320 or maybe you have to take care of a sick parent.
00:51:50.180 And so, there's going to be some time in your life
00:51:52.680 where you're part of the dispossessed.
00:51:55.400 And so, and that's going to be true for everyone,
00:51:58.960 and so we probably need to do something about that,
00:52:00.880 because it's a situation we're all going to find ourselves in
00:52:04.220 at one point or another.
00:52:07.460 And then, also,
00:52:09.100 there's the problem of the people
00:52:11.620 who are more permanently dispossessed.
00:52:13.460 So, here's a really vicious problem
00:52:16.280 that no one will talk about,
00:52:18.300 that everyone needs to talk about.
00:52:20.860 So, imagine you have this hierarchy,
00:52:23.900 and you have this hierarchy,
00:52:25.200 and you have this hierarchy.
00:52:26.280 So, you have a whole set of hierarchies,
00:52:27.560 because our society isn't a hierarchy,
00:52:29.720 it's a whole set of hierarchies.
00:52:31.340 And they're sort of loosely aggregated into a meta-hierarchy,
00:52:35.580 because there's recognized power structures
00:52:37.900 across all those sets of hierarchies.
00:52:40.400 Wealth might be one,
00:52:41.500 and political power might be another.
00:52:44.420 Then you think, well, are there...
00:52:47.020 It's good that we have all those hierarchies,
00:52:48.700 because if you're not good at one thing,
00:52:49.980 maybe you can be good at another.
00:52:51.520 And so, you're dispossessed,
00:52:53.800 because you don't know how to sing,
00:52:55.520 but, you know, maybe you're a pretty good engineer,
00:52:57.580 and so you don't have to sing.
00:52:59.580 And hopefully, in a pluralistic society,
00:53:02.980 we can set up a lot of different hierarchies,
00:53:05.480 and everyone can find one
00:53:07.080 in which they're a contender.
00:53:09.380 Even if you're not at the top,
00:53:10.780 that might not even matter.
00:53:12.060 What you really want is some hope
00:53:13.700 that you can move towards the top.
00:53:16.300 Right?
00:53:16.500 That's even better than being at the top,
00:53:18.180 I think, most of the time,
00:53:19.200 because people live more on hope
00:53:20.600 than actuality anyways.
00:53:22.000 And you need a vision for the future
00:53:23.560 and something to work toward.
00:53:24.820 And so, if you found a hierarchy
00:53:26.180 where your particular competence
00:53:27.620 might express itself properly,
00:53:29.640 then you have something to live for.
00:53:31.100 And hopefully, we could provide that for everyone.
00:53:33.480 And in a diverse society,
00:53:35.000 a properly diverse society,
00:53:36.240 to use a word I really despise
00:53:37.920 because of the way it's being used,
00:53:39.440 you might set up enough hierarchies
00:53:41.120 so that everybody would have a chance
00:53:42.480 to move toward the top.
00:53:43.860 And I think, in the West,
00:53:44.880 we've done a remarkable job of that.
00:53:47.000 You know, because most people
00:53:48.280 can find their place.
00:53:49.780 But here's a problem.
00:53:52.840 So, intelligence is a problem.
00:53:54.880 Because one of the things
00:53:56.200 that predicts your ability
00:53:57.460 to be successful in any hierarchy,
00:54:00.280 virtually any hierarchy,
00:54:01.660 is how intelligent you are.
00:54:03.480 And you can assess that with IQ tests
00:54:05.240 quite accurately, by the way,
00:54:06.760 despite the fact that people
00:54:08.040 don't like that idea.
00:54:09.280 And no wonder they don't like it.
00:54:11.440 And so, what that means is that
00:54:13.740 if you happen to be born
00:54:15.340 at the lower end
00:54:16.760 of the cognitive distribution,
00:54:18.280 through no fault of your own,
00:54:20.060 the probability that you're going
00:54:21.580 to end up among the permanently
00:54:23.100 dispossessed is quite high.
00:54:25.540 And so, I can give you
00:54:26.240 an example of that.
00:54:27.300 So, the United,
00:54:29.380 the American Armed Forces
00:54:30.820 have been using IQ tests
00:54:32.580 to screen for officers
00:54:33.820 and enlisted men for a long time,
00:54:35.980 more than 100 years.
00:54:37.020 They're really good at it,
00:54:38.100 and they know what they're doing.
00:54:39.140 They've published a lot of the
00:54:40.180 seminal research on IQ.
00:54:42.600 And partly, the reason they did that
00:54:44.860 was because during wartime,
00:54:46.080 you want to screen a lot of people
00:54:47.120 really fast,
00:54:48.080 and you want to find people
00:54:48.880 who are generally competent,
00:54:50.240 and you want to make them
00:54:50.920 into officers,
00:54:51.740 and you want to do that
00:54:52.480 at an incredible rate,
00:54:53.840 because otherwise you lose.
00:54:55.580 So, like, you're driven
00:54:56.560 by absolute necessity,
00:54:57.860 and that's why the military
00:54:59.720 uses IQ tests,
00:55:00.780 because there isn't a better way
00:55:01.840 of screening people rapidly
00:55:02.980 for competence than IQ tests.
00:55:05.180 And so, one of the upshots of that
00:55:07.340 was that the Army has,
00:55:09.720 the Armed Forces has produced
00:55:11.940 a lot of data on the relationship
00:55:13.440 between IQ and your ability
00:55:15.360 to function in the Armed Forces.
00:55:18.080 And 20 years ago,
00:55:19.460 something like that,
00:55:21.140 they produced a policy
00:55:23.320 that's now instantiated
00:55:24.780 into American law,
00:55:26.120 that if you have an IQ of less than 83,
00:55:27.820 you cannot be drafted
00:55:29.260 into the Armed Forces.
00:55:31.580 Why?
00:55:32.060 Now, think about this,
00:55:34.880 because you've got to think
00:55:35.540 about the conditions
00:55:36.240 under which a scientific proposition
00:55:38.420 might be valid.
00:55:39.900 So, if you discover a set of facts
00:55:41.620 that you don't like,
00:55:44.100 but you've discovered them,
00:55:45.600 and you're so convinced
00:55:46.560 by their truth
00:55:47.180 that you're willing to state
00:55:48.560 the set of facts
00:55:49.440 even though you don't like them,
00:55:50.940 then that's one piece of evidence
00:55:52.240 that they might be true.
00:55:53.520 Right?
00:55:53.720 They run contrary to your bias
00:55:55.740 rather than affirming it.
00:55:57.320 Now, the Armed Forces
00:55:58.260 wants every single person
00:55:59.640 they can possibly get their hands on
00:56:01.060 to be in the Armed Forces.
00:56:02.900 Right?
00:56:03.180 Because they have a chronic shortage
00:56:05.040 of people in wartime
00:56:06.260 for obvious reasons,
00:56:07.640 and they have a chronic shortage
00:56:08.740 of people in peacetime.
00:56:10.760 So, it's a matter of American policy
00:56:13.160 to, necessity first,
00:56:15.020 to pull people into the Armed Forces,
00:56:16.660 and then it's actually also a measure,
00:56:18.460 especially in peacetime,
00:56:19.860 of compassion and care for people
00:56:22.980 to pull them into the Armed Forces.
00:56:24.440 Because one of the things
00:56:26.100 the Armed Forces have been used for
00:56:27.940 in the United States
00:56:28.660 is to pull people out of the underclass
00:56:30.660 and to provide them with
00:56:32.040 the means of social mobility
00:56:34.280 that might put them up
00:56:35.260 into the functional working class
00:56:36.720 or above.
00:56:37.720 And so, that's an explicit part of policy.
00:56:39.520 And so, you actually want to pull
00:56:41.120 as many people as you possibly can
00:56:42.600 into that training ground.
00:56:44.540 Nonetheless, they concluded
00:56:45.560 that if you had an IQ of 83
00:56:47.040 or less,
00:56:48.160 you can't be in the Armed Forces.
00:56:49.680 And the reason for that was
00:56:50.760 despite years of effort,
00:56:54.300 the people who make the selection decisions
00:56:59.680 have determined that
00:57:00.820 if you have an IQ of less than 83,
00:57:02.520 you cannot be trained
00:57:03.660 under any circumstances whatsoever
00:57:05.340 to do anything in the Armed Forces
00:57:06.860 that isn't positively counterproductive.
00:57:11.060 That's 10% of the population.
00:57:12.880 10% of the population.
00:57:18.360 And as our society
00:57:19.560 becomes more technologically complex,
00:57:21.200 which it is doing
00:57:22.040 at a very, very rapid rate,
00:57:24.480 the necessity
00:57:25.420 for higher-level cognitive function
00:57:27.140 is increasing, not decreasing.
00:57:29.500 And so, despite the fact
00:57:30.920 that we have hierarchies
00:57:32.420 and despite the fact
00:57:33.220 that we need them
00:57:33.920 and despite the fact
00:57:34.780 that we have a very large number
00:57:36.320 of hierarchies,
00:57:37.620 we also have a situation
00:57:38.860 where it's increasingly
00:57:40.640 going to be the case
00:57:42.360 that a narrow band
00:57:44.080 of the population,
00:57:45.140 10%, it's not that narrow,
00:57:47.260 is going to be permanently
00:57:48.600 set outside those hierarchies.
00:57:51.940 And that actually constitutes
00:57:53.100 a real problem.
00:57:54.600 Now, there's all sorts
00:57:55.280 of other reasons
00:57:55.860 you might get dispossessed,
00:57:57.200 but that's a big one, man.
00:57:58.700 And it's one that really
00:58:00.080 does seem arbitrary, too,
00:58:01.460 because, you know,
00:58:02.180 you might be a very smart person
00:58:03.480 and you might be happy about that
00:58:04.860 and you might be proud of it even,
00:58:06.580 but it's actually not your fault.
00:58:08.760 Like, you might have not done
00:58:10.120 anything so idiotic
00:58:11.600 that you made yourself
00:58:12.700 permanently stupid.
00:58:14.660 And congratulations for that.
00:58:17.500 But, you know,
00:58:18.860 IQ, to a very large degree,
00:58:20.680 is biologically determined.
00:58:22.540 And so, if you happen to be
00:58:23.600 in the upper fifth percentile,
00:58:25.160 let's say,
00:58:26.000 a tremendous amount of that
00:58:27.040 is cards you were dealt with,
00:58:29.560 cards you were dealt at birth.
00:58:31.720 And, you know,
00:58:32.540 you might not like that idea.
00:58:34.480 Environment can make you stupider,
00:58:36.880 but it's very, very hard
00:58:38.880 for it to make you smarter.
00:58:40.980 So, and that's a,
00:58:42.360 that's part of the reason
00:58:43.060 people don't like the IQ literature,
00:58:44.580 and no wonder.
00:58:45.420 It's a harsh literature.
00:58:46.980 But then, you know,
00:58:47.600 so there's the problem
00:58:48.840 of people who aren't
00:58:49.800 at the upper end
00:58:52.240 or even in the middle
00:58:52.920 of the cognitive distribution.
00:58:54.000 There's a problem of people
00:58:55.000 who are impaired in some manner.
00:58:57.740 Their health is impaired,
00:58:58.660 physically or mentally.
00:58:59.740 Plenty of people like that.
00:59:02.000 There's all sorts of things
00:59:03.020 that might put you
00:59:03.720 outside the hierarchy,
00:59:05.060 even though the hierarchy
00:59:05.720 is necessary.
00:59:06.340 And the reason
00:59:07.760 that we need a left wing
00:59:08.820 is to keep the,
00:59:11.240 to keep the criticism
00:59:12.900 of the hierarchy
00:59:13.760 out in the open
00:59:15.460 to make sure it doesn't
00:59:16.940 degenerate into a power game,
00:59:18.940 for example,
00:59:19.520 and to speak on behalf
00:59:21.600 of the genuinely dispossessed.
00:59:24.480 And then the question is,
00:59:25.800 well, how steep
00:59:28.720 and purely functional
00:59:30.260 should the hierarchy be?
00:59:32.760 That's something you might promote
00:59:33.900 if you're on the conservative
00:59:34.760 end of things.
00:59:35.700 And how flat and inclusive
00:59:37.880 should it be
00:59:38.600 to take care of the dispossessed?
00:59:40.020 And that's something
00:59:40.580 that you would push
00:59:41.240 if you're on the left wing
00:59:42.080 part of the spectrum.
00:59:43.480 And the answer is,
00:59:44.820 we don't know,
00:59:46.200 and it always changes,
00:59:47.580 so we need to talk about it
00:59:49.040 all the time.
00:59:50.180 And so that's why
00:59:50.800 we have political dialogue, right?
00:59:53.060 Yeah, well, that's the,
00:59:54.020 and that's,
00:59:55.340 that's why free speech
00:59:56.820 is so necessary.
00:59:57.640 It's not just another right.
00:59:59.260 You have to be a fool
01:00:00.260 if you think that.
01:00:01.540 If you're a political figure
01:00:03.160 and you think that free speech
01:00:04.240 is just another right,
01:00:05.420 you have disqualified yourself
01:00:07.400 from holding office
01:00:09.600 because you don't understand
01:00:11.100 how the system works at all.
01:00:13.160 Say, free speech
01:00:17.160 is the mechanism
01:00:17.840 by which the left
01:00:18.620 and the right
01:00:19.040 keep society in balance.
01:00:21.020 You don't mess
01:00:21.800 with that mechanism
01:00:22.640 unless you want everything
01:00:23.880 to become unbalanced.
01:00:25.660 Okay, so now we've made
01:00:27.160 a case for the left, right?
01:00:28.580 So the left is,
01:00:30.320 in its proper place,
01:00:31.320 it speaks for the dispossessed
01:00:32.580 and it's a critic
01:00:33.540 of the proclivity
01:00:34.740 for hierarchies
01:00:35.460 to become blind and rigid.
01:00:37.960 Okay, and,
01:00:38.640 and what motivates the left?
01:00:41.320 Well, we know
01:00:42.080 some temperamental features
01:00:43.200 that motivate people
01:00:44.120 on the left.
01:00:44.680 So people on the left
01:00:45.420 tend to be more creative.
01:00:47.320 They're more open.
01:00:48.380 Technically,
01:00:48.860 that's a big five trait
01:00:49.840 and it's associated
01:00:50.880 with the ability
01:00:51.540 to think laterally.
01:00:52.940 So if you're high in openness,
01:00:54.400 if someone throws you an idea,
01:00:56.060 you'll have a bunch
01:00:56.580 of other ideas.
01:00:57.780 And that's good
01:00:58.540 because that's creative,
01:00:59.560 but it's bad too
01:01:00.240 because most of your ideas
01:01:01.160 will be stupid
01:01:01.700 and dangerous.
01:01:02.720 Well, that's the price
01:01:03.840 you pay for creativity,
01:01:05.880 right?
01:01:06.160 And you can even see that
01:01:07.180 with creative people
01:01:08.020 because creative people
01:01:08.900 have a hard time
01:01:09.640 catalyzing an identity
01:01:10.960 because they can be
01:01:12.100 so many things,
01:01:13.080 it's hard for them
01:01:13.940 to be anything at all.
01:01:15.600 And so you see
01:01:16.360 creative people
01:01:17.160 who are often lost,
01:01:18.080 especially if they're also
01:01:19.040 high in negative emotion
01:01:20.140 because they can't
01:01:21.480 organize themselves
01:01:22.420 towards one thing
01:01:23.700 and they get scattered.
01:01:25.560 And so,
01:01:26.080 and creativity per se
01:01:27.940 is necessary
01:01:29.440 because there's hard problems
01:01:31.260 and we haven't solved them
01:01:32.180 and we need new solutions,
01:01:33.300 but it's also
01:01:34.320 deeply problematic
01:01:35.160 because almost all new solutions
01:01:37.100 are stupid
01:01:37.800 and pathological.
01:01:38.640 so creativity,
01:01:42.180 like all other gifts,
01:01:43.300 comes at a tremendous price.
01:01:44.900 Anyways,
01:01:45.320 the left-wingers
01:01:45.960 tend to be more creative.
01:01:47.240 They also tend to be
01:01:47.980 less conscientious,
01:01:49.900 especially orderliness,
01:01:51.420 orderly,
01:01:52.240 which is part of conscientiousness.
01:01:54.120 Conservatives tend to be
01:01:55.060 more conscientious
01:01:55.860 and so they make good managers
01:01:57.160 and administrators.
01:01:58.960 They're really good at,
01:02:00.580 once someone's figured out
01:02:02.160 how to do something,
01:02:02.980 conservatives are really good
01:02:03.920 at doing it,
01:02:04.800 but they're not so good
01:02:05.600 at figuring out
01:02:06.120 how to do a new thing.
01:02:07.040 And so,
01:02:08.040 the way things look
01:02:08.780 economically
01:02:09.400 is that the liberal types
01:02:11.160 generate new business ideas,
01:02:12.840 most of which fail
01:02:13.740 catastrophically,
01:02:14.680 by the way,
01:02:15.500 and then,
01:02:16.020 and some of which
01:02:16.620 succeed immensely.
01:02:18.120 And then,
01:02:18.680 conservatives leap in
01:02:19.640 and run the new things
01:02:21.700 and implement them
01:02:23.380 insofar as they can be
01:02:24.880 implemented algorithmically.
01:02:26.480 And then they run them
01:02:27.480 until their conclusion,
01:02:28.720 the environment changes,
01:02:30.020 the company dies
01:02:30.820 because that solution
01:02:31.660 no longer works,
01:02:32.880 the company dies
01:02:33.560 and a creative person
01:02:34.780 comes up with a new idea.
01:02:36.840 So,
01:02:37.340 another reason why
01:02:38.140 the left and the right
01:02:38.900 are necessary
01:02:39.460 from a temperamental
01:02:40.360 perspective is
01:02:41.140 the liberal left types
01:02:42.600 create new things
01:02:43.680 but the right conservative
01:02:45.020 types run them.
01:02:46.220 And so,
01:02:46.640 we need each other
01:02:47.480 seriously.
01:02:49.120 And so,
01:02:49.400 we have to have
01:02:49.820 some appreciation
01:02:50.640 across the temperamental
01:02:52.660 and political divide.
01:02:54.100 Okay,
01:02:54.460 now,
01:02:54.760 the other thing
01:02:55.280 that seems to motivate
01:02:56.100 the left,
01:02:56.940 at least to some degree,
01:02:58.340 especially the more radical
01:02:59.540 end of the left,
01:03:00.680 is compassion.
01:03:01.480 We've done some research
01:03:02.960 that shows that
01:03:03.720 agreeableness,
01:03:04.560 which is the compassion
01:03:05.940 dimension in the big five,
01:03:07.800 predicts politically
01:03:08.500 correct belief.
01:03:09.800 And,
01:03:10.020 you can kind of understand
01:03:11.620 compassion as a motivator.
01:03:13.500 You know,
01:03:13.740 like if you're walking
01:03:14.400 down the street
01:03:15.080 and you run across
01:03:17.440 someone who's homeless,
01:03:18.400 some young person,
01:03:19.280 for example,
01:03:19.780 a runaway,
01:03:20.500 or even someone
01:03:21.140 who's sort of
01:03:22.120 in the throes
01:03:22.900 of alcohol addiction
01:03:24.640 and is laying on the street,
01:03:26.280 it's not a pleasant experience.
01:03:27.920 It's not like you
01:03:28.600 take a moment
01:03:29.580 to celebrate
01:03:30.300 the oppressive
01:03:31.100 hierarchy
01:03:31.940 when you see somebody
01:03:33.720 who's so obviously
01:03:35.820 fallen out
01:03:37.480 of the structure
01:03:38.100 of the world.
01:03:38.880 No one likes that.
01:03:40.460 And,
01:03:41.000 if you're
01:03:41.360 the more compassionate,
01:03:42.760 if you're a conscientious
01:03:43.780 conservative type,
01:03:44.520 you might be somewhat
01:03:45.160 judgmental
01:03:45.920 in a situation like that
01:03:47.140 because conscientious people
01:03:48.500 are less likely,
01:03:49.300 for example,
01:03:49.680 to give money
01:03:50.140 to homeless people,
01:03:51.360 partly because
01:03:52.220 they judge them
01:03:54.300 as well as
01:03:54.920 feeling compassion
01:03:56.160 towards them,
01:03:56.660 whereas the
01:03:57.120 compassionate,
01:03:58.720 low-conscientious types
01:03:59.840 are instantly
01:04:00.860 awash in
01:04:02.220 mercy and pity.
01:04:04.520 And,
01:04:04.800 there's something
01:04:05.600 admirable about that
01:04:07.460 in the same way
01:04:08.560 that watching a mother
01:04:09.860 take care of an infant
01:04:10.700 is admirable,
01:04:11.840 right?
01:04:12.240 Especially an infant
01:04:13.080 who's under six months old
01:04:14.300 because the right response
01:04:15.360 to an infant
01:04:15.920 is compassion.
01:04:17.980 So,
01:04:18.260 whatever an infant does
01:04:19.220 before it's six months old
01:04:20.420 is correct.
01:04:21.880 Whenever it manifests
01:04:23.060 any distress,
01:04:24.560 your job is to drop
01:04:25.620 whatever you're doing
01:04:26.220 and fix it.
01:04:27.220 Pure compassion.
01:04:28.060 And that's so deeply
01:04:29.520 wired into us
01:04:30.600 and appropriately so
01:04:32.180 that it's hard for us
01:04:33.800 to even understand
01:04:34.720 that a compassionate
01:04:35.760 response isn't always
01:04:36.900 the proper response
01:04:37.880 in every circumstance.
01:04:40.220 You know,
01:04:40.480 and as your children grow,
01:04:42.540 you start to temper
01:04:43.460 compassion with judgment,
01:04:45.900 right?
01:04:46.160 And you increase
01:04:46.880 the amount of judgment
01:04:47.680 as they mature
01:04:48.460 because you hold them
01:04:49.240 accountable for their actions.
01:04:50.740 And so,
01:04:51.020 compassion has to shade
01:04:52.200 into something like judgment
01:04:53.640 in order for development
01:04:54.860 to proceed.
01:04:55.880 You also see that
01:04:56.560 on the social level.
01:04:58.440 Compassion might be
01:04:59.260 a perfectly good
01:05:00.160 primary instinctive ethos
01:05:02.820 for dealing with people
01:05:04.360 who are truly,
01:05:05.620 utterly dependent.
01:05:07.320 But for setting up
01:05:08.860 complex organizations
01:05:10.360 like a business,
01:05:11.780 well,
01:05:11.920 there's no evidence
01:05:12.520 whatsoever
01:05:13.020 that agreeableness
01:05:13.740 is a predictor of success.
01:05:15.620 If you're an agreeable person,
01:05:16.760 you get paid less
01:05:17.640 and you don't do
01:05:18.500 a better job.
01:05:19.760 Exception there might be
01:05:20.880 if you're caring
01:05:21.620 directly for people.
01:05:22.940 So,
01:05:23.240 in healthcare,
01:05:24.600 customer service
01:05:25.320 and that sort of thing,
01:05:26.300 being agreeable
01:05:26.880 might give you an edge.
01:05:28.000 But mostly,
01:05:28.660 it's conscientiousness,
01:05:29.640 which is a cold virtue.
01:05:31.220 It's the ability
01:05:31.800 to forego immediate gratification,
01:05:33.600 for example,
01:05:34.100 and to work dutifully
01:05:35.460 and orderly
01:05:36.480 towards a kind of a cold end.
01:05:38.780 So,
01:05:39.040 compassion doesn't seem
01:05:39.920 to scale.
01:05:40.880 But nonetheless,
01:05:42.700 it's necessary
01:05:43.620 for people
01:05:45.340 who are truly
01:05:45.880 in trouble.
01:05:47.040 And
01:05:47.260 it has this
01:05:48.980 feeling
01:05:49.920 of moral virtue
01:05:51.480 that's automatically
01:05:52.580 associated with it.
01:05:53.620 And I think
01:05:54.340 that's part
01:05:54.780 of the reason
01:05:55.200 why...
01:05:56.080 So,
01:05:56.300 here's the other issue.
01:05:57.180 It's like,
01:05:57.660 the right wing
01:05:58.340 can go too far.
01:05:59.660 The right seems
01:06:00.360 to go too far
01:06:01.060 when it
01:06:01.580 holds up
01:06:03.600 a hierarchy
01:06:04.840 as absolute
01:06:05.660 or when it starts
01:06:06.980 to make claims
01:06:07.640 of something like
01:06:08.240 ethnic or racial superiority.
01:06:10.900 You box the right wingers in,
01:06:12.120 you say,
01:06:12.420 no,
01:06:12.600 you've gone too far.
01:06:14.200 When does the left
01:06:15.120 go too far?
01:06:16.420 The answer is,
01:06:17.680 well,
01:06:17.840 we're not exactly sure,
01:06:19.280 even though we know
01:06:19.980 that the left
01:06:20.760 can go too far,
01:06:21.840 if we have any sense,
01:06:22.900 because we read
01:06:23.840 our 20th century history
01:06:25.180 and we understand
01:06:26.260 that when the left
01:06:27.800 went too far
01:06:28.780 in the 20th century,
01:06:29.800 it destroyed
01:06:30.680 100 million people.
01:06:32.460 So,
01:06:32.960 the evidence
01:06:33.400 is absolutely
01:06:34.620 overwhelming
01:06:36.340 that the left
01:06:37.720 can go too far.
01:06:39.460 But it's not so easy
01:06:40.540 to figure out when.
01:06:41.860 And that's actually
01:06:42.460 a problem
01:06:42.860 that's causing us
01:06:45.700 an immense amount
01:06:46.280 of trouble
01:06:46.660 in our society
01:06:47.360 at the moment.
01:06:47.940 It's part of what's
01:06:48.560 driving polarization
01:06:49.420 because we don't know
01:06:51.620 how to box it in.
01:06:52.620 And then the question,
01:06:53.740 the question I'm trying
01:06:54.420 to address,
01:06:54.940 at least in part,
01:06:55.560 is,
01:06:55.700 well,
01:06:55.840 why not?
01:06:57.720 Compassion is part of it.
01:06:59.600 It's like compassion
01:07:00.240 has the aspect
01:07:01.460 of virtue about it.
01:07:03.580 And it's especially true
01:07:05.040 when people are
01:07:05.780 genuinely dispossessed.
01:07:07.680 Okay,
01:07:07.920 and we don't know
01:07:08.220 how to solve that problem.
01:07:09.400 How much compassion
01:07:10.360 is enough?
01:07:13.080 Try making the rule,
01:07:14.180 man.
01:07:14.400 It's really difficult.
01:07:15.360 Like,
01:07:15.720 those of you
01:07:16.180 who have children
01:07:16.840 and are debating
01:07:18.200 between you,
01:07:19.180 you know,
01:07:19.440 mother and father,
01:07:20.200 generally speaking,
01:07:20.860 about how to
01:07:21.720 discipline your children.
01:07:23.040 Well,
01:07:23.200 that's always
01:07:23.580 the discussion
01:07:24.120 you're having.
01:07:24.700 It's like,
01:07:25.040 well,
01:07:26.080 compassion or judgment
01:07:27.280 here?
01:07:28.420 And,
01:07:28.720 you know,
01:07:28.900 if you're the
01:07:30.060 devouring mother type,
01:07:31.540 technically speaking,
01:07:32.300 it's all compassion.
01:07:33.780 And if you're the
01:07:34.280 harsh,
01:07:34.820 tyrannical father,
01:07:35.820 then it's all judgment.
01:07:37.100 And both of those
01:07:37.820 are wrong.
01:07:39.080 Every decision
01:07:39.880 is some balance
01:07:41.180 between mercy
01:07:42.380 and justice.
01:07:43.600 And when you have
01:07:44.900 children,
01:07:45.880 you're doing that
01:07:46.600 in real time.
01:07:47.280 You're trying to figure
01:07:47.800 out how to get
01:07:48.220 that balance right,
01:07:49.080 which is the same thing
01:07:50.000 you have to do
01:07:50.460 in a functioning
01:07:50.920 political system,
01:07:51.900 right,
01:07:52.380 to get that balance
01:07:53.180 right.
01:07:54.660 So,
01:07:55.720 okay,
01:07:56.000 the compassion problem
01:07:56.840 is a big problem.
01:07:57.860 Here's another problem.
01:07:59.820 The left tends
01:08:00.680 to produce utopian vision,
01:08:03.040 right?
01:08:03.280 So,
01:08:03.620 for the communists,
01:08:04.240 it was the eternal
01:08:05.380 brotherhood of man.
01:08:06.880 And everyone.
01:08:07.920 It was an inclusive
01:08:08.800 utopian vision,
01:08:10.080 unlike the Nazi vision,
01:08:11.340 which was for us
01:08:12.480 and not for you.
01:08:14.100 And so,
01:08:14.380 that kind of made it
01:08:15.000 suspect right from
01:08:15.900 the beginning,
01:08:16.480 especially if you
01:08:17.740 weren't included
01:08:18.480 in the us,
01:08:19.400 right?
01:08:20.180 So,
01:08:20.680 but the communists,
01:08:21.760 they came out with
01:08:22.240 this universal vision
01:08:23.340 of brotherhood
01:08:24.000 and the promise
01:08:25.400 of a better future.
01:08:26.780 And the thing is,
01:08:28.160 that's attractive.
01:08:30.020 And it's really,
01:08:30.680 it's attractive
01:08:31.160 even psychologically
01:08:32.040 because one of the
01:08:33.340 things that you are
01:08:34.200 working towards
01:08:34.960 is a better future,
01:08:36.000 right?
01:08:36.220 I mean,
01:08:36.780 you go out there
01:08:37.520 and you do something
01:08:38.320 and the reason you do it
01:08:39.180 is because you think
01:08:39.840 things will be better
01:08:40.640 if you do it.
01:08:41.980 And so,
01:08:42.360 you're always moved ahead
01:08:44.040 by the promise
01:08:45.040 of the dawning utopia
01:08:47.840 in a sense.
01:08:48.460 You're always trying
01:08:48.900 to improve your life.
01:08:49.880 And you say,
01:08:50.220 well,
01:08:50.700 of course you're trying
01:08:51.940 to improve your life.
01:08:52.600 It's sort of like
01:08:53.060 built into the definition
01:08:54.260 of improve.
01:08:55.840 And so,
01:08:56.160 it's really easy
01:08:56.660 to hook you
01:08:57.400 with a utopian vision
01:08:59.080 because it fits right
01:08:59.940 into your psychology.
01:09:01.060 And it's also not easy
01:09:02.080 to say what's wrong
01:09:02.960 with it.
01:09:03.260 It's like,
01:09:03.940 well,
01:09:04.080 what's wrong with
01:09:04.720 the utopian vision
01:09:05.680 of eternal brotherhood?
01:09:07.200 Isn't that what we want?
01:09:10.100 It's like,
01:09:10.560 the answer is,
01:09:11.320 well,
01:09:11.860 yeah,
01:09:12.300 kind of.
01:09:13.820 But another answer is,
01:09:15.180 the devil's in the details.
01:09:17.500 Okay,
01:09:17.840 so that's another reason
01:09:19.080 why it's hard
01:09:19.600 to box in the left
01:09:21.880 when it goes too far
01:09:22.700 because they offer
01:09:23.380 this universalist,
01:09:24.920 utopian vision
01:09:25.660 that hooks into
01:09:27.260 our psychology.
01:09:27.980 I was talking
01:09:30.360 to some very smart
01:09:32.260 people in London
01:09:33.740 two or three days ago
01:09:35.400 and we were talking
01:09:36.280 about the utopian vision
01:09:37.420 and one of the people
01:09:38.560 there,
01:09:38.860 who was a professor
01:09:39.480 at Cambridge,
01:09:40.140 if I remember correctly,
01:09:41.360 said something really
01:09:42.300 interesting about the utopia.
01:09:44.020 He said,
01:09:44.520 well,
01:09:44.640 what's the danger
01:09:45.320 of a utopian vision?
01:09:46.420 We've already outlined
01:09:47.180 what might be good about it.
01:09:49.340 Well,
01:09:50.380 you'll make sacrifices
01:09:51.620 to bring about
01:09:52.340 something better.
01:09:54.600 Right?
01:09:55.140 Everyone does that
01:09:55.900 in your life.
01:09:56.420 You sacrifice
01:09:56.960 the pleasures of today
01:09:58.500 for the accomplishments
01:09:59.420 of tomorrow.
01:10:00.280 It's sort of the definition
01:10:01.480 of being disciplined.
01:10:03.200 Right?
01:10:03.420 And you might say,
01:10:04.320 well,
01:10:04.480 the better the outcome,
01:10:05.560 the more justified
01:10:06.460 the sacrifice.
01:10:08.840 So,
01:10:09.060 well,
01:10:09.140 that's the problem
01:10:09.640 with the utopian vision
01:10:10.600 of perfection.
01:10:12.280 He was talking
01:10:13.080 about this UK historian,
01:10:16.400 I think his name
01:10:16.860 was Hobbes,
01:10:18.020 Hobbesbaum,
01:10:19.920 famous left-wing historian
01:10:22.060 who was asked
01:10:23.220 a few years ago
01:10:24.340 after the collapse
01:10:26.280 of the Soviet Union
01:10:27.220 and the full revelation
01:10:28.580 of the absolute catastrophes
01:10:30.180 of the radical left
01:10:31.480 in the 20th century.
01:10:33.740 He was asked
01:10:34.620 about his commitment
01:10:35.360 to the socialist utopian ideal,
01:10:37.860 the communist utopian ideal,
01:10:39.420 and he was asked
01:10:40.280 something like,
01:10:41.420 well,
01:10:43.540 if we could bring about
01:10:44.360 that utopia,
01:10:45.080 would have all those sacrifices
01:10:46.180 been worthwhile?
01:10:47.020 And his answer was yes.
01:10:48.180 So,
01:10:50.060 well,
01:10:50.200 if the goal
01:10:50.960 is sufficiently elevated,
01:10:52.960 then any sacrifice
01:10:54.400 is worthwhile.
01:10:56.460 And,
01:10:56.720 you know,
01:10:56.820 you can kind of
01:10:57.380 understand that
01:10:58.080 because you're willing
01:10:58.900 to make sacrifices
01:10:59.720 of extreme sort
01:11:00.560 for the health
01:11:01.140 and happiness,
01:11:01.860 continued happiness
01:11:02.580 of your family.
01:11:04.200 So,
01:11:04.700 where does that go wrong?
01:11:08.380 Well,
01:11:09.540 maybe it goes wrong
01:11:10.420 when you put the cart
01:11:11.100 before the horse.
01:11:13.460 See,
01:11:14.500 I thought about Hitler a lot
01:11:15.840 and the way Hitler
01:11:16.660 ended his life,
01:11:17.500 right,
01:11:17.680 it was,
01:11:18.480 Berlin was in flames,
01:11:20.120 Europe was in ruins,
01:11:21.560 the world was torn apart,
01:11:23.440 tens of millions of people
01:11:24.400 had died,
01:11:25.420 and he put a bullet
01:11:26.360 through his head,
01:11:28.040 angry and despondent
01:11:29.480 about the betrayal
01:11:30.460 of his Germanic people
01:11:32.260 because that's how Hitler
01:11:33.340 viewed the end of the war.
01:11:34.460 He didn't think he'd failed.
01:11:35.440 He thought the Germans
01:11:36.320 didn't live up
01:11:37.140 to his damn vision.
01:11:39.300 You know,
01:11:39.560 and when we look
01:11:40.060 at someone like Hitler
01:11:40.860 and his hopes
01:11:41.700 for world domination,
01:11:42.660 we naively assume
01:11:44.260 that he was hoping
01:11:45.920 for world domination.
01:11:48.100 But,
01:11:48.520 you know,
01:11:49.040 he was a bad guy.
01:11:51.620 And so,
01:11:52.240 it's not necessarily
01:11:53.100 the case that
01:11:53.900 he was aiming
01:11:55.160 for what he said
01:11:56.140 he was aiming for,
01:11:57.060 or that he was aiming
01:11:57.980 for what he was selling,
01:11:59.560 as pathological
01:12:00.400 as it might have been.
01:12:01.960 Maybe he was aiming
01:12:02.780 to put a bullet
01:12:03.380 through his head
01:12:04.060 in a bunker
01:12:04.940 in Berlin
01:12:05.840 with the city in flames
01:12:06.940 in Europe and ruins
01:12:07.820 and tens of millions
01:12:08.780 of people dead around him.
01:12:10.640 That seems to be
01:12:11.580 far more accurate
01:12:12.380 to me.
01:12:14.200 And so,
01:12:14.660 we could say
01:12:15.120 the same thing
01:12:15.640 about the utopian
01:12:16.620 radical leftists.
01:12:18.320 It's like,
01:12:19.020 well,
01:12:19.220 hypothetically,
01:12:19.920 they were going
01:12:20.240 to bring in
01:12:20.700 the beatific vision
01:12:21.680 and the dawn
01:12:22.820 of the new utopia.
01:12:24.120 A lot of things
01:12:24.860 had to be sacrificed
01:12:25.920 along the way,
01:12:27.180 and most of those
01:12:28.080 happened to be
01:12:28.660 other people.
01:12:30.280 And so,
01:12:30.640 then we might
01:12:31.020 just reverse that
01:12:31.760 and say,
01:12:32.760 wait a sec,
01:12:33.580 maybe there's
01:12:34.040 a dark part
01:12:34.940 of each of us,
01:12:36.720 including the utopians,
01:12:38.400 who would actually
01:12:39.360 like to sacrifice
01:12:40.400 as many people
01:12:41.320 as possible
01:12:41.960 in the shortest
01:12:42.600 possible order
01:12:43.480 with the most
01:12:44.380 possible brutality.
01:12:46.220 And they're actually
01:12:46.700 looking for a way
01:12:47.720 to make that
01:12:48.680 saleable to the
01:12:49.960 general population
01:12:51.000 or maybe even
01:12:51.860 to themselves.
01:12:52.940 Because who the hell
01:12:53.680 wants to look
01:12:54.320 in the mirror
01:12:54.820 and decide that
01:12:55.800 you're a tyrannical,
01:12:57.360 murderous monster?
01:12:58.820 You want to at least
01:12:59.500 think you're motivated
01:13:00.540 by the highest
01:13:01.280 possible motives.
01:13:02.660 Well,
01:13:02.800 what's the highest
01:13:03.320 possible motive?
01:13:04.140 Well,
01:13:04.820 the beatific vision,
01:13:06.440 the heavenly utopia,
01:13:07.980 the brotherhood of man
01:13:08.860 that justifies
01:13:10.160 every murder
01:13:11.420 and is perhaps
01:13:12.880 created not
01:13:14.060 to bring about
01:13:14.880 that utopia
01:13:16.160 but to justify
01:13:17.080 the damn murders.
01:13:19.520 And that's another
01:13:20.280 thing that we should
01:13:21.000 be very careful about
01:13:22.000 when we lay out
01:13:22.700 our utopian visions.
01:13:24.220 Because we're
01:13:25.060 forgetting that we
01:13:26.240 each have a very
01:13:27.260 dark part
01:13:28.200 and that that
01:13:29.180 dark part has a
01:13:30.220 say in how we
01:13:31.620 construe our
01:13:32.620 fantasies
01:13:33.280 and what motivates
01:13:34.100 us in the world.
01:13:35.540 So that's a
01:13:36.000 secondary danger
01:13:37.800 of the universalist
01:13:39.960 vision of the left.
01:13:41.980 I was thinking
01:13:42.520 about this tonight
01:13:43.280 just before
01:13:43.920 I came out here
01:13:45.420 and I was thinking
01:13:46.600 about this issue
01:13:47.320 of sacrifice
01:13:48.040 and one of the
01:13:48.980 things that I'm
01:13:49.480 trying to do
01:13:49.960 in 12 Rules for Life
01:13:51.040 is return
01:13:52.280 two things.
01:13:53.240 I'm trying to
01:13:54.200 make a case
01:13:54.840 that the Judeo-Christian
01:13:56.580 platform on which
01:13:57.880 our conscious
01:13:59.480 political structure
01:14:00.620 is predicated
01:14:01.420 cannot be
01:14:02.700 eradicated
01:14:03.880 without
01:14:04.300 terrible
01:14:05.020 consequences.
01:14:06.600 Now I know
01:14:07.100 why it's shaky
01:14:07.880 and all of that
01:14:08.580 and that's partly
01:14:09.220 what I was talking
01:14:09.900 to Sam Harris
01:14:10.620 about in great
01:14:11.300 detail.
01:14:12.120 But I don't
01:14:13.340 believe that we
01:14:13.980 can take what
01:14:14.760 we have and
01:14:15.400 tear the story
01:14:16.620 out from
01:14:17.200 underneath it
01:14:17.880 without leaving
01:14:18.680 a void that
01:14:19.420 will be filled
01:14:19.980 by all sorts
01:14:20.640 of pathological
01:14:21.380 things.
01:14:22.300 Now maybe we
01:14:22.940 can't return
01:14:23.680 to the past.
01:14:24.820 We never can
01:14:25.640 really.
01:14:26.400 And so it's
01:14:26.840 a problem that's
01:14:27.540 very difficult
01:14:28.100 to solve.
01:14:29.480 But here I'll
01:14:33.600 tell you something
01:14:34.200 about that story
01:14:35.000 that's really
01:14:35.460 interesting.
01:14:36.860 You know,
01:14:37.340 we already
01:14:38.780 pointed out
01:14:39.360 that you need
01:14:40.880 a vision to
01:14:41.460 kind of move
01:14:41.940 you through
01:14:42.280 life, right?
01:14:43.060 You've got to
01:14:43.660 have a sense
01:14:44.240 that you're
01:14:44.580 moving towards
01:14:45.080 something better
01:14:45.680 and the better
01:14:46.740 that better is
01:14:47.860 the more you
01:14:49.060 might be
01:14:49.380 motivated to
01:14:50.000 pursue it.
01:14:50.500 And so that
01:14:50.780 would have a
01:14:51.260 tendency to
01:14:52.020 transform itself
01:14:52.980 archetypally into
01:14:53.840 something like
01:14:54.420 the kingdom of
01:14:55.480 God or heaven
01:14:56.100 on earth or
01:14:56.780 or the eternal
01:14:57.720 brotherhood of
01:14:58.420 man, some
01:14:59.300 ultimate ideal.
01:15:00.820 And you can't
01:15:01.460 just scrap the
01:15:02.220 ultimate ideal.
01:15:03.360 Not so easily.
01:15:04.780 But then you have
01:15:05.360 the problem of the
01:15:06.020 ultimate ideal
01:15:06.660 demands sacrifice.
01:15:08.680 And then you have
01:15:09.280 the problem, I'm
01:15:10.220 writing the preface
01:15:11.320 to the Gulag
01:15:12.160 Archipelago, the
01:15:12.980 abridged version,
01:15:13.820 the 50th anniversary
01:15:14.600 version, and I
01:15:16.340 have to deliver that
01:15:17.060 in the next couple
01:15:17.700 of days and that's
01:15:18.360 partly why I've
01:15:18.940 been working through
01:15:19.500 the problems that
01:15:20.080 I'm sharing with
01:15:20.640 you tonight.
01:15:21.620 It's like, certainly
01:15:22.800 the Russian
01:15:23.280 revolutionaries were
01:15:24.160 willing to
01:15:24.560 sacrifice everyone
01:15:25.960 everyone to
01:15:28.320 their heavenly
01:15:29.640 vision and that
01:15:30.400 started right away
01:15:31.360 as soon as the
01:15:31.820 revolution occurred.
01:15:32.600 It didn't take
01:15:33.020 two years or
01:15:34.160 five years or
01:15:34.800 ten years.
01:15:35.540 It wasn't
01:15:36.000 dependent on the
01:15:36.740 cult of
01:15:37.080 personality.
01:15:38.100 The rounding up
01:15:38.780 and the sacrifices,
01:15:40.020 they were happening
01:15:40.900 right now and on a
01:15:41.980 huge scale.
01:15:43.100 And so it's a
01:15:43.560 tremendous danger.
01:15:45.420 You have to make
01:15:46.240 sacrifices to
01:15:47.760 obtain the
01:15:49.400 vision.
01:15:51.480 who should you
01:15:54.020 sacrifice?
01:15:56.160 Well, you know, one
01:15:57.280 of the things that's
01:15:57.900 really interesting
01:15:58.540 about the narrative
01:16:00.000 substructure upon
01:16:01.280 which our culture is
01:16:02.220 predicated is that
01:16:03.220 the answer to that
01:16:05.740 question is provided
01:16:07.800 in those stories.
01:16:08.820 It's provided at least
01:16:09.680 in part by the image
01:16:10.700 of the crucifixion,
01:16:11.740 right?
01:16:12.000 Because that's a
01:16:13.000 self-sacrifice.
01:16:15.520 You have to sacrifice
01:16:16.720 something to attain
01:16:17.820 what's necessary.
01:16:18.980 If it isn't going to
01:16:21.160 be you who
01:16:22.260 sacrifices yourself,
01:16:23.920 then it's going to
01:16:24.680 be you who
01:16:25.240 sacrifices someone
01:16:26.320 else.
01:16:27.360 That's what it
01:16:27.980 looks like.
01:16:28.960 The sacrifice might
01:16:29.960 be necessary.
01:16:31.120 Well, and we know
01:16:31.640 sacrifice is necessary,
01:16:33.040 right?
01:16:33.720 There's the archaic
01:16:34.660 idea of sacrifice.
01:16:35.840 You offer something
01:16:36.520 up to God in the
01:16:37.760 hopes that his
01:16:38.680 benevolence will shine
01:16:39.780 on you.
01:16:40.240 And you can think
01:16:40.860 about that as a kind
01:16:41.700 of superstition.
01:16:42.660 And in a sense it
01:16:43.680 is to sacrifice an
01:16:44.900 animal to burn it
01:16:45.840 so that its smoke
01:16:46.640 arises to heaven so
01:16:48.000 God can detect
01:16:48.820 the quality of
01:16:49.480 your sacrifice and
01:16:50.580 determine whether
01:16:51.200 you're living in
01:16:51.820 accordance with a
01:16:53.040 heavenly, what would
01:16:54.640 you call it, with a
01:16:55.640 heavenly ethic.
01:16:57.000 But it's acted out.
01:16:58.040 It's a very sophisticated
01:16:58.820 idea that's acted out
01:17:00.140 first.
01:17:00.700 We all know we have
01:17:01.540 to make sacrifices.
01:17:03.040 When I ask my
01:17:03.800 students, what did
01:17:05.260 your parents sacrifice
01:17:06.520 so you could go to
01:17:07.560 university?
01:17:08.460 Most of them,
01:17:09.760 children of first
01:17:10.860 generation immigrants
01:17:11.740 as they are, can
01:17:13.100 list off a very long
01:17:14.540 list of sacrifices that
01:17:16.140 their parents made.
01:17:17.040 And we know that a
01:17:18.040 hallmark of maturity
01:17:19.040 is to make sacrifice
01:17:20.320 for something that's
01:17:21.580 better in the future.
01:17:23.180 What should you
01:17:24.000 sacrifice?
01:17:25.860 How about not other
01:17:26.900 people?
01:17:28.140 You have to sacrifice
01:17:29.340 something though.
01:17:30.740 Well, if you're not
01:17:31.340 going to sacrifice
01:17:31.940 other people, well,
01:17:33.980 then you offer yourself
01:17:34.800 up as a voluntary
01:17:35.660 sacrifice.
01:17:37.060 Right?
01:17:37.520 And that's part of
01:17:38.060 that underlying story.
01:17:39.400 And that's part of the
01:17:40.260 antidote to the
01:17:41.400 pathology of compassion,
01:17:43.360 let's say, that's part
01:17:46.420 of the ethos of the
01:17:47.360 radical collectivist
01:17:48.460 left.
01:17:49.220 The antidote to that
01:17:50.300 collective version of
01:17:51.600 sacrifice is, as far
01:17:53.080 as I'm concerned, the
01:17:54.480 fundamental notion of
01:17:55.620 individual sacrifice.
01:17:57.140 And what that means is
01:17:58.200 the adoption of
01:17:59.040 individual responsibility.
01:18:01.100 And what's so
01:18:02.200 remarkable about that,
01:18:03.240 I think, is that the
01:18:04.580 story upon which our
01:18:05.840 culture, our functional
01:18:06.820 culture, is predicated
01:18:08.360 is the story that places
01:18:11.440 the fundamental burden
01:18:12.860 for putting the
01:18:14.220 structure of reality
01:18:15.040 right on you as a
01:18:16.720 consequence of the
01:18:17.500 necessity for you to
01:18:18.440 make the proper
01:18:19.000 sacrifices voluntarily.
01:18:21.300 And I think that that
01:18:22.260 story is true.
01:18:24.520 And that we need to
01:18:25.320 understand it.
01:18:26.840 Because we act it out
01:18:28.000 imperfectly, and we
01:18:29.520 deviate from it at our
01:18:30.900 peril, and we need to
01:18:32.220 wake up, and we need to
01:18:33.080 become conscious of
01:18:33.920 exactly what it means.
01:18:36.620 Thank you very much.
01:18:37.660 Thank you.
01:18:37.740 All right, my man.
01:18:45.660 It's good to be back,
01:18:46.680 isn't it?
01:18:47.400 Yeah, it's good.
01:18:48.220 I'm glad you're here,
01:18:49.180 too.
01:18:49.600 The lectures are
01:18:50.460 definitely better with
01:18:51.260 you introducing them.
01:18:53.040 So...
01:18:53.680 Oh, I'm the top
01:18:56.060 lobster.
01:18:56.920 Isn't that nice?
01:19:00.280 All right.
01:19:00.920 There's a ton of good
01:19:01.820 stuff here, so let's
01:19:02.560 just get right to it.
01:19:05.520 Your thoughts on the
01:19:07.400 legalization of
01:19:08.420 marijuana.
01:19:12.760 I think it's worth
01:19:13.840 the experiment, I
01:19:15.660 think.
01:19:16.120 While everyone's
01:19:16.820 already experimenting
01:19:17.660 with it anyways.
01:19:19.580 You know, I think
01:19:20.560 there's a Dutch law,
01:19:23.500 if I remember
01:19:24.460 correctly, that a
01:19:26.520 law has to be scrapped
01:19:27.700 if enough people
01:19:28.420 break it.
01:19:29.900 And I like that law
01:19:31.020 because it's
01:19:32.660 reflective of the
01:19:33.460 will of the people.
01:19:34.380 You know, and let's
01:19:35.960 face it, people have
01:19:37.380 been smoking pot for
01:19:38.440 a long time.
01:19:40.060 Like, really, it's
01:19:40.820 60 years since it's
01:19:42.140 become a widespread
01:19:43.480 recreational endeavor.
01:19:45.720 And contrary to the
01:19:47.240 utterances of most
01:19:48.620 mainstream politicians,
01:19:50.880 the vast majority of
01:19:52.300 people who smoke pot
01:19:53.380 also inhale.
01:19:54.400 alcohol, so, and it's
01:19:58.680 also possible that, you
01:20:01.580 know, it's possible that
01:20:02.240 it could go off the
01:20:02.980 rails, but it's
01:20:05.440 possible that people will
01:20:06.620 smoke pot instead of
01:20:07.540 drinking, and that would
01:20:09.120 be good because the worst
01:20:10.760 drug is alcohol, by a huge
01:20:13.100 margin.
01:20:14.760 And that's, I'm saying
01:20:17.300 that as someone who
01:20:18.060 actually has a certain
01:20:19.020 fondness for alcohol.
01:20:20.160 So, but alcohol makes
01:20:23.340 people violent.
01:20:24.680 It's the only drug we
01:20:25.560 know of that actually
01:20:26.440 does that.
01:20:27.580 And it's implicated in
01:20:29.360 the vast majority of
01:20:31.900 violent assaults and
01:20:33.220 family assaults and
01:20:34.820 murders and being the,
01:20:37.720 you're much more likely to
01:20:38.620 be the victim of a
01:20:39.540 murder if you're drunk,
01:20:40.720 you're much more likely to
01:20:41.560 be a murderer if you're
01:20:42.740 drunk, et cetera, et
01:20:44.160 cetera.
01:20:44.340 And so, lots of people
01:20:47.180 smoke pot, doesn't really
01:20:49.360 seem to make people
01:20:50.240 aggressive.
01:20:52.480 It's not so good for a
01:20:54.060 minority of people,
01:20:55.160 especially if combined
01:20:56.160 with stimulant use, like
01:20:58.060 there does seem to be
01:20:59.060 some evidence that it
01:20:59.960 tilts some people towards
01:21:01.060 psychosis under some
01:21:02.200 conditions.
01:21:03.000 It's not without its
01:21:03.800 dangers, but neither is
01:21:05.340 driving, which is
01:21:07.080 generally by far the
01:21:08.680 most dangerous thing you
01:21:09.840 do, except to drink with
01:21:11.140 family members.
01:21:11.900 So, and other
01:21:16.320 countries have tried
01:21:17.700 this, you know,
01:21:18.260 Portugal legalized,
01:21:19.260 decriminalized all drugs
01:21:20.780 15 years ago, all of
01:21:22.940 them, and Portugal hasn't
01:21:25.140 turned into a
01:21:25.760 catastrophe, and in the
01:21:27.200 U.S. where some of the
01:21:28.520 states where marijuana is
01:21:30.280 being legalized, there's
01:21:31.100 been quite a marked
01:21:31.820 decline in opiate use,
01:21:33.420 and in general
01:21:34.280 criminality, which is
01:21:35.320 quite interesting.
01:21:37.120 So, who knows, maybe
01:21:39.000 we'll get it right.
01:21:39.940 And so, it seems to me
01:21:41.020 that it's probably
01:21:41.760 worth the experiment.
01:21:50.960 I have a feeling I know
01:21:52.580 who submitted this
01:21:53.480 question.
01:21:54.800 If you had Wonder
01:21:56.240 Woman's lasso of truth,
01:21:58.400 and could use it on one
01:22:01.380 person, who would that
01:22:03.560 be?
01:22:04.180 Vladimir Putin.
01:22:05.040 You know, what I'd really
01:22:11.960 like to know, what I'd
01:22:13.400 really like to know about
01:22:14.340 Putin, I think, above all
01:22:15.720 else, it's kind of an
01:22:16.600 obscure thing in some
01:22:17.640 sense.
01:22:18.540 There's been an, when
01:22:20.140 Solzhenitsyn wrote the
01:22:21.500 Gulag Archipelago, what he
01:22:22.900 was hoping for was that the
01:22:24.260 Russians would return to
01:22:25.520 their Orthodox Christian
01:22:26.780 tradition, and develop their
01:22:28.800 political system organically
01:22:30.180 as a consequence.
01:22:32.560 And what's happened since
01:22:33.640 the fall of the Soviet Union
01:22:35.900 is that Orthodox
01:22:38.200 Christianity has been
01:22:39.440 undergoing a revival in
01:22:40.760 Russia, for better or worse,
01:22:43.560 but certainly a revival.
01:22:45.040 So, that part of Solzhenitsyn's
01:22:46.700 wish has come true.
01:22:49.540 Now, Putin seems to support
01:22:51.380 that revival, and has made
01:22:53.800 gestures in the direction of
01:22:56.060 affiliation with Orthodox
01:22:58.020 Christianity, and I'd like to
01:22:59.960 find out if he does believe
01:23:02.660 that there is a power that
01:23:05.000 he's subordinate to, because
01:23:07.140 that would be a real relief.
01:23:09.800 You know, so, well, one of the
01:23:10.960 things that I learned about,
01:23:13.120 from studying religious
01:23:14.260 structures, from a scientific
01:23:17.460 perspective, let's say,
01:23:19.400 psychological perspective, was
01:23:20.900 that even thousands of years
01:23:25.380 ago, among the Mesopotamians,
01:23:28.120 for example, there was an idea
01:23:30.520 that the ruler, the emperor,
01:23:32.160 who was pretty much an absolute
01:23:33.400 despot, in some sense, was
01:23:35.760 nonetheless responsible, either
01:23:38.960 directly to a higher power, or
01:23:41.900 was a manifestation of that
01:23:43.940 higher power on earth, and had
01:23:45.640 certain ethical obligations as a
01:23:48.160 consequence.
01:23:49.380 And so, see, one of the things
01:23:50.820 that our religious imagination has
01:23:53.280 done over the course of
01:23:55.000 millennia, is abstract out the
01:23:57.480 idea of sovereignty as something
01:23:59.940 in and of itself, and then to
01:24:01.280 make everyone subordinate to that
01:24:03.380 abstract notion, including
01:24:05.280 absolute rulers.
01:24:07.260 And that's a really big deal,
01:24:08.620 because you don't want your ruler
01:24:10.520 to be the highest thing there is,
01:24:12.880 and you certainly don't want him or
01:24:14.300 her to think that way.
01:24:16.740 You want them to be secondary to
01:24:18.920 some other conception of absolute
01:24:21.340 ethical sovereignty.
01:24:22.200 And I'd really like to find out if
01:24:24.460 Putin's rapprochement with
01:24:27.500 orthodox Christianity is a facade
01:24:30.280 and part of a power game that he's
01:24:32.020 playing.
01:24:32.800 You know, he's using the church
01:24:34.080 cynically as a means to bolster his
01:24:36.200 popularity, or whether there is a
01:24:38.820 least a tiny little corner of him
01:24:41.560 that thinks that he might be playing
01:24:43.560 a part in some broader ethical game.
01:24:47.520 There's been a lot of talk about sex
01:24:58.440 education curriculum since Ford has
01:25:00.940 become the premier.
01:25:03.000 Where do you stand on the
01:25:05.020 curriculum?
01:25:07.580 Do you know specifically?
01:25:08.520 Yes, yes.
01:25:11.200 I think the faster that Ford undoes
01:25:13.820 the radical maneuvers of win, the
01:25:16.360 better.
01:25:17.740 So.
01:25:26.780 Now, that doesn't make me naively
01:25:30.880 optimistic about Ford and the
01:25:32.860 conservatives.
01:25:33.400 And so, let me tell you what I'm
01:25:35.600 hoping for, and what I think we
01:25:36.960 should all hope for.
01:25:39.020 I don't think the conservatives can
01:25:40.740 win, because I think that there has
01:25:43.960 to be a balance between left and
01:25:46.080 right dialogically in the manner that
01:25:48.960 I already described.
01:25:51.140 And so, I hope that that dialogue can
01:25:54.520 reestablish itself properly.
01:25:56.940 And what I'm hoping for from Ford and
01:25:58.980 the conservatives is ordinary
01:26:01.260 incompetence.
01:26:04.080 Right?
01:26:04.860 That's what we've had in Canada, at
01:26:07.000 our government level forever.
01:26:08.620 That's why the country works.
01:26:10.260 Well, it's something, this is, see,
01:26:12.520 one of the things that's so
01:26:13.580 remarkable about the founders of the
01:26:15.460 American system was that that's what
01:26:18.280 they were aiming for.
01:26:19.820 Right?
01:26:20.160 They weren't utopians.
01:26:22.080 For the reasons that we already
01:26:23.360 outlined, they thought, God, we're all
01:26:26.800 stupid and incompetent, all of us.
01:26:28.760 And we're kind of malevolent, too.
01:26:30.160 And we're stuck with it, man.
01:26:32.560 Stupidity, incompetence, malevolence,
01:26:34.940 willful blindness.
01:26:36.140 Yeah, we got lots of that, too.
01:26:38.140 Okay, so, and that isn't going to
01:26:39.480 change.
01:26:40.060 And sometimes it will even get worse.
01:26:42.440 So, how can we produce a system that
01:26:44.640 morons who are malevolent can't screw
01:26:47.320 up too badly?
01:26:49.900 He's never done this whole bit when
01:26:51.540 we're in the States, by the way.
01:26:53.520 Just wanted to say that.
01:26:54.320 And so, you know, that the English
01:26:56.460 system, with its common law, and the
01:26:59.260 Canadian system, embedded in the same
01:27:01.840 philosophical background, is sort of
01:27:04.880 predicated on the same principles, is
01:27:06.540 that we want a government that muddles
01:27:09.760 through reasonably well.
01:27:12.200 And that would mean avoids the pitfalls
01:27:14.700 of total ideological commitment.
01:27:16.380 And so I'm hoping that Ford is just as
01:27:20.920 bad as the average politician, and no
01:27:24.100 worse.
01:27:24.800 And that would be a great relief for the
01:27:26.660 next four years.
01:27:28.580 Thank you.
01:27:36.100 What are your thoughts on the New York
01:27:38.280 Times, not including you, on their
01:27:40.220 bestsellers list?
01:27:41.400 Well, mostly my thoughts are of, I think
01:27:44.700 it's extraordinarily comical.
01:27:47.440 And here's why.
01:27:49.320 So, I wrote this book called Maps of
01:27:51.780 Meaning in 1999.
01:27:54.960 And some of you might be familiar with
01:27:56.740 it, because I lectured about it a lot.
01:27:59.040 And so those lectures are online.
01:28:01.100 And if you like Twelve Rules for Life, if
01:28:03.460 you found it useful, let's say, not
01:28:07.200 whether or not you liked it, but if you
01:28:08.580 found it useful, and you're interested
01:28:11.380 in the abstract philosophical concepts
01:28:15.500 that are part and parcel of it, then
01:28:18.140 you might be interested in Maps of
01:28:20.360 Meaning, because that book goes into
01:28:22.940 all of that way more deeply.
01:28:25.240 And so it's a very, very difficult
01:28:26.540 book.
01:28:27.920 But I recorded the audio version and
01:28:32.000 released that on June 12th.
01:28:34.400 And I think the audio version is a lot
01:28:36.420 more accessible, because I could read
01:28:39.100 the sentences, many of which are long
01:28:40.860 and complicated, I could provide the
01:28:42.820 proper intonation and more of a hint
01:28:45.220 towards the meaning.
01:28:46.300 And so I'm hoping that it'll be more
01:28:47.920 accessible.
01:28:48.740 And in any case, it hit the New York
01:28:52.220 Times bestseller list.
01:28:55.080 So, despite the fact that I'm not a New
01:28:58.020 York Times bestseller because of Twelve
01:28:59.840 Rules for Life, because they gerrymandered
01:29:02.020 the damn selection criteria post-hoc for
01:29:04.340 reasons of their own, I'm a New York
01:29:06.560 Times bestseller because of the audio book
01:29:08.620 of Maps of Meaning.
01:29:09.660 So I think that's spectacularly comical.
01:29:11.880 And that's sort of what I think about it.
01:29:20.360 Did you always talk with your hands so
01:29:23.060 much?
01:29:26.560 Always is a very long time.
01:29:29.500 Yeah, I think so.
01:29:30.640 Well, as far as I know, I haven't just
01:29:34.200 learned it recently.
01:29:36.820 So, and it's helpful for some reason.
01:29:39.220 I don't know why.
01:29:41.000 Shape things out, you know.
01:29:42.400 Drawing little triangles in particular.
01:29:46.300 Your voice should possess your body.
01:29:49.740 You know?
01:29:50.780 It's part of being embodied.
01:29:52.980 So, it's part of having your words
01:29:57.200 make themselves manifest in your body.
01:30:00.960 And you do want that to happen.
01:30:02.640 You should act out what you say.
01:30:05.400 So, it's part of that.
01:30:08.580 Does the amount of time teenagers spend
01:30:11.400 on various forms of social media,
01:30:13.400 Facebook, Twitter, Instagram, etc.,
01:30:15.540 at school negatively impact their ability
01:30:18.780 to learn?
01:30:19.500 No, they should probably be on social media
01:30:22.100 instead of being at school.
01:30:27.040 I'm not much of a fan of the current
01:30:29.660 education system.
01:30:30.940 So, I think it's mostly something
01:30:32.580 to escape from.
01:30:36.840 Look, having said that,
01:30:40.180 you know,
01:30:41.100 you don't want to be using something
01:30:45.020 when it's interfering with something else
01:30:47.140 that you know you should do.
01:30:48.220 And so, if a kid's going through school,
01:30:51.580 there's obviously a certain amount
01:30:53.020 of participation in that process
01:30:56.060 that's necessary.
01:31:00.240 Partly so that the kid has a sense
01:31:03.020 of accomplishment,
01:31:04.800 and partly so that they prepare themselves
01:31:06.540 to move forward in the world.
01:31:08.880 And so, if they're using social media
01:31:10.400 impulsively,
01:31:12.000 in an undisciplined manner,
01:31:14.140 and interfering with their own progress forward,
01:31:16.400 then it's a bad idea.
01:31:18.320 Just like it is whenever you do anything
01:31:20.220 that's impulsive,
01:31:21.200 that interferes with your movement forward,
01:31:23.280 and that's indicative of a lack of discipline.
01:31:26.560 So,
01:31:27.060 everything in balance
01:31:29.460 and in the right place
01:31:30.760 is the proper attitude.
01:31:32.620 And if you're a high school student
01:31:34.260 and you're addicted to Twitter,
01:31:36.440 which is a very easy thing
01:31:37.660 to become addicted to,
01:31:39.380 despite its horror,
01:31:40.500 then it would be better
01:31:44.440 to get your priorities straight
01:31:46.180 and discipline yourself.
01:31:48.380 So.
01:31:51.560 What do you mean
01:31:52.460 when you say that
01:31:53.720 atheists aren't really atheists?
01:31:56.500 Well, I can give you an example.
01:32:01.860 So,
01:32:03.720 this is something I've argued
01:32:05.440 with Sam Harris about constantly,
01:32:06.940 because he thinks
01:32:07.760 that he's generated
01:32:08.620 this entire ethic
01:32:09.600 sort of out of his rational mind.
01:32:13.560 So, Harris basically believes
01:32:14.960 that
01:32:15.300 we should act in a way
01:32:18.280 that minimizes suffering.
01:32:19.960 And he tells a little horror story
01:32:25.340 about someone suffering terribly,
01:32:27.480 a fictional person in his account,
01:32:29.880 born in the wrong country,
01:32:31.700 tortured throughout her life
01:32:32.880 by tyrants and rapists
01:32:35.560 and predators
01:32:37.440 and doing nothing in life but suffering,
01:32:41.000 and pointing out
01:32:41.800 that we should try to help people
01:32:43.920 arrange their lives
01:32:44.680 so that that isn't their existence.
01:32:48.440 He thinks we should move away
01:32:49.820 from suffering
01:32:50.560 towards whatever
01:32:51.600 its opposite is,
01:32:52.680 which he conceptualizes
01:32:53.820 somewhat thinly,
01:32:55.300 in my opinion,
01:32:56.360 as well-being.
01:32:58.220 Well, for me,
01:32:59.100 he's just laid out
01:33:00.180 the landscape
01:33:00.820 of heaven and hell.
01:33:05.140 And hell is the place
01:33:06.080 of maximal suffering,
01:33:07.380 and heaven is whatever
01:33:08.680 is the opposite of that.
01:33:10.360 Now, he would object,
01:33:11.880 yeah, but I'm not
01:33:12.620 projecting it into the afterlife,
01:33:15.340 but that's a different issue,
01:33:16.720 because those concepts
01:33:17.680 are much more complicated
01:33:18.740 than concepts
01:33:20.200 that merely encapsulate
01:33:21.440 the afterlife.
01:33:22.940 He believes that
01:33:23.760 it's a fact
01:33:25.740 that everyone can see
01:33:27.100 that that movement
01:33:27.880 is the right way to behave,
01:33:29.480 and that your primary
01:33:30.640 moral obligation
01:33:31.480 is to behave in that manner.
01:33:33.680 But I would say
01:33:34.820 what he's saying
01:33:36.100 is that
01:33:36.600 you should embody
01:33:38.220 the ethic
01:33:39.340 of moving away
01:33:40.660 from hell
01:33:41.140 towards heaven,
01:33:41.840 and that's your primary
01:33:42.900 ethical obligation.
01:33:45.100 and that's already
01:33:46.740 embodied,
01:33:47.560 for example,
01:33:48.100 in the Christian ethos,
01:33:49.180 which says that
01:33:49.940 explicitly.
01:33:51.940 And so that idea
01:33:52.780 has been there
01:33:53.320 for 2,000 years.
01:33:54.540 It's not like Sam
01:33:55.240 came up with it.
01:33:56.800 He has put,
01:33:57.940 he has said,
01:33:58.880 well, here's how
01:33:59.760 I came to that conclusion,
01:34:02.100 but there's no reason
01:34:03.080 to assume that he knows
01:34:04.200 how he came to that conclusion,
01:34:05.740 especially given that
01:34:06.580 it's the same damn conclusion.
01:34:08.860 And so,
01:34:09.440 and when pushed,
01:34:10.200 he says,
01:34:10.620 well,
01:34:10.800 the ethic should be
01:34:12.640 based on love.
01:34:14.820 Fair enough.
01:34:15.860 And that you should
01:34:16.460 tell the truth.
01:34:17.600 Truth serving love.
01:34:19.300 Well,
01:34:20.080 that's a Christian concept.
01:34:21.860 Now,
01:34:22.020 it's not just
01:34:22.780 a Christian concept,
01:34:24.200 but it's certainly
01:34:25.240 a Christian concept.
01:34:26.320 It might even be
01:34:26.880 the highest Christian concept.
01:34:28.760 And so,
01:34:29.720 you know,
01:34:30.100 if it,
01:34:30.580 what do they say?
01:34:31.460 If it walks like a duck
01:34:33.060 and quacks like a duck,
01:34:35.020 then it's probably a duck.
01:34:38.220 And so,
01:34:38.780 and you know,
01:34:39.280 I've come to this conclusion
01:34:40.360 partly because of my reading
01:34:41.720 of Carl Jung,
01:34:42.560 and Jung pointed out,
01:34:44.040 and I think he's right,
01:34:45.140 even from a scientific perspective,
01:34:47.560 that most of our ethic
01:34:49.380 is unconscious
01:34:50.980 and built into us,
01:34:53.040 both biologically
01:34:53.840 and then socioculturally.
01:34:55.420 And that,
01:34:55.980 and Nietzsche himself said
01:34:57.040 that every philosophy
01:34:58.600 is an unconscious unfolding
01:35:01.140 of the explicit axioms
01:35:04.240 of the person.
01:35:05.360 It's like,
01:35:05.640 you already have the axioms.
01:35:07.660 What are you doing
01:35:08.280 when you're thinking?
01:35:09.520 You're thinking
01:35:10.280 through an articulation
01:35:11.360 of what you already are.
01:35:13.120 Now,
01:35:13.340 Sam won't accept that
01:35:14.700 because he's bound
01:35:15.800 and determined
01:35:16.840 to indicate
01:35:18.100 that you can come up
01:35:19.180 with a viable ethical solution
01:35:20.800 through rationality alone.
01:35:22.940 But the other problem
01:35:23.700 with that is,
01:35:24.840 well,
01:35:25.100 what exactly,
01:35:26.740 precisely,
01:35:27.560 do you mean by rationality,
01:35:29.000 which is kind of
01:35:29.520 a 17th century concept?
01:35:31.320 What do you mean,
01:35:32.240 how does that instantiate
01:35:33.280 itself neurobiologically?
01:35:34.600 What's the underlying structure?
01:35:35.920 What's the mechanism
01:35:37.220 by which you use,
01:35:38.580 you apply rationality
01:35:39.560 to the world
01:35:40.080 and derive an ethic?
01:35:41.540 Well,
01:35:42.180 all of these things are,
01:35:44.260 I don't believe
01:35:45.220 that the atheists
01:35:47.160 have an answer
01:35:48.240 to those questions.
01:35:49.720 And that's because
01:35:51.100 they're stuck
01:35:52.320 in time
01:35:53.700 300 years ago.
01:35:56.500 So,
01:35:56.720 Do you think
01:35:58.420 you guys moved
01:35:59.340 each other
01:36:00.020 in any direction
01:36:01.000 throughout the talks?
01:36:02.200 Oh, definitely.
01:36:03.240 Yeah, yeah.
01:36:04.240 I mean,
01:36:04.600 I certainly,
01:36:05.180 I certainly,
01:36:06.480 you know,
01:36:07.760 I certainly have
01:36:09.140 no doubt
01:36:09.620 about his
01:36:10.300 good will
01:36:12.120 or at least
01:36:13.180 I have no doubt
01:36:14.060 that he has
01:36:14.540 as much good will
01:36:15.240 as I do.
01:36:16.000 I have doubts
01:36:16.640 about both
01:36:17.240 of our good will
01:36:17.960 because I have doubts
01:36:18.740 about everyone's
01:36:19.520 good will.
01:36:20.480 You know,
01:36:20.840 and you can say
01:36:21.520 that you're full
01:36:22.060 of good will forever
01:36:23.020 but,
01:36:24.500 you know,
01:36:26.020 you're not.
01:36:28.280 So,
01:36:28.880 so yeah,
01:36:30.020 I think the discussions
01:36:30.840 were unbelievably productive
01:36:32.120 and I'm hoping
01:36:33.120 they'll be released
01:36:34.040 in August.
01:36:35.300 That's the plan
01:36:36.100 and after the audio
01:36:38.300 is edited
01:36:38.800 and they're all
01:36:39.300 put together properly,
01:36:40.300 that's the delay.
01:36:41.720 We're also trying
01:36:42.300 to figure out
01:36:42.760 how best to release them
01:36:44.040 but that's a secondary issue.
01:36:45.580 I think the discussions
01:36:46.480 were unbelievably productive
01:36:47.720 but everybody
01:36:48.920 will have a chance
01:36:49.480 to figure that out
01:36:50.080 for themselves
01:36:50.520 if they want.
01:36:51.920 So,
01:36:52.100 and that'll be interesting
01:36:52.820 too,
01:36:53.160 it'll be really fascinating
01:36:54.360 as far as I'm concerned
01:36:55.280 to see what the public judgment
01:36:57.300 is on each talk
01:36:58.840 and then on the,
01:37:00.460 because the talks
01:37:01.020 did develop
01:37:01.520 across all four events.
01:37:03.720 So,
01:37:05.120 but yeah,
01:37:05.700 I think the,
01:37:07.320 and Sam himself,
01:37:08.360 you know,
01:37:08.640 I mean,
01:37:10.200 he's got his point,
01:37:11.540 you know,
01:37:11.880 I mean,
01:37:12.320 certainly the problem
01:37:13.840 with religious thinking
01:37:15.180 is that it can degenerate
01:37:16.360 into fundamentalism
01:37:17.340 and that's a big problem.
01:37:18.580 We also discussed that
01:37:19.380 with Douglas Murray
01:37:20.100 who's particularly concerned
01:37:21.360 about Islamic fundamentalism.
01:37:22.820 and the threat it poses,
01:37:24.600 hypothetically poses
01:37:25.420 to the West
01:37:26.060 and to itself
01:37:27.840 for that matter.
01:37:29.700 And certainly
01:37:30.460 it is the case
01:37:31.160 that the problem
01:37:32.620 with religious presupposition
01:37:34.740 is that it can become
01:37:35.860 fundamentalist
01:37:36.740 and degenerate.
01:37:37.780 That's a big problem.
01:37:38.880 Might even be a fatal problem,
01:37:40.300 you know.
01:37:40.560 But the problem
01:37:42.340 with the atheist perspective
01:37:43.600 is it provides
01:37:45.020 very little antidote
01:37:45.980 to nihilism.
01:37:48.520 And that's,
01:37:49.120 and Sam himself
01:37:50.020 admits that.
01:37:50.720 It's like,
01:37:51.080 you know,
01:37:51.380 atheism is fundamentally
01:37:52.420 a doctrine of negation.
01:37:53.920 There is no God.
01:37:55.560 Okay?
01:37:56.220 What else isn't there then?
01:37:59.080 Well,
01:37:59.420 is there any meaning?
01:38:01.180 Well,
01:38:01.680 meaning is what you create.
01:38:03.020 That would be
01:38:03.380 Nietzsche's answer.
01:38:04.380 No,
01:38:04.640 it's not.
01:38:05.640 That's wrong.
01:38:06.500 You can't create
01:38:07.200 your own meaning.
01:38:08.280 You have an intrinsic nature.
01:38:10.120 You can't,
01:38:10.720 you won't do
01:38:11.480 what you say,
01:38:12.840 you won't do
01:38:13.540 what you order yourself
01:38:14.760 to do.
01:38:15.560 And you won't be interested
01:38:16.880 in what you command
01:38:18.080 yourself to be interested in.
01:38:19.640 You have a nature
01:38:20.260 that you have to take
01:38:21.000 into consideration.
01:38:22.440 And the problem
01:38:23.020 with the atheist doctrine
01:38:24.060 doesn't leave people
01:38:24.900 with anything.
01:38:26.560 You know,
01:38:26.880 we should strive
01:38:27.620 to maximize well-being.
01:38:30.400 It's like,
01:38:31.240 well,
01:38:31.540 go write a symphony
01:38:32.360 about that.
01:38:34.040 You know,
01:38:34.300 erect a cathedral
01:38:36.460 to that.
01:38:37.600 It's weak
01:38:38.760 from a motivational
01:38:40.620 perspective.
01:38:41.340 It doesn't have
01:38:41.760 any grandeur.
01:38:42.560 It doesn't have
01:38:42.920 any beauty.
01:38:43.600 It doesn't have
01:38:43.960 any motive force.
01:38:45.860 And maybe it's true.
01:38:48.780 Maybe it is true.
01:38:50.100 You know,
01:38:50.360 it's not like being
01:38:51.080 an atheist materialist
01:38:52.200 means you're stupid.
01:38:53.760 The atheist materialists
01:38:54.800 have gone a long ways
01:38:55.780 and all of our
01:38:56.520 remarkable technology
01:38:57.780 is at least in part
01:38:58.800 a consequence
01:38:59.440 of that atheist
01:39:00.380 materialistic critique
01:39:01.580 of the superstitious
01:39:03.580 element of religion.
01:39:04.280 It's not a religious dogma.
01:39:05.220 It's a big deal.
01:39:06.500 You know,
01:39:07.200 but
01:39:07.580 Sam's very concerned
01:39:10.760 about the problem
01:39:11.420 of fundamentalism.
01:39:12.520 He's also
01:39:13.360 concerned about
01:39:14.660 the problem of nihilism
01:39:15.660 but not nearly
01:39:16.600 as concerned.
01:39:17.560 And I think that's
01:39:18.160 where we really differ.
01:39:19.820 You know,
01:39:20.460 we're both concerned
01:39:21.740 about
01:39:22.080 the swings in meaning
01:39:24.200 to absolute meaning
01:39:25.820 with no possibility
01:39:27.140 of deviation,
01:39:28.460 fundamentalism,
01:39:29.360 totalitarianism,
01:39:30.180 and then no meaning
01:39:31.340 and the collapse
01:39:32.020 and denialism.
01:39:32.880 That end of the spectrum
01:39:34.260 doesn't disturb him
01:39:35.200 as much.
01:39:35.860 It really disturbs me.
01:39:38.240 And so,
01:39:38.700 and so that's partly
01:39:40.020 I think what's oriented
01:39:41.020 us towards
01:39:41.540 different solutions.
01:39:43.220 So,
01:39:44.200 but you know,
01:39:44.860 we had 10 hours
01:39:45.680 of discussion
01:39:46.280 and we probably took it
01:39:47.660 as far as we could.
01:39:49.240 And so,
01:39:50.560 well,
01:39:51.880 we'll see
01:39:52.380 what the consequences,
01:39:53.760 if any,
01:39:54.500 of that are.
01:39:55.480 Well,
01:39:57.920 speaking of meaning,
01:39:59.700 when did you decide
01:40:00.820 to become
01:40:01.440 a male fashion icon?
01:40:09.840 Well,
01:40:10.380 it's a funny thing.
01:40:11.660 You know,
01:40:11.920 one of the rules
01:40:13.160 that I did not write about,
01:40:15.300 which I think might be
01:40:16.040 in my next book,
01:40:16.940 is dress like the person
01:40:19.140 you want to become.
01:40:21.240 So,
01:40:21.800 which is
01:40:22.240 play the part
01:40:23.460 before you're the part.
01:40:24.680 It's something like that.
01:40:25.540 It's act out your ideal.
01:40:26.980 That's another way
01:40:27.560 of thinking about it.
01:40:28.920 And so,
01:40:29.380 I've thought about that.
01:40:30.340 I've never been
01:40:31.320 much into
01:40:32.660 fashion.
01:40:34.860 Although,
01:40:35.360 I bought some decent suits
01:40:36.360 a few years ago
01:40:37.280 when I was on,
01:40:38.800 was working for TV Ontario.
01:40:40.700 And I bought some decent suits
01:40:42.060 for that.
01:40:42.680 And I've always lectured
01:40:43.600 in a suit.
01:40:44.780 And my father,
01:40:45.640 who was the school teacher,
01:40:46.580 always lectured in a suit.
01:40:48.100 And so,
01:40:48.440 I kept that going
01:40:49.360 because,
01:40:49.940 well,
01:40:50.100 partly when I was a young professor
01:40:51.440 and looked very young,
01:40:52.680 it was part of
01:40:53.740 a way that I could
01:40:55.040 put a boundary
01:40:56.660 of authority,
01:40:57.440 let's say,
01:40:57.880 between me
01:40:59.000 and my students.
01:41:00.220 But I also thought
01:41:00.980 that it was appropriate
01:41:02.440 because there is a hierarchy
01:41:04.300 at least
01:41:05.120 of competence,
01:41:06.320 at least in principle
01:41:07.280 in universities,
01:41:08.060 although we're doing
01:41:08.580 everything we can
01:41:09.260 to erase that
01:41:09.900 as rapidly as possible,
01:41:10.920 by making everyone
01:41:12.080 equally incompetent,
01:41:13.620 which is not
01:41:14.080 a very bad solution.
01:41:15.160 And then when I had
01:41:18.360 the opportunity
01:41:18.980 to go in these lectures
01:41:21.640 and speak to all
01:41:22.700 these people,
01:41:23.500 I thought,
01:41:23.980 well,
01:41:25.180 well,
01:41:25.440 I thought I'm going
01:41:26.140 all in.
01:41:27.860 You know,
01:41:28.080 well,
01:41:28.320 seriously,
01:41:28.940 like,
01:41:29.140 why wouldn't you do that?
01:41:30.280 I mean,
01:41:30.520 this is a ridiculous
01:41:31.200 opportunity.
01:41:32.100 It's an absolutely
01:41:32.760 insane opportunity.
01:41:34.040 It's like,
01:41:34.700 all these people,
01:41:35.840 all you people,
01:41:37.020 are coming to
01:41:37.680 these events.
01:41:38.600 It's like,
01:41:38.940 I'm going to do
01:41:39.300 everything I possibly can
01:41:40.580 to make these events
01:41:41.320 go as well as they
01:41:42.260 possibly could.
01:41:43.080 And so everything
01:41:44.100 I could think of
01:41:44.820 that I could put in place,
01:41:45.920 I tried to put in place,
01:41:46.960 and that also meant
01:41:48.040 going to a decent tailor
01:41:49.640 and getting a good suit
01:41:51.160 and thinking,
01:41:52.620 well,
01:41:52.780 that's just,
01:41:53.500 you know,
01:41:53.740 maybe that's,
01:41:54.340 it's not of crucial importance,
01:41:56.820 you know,
01:41:57.220 I could probably come out
01:41:58.140 here in jeans
01:41:58.760 and all of that.
01:42:00.280 And,
01:42:00.700 but,
01:42:02.040 you know,
01:42:04.000 why not go
01:42:05.500 the extra 3%
01:42:07.180 or the extra 4%
01:42:08.520 or whatever,
01:42:09.180 because I want to do
01:42:10.040 everything I possibly can
01:42:11.480 to make this work right.
01:42:13.960 So.
01:42:15.520 Applause
01:42:15.920 Applause
01:42:16.480 Applause
01:42:17.480 All right,
01:42:21.200 I get it,
01:42:21.660 I got to invest
01:42:22.340 in a suit.
01:42:23.040 All right.
01:42:24.480 All right.
01:42:25.320 And it's been fun,
01:42:26.500 actually.
01:42:27.600 So,
01:42:28.080 and lots of people
01:42:28.680 now come to these shows
01:42:29.660 in suits,
01:42:30.340 and some people
01:42:30.900 in three-piece suits,
01:42:31.900 so that's kind of cool
01:42:33.100 to see people
01:42:34.280 dress like grown-ups.
01:42:36.080 Laughter
01:42:36.560 And you told me
01:42:39.040 right before you said it,
01:42:39.900 you designed these shoes,
01:42:41.460 didn't you?
01:42:41.660 Well,
01:42:41.880 sort of.
01:42:42.720 I mean,
01:42:43.260 I bought these
01:42:44.220 from an online shoe company
01:42:45.840 called Undandy,
01:42:48.220 if you care.
01:42:49.380 And they had a lot
01:42:51.120 of mix-and-match styles,
01:42:52.520 and so they're handmade
01:42:53.740 in Portugal,
01:42:54.560 apparently.
01:42:54.920 And so they looked
01:42:57.200 kind of 1920s,
01:42:59.100 I guess.
01:42:59.940 Sort of like a hybrid
01:43:00.860 between 1920s spat
01:43:02.480 and a cowboy boot.
01:43:03.920 And so,
01:43:05.140 I thought,
01:43:06.800 what the hell?
01:43:07.660 And I'm actually
01:43:08.260 kind of happy with them,
01:43:09.200 and my wife told me
01:43:10.160 they glitter quite nicely
01:43:11.340 on stage.
01:43:12.080 Laughter
01:43:12.460 Applause
01:43:14.080 Do you ever fear
01:43:18.960 that you're being used
01:43:20.560 by the conservative
01:43:21.800 political agenda
01:43:23.160 as a political puppet?
01:43:26.460 No.
01:43:27.840 Okay,
01:43:28.580 moving on.
01:43:29.280 Yeah.
01:43:30.400 Well,
01:43:31.100 by who?
01:43:32.820 I mean,
01:43:33.300 you know,
01:43:33.640 there's people on,
01:43:34.940 who are these people?
01:43:36.480 There are people online
01:43:37.700 who clip my videos
01:43:39.280 and use them
01:43:40.300 to promote
01:43:41.640 their particular viewpoints,
01:43:43.060 but there's a lot of people
01:43:45.100 who are doing that,
01:43:46.940 like,
01:43:47.700 for all sorts of purposes.
01:43:49.100 I can't even begin
01:43:50.360 to keep up with it.
01:43:51.340 I think I mentioned already
01:43:52.840 that it's something
01:43:54.620 on the neighborhood
01:43:55.320 of 20,000 new clip videos
01:43:57.920 come out a month,
01:43:59.300 and so some of those,
01:44:00.680 no doubt,
01:44:01.140 are being used
01:44:03.280 for purposes
01:44:03.880 of which I wouldn't approve,
01:44:05.820 but that's part
01:44:06.860 of the public debate.
01:44:08.500 And so,
01:44:08.780 I'm not,
01:44:10.780 apart from that,
01:44:13.540 I'm not concerned
01:44:14.980 about it.
01:44:16.000 The people
01:44:16.480 that I'm in touch
01:44:17.680 with most,
01:44:19.020 you being one of them,
01:44:20.880 for example,
01:44:21.680 I mean,
01:44:22.320 there's no conspiracy
01:44:24.420 there,
01:44:26.020 except maybe
01:44:28.340 what's an emergent
01:44:29.300 conspiracy,
01:44:30.080 because we've been
01:44:30.720 talking,
01:44:32.060 but I'm concerned,
01:44:34.840 mostly what I'm
01:44:35.780 concerned about
01:44:36.340 is that I'll do
01:44:36.980 something stupid.
01:44:37.800 I'm really concerned
01:44:39.380 about that.
01:44:40.640 I'm less concerned
01:44:41.600 that other people
01:44:42.220 will do stupid things,
01:44:43.320 because they will,
01:44:44.420 and I can't do
01:44:45.020 anything about it,
01:44:45.740 and I'm not trying
01:44:46.840 to stop it
01:44:48.480 or facilitate it.
01:44:49.600 And I think
01:44:50.360 it's gone fine
01:44:51.300 so far that,
01:44:52.500 from what I've been
01:44:53.080 able to tell,
01:44:53.760 you know,
01:44:53.960 what I see,
01:44:55.000 I really like coming
01:44:56.000 to do these lectures,
01:44:57.100 and the reason
01:44:57.620 for that is because
01:44:58.460 my experience,
01:44:59.660 and your experience
01:45:00.340 too,
01:45:00.800 you know,
01:45:01.080 has been that
01:45:01.500 these are unbelievably
01:45:03.080 positive events.
01:45:05.280 You know,
01:45:05.980 I know,
01:45:07.320 because I've talked
01:45:08.240 to at least
01:45:09.460 seven or eight
01:45:10.420 thousand people
01:45:11.160 now at these events,
01:45:13.300 you know,
01:45:13.640 and the story
01:45:15.060 is fairly
01:45:15.580 straight,
01:45:17.120 is fairly consistent.
01:45:19.180 Most of the people
01:45:20.160 who are coming
01:45:20.720 to these lectures,
01:45:21.800 or buying my book,
01:45:23.120 or listening
01:45:24.000 to the lectures,
01:45:24.660 or watching Ruben,
01:45:25.760 or Rogan,
01:45:26.380 for that matter,
01:45:27.360 are people
01:45:27.800 who are trying
01:45:28.400 to learn things
01:45:30.880 they don't know
01:45:31.520 and make their lives
01:45:32.260 better,
01:45:33.720 but not better
01:45:34.520 in a sort of
01:45:35.120 impulsive,
01:45:35.920 hedonistic way,
01:45:37.080 even though sometimes
01:45:37.980 that's okay,
01:45:38.660 but by expanding
01:45:40.800 the depth
01:45:42.060 of their
01:45:42.580 philosophical acumen
01:45:44.800 and trying to live
01:45:46.620 more responsible
01:45:47.500 and truthful lives.
01:45:49.180 And that's just
01:45:50.340 absolutely great.
01:45:52.060 And if there's
01:45:52.600 some fuzziness
01:45:53.660 at the edges
01:45:54.400 and some trouble,
01:45:55.700 well,
01:45:55.980 there's going to be,
01:45:56.820 and that's just
01:45:57.640 how it is.
01:45:58.560 But what I observe
01:46:00.260 is that,
01:46:03.680 and the way
01:46:04.380 that I experience
01:46:05.140 this is that
01:46:05.800 it's overwhelmingly
01:46:07.060 positive.
01:46:08.480 And so that's
01:46:09.020 good enough
01:46:09.440 as far as I'm
01:46:10.120 concerned.
01:46:17.200 I'll just add
01:46:18.060 this one in.
01:46:18.840 Because you just
01:46:19.580 mentioned something
01:46:20.100 about conspiracies.
01:46:21.440 when we took
01:46:22.480 that picture
01:46:23.000 a couple weeks
01:46:23.600 ago at dinner
01:46:24.220 with me,
01:46:24.760 you,
01:46:25.060 Rogan,
01:46:25.660 and Sam,
01:46:26.880 and Eric Weinstein,
01:46:28.120 and Ben Shapiro,
01:46:29.380 you know,
01:46:29.660 thousands of people
01:46:30.580 retweeted it,
01:46:31.440 loved it,
01:46:31.800 and all that,
01:46:32.160 but then there was
01:46:32.900 this secondary
01:46:33.500 group of people
01:46:34.460 that had all
01:46:35.620 of these conspiracies
01:46:36.900 about us sitting
01:46:38.340 there having dinner.
01:46:39.380 all we did was
01:46:41.740 have dinner.
01:46:43.540 As far as we can
01:46:44.440 tell these people.
01:46:46.340 But I mean,
01:46:46.980 really,
01:46:47.260 we had dinner,
01:46:48.260 right?
01:46:48.480 Well,
01:46:48.960 the thing about
01:46:49.880 that dinner,
01:46:50.540 I really enjoyed
01:46:51.520 that dinner.
01:46:52.040 It was quite fun
01:46:52.600 to see everybody
01:46:53.280 there.
01:46:54.100 It's like generating
01:46:55.620 a conspiracy
01:46:56.620 out of cats.
01:46:59.120 You know,
01:46:59.540 it's like,
01:47:00.320 well,
01:47:00.560 everybody,
01:47:01.220 it's not like
01:47:01.720 the people in
01:47:02.340 that little group
01:47:03.200 were the same.
01:47:04.480 I mean,
01:47:05.520 Eric Weinstein
01:47:06.380 and Joe Rogan,
01:47:07.280 it's like they're
01:47:07.720 not even members
01:47:08.520 of the same species.
01:47:11.040 So,
01:47:12.160 it's not that easy
01:47:14.460 to organize a conspiracy
01:47:15.820 when the people
01:47:17.180 that you're dealing
01:47:17.760 with are marked
01:47:19.740 out mostly
01:47:21.100 by their
01:47:22.240 proclivity
01:47:24.520 for individual
01:47:25.340 endeavors,
01:47:26.040 I would say.
01:47:26.860 It's a weird group
01:47:27.780 to be in,
01:47:28.440 right?
01:47:28.700 Because what we've
01:47:29.960 been trying to figure
01:47:30.640 out,
01:47:31.200 Dave and I in particular,
01:47:32.180 because we spend
01:47:32.660 so much time together,
01:47:33.480 why this intellectual
01:47:35.280 dark web
01:47:36.040 nomenclature
01:47:37.180 emerged
01:47:37.840 and why it
01:47:38.480 sort of stuck.
01:47:39.500 It's like,
01:47:39.920 well,
01:47:40.360 a name doesn't
01:47:41.220 stick unless
01:47:42.000 it names something
01:47:43.540 that at least
01:47:44.760 has more than
01:47:45.800 a nominal existence.
01:47:47.260 And we thought,
01:47:49.180 well,
01:47:49.320 we're all early
01:47:49.900 adopters of this
01:47:50.740 long-form technology
01:47:51.840 and that's probably
01:47:52.920 the biggest issue,
01:47:54.120 really,
01:47:54.520 right there.
01:47:55.060 And that's so
01:47:55.460 fundamentally,
01:47:56.080 that's a
01:47:56.400 technological issue.
01:47:57.320 And then the
01:47:58.040 next issue is
01:47:58.900 we like to have
01:48:02.020 long-form discussions
01:48:03.100 apparently.
01:48:04.900 I think because
01:48:06.020 everybody in that
01:48:06.720 group would actually
01:48:07.400 like to learn
01:48:07.980 something and so
01:48:08.800 is capable of
01:48:10.820 and willing to
01:48:11.660 have an actual
01:48:12.400 conversation.
01:48:13.260 You really see that
01:48:13.920 with Rogan.
01:48:15.080 You know,
01:48:15.320 because,
01:48:15.760 and that's one of
01:48:16.160 the things that
01:48:16.600 makes him so
01:48:17.100 attractive is that
01:48:18.020 Joe doesn't care
01:48:19.340 that there's a bunch
01:48:19.960 of things he doesn't
01:48:20.620 know.
01:48:21.460 Well,
01:48:21.700 partly because there's
01:48:22.320 a bunch of things
01:48:22.840 he does know.
01:48:23.560 He's competent in
01:48:24.320 many ways.
01:48:24.860 So he doesn't care.
01:48:25.980 He'll talk to anyone
01:48:26.720 and try to figure out
01:48:27.680 what the hell's going
01:48:28.400 on.
01:48:28.640 And he's skeptical
01:48:29.280 about it,
01:48:29.780 but only because
01:48:30.580 he wants to,
01:48:32.280 he's not skeptical
01:48:33.180 for the,
01:48:34.200 to tear something
01:48:35.020 down.
01:48:36.140 He's skeptical
01:48:37.000 so that he can
01:48:37.780 separate the wheat
01:48:38.580 from the chaff.
01:48:40.440 So anyways,
01:48:41.120 sitting there with
01:48:41.660 everybody,
01:48:42.200 you know,
01:48:42.440 it was interesting
01:48:44.020 to watch,
01:48:44.640 and that's mostly
01:48:45.260 what I did,
01:48:45.860 and to watch all
01:48:46.540 these people who
01:48:47.100 have their
01:48:47.460 individual enterprises,
01:48:50.220 let's say,
01:48:51.040 micro-empires,
01:48:52.100 and to discuss
01:48:53.960 what it is that
01:48:55.880 we had in common
01:48:56.660 and where we might
01:48:57.620 be going.
01:48:58.980 But there wasn't
01:49:00.140 anything apart from
01:49:02.020 that that was
01:49:02.620 conspiratorial about
01:49:03.700 it,
01:49:04.000 even though it's
01:49:04.420 fun to pretend
01:49:05.040 that there is.
01:49:06.500 You know,
01:49:06.800 and I thought,
01:49:07.340 I thought,
01:49:07.680 this is kind of
01:49:08.280 like being part
01:49:09.020 of the rat pack
01:49:09.720 back in the 1950s.
01:49:11.260 So kind of a more
01:49:12.460 diverse range of rats,
01:49:14.180 but the same thing.
01:49:16.120 We ain't that cool,
01:49:17.080 Peterson.
01:49:17.560 No,
01:49:17.720 I know.
01:49:18.540 That's why I said
01:49:19.260 sort of like.
01:49:21.080 Although I did
01:49:21.780 eat a giant piece
01:49:22.920 of bacon right
01:49:23.740 in front of Shapiro,
01:49:24.640 which I thought
01:49:25.100 was pretty good.
01:49:27.300 I was like,
01:49:27.980 if he eats it,
01:49:28.740 then we've really
01:49:29.320 made some moves here.
01:49:30.180 All right.
01:49:31.540 Oh,
01:49:31.940 you know what?
01:49:32.220 We only have time
01:49:32.760 for one more,
01:49:33.340 unfortunately.
01:49:34.900 I want to make
01:49:35.540 this a good one.
01:49:37.340 Do you believe
01:49:38.380 in a sixth sense?
01:49:42.500 Do I believe
01:49:43.260 in a sixth sense?
01:49:47.140 Well,
01:49:47.700 no.
01:49:51.780 We should have
01:49:55.760 a different question.
01:49:58.020 Yeah,
01:49:58.220 that was not
01:49:59.220 me using
01:50:00.100 my sixth sense.
01:50:01.340 Okay?
01:50:09.040 I'm using
01:50:09.660 a Peterson pause
01:50:10.580 right now.
01:50:11.240 You like that?
01:50:13.340 All right,
01:50:13.740 here's one
01:50:14.080 that people can take away
01:50:15.080 when they leave here.
01:50:16.420 How do you stop
01:50:17.560 yourself
01:50:18.120 from comparing
01:50:19.140 yourself
01:50:19.920 to other people?
01:50:21.940 Yeah,
01:50:22.180 so that's rule
01:50:23.020 four, right?
01:50:24.200 Compare yourself
01:50:24.860 to who you were
01:50:25.520 yesterday
01:50:25.980 and not to who
01:50:26.680 someone else is today.
01:50:28.680 Well,
01:50:29.260 I don't think
01:50:30.020 you really can
01:50:31.160 stop yourself.
01:50:33.480 And under some
01:50:34.540 circumstances,
01:50:35.380 you shouldn't.
01:50:36.480 You know,
01:50:36.760 like when you're
01:50:37.240 16 or 17,
01:50:38.840 when you're
01:50:39.140 a young person,
01:50:41.520 some comparison
01:50:42.300 of yourself
01:50:42.900 to others is,
01:50:43.960 especially others
01:50:44.720 that are your age,
01:50:45.780 is good because
01:50:46.560 you're not really
01:50:48.340 anything yet.
01:50:49.200 and you need
01:50:50.700 to become
01:50:51.180 who you are.
01:50:52.080 And so,
01:50:53.140 a little admiration
01:50:55.180 and a little mimicry
01:50:56.400 and a little
01:50:57.040 competition,
01:51:00.700 that's all a good
01:51:01.400 thing.
01:51:02.160 I think what happens
01:51:03.340 though as you get
01:51:03.940 older,
01:51:04.280 I really noticed
01:51:04.880 this in my
01:51:05.460 clinical practice
01:51:06.300 when I was
01:51:06.820 helping people
01:51:07.380 try to sort out
01:51:08.100 their lives,
01:51:08.920 especially around
01:51:09.660 30,
01:51:11.180 is that by the
01:51:11.820 time you're 30,
01:51:13.160 if you've
01:51:13.840 established yourself
01:51:15.500 to any degree
01:51:16.120 in the world,
01:51:16.620 and even if you
01:51:17.440 haven't,
01:51:18.500 your life has
01:51:19.580 become so
01:51:20.220 idiosyncratic
01:51:21.280 that comparing
01:51:22.060 yourself to other
01:51:22.700 people isn't all
01:51:23.280 that helpful.
01:51:24.960 You know,
01:51:25.240 I mean,
01:51:25.460 you can do it
01:51:25.940 to sort of
01:51:26.320 keep yourself
01:51:27.020 oriented.
01:51:29.600 You don't want
01:51:30.280 to be too
01:51:30.720 strange because
01:51:32.400 you wander off
01:51:34.480 the beaten path
01:51:35.260 and you get
01:51:35.680 lost,
01:51:36.180 and that's not
01:51:36.760 a good thing.
01:51:37.980 So you want
01:51:38.660 to compare
01:51:39.000 yourself to
01:51:39.740 other people
01:51:40.240 insofar as
01:51:40.900 that keeps you
01:51:41.560 on the beaten
01:51:42.420 path,
01:51:42.900 let's say,
01:51:43.260 but most
01:51:44.920 of the other
01:51:45.320 comparisons just
01:51:46.080 start to become
01:51:46.820 they're naively
01:51:48.160 counterproductive.
01:51:50.300 Like,
01:51:50.760 you're the only
01:51:51.860 person with
01:51:52.540 your particular
01:51:53.280 physical attributes
01:51:54.800 and your
01:51:55.500 particular
01:51:55.960 intelligence
01:51:57.760 in combination
01:51:58.980 with your
01:51:59.580 temperament.
01:52:00.200 Like,
01:52:00.420 you really are
01:52:00.880 a unique
01:52:01.440 creature in
01:52:02.300 many ways,
01:52:03.020 and you're
01:52:04.100 the only one
01:52:04.460 who had your
01:52:04.980 parents,
01:52:05.720 and you're
01:52:07.240 the only one
01:52:07.640 who has your
01:52:08.060 siblings and
01:52:08.680 your family
01:52:09.260 and your
01:52:10.240 particular
01:52:10.760 moral virtues
01:52:12.680 and failings.
01:52:13.840 You're really
01:52:14.220 an idiosyncratic
01:52:15.820 creature,
01:52:16.620 and you're
01:52:17.040 very,
01:52:17.420 very complex,
01:52:18.240 and so if
01:52:19.580 you take that
01:52:20.160 idiosyncrasy
01:52:21.080 and that
01:52:21.440 complexity,
01:52:22.580 and you
01:52:22.940 compare it
01:52:23.500 casually to
01:52:24.380 someone else,
01:52:25.160 you're missing
01:52:26.100 all sorts of
01:52:26.940 things that are
01:52:27.600 relevant.
01:52:28.640 You know,
01:52:28.860 like,
01:52:29.440 I've had
01:52:29.840 clients,
01:52:30.380 for example,
01:52:30.920 who were
01:52:31.200 fantastically
01:52:31.880 successful at
01:52:32.860 their chosen
01:52:33.500 endeavor,
01:52:34.660 and who
01:52:35.740 lived a
01:52:36.780 fairly
01:52:37.160 comfortable
01:52:39.580 life as a
01:52:41.160 consequence,
01:52:41.720 and,
01:52:43.120 you know,
01:52:43.360 if you
01:52:43.580 looked at
01:52:43.900 them from
01:52:44.180 the outside,
01:52:44.840 you might
01:52:45.100 think,
01:52:45.520 well,
01:52:45.660 there's
01:52:45.960 someone who's
01:52:46.500 got everything,
01:52:47.420 and you might
01:52:48.480 get bitter about
01:52:49.200 that and
01:52:49.600 envious and
01:52:50.260 start to put
01:52:50.780 yourself down
01:52:51.440 and all that,
01:52:51.980 but when you
01:52:52.740 scratch below
01:52:53.320 the surface
01:52:54.000 and not very
01:52:55.100 far in
01:52:56.540 people's lives,
01:52:57.280 and it doesn't
01:52:57.820 matter who
01:52:58.260 they are,
01:52:59.080 you find out
01:52:59.720 that they're
01:53:00.100 carrying their
01:53:00.840 own catastrophe
01:53:02.220 and tragedy
01:53:02.980 and history
01:53:04.180 of betrayal,
01:53:05.620 and,
01:53:06.120 you know,
01:53:06.340 so one of
01:53:07.340 the clients I
01:53:07.880 had who I
01:53:08.340 admired very
01:53:09.040 well had a
01:53:09.560 very harsh and
01:53:10.740 tyrannical
01:53:11.200 father,
01:53:11.760 and another
01:53:12.440 had a son
01:53:13.680 who was
01:53:14.260 seriously
01:53:14.800 mentally ill,
01:53:16.400 and another
01:53:18.440 one who
01:53:18.900 had,
01:53:21.280 well,
01:53:21.680 who was
01:53:22.120 married to
01:53:22.620 someone who
01:53:23.040 had very,
01:53:23.640 very serious
01:53:24.140 health problems,
01:53:25.040 and everyone
01:53:26.380 is contending
01:53:27.800 with their
01:53:28.200 own feelings
01:53:29.560 of inadequacy
01:53:30.680 and fears.
01:53:31.820 People who
01:53:32.300 run successful
01:53:33.080 businesses wake
01:53:33.980 up in the
01:53:34.400 middle of the
01:53:34.820 night if
01:53:35.240 they're not
01:53:35.600 psychopathic in
01:53:36.440 a cold sweat
01:53:37.180 because so
01:53:37.720 many people
01:53:38.180 depend on
01:53:38.740 them,
01:53:38.900 and now
01:53:39.540 and then
01:53:39.780 the economy
01:53:40.460 takes a
01:53:40.960 downturn
01:53:41.440 and they're
01:53:41.760 not sure
01:53:42.180 that they're
01:53:42.500 going to be
01:53:42.820 able to
01:53:43.100 make payroll
01:53:43.760 and life
01:53:44.700 is bloody
01:53:45.240 hard.
01:53:46.460 And to be
01:53:47.180 envious of
01:53:47.840 someone,
01:53:48.840 generally speaking,
01:53:50.180 is also to be
01:53:51.380 very naive
01:53:52.120 because you
01:53:53.140 don't see the
01:53:54.580 totality of
01:53:55.300 their life.
01:53:56.900 And then it's
01:53:57.380 also unfair
01:53:58.300 in a more
01:53:59.700 profound sense.
01:54:00.780 You can run a
01:54:01.740 fair race with
01:54:02.600 yourself because
01:54:03.940 you're the only
01:54:04.560 person who has
01:54:05.380 your particular
01:54:06.180 combination of
01:54:07.520 talents and
01:54:08.040 weaknesses and
01:54:08.760 it's perfectly
01:54:09.640 reasonable for
01:54:10.380 you to try to
01:54:10.940 be better than
01:54:11.600 you were
01:54:12.000 yesterday.
01:54:12.900 It's a fair
01:54:13.600 game, but to
01:54:15.540 match yourself up
01:54:16.760 against someone
01:54:17.420 else is, well,
01:54:18.440 first of all,
01:54:18.880 that's a gateway
01:54:19.420 to envy and
01:54:20.240 that isn't a
01:54:21.020 gateway that I
01:54:21.620 would recommend
01:54:22.140 opening because
01:54:23.320 there are very,
01:54:24.100 very, very, very
01:54:26.500 dark things inside
01:54:27.820 that door.
01:54:28.620 But it's also
01:54:29.280 just not that
01:54:30.000 productive because
01:54:30.900 it's an inaccurate
01:54:32.180 comparison.
01:54:33.700 And so you have
01:54:34.320 to aim up and
01:54:36.400 you might think,
01:54:36.960 well, you could
01:54:37.460 compare yourself
01:54:38.140 with others who
01:54:38.780 are apparently
01:54:39.400 up, and you
01:54:40.620 can do that to
01:54:41.240 some degree, but
01:54:41.900 basically what you
01:54:42.780 want to do is
01:54:43.340 escape from the
01:54:45.700 self-imposed
01:54:46.580 strictures of your
01:54:47.720 own past and
01:54:49.180 continually transcend
01:54:50.240 yourself.
01:54:51.160 And that's just
01:54:51.860 better.
01:54:52.640 It's just a better
01:54:53.360 game.
01:54:54.900 And it's a
01:54:56.800 sufficient game, I
01:54:58.320 think, because you
01:54:59.560 are bearing up
01:55:00.500 underneath your
01:55:01.360 own catastrophes and
01:55:02.920 your own
01:55:03.840 handicaps and
01:55:04.700 able to capitalize
01:55:07.140 on your own
01:55:07.840 virtues.
01:55:08.460 And it isn't
01:55:10.480 what you're like
01:55:10.980 compared to
01:55:11.440 someone else.
01:55:12.100 It's what you're
01:55:13.080 like compared to
01:55:13.660 who you were
01:55:14.060 yesterday.
01:55:15.000 And the other
01:55:15.460 thing about that,
01:55:16.260 and this is a
01:55:16.700 good place to
01:55:17.200 stop, is you
01:55:18.140 can get a long
01:55:19.000 ways doing that,
01:55:20.060 man.
01:55:21.080 Incremental
01:55:21.480 improvements, you
01:55:22.320 make yourself a
01:55:22.960 little bit better
01:55:23.520 every day, like a
01:55:24.480 humble little bit
01:55:25.480 better, the little
01:55:26.120 bit that you can
01:55:26.760 manage.
01:55:27.560 God, you'll be in
01:55:28.300 such better shape in
01:55:29.340 four or five years
01:55:30.100 that you won't be
01:55:30.680 able to believe it.
01:55:31.360 And you do that
01:55:32.240 for 15 years, you'll
01:55:33.320 be a completely
01:55:34.160 different person.
01:55:35.840 And so, and
01:55:37.200 hypothetically, God
01:55:38.840 willing, with a
01:55:39.660 certain amount of
01:55:40.240 luck, a much
01:55:41.380 better person for
01:55:43.020 you and for your
01:55:43.860 family and for your
01:55:44.820 community.
01:55:45.400 And so, you might
01:55:46.360 as well just do
01:55:46.980 that.
01:55:47.320 It's just better in
01:55:48.620 every possible way.
01:55:49.900 Thank you.
01:55:50.420 If you found this
01:56:00.760 conversation meaningful,
01:56:02.160 you might think
01:56:02.740 about picking up
01:56:03.400 Dad's books, Maps
01:56:04.420 of Meaning, The
01:56:05.080 Architecture of
01:56:05.780 Belief, or his
01:56:06.780 newer bestseller,
01:56:07.700 Twelve Rules for
01:56:08.420 Life, An Antidote
01:56:09.320 to Chaos.
01:56:10.380 Both of these works
01:56:11.460 delve much deeper
01:56:12.340 into the topics
01:56:12.980 covered in the
01:56:13.740 Jordan B.
01:56:14.200 Peterson podcast.
01:56:15.760 See jordanbpeterson.com
01:56:17.620 for audio, e-book,
01:56:18.940 and text links, or
01:56:20.260 pick up the books
01:56:20.900 at your favorite
01:56:21.360 bookseller.
01:56:22.500 Next week's podcast
01:56:23.560 is going to be a
01:56:24.460 discussion between
01:56:25.160 Dad and Ben Shapiro,
01:56:26.640 so that's exciting.
01:56:27.880 If you somehow
01:56:28.460 don't know who
01:56:29.080 Ben Shapiro is, he's
01:56:30.300 one of the most
01:56:30.820 recognized individuals
01:56:31.900 on the American
01:56:32.620 political slash
01:56:33.480 journalism scene.
01:56:34.680 He's a lawyer,
01:56:35.580 writer, journalist,
01:56:36.560 political commentator,
01:56:37.620 and author of
01:56:38.280 ten books.
01:56:39.320 He's editor-in-chief
01:56:40.220 for The Daily Wire
01:56:41.040 and the host of
01:56:42.140 The Ben Shapiro Show.
01:56:43.560 Talk to you next
01:56:44.340 week.
01:56:45.100 Hope you have a
01:56:45.640 wonderful week.
01:56:47.220 Follow me on my
01:56:48.220 YouTube channel,
01:56:49.160 Jordan B.
01:56:50.020 Peterson, on
01:56:50.980 Twitter, at
01:56:51.900 Jordan B.
01:56:52.580 Peterson, on
01:56:53.540 Facebook, at
01:56:54.780 Dr. Jordan B.
01:56:55.900 Peterson, and at
01:56:56.900 Instagram, at
01:56:57.720 Jordan.B.
01:56:59.140 Peterson.
01:57:00.080 Details on this
01:57:01.060 show, access to
01:57:02.520 my blog, information
01:57:04.080 about my tour dates
01:57:05.260 and other events, and
01:57:06.500 my list of recommended
01:57:07.540 books can be found on
01:57:09.120 my website,
01:57:10.280 JordanBPeterson.com.
01:57:12.200 My online writing
01:57:13.340 programs, designed to
01:57:14.760 help people straighten
01:57:15.680 out their pasts,
01:57:17.080 understand themselves
01:57:18.040 in the present, and
01:57:19.160 develop a sophisticated
01:57:20.140 vision and strategy
01:57:21.360 for the future, can be
01:57:22.660 found at
01:57:23.180 SelfAuthoring.com.
01:57:25.140 That's
01:57:25.580 SelfAuthoring.com.
01:57:27.660 From the Westwood
01:57:28.420 One Podcast Network.