The Jordan B. Peterson Podcast - December 08, 2019


Resolving the Science Religion Problem


Episode Stats

Length

3 hours and 32 minutes

Words per Minute

158.13637

Word Count

33,618

Sentence Count

1,622

Misogynist Sentences

13

Hate Speech Sentences

39


Summary

Dr. Jordan B. Peterson has created a new series that could be a lifeline for those battling depression and anxiety. We know how isolating and overwhelming these conditions can be, and we wanted to take a moment to reach out to those listening who may be struggling. With decades of experience helping patients, Dr. Peterson offers a unique understanding of why you might be feeling this way. In his new series, he provides a roadmap towards healing, showing that while the journey isn t easy, it s absolutely possible to find your way forward. If you re suffering, please know you are not alone. There s hope, and there s a path to feeling better. Go to Daily Wire Plus now and start watching Dr. B.P. Peterson on Depression and Anxiety. Let this be the first step towards the brighter future you deserve. Episode 38: 12 Rules for Life: Resolving the Science-Religion Problem (feat. Dr. Jordan Peterson) A lecture delivered in San Jose, CA on January 22, 2019, recorded in San Francisco, CA, by Dr. P.B. Peterson. This episode is a continuation of a lecture he delivered to a large audience in Zurich, Switzerland a week ago. on a topic he was trying to cover in a TEDx talk. TEDx Talk titled 12 Rules For Life: A Guide to My Life . in November 2018. In this episode, he talks about how religion and science are in conflict, and how to resolve the science-religion problem. If you're listening to this podcast before December 17th, 2019 is the perfect time to solve the problem you ve been struggling with. The science and religion problem? What's a problem you re struggling with? or a problem that you re trying to solve? How do you know you re not getting it? ? Why religion and religion are a conflict? What do you need to do to resolve it, and what you can do to solve it, or are you trying to figure out how to get it out of your head? And how do you have a better chance of getting it out? Why do you want to know what you re going to get a better grip on the science and a religion problem or religion are related to a better life and how you can get a grip on your first rule for life? I ll tell me what you should do it, right now?


Transcript

00:00:00.960 Hey everyone, real quick before you skip, I want to talk to you about something serious and important.
00:00:06.480 Dr. Jordan Peterson has created a new series that could be a lifeline for those battling depression and anxiety.
00:00:12.740 We know how isolating and overwhelming these conditions can be, and we wanted to take a moment to reach out to those listening who may be struggling.
00:00:20.100 With decades of experience helping patients, Dr. Peterson offers a unique understanding of why you might be feeling this way in his new series.
00:00:27.420 He provides a roadmap towards healing, showing that while the journey isn't easy, it's absolutely possible to find your way forward.
00:00:35.360 If you're suffering, please know you are not alone. There's hope, and there's a path to feeling better.
00:00:41.800 Go to Daily Wire Plus now and start watching Dr. Jordan B. Peterson on depression and anxiety.
00:00:47.460 Let this be the first step towards the brighter future you deserve.
00:00:57.420 Welcome to Season 2, Episode 38 of the Jordan B. Peterson Podcast.
00:01:04.880 I'm Mikayla Peterson, Dad's daughter and collaborator.
00:01:08.200 Girl who only eats meat and does extended fasting.
00:01:11.980 Believe it or not, my brother is basically completely normal.
00:01:15.280 He's probably adopted.
00:01:16.860 Or he takes after my mom's side. I haven't figured out which.
00:01:19.640 Today's episode is a 12 Rules for Life lecture, recorded in San Jose on January 22, 2019.
00:01:27.220 I've named it Resolving the Science-Religion Problem.
00:01:31.180 Last week, I walked on the beach for the first time with no pain in almost 12 years.
00:01:36.320 I had my ankle replacement re-replaced last January, and that is kind of a hellish surgery.
00:01:41.800 I was awake during it too because I opted out of general anesthesia.
00:01:44.920 About a quarter of the way into the surgery, I regretted that, but hey, I survived.
00:01:50.400 Anyway, the surgeon fixed the problem, the problem being a crooked ankle replacement,
00:01:54.960 installed 10 years prior.
00:01:57.100 Isn't that crazy?
00:01:58.300 So after about a half a mile walking in the sand, I literally cried with appreciation.
00:02:03.480 It was beautiful.
00:02:04.600 I had overlaid my experience of beaches with sadness and hatred and frustration, and that's gone now.
00:02:10.840 It's a hell of a lot easier when you're not in pain.
00:02:13.540 Sometimes the beauty in life is overwhelming.
00:02:16.160 Just wanted to share that with everybody because it was really overwhelming, and it made my day.
00:02:22.200 Exciting news.
00:02:23.540 Dad is launching his very first e-course on December 17, 2019.
00:02:29.360 It's available for pre-sale currently.
00:02:32.080 A lot of people have been asking us for a more structured and condensed resource
00:02:35.400 where they can learn about personality without needing to spend 30-plus hours watching videos,
00:02:41.540 reading resources, etc.
00:02:42.800 So earlier this year, we recorded a new video series that will be packaged as an online course
00:02:48.100 with eight videos, supplementary materials, including lecture notes, additional reading
00:02:52.820 materials and resources, transcripts, a free license to the Understand Myself personality
00:02:57.660 assessment, and an exclusive discussion group.
00:03:00.960 All designed to give you an in-depth look and understanding of your personality.
00:03:05.000 Personality is my favorite topic in psychology.
00:03:07.100 It's worth checking out if you've been intending to learn about personality and want to do
00:03:11.480 it in a concise and structured format.
00:03:13.980 Dad's released a lot of information about personality on YouTube for free, but this is
00:03:17.880 a more concise, structured way of learning and features some information on personality
00:03:21.520 differences between genders, so that's cool.
00:03:24.000 Go to JordanBPeterson.com slash personality to check it out.
00:03:28.280 If you're listening to this podcast before December 17th, 2019, we're currently offering
00:03:33.760 a pre-sale for a 15% discount on the course at $120.
00:03:38.580 If you're interested, this is a great opportunity to get it at a lower price or buy it for someone
00:03:42.960 for Christmas even.
00:03:44.360 Hopefully you find it cool.
00:03:46.020 I did.
00:03:46.440 I sat in on all the lectures, even though I had food poisoning at the time.
00:03:50.660 Imagine getting food poisoning when you only eat one thing.
00:03:53.700 Not ideal, but I still sat through the lectures.
00:03:56.640 They're that good.
00:03:57.640 Check it out at JordanBPeterson.com slash personality.
00:04:10.240 Resolving the Science-Religion Problem.
00:04:12.180 A Jordan B. Peterson 12 Rules for Life Lecture.
00:04:16.440 It's been a while since I talked to a large audience.
00:04:19.580 I talked to an audience in Zurich a week ago, but that was in a completely different time
00:04:24.280 zone.
00:04:25.520 So, and I was just trying to recapitulate in my imagination what it is that I was trying
00:04:35.440 to do with 12 Rules for Life and with my first book, Maps of Meaning.
00:04:41.140 And so I was laying that out, and I think I'll walk through that again.
00:04:43.700 And I kind of get warmed up to the topic.
00:04:46.960 It's a complicated thing to lay out in total.
00:04:55.480 So, the first issue is, I think, that these are issues that everybody knows.
00:05:03.500 But all these issues that I'm going to lay out are related.
00:05:08.000 So, the first issue that everybody knows is that there's a conflict between religion
00:05:13.340 and science.
00:05:14.360 And so, that's a conflict that's torn at the heart of our culture for about 500 years.
00:05:28.700 Especially as scientific progress has become more self-evident.
00:05:34.680 And there's been a lot of really good things about that, obviously.
00:05:39.240 Science has driven a technological revolution, and it's radically improved our standard of
00:05:45.340 living, our material well-being.
00:05:47.720 And it's very difficult not to think of that as a good thing.
00:05:51.920 And one of the things that's really remarkable about that is that it's accelerating, by all appearances.
00:06:00.100 It really took off in the late 1800s.
00:06:03.200 And that it's happening everywhere in the world.
00:06:06.040 And so, there's been an unbelievably rapid economic transformation throughout the world,
00:06:12.400 especially in the last 20 years.
00:06:14.240 So, you know, the bulk of the population in the world now is middle class.
00:06:20.300 And starvation is virtually a thing of the past, although not entirely.
00:06:25.580 And even extreme privation, from a material perspective, is declining very rapidly.
00:06:31.360 The UN defines extreme privation as existence on less than $1.90 a day in today's dollars.
00:06:39.980 And that's declined 50% since the year 2000, which is absolutely phenomenal.
00:06:47.560 And the UN projects that it will be eradicated completely by the year 2030.
00:06:54.840 And just to put things in perspective, in the West, before 1895, the typical person lived
00:07:01.280 on a dollar a day.
00:07:02.860 So that's less than the current UN standard for extreme privation.
00:07:08.660 And so, the technological revolution that's been driven, in large part, by the dawn of
00:07:16.400 scientific thinking has radically altered the West and its standard of living.
00:07:22.760 But now is doing the same thing everywhere else in the world.
00:07:26.440 So, you know, here's another example.
00:07:30.400 The child mortality rate in Africa is now the same as it was in Europe in 1952.
00:07:34.900 So that's just absolutely beyond belief.
00:07:38.740 So, this is a good thing.
00:07:42.440 But the conflict between the scientific viewpoint and the religious viewpoint still exists.
00:07:49.180 Carl Jung, who's a favorite thinker of mine, a psychoanalytic thinker who lived, who wrote and thought
00:08:00.420 through most of the 20th century.
00:08:02.280 He died in 1960.
00:08:06.600 So, I guess for the first six-tenths of the 20th century.
00:08:09.760 He believed that what had happened about 500 years ago was that as science developed out
00:08:16.920 of alchemy, which was like the dream, the alchemy was the dream out of which science emerged.
00:08:23.320 Alchemy was the dream that you could discover a material substance, which was the philosopher's
00:08:28.720 stone that would confer upon everyone health, wealth, and longevity.
00:08:34.380 And, of course, there is no philosopher's stone, but the dream was correct, because the dream
00:08:43.320 was that if we studied the material world with enough care, that we could discover something
00:08:50.500 that would produce wealth and health and longevity.
00:08:54.280 And that did happen.
00:08:55.480 And so, sometimes things that occur in actuality have to be dreamed before they occur.
00:09:05.300 Alchemy had an ethical aspect and a practical aspect, let's say.
00:09:10.340 And the practical aspect exploded up into the scientific revolution and gave us this incredible
00:09:16.000 technological power.
00:09:17.540 But it was Jung's belief that the ethical element remained undeveloped, and that that was dangerous.
00:09:25.480 And that we were now in a situation, and have been for quite a while, where our technological
00:09:31.060 power outstrips our ethical knowledge.
00:09:34.160 And part of that's manifested in an uncertainty about ethics in general, about how to behave,
00:09:41.600 about whether there is even an answer to the question how to behave, about whether or not
00:09:49.240 there is such a thing as an ethic that isn't morally relative or arbitrary in some sense.
00:09:57.000 And I think you see that manifested, that critique manifested most particularly in the postmodern
00:10:02.840 doctrine, which claims with some justification that there's a very large number of ways of
00:10:08.420 looking at the world, and that none of those ways, there's no straightforward way to determine
00:10:15.180 which one of that multitude of manners that you can view the world is correct.
00:10:23.180 Now, there's a corollary problem.
00:10:29.940 You have the religion and science problem.
00:10:32.900 And you have a philosophical problem that David Hume pointed out, which is that it isn't
00:10:38.520 obvious how you can derive an ought from an is.
00:10:42.600 And an ought is an ethic.
00:10:46.500 It's how you should behave.
00:10:47.820 And an is is a description.
00:10:49.760 And you could think of the scientific method as an attempt to describe what is, and the
00:10:57.320 ethical endeavor as an attempt to describe what ought to be.
00:11:02.180 And according to Hume, there was a gap between the two, and it wasn't a straightforward thing
00:11:07.780 to bridge.
00:11:09.200 And that seems to be right.
00:11:10.460 So there's a parallel problem, the is-ought problem, the science-religion problem.
00:11:17.820 And so I was very interested in that, in that, in those parallel sets of problems.
00:11:25.620 And that's what I've been trying to address, and that's what I wrote about in 12 Rules for
00:11:30.320 Life and in Maps of Meaning.
00:11:31.560 Anyway, I started to hypothesize a long while back that, in a similar manner, that there
00:11:44.720 were two ways of looking at the world.
00:11:47.420 You could look at the world as a place of objects, a place of material objects, or you
00:11:54.100 could look at the world as a place to act.
00:11:55.900 And that that's the same idea, reflected in a different way.
00:12:01.600 The world as a place of objects, that's the is-world, that's the world science describes.
00:12:06.480 And the world as a place to act, that's the ought-world.
00:12:09.300 And then I realized, too, and this was partly from studying the psychoanalysts, but also from
00:12:14.300 studying literature.
00:12:16.020 Literature first, and the psychoanalysts second, was that the world as a place to act is laid
00:12:24.480 out in stories, and that maybe the world is like a stage that's set for a drama, you
00:12:33.180 know?
00:12:33.680 You walk in like you walked in tonight, you look at the stage, and there's nothing, there's
00:12:39.200 no characters on it to begin with.
00:12:41.240 There's just the objects on the stage, the speakers and the chair, and the stage is set.
00:12:48.720 And you could know everything about the stage setting from a scientific perspective, and you
00:12:52.860 still would have no idea what drama was about to take place.
00:12:58.020 Yet the purpose of the stage and the purpose of the play is the drama and the stage setting,
00:13:04.600 the objects are in some sense almost peripheral to that.
00:13:07.700 And you know that, too, because you've seen plays and movies, and you know how many different
00:13:11.620 ways you can set a stage for the drama to occur.
00:13:16.780 The question is, well, what constitutes the drama?
00:13:20.060 So I got interested in the drama.
00:13:25.380 And because the point of the stage is the play and not the setting.
00:13:32.500 I got interested in the drama.
00:13:36.320 And I started to understand that the drama had a structure, and that the structure, just
00:13:44.840 like the material world has a structure, just like there's a periodic table of the elements,
00:13:49.180 there's a periodic table, in some sense, there's a periodic table of dramatic personae.
00:13:55.520 And if you understand them, you can start to understand how to act in the world.
00:14:01.980 And there is actually a way to act in the world that's not arbitrary.
00:14:05.820 When I was writing Maps of Meaning, which was my first book, I was trying to address a question
00:14:11.180 that was a postmodern question, although I didn't know it at the time.
00:14:15.160 I hadn't conceptualized it at the time.
00:14:17.160 And the question was this.
00:14:18.540 I was obsessed by the fact that the world had divided itself into two armed camps, the
00:14:27.020 armed camp that constituted essentially the Soviet Union, but you could say the entire
00:14:31.440 communist bloc, and then the West.
00:14:34.600 And that each of those two blocs had arranged their societies according to different axiomatic
00:14:40.920 presumptions, and were at odds with one another.
00:14:43.660 And it wasn't obvious, the world wouldn't have divided itself into those two armed camps,
00:14:49.940 if it was obvious which of those two sets of axiomatic presuppositions were more valid,
00:14:57.800 let's say.
00:14:58.680 Or maybe the problem was even deeper than that, which was those were just two arbitrary sets
00:15:04.820 of axiomatic presuppositions out of a very large number of potential sets, and they just
00:15:10.480 happened to be the ones that emerged, and they were at odds with one another.
00:15:13.660 And so I started to study the understructure of the two belief systems to see if one of
00:15:19.800 them had more validity than the other, if I could figure that out, if there was something
00:15:25.100 underneath at least one of them.
00:15:27.660 And what I discovered, I think, was that the belief system that characterized the societies
00:15:35.380 of the West, the underlying belief system, wasn't arbitrary.
00:15:38.200 It was just correct.
00:15:41.000 And I think the fact that the Soviet Union fell apart so precipitously in 1989 is actually
00:15:47.200 evidence of the unplayability of the Soviet game.
00:15:51.040 That's a good way of, that's a really good way of thinking about it.
00:15:53.620 There's some games you can play, and there's some games you can't play.
00:15:56.780 If you iterate some games, they degenerate.
00:15:59.540 And if you iterate others, they improve.
00:16:01.660 And that's actually one of the pieces of evidence that suggests that one system is preferable
00:16:06.960 to another, that you can iterate it across time, and that that's actually not a degenerating
00:16:12.700 game, but an improving game.
00:16:15.340 And, you know, if you have a relationship with someone, it's an iterated game, right?
00:16:19.460 It's iterated because it repeats.
00:16:20.960 If you have a permanent relationship with someone, a marriage, a friendship, a relationship between
00:16:26.740 siblings, relationship between you and your children, you want to be able to interact
00:16:31.120 with them in a matter that doesn't get worse across time.
00:16:33.880 Maybe at least it stays flat, but it would be better if it even got better.
00:16:38.480 That would be lovely.
00:16:40.160 And your successful relationships are at least ones that maintain the status quo, but the
00:16:46.860 great ones improve across time.
00:16:48.520 And so there are ways that you can act, interact iteratively with people in a manner that sustains
00:16:57.660 and sustains that iteration and fortifies it.
00:17:05.080 And so some games aren't playable, and some games are.
00:17:09.200 Some games improve as you play them.
00:17:11.000 And the game that the communists played was a degenerating game.
00:17:14.980 And that meant that there was something wrong with it.
00:17:17.000 And that meant that at least that way of looking at the world wasn't as good as any other
00:17:21.840 way.
00:17:22.960 Which is interesting, because at least it suggests that there's one way of looking at the world
00:17:26.580 that isn't as good as another way, right?
00:17:29.200 And that's a bothersome fact in some sense if you're a moral relativist.
00:17:34.300 Because as soon as there's any evidence that one game isn't as good as another, you've got
00:17:38.320 some evidence that some games are better than others.
00:17:41.640 And so then you have some evidence that there's a rank order, let's say, of games.
00:17:48.000 When Solzhenitsyn wrote about the Nazi Holocaust, he talked about the Nuremberg Trials.
00:17:57.240 And he believed that the Nuremberg Trials were among the most important events of the 20th century.
00:18:03.540 It's a similar, he's making a similar case.
00:18:06.720 And the reason he believed that was because the conclusion of the Nuremberg Trials, you
00:18:15.560 have to think about whether or not you believe this, because there's a cost to believing it
00:18:19.400 and there's a cost to not believing it.
00:18:21.040 So there's no scot-free way out of this conundrum.
00:18:24.700 The conclusion of the Nuremberg Trials was that there were some things that you could not
00:18:30.640 do if you were human without being subject to moral sanction, regardless of your cultural
00:18:37.860 milieu, right?
00:18:39.100 Those were what have come to be known as crimes against humanity.
00:18:42.320 So the Nuremberg Trials made it axiomatic that some things were wrong independently of
00:18:52.920 the moral structure within which you were raised.
00:18:58.240 Or another way of thinking about it was that no matter what moral structure it was within
00:19:04.560 you were, it was that you were raised within, there was something common across all of them
00:19:10.920 that would make the sorts of things that the Nazis did wrong by any reasonable standard.
00:19:18.300 Now you don't have to believe that, but if you don't believe that, then that puts you
00:19:23.040 in an awkward position with regards to the Holocaust, say.
00:19:28.720 Well, that's what I mean by a cost.
00:19:31.820 It's like you either admit that there's something wrong, or you admit that there wasn't anything
00:19:37.640 wrong with the Holocaust, and that it was just an arbitrary cultural decision, one of many
00:19:43.940 such arbitrary cultural decisions, and of no more distinction than any other.
00:19:51.580 And that seems like a, well, it seems like a conclusion that in the main we're not willing
00:19:58.620 to draw, so maybe there is a difference between wrong and right, and that's worth thinking
00:20:08.080 about, and then if, and maybe that difference is real, whatever real might mean.
00:20:15.900 Now, let's think about the problem of how to act.
00:20:21.220 Think about it biologically, because that's what I tried to do when I was writing Maps of
00:20:25.520 Meaning, and in Twelve Rules for Life, there's also a biological approach.
00:20:31.400 I mean, I drew on a lot of religious stories when I wrote both those books, but, which is
00:20:37.800 a strange thing to do if you're also trying to think biologically, because those two things
00:20:42.140 don't necessarily exist harmoniously.
00:20:47.480 You know, the more strict scientific types, Sam Harris being a good example, aren't comfortable
00:20:54.780 with religious presuppositions.
00:20:58.840 And so the attempt to use a physical standpoint derived from physics and a standpoint derived
00:21:06.380 from biology, say, social science for that matter, and to ally that with religious presuppositions
00:21:12.460 is an awkward marriage.
00:21:15.060 But it can be done.
00:21:18.600 I should tell you a story about Jean Piaget.
00:21:20.840 Jean Piaget was probably the greatest child developmental psychologist of the 20th century.
00:21:26.560 He invented the field.
00:21:29.580 He had a messianic crisis when he was a young man, and was tormented by the contradiction
00:21:36.640 between science and religion, and decided that he was going to devote his entire life to
00:21:41.580 rectifying that.
00:21:42.800 And I actually think he managed it to a large degree.
00:21:45.760 You know, it's not generally how Piaget is discussed.
00:21:49.040 When we discuss great people, we generally don't discuss them in their full peculiarity.
00:21:56.540 You know, you remember a while back, Elon Musk was sort of savaged for smoking, you know,
00:22:02.720 one one-thousandth of a joint on Joe Rogan.
00:22:07.140 And I thought it was so comical, because Elon Musk is a very strange person.
00:22:10.780 Now, obviously, you have to be a strange person to build an electric car, and then shoot
00:22:16.940 it on your own rocket out into space.
00:22:20.160 Right?
00:22:21.700 Seriously.
00:22:22.700 Like...
00:22:23.140 Right?
00:22:28.240 Either one of those things is really strange.
00:22:30.720 But the joint probability of doing both is way below zero.
00:22:34.760 So, so I thought it was comical that people went after Musk, because I thought, yes, well,
00:22:41.540 we like our insane geniuses, predictable and normal.
00:22:45.760 It's like...
00:22:47.140 Anyways, Piaget did do a very good job of reconciling religion with science, although people don't
00:22:57.080 really know that, because that isn't how they read Piaget.
00:22:59.760 And I think that's partly because they're afraid of the...
00:23:03.540 If you read someone who's really a genius, the depth of their genius will frighten you.
00:23:09.240 And so you'll only read down into their work till you hit what you can't abide, and then
00:23:15.380 you'll stop.
00:23:16.720 And that's the case with Freud, and it's the case with Jung, for sure, because if you read
00:23:21.780 Jung and you have any sense, you're terrified instantly.
00:23:24.400 And, and, and then it's also the case with Piaget.
00:23:29.800 And so, I'll weave in what it was that he discovered.
00:23:34.700 Okay, so, we're going to think about this biologically.
00:23:38.040 So let's say, well, one of the problems you have is what the world is made out of.
00:23:42.480 Now, one of the things that's kind of interesting about human beings is that we really didn't
00:23:45.980 care about that very much for a very long period of time.
00:23:49.580 Right?
00:23:49.980 I mean, we didn't invent science until about 500 years ago.
00:23:54.180 Now, you can argue about that, and you could say that the precursors for the scientific viewpoint
00:23:58.840 were laid down by the Greeks.
00:24:00.760 And then we could say, okay, well, that was 2,500 years ago, or 3,000 years ago.
00:24:05.000 But who cares?
00:24:06.120 If you're thinking biologically, the difference between 500 years and 3,000 years is, that's
00:24:10.840 no difference at all, you know?
00:24:12.340 Because human beings, in their current form, are 150,000 years old.
00:24:18.080 That's the estimate.
00:24:19.260 There's been creatures basically identical to us genetically for 150,000 years.
00:24:24.820 And we diverged from chimpanzees 7 million years ago.
00:24:28.700 And so, you know, and so we were vaguely human for 7 million years.
00:24:32.800 And so, in the span of, let's say, 2 million years, just to, you know, give the later proto-humans
00:24:39.800 their advantage, the difference between 500 years and 3,000 years is completely trivial.
00:24:46.320 The point is, is that we didn't invent anything approximating science, and we weren't concerned
00:24:51.060 about the structure of the material world in any objective sense until, like, yesterday.
00:24:56.780 And really, and we managed, we managed to survive without it.
00:25:02.080 And then, of course, there's the 3.5 billion years of biological evolution that preceded even
00:25:09.940 the emergence of human beings that existed in the absolute absence of anything approximating
00:25:15.640 a scientific perspective, which indicates that, well, A, that that perspective appears
00:25:21.840 to not be strictly necessary from the purpose, from the perspective of survival, and, and B,
00:25:30.440 that, well, that something else, something else was occurring to provide us with the knowledge
00:25:35.840 that we needed during all of that time.
00:25:39.260 And so, when I read Nietzsche decades ago, he talked about how philosophy emerged.
00:25:50.700 So, he was an analyst of philosophers.
00:25:54.080 And it was Nietzsche's idea that what the typical philosopher, perhaps even including him, but perhaps
00:26:04.980 not him, produced was an unconscious recapitulation of their own knowledge, that they felt that what
00:26:12.940 they were doing was coming up with a rationalistic account of the structure of behavior, let's
00:26:17.860 say, but really what they were doing was noticing how they acted, and then describing that and
00:26:24.900 providing it with a rationale.
00:26:27.420 And so, it was bottom up, not top down.
00:26:31.660 So, we might think about that.
00:26:33.960 We might think about that, because one of the things we do know is that however it was
00:26:39.000 that creatures, animals, us, figured out how to act, over the millions of years that they
00:26:45.880 figured out how to act, it was bottom up and top, not top down.
00:26:50.920 The simpler the animal, the less capable it is of thinking.
00:26:55.060 And so, if you go back far enough, and you don't have to go back that far, you get creatures
00:26:59.140 that can't think at all.
00:27:01.100 Like, there's lots of complex, multi-celled organisms that don't have much of a nervous
00:27:06.020 system, and they don't think.
00:27:07.680 They just act.
00:27:08.560 They just exist.
00:27:10.700 And so, whatever they're doing isn't a consequence of thinking.
00:27:13.180 It's a consequence, the fact that they know what they're doing is a consequence of something
00:27:16.820 other than thinking.
00:27:18.620 And even more sophisticated animals, mammals, let's say, act and don't think.
00:27:27.220 And so, but they act in sophisticated ways.
00:27:32.340 And so, I was reading some of the ethologists about the same time.
00:27:35.700 Ethologists are scientists who study animal behavior.
00:27:39.660 They're not like laboratory behaviorists.
00:27:41.840 They don't put the animals in a controlled environment and do experiments on them.
00:27:46.740 They just use naturalistic observations.
00:27:49.360 So, you can imagine, Jane Goodall was an ethologist, for example, and she studied chimpanzees.
00:27:54.360 This guy named Conrad Lorenz in Scandinavia, he studied geese and other animals, dogs as well.
00:28:02.040 And so, these are people who just, and Diane Fosse, I think it was Fosse that, now, was she
00:28:07.220 gorillas, I think it was gorillas.
00:28:10.120 Yeah, yeah.
00:28:10.980 And so, you go out and watch the animals.
00:28:13.380 And, you know, Goodall, when she was watching chimpanzees, what she'd do is, well, she'd look
00:28:18.620 for regularities, but she also found herself telling stories.
00:28:22.140 You know, so you read Goodall's accounts of chimpanzees, you get stories about what the
00:28:25.880 chimpanzees are like.
00:28:26.740 And she gave them all names.
00:28:27.860 And you can see the personalities of the chimpanzees emerging in her accounts.
00:28:34.240 And that's kind of interesting, because what it suggests is that if you're looking for how
00:28:39.060 a set of creatures, how a set of creatures interact, you tend to look at them as if their
00:28:49.240 personalities acting something out, acting out patterns.
00:28:53.800 Say, like, wolf pack.
00:28:55.640 If you watch wolf pack for a while, you're going to see patterns, regular patterns of
00:29:01.080 behavior that characterize the interactions between the wolf pack.
00:29:04.340 Has to be that case, right?
00:29:05.560 Because otherwise, the wolves couldn't predict one another.
00:29:07.660 If there was no regularity, there'd be no predictability.
00:29:10.620 There'd just be chaos.
00:29:11.980 And so, what happens is the wolves settle into predictable patterns of behavior.
00:29:16.400 The chimpanzees settle into predictable patterns of behavior.
00:29:19.080 And then you can tell stories about those predictable patterns of behavior.
00:29:25.000 And you might even be able to derive rules.
00:29:26.980 You might say, well, it's as if the chimpanzees are acting out this rule.
00:29:30.560 So here's a rule that you might act out if you were a wolf.
00:29:34.080 So let's say you're a male wolf, and there's another male wolf around, and you decide that
00:29:40.120 you're going to have a dominance dispute, right?
00:29:43.320 And so you puff up your fur, and you look rough and tough, and you bare your teeth, and
00:29:47.100 you growl, and you threaten, and you terrify.
00:29:52.620 And maybe you even fight to some degree, but not too much, because you don't want to damage
00:29:57.100 each other.
00:29:58.760 Because the person, the wolf that you're fighting with, you might need tomorrow to bring
00:30:03.460 down a moose with, right?
00:30:06.500 So that's an interesting thing to consider, because it constitutes a limit on the kind
00:30:12.380 of aggression that's allowable in the pursuit of dominance.
00:30:17.760 Anyways, two wolves go at it, and usually what happens is one wolf decides it's not worth
00:30:23.020 the risk, and he rolls over and presents his throat to the victor.
00:30:28.420 And that basically means something like, well, I'm useless and weak, and you can tear out
00:30:34.700 like a prey animal, and you can tear out my throat if you so choose.
00:30:40.540 And the dominant wolf acts out something like, well, I know you're useless and weak, but I
00:30:47.760 might need you to haul down a moose tomorrow, so despite the fact that you're not good for
00:30:52.260 anything, you might as well get up, and we'll get on with it.
00:30:55.200 Now, obviously, the wolves aren't thinking that, but that's how they act.
00:31:00.200 And so if you're watching, and you can think, and you describe how they act, that might be
00:31:05.080 how you describe it.
00:31:06.360 And so I just described it in words.
00:31:09.340 I used what would be approximately the description of a rule.
00:31:14.840 The rule would be, if you're a wolf pack, don't kill each other, because you need the whole
00:31:19.340 pack to pull down your prey.
00:31:21.620 Right, right.
00:31:22.320 So there's a constraint on striving.
00:31:26.080 And you can lay it out in a rule-like manner.
00:31:28.520 And I told a little story, and then I derived a rule from it.
00:31:33.100 And that's how we act emerged.
00:31:39.460 So, each of us pursues our own motivated behaviors, but then we aggregate together in groups.
00:31:47.220 And the fact that we aggregate together in groups puts certain restrictions on how it
00:31:53.900 is that we manifest our motivated behavior, partly because we depend on each other, like
00:31:58.920 the wolves depend on each other.
00:32:00.920 And so what that means is that there's an ethic that emerges out of the interactions between
00:32:05.940 those motivations, and that ethic manifests itself in a certain kind of pattern behavior.
00:32:10.880 Now, that's about as far as it goes for wolves, because they don't sit around at the campfire
00:32:15.800 at night and talk about how it is that wolves interact with one another.
00:32:21.780 But human beings do that, see, because we've got this next level of cognitive ability, imaginative
00:32:28.380 ability.
00:32:28.860 It's not just pure abstract thought.
00:32:32.400 It's not like we've got motivation like other animals, and then we've got the patterns of
00:32:38.480 behavior that emerge as a consequence of the interaction of those motivated behaviors.
00:32:44.360 We then have the ability to watch ourselves like we would watch a wolf pack, right?
00:32:49.940 Because an anthropologist, an ethologist can go out in the wild and watch a wolf pack or
00:32:55.380 watch a chimpanzee troop and take notes, say, look, this is what's, here are the patterns.
00:33:00.160 I can write down the patterns.
00:33:01.780 And when I write down the patterns, then I'm telling a story, and out of the story, I can
00:33:05.660 abstract rules.
00:33:07.080 But the chimps aren't doing that.
00:33:08.580 The wolves aren't doing that.
00:33:09.500 But the human being can do that.
00:33:12.360 And then the human being can say, well, it's as if wolves follow these rules, which they
00:33:17.140 don't, because they're wolves, and they don't follow rules.
00:33:19.900 But they do manifest patterns.
00:33:22.120 And the patterns emerged as a consequence of something approximating an evolutionary competition.
00:33:28.820 Now, it's interesting, it's very interesting to think this through.
00:33:33.180 So, Franz de Waal, who's another ethologist, who studies chimpanzees, he's written a bunch
00:33:40.480 of great books.
00:33:41.340 You might be interested in them if you're interested in this sort of thing.
00:33:44.160 He's been interested in the emergence of morality among chimpanzees.
00:33:49.800 And so, one of the things de Waal has found was that, and this is partly why the post-modernists
00:33:59.500 who insist that the fundamental motivator for the structure of hierarchies is power, are
00:34:06.680 wrong.
00:34:08.180 Okay, so I'm going to repeat that.
00:34:10.220 They're wrong!
00:34:11.180 Okay, it's seriously important, because one of the claims that's tearing our culture apart
00:34:20.000 is that our hierarchical structure, which would be our entire culture, is an oppressive patriarchy,
00:34:29.080 and that the people who occupy the positions, especially the higher positions, let's say, in
00:34:35.600 the hierarchy, got there because they exercise power.
00:34:40.340 That's a fundamental claim of Foucault, for example, who's an absolutely reprehensible
00:34:45.460 scholar on about 15 different dimensions.
00:34:48.620 But one of them, one of them is his narrow-minded insistence that power is the only justification
00:34:56.760 for hierarchical position.
00:34:58.520 Now, I think that's patently absurd in the case of human beings, but I won't go there to
00:35:06.500 begin with, because I might as well make the case that it's patently absurd in the case
00:35:10.660 of animals, where it's more like power.
00:35:17.160 De Waal has shown, well, you got the example of the wolf already, it's like, you just can't
00:35:22.200 be savaging up your pack mates, because all you do is demolish the structure within which
00:35:30.440 you live, and that's true for wolves, you know, it's true for chimpanzees.
00:35:38.480 So De Waal has shown in sequence of observations, and chimps are a good test case, because we're
00:35:47.560 rather chimp-like.
00:35:50.020 Now, well, they're our closest biological relatives, right?
00:35:53.400 So it's only a seven million year gap between us and the common ancestors.
00:35:57.100 So if you're going to look at any animal, and derive conclusions about the basal motivations
00:36:04.620 of human beings, well then, it would be chimps that you would look at.
00:36:07.840 You might look at bonobos too, although the differences between bonobos and chimps has been
00:36:12.400 exaggerated by all appearances.
00:36:14.580 But in any case, you can get to the top if you're the roughest, toughest, meanest, most
00:36:20.460 physically powerful and cruelest chimpanzee.
00:36:24.740 But your rule is unstable, and there's a reason for that.
00:36:30.520 And the reason is, is that chimpanzees, the males, females as well, but the fundamental hierarchy
00:36:36.500 for chimps is male.
00:36:39.120 Chimpanzees are competitive, but they're also cooperative.
00:36:43.540 They spend a lot of time grooming each other, and they actually have long-term friendships.
00:36:49.060 And it turns out that two chimpanzees that are three-quarters as tough as the toughest
00:36:57.240 chimpanzee make a pretty vicious set of opponents if you get a little bit too tyrannical.
00:37:04.020 And the problem with pure power for the chimpanzees is that if you're a chimp that climbs to
00:37:09.880 the top as a consequence of nothing but psychopathic dominance, you have no allies and no friends
00:37:15.860 because you don't engage in any cooperative behavior, you'll have an off day and your two
00:37:20.700 lieutenants, each of whom is three-quarters as strong as you, will band together and tear
00:37:26.480 you into bits.
00:37:28.760 It's an unstable medium-to-long-term solution.
00:37:34.220 Power.
00:37:36.140 And I see no reason to assume that exactly the same thing isn't the case with human beings.
00:37:42.480 If it doesn't work for chimpanzees and it doesn't work for wolves, why in the world would it work
00:37:48.340 for human beings?
00:37:49.080 Especially when you think that our hierarchies are way more complicated than the hierarchies
00:37:54.760 that characterize animals.
00:37:56.820 And that we do all sorts of things with our hierarchies that animals don't do.
00:38:02.360 I mean, most of you have jobs, and most of those jobs are...
00:38:09.080 While you're performing most of those jobs, you're not doing things that any animal would
00:38:15.440 have done, would do now, or that any human being would have done 300 years ago.
00:38:20.220 Now, there's certain exceptions to that.
00:38:22.380 Some of you might farm, but even if you're doing that, you're not doing it in a way that
00:38:26.000 people did it 300 years ago.
00:38:28.480 Whatever your job is, and it's likely to be very abstract, you're pursuing some goal that
00:38:35.160 is fairly distant from a fundamental biological motivation.
00:38:39.300 It's very abstract.
00:38:40.760 And you're actually creating something of at least sufficient value so that other people
00:38:45.340 will trade with you for it.
00:38:47.980 And the idea that you're going to move up in your hierarchy of production by exercising
00:38:55.160 something like psychopathic power is...
00:38:58.480 I think that's insane to think that.
00:39:02.100 You know, and I think about plumbers, for example.
00:39:07.180 I've used this joke before, but I like it, so I'm going to use it again.
00:39:10.420 And it's like, you know, if the postmodernists were right, this is how you'd hire a plumber.
00:39:16.560 First of all, the plumbers would have all banded together, although they'd be fighting
00:39:19.640 within themselves because, of course, there's nothing but power.
00:39:22.380 But they would have all banded together.
00:39:24.300 And they'd go from door to door, and they'd basically knock on your door and tell you that
00:39:29.100 if you didn't hire them, there was going to be hell to pay.
00:39:32.080 It'd be like mafia plumbers.
00:39:33.800 And then you'd pick the roughest, toughest plumber who's best at exercising power to
00:39:41.500 not fix your pipes, because why would he be interested in fixing your pipes?
00:39:46.880 He'd just be interested in pretending to fix your pipes and taking your money, which is
00:39:51.640 how you'd expect a mafioso plumber to operate.
00:39:55.260 But that isn't how it works, right?
00:39:56.840 I mean, if you hire virtually anyone in your day-to-day life, you suss out their reputation
00:40:03.400 for their ability to provide the service that they promised to provide, which is predicated
00:40:09.220 on their skill, their technical skill, their skill as workmen and craftsmen.
00:40:15.660 It also is predicated on their ability to generally work in some reasonably interactive way with their
00:40:21.900 employees, because otherwise they don't stay in business for that long.
00:40:25.120 And also to treat their customers with a modicum of reciprocal respect, or their reputation
00:40:32.500 is savaged immediately and they fail.
00:40:36.540 And so, this patriarchal structure that we hypothetically all occupy, that's fundamentally
00:40:43.900 predicated on oppression and power, breaks apart into something approximating cooperative
00:40:52.120 competence when you look at any of its sub-components.
00:40:56.600 You know, you don't have power-hungry massage therapists, you know?
00:41:03.600 Well, I don't understand how the entire structure can be a power-dominated oppressive patriarchy
00:41:11.540 if none of the sub-components are.
00:41:14.100 It doesn't make any sense to me.
00:41:15.620 And I'm not saying at all that, within large hierarchies, there isn't room for relatively
00:41:21.720 psychopathic people to now and then manage a certain amount of success.
00:41:26.540 You know, you know that as a hierarchical structure grows in size, that pure power politic players
00:41:33.240 have a higher probability of success.
00:41:36.520 But all that happens to large organizations, when they get completely dominated by people
00:41:41.820 who are using power and politics as a means to climb to the top, is that they precipitously
00:41:46.840 collapse.
00:41:47.900 Because you end up with no one who can actually perform the function that the structure is
00:41:52.900 supposed to perform, and a whole plethora of people who are good at doing nothing but
00:41:57.660 playing politics.
00:41:58.980 And then the company dies.
00:42:00.240 And the thing is, companies die all the time.
00:42:02.840 The typical Fortune 500 company lasts 30 years.
00:42:07.560 And the reason for that often is that it's functional for a while, and then it gets corrupted
00:42:12.860 by internal politics, or it gets blind, or it can't keep up, or whatever it is, and that's
00:42:18.460 the end of it.
00:42:19.080 It falls apart, and, you know, breaks into its constituent elements, and they reformulate.
00:42:24.360 Maybe there's some new companies that come out of it, but it's not a permanent structure
00:42:28.620 by any stretch of the imagination.
00:42:30.320 And so, I don't think there's any reason whatsoever to assume, well, we can think this through.
00:42:37.140 Are our hierarchical structures based on power, or are they based on competence?
00:42:43.820 And look, the answer is, well, they're a bit based on power, you know?
00:42:47.380 Because you shouldn't be naive about this.
00:42:51.160 Hierarchical structures can, and do, become corrupt in some ways.
00:42:57.160 And we have to keep an eye on that all the time.
00:42:59.620 But my sense is that, by and large, things work.
00:43:07.520 You know, I mean, here we are, we're sitting in this hall, and there's 3,000 of us, and
00:43:13.480 you all got here, because your cars worked, and the highways work, sort of.
00:43:20.220 I mean, I was in a traffic jam for like an hour and a half getting here.
00:43:26.740 But, you know, that's what happens when you drive somewhere at 5 o'clock.
00:43:30.400 You know, you can predict that.
00:43:32.320 So they worked, and the lights are on, and they seem to work, and that's no trivial thing.
00:43:37.840 And it looks like the big TV screen is working.
00:43:40.640 And you're all sitting here peacefully, not engaging in an overt power struggle, as far
00:43:48.180 as I can tell, even though, even though there's a hierarchical arrangement of seats, right?
00:43:53.860 You accept that, you know?
00:43:56.680 And you're all, and you accept the fact that, you know, some of you paid a premium for sitting
00:44:02.220 closer, and that that seems to be a reasonable way of distributing somewhat scarce resources.
00:44:07.700 And, you know, everyone in here is behaving peacefully, and, like, all this seems to work
00:44:12.920 quite nicely.
00:44:13.860 And so, since this all works, it's very difficult for me to understand how it's not predicated,
00:44:19.580 at least in large part, on competence.
00:44:22.240 And then, of course, there's the other evidence, which is, well, you know, a lot of you are older
00:44:27.980 than the average person would have been when he or she died throughout the history of the
00:44:33.180 entire human race.
00:44:34.820 And so that seems to be working out pretty good for you.
00:44:37.700 And you're, none of you are skinny, and quite the contrary.
00:44:45.660 So there are more obese people in the world now than there are starving people, by quite
00:44:50.460 a substantial majority.
00:44:51.840 And I think that that's worth quite the celebration, even though it's perhaps not exactly optimized.
00:44:57.380 But it's definitely better than the alternative.
00:44:59.980 And, you know, by and large, we're moderately healthy.
00:45:04.520 And most women don't die in childbirth like they used to.
00:45:09.260 And most children don't die within a year of being born like they used to, not very long ago.
00:45:16.020 And so, so things seem to be working.
00:45:23.880 Not so bad, given what a bloody catastrophe life is, and how difficult it is to get things
00:45:30.200 to work, and how fragile people are, and how short-lived we are intrinsically, and how vulnerable
00:45:37.500 we are to suffering.
00:45:38.720 We're not doing so bad, you know?
00:45:41.760 We must be doing something right.
00:45:44.660 And the...
00:45:46.440 Well, this is the thing.
00:45:50.240 This is the thing that, I would say, enrages me about, let's call them universities.
00:45:57.320 And...
00:45:58.680 It's the intellectual and moral laziness of the resentful victimization power post-modern
00:46:18.660 doctrine.
00:46:20.020 Because, look, it doesn't take a genius to figure out that the history of humanity is
00:46:26.320 a bloody nightmare.
00:46:28.440 And, you know, to lay that only at the feet of human beings, I think, is a mistake.
00:46:36.240 Because life in the natural world is a bloody nightmare.
00:46:40.980 And so you can't just blame that on people.
00:46:43.740 You know, like, you can blame it on people to some degree.
00:46:46.240 We take a bad thing, or we take a deadly thing, or a dangerous thing, and we can make it worse.
00:46:52.160 And we definitely do that.
00:46:53.900 But it's not like the problem is simple to begin with.
00:46:56.700 Because it's clearly the case that at every moment, the planet is trying to kill you.
00:47:03.920 And eventually, it will succeed.
00:47:08.880 And so, that's a big problem.
00:47:11.640 And it's reasonable enough for us to group together, and to try to stop that.
00:47:17.840 And while we're doing that, we cause some trouble.
00:47:21.640 You know, we pollute things, and we break things, and we don't play as good an iterating game as we might.
00:47:27.200 But, you know, for creatures that only last eight decades, and have a lot of trouble during all eight of those, we don't do that badly.
00:47:39.360 And we've got our reasons for being as perverse and useless as we are.
00:47:43.520 And all of that shouldn't be laid at our feet, even though we need to take responsibility for it.
00:47:49.940 And so, one of the things that I find extremely disturbing about the emergent hypothesis that our culture is nothing but an oppressive patriarchy,
00:48:01.720 is that it's ungrateful beyond comprehension.
00:48:06.080 And worse, if you adopt that stance, well, it bestows a certain amount of benefit on you.
00:48:15.100 I mean, first of all, if you posit that there's nothing in your culture except what's corrupt,
00:48:21.420 then that immediately elevates you above all of it as that sort of critic, right?
00:48:25.700 It's like, well, I'm taking the high ground, and I'm looking at the entire history of humanity, and calling it unacceptable.
00:48:35.760 It's like, well, that's fine, I suppose, except it isn't necessarily obvious that you could do any better, and you probably haven't.
00:48:43.280 You know, and maybe you could try, although then you might say, well, just try all that trying does is make it worse.
00:48:49.040 And participating in that terror, doing anything of any utility, is just playing the power game and making it worse.
00:48:56.800 It's like, yeah, well, you know, pardon me for being a bit skeptical about your motivation.
00:49:02.060 I don't think that moral virtue is that easily come by.
00:49:05.600 And I think that there are difficult problems to solve, and that you could contend with the world and try to solve them,
00:49:11.360 instead of complaining about the fact that they haven't been solved properly.
00:49:14.580 And that if you did manage to solve a problem or two, which you might, then maybe you'd have the right to stand up as a more global critic.
00:49:23.760 But until then, well, what's rule six?
00:49:26.380 You should get your house in order before you criticize the world.
00:49:29.860 And that's no simple thing.
00:49:34.100 But it's, so the moral virtue thing, that's annoying, because I don't think that moral virtue should be unearned.
00:49:40.720 You know, this is kind of why I'm an admirer of the doctrine of original sin.
00:49:45.840 You know, I think you're stuck with some concept of original sin, no matter how you think.
00:49:51.100 I mean, I see that in the atheistic environmentalist types, and I know all environmentalist types aren't atheistic,
00:49:56.520 and I know that all environmentalist types aren't reprehensible either.
00:49:59.700 But there's a nice selection of atheist environmental types who are reprehensible.
00:50:03.900 And they're, yeah, yeah, yeah, they're usually the ones that say that the world would be better off if there were fewer people on it.
00:50:11.380 Which is not a sentiment that I find particularly attractive.
00:50:15.720 And it's also one, if, when I meet someone who utters that, I always think two things.
00:50:23.840 One is, well, why are you still here, then?
00:50:27.100 One, that's the first one.
00:50:29.540 And the second one is, just exactly who it is, who is it that you're planning on getting rid of,
00:50:36.720 and how exactly would you go about doing that if you had the opportunity?
00:50:42.100 So, I don't find that, I don't find that particularly admirable.
00:50:49.000 And I don't think that there's any sympathy in it.
00:50:51.840 You know, because I think that we should have some sympathy for ourselves.
00:50:55.740 So, that's rule two, is you should treat yourself like you're someone responsible for helping.
00:51:01.260 Or rule three, which is you should make friends with people who want the best for you.
00:51:05.820 It's like, corrupt and useless as you are, you do have a hard lot.
00:51:12.760 You know, and so there's some reason for sympathy.
00:51:16.180 And to say that human beings are nothing but despoilers of the planet,
00:51:19.900 is to miss half the story, which is as fast as we're trying to kill Mother Nature,
00:51:26.580 she is returning the favor in spades.
00:51:30.200 So, that doesn't mean that we should be foolish about it.
00:51:34.620 You know, and a certain balance has to be attained.
00:51:37.640 But it's nice to look at both sides of the equation before you lay out too much judgment.
00:51:43.280 You know, back in the late 1800s, Thomas Huxley, who was eldest Huxley's great-grandfather
00:51:50.300 and also a great defender of Darwin, was commissioned by the British Parliament
00:51:55.160 to do a study on oceanic resources, because there was some concern at that point
00:52:00.300 that human beings might be overfishing.
00:52:03.660 And his conclusion was, the oceans were so bountiful and plentiful
00:52:08.300 and human beings so comparatively small in number and powerless
00:52:13.880 that there wasn't a hope in hell that we would ever be able to put a dent
00:52:18.020 in the vast resources of the ocean.
00:52:22.360 That was only at the beginning of the 20th century.
00:52:26.160 You know, we didn't get to the point where we could harvest on an industrial scale
00:52:31.140 until after World War II.
00:52:32.720 You know, that's only about 70 years ago.
00:52:35.720 And so it's only been 50 years, say, maybe a bit more, 60 years, maybe since 1960,
00:52:41.220 that we woke up to the fact that some of our actions had now become powerful enough
00:52:45.280 to be considered on a global scale.
00:52:47.460 And that's within the lifetime.
00:52:48.880 That's within my lifetime.
00:52:50.220 That's within the lifetime of a single person.
00:52:52.420 I don't think we've done such a bad job of waking up since then
00:52:55.560 and starting to understand that, you know, maybe we have a larger-scale moral obligation
00:53:01.120 than we realized before that's proportionate to our technological power.
00:53:05.180 That's another place where a little bit of sympathy might be in order.
00:53:08.760 You know, and I mean, L.A. is a lot cleaner than it used to be in terms of its air quality
00:53:13.280 and so is London.
00:53:14.320 And, you know, we've made a lot of progress, I would say, in a relatively short time
00:53:18.340 trying to clean up the mess that we made when we were trying not to die painfully young.
00:53:23.460 You know, so, all right, well, back to this ethic.
00:53:28.720 So, I'll tell you another study that I really like that's really cool.
00:53:33.880 So, this is a study that was done by a guy named Yak Panksep and he did it with rats.
00:53:39.820 And now and then, I love reading animal experimental work.
00:53:43.800 If you want to study psychology, that's what you should read.
00:53:47.400 You should just read animal experimental work because those people,
00:53:50.140 it's not all good, but some of it's really good.
00:53:52.940 And some of the people who've done it, they were real scientists.
00:53:55.760 And this guy Panksep, he's one of them.
00:53:58.240 Got another guy's name, Jeffrey Gray, who wrote a book called The Neuropsychology of Anxiety,
00:54:02.180 which take you like eight months to read.
00:54:04.020 It's a really hard book, but it's brilliant.
00:54:07.100 Anyways, Panksep has done a really good job of laying out the fundamental motivational systems
00:54:12.180 in their biology.
00:54:13.220 I'll just tell you a brief story about that.
00:54:15.180 The American Psychological Association just came out with its guidelines for the treatment of men and boys.
00:54:20.340 It's actually, it's like, it's not, they're not guidelines.
00:54:24.020 They're not for treatment.
00:54:25.480 And they're certainly not for the improvement of the health of men and boys.
00:54:28.560 It's an absolutely reprehensible ideological screed on how psychologists have to think politically
00:54:35.320 so they won't be punished by those who accredit them.
00:54:38.900 That's it fundamentally.
00:54:39.960 But one of the claims they made, they made two claims that are beyond comprehension to me.
00:54:46.800 The first one was that aggression is socialized.
00:54:53.020 So that's the first claim.
00:54:54.580 And the second claim is that boys are socialized into aggression by men.
00:55:00.820 Okay, so let's look at those.
00:55:02.300 This relates to Panksep.
00:55:04.700 So Panksep outlined a bunch of biological circuits.
00:55:07.640 So human beings, like mammals, but also like even more, what, archaic animals, speaking evolutionarily,
00:55:18.560 have a variety of fundamental biological circuits.
00:55:21.300 So I can tell you some of them.
00:55:23.000 Some of them are obvious and some of them are somewhat surprising.
00:55:26.420 You have a circuit for pain.
00:55:28.240 Well, there's no surprise.
00:55:30.020 You have a circuit for anxiety.
00:55:31.940 That's circuit.
00:55:32.900 You know, it's a metaphor, obviously.
00:55:34.820 You have a biological system that mediates anxiety.
00:55:37.840 All of you have it.
00:55:38.940 Animals have it.
00:55:41.440 You have one for something called incentive reward.
00:55:45.520 And that's what moves you towards valued goals.
00:55:48.180 That's basically associated with positive emotion.
00:55:50.860 You have one that satisfies you when you consume.
00:55:53.840 That's a consumatory reward system.
00:55:57.860 Lust.
00:55:58.780 That's another one.
00:56:00.020 Thirst.
00:56:00.660 Hunger.
00:56:02.160 Play.
00:56:03.260 That's cool.
00:56:04.120 That's a fundamental circuit.
00:56:05.420 Panksep discovered that.
00:56:08.420 There's a circuit for care.
00:56:11.400 That's another one.
00:56:12.500 And there's a circuit for rage.
00:56:14.400 That's not all of them, but that's a good start.
00:56:17.340 Rage.
00:56:18.360 It's there.
00:56:19.280 Right from the beginning.
00:56:21.040 If you do facial expression coding analysis of infants.
00:56:25.400 Say an infant learns to recognize its mother.
00:56:27.940 And then that's around nine months often or even earlier than that.
00:56:32.040 So a person comes into the baby's room and the baby starts to cry because that person isn't the mother.
00:56:39.900 And you think, oh, the baby's sad.
00:56:41.680 It's like, no, the baby isn't sad.
00:56:43.200 The baby is angry.
00:56:45.360 That's why it's turning red.
00:56:46.860 And if you do facial expression coding, it's like the baby is actually, the baby is cursing internally.
00:56:53.220 And you already know this because you know that, like, a two-year-old isn't much older than a baby.
00:57:01.080 And two-year-olds have temper tantrums.
00:57:02.840 And it's not because they're sad, as you can tell perfectly well, if you just watch a two-year-old have a temper tantrum.
00:57:09.020 It's clear that they're completely possessed by rage.
00:57:13.320 And that rage circuit, it's exactly right.
00:57:15.680 That rage circuit is active even before the fear circuit is active.
00:57:19.420 It's activated very early.
00:57:20.880 And one of the things that's the case is that some children are much more aggressive than others.
00:57:26.740 Right from the beginning.
00:57:28.080 And most of those are boys.
00:57:30.600 And not all of them, but most of them.
00:57:32.380 And if you take two-year-olds and you group them together, you find if you take one-year-olds, two-year-olds, three-year-olds, four-year-olds, all the way up to 16, and you group them together, they don't know each other.
00:57:42.460 And then you count aggressive actions.
00:57:45.740 The two-year-olds are by far the most aggressive bunch.
00:57:48.740 They kick and hit and bite and steal.
00:57:51.120 Not all of them.
00:57:52.280 But a good chunk of them.
00:57:54.400 And so, 16-year-olds don't do that when you put them together.
00:57:58.080 And look, you're all here, and you're not 16, you're like 30, and you're not doing any of that.
00:58:03.280 And so, two-year-olds, man, they're aggressive little monsters.
00:58:06.680 And it's okay because they're small and soft, and what the hell can they do, you know?
00:58:11.320 So, it's not like, you know, it's not blood warfare among two-year-olds.
00:58:17.980 But it's just because they don't have the sophistication and the weapons.
00:58:20.860 They've got the motivation.
00:58:24.620 And it's really true for a subset of them.
00:58:29.020 So, about 5% of two-year-old boys are hyper-aggressive.
00:58:33.560 But what's cool is that almost all of them are socialized into civilized behavior by the time they're four.
00:58:41.860 And you can define civilized behavior actually quite nicely using a Piagetian definition is that other children will play with them.
00:58:50.320 And that's rule five, by the way.
00:58:52.100 Okay, don't let your children do anything that makes you dislike them.
00:58:55.460 What's your job if you're a parent?
00:58:57.140 Your job as a parent is to socialize your children so that by the age of three, other children will play with them.
00:59:02.760 Because that means they've learned how to engage in reciprocal social interactions.
00:59:06.760 And that'll start to spiral upward.
00:59:08.500 So, if your child is acceptable as a playmate by the age of three, age of four is the limit, by the way.
00:59:14.620 So, you better have it together by then.
00:59:16.240 And if, otherwise, they get alienated and isolated and they don't make friends and then they never recover from that.
00:59:21.680 It's not, it's not good.
00:59:23.520 It's not good.
00:59:24.660 So, most kids are socialized, even the hyper-aggressive ones are socialized into acceptable playmates by the age of four.
00:59:32.500 And it's men, in large part, that do that.
00:59:37.620 Now, what's the evidence for that?
00:59:39.480 Well, there's lots of evidence for it.
00:59:41.020 It's the evidence among mammals is the use of rough and tumble play, for example, among males and their offspring in order to socialize and civilize them.
00:59:50.220 But the evidence among human beings is that, where do you get aggressive teenagers?
00:59:58.080 What sort of families produce aggressive teenagers?
01:00:03.260 Fatherless families.
01:00:05.760 Right, so let's think about that with regards to what the APA said.
01:00:09.780 Okay?
01:00:10.240 Because what they said was the opposite of the truth.
01:00:13.300 It wasn't even a lie.
01:00:14.700 It wasn't even wrong.
01:00:16.000 It was, they took the truth and then they claimed the opposite.
01:00:20.220 On two dimensions.
01:00:21.520 Number one, aggression isn't socialized.
01:00:25.420 Civilization is socialized.
01:00:28.280 Number two, if men were responsible for the creation of aggression among boys,
01:00:34.320 then fatherless families would produce boys that were more peaceful.
01:00:38.300 And they don't.
01:00:40.200 So what the hell, fundamentally?
01:00:42.600 All right, so you have the behavioral pattern that characterizes civilized behavior among wolves or civilized behavior among chimpanzees or civilized behavior among rats.
01:01:04.340 I'll tell you the rest of the rat story.
01:01:05.740 So, rats like to engage in rough and tumble play.
01:01:10.160 Especially juveniles.
01:01:11.300 Especially juvenile males.
01:01:12.800 And we know they like it because they'll work to do it.
01:01:15.800 So, if they know, if they know that, if a rat has been somewhere and he got to play,
01:01:21.120 and then you put him back there and you make him press a bar to open the door so he can go into where he played,
01:01:25.740 he'll press that bar like mad.
01:01:26.960 And so, that's how you infer motivation among rats.
01:01:30.940 Will the rat work to play?
01:01:32.820 You will.
01:01:34.060 Right?
01:01:35.100 Which is why you'll pay for tickets to a basketball game.
01:01:38.400 Hell, you'll work to watch people play.
01:01:43.380 Right?
01:01:44.120 That's very interesting.
01:01:45.560 It's absolutely, like, that's so important.
01:01:48.520 It's almost unbelievable that you will do that.
01:01:51.780 I mean, because the question is, well, why?
01:01:54.620 That's the first question.
01:01:55.680 Well, seriously, it's like, what the hell?
01:01:57.920 What are you watching a bunch of pituitary cases go, you know, bang a ball so that they can throw it through a hoop?
01:02:03.520 And, like, you'll pay outrageous sums of money to do that.
01:02:09.020 It's like, you're not even throwing the damn ball.
01:02:11.540 You're just, like, watching them do it.
01:02:13.400 Can I have the ball?
01:02:14.520 No.
01:02:15.160 Well, I paid $200 for this ticket.
01:02:17.620 They don't get the ball.
01:02:19.500 It's like, you don't get to play.
01:02:20.740 Okay, well, I'll just watch.
01:02:22.380 It's like, so that's how important it is.
01:02:25.200 That's how important play is.
01:02:26.500 That you'll work to make money to buy tickets to watch people play.
01:02:31.820 Right.
01:02:32.320 Man, that's crazy.
01:02:34.260 So, but it's fundamental.
01:02:36.820 And you think about how much of our entertainment is associated with exactly that.
01:02:41.060 Think, well, are we just doing something random?
01:02:43.180 Or is there something important going on there?
01:02:45.180 I mean, it doesn't look that important.
01:02:46.600 You know, in some sense, it's easy to be cynical about it.
01:02:49.580 But it's foolish because it's crucially important.
01:02:52.460 The fact that we're so wired up to admire fair play that we'll pay for it.
01:02:58.980 We'll pay for the right just to watch it vicariously.
01:03:02.540 That's a testament to the degree to which we're civilized and social.
01:03:07.360 Because a game is something that's civilized and social.
01:03:10.120 So, back to Piaget.
01:03:11.860 Piaget believed that most socialization occurred as a consequence of integrating these underlying motivational systems into iterable games.
01:03:21.900 And that's what you're doing with your kids, right?
01:03:23.440 When you socialize your kids, when you teach them how to take turns, when you teach them how to play a game,
01:03:30.300 then what you're doing is that you're socializing them into iterated, reciprocal interactions with other people.
01:03:39.120 And that's the fundamental aspect of ethic, of the ethic.
01:03:43.240 That's how an ethic emerges from the bottom up.
01:03:45.880 It emerges in games.
01:03:47.080 That was Piaget's fundamental observation.
01:03:49.220 Much better than the Freudian hypothesis.
01:03:51.060 The Freudian hypothesis was basically that human beings learn to inhibit their aggression.
01:03:56.140 That's not Piaget's model.
01:03:57.580 Piaget's model is, no, no, no.
01:03:59.200 You integrate the aggression into a higher order game.
01:04:01.980 And you know this because you want an athlete, a good athlete, is someone who's got that aggression.
01:04:06.340 But it's directed, right?
01:04:07.580 It's not random.
01:04:08.700 In fact, if you see an athlete manifest random aggression, you're not happy about that, right?
01:04:14.300 Maybe you admire them because they're really competitive.
01:04:17.240 They're highly skilled, really competitive.
01:04:18.940 They're goal-driven.
01:04:20.060 They want to win.
01:04:21.060 But it's all focused on the goal.
01:04:23.140 And, you know, they're cooperating with their teammates.
01:04:25.960 And they're not hogging the ball.
01:04:27.400 They're also facilitating the development of their teammates.
01:04:30.820 And they don't cheat.
01:04:32.180 They don't break the rules.
01:04:33.140 So they're cooperating even with their adversaries.
01:04:36.140 Because they're all playing basketball.
01:04:37.620 And they're all playing by the rules.
01:04:38.800 It's all cooperative.
01:04:40.580 That's an ethic.
01:04:41.440 It's an emergent ethic.
01:04:42.700 And, you know, the things that we do, what we do in our lives, outside of the game, is very game-like.
01:04:49.660 We engage in a cooperative and competitive ethic.
01:04:53.260 And we know the good players.
01:04:55.260 We know the good sports.
01:04:56.360 And that's the bottom-up emergence of an ethic.
01:05:00.060 And it's also the solution to the postmodern conundrum, as far as I'm concerned, or part of the solution.
01:05:05.840 Because the postmodernists claim that there's an infinite way of looking at the world.
01:05:11.680 And there is.
01:05:13.280 And that no way of looking at the world is better than any other way, which is wrong.
01:05:17.800 The first part's right, because the world's very complicated.
01:05:20.140 The second part's wrong.
01:05:21.820 There aren't that many ways of playing a game properly.
01:05:24.920 And you can tell that, too, because even if you watch kids play, if you go out and you watch your kids play,
01:05:29.720 you can tell which kids are good sports and which ones aren't.
01:05:33.080 Right?
01:05:33.640 That's easy.
01:05:34.380 And the kids can tell, too.
01:05:35.840 And so, you can tell when it doesn't matter what the game is.
01:05:39.360 And if you're a good sport, it's the same across games.
01:05:42.760 Which is really another indication of the emergence of a transcendent ethic.
01:05:48.260 Right?
01:05:48.960 The concept good sportsman is independent of the game.
01:05:55.440 Right?
01:05:56.380 And it's very much what it is that you want.
01:05:59.440 It's the dawning of the behaviors that you want from someone who's sophisticated and reciprocal in their day-to-day life.
01:06:06.820 And it's not, it's by no means arbitrary.
01:06:09.700 It's, it's in fact, very tightly constrained.
01:06:13.060 Back to the rats.
01:06:13.900 So, rats like to wrestle.
01:06:16.620 And so, you can get them to work to wrestle.
01:06:18.580 And so, they work.
01:06:19.240 And then the door opens and out they go.
01:06:20.880 And there's rat A and rat B.
01:06:22.600 And rat A is 10% bigger than rat B.
01:06:25.300 And so, they have a little wrestling match.
01:06:26.820 And rat A, who's the bigger rat, the more powerful rat, pins rat B.
01:06:30.320 And then the post-modern social scientists observing derive their conclusion.
01:06:38.680 Power produces dominance.
01:06:41.200 And they publish it.
01:06:42.600 But they're stupid.
01:06:44.380 And so, so then you have a smart scientist and he thinks, wait a second.
01:06:49.080 You don't just wrestle once if you're a rat.
01:06:51.300 You wrestle a bunch of times.
01:06:53.740 And so, what happens if you get the rats to wrestle more than once?
01:06:57.820 Because that's the issue.
01:06:58.940 Unless, see, and this is the difference between a psychopath and someone who's not psychopathic.
01:07:04.320 So, the thing about being a psychopath is that it's all for you and none for someone else.
01:07:09.380 But that only works once or twice.
01:07:12.620 You know, depending on the naivety of your target.
01:07:15.880 And if you're a psychopath, you have to move from place to place.
01:07:20.120 Because people catch on to your lack of ability to play fair.
01:07:24.640 And they don't get fooled again.
01:07:26.140 And so, psychopaths drift from place to place.
01:07:30.080 And they can find their mark and their victim.
01:07:32.880 But they can't do it in any stable manner.
01:07:35.340 It's not a good solution.
01:07:38.240 Because people track reputation.
01:07:40.560 And we're really good at that.
01:07:41.940 In fact, there are evolutionary psychologists who believe that we have a module for tracking reputation.
01:07:47.140 A cheating detection module.
01:07:49.640 And that you don't have to activate that very often before people will remember forever that you did.
01:07:55.900 So, you got the big rat and the little rat.
01:08:00.600 Now, the little rat's lost.
01:08:02.180 Now, you separate them.
01:08:03.700 And then you let them come back and play another day.
01:08:06.280 And they still want to.
01:08:07.460 Even the little rat wants to.
01:08:09.120 And so, then the little rat goes out in the play field.
01:08:11.160 And the big rat's there.
01:08:12.080 And the little rat has to ask the big rat to play.
01:08:17.060 And you think, well, how does a rat do that?
01:08:18.940 It's like, well, you've been in dog parks.
01:08:20.560 And you see what dogs do.
01:08:22.080 Maybe you've had dogs.
01:08:23.180 You know how dogs act when they want to play.
01:08:24.940 They sort of bounce around.
01:08:26.360 They kind of look at you.
01:08:27.480 You know, and they bounce.
01:08:29.100 And if you have any sense, you all laugh.
01:08:30.940 Because you recognize that.
01:08:32.760 That's, there's something playful about that.
01:08:34.640 It's like, I'm looking at you, signaling intent.
01:08:37.620 And this is sort of like a little dance.
01:08:39.600 And the idea is, why don't you come and do this with me?
01:08:42.280 And if you have any sense, it's a big dog.
01:08:44.040 And if you have any sense, the dog does that.
01:08:45.940 And you kind of whack it on the side of the head.
01:08:47.580 And it's not really that you hit it.
01:08:49.620 And the dog knows.
01:08:50.320 And it sort of bites you.
01:08:52.020 But not really.
01:08:52.800 Because, you know, the dog knows how to play.
01:08:55.380 If it had owners that, I could say that weren't psychologists.
01:08:59.400 Because the worst dog I ever met was the dog that a psychologist owned.
01:09:03.920 And it couldn't play at all.
01:09:05.100 It was just, it was just, it would just bite you.
01:09:08.040 It's like, what?
01:09:10.420 This is your dog?
01:09:11.900 You're going to have clients?
01:09:15.000 Not good.
01:09:20.060 So anyways, the little rat has to ask the big rat to play.
01:09:23.240 Because the big rat's cool.
01:09:24.240 Because he won once.
01:09:25.200 So he gets to sit there and, like, look cynical.
01:09:27.720 And pretend that he's ignoring the little rat.
01:09:29.460 But he wants to play too.
01:09:30.940 So the little rat bounces around.
01:09:32.300 And the big rat thinks.
01:09:33.580 He tosses away his cigarette butt.
01:09:35.360 And gets off the street corner.
01:09:36.820 And plays.
01:09:39.660 And because he's 10% bigger, he could pin the little rat.
01:09:42.520 And maybe he does that.
01:09:43.540 But if you pair them repeatedly.
01:09:46.800 And the big rat doesn't let the little rat win 30% of the time.
01:09:50.440 The little rat stops asking him to play.
01:09:52.560 And that is so cool.
01:09:55.660 I read that.
01:09:56.560 It just blew me away.
01:09:58.480 I thought, really?
01:09:59.260 You're kidding.
01:09:59.840 You're kidding.
01:10:00.640 There's an emergent ethic of fair play in wrestling rats.
01:10:07.120 That's how fundamental that ethic is.
01:10:10.300 I mean, they're rats for Christ's sake.
01:10:12.160 It's not like they're the world's most ethical animal.
01:10:14.520 Right?
01:10:15.520 They're rats.
01:10:16.980 And still, they have to play fair.
01:10:20.660 So.
01:10:23.980 What was I doing in Maps of Meaning and in 12 Rules for Life?
01:10:27.980 Well, we're motivated creatures.
01:10:32.540 Each of those motivations has to pursue its own goal.
01:10:35.460 Because otherwise we die.
01:10:37.440 But then those motivations have to get integrated.
01:10:39.600 They have to get integrated within you.
01:10:41.360 You kind of got that more or less under control by the time you're about two.
01:10:45.900 You start to become something approximating the integration of your motivations.
01:10:52.700 But that hasn't happened socially yet.
01:10:55.640 That happens between two and four.
01:10:57.700 You have to integrate that integrated structure with everyone else doing the same thing.
01:11:02.540 And how do you do that?
01:11:04.100 Well, you do that by learning how to play.
01:11:06.240 How to play fair.
01:11:07.320 Fair.
01:11:08.500 And there's a reciprocity that goes along with it.
01:11:10.800 I'll give you another example.
01:11:13.500 So here's a game that behavioral economists have been playing with people.
01:11:18.560 So here's the game.
01:11:19.640 So you take person A and you take person B.
01:11:21.560 And you say to person A, look, I'm going to give you $100.
01:11:24.060 And you have to share it with the person next to you.
01:11:26.660 And you can offer them a fraction of it.
01:11:29.020 Whatever fraction you want.
01:11:30.780 And if they take it, you get the $100 and you split it with them.
01:11:34.320 But if they refuse it, you guys get nothing.
01:11:38.080 So you say, okay, well, here's $100.
01:11:39.840 How much are you going to offer the person next to you?
01:11:42.420 Now, a classical economist would say, well, you offer them a buck.
01:11:47.040 Well, why?
01:11:48.460 Well, because you're trying to maximize your own self-interest.
01:11:52.820 Because that's what you are.
01:11:54.040 You're going to maximize your own self-interest.
01:11:56.700 And why would they say no?
01:11:58.440 It's like they get a buck.
01:11:59.860 And that's better than nothing.
01:12:02.080 Why would they say no?
01:12:04.100 So you say, well, will you take a dollar?
01:12:08.740 And the person next to you says, well, they think things they won't say.
01:12:15.780 But what they say is no.
01:12:19.240 But what happens is if you do this experiment, is that isn't what people do.
01:12:23.840 They offer 50%.
01:12:25.760 Cross-culturally, it's somewhere between 40 and 60%.
01:12:29.080 But it's basically 50.
01:12:32.380 50-50.
01:12:33.740 And then the other person accepts it.
01:12:35.560 And you might think, well, that's because you don't need the money.
01:12:39.340 It's like, let's say, you're starving.
01:12:40.960 And you have the $100.
01:12:42.460 And you offer your compatriot here a dollar.
01:12:45.060 And like, he needs something to eat.
01:12:46.140 So it's like, yeah, I'll take the dollar.
01:12:47.340 It's like, no.
01:12:48.380 The poor people are more likely to tell you to go to hell.
01:12:51.660 Because along with having no money, they'd like not to have no pride.
01:12:58.920 Right?
01:12:59.820 You've got to hang on to something.
01:13:01.880 And so you can at least still tell someone to go to hell when they deserve it.
01:13:06.020 And that's something.
01:13:07.700 And so even in the simple behavioral economist games, you get emergent evidence for automatic reciprocity.
01:13:14.660 Well, and why would you offer 50% to the stranger?
01:13:18.800 And the answer is, well, because it's a good rule of thumb.
01:13:21.880 You know, if you want people to play with you across time, if you want to engage in as many games as possible,
01:13:28.480 and you want to participate in the ethic properly,
01:13:30.960 then what you're aiming at is something approximating reciprocity.
01:13:34.180 The rules that I outlined, you know, in 12 Rules for Life,
01:13:40.760 that's what they're based on.
01:13:43.240 They're based on this observation.
01:13:44.700 So this is how it works, is that we have these motivational systems,
01:13:48.380 and we get them together, maybe around the age of two.
01:13:51.880 And then we integrate them with the motivational systems of others.
01:13:55.140 We do that by producing games.
01:13:57.540 And the games get more and more sophisticated, more multiplicitous.
01:14:00.420 But there's a game ethic that emerges out of that.
01:14:04.180 And it governs reciprocity.
01:14:05.940 And the game ethic is something like, well, we're all equally valuable players,
01:14:10.500 and everybody deserves a fair shot.
01:14:12.240 And you've got to bring your best skills to the table,
01:14:14.500 but you have to play fair, and you have to play reciprocally.
01:14:16.880 And that works across time.
01:14:19.460 And then you get an archetype that emerges out of that,
01:14:22.620 which is something like the fair player.
01:14:25.900 It's a variant of the archetype of the hero.
01:14:28.780 And that's the thing that people are driven to imitate and admire.
01:14:33.540 And this isn't, this is hardly a mystery.
01:14:35.800 I mean, look at, you think, why do we pay professional athletes
01:14:39.820 the inordinate sums that we pay them?
01:14:42.780 Well, it's possible it's because they're modeling something of crucial importance
01:14:47.280 in a dramatic manner.
01:14:49.240 You know, like a basketball game, a professional basketball game is a very complex drama.
01:14:56.400 And the drama is skill, but also the ethic of fair play.
01:15:02.880 And we're all observing that because we bloody well need to understand it.
01:15:06.820 And then you see that echoed.
01:15:08.340 So you get the emergent ethic, and that's the pattern of behavior.
01:15:12.140 And then you get representations of that.
01:15:14.080 You get abstract representations of it.
01:15:15.880 That's what we're doing when we play these abstract games.
01:15:18.360 It's also what we do when we tell stories, and we make movies, and we present plays.
01:15:23.500 And we do everything that's dramatic, is that we take a look at that behavioral pattern
01:15:28.400 that's emerged, that works, and then we try to represent it,
01:15:31.300 because we need to understand it.
01:15:32.780 It's like, well, what are you like if you're the hero of a story?
01:15:35.760 And what are you like if you're the anti-hero?
01:15:37.880 You're the cheat.
01:15:38.660 You're the crook.
01:15:39.280 You're the deceiver.
01:15:40.080 You're the liar.
01:15:40.720 You're the poor sport.
01:15:43.180 You know, you're the person who fixes the game.
01:15:45.540 And we abstract out those patterns, and then we try to imitate them.
01:15:50.200 And all of that drives our, that's what drives our knowledge of the ethic of behavior.
01:15:56.720 We elaborate that up.
01:15:57.900 We elaborate that up into drama, and we elaborate it into ritual,
01:16:01.680 and we elaborate it into religious representations.
01:16:04.880 We have stories that emerge of people who play properly, and people who play improperly.
01:16:11.080 And then you abstract out the essence of that.
01:16:14.380 You say, well, what's it like to play properly?
01:16:16.580 If you were doing it perfectly, what would you be like?
01:16:19.140 Well, you'd be a target for emulation and imitation, rather than a target for rejection.
01:16:24.020 You get the archetype of the hero and the adversary come out of that.
01:16:27.260 It's a completely different way of constructing a knowledge system.
01:16:30.080 And I think that what I've been trying to do in Maps of Meaning,
01:16:34.000 and trying to do in Twelve Rules for Life, is to lay that out.
01:16:36.560 Say, look, there is a system of knowledge that underlies the ethic that we need to adopt
01:16:44.280 to conduct ourselves properly in life.
01:16:46.380 And I'll close with this.
01:16:47.640 Here's one way of conceptualizing it, I think.
01:16:51.480 And this has to do with rule seven.
01:16:57.780 Do what is meaningful, not what is expedient.
01:17:00.080 It also has to do with rule eight, which is, tell the truth, or at least don't lie.
01:17:06.060 Well, it's not so easy to tell the truth, but you can tell when you're lying,
01:17:09.300 and you can stop doing that.
01:17:10.520 And then maybe you approximate the truth across time by doing that.
01:17:16.780 You need to do, you need to take care of yourself.
01:17:21.300 But then you think, well, what does that mean exactly?
01:17:23.500 It's not an Ayn Rand sort of individualism.
01:17:26.360 And the reason for that is, what self do you mean?
01:17:28.920 There isn't just you.
01:17:31.380 There's you now, and you tomorrow, and you next week.
01:17:34.240 You next month, and you next year, and you in a decade.
01:17:37.420 And you when you're old.
01:17:38.620 You're a community.
01:17:39.900 And you're a creature, unlike other creatures, in that you're aware of your own duration.
01:17:44.980 And so if you're going to treat yourself properly, you're already playing a game.
01:17:48.220 It's a game you play with yourself across time.
01:17:50.540 It's an iterated game.
01:17:51.620 And you better play fair.
01:17:53.160 Because otherwise, the you that is to come is going to suffer for it.
01:17:58.320 And so there's no individual you outside of the community.
01:18:02.760 Because just because of the way you are, you're already a community.
01:18:06.360 And so you have to take that into account.
01:18:08.200 But when you tell your child to play fair and be a good sport, you know, you're doing that
01:18:13.600 partly because then they're better for their teammates.
01:18:15.960 But you're doing it mostly because then they're better for themselves across time, right?
01:18:20.260 And you have an intimation of that.
01:18:22.660 So you've got to treat yourself fairly.
01:18:24.020 So the game's on with you.
01:18:26.780 But then, you know, you, the self, well, what is that?
01:18:29.420 You think, what, are you more important than your family?
01:18:31.680 That isn't how people act.
01:18:33.400 You know, if you take the typical parent, you say, well, look, I'm going to shoot you
01:18:36.860 or I'm going to shoot your son.
01:18:39.340 Well, the typical parent is going to say, well, take me.
01:18:42.640 Well, you think, well, which is you then exactly?
01:18:45.240 Is it, you know, if what's you is what's most dear to you, what you identify with more,
01:18:50.920 well, then you identify more with your kids and a tremendous amount with your parents and
01:18:56.860 your siblings, it's like you're extended.
01:18:59.180 You've got the community of yourself across time, but then you've got your family and then
01:19:02.940 you've got your family across time, right?
01:19:05.620 And then, well, there's that, but then your family's nested inside a community and you've
01:19:09.460 got the same issue there.
01:19:10.580 You've got the community now, all those other people and the community across time.
01:19:14.620 So what's the ethic of fair play?
01:19:16.600 It's like, do what's good for you in a way that's good for the future you, in a
01:19:20.880 way that's good for your family and your future family, in a way that's good for the
01:19:24.980 community and the future community.
01:19:26.900 All of that, that unbelievably complex sequence of nested games, that's what you have to play
01:19:33.020 properly.
01:19:34.120 And you admire people who do play it properly and you can see it even though you might
01:19:38.120 not be able to articulate it, right?
01:19:41.000 It's too complicated.
01:19:43.060 And I would say it's a matter of balancing out things properly.
01:19:46.940 And then to close, I would say this is something to think about too.
01:19:50.460 You know, we have an instinct for meaning.
01:19:52.500 You know, you can get meaningfully engaged in things.
01:19:55.180 You might get meaningfully engaged in a basketball game, for example.
01:19:58.520 You might ask yourself, why?
01:20:00.020 And I shed some light on that.
01:20:01.840 You're watching play progress properly, right?
01:20:05.240 Maybe you follow the team across an entire championship because you don't give a damn
01:20:09.020 about one basket successfully managed.
01:20:14.220 And you don't give a damn about one game successfully won.
01:20:17.040 You want the sequence of games to be won successfully so the championship manifests itself.
01:20:23.620 And you're deeply embedded in that, you know?
01:20:25.420 You might track all the statistics because you care who wins the game that's iterated across
01:20:30.220 time.
01:20:30.600 And you're dramatizing that.
01:20:32.260 You're playing that out.
01:20:33.560 Even though you can't articulate it, you don't really understand it.
01:20:36.460 It still grips you like it should because that is what should grip you.
01:20:42.200 Say, it grips you and it's engaging and it's meaningful.
01:20:46.540 And the reason for that, okay, imagine this.
01:20:49.080 This is the final part of this.
01:20:52.220 Well, let's say you manage to take care of yourself and your family and your community
01:20:55.760 all at the same time.
01:20:57.020 Let's say that you imagine you're successful as a consequence of that.
01:21:00.160 Like, in the broad sense, right?
01:21:01.880 You've got what it takes to live a rich life.
01:21:04.580 Well, what's that going to do to you biologically?
01:21:07.500 It's like, well, isn't that going to make you an attractive partner?
01:21:10.720 Isn't that going to make you an attractive mate?
01:21:13.640 And isn't the case that if you're an attractive mate because you act out that partner that
01:21:17.840 you're more likely to succeed from an evolutionary perspective?
01:21:22.480 You're more likely to reproduce.
01:21:24.820 Your children are more likely to live.
01:21:26.260 And so, what that implies across time is that not only does that ethic exist,
01:21:31.820 and not only do we recognize it, but that the degree to which you're able to manifest
01:21:37.080 that self in your life is associated directly with your long-term success on a biological,
01:21:47.640 speaking on the biological time frame, speaking in the biological time frame.
01:21:52.940 Okay, so imagine that.
01:21:54.960 Then imagine this.
01:21:55.900 Imagine that you have instincts that guide you towards that pattern.
01:21:59.260 I could tell you two of those.
01:22:00.640 One is your conscience.
01:22:03.080 And it says, you've fallen off the path.
01:22:06.200 Well, what's the path?
01:22:07.260 Well, the path is that balanced harmony.
01:22:09.560 And you've deviated from it.
01:22:10.720 You've betrayed yourself.
01:22:11.800 You've cheated yourself.
01:22:13.380 You're not playing the game properly with yourself.
01:22:15.580 And something responds.
01:22:17.080 And so, that's the punishing end of it.
01:22:18.580 And the positive end is, well, if you've got the pattern right, then it's deeply and meaningfully engaging.
01:22:25.480 And that's not an illusion.
01:22:27.480 You know, one of the problems that modern people have is that we think that the sense of meaning is an illusion.
01:22:32.840 Because we think it's arbitrary or constructed.
01:22:35.140 But there's no evidence for that.
01:22:36.560 The evidence is quite the contrary.
01:22:38.660 That that sense of deep, meaningful engagement isn't rational at all.
01:22:42.620 It's way deeper than that.
01:22:44.440 And it signifies that you're where you should be doing what you should be doing.
01:22:50.560 You know, with a bit of a whack from time to time from your conscience.
01:22:54.760 It says, well, walk the straight and narrow path, right?
01:22:58.460 The line between chaos and order.
01:23:01.200 And get everything balanced harmoniously around you.
01:23:03.740 Because that's where you should be and that's what you should be doing.
01:23:06.560 I think that's portrayed in music.
01:23:09.600 You know, music lays out patterns.
01:23:12.180 Layers of patterns.
01:23:13.560 And they're all interacting harmoniously.
01:23:15.420 And you can put yourself in sync with that.
01:23:17.840 You know, physically even.
01:23:19.440 Because you tend to dance to it.
01:23:21.140 And you dance to that and that's meaningful.
01:23:23.120 And it's because you've allied yourself properly with all that multitudinous pattern.
01:23:28.420 And it's a symbolic manner of, what would you say, acting out, being positioned properly in the midst of all that complexity.
01:23:37.660 And that's signified by that sense of meaning.
01:23:40.200 Well, so that's a much better story than a moral relativist story or a nihilist story, a hopeless story.
01:23:48.800 It's like you have a sense of, you have an instinct for meaning, guided by conscience, that puts you in the proper orientation to yourself extended across time.
01:24:00.680 To your family extended across time.
01:24:03.320 And to your community extended across time.
01:24:06.240 And if you attend to that, then you act out things properly.
01:24:10.240 And then, not only do you feel better psychologically.
01:24:12.880 You're facing the obstacles you need to face.
01:24:14.940 You're playing the game properly.
01:24:16.360 That's great psychologically.
01:24:17.840 Gives your life some purpose and some higher meaning and protects you from anxiety and pain.
01:24:22.200 Wonderful.
01:24:23.260 But it's not just psychological.
01:24:24.520 Because if you do that, you're also actually useful.
01:24:29.140 Right?
01:24:29.500 You're taking care of yourself.
01:24:31.220 So you won't die.
01:24:32.340 You're taking care of your family.
01:24:33.700 So they don't suffer unduly.
01:24:35.540 And you're doing that in a manner that actually strengthens the community.
01:24:40.220 Right?
01:24:40.420 You're taking on your role as a sovereign and responsible citizen.
01:24:44.000 And that helps actually.
01:24:45.420 It sets the world right psychologically.
01:24:47.520 But it also sets the world right.
01:24:51.000 So what's so arbitrary about that?
01:24:53.200 I don't see anything arbitrary about that.
01:24:56.760 I think it's long past time that we stopped regarding any of that as arbitrary.
01:25:02.660 It's self-evident.
01:25:04.480 Play the game properly.
01:25:06.580 Right?
01:25:07.120 It's important.
01:25:08.260 Everyone knows it.
01:25:09.540 You do it.
01:25:10.080 Your life is meaningful and worthwhile.
01:25:11.900 And you've got something to wake up for.
01:25:13.780 You've got some anxiety set at bay.
01:25:16.740 At least you've got something worthwhile to do in the face of what you're terrified of.
01:25:21.080 That's something.
01:25:21.820 And then there's important problems to solve.
01:25:24.060 And you're solving them.
01:25:25.740 It's like, well.
01:25:28.340 That's separating the wheat from the chaff.
01:25:30.460 Right?
01:25:31.760 I say, well, there's something to our civilized society that's integral and valuable.
01:25:36.960 And that we need to protect and identify.
01:25:38.840 And then act out and support.
01:25:40.520 And manifest more.
01:25:42.040 And not criticize to death.
01:25:43.780 And leave nothing at all in the ashes.
01:25:46.420 And that's why I was working on Maps of Meaning and 12 Rules for Life.
01:25:51.500 Thank you very much.
01:25:53.500 Going online without ExpressVPN is like not paying attention to the safety demonstration on a flight.
01:26:09.260 Most of the time, you'll probably be fine.
01:26:11.340 But what if one day that weird yellow mask drops down from overhead and you have no idea what to do?
01:26:16.740 In our hyper-connected world, your digital privacy isn't just a luxury.
01:26:20.920 It's a fundamental right.
01:26:22.200 Every time you connect to an unsecured network in a cafe, hotel, or airport, you're essentially broadcasting your personal information to anyone with a technical know-how to intercept it.
01:26:31.480 And let's be clear, it doesn't take a genius hacker to do this.
01:26:34.360 With some off-the-shelf hardware, even a tech-savvy teenager could potentially access your passwords, bank logins, and credit card details.
01:26:42.120 Now, you might think, what's the big deal?
01:26:44.200 Who'd want my data anyway?
01:26:45.740 Well, on the dark web, your personal information could fetch up to $1,000.
01:26:50.140 That's right, there's a whole underground economy built on stolen identities.
01:26:54.400 Enter ExpressVPN.
01:26:56.160 It's like a digital fortress, creating an encrypted tunnel between your device and the internet.
01:27:00.420 Their encryption is so robust that it would take a hacker with a supercomputer over a billion years to crack it.
01:27:06.500 But don't let its power fool you.
01:27:08.320 ExpressVPN is incredibly user-friendly.
01:27:10.660 With just one click, you're protected across all your devices.
01:27:13.680 Phones, laptops, tablets, you name it.
01:27:15.860 That's why I use ExpressVPN whenever I'm traveling or working from a coffee shop.
01:27:20.000 It gives me peace of mind knowing that my research, communications, and personal data are shielded from prying eyes.
01:27:25.720 Secure your online data today by visiting expressvpn.com slash jordan.
01:27:30.720 That's E-X-P-R-E-S-S-V-P-N dot com slash jordan, and you can get an extra three months free.
01:27:37.060 ExpressVPN dot com slash jordan.
01:27:42.740 Starting a business can be tough, but thanks to Shopify, running your online storefront is easier than ever.
01:27:48.800 Shopify is the global commerce platform that helps you sell at every stage of your business.
01:27:52.880 From the launch your online shop stage, all the way to the did we just hit a million orders stage, Shopify is here to help you grow.
01:27:59.680 Our marketing team uses Shopify every day to sell our merchandise, and we love how easy it is to add more items, ship products, and track conversions.
01:28:08.160 With Shopify, customize your online store to your style with flexible templates and powerful tools, alongside an endless list of integrations and third-party apps like on-demand printing, accounting, and chatbots.
01:28:19.220 Shopify helps you turn browsers into buyers with the internet's best converting checkout, up to 36% better compared to other leading e-commerce platforms.
01:28:28.020 No matter how big you want to grow, Shopify gives you everything you need to take control and take your business to the next level.
01:28:33.820 Sign up for a $1 per month trial period at shopify.com slash jbp, all lowercase.
01:28:40.340 Go to shopify.com slash jbp now to grow your business, no matter what stage you're in.
01:28:45.700 That's shopify.com slash jbp.
01:28:48.140 These are the most comfortable chairs we've had.
01:28:53.160 Yeah, geez, wow, look at these chairs.
01:28:54.900 Maybe we'll just stay here.
01:28:56.280 Yeah, this is nice.
01:28:58.720 It's good to be back, brother, right?
01:29:01.140 Yeah, it's good to be back.
01:29:02.800 You feeling good?
01:29:04.360 Yeah, it's good to be here.
01:29:06.100 Thank you all for coming.
01:29:07.140 It's much appreciated.
01:29:08.180 All right, we've got a ton of questions because this is a tech town and they actually know how to use that.
01:29:18.400 They know how to use Slido.
01:29:19.520 That's good.
01:29:20.060 Yeah, it's impressive.
01:29:21.040 I like this first one.
01:29:22.760 Now that you've destroyed Patreon, will the two of you save the internet?
01:29:29.880 No, no, definitely not.
01:29:32.860 There's no saving the internet.
01:29:34.120 I don't know even if we'll save Patreon, you know, because this is part of the not taking the moral high ground too easily.
01:29:45.000 You know, Dave and I, and Harris as well, Dave talked to Sam in particular, but Dave and I talked about what had happened after Patreon banned Carl Benjamin, Sargon of a cat, and we thought, well, that's just not acceptable.
01:30:00.260 And so we decided to stop using the platform, and I'd been working on a program that had some parallel functions to Patreon that we were producing for a slightly different reason.
01:30:15.880 And we started talking about the possibility that it could be repurposed as a Patreon alternative, which is something that we're working diligently on, and which will happen.
01:30:26.800 But I would never claim that the solution to the problem is going to be straightforward.
01:30:33.380 So, we'll see what we can manage.
01:30:38.800 I think Dave and I will start using the platform in about a month and a half, something like that.
01:30:44.720 But we don't know.
01:30:46.940 Maybe Patreon's time has come and gone already, because there's already ways of supporting individual contributors.
01:30:52.080 Maybe it's not possible to aggregate a tremendous number of creative people together in a single platform without attracting undue negative attention and the immediate probability of this kind of sensorial action.
01:31:06.840 We don't know, right?
01:31:08.020 Because these technologies are all so new.
01:31:09.880 You know, it looks like YouTube and Facebook are wrestling now, and Twitter as well, wrestling now with the conundrum that they've been presented with, which is, well, there's a billion opinions, and some of them are rough.
01:31:28.300 What do we do about the rough opinions?
01:31:30.480 And the answer might be, like, in the frontier heyday of YouTube, when it wasn't run by a giant and increasingly rigid corporation, then anything went.
01:31:42.740 But as soon as it's corporatized and systematized, then that can no longer work.
01:31:49.220 I mean, you think, you really think the internet could have ever started if it would have been regulated to begin with?
01:31:54.520 I mean, just think about it.
01:31:55.420 What drove the internet?
01:31:58.020 Porn.
01:31:58.460 I mean, no, but I'm dead serious, right?
01:32:01.820 It's that that's what happened, is that the internet exploded over its first, say, 10 years of development, of public development, because it was just abs...
01:32:10.160 It was like, what was the percentage of porn on the internet for the first 10 years?
01:32:14.360 It was some ridiculous amount.
01:32:16.520 I didn't know that there was porn on the internet.
01:32:18.080 No.
01:32:23.500 Oh, yeah.
01:32:24.840 See, everyone learns something at a Jordan Peterson event.
01:32:27.700 You see what I'm saying?
01:32:29.400 Yeah.
01:32:31.260 So, I don't know if it's a solvable problem, but we're going to try...
01:32:35.920 I think at least we're going to try to build a platform where, if you're on it, we're not going to kick you off arbitrarily.
01:32:43.100 I think we can promise that.
01:32:44.700 See, let me just say one more thing about that.
01:32:53.080 So, I went and talked to one of the guys who runs one of these big social communication systems, you know, social networks.
01:33:03.800 And they'd faced a lot of pressure because ISIS was using the platform to recruit.
01:33:10.800 Okay, so you've got to ask yourself, if you're a free speech absolutist, you're going to let...
01:33:18.260 If you're in a war, are you going to let the enemies of your state recruit with your platform?
01:33:22.060 And if the answer is no, which seems like a reasonable answer, well, then you've already opened the door, right?
01:33:32.000 Say, there are things that should, could be censored.
01:33:37.100 Then the question is, as soon as you open that door, like, it's Pandora's box.
01:33:42.160 It's like, okay, well, what about what's right next to that?
01:33:45.900 Maybe that would be, like, certain forms of communication about radical religious fundamentalism.
01:33:52.700 Because that's right next door.
01:33:54.380 And then there's something right next door to that.
01:33:56.580 It's like, those lines are really hard to draw.
01:33:59.240 So, it isn't obvious to me how large-scale social networks are going to solve this problem.
01:34:04.280 And we're wrestling with that, trying to come up with a solution that's reasonable.
01:34:09.720 So far, it's something like, you'll be able to stay on the platform unless you break an American law.
01:34:19.020 An actual law, right?
01:34:21.680 And I don't even know if that's a good enough guideline, but, well, it might have to be.
01:34:26.420 But, we'll see.
01:34:27.480 And as for fixing the net, well, that's a no-go, that is.
01:34:36.780 So, we offer whatever content we can manage, and people seem to enjoy it.
01:34:41.660 And that's working pretty well, but that's about all we can manage.
01:34:45.340 And maybe that's all that's manageable.
01:34:50.480 And there's porn out there, so who knew?
01:34:57.520 The APA recently defined traditional masculinity as toxic, conflating virtuous and harmful aspects.
01:35:05.320 How can we reverse this dangerous ideological progression?
01:35:08.440 Oh, the purpose was to conflate the virtuous and the harmful.
01:35:13.940 That's the purpose of the document, is to blur the distinction between the two.
01:35:17.120 And I think the real reason, I'm writing an article about this right now.
01:35:21.400 I think the actual reason was to damage the virtuous.
01:35:25.500 Because that's the best way of doing it, right?
01:35:27.360 If you want to damage the virtuous, what you do is you conflate it with the harmful.
01:35:31.280 It doesn't hurt the harmful any to have it conflated with the virtuous.
01:35:35.640 So, let's say your real motivation is like a seriously deep resentment and spite.
01:35:42.060 And that the best way to manifest that is to take the virtuous and conflate it with the pathological.
01:35:48.860 And to take down the virtuous.
01:35:50.520 And maybe that's because you can't bloody manage it on your own.
01:35:54.100 That's what it looks like to me.
01:35:56.900 Welcome to Season 2, Episode 38 of the Jordan B. Peterson Podcast.
01:36:06.380 I'm Mikayla Peterson, dad's daughter and collaborator.
01:36:10.640 Girl who only eats meat and does extended fasting.
01:36:14.440 Believe it or not, my brother is basically completely normal.
01:36:17.760 He's probably adopted.
01:36:19.320 Or he takes after my mom's side.
01:36:20.880 I haven't figured out which.
01:36:22.940 Today's episode is a 12 Rules for Life lecture, recorded in San Jose on January 22, 2019.
01:36:29.660 I've named it Resolving the Science-Religion Problem.
01:36:32.880 Last week, I walked on the beach for the first time with no pain in almost 12 years.
01:36:38.800 I had my ankle replacement re-replaced last January.
01:36:41.920 And that is kind of a hellish surgery.
01:36:44.280 I was awake during it too because I opted out of general anesthesia.
01:36:48.040 About a quarter of the way into the surgery, I regretted that.
01:36:51.020 But hey, I survived.
01:36:52.820 Anyway, the surgeon fixed the problem.
01:36:54.940 The problem being a crooked ankle replacement installed 10 years prior.
01:36:59.540 Isn't that crazy?
01:37:00.260 So after about a half a mile walking in the sand, I literally cried with appreciation.
01:37:05.960 It was beautiful.
01:37:07.100 I had overlaid my experience of beaches with sadness and hatred and frustration.
01:37:11.340 And that's gone now.
01:37:13.260 It's a hell of a lot easier when you're not in pain.
01:37:16.520 Sometimes the beauty in life is overwhelming.
01:37:18.620 Just wanted to share that with everybody because it was really overwhelming.
01:37:22.140 And it made my day.
01:37:24.580 Exciting news.
01:37:25.620 Dad is launching his very first e-course on December 17th, 2019.
01:37:31.820 It's available for pre-sale currently.
01:37:34.540 A lot of people have been asking us for a more structured and condensed resource where they can learn about personality without needing to spend 30 plus hours watching videos, reading resources, etc.
01:37:45.260 So earlier this year, we recorded a new video series that will be packaged as an online course with eight videos, supplementary materials, including lecture notes, additional reading materials and resources, transcripts, a free license to the Understand Myself Personality Assessment, and an exclusive discussion group.
01:38:02.880 All designed to give you an in-depth look and understanding of your personality.
01:38:07.440 Personality is my favorite topic in psychology.
01:38:10.120 It's worth checking out if you've been intending to learn about personality and want to do it in a concise and structured format.
01:38:16.400 Dad's released a lot of information about personality on YouTube for free, but this is a more concise, structured way of learning and features some information on personality differences between genders.
01:38:25.340 So that's cool.
01:38:26.480 Go to jordanbpeterson.com slash personality to check it out.
01:38:30.740 If you're listening to this podcast before December 17th, 2019, we're currently offering a pre-sale for a 15% discount on the course at $120.
01:38:40.980 If you're interested, this is a great opportunity to get it at a lower price or buy it for someone for Christmas even.
01:38:46.820 Hopefully you find it cool.
01:38:48.480 I did.
01:38:49.080 I sat in on all the lectures, even though I had food poisoning at the time.
01:38:53.120 Imagine getting food poisoning when you only eat one thing.
01:38:56.160 Not ideal, but I still sat through the lectures.
01:38:59.100 They're that good.
01:38:59.720 Check it out at jordanbpeterson.com slash personality.
01:39:12.680 Resolving the science-religion problem.
01:39:15.000 A Jordan B. Peterson 12 Rules for Life lecture.
01:39:18.880 It's been a while since I talked to a large audience.
01:39:22.020 I talked to an audience in Zurich a week ago, but that was in a completely different time zone.
01:39:27.080 So, and I was just trying to recapitulate in my imagination what it is that I was trying to do with 12 Rules for Life and with my first book, Maps of Meaning.
01:39:43.580 And so I was laying that out, and I think I'll walk through that again.
01:39:46.140 And I kind of get warmed up to the topic.
01:39:49.420 It's a complicated thing to lay out in total.
01:39:57.920 So, the first issue is, I think, that these are issues that everybody knows.
01:40:05.940 But all these issues that I'm going to lay out are related.
01:40:10.460 So, the first issue that everybody knows is that there's a conflict between religion and science.
01:40:16.820 And so, that's a conflict that's torn at the heart of our culture for about 500 years.
01:40:31.140 Especially as scientific progress has become more self-evident.
01:40:37.140 And there's been a lot of really good things about that, obviously.
01:40:41.680 Science has driven a technological revolution.
01:40:44.060 And it's radically improved our standard of living, our material well-being.
01:40:50.160 And it's very difficult not to think of that as a good thing.
01:40:54.500 And one of the things that's really remarkable about that is that it's accelerating by all appearances.
01:41:01.780 It really took off in the late 1800s.
01:41:05.680 And that it's happening everywhere in the world.
01:41:08.500 And so, there's been an unbelievably rapid economic transformation throughout the world.
01:41:14.960 Especially in the last 20 years.
01:41:17.540 So, you know, the bulk of the population in the world now is middle class.
01:41:22.760 And starvation is virtually a thing of the past.
01:41:26.600 Although, not entirely.
01:41:27.460 Although, and even extreme privation from a material perspective is declining very rapidly.
01:41:34.100 The UN defines extreme privation as existence on less than $1.90 a day in today's dollars.
01:41:43.360 And that's declined 50% since the year 2000.
01:41:47.280 Which is absolutely phenomenal.
01:41:50.000 And the UN projects that it will be eradicated completely by the year 2030.
01:41:57.460 And just to put things in perspective, in the West, before 1895, the typical person lived on a dollar a day.
01:42:05.340 So that's less than the current UN standard for extreme privation.
01:42:11.120 And so, the technological revolution that's been driven in large part by the dawn of scientific thinking has radically altered the West and its standard of living.
01:42:25.320 But now is doing the same thing everywhere else in the world.
01:42:28.880 So, you know, here's another example.
01:42:32.000 The child mortality rate in Africa is now the same as it was in Europe in 1952.
01:42:38.100 So that's just absolutely beyond belief.
01:42:41.200 So, this is a good thing.
01:42:43.040 But the conflict between the scientific viewpoint and the religious viewpoint still exists.
01:42:53.640 Carl Jung, who's a favorite thinker of mine, a psychoanalytic thinker who lived, who wrote and thought through most of the 20th century.
01:43:04.340 He died in 1960.
01:43:08.620 So, I guess for the first six-tenths of the 20th century.
01:43:12.540 He believed that what had happened about 500 years ago was that as science developed out of alchemy,
01:43:20.780 which was like the dream, the alchemy was the dream out of which science emerged.
01:43:25.620 First, alchemy was the dream that you could discover a material substance, which was the philosopher's stone that would confer upon everyone health, wealth, and longevity.
01:43:38.100 And, of course, there is no philosopher's stone, but the dream was correct.
01:43:42.720 Because the dream was that if we studied the material world with enough care that we could discover something that would produce wealth and health and longevity.
01:43:56.760 And that did happen.
01:43:59.160 And so, sometimes things that occur in actuality have to be dreamed before they occur.
01:44:06.500 However, alchemy had an ethical aspect and a practical aspect, let's say.
01:44:12.820 And the practical aspect exploded up into the scientific revolution and gave us this incredible technological power.
01:44:20.020 But it was Jung's belief that the ethical element remained undeveloped and that that was dangerous.
01:44:28.480 And that we were now in a situation, and have been for quite a while, where our technological power outstrips our ethical knowledge.
01:44:36.500 And part of that's manifested in an uncertainty about ethics in general, about how to behave.
01:44:43.660 About whether there is even an answer to the question how to behave.
01:44:50.740 About whether or not there is such a thing as an ethic that isn't morally relative or arbitrary in some sense.
01:44:59.460 And I think you see that manifested, that critique manifested most particularly in the post-modern doctrine.
01:45:06.140 Which claims, with some justification, that there's a very large number of ways of looking at the world.
01:45:12.460 And that none of those ways, there's no straightforward way to determine which one of that multitude of manners that you can view the world is correct.
01:45:25.640 Now, there's a corollary problem.
01:45:32.080 You have the religion and science problem.
01:45:35.340 And you have a philosophical problem that David Hume pointed out, which is that it isn't obvious how you can derive an ought from an is.
01:45:43.880 And an ought is an ethic.
01:45:48.900 It's how you should behave.
01:45:50.360 And an is is a description.
01:45:52.200 And you could think of the scientific method as an attempt to describe what is.
01:45:59.560 And the ethical endeavor as an attempt to describe what ought to be.
01:46:04.200 And according to Hume, there was a gap between the two.
01:46:08.160 And it wasn't a straightforward thing to bridge.
01:46:11.480 And that seems to be right.
01:46:12.920 So there's a parallel problem.
01:46:14.220 The is ought problem.
01:46:15.220 The science-religion problem.
01:46:17.000 So I was very interested in that, in those parallel sets of problems.
01:46:28.120 And that's what I've been trying to address.
01:46:30.100 And that's what I wrote about in 12 Rules for Life and in Maps of Meaning.
01:46:34.020 I started to hypothesize a long while back that, in a similar manner, that there were two ways of looking at the world.
01:46:49.880 You could look at the world as a place of objects, a place of material objects.
01:46:55.980 Or you could look at the world as a place to act.
01:46:59.580 And that that's the same idea reflected in a different way.
01:47:04.040 The world as a place of objects.
01:47:05.560 That's the is world.
01:47:07.000 That's the world science describes.
01:47:08.920 And the world as a place to act.
01:47:10.340 That's the ought world.
01:47:11.760 And then I realized, too, and this was partly from studying the psychoanalysts, but also from studying literature.
01:47:18.440 Literature first and the psychoanalysts second was that the world as a place to act is laid out in stories.
01:47:27.920 And then maybe the world is like a stage that's set for a drama, you know.
01:47:36.100 You walk in like you walked in tonight.
01:47:38.100 You look at the stage and there's nothing, there's no characters on it to begin with.
01:47:43.700 There's just the objects on the stage.
01:47:46.620 The speakers and the chair and the stage is set.
01:47:50.520 And you could know everything about the stage setting from a scientific perspective and you still would have no idea what drama was about to take place.
01:47:58.260 Yet the purpose of the stage and the purpose of the play is the drama and the stage setting.
01:48:07.080 The objects are in some sense almost peripheral to that.
01:48:10.160 And you know that, too, because you've seen plays and movies and you know how many different ways you can set a stage for the drama to occur.
01:48:18.600 And the question is, well, what constitutes the drama?
01:48:22.360 So I got interested in the drama.
01:48:27.740 And because the point of the stage is the play and not the setting.
01:48:35.840 I got interested in the drama.
01:48:38.960 And I started to understand that the drama had a structure.
01:48:43.440 And that the structure, just like the material world has a structure, just like there's a periodic table of the elements, there's a periodic table in some sense.
01:48:54.240 There's a periodic table of dramatic personae.
01:48:58.360 And if you understand them, you can start to understand how to act in the world.
01:49:04.200 And there is actually a way to act in the world that's not arbitrary.
01:49:07.620 When I was writing Maps of Meaning, which was my first book, I was trying to address a question that was a postmodern question, although I didn't know it at the time.
01:49:17.620 I hadn't conceptualized it at the time.
01:49:19.680 And the question was this.
01:49:21.220 I was obsessed by the fact that the world had divided itself into two armed camps.
01:49:29.280 The armed camp that constituted essentially the Soviet Union, but you could say the entire communist bloc, and then the West.
01:49:36.100 And that each of those two blocks had arranged their societies according to different axiomatic presumptions, and were at odds with one another.
01:49:46.660 And it wasn't obvious, the world wouldn't have divided itself into those two armed camps, if it was obvious which of those two sets of axiomatic presuppositions were more valid, let's say.
01:50:00.740 Or maybe the problem was even deeper than that, which was those were just two arbitrary sets of axiomatic presuppositions out of a very large number of potential sets.
01:50:12.120 And they just happened to be the ones that emerged, and they were at odds with one another.
01:50:16.020 And so I started to study the understructure of the two belief systems, to see if one of them had more validity than the other, if I could figure that out.
01:50:26.800 If there was something underneath at least one of them.
01:50:30.180 And what I discovered, I think, was that the belief system that characterized the societies of the West, the underlying belief system, wasn't arbitrary.
01:50:40.580 It was just correct.
01:50:43.420 And I think the fact that the Soviet Union fell apart so precipitously in 1989 is actually evidence of the unplayability of the Soviet game.
01:50:53.500 That's a good way of, that's a really good way of thinking about it.
01:50:56.080 There's some games you can play, and there's some games you can't play.
01:50:59.260 If you iterate some games, they degenerate, and if you iterate others, they improve.
01:51:03.700 And that's actually one of the pieces of evidence that suggests that one system is preferable to another, that you can iterate it across time.
01:51:12.580 And that that's actually not a degenerating game, but an improving game.
01:51:17.780 And, you know, if you have a relationship with someone, it's an iterated game, right?
01:51:21.920 It's iterated because it repeats.
01:51:23.920 If you have a permanent relationship with someone, a marriage, a friendship, a relationship between siblings, relationship between you and your children,
01:51:31.560 and you want to be able to interact with them in a manner that doesn't get worse across time.
01:51:36.340 Maybe at least it stays flat, but it would be better if it even got better.
01:51:40.840 That would be lovely.
01:51:42.480 And your successful relationships are at least ones that maintain the status quo, but the great ones improve across time.
01:51:50.980 And so there are ways that you can act, interact, iteratively with people in a manner that sustains that iteration and fortifies it.
01:52:07.540 And so some games aren't playable, and some games are.
01:52:11.660 Some games improve as you play them.
01:52:13.480 And the game that the communists played was a degenerating game.
01:52:17.520 And that meant that there was something wrong with it.
01:52:19.440 And that meant that at least that way of looking at the world wasn't as good as any other way.
01:52:25.420 Which is interesting, because at least it suggests that there's one way of looking at the world that isn't as good as another way.
01:52:31.340 Right?
01:52:31.660 And that's a bothersome fact, in some sense, if you're a moral relativist.
01:52:36.740 Because as soon as there's any evidence that one game isn't as good as another, you've got some evidence that some games are better than others.
01:52:44.080 And so then you have some evidence that there's a rank order, let's say, of games.
01:52:50.460 When Solzhenitsyn wrote about the Nazi Holocaust, he talked about the Nuremberg Trials.
01:52:59.020 And he believed that the Nuremberg Trials were among the most important events of the 20th century.
01:53:06.340 It's a similar, he's making a similar case.
01:53:09.180 And the reason he believed that was because the conclusion of the Nuremberg Trials...
01:53:17.020 You have to think about whether or not you believe this, because there's a cost to believing it, and there's a cost to not believing it.
01:53:23.540 So there's no scot-free way out of this conundrum.
01:53:27.260 The conclusion of the Nuremberg Trials was that there were some things that you could not do if you were human, without being subject to moral sanction.
01:53:38.900 Regardless of your cultural milieu.
01:53:41.360 Right?
01:53:41.560 Those were what have come to be known as crimes against humanity.
01:53:44.780 So the Nuremberg Trials made it axiomatic that some things were wrong, independently of the moral structure within which you were raised.
01:54:00.680 Or, another way of thinking about it was that no matter what moral structure it was that you were raised within,
01:54:10.620 and there was something common across all of them that would make the sorts of things that the Nazis did wrong by any reasonable standard.
01:54:20.780 Now, you don't have to believe that.
01:54:22.780 But if you don't believe that, then that puts you in an awkward position with regards to the Holocaust, say.
01:54:29.860 Well, that's what I mean by a cost.
01:54:34.260 It's like, you either admit that there's something wrong, or you admit that there wasn't anything wrong with the Holocaust,
01:54:41.280 and that it was just an arbitrary cultural decision.
01:54:45.440 One of many such arbitrary cultural decisions, and of no more distinction than any other.
01:54:52.680 And that seems like a conclusion that, in the main, we're not willing to draw.
01:55:03.840 So, maybe there is a difference between wrong and right.
01:55:09.260 And that's worth thinking about.
01:55:11.060 And maybe that difference is real.
01:55:15.580 Whatever real might mean.
01:55:17.040 Now, let's think about the problem of how to act.
01:55:23.740 I think about it biologically, because that's what I tried to do when I was writing Maps of Meaning.
01:55:28.340 And in Twelve Rules for Life, there's also a biological approach.
01:55:33.840 I mean, I drew on a lot of religious stories when I wrote both those books,
01:55:37.920 which is a strange thing to do if you're also trying to think biologically,
01:55:42.980 because those two things don't necessarily exist harmoniously.
01:55:49.960 You know, the more strict scientific types,
01:55:54.120 Sam Harris being a good example,
01:55:56.520 aren't comfortable with religious presuppositions.
01:55:58.980 And so, the attempt to use a physical standpoint derived from physics and a standpoint derived from biology,
01:56:09.640 say, social science for that matter,
01:56:11.940 and to ally that with religious presuppositions is an awkward marriage.
01:56:17.480 But it can be done.
01:56:21.020 I should tell you a story about Jean Piaget.
01:56:24.060 Jean Piaget was probably the greatest child developmental psychologist of the 20th century.
01:56:28.980 He invented the field.
01:56:32.000 He had a messianic crisis when he was a young man,
01:56:36.000 and was tormented by the contradiction between science and religion,
01:56:40.580 and decided that he was going to devote his entire life to rectifying that.
01:56:45.260 And I actually think he managed it to a large degree.
01:56:48.860 You know, it's not generally how Piaget is discussed.
01:56:51.140 When we discuss great people,
01:56:53.580 we generally don't discuss them in their full peculiarity.
01:56:59.060 You know, you remember a while back,
01:57:00.940 Elon Musk was sort of savaged for smoking,
01:57:04.460 you know, one one-thousandth of a joint on Joe Rogan.
01:57:09.420 And I thought it was so comical,
01:57:11.000 because Elon Musk is a very strange person.
01:57:13.680 Obviously, you have to be a strange person to build an electric car,
01:57:18.240 and then shoot it on your own rocket out into space.
01:57:22.740 Right?
01:57:24.240 Seriously.
01:57:29.520 Either one of those things is really strange,
01:57:33.180 but the joint probability of doing both is way below zero.
01:57:37.760 So,
01:57:38.520 so I thought it was comical that people went after Musk,
01:57:42.840 because I thought,
01:57:43.580 yes, well, we like our insane geniuses,
01:57:46.380 predictable and normal.
01:57:48.220 It's like...
01:57:52.120 Anyways,
01:57:52.860 Piaget did do a very good job of,
01:57:55.080 of,
01:57:55.640 of reconciling religion with science,
01:57:58.840 although people don't really know that,
01:58:00.380 because that isn't how they read Piaget.
01:58:02.320 And I think that's partly because they're afraid of,
01:58:05.320 that if you read someone who's really a genius,
01:58:08.780 the,
01:58:08.980 the depth of their genius will frighten you.
01:58:11.600 And so you'll only read down into their work,
01:58:15.400 till you hit what you can't abide,
01:58:17.580 and then you'll stop.
01:58:19.180 And that's the case with Freud,
01:58:20.720 and it's the case with Jung,
01:58:22.280 for sure,
01:58:23.040 because if you read Jung,
01:58:24.540 and you have any sense,
01:58:25.600 you're terrified instantly.
01:58:27.620 And,
01:58:27.860 and,
01:58:28.900 and then it's also the case with Piaget.
01:58:32.260 And so,
01:58:33.740 I'll weave in what it was that he discovered.
01:58:37.160 Okay, so,
01:58:38.700 we're going to think about this biologically.
01:58:40.260 So let's say,
01:58:41.040 well,
01:58:41.140 one of the problems you have,
01:58:42.560 is what the world is made out of.
01:58:44.940 Now,
01:58:45.220 one of the things that's kind of interesting about human beings,
01:58:47.720 is that we really didn't care about that very much,
01:58:49.760 for a very long period of time.
01:58:52.040 Right?
01:58:52.620 I mean,
01:58:53.080 we didn't invent science until about 500 years ago.
01:58:56.500 Now,
01:58:56.980 you can argue about that,
01:58:58.220 and you could say that the precursors,
01:59:00.180 for the scientific viewpoint,
01:59:01.420 were laid down by the Greeks.
01:59:03.200 And then we could say,
01:59:04.020 okay,
01:59:04.220 well,
01:59:04.340 that was 2,500 years ago,
01:59:06.260 or 3,000 years ago.
01:59:07.380 But who cares?
01:59:08.160 If you're thinking biologically,
01:59:10.000 the difference between 500 years and 3,000 years is,
01:59:13.100 that's no difference at all,
01:59:14.480 you know,
01:59:14.800 because human beings,
01:59:16.460 in their current form,
01:59:17.940 are 150,000 years old.
01:59:20.640 That's the estimate.
01:59:21.700 There's been creatures,
01:59:23.040 basically identical to us,
01:59:24.880 genetically,
01:59:25.560 for 150,000 years.
01:59:27.280 And we diverged from chimpanzees,
01:59:29.420 7 million years ago.
01:59:30.640 And so,
01:59:32.260 you know,
01:59:32.540 and so we were vaguely human for 7 million years.
01:59:35.280 And so,
01:59:35.900 in the span of,
01:59:36.940 let's say,
01:59:37.380 2 million years,
01:59:38.540 just to,
01:59:39.400 you know,
01:59:39.940 give the later proto-humans their advantage,
01:59:44.360 the difference between 500 years and 3,000 years is completely trivial.
01:59:49.100 The point is,
01:59:49.800 is that we didn't invent anything approximating science,
01:59:52.660 and we weren't concerned about the structure of the material world,
01:59:55.760 in any objective sense,
01:59:57.220 until like yesterday.
01:59:58.340 And really,
02:00:00.260 and we managed,
02:00:01.340 we managed to survive,
02:00:03.400 without it.
02:00:04.600 And then,
02:00:04.960 of course,
02:00:05.400 there's the,
02:00:07.420 3.5 billion years of biological evolution,
02:00:10.780 that preceded,
02:00:12.120 even the emergence of human beings,
02:00:14.280 that existed in the absolute absence of anything,
02:00:17.340 approximating a scientific perspective,
02:00:20.380 which indicates that,
02:00:22.180 well,
02:00:22.420 A,
02:00:22.660 that that perspective,
02:00:23.840 appears to not be strictly necessary,
02:00:26.220 from the purpose,
02:00:27.120 from the perspective of survival,
02:00:30.540 and,
02:00:30.980 and B,
02:00:32.880 that,
02:00:33.780 well,
02:00:34.060 that something else,
02:00:34.920 something else was occurring,
02:00:36.920 to provide us with the knowledge that we needed,
02:00:39.420 during all of that time.
02:00:40.760 And so,
02:00:42.080 when I read Nietzsche,
02:00:44.420 decades ago,
02:00:46.460 he talked about,
02:00:49.200 how philosophy,
02:00:52.260 emerged.
02:00:53.120 So,
02:00:53.680 he was an,
02:00:54.300 analyst of philosophers.
02:00:57.240 And it was Nietzsche's,
02:00:58.800 idea that,
02:01:00.600 what,
02:01:03.460 the typical philosopher,
02:01:05.260 perhaps even including him,
02:01:06.840 but perhaps not him,
02:01:08.000 produced,
02:01:09.580 produced,
02:01:09.760 was an unconscious recapitulation,
02:01:12.160 of their own knowledge.
02:01:14.160 That they felt,
02:01:15.180 that what they were doing,
02:01:15.960 was coming up with a rationalistic account,
02:01:18.220 of the structure of behavior,
02:01:20.080 let's say.
02:01:20.560 But,
02:01:21.000 really what they were doing,
02:01:22.140 was noticing how they acted,
02:01:24.700 and then,
02:01:26.180 describing that,
02:01:27.260 and providing it with a rationale.
02:01:29.860 And so,
02:01:30.460 it was bottom up,
02:01:31.700 not top down.
02:01:32.740 So,
02:01:34.660 we might think about that.
02:01:36.700 We might think about that,
02:01:37.720 because,
02:01:38.040 one of the things we do know,
02:01:39.800 is that,
02:01:40.500 however it was,
02:01:41.580 that creatures,
02:01:42.700 animals,
02:01:43.360 us,
02:01:44.080 figured out how to act,
02:01:45.900 over,
02:01:46.740 the millions of years,
02:01:48.000 that they figured out how to act,
02:01:49.840 it was bottom up,
02:01:51.080 and top,
02:01:51.600 not top down.
02:01:53.380 The simpler the animal,
02:01:54.840 the less capable it is,
02:01:56.080 of thinking.
02:01:57.500 And so,
02:01:57.860 if you go back far enough,
02:01:59.060 and you don't have to go back that far,
02:02:00.420 you get creatures,
02:02:01.720 that can't think at all.
02:02:03.560 Like,
02:02:04.580 there's lots of complex,
02:02:05.860 multi-celled organisms,
02:02:07.420 that don't have much of a nervous system,
02:02:09.080 and they don't think,
02:02:10.120 they just act,
02:02:11.000 they just exist.
02:02:13.060 And so,
02:02:13.520 whatever they're doing,
02:02:14.340 isn't a consequence of thinking,
02:02:15.800 it's a consequence,
02:02:16.760 the fact that they know,
02:02:17.780 what they're doing,
02:02:18.260 is a consequence of something,
02:02:19.340 other than thinking.
02:02:20.800 And even more sophisticated animals,
02:02:24.000 mammals,
02:02:24.760 let's say,
02:02:26.480 act,
02:02:27.320 and don't think.
02:02:29.600 And so,
02:02:30.420 but they act,
02:02:32.700 in sophisticated ways.
02:02:34.820 And so,
02:02:35.400 I was reading,
02:02:36.100 some of the ethologists,
02:02:37.220 about the same time.
02:02:38.280 Ethologists are,
02:02:39.780 scientists,
02:02:40.740 who study animal behavior.
02:02:42.240 They're not like,
02:02:42.780 laboratory behaviorists.
02:02:44.280 They don't,
02:02:45.060 put the animals,
02:02:46.040 in a controlled,
02:02:47.360 environment,
02:02:48.040 and do experiments on them.
02:02:49.200 They just,
02:02:49.940 use naturalistic observations.
02:02:51.780 So you can imagine,
02:02:52.700 Jane Goodall,
02:02:53.400 was an ethologist,
02:02:54.500 for example,
02:02:55.100 and she studied chimpanzees.
02:02:56.820 This guy named,
02:02:57.440 Conrad Lorenz,
02:02:58.420 in Scandinavia.
02:03:00.880 He studied geese,
02:03:02.340 and other animals,
02:03:03.140 dogs as well.
02:03:04.480 And so,
02:03:04.800 these are people,
02:03:05.480 who just,
02:03:06.400 and Dian Fossey,
02:03:07.360 I think it was Fossey,
02:03:08.920 now,
02:03:09.320 was she gorillas?
02:03:10.120 I think it was gorillas.
02:03:12.540 Yeah,
02:03:13.060 yeah.
02:03:13.440 And so,
02:03:14.160 you go out,
02:03:14.600 and watch the animals,
02:03:15.720 and you know,
02:03:16.880 Goodall,
02:03:17.260 when she was watching chimpanzees,
02:03:18.960 what she'd do is,
02:03:20.380 well,
02:03:20.700 she'd look for regularities,
02:03:21.900 but she also found herself,
02:03:23.160 telling stories.
02:03:24.600 You know,
02:03:24.840 so you read Goodall's,
02:03:25.820 accounts of chimpanzees,
02:03:27.060 you get stories,
02:03:27.760 about what the chimpanzees are like,
02:03:29.200 and she gave them all names,
02:03:30.500 and,
02:03:30.800 and,
02:03:31.660 you can see the personalities,
02:03:33.340 of the chimpanzees,
02:03:34.920 emerging,
02:03:35.800 in her accounts.
02:03:36.660 And that's kind of interesting,
02:03:37.620 because what it suggests is that,
02:03:39.600 if you're looking for how,
02:03:42.080 a set of creatures,
02:03:44.000 how a set of creatures interact,
02:03:47.800 you tend to,
02:03:49.600 look at them as if their personalities,
02:03:52.680 acting something out,
02:03:54.260 acting out patterns.
02:03:56.240 Say like wolf pack.
02:03:58.280 If you watch wolf pack for a while,
02:03:59.760 you're going to see,
02:04:01.020 patterns,
02:04:02.360 regular patterns of behavior,
02:04:04.100 that characterize the interactions,
02:04:05.800 between the wolf pack.
02:04:06.620 Has to be that case,
02:04:07.880 right?
02:04:08.020 Because otherwise,
02:04:08.560 the wolves couldn't predict one another.
02:04:10.120 If there was no regularity,
02:04:11.880 there'd be no predictability.
02:04:13.080 There'd just be chaos.
02:04:14.440 And so what happens is,
02:04:15.380 the wolves,
02:04:16.300 settle into predictable patterns,
02:04:18.240 of behavior.
02:04:18.840 The chimpanzees settle into,
02:04:20.340 predictable patterns,
02:04:20.980 of behavior.
02:04:22.740 And,
02:04:23.300 then you can tell stories,
02:04:25.460 about those predictable patterns,
02:04:26.680 of behavior.
02:04:27.460 And you might even be able,
02:04:28.300 to derive rules.
02:04:29.420 You might say,
02:04:29.920 well it's as if,
02:04:30.740 the chimpanzees are acting out,
02:04:32.300 this rule.
02:04:33.080 So here's a rule,
02:04:34.260 that you might act out,
02:04:35.260 if you were a wolf.
02:04:36.620 So let's say,
02:04:38.440 you're a male wolf,
02:04:40.080 and there's another male wolf around,
02:04:41.500 and you decide,
02:04:42.480 that you're going to,
02:04:43.300 have a dominance dispute.
02:04:45.580 Right?
02:04:45.840 And so you puff up your fur,
02:04:47.320 and you look rough and tough,
02:04:48.420 and you bare your teeth,
02:04:49.440 and you growl,
02:04:50.200 and you threaten,
02:04:51.240 and you,
02:04:52.720 you terrify.
02:04:54.800 And,
02:04:55.580 and maybe you even fight,
02:04:56.960 to some degree,
02:04:57.620 but not too much,
02:04:58.520 because you don't want to,
02:04:59.240 damage each other.
02:05:00.100 Because the person,
02:05:02.480 the wolf,
02:05:02.920 that you're fighting with,
02:05:03.920 you might need,
02:05:05.100 tomorrow,
02:05:05.600 to bring down a wolf,
02:05:06.840 to bring down a moose with.
02:05:08.720 Right?
02:05:09.520 So,
02:05:09.860 that's an interesting thing,
02:05:11.120 to consider,
02:05:11.740 because it constitutes,
02:05:12.760 a limit on,
02:05:14.000 on the kind of aggression,
02:05:15.480 that's allowable,
02:05:16.840 in the pursuit of,
02:05:18.440 dominance.
02:05:20.220 Anyways,
02:05:21.060 two wolves,
02:05:21.620 that go at it,
02:05:22.160 and usually what happens,
02:05:23.100 is one wolf,
02:05:24.340 decides it's not worth,
02:05:26.240 the risk,
02:05:26.820 and he rolls over,
02:05:28.260 and presents his throat,
02:05:30.180 to the victor.
02:05:31.600 And that basically,
02:05:32.540 means something like,
02:05:34.460 well I'm useless,
02:05:35.640 and weak,
02:05:36.140 and you can tear out,
02:05:37.380 like a prey animal,
02:05:38.680 and you can tear out,
02:05:39.560 my throat,
02:05:40.220 if you,
02:05:41.700 so choose.
02:05:43.120 And the dominant wolf,
02:05:45.560 acts out,
02:05:46.500 something like,
02:05:47.560 well I know,
02:05:48.180 you're useless,
02:05:48.840 and weak,
02:05:49.320 but I might need you,
02:05:50.740 to hold down a moose,
02:05:51.840 tomorrow,
02:05:52.380 so despite the fact,
02:05:53.940 that you're not good,
02:05:54.620 for anything,
02:05:55.380 you might as well,
02:05:56.180 get up,
02:05:56.540 and we'll get on with it.
02:05:58.140 Now obviously,
02:05:59.120 the wolves aren't thinking that,
02:06:00.820 but that's how they act.
02:06:02.640 And so if you're watching,
02:06:03.840 and you can think,
02:06:04.660 and you describe how they act,
02:06:06.280 that might be how you describe it.
02:06:08.840 And so I just described it,
02:06:10.440 in words,
02:06:11.820 I used,
02:06:12.640 what would be,
02:06:13.840 approximately,
02:06:15.940 the description of a rule,
02:06:17.280 the rule would be,
02:06:18.740 if you're a wolf pack,
02:06:19.720 don't kill each other,
02:06:20.800 because you need the whole pack,
02:06:22.160 to pull down your prey.
02:06:24.080 Right, right,
02:06:24.780 so there's a constraint,
02:06:26.340 on striving,
02:06:28.440 and you can,
02:06:29.220 you can lay it out,
02:06:30.100 in a rule-like manner,
02:06:30.960 and I told a little story,
02:06:32.380 and then I derived a rule from it.
02:06:35.540 And that's how,
02:06:36.660 how we act,
02:06:38.740 emerged.
02:06:41.900 So,
02:06:43.160 each of us,
02:06:44.240 pursues our own,
02:06:45.420 motivated behaviors,
02:06:46.420 but then we aggregate together,
02:06:49.240 in groups.
02:06:51.020 And the fact,
02:06:52.140 that we aggregate together,
02:06:53.080 in groups,
02:06:54.380 puts certain restrictions,
02:06:55.620 on how it is,
02:06:56.580 that we manifest,
02:06:57.440 our motivated behavior.
02:06:59.580 Partly,
02:07:00.040 because we depend on each other,
02:07:01.220 like the wolves,
02:07:01.860 depend on each other.
02:07:03.380 And so what that means,
02:07:04.400 is that there's an ethic,
02:07:05.280 that emerges,
02:07:06.700 out of the interactions,
02:07:08.140 between those motivations,
02:07:09.580 and that ethic,
02:07:10.500 manifests itself,
02:07:11.580 in a certain kind,
02:07:12.380 of pattern behavior.
02:07:14.080 Now that's about,
02:07:14.940 as far as it goes,
02:07:15.620 for wolves.
02:07:16.460 Because they don't sit around,
02:07:17.560 at the campfire at night,
02:07:19.240 and talk about,
02:07:20.160 how it is that wolves,
02:07:21.840 interact with one another.
02:07:24.240 But human beings do that,
02:07:26.040 see,
02:07:26.280 because we've got this next level,
02:07:28.140 of cognitive ability,
02:07:30.140 imaginative ability.
02:07:31.680 It's not,
02:07:32.460 it's not just pure abstract thought.
02:07:34.820 It's not like,
02:07:36.120 we've got motivation,
02:07:37.760 like other animals,
02:07:38.740 and then we've got,
02:07:40.000 the patterns of behavior,
02:07:41.400 that emerge,
02:07:42.180 out of a,
02:07:42.840 as a consequence of,
02:07:44.100 the interaction,
02:07:45.480 of those motivated behaviors.
02:07:46.840 We then have the ability,
02:07:48.020 to watch ourselves,
02:07:49.720 like we would watch,
02:07:50.940 a wolf pack,
02:07:51.800 right?
02:07:52.400 Because an anthropologist,
02:07:53.680 an ethologist,
02:07:54.380 can go out in the wild,
02:07:56.040 and watch a wolf pack,
02:07:57.760 or watch a chimpanzee troop,
02:07:59.320 and take notes.
02:08:00.540 Say,
02:08:00.800 look,
02:08:01.040 this is what's,
02:08:01.640 here are the patterns,
02:08:02.580 I can write down the patterns.
02:08:04.400 And when I write down the patterns,
02:08:06.080 then I'm telling a story,
02:08:07.140 and out of the story,
02:08:07.900 I can abstract rules.
02:08:09.700 But the chimps,
02:08:10.360 aren't doing that,
02:08:11.020 the wolves aren't doing that,
02:08:11.940 but the human being,
02:08:12.660 can do that.
02:08:14.760 And then the human being,
02:08:15.980 can say,
02:08:16.260 well it's as if,
02:08:16.800 wolves follow these rules,
02:08:19.040 which they don't,
02:08:20.220 because they're wolves,
02:08:21.220 and they don't follow rules.
02:08:22.520 But they do manifest patterns,
02:08:25.140 and the patterns emerged,
02:08:26.760 as a consequence,
02:08:27.660 of something approximating,
02:08:28.980 an evolutionary competition.
02:08:31.200 Now,
02:08:32.320 it's interesting,
02:08:33.140 it's very interesting,
02:08:34.040 to think this through.
02:08:35.580 So,
02:08:36.500 Franz de Waal,
02:08:38.040 who's a,
02:08:39.220 another ethologist,
02:08:40.580 who studies chimpanzees,
02:08:42.320 he's written a bunch of great books,
02:08:43.720 you might be interested in them,
02:08:45.160 if you're interested in this sort of thing.
02:08:47.060 He's,
02:08:47.420 he's been interested in the,
02:08:48.880 in the emergence of morality,
02:08:50.880 among chimpanzees.
02:08:53.080 And so,
02:08:54.640 one of the things de Waal has found,
02:08:57.540 was that,
02:08:58.560 and this is partly why the,
02:08:59.980 the post-modernists,
02:09:01.940 who insist that,
02:09:03.800 the fundamental motivator,
02:09:06.120 for the structure of hierarchies,
02:09:07.800 is power,
02:09:08.880 are wrong.
02:09:10.800 Okay,
02:09:11.260 so I'm going to repeat that.
02:09:12.660 They're wrong.
02:09:14.560 Okay,
02:09:15.240 it's seriously important.
02:09:17.440 Because,
02:09:18.360 one of the,
02:09:19.120 one of the claims,
02:09:20.600 that's tearing our culture apart,
02:09:23.000 is that,
02:09:24.680 our hierarchical structure,
02:09:27.040 which would be our entire culture,
02:09:29.120 is an oppressive patriarchy,
02:09:31.000 and that the people,
02:09:32.860 who occupy the positions,
02:09:34.900 especially the,
02:09:35.600 the higher positions,
02:09:37.560 let's say in the hierarchy,
02:09:39.100 got there,
02:09:39.880 because they exercise power.
02:09:42.780 That's a fundamental claim of,
02:09:44.260 Foucault for example,
02:09:45.960 who's,
02:09:46.400 an absolutely reprehensible scholar,
02:09:48.420 on about 15 different dimensions.
02:09:51.080 But one of them,
02:09:51.980 one of them is,
02:09:54.320 his narrow-minded insistence,
02:09:56.960 that power is the only justification,
02:09:59.200 for hierarchical position.
02:10:01.920 Now,
02:10:02.400 I think that's,
02:10:03.360 patently absurd,
02:10:05.120 in the case of human beings,
02:10:06.680 but I,
02:10:07.000 but I won't go there,
02:10:08.600 to begin with,
02:10:09.640 because I might as well,
02:10:10.480 make the case,
02:10:11.160 that it's patently absurd,
02:10:12.620 in the case of animals.
02:10:15.260 Where it's,
02:10:16.140 where it's more like power.
02:10:19.560 De Waal has shown,
02:10:20.820 well,
02:10:21.100 you,
02:10:21.240 you got the example,
02:10:22.220 of the wolf already.
02:10:23.120 It's like,
02:10:23.900 you just can't be,
02:10:24.880 savaging up your,
02:10:26.300 your pack mates.
02:10:29.260 Because all you do,
02:10:30.380 is demolish the structure,
02:10:32.100 within which you live.
02:10:34.180 And that's true for wolves.
02:10:36.860 You know,
02:10:38.000 it's,
02:10:38.340 it's true for chimpanzees.
02:10:40.900 So De Waal has shown,
02:10:42.960 in sequence of observations.
02:10:45.600 And chimps are a good,
02:10:46.980 test case,
02:10:48.700 because,
02:10:49.700 we're rather chimp-like.
02:10:52.220 Now,
02:10:52.840 well,
02:10:53.260 there,
02:10:53.440 there are closest,
02:10:54.360 biological relatives,
02:10:55.640 right?
02:10:55.860 So it's only a seven million year gap,
02:10:57.900 between us,
02:10:58.520 and the common ancestors.
02:10:59.700 So if you're going to look,
02:11:00.360 at any animal,
02:11:01.620 and,
02:11:02.320 derive,
02:11:03.820 conclusions about,
02:11:05.620 the basal motivations,
02:11:07.260 of human beings,
02:11:08.360 well then,
02:11:08.940 it would be chimps,
02:11:09.700 that you would look at.
02:11:10.300 You might look at bonobos too.
02:11:12.240 Although the differences,
02:11:13.240 between bonobos and chimps,
02:11:14.480 has been exaggerated,
02:11:16.160 by all appearances.
02:11:17.340 In any case,
02:11:19.100 you can get to the top,
02:11:20.320 if you're the roughest,
02:11:21.400 toughest,
02:11:21.960 meanest,
02:11:22.560 most physically,
02:11:23.860 powerful,
02:11:24.840 and cruelest chimpanzee.
02:11:27.200 But your rule is unstable.
02:11:30.700 And there's a reason for that.
02:11:32.980 And the reason is,
02:11:33.940 is that chimpanzees,
02:11:35.320 the males,
02:11:36.340 females as well,
02:11:37.460 but the fundamental hierarchy,
02:11:38.960 for chimps is male.
02:11:41.600 Chimpanzees are,
02:11:42.480 are competitive,
02:11:44.520 but they're also cooperative.
02:11:46.000 They spend a lot of time,
02:11:47.700 grooming each other.
02:11:48.780 And they actually have,
02:11:49.920 long term friendships.
02:11:51.420 And it turns out that,
02:11:53.000 um,
02:11:54.440 two,
02:11:55.280 chimpanzees,
02:11:56.880 that are three quarters,
02:11:57.960 as tough,
02:11:58.940 as the toughest chimpanzee,
02:12:01.100 make a pretty,
02:12:01.960 vicious set of opponents,
02:12:04.180 if you get a little bit,
02:12:05.540 too tyrannical.
02:12:07.340 And the problem,
02:12:08.520 with pure power,
02:12:09.560 for the chimpanzees,
02:12:10.660 is that if you're a chimp,
02:12:11.740 that climbs to the top,
02:12:12.840 as a consequence,
02:12:13.440 of nothing but psychopathic dominance,
02:12:16.160 you have no allies,
02:12:17.660 and no friends,
02:12:18.400 because you don't engage,
02:12:19.480 in any cooperative behavior,
02:12:21.120 you'll have an off day,
02:12:22.340 and your two,
02:12:23.780 lieutenants,
02:12:24.900 each of whom,
02:12:25.740 is three quarters,
02:12:26.620 as strong as you,
02:12:27.560 will band together,
02:12:28.420 and tear you into bits.
02:12:31.220 It's an unstable,
02:12:33.760 medium to long term,
02:12:35.280 solution.
02:12:36.680 Power.
02:12:38.600 And,
02:12:39.200 I see no reason,
02:12:40.220 to assume,
02:12:40.840 that exactly the same thing,
02:12:42.260 isn't the case,
02:12:42.980 with human beings.
02:12:44.960 If it doesn't work,
02:12:45.980 for chimpanzees,
02:12:46.940 and it doesn't work,
02:12:47.640 for wolves,
02:12:48.840 why in the world,
02:12:50.380 would it work,
02:12:50.880 for human beings?
02:12:51.600 Especially when you think,
02:12:52.780 that our hierarchies,
02:12:54.100 are way more complicated,
02:12:55.920 than the hierarchies,
02:12:57.320 that characterize animals.
02:12:59.100 And that,
02:12:59.820 we do all sorts of things,
02:13:02.540 with our hierarchies,
02:13:03.400 that animals don't do.
02:13:04.820 I mean,
02:13:05.080 most of you have jobs,
02:13:06.500 and most of those jobs,
02:13:09.420 are,
02:13:11.880 while you're,
02:13:12.920 performing most of those jobs,
02:13:15.320 you're not doing things,
02:13:16.780 that any animal would have done,
02:13:19.140 would do now,
02:13:20.080 or that any human being,
02:13:21.200 would have done,
02:13:21.680 300 years ago.
02:13:22.660 Now,
02:13:22.880 there's certain exceptions to that.
02:13:24.820 Some of you might farm,
02:13:26.080 but even if you're doing that,
02:13:27.160 you're not doing it in a way,
02:13:28.320 that people did it,
02:13:29.440 300 years ago.
02:13:31.240 Whatever your job is,
02:13:32.500 and it's likely to be very abstract,
02:13:35.140 you're pursuing some goal,
02:13:37.440 that is fairly distant,
02:13:39.320 from a fundamental,
02:13:40.560 biological motivation.
02:13:41.740 It's very abstract.
02:13:43.220 And you're actually creating,
02:13:44.320 something of at least,
02:13:45.960 sufficient value,
02:13:47.000 so that other people,
02:13:47.960 will trade with you for it.
02:13:50.420 And the idea,
02:13:51.160 that you're going to,
02:13:53.040 move up in your hierarchy,
02:13:54.740 of production,
02:13:56.620 by exercising something,
02:13:58.340 like psychopathic power,
02:14:00.120 is,
02:14:00.960 I think that's insane.
02:14:02.860 To think that.
02:14:04.480 You know,
02:14:04.860 I think about plumbers,
02:14:07.680 for example.
02:14:09.620 I've used this joke before,
02:14:11.180 but I like it,
02:14:11.800 so I'm going to use it again.
02:14:13.640 It's like,
02:14:14.640 you know,
02:14:15.040 if the postmodernists were right,
02:14:17.160 this is how you'd hire a plumber.
02:14:19.040 First of all,
02:14:19.520 the plumbers would have all banded together,
02:14:21.240 although they'd be fighting within themselves,
02:14:23.040 because of course,
02:14:23.820 there's nothing but power.
02:14:24.760 But they would have all banded together,
02:14:26.480 and they'd go from door to door,
02:14:28.260 and they'd basically,
02:14:29.300 knock on your door,
02:14:30.800 and tell you that,
02:14:31.560 if you didn't hire them,
02:14:32.880 there was going to be hell to pay.
02:14:34.540 It'd be like mafia plumbers.
02:14:37.220 And,
02:14:37.800 and then you'd pick the roughest,
02:14:40.200 toughest plumber,
02:14:41.400 who's best at exercising power,
02:14:43.420 to not fix your pipes,
02:14:45.420 because,
02:14:46.060 why would he be interested in fixing your pipes?
02:14:49.340 He'd just be interested in,
02:14:50.800 pretending to fix your pipes,
02:14:52.500 and taking your money,
02:14:53.700 which is how you'd expect,
02:14:54.800 a mafioso plumber to operate.
02:14:57.480 But that isn't how it works,
02:14:59.060 right?
02:14:59.460 I mean,
02:14:59.720 if you hire virtually anyone,
02:15:02.080 in your day-to-day life,
02:15:03.860 you suss out their reputation,
02:15:05.860 for their ability,
02:15:07.120 to provide the service,
02:15:08.620 that they promised to provide,
02:15:10.740 which is predicated on their skill,
02:15:13.280 their technical skill,
02:15:14.440 their skill as,
02:15:15.600 as workmen,
02:15:16.720 and craftsmen.
02:15:18.120 It also is predicated on their ability,
02:15:20.200 to generally work,
02:15:21.620 in some reasonably interactive way,
02:15:23.960 with their employees,
02:15:25.160 because otherwise,
02:15:25.800 they don't stay in business,
02:15:26.780 for that long.
02:15:27.860 And also,
02:15:28.800 to treat their customers,
02:15:30.060 with a modicum of,
02:15:31.660 reciprocal respect,
02:15:33.460 or their reputation,
02:15:35.260 is savaged immediately,
02:15:36.860 and they fail.
02:15:39.000 And so,
02:15:40.000 this patriarchal structure,
02:15:41.660 that we hypothetically all,
02:15:43.880 occupy,
02:15:45.020 that's fundamentally predicated,
02:15:47.180 on oppression and power,
02:15:49.680 breaks apart,
02:15:51.260 into something approximating,
02:15:53.980 cooperative competence,
02:15:55.880 when you look at any of its sub-components.
02:15:59.060 You know,
02:15:59.360 you don't have,
02:16:00.880 power-hungry,
02:16:02.320 massage therapists.
02:16:04.480 You know?
02:16:05.840 Well,
02:16:06.720 I don't understand,
02:16:07.940 how the entire structure,
02:16:09.860 can be,
02:16:11.120 a power-dominated,
02:16:12.700 oppressive patriarchy,
02:16:14.000 if none of the sub-components are.
02:16:16.020 It doesn't make any sense to me.
02:16:18.360 And I'm not saying,
02:16:19.480 at all,
02:16:20.000 that,
02:16:20.920 within large hierarchies,
02:16:22.260 there isn't room,
02:16:23.220 for relatively,
02:16:24.140 psychopathic people,
02:16:25.820 to now and then,
02:16:27.020 manage a certain amount,
02:16:28.180 of success.
02:16:29.140 You know,
02:16:29.400 you know that as a,
02:16:30.580 hierarchical structure,
02:16:32.020 grows in size,
02:16:33.280 that pure power,
02:16:34.620 politic players,
02:16:36.100 have a,
02:16:36.760 higher probability,
02:16:38.220 of success.
02:16:38.760 But all that happens,
02:16:40.220 to large organizations,
02:16:41.440 when they get completely,
02:16:42.980 dominated by people,
02:16:44.420 who are using power,
02:16:45.480 and politics,
02:16:46.220 as a means,
02:16:46.940 to climb to the top,
02:16:48.060 is that they precipitously collapse.
02:16:50.360 Because you end up,
02:16:51.220 with no one,
02:16:51.820 who can actually,
02:16:52.700 perform the function,
02:16:54.320 that the structure,
02:16:55.140 is supposed to perform.
02:16:56.840 And a whole plethora,
02:16:58.220 of people,
02:16:58.720 who are good at doing nothing,
02:17:00.000 but playing politics.
02:17:01.440 And then the company dies.
02:17:03.040 And the thing is,
02:17:03.800 companies die all the time.
02:17:05.280 The typical Fortune 500 company,
02:17:07.920 lasts 30 years.
02:17:10.000 And the reason for that often,
02:17:11.580 is that,
02:17:12.320 it's functional for a while,
02:17:14.140 and then it gets corrupted,
02:17:15.480 by internal politics,
02:17:16.640 or it gets blind,
02:17:17.680 or it can't keep up,
02:17:18.720 or whatever it is,
02:17:19.580 and,
02:17:20.540 that's the end of it.
02:17:21.520 It falls apart.
02:17:22.520 And,
02:17:23.160 you know,
02:17:23.560 breaks into its,
02:17:24.600 constituent elements,
02:17:25.740 and they reformulate.
02:17:26.820 Maybe there's some new companies,
02:17:28.160 that come out of it,
02:17:28.900 but,
02:17:29.560 it's not a permanent structure,
02:17:31.140 by any stretch of the imagination.
02:17:32.740 And so,
02:17:33.900 I don't think,
02:17:34.760 there's any reason,
02:17:35.580 whatsoever to assume,
02:17:37.300 well,
02:17:37.540 we can think this through.
02:17:39.700 Are our,
02:17:40.400 hierarchical structures,
02:17:42.040 based on,
02:17:43.200 power?
02:17:43.860 Or,
02:17:44.280 are they based on,
02:17:45.280 competence?
02:17:46.220 And look,
02:17:46.820 the answer is,
02:17:47.560 well,
02:17:47.720 they're a bit based on power,
02:17:49.240 you know?
02:17:49.840 Because,
02:17:50.140 you shouldn't be naive,
02:17:51.120 about this.
02:17:53.600 Hierarchical structures,
02:17:54.700 can,
02:17:55.560 and do,
02:17:57.680 become corrupt,
02:17:58.720 in some ways.
02:17:59.980 And we have to keep,
02:18:00.860 an eye on that,
02:18:01.460 all the time.
02:18:01.940 But,
02:18:02.400 my sense is,
02:18:04.040 that,
02:18:04.880 by and large,
02:18:06.840 things work.
02:18:09.980 You know,
02:18:10.340 I mean,
02:18:10.820 here we are,
02:18:11.600 we're sitting in this,
02:18:12.880 hall,
02:18:13.460 and there's,
02:18:13.840 3,000 of us,
02:18:15.600 and you all got here,
02:18:17.520 because your cars worked,
02:18:19.340 and,
02:18:20.480 the highways work,
02:18:21.800 sort of.
02:18:22.760 I mean,
02:18:23.060 I was,
02:18:23.720 in,
02:18:24.140 in a traffic jam,
02:18:27.040 for like,
02:18:27.420 an hour and a half,
02:18:28.420 getting here.
02:18:28.940 But,
02:18:29.300 but,
02:18:30.200 you know,
02:18:30.520 that's what happens,
02:18:31.100 when you drive somewhere,
02:18:32.020 at five o'clock.
02:18:32.840 You know,
02:18:33.080 you can,
02:18:33.660 you can predict that.
02:18:34.780 So they worked,
02:18:35.620 and,
02:18:36.180 the lights are on,
02:18:38.020 and they seem to work,
02:18:39.000 and that's no trivial thing,
02:18:40.280 and,
02:18:40.600 looks like the,
02:18:41.580 big TV screen is working,
02:18:43.240 and,
02:18:43.720 you're all sitting here,
02:18:45.200 peacefully,
02:18:46.980 not engaging in,
02:18:48.800 an overt power struggle,
02:18:50.260 as far as I can tell,
02:18:51.340 even though,
02:18:52.060 even though,
02:18:52.680 there's a hierarchical,
02:18:53.920 arrangement of seats,
02:18:55.900 right?
02:18:56.400 You,
02:18:56.520 you accept that,
02:18:58.600 you know,
02:18:59.100 and you're all,
02:19:00.440 and you accept the fact,
02:19:01.780 that,
02:19:02.260 you know,
02:19:02.520 some of you,
02:19:03.320 paid a premium,
02:19:04.280 for sitting closer,
02:19:05.300 and,
02:19:05.760 that that seems to be,
02:19:06.660 a reasonable way,
02:19:07.520 of distributing,
02:19:08.620 somewhat scarce resources,
02:19:10.340 and,
02:19:10.960 you know,
02:19:11.380 everyone in here,
02:19:12.340 is behaving peacefully,
02:19:13.560 and,
02:19:14.000 like,
02:19:14.280 all this seems to work,
02:19:15.440 quite nicely,
02:19:16.260 and so,
02:19:16.940 since this all works,
02:19:18.280 it's very difficult,
02:19:19.200 for me to understand,
02:19:20.500 how it's not predicated,
02:19:22.100 at least in large part,
02:19:23.240 on competence,
02:19:24.920 and then,
02:19:25.280 of course,
02:19:25.720 there's the other evidence,
02:19:26.840 which is,
02:19:27.820 well,
02:19:28.280 you know,
02:19:29.400 a lot of you are older,
02:19:30.540 than the average person,
02:19:31.860 would have been,
02:19:32.800 when he or she died,
02:19:34.320 throughout the history,
02:19:35.380 of the entire human race,
02:19:37.520 and so,
02:19:37.880 that seems to be,
02:19:38.700 working out pretty good,
02:19:39.480 for you,
02:19:40.180 and you're,
02:19:41.280 none of you are skinny,
02:19:43.560 quite the contrary,
02:19:45.640 so,
02:19:48.600 there are more obese people,
02:19:50.240 in the world now,
02:19:50.960 than there are starving people,
02:19:52.380 by quite a substantial majority,
02:19:54.140 and I think,
02:19:55.040 that that's worth,
02:19:55.720 quite the celebration,
02:19:56.900 even though,
02:19:57.340 it's perhaps,
02:19:57.920 not exactly optimized,
02:19:59.860 but,
02:20:00.360 it's definitely better,
02:20:01.740 than the alternative,
02:20:02.680 and you know,
02:20:03.280 by and large,
02:20:04.120 we're,
02:20:04.660 moderately healthy,
02:20:06.920 and,
02:20:07.460 most,
02:20:08.760 women don't die,
02:20:09.820 in childbirth,
02:20:10.640 like they used to,
02:20:11.700 and,
02:20:12.060 most children,
02:20:12.680 don't die,
02:20:14.040 within a year,
02:20:14.780 of being born,
02:20:15.640 like they used to,
02:20:17.100 not very long ago,
02:20:18.480 and so,
02:20:19.800 so,
02:20:23.260 things seem to be,
02:20:24.460 working,
02:20:25.720 not so bad,
02:20:27.660 given,
02:20:28.240 what a bloody catastrophe,
02:20:29.700 life is,
02:20:30.340 and how difficult it is,
02:20:32.000 to get things to work,
02:20:33.180 and how fragile people are,
02:20:34.820 and how short-lived,
02:20:36.380 we are,
02:20:37.320 intrinsically,
02:20:38.220 and how,
02:20:39.380 vulnerable,
02:20:40.220 we are to suffering,
02:20:41.380 we're not doing so bad,
02:20:43.560 you know,
02:20:44.160 we must be doing,
02:20:45.360 something right,
02:20:46.860 and,
02:20:47.800 the,
02:20:48.860 well,
02:20:51.500 this is the thing,
02:20:52.760 this is the thing,
02:20:54.000 that,
02:20:54.400 that,
02:20:54.920 I would say,
02:20:55.740 enrages me,
02:20:56.880 about,
02:20:57.960 let's call them,
02:20:59.060 universities,
02:21:00.720 and,
02:21:01.200 it's the,
02:21:10.580 it's the,
02:21:11.200 intellectual,
02:21:12.160 and moral laziness,
02:21:13.280 of the,
02:21:14.480 of the,
02:21:15.360 resentful,
02:21:16.760 victimization,
02:21:18.300 power,
02:21:20.240 post-modern doctrine,
02:21:22.180 because,
02:21:23.320 look,
02:21:23.740 it doesn't take a genius,
02:21:25.080 to figure out that,
02:21:26.900 the history of humanity,
02:21:28.680 is a bloody nightmare,
02:21:30.820 and,
02:21:31.220 you know,
02:21:32.640 to lay that only,
02:21:33.960 at the feet of human beings,
02:21:35.400 I think,
02:21:35.800 is a mistake,
02:21:37.400 because,
02:21:39.120 life,
02:21:40.340 in the natural world,
02:21:41.400 is a bloody nightmare,
02:21:43.380 and,
02:21:43.860 so,
02:21:44.060 you can't just,
02:21:44.820 blame that on people,
02:21:46.200 you know,
02:21:46.440 like,
02:21:46.680 you can blame it on people,
02:21:47.720 to some degree,
02:21:48.600 we take a bad thing,
02:21:50.520 or we take a deadly thing,
02:21:52.420 or a dangerous thing,
02:21:53.260 and we can make it worse,
02:21:54.600 and we definitely do that,
02:21:56.480 but it's not like,
02:21:57.480 the problem is simple,
02:21:58.460 to begin with,
02:21:59.080 because it's,
02:21:59.960 clearly the case,
02:22:01.180 that,
02:22:01.880 at every moment,
02:22:03.820 the planet,
02:22:04.760 is trying to,
02:22:05.700 kill you,
02:22:06.360 and,
02:22:08.320 eventually,
02:22:09.520 it will succeed,
02:22:11.260 and so,
02:22:12.080 that's a big problem,
02:22:14.020 and,
02:22:14.560 it's reasonable enough,
02:22:15.760 for us to group together,
02:22:17.680 and to try to,
02:22:18.940 stop that,
02:22:20.300 and while we're doing that,
02:22:21.720 we're cause,
02:22:22.400 we cause some trouble,
02:22:24.080 you know,
02:22:24.460 we pollute things,
02:22:25.460 and we break things,
02:22:26.440 and,
02:22:26.680 and we don't play as good,
02:22:28.160 an iterating game,
02:22:29.060 as we might,
02:22:29.780 but,
02:22:30.620 you know,
02:22:31.940 for,
02:22:32.700 for creatures,
02:22:33.760 that only last,
02:22:34.560 eight decades,
02:22:35.180 and have a lot of trouble,
02:22:36.820 during all,
02:22:37.920 eight of those,
02:22:39.260 we don't do that badly,
02:22:41.740 and we've got our reasons,
02:22:43.340 for being as perverse,
02:22:44.660 and useless,
02:22:45.260 as we are,
02:22:46.380 and,
02:22:46.780 all of that,
02:22:47.800 shouldn't be laid at our feet,
02:22:49.180 even though we need to take,
02:22:50.360 responsibility for it,
02:22:52.300 and so,
02:22:53.000 one of the things,
02:22:53.680 that I find extremely disturbing,
02:22:55.500 about,
02:22:56.620 the,
02:22:58.800 emergent hypothesis,
02:23:00.740 that our culture,
02:23:01.960 is nothing but,
02:23:02.780 an oppressive patriarchy,
02:23:04.320 is that,
02:23:05.180 it's ungrateful,
02:23:06.480 beyond comprehension,
02:23:08.380 and worse,
02:23:09.420 if you adopt,
02:23:12.200 that stance,
02:23:13.280 well,
02:23:13.540 it bestows,
02:23:14.620 a certain amount,
02:23:15.380 of benefit on you,
02:23:17.520 I mean,
02:23:17.880 first of all,
02:23:19.020 if you,
02:23:19.900 posit that,
02:23:20.740 there's nothing,
02:23:21.440 in your culture,
02:23:22.360 except what's corrupt,
02:23:23.880 then that immediately,
02:23:24.920 elevates you,
02:23:25.700 above all of it,
02:23:26.380 as that sort of critic,
02:23:27.940 right,
02:23:28.720 it's like,
02:23:29.040 well,
02:23:29.140 I'm taking the high ground,
02:23:30.560 and I'm looking at,
02:23:31.960 the entire,
02:23:32.920 history of humanity,
02:23:35.720 and calling it,
02:23:37.160 unacceptable,
02:23:38.220 it's like,
02:23:38.540 well,
02:23:38.640 that's fine,
02:23:39.640 I suppose,
02:23:40.260 except,
02:23:40.700 it isn't necessarily,
02:23:41.780 obvious,
02:23:42.260 that you could do,
02:23:43.060 any better,
02:23:43.900 and you probably haven't,
02:23:45.720 you know,
02:23:45.960 and maybe you could try,
02:23:47.760 although then,
02:23:48.320 you might say,
02:23:48.800 well,
02:23:48.980 just try,
02:23:49.800 all that trying does,
02:23:50.660 is make it worse,
02:23:52.440 and participating,
02:23:53.600 in that terror,
02:23:55.240 doing anything,
02:23:56.160 of any utility,
02:23:57.100 is just playing the power game,
02:23:58.420 and making it worse,
02:23:59.240 it's like,
02:23:59.660 yeah,
02:24:00.200 well,
02:24:00.720 you know,
02:24:01.100 pardon me,
02:24:01.620 for being a bit skeptical,
02:24:02.620 about your motivation,
02:24:04.500 I don't think,
02:24:05.400 that moral virtue,
02:24:06.280 is that easily come by,
02:24:08.080 and I think,
02:24:08.740 that there are difficult problems,
02:24:10.100 to solve,
02:24:10.640 and that you could,
02:24:11.360 contend with the world,
02:24:12.300 and try to solve them,
02:24:13.820 instead of complaining,
02:24:14.760 about the fact,
02:24:15.560 that they haven't been,
02:24:16.240 solved properly,
02:24:17.380 and that if you did manage,
02:24:18.500 to solve a problem or two,
02:24:20.140 which you might,
02:24:20.800 then maybe,
02:24:21.820 you'd have the right,
02:24:22.720 to stand up,
02:24:24.520 as a more global critic,
02:24:26.200 but until then,
02:24:27.500 well,
02:24:27.680 what's rule six,
02:24:28.960 you should get your house,
02:24:29.940 in order,
02:24:30.340 before you criticize the world,
02:24:32.320 and that's no simple thing,
02:24:34.160 but it's,
02:24:38.000 so the moral virtue thing,
02:24:39.520 that's annoying,
02:24:40.460 because I don't think,
02:24:41.340 that moral virtue,
02:24:42.020 should be unearned,
02:24:43.820 you know,
02:24:44.140 this is kind of,
02:24:44.600 why I'm an admirer,
02:24:45.480 of the doctrine,
02:24:46.080 of original sin,
02:24:48.280 you know,
02:24:49.120 I think you're stuck,
02:24:50.560 with some concept,
02:24:51.540 of original sin,
02:24:52.480 no matter how you think,
02:24:53.540 I mean,
02:24:53.780 I see that,
02:24:54.340 in the atheistic,
02:24:55.180 environmentalist types,
02:24:56.300 and I know,
02:24:56.840 all environmentalist types,
02:24:58.080 aren't atheistic,
02:24:58.940 and I know,
02:24:59.480 that all environmentalist types,
02:25:00.780 aren't reprehensible either,
02:25:02.140 but there's a nice,
02:25:03.300 selection of atheist,
02:25:05.020 environmental types,
02:25:05.960 who are reprehensible,
02:25:07.320 and they're,
02:25:07.820 yeah,
02:25:08.560 yeah,
02:25:08.780 yeah,
02:25:09.020 they're usually the ones,
02:25:10.280 that say,
02:25:10.680 that the world,
02:25:11.180 would be better off,
02:25:12.120 if there were fewer people,
02:25:13.200 on it,
02:25:13.980 which is not,
02:25:14.700 a sentiment,
02:25:15.280 that I find,
02:25:16.280 particularly attractive,
02:25:18.080 and it's also one,
02:25:21.200 if,
02:25:22.080 when I meet someone,
02:25:23.180 who utters that,
02:25:24.240 I always think,
02:25:25.180 two things,
02:25:26.300 one is,
02:25:27.180 well,
02:25:27.400 why are you still here,
02:25:29.280 then,
02:25:29.800 that's the first one,
02:25:31.600 and the second one is,
02:25:34.100 just exactly,
02:25:36.160 who it is,
02:25:36.700 who is it,
02:25:37.320 that you're planning,
02:25:38.080 on getting rid of,
02:25:39.200 and how exactly,
02:25:40.220 would you go about,
02:25:41.180 doing that,
02:25:41.580 if you had the opportunity,
02:25:44.580 so,
02:25:45.360 I don't find that,
02:25:47.280 I don't find that,
02:25:50.160 particularly admirable,
02:25:51.440 and I don't think,
02:25:52.280 that there's any sympathy,
02:25:53.560 in it,
02:25:54.320 you know,
02:25:54.600 because I think,
02:25:55.420 that we should have,
02:25:56.800 some sympathy,
02:25:57.560 for ourselves,
02:25:58.360 that's rule two,
02:25:59.640 is you should treat yourself,
02:26:00.900 like you're someone,
02:26:02.200 responsible for helping,
02:26:03.680 or rule three,
02:26:04.820 which is,
02:26:05.180 you should make friends,
02:26:06.660 with people,
02:26:07.100 who want the best for you,
02:26:08.260 it's like,
02:26:09.560 corrupt,
02:26:10.380 and useless,
02:26:11.080 as you are,
02:26:12.400 you do have a hard lot,
02:26:15.260 you know,
02:26:15.660 and so there's some reason,
02:26:16.960 for sympathy,
02:26:18.620 and to say,
02:26:19.440 that human beings,
02:26:20.220 are nothing,
02:26:20.780 but the spoilers,
02:26:21.660 of the planet,
02:26:22.680 is to miss,
02:26:23.660 half the story,
02:26:25.140 which is as fast,
02:26:26.440 as we're trying,
02:26:27.440 to kill mother nature,
02:26:28.960 she is returning,
02:26:30.300 the favor,
02:26:31.040 in spades,
02:26:32.620 so,
02:26:34.360 that doesn't mean,
02:26:35.480 that we should be foolish,
02:26:36.320 about it,
02:26:37.020 you know,
02:26:37.300 and a certain balance,
02:26:38.440 has to be attained,
02:26:39.900 but it's nice,
02:26:41.160 to look at both sides,
02:26:42.500 of the equation,
02:26:43.340 before you,
02:26:44.620 lay out too much judgment,
02:26:45.880 you know,
02:26:46.500 back in the late 1800s,
02:26:48.860 Thomas Huxley,
02:26:49.920 who was Eldis Huxley's,
02:26:51.580 great grandfather,
02:26:52.880 and also,
02:26:53.680 a great defender,
02:26:54.480 of Darwin,
02:26:55.220 was commissioned,
02:26:56.340 by the British Parliament,
02:26:58.120 to do a study,
02:26:59.120 on oceanic resources,
02:27:01.000 because there was,
02:27:01.640 some concern,
02:27:02.200 at that point,
02:27:02.860 that human beings,
02:27:03.940 might be overfishing,
02:27:06.120 and his conclusion,
02:27:07.800 was,
02:27:08.400 the oceans,
02:27:09.120 were so bountiful,
02:27:10.200 and plentiful,
02:27:10.880 and human beings,
02:27:11.820 so,
02:27:12.720 comparatively small,
02:27:15.340 in number,
02:27:15.820 and powerless,
02:27:16.500 that there wasn't,
02:27:17.060 a hope,
02:27:17.680 in hell,
02:27:18.300 that we would,
02:27:18.920 ever be able,
02:27:19.740 to put a dent,
02:27:20.580 in the,
02:27:22.320 vast,
02:27:23.400 resources,
02:27:24.020 of the ocean,
02:27:24.840 that was only,
02:27:26.000 at the beginning,
02:27:26.760 of the 20th century,
02:27:28.620 you know,
02:27:28.860 we didn't get,
02:27:29.560 to the point,
02:27:30.300 where we could,
02:27:31.520 harvest,
02:27:32.400 on an industrial scale,
02:27:33.680 until after,
02:27:34.440 World War II,
02:27:35.160 you know,
02:27:35.880 that's only about,
02:27:36.640 70 years ago,
02:27:38.120 and so it's only been,
02:27:39.340 50 years say,
02:27:40.560 maybe a bit more,
02:27:41.540 60 years,
02:27:42.280 maybe since 1960,
02:27:43.680 that we woke up,
02:27:44.600 to the fact,
02:27:45.200 that some of our actions,
02:27:46.440 had now become,
02:27:47.120 powerful enough,
02:27:47.800 to be considered,
02:27:48.560 on a global scale,
02:27:49.900 and that's within,
02:27:50.600 the lifetime,
02:27:51.340 that's within my lifetime,
02:27:52.640 that's within the lifetime,
02:27:53.540 of a single person,
02:27:54.860 I don't think,
02:27:55.420 we've done such a bad job,
02:27:56.620 of waking up,
02:27:57.400 since then,
02:27:58.260 and starting to understand,
02:27:59.560 that you know,
02:27:59.960 maybe we have a,
02:28:00.980 larger scale,
02:28:02.560 moral obligation,
02:28:03.760 than we realized before,
02:28:04.920 that's proportionate,
02:28:05.900 to our technological power,
02:28:07.640 it's another place,
02:28:08.500 where a little bit,
02:28:08.960 of sympathy,
02:28:09.380 might be in order,
02:28:11.140 you know,
02:28:11.520 and I mean,
02:28:12.580 LA is a lot cleaner,
02:28:13.760 than it used to be,
02:28:14.480 in terms of its air quality,
02:28:15.820 and so is London,
02:28:16.600 and you know,
02:28:17.440 we've made a lot of progress,
02:28:18.920 I would say,
02:28:19.540 in a relatively short time,
02:28:20.860 trying to clean up,
02:28:21.820 the mess that we made,
02:28:22.900 when we were trying,
02:28:23.560 not to die painfully young,
02:28:26.640 you know,
02:28:27.160 so,
02:28:28.240 all right,
02:28:29.880 well back to this ethic,
02:28:31.140 so,
02:28:31.920 I'll tell you another study,
02:28:34.380 that I really like,
02:28:35.140 that's really cool,
02:28:36.300 so,
02:28:36.880 this is a study,
02:28:38.120 that was done,
02:28:38.880 by a guy named,
02:28:39.560 Yuck Panksepp,
02:28:40.480 and he did it with rats,
02:28:42.100 and,
02:28:43.020 now and then,
02:28:43.700 I love reading,
02:28:44.600 animal experimental work,
02:28:46.260 if you,
02:28:46.660 if you want to study psychology,
02:28:48.660 that's what you should read,
02:28:49.840 you should just read,
02:28:50.540 animal experimental work,
02:28:51.800 because those people,
02:28:53.320 it's not all good,
02:28:54.140 but some of it's really good,
02:28:55.360 and some of the people,
02:28:56.280 who've done it,
02:28:56.740 they were real scientists,
02:28:58.120 and this guy Panksepp,
02:28:59.540 he's one of them,
02:29:00.680 got another guy,
02:29:01.320 he's named Jeffrey Gray,
02:29:02.540 who wrote a book called,
02:29:03.300 The Neuropsychology of Anxiety,
02:29:04.640 which take you like,
02:29:05.300 eight months to read,
02:29:06.460 it's a really hard book,
02:29:08.080 but it's brilliant,
02:29:09.540 anyways,
02:29:10.040 Panksepp has done a really good job,
02:29:11.700 of laying out the fundamental,
02:29:13.440 motivational systems,
02:29:14.740 in their biology,
02:29:15.660 I'll just tell you,
02:29:16.500 a brief story about that,
02:29:17.620 the American Psychological Association,
02:29:19.840 just came out with its guidelines,
02:29:21.280 for the treatment of men and boys,
02:29:22.780 it's actually,
02:29:23.520 it's like,
02:29:24.980 it's not,
02:29:25.520 they're not guidelines,
02:29:26.460 they're not for treatment,
02:29:27.980 and they're certainly not,
02:29:28.800 for the improvement,
02:29:29.600 of the health of men and boys,
02:29:30.960 it's a,
02:29:31.620 it's an absolutely reprehensible,
02:29:33.640 ideological screed,
02:29:35.020 on how psychologists,
02:29:36.580 have to think politically,
02:29:37.960 so they won't be punished,
02:29:39.600 by those who accredit them,
02:29:41.340 that's it fundamentally,
02:29:42.780 but one of the claims they made,
02:29:45.420 they made two claims,
02:29:46.340 that are,
02:29:46.780 that are beyond comprehension,
02:29:48.720 to me,
02:29:49.720 the first one was that,
02:29:52.900 aggression is socialized,
02:29:55.440 so that's the first claim,
02:29:56.900 and the second claim is that,
02:29:59.220 boys are socialized,
02:30:00.640 into aggression,
02:30:01.600 by men,
02:30:03.000 okay,
02:30:03.600 so let's look at those,
02:30:04.720 this relates to Panksepp,
02:30:07.160 so Panksepp outlined,
02:30:08.540 a bunch of biological circuits,
02:30:10.080 so human beings,
02:30:11.960 like mammals,
02:30:14.080 but also like,
02:30:15.520 even more,
02:30:17.300 what,
02:30:17.860 archaic animals,
02:30:19.680 speaking evolutionarily,
02:30:21.100 have a variety,
02:30:22.040 of fundamental biological circuits,
02:30:23.760 so I can tell you,
02:30:24.720 some of them,
02:30:25.460 some of them are obvious,
02:30:26.420 and some of them,
02:30:26.880 are somewhat surprising,
02:30:28.560 you have a circuit for pain,
02:30:30.760 well there's no surprise,
02:30:32.500 you have a circuit for anxiety,
02:30:34.400 that's circuit,
02:30:35.560 you know,
02:30:35.780 it's a metaphor,
02:30:36.360 obviously,
02:30:37.380 you have a biological system,
02:30:38.780 that mediates anxiety,
02:30:40.280 all of you have it,
02:30:41.400 animals have it,
02:30:43.020 you have one for,
02:30:45.740 something called,
02:30:46.540 incentive reward,
02:30:48.060 and that's what moves you,
02:30:49.240 towards valued goals,
02:30:50.640 that's basically associated,
02:30:52.060 with positive emotion,
02:30:53.320 you have one that satisfies you,
02:30:55.140 when you consume,
02:30:56.300 that's a consumatory reward system,
02:30:59.880 lust,
02:31:01.240 that's another one,
02:31:02.460 thirst,
02:31:03.120 hunger,
02:31:04.600 play,
02:31:05.740 that's cool,
02:31:06.360 that's a fundamental circuit,
02:31:07.880 Panksepp discovered that,
02:31:09.700 there's a circuit,
02:31:12.320 for care,
02:31:13.860 that's another one,
02:31:14.980 and there's a circuit,
02:31:15.820 for rage,
02:31:16.900 that's not all of them,
02:31:17.760 but that's a good,
02:31:18.360 that's a good start,
02:31:19.800 rage,
02:31:20.820 it's there,
02:31:21.820 right from the beginning,
02:31:23.480 if you do facial expression,
02:31:25.520 coding analysis,
02:31:26.680 of infants,
02:31:27.860 say an infant learns,
02:31:28.820 to recognize its mother,
02:31:30.700 and then,
02:31:31.420 that's around,
02:31:32.180 nine months often,
02:31:33.340 or even earlier than that,
02:31:35.040 so,
02:31:36.100 a person comes into,
02:31:37.160 the baby's room,
02:31:38.080 and the baby starts to cry,
02:31:39.800 because that person,
02:31:40.800 isn't,
02:31:41.920 the mother,
02:31:42.620 and you think,
02:31:43.120 oh,
02:31:43.280 the baby's sad,
02:31:44.120 it's like,
02:31:44.440 no,
02:31:44.680 the baby isn't sad,
02:31:46.040 the baby is angry,
02:31:47.800 that's why it's turning red,
02:31:49.300 and if you do,
02:31:49.860 facial expression coding,
02:31:51.480 it's like,
02:31:52.120 the baby is actually,
02:31:53.080 the baby is cursing,
02:31:55.140 internally,
02:31:55.700 and I,
02:31:58.820 you already know this,
02:32:00.420 because you know,
02:32:01.120 that like,
02:32:01.480 a two-year-old,
02:32:02.060 isn't much older,
02:32:02.700 than a baby,
02:32:03.680 and two-year-olds,
02:32:04.400 have temper tantrums,
02:32:05.280 and it's not,
02:32:05.720 because they're sad,
02:32:06.980 as you can tell,
02:32:08.080 perfectly well,
02:32:08.740 if you just watch,
02:32:09.740 a two-year-old,
02:32:10.600 have a temper tantrum,
02:32:11.680 it's clear,
02:32:12.480 that they're completely,
02:32:13.540 possessed,
02:32:14.420 by rage,
02:32:15.760 and that rage circuit,
02:32:17.020 it's exactly right,
02:32:18.120 that rage circuit is active,
02:32:19.540 even before the fear circuit,
02:32:21.180 is active,
02:32:21.880 it's activated very early,
02:32:23.300 and one of the things,
02:32:24.520 that's the case,
02:32:25.260 is that some children,
02:32:26.680 are much more aggressive,
02:32:28.100 than others,
02:32:29.240 right from the beginning,
02:32:30.480 and most of those,
02:32:31.600 are boys,
02:32:33.060 and not all of them,
02:32:34.180 but most of them,
02:32:34.900 and if you take,
02:32:35.580 two-year-olds,
02:32:36.200 and you group them together,
02:32:38.000 you find,
02:32:38.640 if you take,
02:32:39.240 one-year-olds,
02:32:39.940 two-year-olds,
02:32:40.540 three-year-olds,
02:32:41.100 four-year-olds,
02:32:41.580 all the way up to 16,
02:32:42.620 and you group them together,
02:32:43.880 they don't know each other,
02:32:44.900 and then you count,
02:32:45.940 aggressive actions,
02:32:48.200 the two-year-olds,
02:32:48.960 are by far,
02:32:49.500 the most aggressive bunch,
02:32:51.180 they kick,
02:32:51.760 and hit,
02:32:52.140 and bite,
02:32:52.620 and steal,
02:32:53.560 not all of them,
02:32:54.920 but a good chunk of them,
02:32:56.800 and so,
02:32:58.160 as 16-year-olds,
02:32:59.180 don't do that,
02:32:59.740 when you put them together,
02:33:00.640 and look,
02:33:01.020 you're all here,
02:33:01.820 and you're not 16,
02:33:02.660 you're like 30,
02:33:03.780 and you're not doing any of that,
02:33:05.740 and so,
02:33:06.280 two-year-olds,
02:33:07.020 man,
02:33:07.300 they're aggressive little monsters,
02:33:09.080 and it's okay,
02:33:10.140 because they're small,
02:33:11.040 and soft,
02:33:11.780 and what the hell,
02:33:12.620 can they do,
02:33:13.220 you know,
02:33:13.780 so,
02:33:14.580 it's not like,
02:33:15.940 you know,
02:33:16.920 it's not blood warfare,
02:33:19.620 among two-year-olds,
02:33:20.440 but it's just because,
02:33:21.060 they don't have the sophistication,
02:33:22.420 and the weapons,
02:33:23.240 they've got,
02:33:24.120 they've got the motivation,
02:33:27.660 and it's really true,
02:33:29.580 it's really true for a subset of them,
02:33:31.440 so about 5% of two-year-old boys,
02:33:34.080 are hyper-aggressive,
02:33:36.080 but,
02:33:36.600 what's cool,
02:33:37.300 is that almost all of them,
02:33:38.760 are socialized,
02:33:40.500 into civilized behavior,
02:33:42.520 by the time they're four,
02:33:44.020 and you can define civilized behavior,
02:33:46.440 actually quite nicely,
02:33:47.700 using a Piagetian definition,
02:33:49.880 is that other children,
02:33:50.820 will play with them,
02:33:52.700 and that's rule five,
02:33:54.120 by the way,
02:33:55.100 don't let your children,
02:33:56.140 do anything that makes you dislike them,
02:33:57.900 what's your job,
02:33:58.820 if you're a parent,
02:33:59.580 your job as a parent,
02:34:00.560 is to socialize your children,
02:34:01.860 so that by the age of three,
02:34:03.160 other children will play with them,
02:34:05.300 because that means,
02:34:06.080 they've learned how to engage,
02:34:07.220 in reciprocal social interactions,
02:34:09.200 and that'll start to spiral upward,
02:34:10.960 so if your child,
02:34:12.140 is acceptable as a playmate,
02:34:13.580 by the age of three,
02:34:14.980 age of four is the limit,
02:34:16.600 by the way,
02:34:17.080 so you better have it together,
02:34:18.300 by then,
02:34:19.200 if,
02:34:19.820 otherwise they get alienated,
02:34:21.160 and isolated,
02:34:21.700 and they don't make friends,
02:34:22.540 and then they never recover from that,
02:34:24.140 it's not,
02:34:24.760 it's not good,
02:34:25.940 it's not good,
02:34:27.020 so most kids are socialized,
02:34:28.740 even the hyper-aggressive ones,
02:34:29.900 are socialized,
02:34:30.820 into acceptable playmates,
02:34:33.900 by the age of four,
02:34:35.420 and it's men,
02:34:37.520 in large part,
02:34:38.440 that do that,
02:34:39.960 now,
02:34:40.500 what's the evidence for that,
02:34:41.940 well,
02:34:42.080 there's lots of evidence for it,
02:34:43.420 it's the evidence among mammals,
02:34:45.320 is the use of rough and tumble play,
02:34:46.880 for example,
02:34:47.500 among males,
02:34:48.820 and their offspring,
02:34:50.260 in order to socialize,
02:34:52.000 and civilize them,
02:34:52.700 but the evidence among human beings,
02:34:55.040 is that,
02:34:56.200 where do you get aggressive teenagers,
02:35:00.120 what sort of families,
02:35:01.900 produce aggressive teenagers,
02:35:04.060 fatherless families,
02:35:08.200 right,
02:35:08.660 so let's think about that,
02:35:09.800 with regards to what the APA said,
02:35:12.220 okay,
02:35:12.660 because what they said,
02:35:13.820 was the opposite of the truth,
02:35:15.780 wasn't even a lie,
02:35:17.220 wasn't even wrong,
02:35:18.820 it was,
02:35:19.580 they took the truth,
02:35:20.840 and then they claimed the opposite,
02:35:22.920 on two dimensions,
02:35:23.960 number one,
02:35:25.100 aggression isn't socialized,
02:35:26.660 civilization is socialized,
02:35:30.440 number two,
02:35:32.180 if men were responsible,
02:35:33.940 for the creation of aggression,
02:35:35.440 among boys,
02:35:36.800 then fatherless families,
02:35:38.260 would produce boys,
02:35:39.320 that were more peaceful,
02:35:40.920 and they don't,
02:35:42.560 so what the hell,
02:35:44.440 fundamentally,
02:35:45.080 all right,
02:35:56.700 so,
02:35:58.100 you have the behavioral pattern,
02:36:00.340 that characterizes,
02:36:01.180 civilized behavior,
02:36:02.220 among wolves,
02:36:03.000 or civilized behavior,
02:36:03.960 among chimpanzees,
02:36:04.960 or civilized behavior,
02:36:06.140 among rats,
02:36:06.780 I'll tell you the rest,
02:36:07.480 of the rat story,
02:36:08.360 so,
02:36:09.460 rats like to engage,
02:36:11.000 in rough and tumble play,
02:36:12.600 especially juveniles,
02:36:13.640 especially juvenile males,
02:36:15.120 and we know they like it,
02:36:16.440 because they'll work to do it,
02:36:18.240 so,
02:36:18.620 if they know,
02:36:19.520 if they know that,
02:36:20.580 if a rat has been somewhere,
02:36:22.200 and he got to play,
02:36:23.580 and then you put him back there,
02:36:25.040 and you make him press a bar,
02:36:26.240 to open the door,
02:36:26.900 so he can go into where he played,
02:36:28.180 he'll press that bar like mad,
02:36:29.500 and so,
02:36:30.040 that's how you infer,
02:36:31.560 motivation among rats,
02:36:33.320 will the rat work to play,
02:36:35.380 you will,
02:36:36.520 right,
02:36:37.520 which is why you'll pay,
02:36:39.220 for tickets to a basketball game,
02:36:40.860 hell,
02:36:41.140 you'll work to watch people play,
02:36:43.640 right,
02:36:46.560 that's very interesting,
02:36:48.020 it's absolutely,
02:36:49.180 like,
02:36:49.400 that's so important,
02:36:50.960 it's almost unbelievable,
02:36:52.640 that you will do that,
02:36:54.220 I mean,
02:36:54.520 it,
02:36:54.840 because the question is,
02:36:56.020 well,
02:36:56.180 why,
02:36:57.100 that's the first question,
02:36:58.420 well,
02:36:58.700 seriously,
02:36:59.320 it's like,
02:36:59.660 what the hell,
02:37:00.360 what,
02:37:00.580 what are you watching,
02:37:01.340 a bunch of pituitary cases,
02:37:02.780 go,
02:37:03.260 you know,
02:37:03.600 bang a ball,
02:37:04.320 so that they can throw it through a hoop,
02:37:05.960 and like,
02:37:06.320 you'll pay,
02:37:06.960 outrageous,
02:37:07.560 outrageous sums of money,
02:37:10.800 to do that,
02:37:11.480 it's like,
02:37:11.780 you're not even throwing the damn ball,
02:37:13.980 you're just like,
02:37:14.520 watching them do it,
02:37:15.860 can I have the ball,
02:37:16.960 no,
02:37:17.640 well,
02:37:17.840 I paid $200 for this ticket,
02:37:20.080 they don't get the ball,
02:37:21.960 it's like,
02:37:22.320 you don't get to play,
02:37:23.200 okay,
02:37:23.500 well,
02:37:23.640 I'll just watch,
02:37:24.840 it's like,
02:37:26.180 so that's how important it is,
02:37:27.700 that's how important play is,
02:37:28.920 that you'll,
02:37:29.760 you'll work to make money,
02:37:31.860 to buy tickets,
02:37:32.540 to watch people play,
02:37:34.260 right,
02:37:34.580 man,
02:37:35.180 that's crazy,
02:37:36.680 so,
02:37:36.880 but it's fundamental,
02:37:39.260 and you think about,
02:37:39.940 how much of our entertainment,
02:37:41.360 is associated with exactly that,
02:37:43.480 think,
02:37:43.840 well,
02:37:43.980 we're just doing something random,
02:37:45.660 or is there something important,
02:37:47.000 going on there,
02:37:47.640 I mean,
02:37:47.820 it doesn't look that important,
02:37:49.580 you know,
02:37:49.800 in some sense,
02:37:50.580 it's easy to be cynical about it,
02:37:52.020 but it's foolish,
02:37:52.840 because it's crucially important,
02:37:54.900 the fact that we're so wired up,
02:37:57.080 to admire fair play,
02:37:59.460 that we'll pay for it,
02:38:01.460 we'll pay for the right,
02:38:02.360 just to watch it vicariously,
02:38:05.180 that's a testament,
02:38:06.060 to the degree,
02:38:07.460 to which we're civilized,
02:38:09.000 and social,
02:38:09.800 because a game,
02:38:10.840 is something,
02:38:11.380 that's civilized,
02:38:12.180 and social,
02:38:12.820 back to Piaget,
02:38:14.320 Piaget believed,
02:38:15.420 that most socialization,
02:38:16.720 occurred as a consequence,
02:38:18.260 of integrating,
02:38:20.300 these underlying motivational systems,
02:38:22.060 into iterable games,
02:38:24.320 and that's what you're doing,
02:38:25.100 with your kids,
02:38:25.720 right,
02:38:25.960 when you,
02:38:26.280 when you,
02:38:26.920 when you,
02:38:27.420 when you socialize your kids,
02:38:28.640 when you teach them,
02:38:29.340 how to take turns,
02:38:30.320 when you teach them,
02:38:31.080 how to play a game,
02:38:32.760 then what you're doing,
02:38:33.880 is that you're socializing them,
02:38:36.060 into iterated,
02:38:38.800 reciprocal interactions,
02:38:40.080 with other people,
02:38:41.420 it's,
02:38:41.840 and that's the fundamental,
02:38:42.580 that's the fundamental aspect,
02:38:44.040 of ethic,
02:38:44.780 of the ethic,
02:38:45.680 that's how an ethic,
02:38:46.580 emerges from the bottom up,
02:38:48.320 it emerges in games,
02:38:49.520 that was Piaget's,
02:38:50.400 fundamental observation,
02:38:51.720 much better,
02:38:52.340 than the Freudian hypothesis,
02:38:53.700 the Freudian hypothesis,
02:38:54.620 was basically,
02:38:55.540 that human beings,
02:38:56.500 learn to inhibit their aggression,
02:38:58.580 that's not Piaget's model,
02:39:00.020 Piaget's model is,
02:39:00.980 no,
02:39:01.140 no,
02:39:01.220 no,
02:39:01.380 you integrate the aggression,
02:39:02.940 into a higher order game,
02:39:04.420 and you know this,
02:39:05.120 because you want an athlete,
02:39:06.420 a good athlete,
02:39:07.100 is someone who's got that aggression,
02:39:08.800 but it's directed,
02:39:09.860 right,
02:39:10.040 it's not random,
02:39:11.140 in fact,
02:39:11.600 if you see an athlete,
02:39:12.700 manifest random aggression,
02:39:14.820 you're not happy about that,
02:39:16.400 right,
02:39:16.760 maybe you admire them,
02:39:17.820 because they're really competitive,
02:39:19.680 highly skilled,
02:39:20.520 really competitive,
02:39:21.340 they're,
02:39:21.580 they're goal driven,
02:39:22.620 they want to win,
02:39:23.480 but it's all focused,
02:39:24.540 on the goal,
02:39:25.560 and,
02:39:26.060 you know,
02:39:26.320 they're cooperating,
02:39:27.000 with their teammates,
02:39:27.880 and they're not hogging the ball,
02:39:29.840 they're also,
02:39:30.500 facilitating the development,
02:39:32.640 of their teammates,
02:39:33.280 and they don't cheat,
02:39:34.660 they don't break the rules,
02:39:35.600 so they're cooperating,
02:39:36.460 even with the,
02:39:37.300 their adversaries,
02:39:38.660 because they're all playing basketball,
02:39:40.100 and they're all playing by the rules,
02:39:41.260 it's all cooperative,
02:39:43.100 that's an ethic,
02:39:44.160 it's an emergent ethic,
02:39:46.060 and you know,
02:39:46.600 the things that we do,
02:39:47.600 what we do in our lives,
02:39:49.280 outside of the game,
02:39:50.700 is very game-like,
02:39:52.160 we engage in a cooperative,
02:39:54.540 and competitive ethic,
02:39:56.180 and we know the good players,
02:39:57.720 we know the good sports,
02:39:59.000 and,
02:39:59.240 and that's,
02:40:00.400 that's the bottom-up emergence,
02:40:01.840 of an ethic,
02:40:03.340 it's also the solution,
02:40:04.900 to the post-modern conundrum,
02:40:06.640 as far as I'm concerned,
02:40:07.500 or part of the solution,
02:40:08.660 because the post-modernists claim,
02:40:11.220 that there's an infinite way,
02:40:12.380 of looking at the world,
02:40:14.180 and there is,
02:40:15.560 and that no way,
02:40:17.080 of looking at the world,
02:40:17.920 is better than any other way,
02:40:19.220 which is wrong,
02:40:20.260 the first part's right,
02:40:21.380 because the world's very complicated,
02:40:22.600 the second part's wrong,
02:40:24.260 there aren't that many ways,
02:40:25.660 of playing a game properly,
02:40:26.820 and you can tell that too,
02:40:28.720 because even if you watch kids play,
02:40:30.540 if you go out,
02:40:31.060 and you watch your kids play,
02:40:32.240 you can tell which kids are good sports,
02:40:33.940 and which ones aren't,
02:40:35.680 right,
02:40:36.020 that's easy,
02:40:36.820 and the kids can tell too,
02:40:38.660 and so you can tell when,
02:40:40.120 it doesn't matter what the game is,
02:40:41.720 and if you're a good sport,
02:40:43.280 it's the same across games,
02:40:45.100 which is really,
02:40:46.380 another indication,
02:40:47.400 of the emergence,
02:40:48.180 of a transcendent ethic,
02:40:50.600 right,
02:40:51.080 if you're,
02:40:51.880 the concept,
02:40:53.100 good sportsman,
02:40:54.200 is independent,
02:40:55.700 of the game,
02:40:57.120 right,
02:40:58.560 and it's very much,
02:40:59.900 what it is,
02:41:00.980 that you want,
02:41:02.080 it's the,
02:41:02.520 it's the dawning,
02:41:03.900 of the behaviors,
02:41:04.560 that you want,
02:41:05.360 from someone,
02:41:05.920 who's sophisticated,
02:41:07.020 and reciprocal,
02:41:08.100 in their day-to-day life,
02:41:09.260 and it's not,
02:41:10.900 it's by no means arbitrary,
02:41:13.260 it's,
02:41:13.700 it's in fact,
02:41:14.460 very tightly constrained,
02:41:15.440 back to the rats,
02:41:16.460 so rats like to wrestle,
02:41:18.840 and so you can get them,
02:41:20.180 to work to wrestle,
02:41:21.040 and so they work,
02:41:21.680 and then the door opens,
02:41:22.580 and out they go,
02:41:23.320 and there's rat A,
02:41:24.400 and rat B,
02:41:25.020 and rat A,
02:41:25.760 is 10% bigger,
02:41:26.860 than rat B,
02:41:27.860 and so they have,
02:41:28.340 a little wrestling match,
02:41:29.280 and rat A,
02:41:29.920 who's the bigger rat,
02:41:30.760 the more powerful rat,
02:41:31.720 pins rat B,
02:41:33.180 and then the,
02:41:34.940 post-modern,
02:41:36.360 social scientists,
02:41:37.540 observing,
02:41:38.040 derive their conclusion,
02:41:41.080 power produces dominance,
02:41:43.660 and they publish it,
02:41:45.240 but they're stupid,
02:41:46.760 and so,
02:41:47.320 so then you have,
02:41:49.260 a smart scientist,
02:41:50.120 and he thinks,
02:41:50.540 wait a second,
02:41:51.540 you don't just wrestle once,
02:41:52.920 if you're a rat,
02:41:53.740 you wrestle,
02:41:54.480 a bunch of times,
02:41:56.200 and so what happens,
02:41:56.860 if you get the rats,
02:41:57.780 to wrestle,
02:41:58.400 more than once,
02:42:00.180 because that's the issue,
02:42:01.540 unless,
02:42:02.540 see,
02:42:03.160 and this is the difference,
02:42:04.040 between a psychopath,
02:42:05.180 and someone,
02:42:05.640 who's not psychopathic,
02:42:06.760 so,
02:42:07.360 the thing about,
02:42:07.940 being a psychopath,
02:42:08.940 is that it's all for you,
02:42:10.220 and none for someone else,
02:42:11.820 but that only works once,
02:42:13.760 or twice,
02:42:15.320 you know,
02:42:15.880 depending on,
02:42:16.920 the naivety,
02:42:17.640 of your target,
02:42:18.320 and if you're a psychopath,
02:42:19.480 you have to move,
02:42:20.820 from place to place,
02:42:22.660 because people catch on,
02:42:24.020 to your,
02:42:24.900 lack of ability,
02:42:25.880 to play fair,
02:42:27.120 and they don't get fooled,
02:42:28.240 again,
02:42:28.600 and so,
02:42:29.580 psychopaths drift,
02:42:30.760 from place to place,
02:42:32.520 and they can,
02:42:33.520 find their mark,
02:42:34.560 and their victim,
02:42:35.320 but,
02:42:35.820 they can't do it,
02:42:36.560 in any stable manner,
02:42:37.760 it's not,
02:42:38.380 a good solution,
02:42:40.940 because people,
02:42:41.600 track reputation,
02:42:42.960 and we're really good,
02:42:44.060 at that,
02:42:44.380 in fact,
02:42:44.820 there are,
02:42:45.200 evolutionary psychologists,
02:42:46.620 who believe,
02:42:47.120 that we have a module,
02:42:48.160 for tracking,
02:42:49.580 reputation,
02:42:50.380 a cheating detection module,
02:42:52.080 and that you don't have,
02:42:53.060 to activate that,
02:42:54.160 very often,
02:42:54.740 before,
02:42:55.760 people will remember,
02:42:57.020 forever,
02:42:57.600 that you did,
02:42:59.360 so,
02:43:01.580 you got the big rat,
02:43:02.480 and the little rat,
02:43:03.040 now the little rat's lost,
02:43:04.620 now you separate them,
02:43:06.280 and then you let them,
02:43:07.200 come back,
02:43:07.520 and play another day,
02:43:08.820 and they still want to,
02:43:09.920 even the little rat,
02:43:10.780 wants to,
02:43:11.580 and so then the little rat,
02:43:12.460 goes out in the play field,
02:43:13.640 and the big rat's there,
02:43:15.700 and the little rat,
02:43:16.900 has to ask the big rat,
02:43:18.100 to play,
02:43:19.080 and you think,
02:43:20.200 well how does a rat,
02:43:20.940 do that,
02:43:21.400 it's like,
02:43:21.680 well you've been in dog parks,
02:43:23.060 and you see what dogs do,
02:43:24.540 maybe you've had dogs,
02:43:25.620 you know how dogs act,
02:43:26.620 when they want to play,
02:43:27.380 they sort of bounce around,
02:43:28.820 they kind of look at you,
02:43:29.960 you know,
02:43:30.260 and they bounce,
02:43:31.220 and if you have any sense,
02:43:32.620 you all laugh,
02:43:33.400 because you recognize that,
02:43:35.200 that's,
02:43:35.580 there's something playful,
02:43:36.500 about that,
02:43:37.080 it's like,
02:43:37.800 I'm looking at you,
02:43:39.100 signaling intent,
02:43:39.960 and this is sort of like,
02:43:40.960 a little dance,
02:43:42.100 and the idea is,
02:43:43.360 why don't you come,
02:43:44.000 and do this with me,
02:43:44.760 and if you have any sense,
02:43:45.700 it's a big dog,
02:43:46.500 and if you have any sense,
02:43:47.480 the dog does that,
02:43:48.380 and you kind of whack it,
02:43:49.200 on the side of the head,
02:43:50.020 and it's not really,
02:43:51.360 that you hit it,
02:43:52.080 and the dog knows,
02:43:52.760 and it sort of bites you,
02:43:54.400 but not really,
02:43:55.180 because you know,
02:43:56.480 the dog knows how to play,
02:43:57.920 if it had owners that,
02:43:59.500 I could say that weren't psychologists,
02:44:01.840 because the worst dog I ever met,
02:44:04.180 was the dog that a psychologist owned,
02:44:06.380 and it couldn't play at all,
02:44:07.460 it was just,
02:44:08.100 it was just,
02:44:09.000 it would just bite you,
02:44:10.460 it's like,
02:44:11.160 what,
02:44:12.860 this is your dog,
02:44:14.320 you're gonna have clients,
02:44:15.760 not good,
02:44:18.440 so anyways,
02:44:23.400 the little rat has to ask,
02:44:24.560 the big rat to play,
02:44:25.700 because the big rat's cool,
02:44:26.780 because he won once,
02:44:27.620 so he gets to sit there,
02:44:29.040 and like look cynical,
02:44:30.160 and pretend that he's ignoring,
02:44:31.280 the little rat,
02:44:31.920 but he wants to play too,
02:44:33.440 so the little rat bounces around,
02:44:34.760 and the big rat thinks,
02:44:36.040 he tosses away his cigarette butt,
02:44:37.800 and gets off the street corner,
02:44:39.240 and plays,
02:44:40.820 and because he's 10% bigger,
02:44:43.700 he could pin the little rat,
02:44:44.980 and maybe he does that,
02:44:45.980 but if he,
02:44:46.620 if you pair them repeatedly,
02:44:49.220 and the big rat doesn't let the little rat,
02:44:51.260 win 30% of the time,
02:44:53.000 the little rat stops asking him to play,
02:44:56.020 and that is so cool,
02:44:58.140 I read that,
02:44:59.080 it just,
02:44:59.400 it just blew me away,
02:45:00.940 I thought really,
02:45:01.740 you're kidding,
02:45:02.300 you're kidding,
02:45:03.100 there's an emergent,
02:45:04.560 ethic of fair play,
02:45:07.340 in wrestling rats,
02:45:08.860 that's how fundamental,
02:45:11.480 that ethic is,
02:45:12.760 I mean they're rats,
02:45:13.880 for Christ's sake,
02:45:14.600 it's not like they're the world's,
02:45:15.860 most ethical animal,
02:45:16.940 right,
02:45:17.520 they're rats,
02:45:19.300 and still,
02:45:20.180 they have to play fair,
02:45:23.280 so,
02:45:24.520 what was I doing,
02:45:27.680 in maps of meaning,
02:45:29.000 and in 12 rules for life,
02:45:30.440 well,
02:45:32.160 we're motivated creatures,
02:45:34.420 each of those motivations,
02:45:36.300 has to pursue its own goal,
02:45:37.840 because otherwise we die,
02:45:39.900 but then those motivations,
02:45:41.060 have to get integrated,
02:45:42.060 they have to get integrated,
02:45:43.420 within you,
02:45:44.200 you kind of got that,
02:45:45.240 more or less under control,
02:45:46.400 by the time you're about two,
02:45:48.380 you start to become,
02:45:50.260 something approximating,
02:45:52.520 the integration of your motivations,
02:45:54.860 but that hasn't happened socially yet,
02:45:58.280 that happens between two and four,
02:46:00.160 you have to integrate,
02:46:01.380 that integrated structure,
02:46:02.620 with everyone else,
02:46:03.700 doing the same thing,
02:46:04.980 and how do you do that,
02:46:05.960 well you do that,
02:46:07.120 by learning how to play,
02:46:08.700 how to play fair,
02:46:10.480 and there's a reciprocity,
02:46:12.540 that goes along with that,
02:46:13.260 I'll give you another example,
02:46:15.180 so here's a game,
02:46:16.940 that behavioral,
02:46:18.680 economists have been playing,
02:46:20.120 with people,
02:46:21.020 so here's the game,
02:46:22.120 so you take person A,
02:46:23.120 and you take person B,
02:46:24.020 and you say to person A,
02:46:24.900 look,
02:46:25.120 I'm going to give you a hundred dollars,
02:46:26.640 and you have to share it,
02:46:27.640 with the person next to you,
02:46:28.880 and you can offer them,
02:46:30.300 a fraction of it,
02:46:31.480 whatever fraction you want,
02:46:32.720 and if they take it,
02:46:35.040 you get the hundred dollars,
02:46:36.080 and you split it with them,
02:46:36.800 but if they refuse it,
02:46:38.300 you guys get nothing,
02:46:40.540 so you say,
02:46:41.020 okay,
02:46:41.200 well here's a hundred dollars,
02:46:42.360 how much are you going to offer,
02:46:43.540 the person next to you,
02:46:44.860 now a classical economist would say,
02:46:47.360 well you offer them a buck,
02:46:49.500 well why,
02:46:50.960 well because you're trying to maximize,
02:46:52.220 your own self-interest,
02:46:54.260 because that's what you are,
02:46:56.500 you're going to maximize,
02:46:57.960 your own self-interest,
02:46:59.180 and why would they say no,
02:47:00.800 it's like they get a buck,
02:47:02.320 and that's better than nothing,
02:47:04.660 why would they say no,
02:47:06.560 so you say,
02:47:07.500 well will you take a dollar,
02:47:11.120 and the person next to you says,
02:47:13.560 well they think things they won't say,
02:47:17.020 but what they say is no,
02:47:21.380 but what happens is,
02:47:23.260 if you do this experiment,
02:47:24.720 is that isn't what people do,
02:47:26.300 they offer 50%,
02:47:28.200 cross-culturally,
02:47:29.500 it's somewhere between 40 and 60%,
02:47:32.200 but it's basically 50,
02:47:34.800 50-50,
02:47:36.420 and then the other person accepts it,
02:47:38.000 and you might think,
02:47:38.500 well that's because,
02:47:39.540 you don't need the money,
02:47:41.860 it's like let's say,
02:47:42.640 you're starving,
02:47:43.440 and you have the hundred dollars,
02:47:44.940 and you offer your,
02:47:46.180 compatriot here a dollar,
02:47:47.500 and like he needs something to eat,
02:47:48.600 so it's like yeah,
02:47:49.160 I'll take the dollar,
02:47:49.800 it's like no,
02:47:50.880 the poor people are more likely,
02:47:52.700 to tell you to go to hell,
02:47:54.120 because along with having no money,
02:47:58.280 they'd like not to have no pride,
02:48:01.040 right,
02:48:02.200 you got to hang on to something,
02:48:04.320 and so you can at least still tell someone,
02:48:06.540 to go to hell when they deserve it,
02:48:08.460 and that's something,
02:48:10.080 and so even in the simple behavioral economist games,
02:48:14.020 you get emergent evidence,
02:48:15.920 for automatic reciprocity,
02:48:17.880 well and why would you offer 50% to the stranger,
02:48:21.260 and the answer is well,
02:48:22.440 because it's a good rule of thumb,
02:48:24.260 you know if you want people,
02:48:25.620 to play with you across time,
02:48:27.720 if you want to engage in as many games as possible,
02:48:31.080 and you want to participate in the ethic properly,
02:48:33.440 then what you're aiming at,
02:48:34.500 is something approximating reciprocity,
02:48:37.420 the rules that I outlined,
02:48:40.740 you know in 12 rules for life,
02:48:43.840 that's what they're based on,
02:48:45.680 they're based on this observation,
02:48:46.940 so this is how it works,
02:48:48.260 is that we have these motivational systems,
02:48:50.860 and we get them together,
02:48:52.720 maybe around the age of two,
02:48:54.340 and then we integrate them,
02:48:55.520 with the motivational systems of others,
02:48:57.600 we do that by producing games,
02:48:59.920 and the games get more and more sophisticated,
02:49:01.960 more multiplicitous,
02:49:03.020 but there's a game ethic,
02:49:05.100 that emerges out of that,
02:49:06.680 and it governs reciprocity,
02:49:08.320 and the game ethic is something like,
02:49:10.220 well we're all equally valuable players,
02:49:12.940 and everybody deserves a fair shot,
02:49:14.600 and you got to bring your best skills to the table,
02:49:16.700 but you have to play fair,
02:49:18.080 and you have to play reciprocally,
02:49:19.380 and that works across time,
02:49:20.620 and then you get an archetype,
02:49:24.040 that emerges out of that,
02:49:25.080 which is something like,
02:49:26.560 the fair player,
02:49:27.880 it's a variant of the archetype of the hero,
02:49:30.800 and that's the thing that people are driven to imitate,
02:49:34.720 and admire,
02:49:36.140 and this isn't,
02:49:36.940 this is hardly a mystery,
02:49:38.220 I mean,
02:49:38.720 look at,
02:49:39.780 you think,
02:49:40.280 why do we pay professional athletes,
02:49:42.420 the inordinate sums that we pay them,
02:49:45.340 well it's possible,
02:49:46.880 it's because they're modeling something of crucial importance,
02:49:50.200 in a dramatic manner,
02:49:52.520 you know,
02:49:52.740 like a basketball game,
02:49:54.500 a professional basketball game,
02:49:55.880 is a very complex drama,
02:49:58.280 and the drama is skill,
02:50:02.420 but also the ethic of fair play,
02:50:05.300 and we're all observing that,
02:50:06.560 because we bloody well need to understand it,
02:50:09.220 and then you see that echoed,
02:50:10.780 so you get the emergent ethic,
02:50:13.080 and that's the pattern of behavior,
02:50:14.680 and then you get representations of that,
02:50:16.540 you get abstract representations of it,
02:50:18.340 that's what we're doing when we play these abstract games,
02:50:20.760 it's also what we do when we tell stories,
02:50:22.740 and we make movies,
02:50:24.820 and we present plays,
02:50:25.940 and we do everything that's dramatic,
02:50:27.640 is that we take a look at that behavioral pattern,
02:50:31.200 that's emerged,
02:50:32.000 that works,
02:50:32.620 and then we try to represent it,
02:50:33.740 because we need to understand it,
02:50:35.240 it's like,
02:50:35.600 well,
02:50:35.880 what are you like if you're the hero of a story,
02:50:38.200 and what are you like if you're the anti-hero,
02:50:40.340 you're the cheat,
02:50:41.100 you're the crook,
02:50:41.740 you're the deceiver,
02:50:42.540 you're the liar,
02:50:43.680 you're the poor sport,
02:50:45.580 you know,
02:50:45.780 you're the person who fixes the game,
02:50:47.540 and we abstract out those patterns,
02:50:50.200 and then we try to imitate them,
02:50:52.660 and all of that drives our,
02:50:55.160 that's what drives our knowledge of the ethic of behavior,
02:50:59.160 we elaborate that up,
02:51:00.360 we elaborate that up into drama,
02:51:02.560 and we elaborate it into ritual,
02:51:04.160 and we elaborate it into religious representations,
02:51:08.020 you have stories that emerge of people who play properly,
02:51:11.600 and people who play improperly,
02:51:13.520 and then you abstract out the essence of that,
02:51:16.740 you say,
02:51:17.060 well,
02:51:17.220 what's it like to play properly,
02:51:19.040 if you were doing it perfectly,
02:51:20.280 what would you be like,
02:51:21.600 well,
02:51:21.720 you'd be a target for emulation,
02:51:23.580 and imitation,
02:51:24.740 rather than a target for rejection,
02:51:26.480 you get the archetype of the hero,
02:51:27.840 and the adversary,
02:51:28.700 come out of that,
02:51:29.720 it's a completely different way,
02:51:30.980 of constructing a knowledge system,
02:51:33.240 and I think that,
02:51:34.400 what I've been trying to do,
02:51:35.560 in Maps of Meaning,
02:51:36.440 and trying to do in 12 Rules for Life,
02:51:38.040 is to lay that out,
02:51:39.000 say,
02:51:39.200 look,
02:51:40.700 there is a system of knowledge,
02:51:44.000 that underlies the ethic,
02:51:45.680 that we need to adopt,
02:51:46.840 to conduct ourselves properly in life,
02:51:48.720 and I'll close with this,
02:51:49.840 here's one way of conceptualizing it,
02:51:51.980 I think,
02:51:52.340 and this has to do with,
02:51:55.800 rule,
02:51:56.900 rule seven,
02:52:00.200 do what is meaningful,
02:52:01.180 not what is expedient,
02:52:02.540 it also has to do with rule eight,
02:52:04.080 which is,
02:52:04.760 tell the truth,
02:52:05.380 or at least don't lie,
02:52:08.320 well,
02:52:08.960 it's not so easy to tell the truth,
02:52:10.540 but you can tell when you're lying,
02:52:11.760 and you can stop doing that,
02:52:12.980 and then maybe you approximate the truth,
02:52:14.640 across time,
02:52:15.260 by doing that,
02:52:16.120 you need to do,
02:52:20.200 you need to take care of yourself,
02:52:23.760 but then you think,
02:52:24.460 well,
02:52:24.600 what does that mean exactly,
02:52:25.940 it's not an Ayn Rand sort of individualism,
02:52:28.720 and the reason for that is,
02:52:30.260 what self do you mean,
02:52:32.160 there isn't just you,
02:52:33.980 there's you now,
02:52:34.800 and you tomorrow,
02:52:35.540 and you next week,
02:52:36.720 you next month,
02:52:37.640 and you next year,
02:52:38.620 and you in a decade,
02:52:39.880 and you when you're old,
02:52:41.120 you're a community,
02:52:42.360 and you're a creature,
02:52:43.720 unlike other creatures,
02:52:44.860 in that you're aware of your own duration,
02:52:47.280 and so if you're going to treat yourself properly,
02:52:49.060 you're already playing a game,
02:52:50.640 it's a game you play with yourself,
02:52:52.020 across time,
02:52:52.980 it's an iterated game,
02:52:54.320 and you better play fair,
02:52:55.780 because otherwise,
02:52:56.680 the you that is to come,
02:52:58.480 is going to suffer for it,
02:53:00.780 and so there's no individual you,
02:53:04.000 outside of the community,
02:53:05.160 because just because of the way you are,
02:53:07.340 you're already a community,
02:53:08.820 and so you have to take that into account,
02:53:10.940 when you tell your child to play fair,
02:53:13.240 and be a good sport,
02:53:15.260 you know,
02:53:15.540 you're doing that partly,
02:53:16.480 because then they're better for their teammates,
02:53:18.220 but you're doing it mostly,
02:53:19.420 because then they're better for themselves,
02:53:21.460 across time,
02:53:22.460 right,
02:53:22.720 and you have an intimation of that,
02:53:25.120 so you've got to treat yourself fairly,
02:53:27.220 so the game's on with you,
02:53:29.220 but then you know,
02:53:29.900 you,
02:53:30.400 the self,
02:53:30.980 well what is that,
02:53:31.880 you think,
02:53:32.320 what are you more important than your family,
02:53:34.220 that isn't how people act,
02:53:35.840 you know,
02:53:36.100 if you take the typical parent,
02:53:37.440 you say,
02:53:37.780 well look,
02:53:38.500 I'm going to shoot you,
02:53:39.380 or I'm going to shoot your son,
02:53:41.780 well the typical parent,
02:53:42.980 is going to say,
02:53:43.560 well take me,
02:53:45.100 well you think,
02:53:45.520 well which is you then,
02:53:46.980 exactly,
02:53:47.660 is it,
02:53:48.420 you know,
02:53:48.760 if what's you,
02:53:50.320 is what's most dear to you,
02:53:51.840 what you identify with more,
02:53:53.800 well then you identify,
02:53:55.700 more with your kids,
02:53:57.200 and a tremendous amount,
02:53:58.560 with your parents,
02:53:59.220 and your siblings,
02:54:00.420 it's like you're extended,
02:54:01.620 you've got the community of yourself,
02:54:03.040 across time,
02:54:03.620 but then you've got your family,
02:54:05.080 and then you've got your family,
02:54:06.380 across time,
02:54:07.820 right,
02:54:08.080 and then well,
02:54:08.680 there's that,
02:54:09.300 but then your family's nested,
02:54:10.660 inside a community,
02:54:11.680 and you've got the same issue there,
02:54:13.040 you've got the community now,
02:54:14.240 all those other people,
02:54:15.280 and the community,
02:54:16.160 across time,
02:54:17.220 so what's the ethic,
02:54:18.340 of fair play,
02:54:19.160 it's like do what's good for you,
02:54:20.960 in a way that's good,
02:54:22.000 for the future you,
02:54:23.080 in a way that's good,
02:54:23.960 for your family,
02:54:24.840 and your future family,
02:54:26.340 in a way that's good,
02:54:27.220 for the community,
02:54:28.000 and the future community,
02:54:29.360 all of that,
02:54:30.560 that unbelievably complex,
02:54:32.460 sequence of nested games,
02:54:34.440 that's what you have,
02:54:35.180 to play properly,
02:54:36.580 and you admire people,
02:54:37.740 who do play it properly,
02:54:38.820 and you can see it,
02:54:39.940 even though you might not,
02:54:40.780 be able to articulate it,
02:54:43.220 right,
02:54:43.460 it's too complicated,
02:54:45.780 and I would say,
02:54:46.480 it's a matter of,
02:54:47.200 balancing out things properly,
02:54:49.080 and then to close,
02:54:50.420 I would say,
02:54:50.920 this is something,
02:54:51.540 to think about too,
02:54:52.900 you know,
02:54:53.100 we have an instinct,
02:54:54.020 for meaning,
02:54:54.940 you know,
02:54:55.220 you can get,
02:54:55.860 meaningfully engaged,
02:54:57.320 in things,
02:54:57.640 you might get meaningfully,
02:54:58.600 engaged in a basketball game,
02:55:00.100 for example,
02:55:00.960 you might ask yourself,
02:55:01.940 why,
02:55:02.480 and I shed some light,
02:55:03.800 on that,
02:55:04.280 you're watching play,
02:55:05.900 progress properly,
02:55:07.060 right,
02:55:07.960 maybe you follow the team,
02:55:09.140 across an entire championship,
02:55:10.580 because you don't give a damn,
02:55:11.540 about one basket,
02:55:13.300 one basket,
02:55:15.220 successfully managed,
02:55:16.760 and you don't give a damn,
02:55:17.720 about one game,
02:55:18.600 successfully won,
02:55:19.480 you want,
02:55:20.940 the sequence of games,
02:55:22.460 to be won successfully,
02:55:24.000 so the championship,
02:55:25.060 manifests itself,
02:55:26.040 and you're deeply,
02:55:26.620 embedded in that,
02:55:27.520 you know,
02:55:27.880 you might track,
02:55:28.480 all the statistics,
02:55:29.440 because you care,
02:55:30.660 who wins the game,
02:55:31.680 that's iterated,
02:55:32.440 across time,
02:55:33.280 and you're dramatizing that,
02:55:34.700 you're playing that out,
02:55:36.140 even though you can't,
02:55:36.920 articulate it,
02:55:37.700 you don't really understand it,
02:55:38.900 it still grips you,
02:55:40.300 like it should,
02:55:41.440 because that is,
02:55:42.280 what should grip you,
02:55:44.660 say,
02:55:45.640 it grips you,
02:55:47.280 and it's engaging,
02:55:48.120 and it's meaningful,
02:55:48.960 and the reason for that,
02:55:50.480 okay,
02:55:50.780 imagine this,
02:55:51.520 this is the final part,
02:55:52.480 of this,
02:55:54.660 well,
02:55:55.140 let's say,
02:55:55.500 you manage to take care,
02:55:56.560 of yourself,
02:55:57.020 and your family,
02:55:57.660 and your community,
02:55:58.360 all at the same time,
02:55:59.500 let's say,
02:56:00.000 that you imagine,
02:56:00.660 you're successful,
02:56:01.520 as a consequence of that,
02:56:02.660 like in the broad sense,
02:56:03.860 right,
02:56:04.320 you've got what it takes,
02:56:05.560 to live a rich life,
02:56:07.040 what's that going to do,
02:56:07.940 to you biologically,
02:56:09.980 it's like,
02:56:10.240 well,
02:56:10.360 isn't that going to make you,
02:56:11.180 an attractive partner,
02:56:13.180 isn't that going to make you,
02:56:14.200 an attractive mate,
02:56:15.900 and isn't the case,
02:56:17.300 that if you're an attractive mate,
02:56:18.680 because you act out,
02:56:19.580 that partner,
02:56:20.180 that you're more likely,
02:56:21.220 to succeed,
02:56:23.140 from an evolutionary perspective,
02:56:24.900 you're more likely,
02:56:25.580 to reproduce,
02:56:27.260 your children,
02:56:27.720 are more likely to live,
02:56:28.800 and so what that implies,
02:56:29.760 across time,
02:56:30.520 is that not only,
02:56:31.140 does that ethic exist,
02:56:34.200 not only do we recognize it,
02:56:35.860 but that the degree,
02:56:38.000 to which you're able,
02:56:38.860 to manifest that self,
02:56:40.520 in your life,
02:56:41.580 is associated directly,
02:56:43.520 with your long term success,
02:56:46.680 on a biologic,
02:56:50.100 speaking on the biological,
02:56:52.340 time frame,
02:56:53.280 speaking in the biological,
02:56:54.720 time frame,
02:56:55.820 okay,
02:56:56.220 so imagine that,
02:56:57.400 then imagine this,
02:56:58.340 imagine that you have instincts,
02:56:59.620 that guide you,
02:57:00.520 towards that pattern,
02:57:01.480 I could tell you two of those,
02:57:03.080 one is your conscience,
02:57:05.600 and it says,
02:57:06.240 you've fallen off the path,
02:57:08.720 and what's the path,
02:57:09.720 well the path,
02:57:10.500 is that balanced harmony,
02:57:12.040 and you've deviated from it,
02:57:13.180 you've betrayed yourself,
02:57:14.260 you've cheated yourself,
02:57:15.820 you're not playing the game,
02:57:16.860 properly with yourself,
02:57:18.000 and something responds,
02:57:19.540 and so that's the punishing end of it,
02:57:21.180 and the positive end is,
02:57:23.480 well if you've got the pattern right,
02:57:25.180 then it's deeply,
02:57:26.280 and meaningfully engaging,
02:57:27.920 and that's not an illusion,
02:57:29.980 you know one of the problems,
02:57:31.040 that modern people have,
02:57:32.060 is that we think that,
02:57:32.920 the sense of meaning,
02:57:33.900 is an illusion,
02:57:35.340 because we think it's arbitrary,
02:57:36.580 or constructed,
02:57:37.600 but there's no evidence for that,
02:57:39.480 the evidence is quite the contrary,
02:57:41.060 that that sense of deep,
02:57:42.380 meaningful engagement,
02:57:43.660 isn't rational at all,
02:57:45.000 it's way deeper than that,
02:57:46.780 and it signifies,
02:57:48.640 that you're,
02:57:49.680 where you should be doing,
02:57:51.400 what you should be doing,
02:57:53.020 you know with a bit of,
02:57:54.420 a whack from time to time,
02:57:56.000 from your conscience,
02:57:57.220 it says well,
02:57:58.720 walk the straight and narrow path,
02:58:00.580 right,
02:58:00.780 the line between chaos and order,
02:58:03.600 and get everything balanced,
02:58:04.920 harmoniously around you,
02:58:06.340 because that's where you should be,
02:58:07.720 and that's what you should be doing,
02:58:09.100 I think that's portrayed in music,
02:58:12.040 you know,
02:58:12.640 music lays out patterns,
02:58:14.620 layers of patterns,
02:58:16.000 and they're all interacting harmoniously,
02:58:17.860 and you can put yourself,
02:58:19.080 in sync with that,
02:58:20.300 you know physically even,
02:58:21.880 because you tend to dance to it,
02:58:23.600 and you dance to that,
02:58:24.600 and that's meaningful,
02:58:25.560 and it's because,
02:58:26.100 you've aligned yourself properly,
02:58:28.320 with all that multitudinous pattern,
02:58:31.000 and it's a symbolic manner of,
02:58:34.140 what would you say,
02:58:35.380 acting out,
02:58:36.440 being positioned properly,
02:58:38.260 in the midst of all that complexity,
02:58:40.200 and that's signified,
02:58:41.500 by that sense of meaning,
02:58:44.500 well so that's a much better story,
02:58:46.120 than a moral relativist story,
02:58:48.840 or a nihilist story,
02:58:50.380 a hopeless story,
02:58:51.180 it's like you have a sense of,
02:58:52.960 you have an instinct for meaning,
02:58:54.740 guided by conscience,
02:58:56.460 that puts you,
02:58:58.080 in the proper orientation,
02:59:00.400 to yourself,
02:59:01.780 extended across time,
02:59:03.460 to your family,
02:59:04.440 extended across time,
02:59:05.760 and to your community,
02:59:07.140 extended across time,
02:59:08.680 and if you attend to that,
02:59:11.020 then you act out things properly,
02:59:12.680 and then,
02:59:13.220 not only do you feel better,
02:59:14.560 psychologically,
02:59:15.320 you're facing the obstacles,
02:59:16.660 you need to face,
02:59:17.380 you're playing the game properly,
02:59:18.820 that's great psychologically,
02:59:19.980 gives your life some purpose,
02:59:21.760 and some higher meaning,
02:59:22.660 and protects you from anxiety,
02:59:23.980 and pain,
02:59:24.640 wonderful,
02:59:25.720 but it's not just psychological,
02:59:27.540 because if you do that,
02:59:29.060 you're also actually useful,
02:59:31.580 right,
02:59:31.960 you're taking care of yourself,
02:59:33.680 so you won't die,
02:59:34.800 you're taking care of your family,
02:59:36.160 so they don't suffer unduly,
02:59:38.240 and you're doing that,
02:59:39.100 in a manner that actually,
02:59:40.480 strengthens the community,
02:59:42.640 right,
02:59:42.860 you're taking on your role,
02:59:44.120 as a sovereign,
02:59:44.940 and responsible citizen,
02:59:46.600 and that helps actually,
02:59:48.080 it sets the world right,
02:59:49.260 psychologically,
02:59:49.980 but it also sets the world right,
02:59:53.300 so what's so arbitrary about that,
02:59:56.240 I don't see anything arbitrary about that,
02:59:59.180 I think it's long past time,
03:00:01.480 that we stopped regarding any of that,
03:00:03.700 as arbitrary,
03:00:05.100 it's self-evident,
03:00:06.920 play the game properly,
03:00:08.960 right,
03:00:09.580 it's important,
03:00:10.700 everyone knows it,
03:00:12.000 you do it,
03:00:12.540 your life is meaningful,
03:00:13.540 and worthwhile,
03:00:14.000 and you've got something to wake up for,
03:00:16.080 and you've got,
03:00:17.060 you've got some anxiety set at bay,
03:00:19.180 at least you've got something worthwhile to do,
03:00:21.040 in the face of what you're terrified of,
03:00:23.400 that's something,
03:00:24.800 and then there's important problems to solve,
03:00:26.500 and you're solving them,
03:00:28.120 it's like,
03:00:28.660 well,
03:00:28.820 that's separating the wheat from the chaff,
03:00:32.880 right,
03:00:34.220 I say,
03:00:34.720 well,
03:00:34.800 there's something to our civilized society,
03:00:37.280 that's integral,
03:00:38.720 and valuable,
03:00:39.400 and that we need to protect,
03:00:40.520 and identify,
03:00:41.260 and then act out,
03:00:42.040 and support,
03:00:42.600 and manifest more,
03:00:44.480 and not criticize to death,
03:00:46.220 and leave nothing at all,
03:00:48.140 in the ashes,
03:00:50.580 and that's why,
03:00:51.340 I was working on,
03:00:52.180 maps of meaning,
03:00:52.880 and 12 rules for life,
03:00:54.560 thank you very much,
03:00:56.480 these,
03:01:06.020 these are the most comfortable chairs we've had,
03:01:08.200 yeah,
03:01:08.680 geez,
03:01:09.180 wow,
03:01:09.580 maybe we'll just stay here,
03:01:11.360 yeah,
03:01:11.700 this is nice,
03:01:12.600 it's good to be back,
03:01:14.940 brother,
03:01:15.300 right,
03:01:16.220 yeah,
03:01:16.620 it's good to be back,
03:01:17.820 you feeling good,
03:01:19.380 yeah,
03:01:20.100 it's good to be here,
03:01:21.140 thank you all for coming,
03:01:22.160 it's much appreciated,
03:01:23.580 all right,
03:01:29.800 we got a ton of questions,
03:01:30.820 because this is a tech town,
03:01:32.100 and they actually know how to use,
03:01:33.380 they know how to use Slido,
03:01:34.540 that's good,
03:01:35.060 yeah,
03:01:35.220 it's impressive,
03:01:36.060 I like this first one,
03:01:37.480 now that you've destroyed Patreon,
03:01:40.780 will the two of you,
03:01:41.820 save the internet,
03:01:43.560 no,
03:01:45.160 no,
03:01:45.600 definitely not,
03:01:47.880 there's no saving the internet,
03:01:49.600 I don't know,
03:01:51.780 even if we'll save Patreon,
03:01:53.640 you know,
03:01:54.000 because,
03:01:55.160 this is part of that,
03:01:57.300 not taking the moral high ground,
03:01:58.880 too easily,
03:01:59.920 you know,
03:02:01.100 Dave and I,
03:02:01.940 and Harris as well,
03:02:03.480 Dave talked to Sam,
03:02:04.740 in particular,
03:02:05.200 but Dave and I talked about,
03:02:07.180 what had happened after,
03:02:08.440 Patreon banned,
03:02:10.480 Carl Benjamin,
03:02:12.500 Sargon of a cat,
03:02:13.420 and we thought,
03:02:14.020 well,
03:02:14.140 that's just not acceptable,
03:02:15.280 and so we decided to,
03:02:18.620 stop using the platform,
03:02:20.340 and,
03:02:20.640 I'd been working on a program,
03:02:23.860 that was,
03:02:26.040 had some parallel functions,
03:02:27.860 to Patreon,
03:02:28.740 that we were producing,
03:02:29.740 for a slightly different reason,
03:02:31.060 and,
03:02:31.340 and we started talking about,
03:02:32.720 the possibility,
03:02:33.480 that it could be repurposed,
03:02:34.500 as a Patreon,
03:02:35.940 alternative,
03:02:37.680 which is something,
03:02:38.940 that we're working diligently on,
03:02:40.440 and which will happen,
03:02:41.800 but,
03:02:42.180 I would never claim that,
03:02:45.040 the solution to the problem,
03:02:47.240 is going to be straightforward,
03:02:49.580 so,
03:02:50.760 we'll see,
03:02:52.280 what we can manage,
03:02:53.800 I think Dave and I,
03:02:54.740 will start using the platform,
03:02:56.040 in about,
03:02:57.460 month and a half,
03:02:58.500 something like that,
03:02:59.640 but we don't know,
03:03:01.940 maybe Patreon's time,
03:03:03.340 has come and gone already,
03:03:04.360 because there's already ways,
03:03:05.500 of supporting individual contributors,
03:03:07.460 maybe it's not possible,
03:03:08.840 to aggregate,
03:03:09.800 a tremendous number,
03:03:11.300 of creative people together,
03:03:12.460 in a single platform,
03:03:13.620 without,
03:03:14.280 attracting,
03:03:15.880 undue negative attention,
03:03:17.420 and the immediate,
03:03:18.840 probability of this kind,
03:03:20.140 of sensorial action,
03:03:21.840 we don't know,
03:03:22.860 right,
03:03:23.040 because these technologies,
03:03:24.040 are all so new,
03:03:25.700 you know,
03:03:26.320 it looks like,
03:03:27.520 YouTube and,
03:03:28.840 Facebook are wrestling,
03:03:30.420 now,
03:03:31.240 in,
03:03:31.560 and Twitter as well,
03:03:33.020 wrestling now,
03:03:34.100 with the conundrum,
03:03:35.320 that they've been presented with,
03:03:36.700 which is,
03:03:37.540 well,
03:03:38.840 there's a billion,
03:03:39.540 opinions,
03:03:40.900 and some of them,
03:03:41.820 are rough,
03:03:42.760 what do we do,
03:03:44.300 about the rough opinions,
03:03:46.120 and the answer might be,
03:03:47.360 like in the,
03:03:48.620 frontier,
03:03:49.880 heyday of YouTube,
03:03:51.340 when it wasn't run,
03:03:52.460 by a giant,
03:03:53.480 and increasingly,
03:03:54.480 rigid corporation,
03:03:55.720 then anything,
03:03:56.960 went,
03:03:57.760 but as soon as it's,
03:03:59.800 corporatized,
03:04:00.760 and systematized,
03:04:02.000 then that can,
03:04:02.920 no longer work,
03:04:04.220 I mean,
03:04:04.480 you think,
03:04:05.180 you really think,
03:04:05.760 the internet,
03:04:06.480 could have ever started,
03:04:07.540 if it would have been,
03:04:08.200 regulated to begin with,
03:04:09.580 I mean,
03:04:09.780 just think about it,
03:04:10.480 what drove the internet,
03:04:12.580 porn,
03:04:14.660 I mean,
03:04:14.980 no,
03:04:15.180 but I'm dead serious,
03:04:16.520 right,
03:04:16.820 is that,
03:04:17.180 that's what happened,
03:04:18.220 is that the internet,
03:04:19.040 exploded,
03:04:20.040 over its first,
03:04:21.720 say,
03:04:22.100 10 years of development,
03:04:23.220 of public development,
03:04:24.040 because it was just,
03:04:24.840 it was like,
03:04:26.020 what was the percentage,
03:04:27.080 of porn on the internet,
03:04:28.300 for the first 10 years,
03:04:29.360 it was,
03:04:29.820 some ridiculous amount,
03:04:31.260 I didn't know,
03:04:32.380 that there was porn,
03:04:32.960 on the internet,
03:04:33.100 oh,
03:04:35.300 yeah,
03:04:40.000 see,
03:04:40.260 everyone learns something,
03:04:41.340 at a Jordan Peterson event,
03:04:42.700 you see what I'm saying,
03:04:44.040 yeah,
03:04:45.360 so,
03:04:46.900 I don't know,
03:04:47.460 if it's a solvable problem,
03:04:49.040 but we're,
03:04:49.700 we're going to try,
03:04:50.920 I think at least,
03:04:51.980 we're going to try,
03:04:52.480 to build a platform,
03:04:53.560 where,
03:04:54.120 if you're on it,
03:04:55.180 we're not going to,
03:04:55.980 kick you off,
03:04:56.980 arbitrarily,
03:04:58.120 I think we can promise that,
03:04:59.700 see,
03:05:06.160 let me just say,
03:05:07.140 one more thing,
03:05:07.640 about that,
03:05:08.520 so I went and talked,
03:05:09.920 to,
03:05:10.800 one of the guys,
03:05:11.700 who runs one of these,
03:05:12.560 big,
03:05:13.880 social,
03:05:15.340 communication,
03:05:16.860 systems,
03:05:17.480 you know,
03:05:17.960 social networks,
03:05:19.100 and they'd faced,
03:05:21.340 a lot of pressure,
03:05:22.760 because ISIS,
03:05:23.520 was using the platform,
03:05:25.260 to recruit,
03:05:27.000 okay,
03:05:27.360 so you got to ask yourself,
03:05:28.420 if you're a free speech,
03:05:29.320 absolutist,
03:05:31.800 you're going to let,
03:05:33.120 if you're in a war,
03:05:33.920 you're going to let,
03:05:34.460 the enemies of your state,
03:05:35.920 recruit with your platform,
03:05:37.660 and if the answer is no,
03:05:41.700 which seems like,
03:05:42.600 a reasonable answer,
03:05:44.480 well then you've already,
03:05:45.380 opened the door,
03:05:46.180 right,
03:05:47.020 say there are things,
03:05:48.480 that should,
03:05:50.440 could,
03:05:51.340 be censored,
03:05:52.420 then the question is,
03:05:53.860 as soon as you open that door,
03:05:55.100 like you,
03:05:55.880 it's Pandora's box,
03:05:57.160 it's like,
03:05:57.580 okay,
03:05:57.780 well what about,
03:05:58.200 what's right next to that,
03:06:00.920 maybe that would be,
03:06:01.940 like,
03:06:03.800 certain forms of communication,
03:06:05.560 about radical,
03:06:06.400 religious fundamentalism,
03:06:07.740 because that's right next door,
03:06:09.380 and then there's something,
03:06:10.320 right next door to that,
03:06:11.560 it's like,
03:06:12.640 nose lines are really hard to draw,
03:06:14.520 so it isn't obvious to me,
03:06:15.940 how large scale social networks,
03:06:18.040 are going to solve this problem,
03:06:19.700 and we're wrestling with that,
03:06:21.540 trying to come up with,
03:06:22.480 a solution that's reasonable,
03:06:23.720 so far,
03:06:25.620 it's something like,
03:06:27.260 you'll be able to stay on the platform,
03:06:31.080 unless you break an American law,
03:06:33.440 an actual law,
03:06:35.940 right,
03:06:36.600 and I don't even know,
03:06:37.660 if that's a good enough guideline,
03:06:39.060 but,
03:06:39.580 well,
03:06:40.200 it might have to be,
03:06:41.600 but,
03:06:41.960 we'll see,
03:06:44.000 and,
03:06:44.460 as for fixing the net,
03:06:45.920 well,
03:06:46.140 that's a,
03:06:49.520 that's a no-go,
03:06:50.640 that is,
03:06:51.300 so,
03:06:52.940 we offer whatever content,
03:06:54.360 we can manage,
03:06:55.120 and people seem to enjoy it,
03:06:56.640 and,
03:06:57.320 that's working pretty well,
03:06:58.440 but that's about all we can manage,
03:07:00.320 and,
03:07:00.940 maybe that's all that,
03:07:03.180 maybe that's all that's manageable,
03:07:06.000 and there's porn out there,
03:07:09.140 so who knew,
03:07:09.800 the APA recently defined,
03:07:14.520 traditional masculinity as toxic,
03:07:17.440 conflating virtuous,
03:07:18.820 and harmful aspects,
03:07:20.300 how can we reverse,
03:07:21.520 this dangerous,
03:07:22.680 ideological progression,
03:07:23.480 oh,
03:07:23.640 the purpose was,
03:07:24.280 the purpose was to conflate,
03:07:26.560 the virtuous,
03:07:27.240 and the harmful,
03:07:28.960 that's the purpose of the document,
03:07:30.440 is to blur the distinction,
03:07:31.460 between the two,
03:07:32.400 and I think the real reason,
03:07:33.880 I'm writing an article,
03:07:34.680 about this right now,
03:07:36.620 I think the actual reason,
03:07:38.080 was to damage the virtuous,
03:07:40.420 because that's the best way,
03:07:41.680 of doing it,
03:07:42.200 right,
03:07:42.360 if you want to damage the virtuous,
03:07:43.820 what you do is,
03:07:44.340 you conflate it with the harmful,
03:07:46.280 it doesn't hurt the harmful,
03:07:47.720 any,
03:07:48.000 to have it conflated,
03:07:48.880 with the virtuous,
03:07:50.660 so let's say,
03:07:51.600 let's say your real motivation,
03:07:53.220 is like,
03:07:53.840 a seriously deep,
03:07:55.520 resentment,
03:07:56.380 and spite,
03:07:57.700 and that the best way,
03:07:58.980 to manifest that,
03:07:59.860 is to take the virtuous,
03:08:01.200 and conflate it,
03:08:02.360 with the pathological,
03:08:03.740 and to take down the virtuous,
03:08:05.420 and maybe that's because,
03:08:06.340 you can't bloody manage it,
03:08:07.720 on your own,
03:08:09.100 that's what it looks like,
03:08:10.120 to me,
03:08:10.400 so,
03:08:12.180 I think it's an,
03:08:13.860 I mean,
03:08:14.220 I was,
03:08:15.020 you know,
03:08:16.400 the American Psychological Association,
03:08:22.120 put out a variety,
03:08:23.120 of very solid,
03:08:23.940 scientific journals,
03:08:25.120 for a very long period of time,
03:08:26.460 and,
03:08:27.440 a couple of my friends,
03:08:28.340 sent me,
03:08:30.040 some of the articles,
03:08:30.900 that are going to be published,
03:08:31.840 in the next couple of months,
03:08:32.880 in some of the flagship journals,
03:08:34.260 like American Psychologist,
03:08:35.440 and they're,
03:08:36.880 they're social justice oriented,
03:08:39.220 right to the damn core,
03:08:41.100 and so that organization,
03:08:42.100 has become thoroughly corrupt,
03:08:44.920 and it's really,
03:08:45.560 an appalling thing to see,
03:08:46.740 because I like to be,
03:08:47.640 in a clinical psychologist,
03:08:48.800 and like the,
03:08:49.640 the training I had at McGill,
03:08:51.440 was top rate man,
03:08:52.840 it was,
03:08:53.680 it was very,
03:08:54.640 stringently,
03:08:55.760 stringently scientific,
03:08:57.060 I mean,
03:08:57.300 clinical psychology,
03:08:58.100 isn't exactly a science,
03:09:00.340 any more than medicine is,
03:09:01.980 or any more than engineering is,
03:09:03.260 because,
03:09:03.840 there's practical engineering,
03:09:05.440 in it,
03:09:05.780 right,
03:09:06.340 I mean,
03:09:06.540 if you're a clinical psychologist,
03:09:07.920 you're helping,
03:09:08.720 you're trying to help people,
03:09:09.780 have better lives,
03:09:11.260 you're not exactly,
03:09:12.220 trying to make them,
03:09:13.220 more mentally healthy,
03:09:14.460 it's,
03:09:14.860 and even mental health,
03:09:15.860 is a very tricky thing,
03:09:17.040 to define,
03:09:17.720 because you can't tell,
03:09:18.740 if it's normative,
03:09:19.480 or ideal,
03:09:20.520 like it's a weird mixture,
03:09:21.620 of normative,
03:09:22.240 and ideal,
03:09:22.900 but despite all that,
03:09:24.800 you know,
03:09:25.460 the clinical psychology programs,
03:09:28.500 for a long time,
03:09:29.480 they were very,
03:09:30.260 rigorous,
03:09:33.060 they're hard to get into,
03:09:34.200 and they were rigorous,
03:09:34.880 and if you came out of them,
03:09:36.380 you came out of them,
03:09:38.140 better than you,
03:09:38.940 were when you went in,
03:09:40.520 you came out of them,
03:09:41.300 more knowledgeable,
03:09:42.100 and you came out of them,
03:09:43.660 tougher,
03:09:44.180 and more resilient,
03:09:44.940 because you had to learn,
03:09:45.780 how to cope with people's problems,
03:09:47.180 and not take them home,
03:09:48.220 and to be mature enough,
03:09:49.340 to handle it,
03:09:49.980 and it was really good,
03:09:51.280 and,
03:09:52.480 for a long time,
03:09:53.420 you know,
03:09:53.620 when anybody ever asked me,
03:09:55.280 how to find a psychologist,
03:09:56.540 I would say,
03:09:57.060 well,
03:09:57.340 find an American Psychological Association,
03:09:59.880 accredited,
03:10:02.540 clinical research program graduate,
03:10:04.820 because you can be,
03:10:05.760 virtually certain,
03:10:06.760 that they will be,
03:10:07.720 trained enough,
03:10:09.360 and smart enough,
03:10:10.100 to at least be competent,
03:10:11.980 you know,
03:10:12.680 so you've got a good crack at it,
03:10:14.140 and they're just throwing that,
03:10:15.480 all away,
03:10:16.080 it's really,
03:10:17.420 despicable,
03:10:18.400 in my estimation,
03:10:19.520 and it makes me ashamed,
03:10:21.060 of my profession,
03:10:22.060 and,
03:10:22.320 it's,
03:10:23.400 it's,
03:10:24.080 it's really too bad.
03:10:29.240 This got the third highest upvotes.
03:10:32.160 Do you season your beef,
03:10:33.800 and if so,
03:10:34.560 what do you use?
03:10:40.040 I have such a stupid life.
03:10:48.080 Salt.
03:10:52.320 And,
03:10:55.400 and for those of you that care,
03:10:56.980 my daughter,
03:10:57.760 who,
03:10:59.260 invented this appalling diet,
03:11:01.900 doesn't even use salt.
03:11:06.620 So,
03:11:07.820 it's pretty bare bones,
03:11:09.300 man.
03:11:10.840 So,
03:11:11.760 that's that.
03:11:12.820 But,
03:11:13.180 you know,
03:11:13.600 we've discovered that there are,
03:11:16.660 there's,
03:11:17.320 what would you call it?
03:11:20.140 There's variety in water,
03:11:22.320 you can have,
03:11:23.380 here,
03:11:23.700 here's the variety,
03:11:24.680 you can have hot water,
03:11:26.180 that's sort of like tea,
03:11:27.520 without the tea,
03:11:28.740 you can have cold water,
03:11:31.020 that's a different kind,
03:11:32.040 and you can have sparkling water,
03:11:33.940 that's three kinds of water,
03:11:36.560 and beef,
03:11:37.780 and salt,
03:11:38.740 in my case,
03:11:39.520 so.
03:11:42.280 Living the dream,
03:11:43.180 man.
03:11:43.560 Yeah.
03:11:46.660 Actually,
03:11:47.100 the best thing that I saw all year,
03:11:48.560 was the night that we had,
03:11:49.880 that dinner that I'm sure many of you saw the picture of,
03:11:51.660 where it was Sam and Joe and Shapiro and all of us,
03:11:54.660 and watching you and Rogan sit across from each other,
03:11:57.600 each one of you eating like a 50 ounce tomahawk,
03:12:01.760 like Fred Flintstone style,
03:12:03.700 I was like,
03:12:04.160 I am going to remember this.
03:12:07.720 All right,
03:12:08.220 we got a couple along this line,
03:12:10.060 it's sort of a theme here.
03:12:11.020 As a leader of a tech company,
03:12:13.360 how can you break out of the SJW mood of expected values,
03:12:17.500 without risking your whole company?
03:12:19.160 Fire all your HR people.
03:12:31.380 Well,
03:12:31.860 the first,
03:12:32.560 that's a really,
03:12:33.320 that was a really nasty thing to say,
03:12:35.360 which,
03:12:35.600 that I just said,
03:12:37.120 but true.
03:12:39.080 Well,
03:12:43.180 the first thing is,
03:12:43.980 you have to decide if that's what you,
03:12:45.580 if that's what you want to do.
03:12:47.360 Like,
03:12:47.740 you have to actually make that decision,
03:12:49.200 and then,
03:12:49.540 and then you have to act it out at all the small,
03:12:51.960 in all the small ways.
03:12:53.900 You know,
03:12:54.300 you just,
03:12:55.320 you're just not going to go there.
03:12:57.240 You're going to decide not to go there,
03:12:58.720 and take,
03:12:59.340 take your,
03:13:00.980 fight your battles along the way.
03:13:02.900 You know,
03:13:03.220 and so there'll be no talk about diversity,
03:13:05.060 and there'll be no talk about inclusivity,
03:13:06.080 and there'll be no talk about equity,
03:13:08.820 and there'll be no talk about white privilege,
03:13:11.460 and what else?
03:13:12.040 There'll be no talk about microaggressions,
03:13:14.420 all of that stuff.
03:13:15.380 And I can tell you,
03:13:16.380 the tech people in here,
03:13:18.220 if you think you can let any of that dogma into your company,
03:13:22.720 without letting all of it in,
03:13:24.220 you are naive beyond belief.
03:13:26.520 And if you think that what happened to the universities,
03:13:28.420 won't happen to your company,
03:13:29.640 then you have the same problem,
03:13:32.560 because it will.
03:13:33.840 Because this,
03:13:34.600 like,
03:13:35.140 this set of ideas is not trivial.
03:13:39.380 And you let any of it in.
03:13:41.320 There isn't letting some of it in.
03:13:43.560 You know?
03:13:44.520 That isn't how it works.
03:13:46.260 And if you think you'll curry favor with the public,
03:13:48.660 by playing that game,
03:13:50.940 you will,
03:13:51.540 in the short run,
03:13:52.320 with some people,
03:13:53.260 but you'll risk your company.
03:13:55.340 And,
03:13:55.540 and,
03:13:56.440 and if it's not something that you agree with,
03:13:58.260 then you'll risk yourself, too.
03:13:59.640 Because eventually,
03:14:00.660 you'll either come up against it,
03:14:03.220 and kowtow,
03:14:05.280 or you'll come up against it,
03:14:07.240 and lose.
03:14:09.460 So,
03:14:10.580 so,
03:14:13.540 the first thing you have to do,
03:14:14.680 is decide that you're not going to do that.
03:14:16.280 And then,
03:14:16.840 you have to make sure that you don't do it at all.
03:14:20.000 Because there's no doing it a little bit.
03:14:22.840 So,
03:14:24.000 you know,
03:14:25.500 I'm worried, too,
03:14:26.380 that in the U.S.,
03:14:27.280 that this has gone far enough,
03:14:30.360 so that the STEM fields themselves,
03:14:31.980 are going to be at risk,
03:14:32.840 over the next 10 years.
03:14:34.360 I mean,
03:14:34.720 you know already,
03:14:35.380 in California,
03:14:36.980 if,
03:14:37.760 if you are a faculty member,
03:14:39.480 in one of the STEM fields,
03:14:40.820 before you get hired,
03:14:42.000 at a California university,
03:14:43.460 you have to write,
03:14:44.860 and sign,
03:14:45.700 a diversity document.
03:14:49.360 So,
03:14:50.340 and that's the same for promotion.
03:14:52.160 And what the hell,
03:14:52.900 that has to do with the STEM fields,
03:14:55.060 is beyond me,
03:14:55.920 except,
03:14:56.820 you know,
03:14:57.320 I think that part of,
03:14:58.900 what's driving all of this,
03:15:00.800 like,
03:15:01.300 what drove the APA statement,
03:15:03.160 is an absolute hatred,
03:15:04.800 for competence itself.
03:15:07.140 And,
03:15:07.740 the STEM types,
03:15:08.980 you can,
03:15:09.260 that's why it's,
03:15:14.420 that's why it's all power.
03:15:16.640 You know,
03:15:16.960 that's why the,
03:15:17.620 the discussion is all about power.
03:15:19.620 Well,
03:15:19.900 you,
03:15:20.080 you,
03:15:20.380 you occupy a,
03:15:21.780 high level position,
03:15:22.880 in your hierarchy.
03:15:24.120 Why?
03:15:24.820 Power.
03:15:25.780 Well,
03:15:26.100 what else could it be?
03:15:27.280 Could be competence.
03:15:28.740 Well,
03:15:28.920 God,
03:15:29.300 no,
03:15:29.600 not that,
03:15:30.380 not competence.
03:15:31.360 We can't admit,
03:15:32.240 that there's such a thing,
03:15:32.980 as competence,
03:15:33.700 because it also means,
03:15:34.560 we'd have to admit,
03:15:35.520 that there's such a thing,
03:15:36.360 as incompetence,
03:15:37.220 and maybe that characterizes,
03:15:38.980 like me,
03:15:39.980 for example.
03:15:41.460 So,
03:15:42.180 we're gonna,
03:15:42.780 instead of me dealing,
03:15:44.620 instead of me dealing,
03:15:47.360 with my own,
03:15:48.780 pathetic incompetence,
03:15:50.240 I'm just going to,
03:15:51.040 attack the notion,
03:15:52.060 of competence itself,
03:15:53.080 and I'm gonna do that,
03:15:54.000 by conflating it,
03:15:54.880 with power.
03:15:56.600 And so,
03:15:58.680 you know,
03:15:59.040 that,
03:15:59.300 that's great,
03:15:59.920 except when you need,
03:16:00.680 to build a bridge,
03:16:01.920 that stands up,
03:16:03.280 because bridges,
03:16:04.360 don't stand up,
03:16:05.440 because power hungry,
03:16:06.680 madmen built them.
03:16:08.140 They stand up,
03:16:09.040 because competent people,
03:16:10.240 built them.
03:16:10.940 You know,
03:16:11.280 if you build bridges,
03:16:12.360 if you're,
03:16:12.920 if you're,
03:16:13.720 if you're,
03:16:13.800 if you're a power hungry,
03:16:18.060 madman,
03:16:19.620 then when you build,
03:16:21.400 when you make concrete,
03:16:22.500 you put too much sand in it,
03:16:24.000 because sand is cheap.
03:16:25.580 That's what happened,
03:16:26.540 that's what,
03:16:26.940 that's what happened,
03:16:27.980 in the Soviet Union,
03:16:29.360 in many cases,
03:16:30.120 what happens in all sorts of,
03:16:31.700 countries,
03:16:32.220 where corruption is the norm.
03:16:34.280 It's like,
03:16:34.580 that's fine,
03:16:35.060 your building will stand up,
03:16:36.080 till there's an earthquake,
03:16:37.460 and then it'll fall down.
03:16:39.920 It's like,
03:16:40.300 well that's what happens,
03:16:41.020 when you have engineers,
03:16:41.700 that run off power.
03:16:43.840 If you have engineers,
03:16:44.540 that run off competence,
03:16:45.700 then when,
03:16:46.540 then things stand up,
03:16:47.740 and that's how you can tell,
03:16:48.580 they're competent,
03:16:49.140 and that's why the,
03:16:50.180 politically correct people,
03:16:51.260 hate engineers.
03:16:52.500 Is because you can tell,
03:16:53.880 when an engineer is competent,
03:16:54.960 because they build something,
03:16:55.640 that works.
03:16:56.340 And that's really annoying,
03:16:57.480 because it's evidence,
03:16:58.200 that there's competence.
03:16:59.660 And that's annoying,
03:17:00.440 because that means,
03:17:01.140 there's evidence,
03:17:01.760 that there's incompetence,
03:17:03.100 and your existence,
03:17:05.080 is the evidence.
03:17:05.840 And so instead of rectifying,
03:17:15.200 you'll just get rid of,
03:17:16.900 the idea of competence,
03:17:18.040 and then everything,
03:17:18.820 will fall down around you,
03:17:19.940 but,
03:17:20.580 well then you'll all be equal,
03:17:21.680 in the rubbish,
03:17:22.440 so that'll be fine.
03:17:23.540 You can accomplish,
03:17:24.480 your equity goal,
03:17:25.360 that way too.
03:17:26.020 What kind of music,
03:17:33.140 do you listen to?
03:17:36.240 I was really trying,
03:17:37.380 to work a segue,
03:17:38.060 on that one.
03:17:39.360 There was nothing there.
03:17:44.360 I listen to all sorts of music,
03:17:46.300 I mean,
03:17:47.100 what,
03:17:47.600 I don't remember,
03:17:48.400 there was a famous jazz musician,
03:17:49.860 who said there was only,
03:17:50.980 I don't think,
03:17:51.700 was it Miles Davis?
03:17:52.600 I don't remember.
03:17:54.900 There's only two kinds of music,
03:17:56.120 bad music,
03:17:56.720 and good music.
03:17:58.180 And so that,
03:17:59.600 sort of runs across the genres.
03:18:01.720 You know,
03:18:01.960 and I listen to all,
03:18:03.220 sorts of music.
03:18:04.100 I listen to swing,
03:18:05.040 and jazz,
03:18:05.540 and old country,
03:18:07.000 from the 1950s,
03:18:08.620 and country swing,
03:18:09.960 from the 1930s,
03:18:11.080 and modern rock and roll,
03:18:13.160 and like,
03:18:14.760 I even listen to a little bit,
03:18:15.720 of hip hop.
03:18:16.260 It's not really my genre,
03:18:17.440 but not much,
03:18:18.800 you know.
03:18:19.660 It's,
03:18:20.060 I don't understand,
03:18:21.720 the genre very well,
03:18:22.600 but,
03:18:23.960 it wasn't,
03:18:26.420 it wasn't a Northern Albertan thing,
03:18:28.140 man.
03:18:31.060 So in classical music,
03:18:32.660 and like,
03:18:33.200 I like a lot,
03:18:34.040 I like a very diverse range of music,
03:18:36.080 and,
03:18:36.180 and,
03:18:37.000 and I listen to it,
03:18:38.980 very loud,
03:18:40.120 especially when I'm driving around,
03:18:41.560 which I really need to do much more of.
03:18:44.580 So,
03:18:45.600 yeah.
03:18:48.500 Was mainstream media always this bad?
03:18:51.320 No.
03:18:51.500 Or are we just seeing it more now?
03:18:53.360 It's way worse.
03:18:54.820 Yeah,
03:18:55.140 it's way worse.
03:18:56.320 If you,
03:18:56.840 all you have to do,
03:18:57.500 is go to the library,
03:18:58.320 and pick up a Time magazine,
03:18:59.340 from like,
03:18:59.900 1973.
03:19:02.060 Like,
03:19:02.480 there,
03:19:02.660 it's a miracle.
03:19:03.400 You can't even believe,
03:19:04.080 the bloody thing existed.
03:19:05.220 It was like,
03:19:05.660 a hundred pages long,
03:19:06.680 and it was thick.
03:19:07.620 You know,
03:19:07.860 it's like,
03:19:08.400 it's,
03:19:08.880 it's three-eighths of an inch thick,
03:19:10.380 and when you open it,
03:19:11.720 it doesn't look at all like People magazine.
03:19:14.500 There's no photographs,
03:19:16.160 it's all text,
03:19:17.220 and the text is small,
03:19:19.080 and the words mean things,
03:19:21.660 and they were,
03:19:22.180 they were written by journalists who were paid,
03:19:24.580 and they were fact-checked,
03:19:25.800 and like,
03:19:26.260 yeah,
03:19:26.520 no,
03:19:26.700 it's way worse than it was,
03:19:27.700 but I,
03:19:28.200 I think a lot of that is,
03:19:29.580 a technological,
03:19:30.700 it's a,
03:19:31.040 it's a side effect of technological revolution.
03:19:33.980 You know,
03:19:34.620 rather than pointing fingers at journalists,
03:19:37.020 or even at,
03:19:37.820 at networks,
03:19:38.680 let's say,
03:19:39.660 you know,
03:19:40.060 that all of the mainstream media,
03:19:42.140 is completely done,
03:19:45.140 in by,
03:19:46.820 the new media,
03:19:48.140 right?
03:19:48.520 So the print journalists,
03:19:50.460 what are you going to put your print online,
03:19:53.700 with,
03:19:54.620 and compete head-to-head,
03:19:56.000 against 150,000 blogs,
03:19:58.340 that's not going to work,
03:19:59.840 and your television network,
03:20:02.560 you're going to compete against,
03:20:04.000 online,
03:20:05.560 on-demand video,
03:20:07.200 no,
03:20:07.920 and your radio,
03:20:08.940 is going to compete against podcasts,
03:20:10.880 no,
03:20:12.360 so,
03:20:13.340 what I think,
03:20:14.260 what we're seeing,
03:20:15.040 is that,
03:20:15.500 that old media,
03:20:16.680 is being done in,
03:20:17.420 by the new media forms,
03:20:18.780 and as it's done in,
03:20:20.100 it,
03:20:20.380 its quality,
03:20:21.180 is deteriorating,
03:20:22.260 and it's proclivity,
03:20:23.460 to exaggerate,
03:20:25.600 the polarization,
03:20:27.700 increases,
03:20:28.600 because it's the only way,
03:20:29.700 of gaining attention,
03:20:30.840 in a,
03:20:31.360 declining market,
03:20:32.540 so,
03:20:34.120 rather than,
03:20:35.860 you know,
03:20:36.200 a set of moral failings,
03:20:38.260 which it also is,
03:20:39.420 to some degree,
03:20:40.120 I think it's just,
03:20:41.340 the inevitable consequence,
03:20:42.740 of,
03:20:43.180 a tech,
03:20:44.880 an unbelievable,
03:20:45.820 technological revolution,
03:20:46.960 and YouTube's a great example,
03:20:48.280 right,
03:20:48.500 because,
03:20:49.280 think,
03:20:50.700 what is it,
03:20:51.640 YouTube has everything,
03:20:54.060 a network has,
03:20:55.120 but a network doesn't have,
03:20:56.540 many of the things,
03:20:57.460 that YouTube has,
03:20:58.660 so,
03:20:59.340 bandwidth is expensive,
03:21:00.400 for a network,
03:21:01.180 and it's free,
03:21:01.780 for YouTube,
03:21:02.920 and you need,
03:21:03.600 a huge organization,
03:21:04.420 to manage a network show,
03:21:06.100 and you need,
03:21:06.540 like,
03:21:06.660 three people,
03:21:07.240 to manage a YouTube show,
03:21:08.940 and YouTube has,
03:21:10.280 two billion potential viewers,
03:21:12.700 and so,
03:21:14.040 how in the world,
03:21:15.080 can,
03:21:15.680 can a television network,
03:21:17.240 compete with that,
03:21:18.120 it can't,
03:21:19.240 and podcasts,
03:21:19.880 are the same thing,
03:21:21.240 so,
03:21:22.400 so yes,
03:21:23.020 the media is way worse,
03:21:24.000 than it used to be,
03:21:24.700 but it's not surprising,
03:21:26.340 I mean,
03:21:26.600 Time Magazine,
03:21:27.080 doesn't even pay,
03:21:27.780 its journalists anymore,
03:21:29.480 so in all,
03:21:30.580 even the big newspapers,
03:21:31.780 have lost,
03:21:32.420 most of their,
03:21:33.300 good journalists,
03:21:34.020 they've lost,
03:21:34.460 all their fact checkers,
03:21:35.600 they're,
03:21:35.760 they're,
03:21:36.060 they're dying,
03:21:37.420 for lack of money,
03:21:38.260 they can't monetize anything,
03:21:39.580 because Google has eaten up,
03:21:40.940 all the advertising revenue,
03:21:42.760 so,
03:21:43.940 I think,
03:21:44.620 that quality,
03:21:45.560 degeneration,
03:21:46.500 is a consequence of,
03:21:47.980 just this insane,
03:21:49.520 technological revolution,
03:21:50.880 that's progressing,
03:21:52.480 and,
03:21:53.860 you know,
03:21:54.220 who knows,
03:21:54.820 what's going to happen,
03:21:55.680 as a consequence of that,
03:21:56.760 one of the things,
03:21:57.440 that I've seen,
03:21:57.960 that's been really strange,
03:21:59.140 is that,
03:22:00.340 it isn't so much,
03:22:01.780 that everybody,
03:22:02.660 is in their own bubble,
03:22:03.760 it's that,
03:22:05.620 or that,
03:22:06.200 sorry,
03:22:06.700 there,
03:22:07.000 it's not like,
03:22:07.600 that we've grouped together,
03:22:08.740 in bubbles,
03:22:09.580 it's like,
03:22:10.240 each of us has a bubble,
03:22:11.660 now,
03:22:12.400 and,
03:22:12.820 I watch people,
03:22:14.040 use the million channels,
03:22:17.560 of YouTube,
03:22:18.280 and,
03:22:18.620 it's like,
03:22:18.960 everybody has their,
03:22:20.340 own news,
03:22:21.980 reality,
03:22:23.280 right,
03:22:23.940 and,
03:22:24.440 and that's a strange thing,
03:22:25.720 too,
03:22:26.200 one of the things,
03:22:27.280 we didn't understand,
03:22:28.240 I think,
03:22:28.600 about,
03:22:29.780 the,
03:22:30.100 the classical media,
03:22:31.740 say,
03:22:31.980 the TV stations,
03:22:32.780 of the 1970s,
03:22:34.020 and 1980s,
03:22:34.740 when,
03:22:35.180 1970s in particular,
03:22:36.340 when there was only,
03:22:36.940 really three TV stations,
03:22:38.160 is,
03:22:38.500 we had no idea,
03:22:39.520 how uniting that was,
03:22:41.260 you know,
03:22:41.560 because,
03:22:41.900 we were all doing,
03:22:42.480 the same thing,
03:22:43.460 at least from time to time,
03:22:44.580 we all watched the same news,
03:22:46.140 we were all,
03:22:46.980 kind of viewed the culture,
03:22:48.600 at least more or less,
03:22:49.760 through the same lens,
03:22:50.960 and it was mediated,
03:22:51.860 by people,
03:22:52.340 who were reasonably,
03:22:53.440 respectable,
03:22:54.320 respectable,
03:22:55.020 and competent,
03:22:55.780 and that's just,
03:22:56.560 blown apart,
03:22:57.940 and so,
03:22:59.140 there seems to be,
03:23:01.960 a real loss in that,
03:23:02.960 and it's so strange,
03:23:04.160 that as,
03:23:05.500 that,
03:23:06.300 the number of ways,
03:23:07.400 of communicating,
03:23:08.300 easily,
03:23:08.760 have multiplied,
03:23:09.720 the possibility,
03:23:10.580 of computing,
03:23:11.480 communicating with,
03:23:12.220 everyone at once,
03:23:13.360 has disappeared,
03:23:15.860 so,
03:23:16.480 it's so strange,
03:23:18.620 to see that,
03:23:19.280 that blowing apart,
03:23:20.440 of a limitation,
03:23:21.320 actually eradicated,
03:23:22.440 something of tremendous utility,
03:23:24.260 because you could,
03:23:25.420 there were times,
03:23:27.280 this was true,
03:23:28.680 continually,
03:23:29.200 in the 60s,
03:23:29.800 and the 70s,
03:23:30.460 where,
03:23:31.120 one event,
03:23:33.220 could grip the whole nation,
03:23:34.740 you know,
03:23:35.640 and that was actually,
03:23:36.400 one of the ways,
03:23:36.920 of defining a nation,
03:23:38.640 and that's,
03:23:39.540 disappearing,
03:23:40.540 incredibly quickly,
03:23:42.220 and it's not clear,
03:23:43.160 that we can handle,
03:23:44.080 the fragmenting,
03:23:45.780 consequences of that,
03:23:48.020 so,
03:23:49.900 we got work,
03:23:51.900 to do with this platform,
03:23:52.940 huh,
03:23:53.740 what inspires you,
03:24:00.920 this,
03:24:02.300 this is good,
03:24:03.200 yeah,
03:24:08.680 so,
03:24:09.480 Dave mentioned,
03:24:10.460 that,
03:24:11.040 so,
03:24:11.420 I think this is,
03:24:12.220 the 116th city,
03:24:13.640 that I've been in,
03:24:14.640 since last,
03:24:15.720 since a year,
03:24:16.320 my book came out,
03:24:17.280 a year ago,
03:24:17.840 so it's basically,
03:24:18.660 a year anniversary,
03:24:19.940 I think to the day,
03:24:20.920 actually,
03:24:21.300 I mean,
03:24:28.160 so,
03:24:28.580 so,
03:24:29.120 what's inspiring,
03:24:31.140 well,
03:24:31.660 first of all,
03:24:33.720 you know,
03:24:34.020 I discovered this year,
03:24:35.260 that there's a huge market,
03:24:36.860 for,
03:24:38.260 discourse,
03:24:39.220 at the highest level,
03:24:40.840 that I can manage,
03:24:42.340 you know,
03:24:42.800 that doesn't necessarily mean,
03:24:44.140 it's high level discourse,
03:24:45.200 but it's the best,
03:24:45.880 that I can manage,
03:24:46.780 there's a market for that,
03:24:48.320 that's really interesting,
03:24:50.500 one of the things,
03:24:52.460 Dave and I have talked about this,
03:24:53.940 you know,
03:24:54.820 this group of us,
03:24:56.080 has been given this nomenclature,
03:24:58.360 the intellectual dark web,
03:24:59.860 and that happened accidentally,
03:25:01.640 and we've been trying to puzzle through,
03:25:04.600 what,
03:25:06.260 caused that grouping,
03:25:08.520 and one of the things,
03:25:10.460 that characterizes everybody,
03:25:11.840 who's in that,
03:25:13.580 category,
03:25:15.180 is that,
03:25:16.000 each of us has an independent media platform,
03:25:18.320 isn't beholden to anyone for it,
03:25:22.480 so there's an independence there,
03:25:24.540 financial and practical,
03:25:26.500 but then there's another issue too,
03:25:28.060 which is that,
03:25:28.600 there isn't a single person in that group,
03:25:30.400 who thinks their audience is stupid,
03:25:33.380 you know,
03:25:33.740 and that's also I think,
03:25:35.080 at least in part,
03:25:35.840 a consequence of the new technology,
03:25:38.140 because if you run a television station,
03:25:41.100 the technology makes your audience,
03:25:43.260 feel stupid to you,
03:25:45.560 and there's a bunch of reasons for that,
03:25:47.300 that are worth thinking through,
03:25:48.440 so the first reason is,
03:25:50.040 you can't produce a show,
03:25:51.500 and assume that your audience,
03:25:52.980 has seen any other show,
03:25:54.560 that you've ever produced,
03:25:55.900 so you have to assume,
03:25:57.140 that you're talking to someone,
03:25:58.100 who has no memory,
03:26:00.200 right,
03:26:00.440 and then,
03:26:00.940 you have to appeal,
03:26:02.000 to the lowest common denominator,
03:26:03.900 and then the bandwidth,
03:26:05.060 is extraordinarily expensive,
03:26:06.280 so you can't explain anything,
03:26:07.760 in detail,
03:26:08.300 so basically,
03:26:09.520 if you run a television station,
03:26:11.080 you're talking to someone,
03:26:12.100 who has attention deficit disorder,
03:26:14.420 and Alzheimer's,
03:26:15.520 right,
03:26:17.980 right,
03:26:18.420 and so,
03:26:19.100 you can't help,
03:26:20.360 but think of your audience,
03:26:21.820 as not very bright,
03:26:22.980 when you're looking at them,
03:26:24.120 through an aperture,
03:26:25.240 that's only this wide,
03:26:26.320 and then,
03:26:27.480 with the new technology,
03:26:28.880 with YouTube,
03:26:29.780 and with podcasts,
03:26:30.740 it's like,
03:26:31.120 well,
03:26:31.680 the doors are open,
03:26:32.700 man,
03:26:32.940 it's like,
03:26:33.360 and so Rogan,
03:26:34.040 starts to experiment,
03:26:35.080 with three hour podcasts,
03:26:36.840 who the hell,
03:26:37.500 would have ever thought,
03:26:38.160 that was a good idea,
03:26:39.780 you know,
03:26:40.200 right,
03:26:41.260 so you all think,
03:26:42.040 it's a good idea,
03:26:43.560 you know,
03:26:43.880 and it's the same,
03:26:44.740 the same thing happened,
03:26:45.740 with entertainment,
03:26:46.720 is like,
03:26:47.140 the rule on network TV,
03:26:48.840 for a long time,
03:26:49.500 was,
03:26:49.720 you can't expect an audience,
03:26:51.060 to have more than,
03:26:52.140 a 90 minute attention span,
03:26:53.760 a movie,
03:26:54.340 that's as long as,
03:26:55.120 they can ever manage,
03:26:56.160 and that's like,
03:26:57.100 you know,
03:26:57.560 Game of Thrones comes out,
03:26:59.220 it's like,
03:26:59.900 200 hours long,
03:27:01.840 and,
03:27:02.360 and,
03:27:02.760 and,
03:27:03.120 and,
03:27:03.820 you know,
03:27:04.200 there's 50 plot lines,
03:27:05.740 or 60 plot lines,
03:27:06.900 and people have absolutely,
03:27:07.800 no problem whatsoever,
03:27:08.920 following it,
03:27:10.180 and so,
03:27:10.680 it turns out,
03:27:11.440 that we're way smarter,
03:27:12.460 than we thought we were,
03:27:13.740 and what,
03:27:14.440 one of the things,
03:27:15.180 that inspires,
03:27:16.100 that's inspired me,
03:27:16.960 over the last,
03:27:17.720 year and a half,
03:27:18.660 let's say,
03:27:19.140 or so,
03:27:19.440 is the realization of that,
03:27:20.880 and then,
03:27:21.720 you know,
03:27:23.620 this,
03:27:24.540 so I've talked to,
03:27:25.380 about 250,000 people now,
03:27:27.340 in this way,
03:27:28.560 and it's amazing,
03:27:29.720 that people will,
03:27:31.380 come out,
03:27:31.960 like,
03:27:32.760 is it Saturday?
03:27:34.520 What,
03:27:34.940 what day is it?
03:27:36.000 I,
03:27:36.340 I think today is Wednesday,
03:27:38.320 I've been,
03:27:38.640 I've been,
03:27:38.900 busy too.
03:27:40.600 Oh,
03:27:41.040 I have no idea,
03:27:41.920 what day it was,
03:27:42.600 I mean,
03:27:43.320 I'm not even sure,
03:27:44.820 what time it is.
03:27:45.340 Do not put that on Twitter,
03:27:46.740 please.
03:27:49.780 Anyways,
03:27:50.400 look,
03:27:50.680 it's remarkable,
03:27:51.580 that 3,000 people,
03:27:52.820 will come out,
03:27:53.320 to have a discussion,
03:27:54.040 like this,
03:27:54.940 and I was rereading,
03:27:56.220 12 Rules for Life,
03:27:57.560 today,
03:27:58.300 and,
03:27:58.800 because I,
03:27:59.440 was trying to update,
03:28:01.400 my memory again,
03:28:02.380 and,
03:28:03.020 you know,
03:28:03.720 it's a way easier book,
03:28:05.360 than Maps of Meaning was,
03:28:06.640 and,
03:28:07.180 but it's still a hard book,
03:28:09.160 and I,
03:28:09.620 I didn't think that,
03:28:10.560 when I was writing it,
03:28:11.360 but when I reread it tonight,
03:28:12.500 I thought,
03:28:12.780 oh,
03:28:12.920 this book is a lot harder,
03:28:13.860 than I thought it was,
03:28:14.820 and yet it's sold,
03:28:16.400 it's sold 3 million copies now,
03:28:18.720 and so,
03:28:19.280 so that's inspiring,
03:28:28.240 it's inspiring to see,
03:28:29.660 how much of a market there is,
03:28:31.580 for,
03:28:33.260 a serious discussion,
03:28:34.600 about how,
03:28:36.380 we could get our individual lives together,
03:28:38.660 and make things,
03:28:39.520 like solidly,
03:28:41.100 better,
03:28:42.200 that's really inspiring,
03:28:44.420 and it,
03:28:44.920 it's really real,
03:28:46.220 as far as I can tell,
03:28:47.180 and so,
03:28:48.520 I can't imagine anything better than that,
03:28:50.660 and which is why I keep,
03:28:51.640 keep doing this,
03:28:52.700 I mean,
03:28:53.000 it's,
03:28:53.260 it's ridiculously exciting to do it,
03:28:55.620 but,
03:28:55.980 it's also,
03:28:57.300 remarkable,
03:28:58.320 I can't believe it,
03:28:59.240 that it keeps,
03:29:00.660 rolling along,
03:29:01.720 and that,
03:29:02.220 I can have these sorts of discussions,
03:29:03.720 with a very large number of people,
03:29:05.680 and millions of people online,
03:29:07.940 so,
03:29:08.440 you know,
03:29:08.660 because I've always talked about,
03:29:09.780 the most serious things,
03:29:10.820 that I could,
03:29:12.480 formulate,
03:29:13.520 in my classes,
03:29:14.760 and then,
03:29:15.200 you know,
03:29:15.380 that worked on YouTube,
03:29:16.480 which was quite the damn shock,
03:29:17.960 and it's worked in podcasts,
03:29:19.520 and then it works in this format,
03:29:21.360 it's,
03:29:22.700 it,
03:29:23.080 that's really something,
03:29:24.460 so,
03:29:25.560 yeah.
03:29:31.440 Well,
03:29:31.960 it's inspiring for me,
03:29:33.020 to be back at it with you,
03:29:34.300 so on that note,
03:29:35.500 I'm going to get out of the way,
03:29:37.060 and make some noise,
03:29:38.140 for Jordan Peterson,
03:29:38.980 everybody.
03:29:42.200 Thank you,
03:29:42.880 Mr.
03:29:43.100 Thank you,
03:29:44.640 everyone.
03:29:45.900 It was very nice,
03:29:46.720 talking with you,
03:29:47.600 and,
03:29:48.320 thanks for coming.
03:29:49.780 Good night.
03:29:51.620 If you found this conversation,
03:29:53.180 meaningful,
03:29:53.800 you might think about,
03:29:54.640 picking up dad's books,
03:29:55.740 Maps of Meaning,
03:29:56.520 The Architecture of Belief,
03:29:58.060 or his newer bestseller,
03:29:59.220 12 Rules for Life,
03:30:00.180 and Antidote to Chaos.
03:30:01.540 Both of these works,
03:30:02.420 delve much deeper,
03:30:03.240 into the topics,
03:30:03.960 covered in the Jordan B. Peterson podcast.
03:30:06.180 See,
03:30:06.440 JordanBPeterson.com,
03:30:07.800 for audio,
03:30:08.500 e-book,
03:30:08.880 and text links,
03:30:09.900 or pick up the books,
03:30:10.700 at your favorite bookseller.
03:30:11.580 And check out,
03:30:12.760 JordanBPeterson.com,
03:30:14.160 slash personality,
03:30:15.360 for information,
03:30:16.220 on his new course.
03:30:17.360 I really hope you enjoyed,
03:30:18.460 this podcast.
03:30:19.380 If you did,
03:30:20.100 please leave a rating,
03:30:20.880 at Apple Podcasts,
03:30:22.000 a comment,
03:30:22.560 a review,
03:30:23.220 or share this episode,
03:30:24.700 with a friend.
03:30:25.620 Thanks for tuning in,
03:30:26.880 and talk to you next week.
03:30:28.340 Follow me,
03:30:28.980 on my YouTube channel,
03:30:30.420 Jordan B. Peterson,
03:30:31.840 on Twitter,
03:30:32.720 at Jordan B. Peterson,
03:30:34.300 on Facebook,
03:30:35.660 at Dr. Jordan B. Peterson,
03:30:37.680 and at Instagram,
03:30:38.660 at Jordan.B. Peterson.
03:30:40.680 Details on this show,
03:30:43.000 access to my blog,
03:30:44.700 information about my tour dates,
03:30:46.480 and other events,
03:30:47.520 and my list of recommended books,
03:30:49.600 can be found on my website,
03:30:51.360 JordanBPeterson.com.
03:30:53.340 My online writing programs,
03:30:55.380 designed to help people,
03:30:56.520 straighten out their pasts,
03:30:58.180 understand themselves in the present,
03:31:00.060 and develop a sophisticated vision,
03:31:01.940 and strategy for the future,
03:31:03.480 can be found at,
03:31:04.360 selfauthoring.com.
03:31:06.260 That's,
03:31:06.760 selfauthoring.com.
03:31:08.500 From the Westwood One Podcast Network.
03:31:19.280 In today's chaotic world,
03:31:21.140 many of us are searching for a way,
03:31:22.580 to aim higher,
03:31:23.400 and find spiritual peace.
03:31:25.220 But here's the thing,
03:31:26.420 prayer,
03:31:26.840 the most common tool we have,
03:31:28.340 isn't just about saying,
03:31:29.360 whatever comes to mind.
03:31:30.560 It's a skill,
03:31:31.400 that needs to be developed.
03:31:32.900 That's where Hallow comes in.
03:31:34.600 As the number one prayer,
03:31:35.600 and meditation app,
03:31:36.700 Hallow is launching an exceptional new series called,
03:31:39.400 How to Pray.
03:31:40.620 Imagine learning how to use scripture,
03:31:42.360 as a launch pad,
03:31:43.180 for profound conversations with God.
03:31:45.580 How to properly enter,
03:31:46.760 into imaginative prayer.
03:31:48.380 And how to incorporate prayers,
03:31:50.100 reaching far back in church history.
03:31:52.520 This isn't your average guided meditation.
03:31:54.720 It's a comprehensive two-week journey,
03:31:56.580 into the heart of prayer,
03:31:58.100 led by some of the most respected,
03:31:59.620 spiritual leaders of our time.
03:32:01.680 From guests,
03:32:02.320 including Bishop Robert Barron,
03:32:04.020 Father Mike Schmitz,
03:32:04.980 and Jonathan Rumi,
03:32:06.280 known for his role as Jesus,
03:32:07.680 in the hit series,
03:32:08.420 The Chosen,
03:32:09.320 you'll discover prayer techniques,
03:32:10.640 that have stood the test of time,
03:32:12.520 while equipping yourself,
03:32:13.480 with the tools needed,
03:32:14.320 to face life's challenges,
03:32:15.660 with renewed strength.
03:32:17.280 Ready to revolutionize,
03:32:18.480 your prayer life?
03:32:19.540 You can check out,
03:32:20.200 the new series,
03:32:21.040 as well as an extensive catalog,
03:32:22.580 of guided prayers,
03:32:23.500 when you download,
03:32:24.420 the Hallow app.
03:32:25.560 Just go to,
03:32:26.360 Hallow.com slash Jordan,
03:32:27.860 and download,
03:32:28.260 the Hallow app today,
03:32:29.260 for an exclusive three-month trial.
03:32:31.420 That's Hallow.com slash Jordan.
03:32:33.760 Elevate your prayer life today.