The Jordan B. Peterson Podcast - July 14, 2022


270. Deception and Psychopathy | Robert Trivers


Episode Stats

Length

1 hour and 20 minutes

Words per Minute

113.33703

Word Count

9,133

Sentence Count

577

Misogynist Sentences

14

Hate Speech Sentences

19


Summary

In this episode, Dad speaks with the American evolutionary biologist Robert Trivers. In 1971, Trivers proposed the theory known as "Reciproc altruism," which is perhaps his most well-known theory. Since then, he has published numerous articles, essays, and books. He was awarded the 2007 Crawford Prize in Biosciences for his analysis and contributions to the theory of social evolution, conflict, and cooperation. Among his contributions is his explanation of self-deception as an adaptive evolutionary strategy, which is the focus of some of this episode. Dr. Trivers also served as the undergraduate advisor to Dr. Heather Haying and Dr. Brett Weinstein, who are both well known to the audience that frequents these dialogues. Let this be the first step towards the brighter future you deserve. Dr. Peterson has created a new series that could be a lifeline for those battling depression and anxiety. We know how isolating and overwhelming these conditions can be, and we wanted to take a moment to reach out to those listening who may be struggling. With decades of experience helping patients, Dr. Jordan B. Peterson offers a unique understanding of why you might be feeling this way. In his new series, he provides a roadmap towards healing, showing that while the journey isn t easy, it s absolutely possible to find your way forward. If you're suffering, please know you are not alone, and there s hope to feel better. Go to Daily Wire Plus now and start watching Dr. B.B. Peterson s new series on Depression and Anxiety, where you can be a part of the team that helps others find a brighter future they deserve. (Daily Wire Plus). - Dr. Michaela Peterson, Ph. D., Ph.D. - The Folly of Fools (The Adventures of an Evolutionary Biologist) - . , & Dr. Robert Tvers, PhD, and more. (Ph. D. ) ( ) is a podcast produced by Dr. John R. Thorsen, PhD ( ) . ( ). ( ) ( ) ( ), ( . , and so on, ) and more ... (and so on). (c) (.. ) is a postscript to this episode is And so on. (1) [2) (3) ... and more!


Transcript

00:00:00.960 Hey everyone, real quick before you skip, I want to talk to you about something serious and important.
00:00:06.480 Dr. Jordan Peterson has created a new series that could be a lifeline for those battling depression and anxiety.
00:00:12.740 We know how isolating and overwhelming these conditions can be, and we wanted to take a moment to reach out to those listening who may be struggling.
00:00:20.100 With decades of experience helping patients, Dr. Peterson offers a unique understanding of why you might be feeling this way in his new series.
00:00:27.420 He provides a roadmap towards healing, showing that while the journey isn't easy, it's absolutely possible to find your way forward.
00:00:35.360 If you're suffering, please know you are not alone. There's hope, and there's a path to feeling better.
00:00:41.800 Go to Daily Wire Plus now and start watching Dr. Jordan B. Peterson on depression and anxiety.
00:00:47.460 Let this be the first step towards the brighter future you deserve.
00:00:50.980 Welcome to episode 270 of the Jordan B. Peterson podcast. I'm Michaela Peterson.
00:00:59.980 In this episode, Dad spoke with the American evolutionary biologist, Robert Trivers.
00:01:05.600 In 1971, Trivers proposed the theory known as reciprocal altruism, which is perhaps his most well-known theory in a paper he published.
00:01:14.280 Since that time, he's published numerous articles, essays, and books.
00:01:17.360 He was awarded the 2007 Crawford Prize in Biosciences for his analysis and contributions to the theory of social evolution, conflict, and cooperation.
00:01:27.800 Among his contributions is his explanation of self-deception as an adaptive evolutionary strategy, which is the focus of some of this episode, which is fun.
00:01:37.700 Robert and Dad also talked about genuine victims and reducing vulnerability, reciprocal altruism, the danger of undermining a core belief, and more.
00:01:45.360 Thank you for listening, and enjoy the episode.
00:02:01.380 Hello, everyone. I'm pleased today to have with me Dr. Robert Trivers.
00:02:05.500 Dr. Trivers is an evolutionary biologist who concentrates on social theory based on natural selection and on evolutionary genetics.
00:02:13.520 These happen to be the backbones of all biology.
00:02:17.080 His early work focused on reciprocal altruism, which we will talk about in some detail today,
00:02:22.860 the evolution of sex differences in all species, the sex ratio at birth, parent-offspring conflict, kinship and sex ratio in social insects,
00:02:32.520 and a theory outlining the nature of self-deception and its operation in the service of deceit,
00:02:38.820 which in itself can confer, however temporarily, certain advantages.
00:02:44.020 He then devoted 15 years of his life with Austin Burt to reviewing the vast topic of selfish genetic elements in all species,
00:02:52.160 except bacteria and viruses.
00:02:54.020 These are genes that do not benefit the individual with the genes,
00:02:57.300 but spread because they have a within-individual selective advantage.
00:03:00.940 In 2011, he published a popular book, Deceit and Self-Deception,
00:03:06.840 Fooling Yourself, The Better to Fool Others, published in the U.S. as The Folly of Fools.
00:03:12.220 It's been translated into 11 languages, including Korean, Chinese, and Taiwanese,
00:03:16.680 and is widely regarded as a definitive treatment of the subject.
00:03:20.140 In 2015, he published a personal memoir, Wildlife, Adventures of an Evolutionary Biologist,
00:03:27.300 translated into Spanish and Polish.
00:03:28.960 A side note, Dr. Trivers also served as the undergraduate advisor to Dr. Heather Haying
00:03:34.280 and Dr. Brett Weinstein, who are both well-known to the audience that frequents these dialogues.
00:03:39.840 Welcome, Dr. Trivers.
00:03:41.320 It's very good of you to agree to talk with me today.
00:03:47.340 Thank you, sir.
00:03:49.180 So I thought, my pleasure, I thought we'd start with reciprocal altruism.
00:03:54.820 There'll be lots of people who are listening and watching that don't know what that means,
00:03:58.940 and it's a crucial idea.
00:04:01.140 And so I'd like you to outline, well, first to define it, and then to outline your thoughts about it, if you would.
00:04:06.420 Well, W.D. Hamilton, who I always regarded as the only greater social theorist, evolutionary social theorist than myself,
00:04:21.700 had already laid out in 1964 in detail, the argument for altruistic behavior, so-called altruistic is something that lowers your own reproductive success called fitness.
00:04:41.580 But I never liked the term fitness, because it had connotations that could get in the way of your understanding,
00:04:51.340 whereas reproductive success directly described what we're talking about, the number of surviving offspring you left.
00:04:59.140 So an altruistic act is one which lowers your production of surviving offspring and raises the production of surviving offspring of the recipient of your altruism.
00:05:15.720 Now, if you're related, then the gene or genes involved may enjoy a net benefit.
00:05:24.600 So, indeed, you're related to your children typically by a half, and yet you invest in them as a key vehicle to your reproductive success.
00:05:40.600 But Hamilton extended the system laterally, so nephews and nieces might not be direct descendants of yours,
00:05:50.920 but still, you could be related to them by a quarter, let's say, in which case the benefit would have to be greater than four times the cost for the behavior to be selected.
00:06:06.200 So that was the first step.
00:06:11.080 Now, when I started becoming a biologist in 1967, I took a year as a special student at Harvard to make up for the complete lack of an undergraduate education in biology and was then accepted into graduate school.
00:06:36.680 In any case, I thought it was obvious that there was a second kind of altruism in which I did something nice for you, and at a certain point in the future, you did something nice back to me.
00:06:54.980 So it was reciprocal.
00:06:56.280 And as long as benefits were greater than cost, which one assumes they would be, otherwise you wouldn't be selected under any regime for them, then there could be a net benefit of this transfer.
00:07:15.340 The problem with reciprocal altruism was what happens with the so-called cheater.
00:07:23.720 That is, the system automatically selects for someone that receives the benefit, but doesn't bother reciprocate.
00:07:33.600 Well, if they don't reciprocate at all, you cut off any future altruism toward them.
00:07:40.480 And so each act of failure on their part results in a source of altruism being cut off.
00:07:50.100 So the more interesting phenomenon is where you cheat.
00:07:55.060 That is, you give back less than you got, but you're still giving back a benefit.
00:08:02.720 So they still receive a benefit.
00:08:07.760 They just don't receive the benefit that they, quote, ought to or would if the system was egalitarian and fair.
00:08:20.900 And those words, fair and just and so on, I felt, actually emerged from reciprocal altruism precisely because they evaluated the costs inflicted versus the benefits received.
00:08:44.340 And so you had subtle cheaters, which reciprocated to a degree and enough so that you enjoyed a net benefit, but not as much as you, quote, deserved.
00:08:59.640 And so that was a dynamic of reciprocal altruism was the cheater detector detection of the so wasn't so difficult, but how to interact with the individual.
00:09:21.320 So as to change his or her behavior, that was a more interesting problem.
00:09:29.920 So a colleague, I think a graduate student at Harvard, had happened to write a paper reviewing the emotions associated with altruistic behavior.
00:09:47.820 He didn't have any particular theoretical orientation or evolutionary, but he reviewed the subject.
00:10:00.300 And I remember going to him.
00:10:03.160 I took a class on morality, something like that, specifically in order to learn enough about human behavior related to reciprocity to flesh out my.
00:10:17.820 Paper, but their first class, I saw that the graduate student who was a teaching assistant had already written a paper doing exactly what I hope to learn by sitting in on a class.
00:10:36.300 So I went to him and asked if I could sit and read his paper, let's say, in his office and take notes.
00:10:48.020 And he said, hell, I'll give you a copy.
00:10:51.020 And I thought, almighty God, that's an act of altruism that I can surely benefit from.
00:11:00.620 And so he gave me a copy.
00:11:02.540 I promptly dropped out of the course and actually molded the second half, i.e., the content of my paper in the different categories were just drawn straight out of his work and reorganized a bit so as to fit the logic I was pushing for.
00:11:32.540 I hope that's not too complicated or too much detail.
00:11:37.100 No, no, no, it's exactly right.
00:11:39.220 So why concentrate specifically on reciprocal altruism?
00:11:45.360 I mean, obviously tied up in that is cooperation, mutual benefit.
00:11:51.060 And then also you discuss the problem of cheating and cheating deception or detecting cheating.
00:11:58.080 And deception, of course, is a way of making cheating difficult to detect.
00:12:03.080 And so you're focusing your biological inquiry on what we would intuit as moral issues.
00:12:10.340 Moral issue associated with cooperation, the moral issue associated with deviation from that cooperation.
00:12:17.560 Is that a reasonable way of looking at what you've been doing?
00:12:21.300 Well, what I did when you say what I've been doing, I actually did write a paper called reciprocal altruism 30 years later and tried to bring the subject up to date.
00:12:42.160 But in general, I didn't do that on any of my papers.
00:12:46.640 That is, I wrote the paper and that was it.
00:12:50.720 I wrote a paper on parent-offspring conflict.
00:12:54.000 I don't think I've ever written a second one.
00:12:56.180 I wrote a paper on haplodiploidy and the evolution of social insects, where I took kinship theory, Hamilton's kinship theory, and applied it rigorously to the unusual situation of ants, bees, and wasps, where males only have one set of chromosomes, they're haploid, and females have two, they're diploid.
00:13:23.560 And that leads to unusual degrees of relatedness.
00:13:27.400 Indeed, it's the only case in nature, other than identical twinning, where you're more related to someone other than your own offspring, namely, full sisters.
00:13:48.960 In a haploidyploid system, you're related to them by three quarters, but you're only related to your brothers by one quarter.
00:13:58.800 So it cancels out to give you half.
00:14:02.560 And that's how people thought about it for a couple of years.
00:14:08.820 But it's obvious to me that you don't average them.
00:14:13.840 If one is three quarters and the other is a quarter, then you're selected to invest much more heavily in those that you're related by three quarters, and much less than those that you're related to by one quarter.
00:14:30.160 How do you envision the relationship between reciprocal altruism and the structures of society and the moral structures that guide society?
00:14:43.920 I mean, I would almost be thinking off the top of my head.
00:14:49.360 It's been quite some time since I've thought about it.
00:14:53.920 You mentioned in your introduction that I peeled off 15 years of my life or whatever it was to master selfish genetic elements.
00:15:06.140 That's that's selection below the level of the individual.
00:15:12.620 Hamilton was conscious of it, but it was Austin Birch, my co-author, who was the first to see really deeply into the literature and start to reorganize it.
00:15:28.520 So the entire subject of genetics, evolutionary genetics, was reorganized around the concept of selfish genes and the conflict they have with others, both between closely related individuals and within individuals.
00:15:51.020 So now you're asking me the relationship between reciprocal altruism and sort of society wide phenomenon and so on.
00:16:03.920 And I can't I can't boil it down to a simple argument.
00:16:10.340 Yep.
00:16:11.640 In 2002, which was the selected papers of Robert Trevor, not to collect it because it wasn't worth it.
00:16:21.020 Everything just to selected.
00:16:24.980 But I did something.
00:16:29.040 Unique.
00:16:30.380 Generally, for selected or collected papers, people would write a short introduction to how they happen to write the paper and then you would get the paper and then there would be a short introduction for your next paper and then the paper.
00:16:47.380 What I added was what you're asking for, which is I would write a short introduction, then the paper, then there would be a short section on progress since then, since the paper is published.
00:17:06.120 Uh, that was often fairly brief.
00:17:12.120 Um, and, uh, I can't remember what I said about, uh, reciprocal autism.
00:17:21.260 Now I, I would have to go get the book to, uh, to check it out.
00:17:29.120 But I'm afraid I don't have, I'm afraid I can't reason for you at the level you would like in terms of reciprocal autism and societal organization and so on.
00:17:46.900 I think it's obvious that societies are not, I mean, they're, they may be partly based on kinship, but, uh, uh, only partly and, and often much more on patterns of cooperation that, uh, evolve or are generated and sustain themselves.
00:18:12.900 Why would you stress cooperation as a basis for social organization, say, rather than competition or kinship for that matter?
00:18:24.220 Well, I mean, I mean, it's, it's partly just, you know, what was available at the time when I wrote on reciprocal altruism, if you want to, if you want to bring it all the way back to the evolution of reciprocal altruism, which was my first paper published.
00:18:44.260 Uh, and indeed Harvard broke their usual rule and they allowed me, uh, I just had, uh, a thesis that consisted of three chapters.
00:18:58.320 One was reciprocal altruism.
00:19:00.940 The other was parental investment in sexual selection.
00:19:03.920 Then Harvard had a rule that you had to have at least one chapter that was empirical.
00:19:10.820 Uh, and that meant lab work, which I had no intention whatsoever of doing.
00:19:17.740 I knew nothing about labs and I had no ability there or field work.
00:19:22.380 Well, now field work is, uh, much more congenial, especially if you're interested in social behavior, social theory.
00:19:32.140 So watching baboons, which I did with Irv DeVore in, uh, Kenya and Tanzania back in 72,
00:19:41.380 or going to, uh, uh, Haiti and, and Jamaica with my advisor, Ernest Williams, who was the expert on tree climbing lizards.
00:19:53.700 So in fact, I did my third chapter on the green lizard, uh, the largest of the oldest tree climbing lizards.
00:20:03.920 There's seven species.
00:20:05.220 Well, there's really eight, uh, in Jamaica.
00:20:08.120 Uh, so I had to go back down to Jamaica at regular intervals.
00:20:14.720 I would come back to Harvard and, and, and work for three months during the semester of teaching and so on.
00:20:21.580 Then you'd get a month between semesters and I would fly down to Jamaica and that set up a lifelong bond between me and Jamaica,
00:20:31.440 where I've lived, uh, at least 20 years of my life, uh, married onto the island or stole a woman off the island is another Jamaican expression and, uh, have four children by her.
00:20:49.720 Dr. Trivers, how did you get interested in deception?
00:20:52.960 And how did you get interested in deception and then in self-deception and the relationship between the two?
00:21:00.280 Let me step back one second, sir.
00:21:03.280 You asked about, uh, competition versus cooperation.
00:21:09.560 Yes.
00:21:11.400 I mean, you competition even applies between different cooperative enterprises.
00:21:18.860 They cooperate within their entity, but they compete with other similarly structured entities, uh, so as to maximize their reproductive success.
00:21:33.120 Uh, now competition was well known.
00:21:37.640 I mean, it was the basis of life where we're out there competing all the time.
00:21:43.200 Uh, cooperation was, um, a subtler problem with which you had to figure out what the competitive natural selection advantage was.
00:21:57.600 So now you just asked about deceit and self-deception, but sir, I'm afraid, uh, blame it on, uh, 78 years of, uh, mentation, if you will.
00:22:10.060 So, uh, I, I do know that by 76, in other words, reciprocal autism was 71, parental investment was 72, uh, Trivers and Willard, uh, which was an interesting theory about, uh, tending to produce sons under these conditions and then, uh, daughters under these conditions.
00:22:38.960 So, for example, if you look at a human hierarchy, uh, people at the top end tend to produce sons, uh, with higher frequency, but people at the bottom end tend to produce daughters at higher frequency.
00:22:56.180 Well, every child has a mother and a father.
00:23:00.980 So we know that the aggregate reproductive success of females must equal the aggregate reproductive success of males.
00:23:10.720 So if high class males are doing better, then it makes sense that lower class women are doing better.
00:23:19.240 So as to balance out the equation.
00:23:22.840 And so they get some advantage of, uh, these men, i.e. they mate with them.
00:23:30.460 Um, now, um, I wrote, I wrote the foreword or something like that for some book in 1971.
00:23:48.240 In 1975 or 76, and I'm afraid it slipped my brain, but I slipped in a theory of, uh, deceit and self-deception there.
00:24:04.260 Uh, just, there were just two sentences in there, something about, uh, you know, deceptions obviously favored, but, uh, um,
00:24:15.860 You know, the best way to hide it from others is, first of all, to hide it from yourself.
00:24:24.320 And then you don't give off any of the cues that are associated with deception.
00:24:32.740 Incidentally, I have paid attention to the literature on cues.
00:24:37.820 And it's very interesting to me that the most general cue for deception is a slight raise in your pitch of your voice.
00:24:51.820 That seems to be all but universal.
00:24:54.500 Uh, so if, if you can listen carefully enough to hear when someone's voice rises a bit, they're more likely to be practicing, uh, deception.
00:25:11.860 So, I practiced as a clinician for a long time, and as a research psychologist, and I was very interested in how people deceived themselves and the kinds of psychopathologies that emerged.
00:25:29.880 And, so here's a, here's a hypothesis.
00:25:32.660 Let me ask you what you think of it.
00:25:34.480 What I noticed often was that when people received information that contradicted one of their explicit beliefs, the information often manifested itself emotionally.
00:25:51.260 And so, imagine that a husband comes home with lipstick on his collar, and the wife sees that and becomes agitated as a consequence.
00:26:02.100 But then refuses to think through what the implications of that might be.
00:26:10.080 And so, the self-deception isn't one fully thought out proposition versus another.
00:26:17.320 It's a fully thought out set of propositions, say about marital stability, or at least partially thought out, versus an emotional cue of uncertain significance that has to be unpacked with difficulty.
00:26:32.100 And avoidance of that is tantamount, at least to some form of self-deception.
00:26:39.320 How does that strike you?
00:26:40.540 Well, I get lost in your argument there.
00:26:44.100 Go, go back to your example.
00:26:47.240 There's lipstick on his, uh, collar.
00:26:49.760 Right, and now she has a vision in her mind of, let's say, marital stability and harmony.
00:26:57.480 And it's pretty fleshed out.
00:27:00.260 But now all she's got that contradicts that is one piece of visual evidence and an emotion, which might be anger, anxiety, and so forth.
00:27:10.120 And she can hold on to her pre-existing belief with no work.
00:27:15.880 In order to transform that belief, she's going to have to do a tremendous amount of exploration and investigation.
00:27:23.160 And so she can just not do that.
00:27:24.900 What is her pre-existing, quote, belief here?
00:27:28.680 I'm getting lost already.
00:27:30.980 It might be a fantasy of marital harmony.
00:27:33.540 Okay, so she has a fantasy of marital harmony.
00:27:38.200 She has a piece of evidence that's inconsistent with marital harmony.
00:27:44.800 Now what?
00:27:46.660 Well, there's a very large potential range of meanings of that piece of evidence.
00:27:54.080 And so for her to transform that into something differentiated enough to alter her fantasy would take a tremendous amount of effort.
00:28:05.060 And so she can just not do that.
00:28:08.200 And that's passive self-deception, which I think is the most common type.
00:28:14.840 Passive self-deception?
00:28:16.520 Yes.
00:28:17.100 You know something's up.
00:28:18.760 You know there's an elephant under the carpet.
00:28:21.000 But you decide not to look.
00:28:22.780 So I'm getting confused now.
00:28:25.660 You've already seen it.
00:28:28.060 You've seen the lipstick there.
00:28:30.840 So you've seen evidence that suggests that he's in a semi-intimate relationship with another woman.
00:28:42.420 At least to the point of kissing her or being kissed by her.
00:28:48.680 Now I'm confused with what you're saying.
00:28:51.220 She's put a lot of time and effort into the theory of her marriage that she already holds.
00:28:59.920 So it's the idea that that marriage is stable and loving, let's say, is a predicate for many of her memories.
00:29:08.840 It's a presumption for many of her current activities.
00:29:12.980 And it's the basis for her future plan.
00:29:16.000 And so to investigate that means that she would have to go through all the work of modifying all those representations.
00:29:24.620 It's not like the evidence contains that modification.
00:29:28.980 It's just an error message.
00:29:31.940 It's hard to unpack an error message.
00:29:34.380 What's the error message?
00:29:36.920 Well, the error message would be the lipstick and then the negative emotion that goes with its visual apprehension.
00:29:45.300 What's the error?
00:29:47.160 The error is that the presumption of her fantasy that her marriage is stable and loving and, let's say, also monogamous.
00:29:55.840 So that's an error, you're saying.
00:29:58.940 And now she's got a message that she's in error.
00:30:03.160 So now what's the...
00:30:04.820 Right.
00:30:05.180 But that's all she has.
00:30:06.740 That's the problem, right?
00:30:08.140 And all she has is a message that she's in error.
00:30:11.920 It doesn't contain much other information.
00:30:14.200 And it's going to be extremely hard for her to reconstruct all that theorizing she's done based on the assumption, erroneous assumption of monogamy.
00:30:26.500 And so it's easiest just not to do it.
00:30:30.000 Yes.
00:30:30.380 And then what's the cost of her not doing it?
00:30:33.940 Well, I would say the immediate cost is virtually nothing.
00:30:37.800 And that's another advantage.
00:30:39.720 But the long-term cost is that she's building a future reality based on a mispresumption.
00:30:48.400 And so she can't...
00:30:51.340 She may make errors, for example, about her financial security going into the future, or even the presence of her partner.
00:30:58.780 And that's a huge potential cost.
00:31:03.460 She's going to underestimate, for example, the danger of divorce or of him leaving.
00:31:08.860 In order to avoid facing the pain and the...
00:31:15.940 Yes.
00:31:16.740 Well, I think this idea maps onto your hypothesis that at times the right hemisphere and the left hemisphere can be delivering contradictory messages.
00:31:29.860 So, if the left is linguistic and generates up detailed propositional arguments, let's say, that get held with some certainty, and the right emotionally signals error,
00:31:44.500 then there's a tremendous amount of work that has to be done in order to unpack that error and remake all those propositional presumptions.
00:31:56.540 It's really hard.
00:31:57.980 That's what I don't understand in your argument.
00:32:02.060 I don't understand why there's a huge amount of work where she's got to unpack a whole endless series of assumptions or arguments just because it is one thing.
00:32:17.940 All right.
00:32:18.680 Let's go back to the situation.
00:32:21.040 Let's go back to the situation.
00:32:51.040 It seems to me, I mean, she has a simple decision.
00:32:56.320 Does she confront him over it and say, Joe, what the hell is this?
00:33:04.880 Or confront him whatever way she wants to.
00:33:09.020 Okay, so imagine the complexity of that confrontation.
00:33:15.860 And this is the sort of thing that I saw a lot in my clinical practice.
00:33:19.660 So, because he lied about that, she no longer knows whether anything he's told her or anything he's done is true or real.
00:33:32.380 Well, it's because she's violated this basic presupposition of trust.
00:33:38.960 And so, part of the reason she's going to have a major emotional reaction to that is that she now doesn't know whether she can trust anything about him and may have to reevaluate all her perceptions of him, even those that are part of the past.
00:33:57.120 Yeah, go ahead.
00:34:00.180 So, what?
00:34:01.240 Well, so there's a tremendous amount of work associated with that.
00:34:06.360 You know, and part of what our certainties do, as far as I can tell, is inhibit anxiety and doubt, almost by definition.
00:34:14.480 And so, now if you've discovered that you can't trust someone because they violated a fundamental presumption, then every part of the way you look at the world that's predicated on that trust has now become unstable.
00:34:31.100 Well, now that is such a strong statement.
00:34:35.040 You say everything that is associated with that violation of trust is now subject to reevaluation and so forth and so on.
00:34:50.380 That's, I don't know, you, you know, you're the foundation of your argumentation towards me and I respect it.
00:35:02.360 Is it your, have clinical experience dealing with people who come to you and talk to you about these kinds of things?
00:35:14.520 So, you know, I think that's a good objection.
00:35:19.080 And so, let me, let me propose something in relationship to that because I think that's a crucial objection that you made.
00:35:27.160 So, one of the things that I wrestled with formulating when I was thinking about self-deception was the relationship of one belief to another.
00:35:42.820 So, imagine that, and this is something you could object to, imagine that some beliefs are more fundamental than others and that fundamentalness or is a reflection of how many other beliefs depend on that belief.
00:36:00.720 It's like a definition of fundamental, it's a hierarchy.
00:36:05.580 And so, some beliefs are trivial because almost nothing depends on them, but other beliefs are absolutely fundamental because everything that you're doing depends on their validity.
00:36:17.800 And so then, well, depending on how deep the belief is.
00:36:22.220 Eating dinner?
00:36:23.880 Come on, brother.
00:36:25.080 Well, okay, but fair enough, good objection.
00:36:29.360 But, you know, if you deal with someone who's profoundly depressed because something that was crucial to them was devastated, they will often have a tremendous amount of difficulty doing even those basic things.
00:36:43.620 That's true, brother.
00:36:44.660 I grant you that people can suffer to the point where they've got a problem eating, they've got a problem digesting.
00:36:54.300 I mean, the actual digestive system may be hindered or altered by the kind of mental stress or mentation that's going on.
00:37:06.700 Let me give you an example of that with depression.
00:37:11.140 Go ahead.
00:37:11.700 So, this is often what, okay, this is often what happens with people who are very depressed.
00:37:18.620 Let's say they have a minor argument with their son, someone they love.
00:37:24.420 Just a minor argument.
00:37:26.520 And then they think, well, I acted really badly in that argument.
00:37:31.600 I really hurt my son's feelings.
00:37:34.340 Only a terrible person would hurt someone's feelings.
00:37:37.900 I must be a terrible person.
00:37:39.840 I'm a terrible person now, and I'm going to be a terrible person in the future, and there's nothing that can be done about it.
00:37:48.380 And that's the sort of thinking that leads to suicide.
00:37:51.260 And you can see the person going down the hierarchy of their beliefs, right, from the little argument, which is nothing, all the way down to something that is so basic to their self-concept that if it's challenged, they want to die.
00:38:06.000 That happens a lot in real depression, real severe depression.
00:38:10.620 That happens continually.
00:38:12.040 It's almost like the hallmark of the illness.
00:38:15.300 Well, that's interesting.
00:38:17.760 I have been thinking about depression, you know, personally.
00:38:22.840 I was presenting that conception of depression, you know, that cascade of doubt that I outlined as an illustration of what people are motivated to avoid when they practice self-deception.
00:38:40.580 They don't want to start unraveling because they don't know where the unraveling will end.
00:38:46.980 So they're afraid of that.
00:38:50.980 And that's partly why they won't investigate.
00:38:54.900 Yes, I, you know, off the top of my head, I would agree with that kind of argument that.
00:39:01.920 They don't want to pursue reality very far when it's easier to flip, flip part of it and be unconscious or try to become unconscious of the flip you're making.
00:39:26.100 Now, in your book on self-deception, you outlined some of the social costs of self-deception, say, in relationship to warfare and talked about the way that the biases that we have to perhaps reject contradictory information can produce catastrophic consequences, say, at the policy level.
00:39:52.960 You said, for example, that leaders and the people that they purport to lead are often extremely over optimistic at the beginning of a war and also have a proclivity to derogate and minimize the strength of their enemy.
00:40:11.040 So, when you did your work on self-deception, did you draw any ethical conclusions from it?
00:40:23.540 I mean, as an evolutionary biologist, you see it as a strategy in a sense, but it's a strategy that has a lot of costs.
00:40:32.300 Initially, initially, I was very much down on both deception and self-deception.
00:40:41.060 I was very much biased towards the truth and honesty.
00:40:48.080 Then, I think when I saw the degree to which deception was advantageous,
00:40:58.000 you mentioned in the book, having lots of examples from other animals of deceptive behavior and even morphology.
00:41:11.320 And then, self-deception, I was against, you know, doubly, if you will, because you're deceiving yourself.
00:41:22.760 So, you're both a victim and a victimizer, as I imagined it.
00:41:29.700 Then, I came to kind of relax about both of them.
00:41:34.380 I saw situations in which deception is something I would practice consciously, you know.
00:41:44.620 But again, it might have to be a fairly serious situation in which you would have to construct a serious lie to get out of it.
00:41:58.960 And I know there have been situations in my life not too, too long ago where I've spent a lot of time constructing a deception
00:42:10.820 that gives off the minimal amount of cues so that it's hard to detect, if you will.
00:42:19.720 When I walked my clients through situations where they had to construct deceptions to avoid, let's say, some serious consequence
00:42:30.980 or maybe to gain some serious advantage, which often backfired in the long run,
00:42:38.420 one of the things that seemed useful to do was to trace back into their story the events that led to the necessity of the deception.
00:42:52.840 There's a Canadian songwriter who wrote a line that struck me in this regard.
00:42:59.520 He said,
00:43:00.100 There is no decent place to stand in a massacre.
00:43:04.360 And my response would be, well, you should unpack the actions that led you to be there to begin with, right?
00:43:14.100 Because sometimes there's no good way out of something, but there might have been a good way of not having it arise in the first place.
00:43:24.060 There's no decent place to stand in a massacre.
00:43:28.960 Does that include the victims?
00:43:31.640 Well, that's a good question, isn't it?
00:43:34.360 You know, if you've been a victim, let's say, you've been a victim, and genuine victim,
00:43:40.880 I saw this in my clinical practice a lot as well.
00:43:45.980 Despite the genuineness of your victimization,
00:43:48.820 it still might be useful asking yourself if you did anything that you could change
00:43:55.660 that increased the probability of that victimization.
00:44:00.320 Increased the probability that it did happen?
00:44:02.980 Yes, that you were victimized.
00:44:05.860 You know, you might say, for example, you might want to address your vulnerabilities.
00:44:11.600 You know, so here, let me give you an example, okay?
00:44:15.200 Imagine that you, I had clients like this.
00:44:18.900 They were women who had been in sequential abusive relationships.
00:44:23.660 Yes.
00:44:24.100 Okay, so they were victims, and often of extremely violent and sometimes psychopathic men.
00:44:31.480 Yes.
00:44:32.720 And what, but what I would help them do, because I couldn't deal with the psychopathic men,
00:44:37.980 they weren't there, was to unpack elements of their actions and assumptions
00:44:44.240 that might have increased the probability that they would enter into those relationships.
00:44:49.420 I can remember cases, though not in much detail now, where I noticed a particular woman
00:45:00.220 woman who seemed to go from abusive relationship to abusive relationship.
00:45:06.900 So she would sometimes flee an abusive relationship literally by changing the city she lived in.
00:45:17.000 This is in the U.S., but then what was so striking to me was I would visit her or see her.
00:45:27.440 She was not an intimate friend of mine, but a close friend, if you will, or potentially close friend.
00:45:35.020 And I would see, by God, she's gone and found somebody else that's abusive in this new situation.
00:45:44.740 So she's drawn to them in some way, or at least she's not averse to them.
00:45:51.000 You know, she doesn't have her guard up, and she may indeed be attracted to some element of them
00:46:00.520 that is associated with them being abusive.
00:46:04.060 I'm sure, given your clinical practice, you must have examples of this kind of stuff.
00:46:11.540 Here's something to think about in that regard.
00:46:15.080 Oftentimes, women who find themselves in those situations
00:46:19.580 aren't sophisticated enough, for one reason or another,
00:46:24.460 to distinguish between power, aggression, and competence.
00:46:31.120 And so when they see someone acting aggressively,
00:46:34.060 they infer competence.
00:46:38.580 So part of unpacking that would be to help them distinguish between those two,
00:46:43.360 to know that there is a distinction between the raw expression of power and competence.
00:46:49.840 Well, there certainly is.
00:46:52.200 I had no idea that that confusion was involved in the behavioral problem we're talking about.
00:47:03.060 But certainly, Jesus Christ, there's a difference between competence and abuse.
00:47:08.500 Okay, well, let me, I was speaking with an evolutionary psychologist, David Buss.
00:47:17.260 Yes.
00:47:17.840 Just yesterday.
00:47:19.540 And David Buss has looked at the relationship between dark triad behavior.
00:47:25.760 So that's narcissism, Machiavellianism, and aggression.
00:47:32.280 I might have the third one wrong, but that's basically it.
00:47:35.240 Now, younger women, younger inexperienced women, are much more likely to be attracted to dark triad guys.
00:47:46.200 And that's partly, as far as I can tell, because they haven't had the experience to distinguish between narcissism, let's say,
00:47:55.800 and success, and the confidence that comes with success.
00:48:00.860 And that's an example of that inability to distinguish between aggression and competence.
00:48:06.160 Now, the aggression might be necessary to deal with free riders and cheaters.
00:48:14.520 So you're saying, are you, that some of these aggressive men might be attractive to women precisely because they would be hard on the malevolent types you're talking about?
00:48:25.760 Yes, exactly.
00:48:26.920 Exactly that.
00:48:28.700 The capacity for male aggression is necessary, and it's part of what makes men attractive to women.
00:48:36.160 You can see that in their fantasies.
00:48:38.880 So the most common forms of pornographic fiction that women read feature surgeons, pirates, vampires, billionaires, and unfortunately, I can't remember the other one.
00:48:55.820 But they're men who have, you could say power, but that's not it.
00:49:02.200 I don't believe that.
00:49:03.140 I believe it's something like competence and the ability to use aggression when necessary.
00:49:10.100 And then the narcissistic men, they're parasitizing that in some sense.
00:49:17.800 They're mimicking that, and that's why they're attractive to inexperienced women.
00:49:25.600 Plausible?
00:49:27.300 Sure.
00:49:28.020 When I see the political arguments that take place now, the accusation that the male hierarchy is a, let's say, oppressive patriarchy, right?
00:49:43.560 An exploitative structure.
00:49:46.260 What I see in that is partially this inability to distinguish between power and competence, and also failure to understand when aggressive action is necessary and desirable.
00:50:03.400 And that seems related to the free rider problem.
00:50:07.680 Yes, I hear you.
00:50:08.960 No objections to that?
00:50:13.100 None so far.
00:50:15.920 All right.
00:50:16.760 So let me go back to the idea of belief dependencies.
00:50:21.320 So, you know, we all say that some things are more important to us than others.
00:50:29.920 We hold them more dear.
00:50:33.040 So, the integrity of those core beliefs, I think, is related directly to the inhibition of negative emotion.
00:50:49.120 Inhibition of what?
00:50:50.300 Of negative emotion, anxiety particularly, and doubt.
00:50:59.380 So, the beliefs that are more important to us are beliefs that other beliefs depend on, and when they're threatened, it's very emotionally destabilizing.
00:51:11.860 And very hard on us from a physical perspective as well.
00:51:19.700 Because when our core beliefs are disrupted, we don't know what to do, and therefore we have to prepare to do everything.
00:51:27.700 That's the emergency response, right?
00:51:31.820 That's fight or flight, I suppose.
00:51:33.780 But it's very physiologically costly.
00:51:35.760 And so, I think sometimes people engage in self-deception so that they don't have to undermine their core beliefs and dysregulate themselves like that.
00:51:48.880 That seems plausible to me, sir.
00:51:52.000 The unfortunate problem seems to be that the long-term consequences of that are often not good.
00:52:00.820 But, you know, you ignore a profound danger at your peril.
00:52:08.660 It saves you from the psychophysiological exhaustion in the present.
00:52:13.700 But, if the problem is really there, things unravel really badly in the future.
00:52:20.520 Yes.
00:52:21.660 So, we could say that self-deception has its advantages, even as an adaptive strategy.
00:52:32.020 And I think the idea that it can serve deception, as a handmaiden, say, is a powerful idea.
00:52:40.820 But it doesn't look like it's an optimal strategy.
00:52:45.100 And so, one of the things I wanted to ask you is, is there justification in evolutionary biology for, you know, you said strategies compete, right?
00:52:57.860 And so, does that mean that there's an optimal strategy that we approximate, that we have an intuition of, even?
00:53:07.480 God, I would tend to doubt that there's an optimal one.
00:53:13.140 First of all, if there were an optimal strategy, why isn't everybody adopting it?
00:53:19.920 Well, you may answer me by saying, well, it's adaptive, but not in all situations.
00:53:31.800 Fine.
00:53:32.720 But, if it's always adaptive in these situations, you would expect them to get matched up together fairly quickly over time, wouldn't you?
00:53:45.460 That's a good objection.
00:53:47.360 Yeah.
00:53:47.700 It's a tough one, right?
00:53:49.160 I mean, you've also talked about runaway sexual selection.
00:53:54.140 Okay, so, this is an answer to the problem you just posed, possibly.
00:53:59.680 I mean, one thing that does seem to have been selected for, that operates across a very wide range of contexts, at least in human beings, is something like general intelligence.
00:54:11.220 You know, and the cortical expansion that produced that.
00:54:14.860 And that's been selected by women.
00:54:16.620 So, I would say, as a domain general, there's a domain general ability that might have been selected, that worked in most situations, and that was more intelligence, at least with humans.
00:54:28.100 And that doesn't address the ethical issue exactly.
00:54:32.860 And you said it was mostly being selected by women, female choice?
00:54:39.180 Well, they're choosier.
00:54:40.440 Yes, and they're socially brighter.
00:54:45.360 And they're also, they are also less likely to agree that a low-intelligent sexual partner would be acceptable.
00:54:55.780 According to David Buss, the evolutionary psychologist I was speaking with just yesterday.
00:55:05.300 So, they are, women do appear to be exerting more selection pressure on general cognitive ability.
00:55:11.780 But I also wonder if there's not an ethical equivalent to that, that's something like, well, the capacity for reciprocal altruism.
00:55:21.640 So, what's your, what are you then saying?
00:55:24.720 That females are a positive selective force for reciprocal altruism?
00:55:31.780 Well, yeah, well, I'm, that's not as strong an argument as the one for, let's say, intelligence.
00:55:39.000 It's harder to define the central element of reciprocal altruism than the central element of intelligence.
00:55:46.820 It's harder technically.
00:55:48.100 But, you know, there is an idea in evolutionary biology, sort of implicit, that women select men who are higher, as high as they can manage in the status hierarchy.
00:56:02.080 And that hierarchy is constructed as a consequence of the exercise of power.
00:56:09.260 And I think that's wrong and dangerous, that idea.
00:56:13.300 I don't think those hierarchies, the male hierarchies that influence female selection, are based on power.
00:56:22.100 I think they're based on something more like competence.
00:56:25.860 And I think it's associated with this capacity for reciprocal altruism.
00:56:30.080 Because you want, if you're a woman, you want a man who's productive, but also generous.
00:56:36.560 Certainly, I don't have anything to say against what you just said.
00:56:40.140 I don't know what David Buss was arguing.
00:56:47.680 Well, I would say something similar.
00:56:50.120 I mean, your book on deception and self-deception is very interesting.
00:56:55.220 Because you point out, in many, many ways, how deception can confer at least a temporary advantage, but often a more permanent advantage.
00:57:06.680 And so, it makes making the case that, let's say, something like honesty is selected for, much more difficult.
00:57:18.160 But I also wonder if there's a utility in differentiating between deception and mimicry in animals.
00:57:28.480 No, terminologically.
00:57:31.640 But I don't understand why mimicry is not an example of deception.
00:57:40.700 Let's say we're talking about moths or butterflies and predators, birds.
00:57:49.180 So, you will have some butterflies that are perfectly tasty to birds, but there are a couple that are not, that have a poison that they ingest when they're caterpillars,
00:58:14.060 which they retain in adulthood, which they retain in adulthood, so that they're distasteful and poisonous.
00:58:20.180 So, then they attract, so to speak, mimics.
00:58:25.520 Because now, if you're a related species, so you're similar in appearance already, but you don't happen to have the poison,
00:58:36.200 then you evolve to be, to resemble that species more and more in order to gain the benefit that they have from having the poison.
00:58:49.420 So, the predator makes the assumption that you've got the poison because you look exactly like the species that has the poison.
00:59:03.080 And then they've done work, but I, you know, it's long ago, disappeared from my memory.
00:59:08.480 They've done work on a relative frequency of the two kinds, and there are situations in which the mimic can be, you know, five or ten times as frequent as a model, as they're called,
00:59:27.040 and they're gaining a benefit, and they're inflicting a marginal cost on the model.
00:59:35.240 There are other situations in which, as they rise in number, the mimic, they inflict a cost on the model, because now birds snap the model.
00:59:51.900 Of course, they spit it out, but that doesn't help the model.
00:59:56.800 That just means the bird doesn't swallow the poison.
01:00:00.860 You could make a similar case for narcissists.
01:00:06.420 Imagine that the model is someone competent and confident, and maybe assertive because of that, and productive and generous,
01:00:19.740 but the mimic just mimics the confidence and assertiveness.
01:00:24.320 And then there is a cost inflicted on the model, because if there are enough narcissistic mimics,
01:00:32.700 then the existence of the model starts to become doubtful.
01:00:38.140 You're absolutely right.
01:00:39.880 And there's a rich literature on that, and I used to study it very carefully,
01:00:45.460 because I was interested in the, you know, the interaction between deception and detection of deception.
01:00:53.920 And, of course, you toss in self-deception to bypass the initial problem.
01:01:02.560 Did I tell you about the fact that deceivers have a slightly higher pitch to their voice?
01:01:10.120 Yes, you mentioned that.
01:01:12.420 That's very interesting.
01:01:13.300 It's a very good pen rule.
01:01:14.740 Do you know anything about the physiological mechanisms?
01:01:18.080 Is that a stress response, or did anybody study why that is?
01:01:22.540 It's stress due to fear of detection.
01:01:27.840 Okay.
01:01:28.180 That's believed to cause you to contract your belly when you're making a sound,
01:01:38.480 and so it rises in pitch.
01:01:42.100 As a side note, an interesting story from your book,
01:01:47.420 you talked about one species of butterfly that could lay five different kinds of eggs
01:01:54.760 to mimic five different kinds of poisonous butterfly species.
01:02:00.920 Right.
01:02:01.740 So, let's talk about mimicry for a minute, and deception.
01:02:07.340 Because it's such, they're so tightly interwoven.
01:02:11.820 Human beings are very imitative.
01:02:15.320 And so, someone growing up can choose to mimic a particular model, let's say.
01:02:21.600 And that model might be someone competent, or it might be someone narcissistic.
01:02:28.140 And so, you could mimic competence and become competent,
01:02:33.300 or you could mimic narcissism and become narcissistic.
01:02:37.640 And that's partly why I think maybe there's a useful distinction to be made between mimicry and deception.
01:02:44.600 I mean, not in the cases you raised, with the butterflies, but in the human case.
01:02:53.920 Because there are psychological mechanisms involved, the problem becomes a lot thornier.
01:03:00.740 What's the definition of a narcissist?
01:03:04.300 You can define it by personality.
01:03:06.840 So, narcissists tend to be extroverted, a lot of positive emotion, and disagreeable.
01:03:19.880 So, very little empathy, and more likely to be aggressive.
01:03:25.960 So, and that is a masculine pattern to some degree, because men are more extroverted than women,
01:03:35.000 especially in assertiveness, and they're less agreeable than women.
01:03:41.360 It's the extremes, though, when you get the extremes there, you have something like temperamental narcissism.
01:03:46.480 And then, if they're low in conscientiousness, that's even worse.
01:03:51.220 Because then, well, they're neither productive, nor dutiful, nor honest.
01:03:56.420 None of those.
01:03:57.540 And maybe that's psychopathy.
01:03:59.780 Maybe.
01:04:00.980 It's not clear.
01:04:02.580 And then, you could think about it socially, is that a narcissist is someone who assumes
01:04:09.620 his or her status is higher than those around them would claim.
01:04:18.380 There's a very important literature on psychopaths, because I found it was transformative when I read
01:04:32.100 these papers.
01:04:32.920 They were written by Canadian mathematicians, and there was a psychopathic scale that someone
01:04:45.560 had invented.
01:04:47.040 Robert Hare, I believe, invented the scale.
01:04:50.200 Well, psychopaths, there are violent psychopaths, and they are, of course, of considerable danger,
01:05:00.160 but they are outnumbered by non-violent psychopaths, and their definition, I think, has to do with
01:05:11.400 lack of empathy, lack of feeling for others.
01:05:18.660 A non-violent psychopath, according to this Canadian literature, they studied...
01:05:26.160 Was it the propagation of exploitative psychopathic behavior in populations where there was no
01:05:34.680 punishment for free riders?
01:05:37.220 The reason is because I was thinking that...
01:05:42.340 Oh, Krupp!
01:05:44.580 Was that the name of the people?
01:05:46.560 No, you were mentioning someone else in Canada, right?
01:05:51.300 Yeah, Hare developed the scale, but he didn't do the work you're speaking of.
01:05:55.520 Okay, Krupp.
01:05:57.480 So, let me hit the document now.
01:06:01.420 I believe the paper you're referring to is Nepotistic Patterns of Violent Psychopathy,
01:06:08.820 Evidence for Adaptation.
01:06:10.860 Whether Krupp et al. also did the... held at a 1% to 3% frequency in a population.
01:06:19.440 There, the notion was that the psychopath is held in a frequency-dependent equilibrium.
01:06:29.700 In other words, it doesn't go down to zero, because when there's only 1% that's a psychopath,
01:06:38.260 they're positively selected compared to the general population.
01:06:42.920 However, when they reach 3% frequency, they're already bumping up against the upper boundary,
01:06:50.980 so they're selected.
01:06:52.780 You could also imagine that when their frequency declines in a given population,
01:06:59.700 that people are much less alert to the possibility of psychopathy.
01:07:05.320 And so, then, the deceptions that they engage in are less likely to be detected,
01:07:11.380 and they spring back into existence.
01:07:15.200 Well, yes, but remember that that's automatically true about these psychopaths,
01:07:24.660 because they're held between a 1% to 3% frequency, which is low.
01:07:29.600 So, as I mentioned in that just a little paragraph there, in other words,
01:07:37.660 there, when you're a psychopath, you're always appearing.
01:07:41.980 And, and, but the percentage is only 1% to 3%,
01:07:50.240 whereas the population itself is only experiencing a psychopath
01:07:58.760 once every several generations.
01:08:06.400 So, selection is bound to be weaker on the detectors of psychopaths,
01:08:14.180 although the fact that it doesn't rise above a 3% frequency
01:08:20.980 does suggest that at that frequency, there's too damn many of them,
01:08:28.040 and so people start paying attention.
01:08:30.760 The fact that you just laid out, too,
01:08:34.800 that even when psychopaths are relatively successful in a population,
01:08:40.620 they don't exceed 3%,
01:08:43.260 also indicates that that psychopathic exploitation,
01:08:49.500 which might be regarded as the purest expression of arbitrary, selfish power,
01:08:56.460 is actually not a very good strategy.
01:09:00.400 Well, I'd rather, I'd rather be the part of the 97%
01:09:06.540 that wasn't a psychopath.
01:09:10.820 I just found the,
01:09:12.600 the whole argument of Krupp very powerful,
01:09:16.920 because these psychopaths sure acted as if they'd been under selection,
01:09:23.360 because they, they favored their relatives.
01:09:27.700 Yeah, I didn't know that about psychopaths.
01:09:29.540 I didn't know that, see, because,
01:09:31.200 well, psychologists, when they're talking about psychopaths,
01:09:35.360 they tend to generally assume that
01:09:37.640 there's zero social relatedness
01:09:41.100 governing their behavior.
01:09:42.600 The fact that they favor kin is extremely interesting.
01:09:47.100 I wasn't aware of that.
01:09:48.760 Yes, indeed.
01:09:51.060 And,
01:09:51.820 classically,
01:09:54.880 you know,
01:09:56.440 psychopath
01:09:56.940 is like a,
01:09:59.000 is a negative trait.
01:10:01.200 where you and I aren't psychopaths,
01:10:04.380 we have the positive trait.
01:10:06.760 It's just like any other thing
01:10:08.700 where there's a negative trait
01:10:10.380 that's held at a low frequency
01:10:12.760 or it's at a low frequency,
01:10:14.500 but basically,
01:10:15.860 it's being forced down to zero.
01:10:18.740 But what this literature said was,
01:10:21.980 for these psychopaths,
01:10:23.400 it's not being forced down to zero.
01:10:25.820 If it gets down to one,
01:10:28.360 it's being bounced back up.
01:10:30.160 It probably oscillates between five and a half
01:10:36.680 or something like that.
01:10:38.200 But when it gets infrequent enough,
01:10:41.040 it's definitely beneficial
01:10:43.720 enough to stay in the population,
01:10:48.560 you know.
01:10:48.960 Well,
01:10:49.960 it could well be that
01:10:51.440 effectiveness
01:10:52.100 at low frequency,
01:10:54.780 the effectiveness
01:10:57.320 is dependent
01:10:58.220 on the low frequency.
01:10:59.960 So,
01:11:00.400 you might wonder
01:11:01.040 what happens to societies
01:11:02.400 where
01:11:02.940 the psychopath
01:11:03.860 incidence
01:11:05.140 exceeds,
01:11:06.760 say,
01:11:06.820 five percent.
01:11:08.440 Like,
01:11:09.180 do you suppose
01:11:09.880 those societies
01:11:11.280 get exceptionally punitive?
01:11:13.020 What do you suppose
01:11:13.680 happens to knock down
01:11:15.220 the psychopath percentage?
01:11:16.640 That's a good question,
01:11:19.000 my friend.
01:11:20.520 And I don't have any answer
01:11:22.520 off the top of my head.
01:11:26.200 So,
01:11:26.900 this paper that you cited,
01:11:28.940 one of the conclusions is that
01:11:30.680 in light of Wakefield's
01:11:34.840 1992
01:11:35.160 definition of mental disorder,
01:11:37.840 evidence that psychopathy
01:11:39.440 retains
01:11:40.120 nepotistic
01:11:41.040 design features
01:11:42.420 is at odds
01:11:43.760 with psychopathy
01:11:44.940 being a mental disorder,
01:11:47.260 given that
01:11:47.820 a diagnosis
01:11:48.520 of mental disorder
01:11:49.600 tends to be
01:11:50.340 positively associated
01:11:51.980 with the victimization
01:11:53.660 of genealogical kin.
01:11:55.780 It's not the case
01:11:57.000 with Klopp's work.
01:11:58.820 Absolutely not.
01:12:00.540 And you,
01:12:01.160 you see,
01:12:02.060 uh,
01:12:03.100 uh,
01:12:03.520 figure after figure,
01:12:04.980 uh,
01:12:06.220 with different,
01:12:07.000 uh,
01:12:08.500 relatedness categories
01:12:10.020 and showing
01:12:10.780 that the psychopaths
01:12:12.940 are,
01:12:13.140 are,
01:12:13.600 are treating them
01:12:15.200 better.
01:12:16.060 Oh,
01:12:16.840 right.
01:12:17.520 It's how,
01:12:18.120 how much
01:12:18.920 they score
01:12:20.260 on the psychopathic
01:12:21.600 scale.
01:12:22.420 So the higher
01:12:23.620 they score
01:12:24.400 on the psychopathic
01:12:25.580 scale,
01:12:26.080 the more
01:12:27.440 they are
01:12:28.260 biased
01:12:28.880 towards relatives.
01:12:30.540 Yeah,
01:12:30.700 that's really,
01:12:31.540 that's really something.
01:12:32.480 That's really remarkable.
01:12:33.580 I didn't know that.
01:12:34.660 And that's a,
01:12:35.880 that is a huge finding
01:12:37.240 and I can see
01:12:38.080 why it,
01:12:38.640 it,
01:12:39.220 it,
01:12:39.600 it showed up
01:12:40.980 on your radar.
01:12:42.480 Yes.
01:12:43.220 And it's also
01:12:44.280 a very interesting
01:12:45.280 definition of
01:12:46.260 mental disorder
01:12:47.180 that you can tell
01:12:49.220 if someone
01:12:50.040 suffers from
01:12:51.100 a mental disorder
01:12:52.060 because they
01:12:53.700 victimized
01:12:54.540 genealogical kin.
01:12:56.760 Yes.
01:12:58.060 That's what you think.
01:12:58.900 That's not the only
01:12:59.640 criteria,
01:13:00.220 obviously,
01:13:01.020 but,
01:13:01.660 but it's an
01:13:02.240 interesting criteria
01:13:03.280 for a definition.
01:13:04.880 Yes.
01:13:05.160 And it's not true
01:13:06.200 of,
01:13:06.500 uh,
01:13:07.240 of psychopaths.
01:13:09.220 They don't victimize.
01:13:11.060 Do you suppose
01:13:12.280 that
01:13:12.680 the exacerbation
01:13:15.220 of nepotism
01:13:16.360 is actually
01:13:17.660 part of a
01:13:18.580 psychopathic
01:13:19.300 strategy?
01:13:20.860 Do you think
01:13:21.920 you could go
01:13:22.340 that far?
01:13:24.080 Well,
01:13:25.000 I'm,
01:13:27.600 this paper
01:13:28.120 has made me
01:13:28.760 wonder about
01:13:29.660 the relationship
01:13:30.560 between nepotism
01:13:32.000 as a phenomenon
01:13:33.500 and psychopathy
01:13:35.420 per se.
01:13:36.160 So do you
01:13:38.560 suppose that
01:13:39.240 if psychopaths
01:13:40.600 are more likely
01:13:41.400 to be nepotistic,
01:13:42.420 is the reverse
01:13:43.900 true?
01:13:45.120 Are radical
01:13:46.040 nepotists
01:13:46.840 more likely
01:13:47.460 to be psychopathic?
01:13:51.140 Are radical
01:13:52.260 nepotists
01:13:52.880 like that,
01:13:54.460 that I,
01:13:55.500 that I don't
01:13:57.240 know.
01:13:57.580 what do you
01:13:59.300 think the best
01:14:00.400 evolutionary
01:14:01.000 theories have
01:14:02.380 to say
01:14:02.920 about the
01:14:04.040 persistence
01:14:04.620 of homosexuality,
01:14:06.700 especially among
01:14:07.560 men?
01:14:07.980 one of the
01:14:10.500 most interesting
01:14:11.380 subjects
01:14:13.820 has to do
01:14:16.280 with the
01:14:16.860 repression
01:14:17.800 of
01:14:19.440 homosexual
01:14:21.880 tendencies,
01:14:24.080 and
01:14:25.760 it applies
01:14:27.320 especially to
01:14:28.260 Jamaica,
01:14:28.880 because Jamaica
01:14:29.960 is violently
01:14:32.100 against
01:14:34.700 homosexuals.
01:14:36.060 They're,
01:14:37.040 they're
01:14:37.540 murdered,
01:14:38.480 and
01:14:41.220 they,
01:14:43.820 they have
01:14:44.280 to hide,
01:14:45.500 and so on.
01:14:48.440 But anyway,
01:14:49.400 what was I
01:14:52.280 going to say
01:14:52.780 about
01:14:53.100 homosexuality?
01:14:54.660 Evolutionary
01:14:55.300 theory.
01:14:56.120 Yeah.
01:14:58.960 So there
01:14:59.800 was a,
01:15:00.740 there was an
01:15:01.980 excellent paper
01:15:02.880 published in
01:15:03.940 Georgia,
01:15:04.700 the state of
01:15:05.300 Georgia in
01:15:05.880 the U.S.
01:15:07.380 You,
01:15:07.640 you documented
01:15:08.760 that in your
01:15:09.420 book.
01:15:10.100 I know the
01:15:10.840 study,
01:15:11.500 I believe,
01:15:12.220 it's,
01:15:13.660 they gave
01:15:14.760 them a
01:15:15.140 homophobia
01:15:15.660 scale,
01:15:16.920 and then
01:15:18.520 measured their
01:15:19.220 penile
01:15:19.780 erection,
01:15:20.540 erectile
01:15:21.000 response to
01:15:21.920 homosexual
01:15:22.520 pornography,
01:15:23.340 and found
01:15:25.420 that those
01:15:26.000 who were
01:15:26.280 most homophobic
01:15:27.320 according to
01:15:27.980 the scale
01:15:28.600 showed a
01:15:30.900 significant
01:15:31.420 increase in
01:15:32.940 erect,
01:15:34.180 erectile
01:15:35.000 function
01:15:36.700 during
01:15:38.820 exposure to
01:15:39.700 the homosexual
01:15:40.660 pornography,
01:15:41.800 quite distinct
01:15:42.820 from those
01:15:43.660 who were less
01:15:44.180 homophobic.
01:15:45.940 Yes,
01:15:46.280 and what they
01:15:46.960 did was to,
01:15:48.320 they showed
01:15:49.060 them six
01:15:49.780 six-minute
01:15:50.700 films,
01:15:52.360 and the
01:15:53.260 first was
01:15:54.700 on a
01:15:56.260 man and
01:15:56.740 a woman
01:15:57.160 making love,
01:15:58.540 and then
01:15:59.260 they graphed
01:16:00.720 the penile
01:16:01.680 growth in
01:16:03.900 the homophobic
01:16:06.120 and the
01:16:06.480 non-homophobic,
01:16:07.940 and they
01:16:09.460 were statistically
01:16:11.160 identical.
01:16:12.840 Then they
01:16:14.280 showed them
01:16:15.260 a six-minute
01:16:15.880 movie of
01:16:16.580 two women
01:16:17.720 making love,
01:16:18.780 and it
01:16:19.380 started taking
01:16:20.460 off like
01:16:21.260 a man and
01:16:22.540 a woman,
01:16:23.100 but for
01:16:23.500 a certain
01:16:24.480 reason,
01:16:25.520 it sort
01:16:26.040 of leveled
01:16:27.880 off.
01:16:28.660 Again,
01:16:29.680 there was no
01:16:30.440 difference between
01:16:31.400 the two categories
01:16:32.580 of men.
01:16:33.680 Now,
01:16:34.080 the third
01:16:34.720 one was
01:16:35.980 the interesting
01:16:36.660 one.
01:16:37.540 That's where
01:16:38.160 they showed
01:16:38.660 them six minutes
01:16:39.580 of two men
01:16:40.780 getting it
01:16:41.960 on,
01:16:42.620 and the
01:16:44.480 non-homophobic
01:16:46.080 men showed
01:16:47.480 a small and
01:16:48.680 statistically
01:16:49.380 insignificant
01:16:50.740 increase in
01:16:52.880 penis size,
01:16:56.100 so in other
01:16:57.420 words,
01:16:58.080 they didn't
01:16:58.520 respond at
01:16:59.240 all statistically,
01:17:01.820 whereas the
01:17:03.400 homophobic
01:17:04.240 men started
01:17:05.940 and then
01:17:07.480 they climbed
01:17:08.760 and they
01:17:09.360 kept climbing
01:17:10.320 and they
01:17:11.300 got two
01:17:11.900 thirds of
01:17:12.500 the way
01:17:12.920 up to
01:17:14.280 the level
01:17:14.960 at which
01:17:15.760 they responded
01:17:16.600 to two
01:17:17.220 women,
01:17:18.400 and yet
01:17:19.740 they denied
01:17:21.540 any
01:17:22.240 homosexual
01:17:23.600 tendencies.
01:17:25.300 These were
01:17:25.740 all men
01:17:26.320 that rated
01:17:27.240 themselves as
01:17:28.360 never had
01:17:29.880 a homosexual
01:17:30.460 experience,
01:17:32.680 never had a
01:17:33.580 homosexual
01:17:34.020 fantasy,
01:17:34.980 and never
01:17:36.700 even had
01:17:38.040 a homosexual
01:17:39.320 thought,
01:17:40.800 or so
01:17:41.400 they said.
01:17:43.080 So,
01:17:43.960 that was a
01:17:45.340 fascinating
01:17:45.900 result.
01:17:46.960 That H.E.
01:17:48.460 Adams,
01:17:49.200 I believe,
01:17:50.200 is homophobia
01:17:51.320 associated with
01:17:53.160 homosexual
01:17:53.760 arousal.
01:17:55.220 Probably it
01:17:56.080 is.
01:17:57.480 Are you
01:17:58.220 aware of
01:17:58.720 that paper?
01:17:59.860 Only since,
01:18:01.200 I think I've
01:18:02.200 heard of the
01:18:02.820 paper before,
01:18:03.720 but I
01:18:04.240 reacquainted
01:18:05.180 myself with
01:18:06.120 it when I
01:18:06.700 was reading
01:18:07.180 through your
01:18:07.760 book today.
01:18:08.860 Right.
01:18:09.900 I don't know
01:18:10.420 if it's been
01:18:11.000 replicated,
01:18:12.180 but perhaps,
01:18:14.140 I don't know,
01:18:14.780 I don't know
01:18:15.300 that.
01:18:16.180 No,
01:18:16.720 I,
01:18:16.820 well,
01:18:17.480 I don't know
01:18:18.220 of it being
01:18:18.820 replicated,
01:18:19.680 but,
01:18:20.140 you know,
01:18:21.040 again,
01:18:21.480 I'm,
01:18:23.080 the stage
01:18:24.680 in life
01:18:25.160 where I'm
01:18:25.880 not keeping
01:18:26.480 up with,
01:18:27.180 with new
01:18:28.960 work.
01:18:29.820 Well,
01:18:30.320 I really
01:18:30.680 enjoyed talking
01:18:31.400 to you,
01:18:31.840 and I really
01:18:32.660 benefited from
01:18:33.440 your work
01:18:34.040 scientifically and
01:18:35.900 practically,
01:18:36.640 and I
01:18:37.380 appreciate very
01:18:38.320 much the fact
01:18:38.940 that you
01:18:39.200 talked to me
01:18:39.700 today.
01:18:41.040 All right,
01:18:41.920 Jordan,
01:18:42.360 if there's
01:18:42.840 nothing else,
01:18:43.900 it was a
01:18:44.860 pleasure talking
01:18:45.800 to you,
01:18:46.500 and God
01:18:47.420 bless you
01:18:47.860 and keep
01:18:48.300 you.
01:18:49.240 Thank you
01:18:49.580 very much,
01:18:50.060 sir.
01:18:50.460 Very nice
01:18:50.940 talking with
01:18:51.500 you.
01:18:53.000 Going online
01:18:53.860 without ExpressVPN
01:18:54.840 is like not
01:18:55.700 paying attention
01:18:56.260 to the safety
01:18:56.800 demonstration on a
01:18:57.700 flight.
01:18:58.460 Most of the time,
01:18:59.200 you'll probably
01:18:59.720 be fine,
01:19:00.340 but what if
01:19:01.260 one day that
01:19:01.920 weird yellow
01:19:02.560 mask drops down
01:19:03.440 from overhead
01:19:03.920 and you have
01:19:04.820 no idea what
01:19:05.600 to do?
01:19:06.200 In our
01:19:06.500 hyper-connected
01:19:07.240 world,
01:19:07.680 your digital
01:19:08.200 privacy isn't
01:19:09.000 just a luxury,
01:19:10.020 it's a
01:19:10.380 fundamental right.
01:19:11.320 Every time you
01:19:11.860 connect to an
01:19:12.400 unsecured network
01:19:13.280 in a cafe,
01:19:14.240 hotel,
01:19:14.780 or airport,
01:19:15.600 you're essentially
01:19:16.140 broadcasting your
01:19:17.040 personal information
01:19:17.860 to anyone with
01:19:18.820 a technical
01:19:19.300 know-how to
01:19:19.920 intercept it.
01:19:20.660 And let's be
01:19:21.100 clear,
01:19:21.520 it doesn't take
01:19:22.260 a genius hacker
01:19:22.900 to do this.
01:19:23.860 With some
01:19:24.240 off-the-shelf
01:19:24.840 hardware,
01:19:25.420 even a tech-savvy
01:19:26.340 teenager could
01:19:27.120 potentially access
01:19:27.940 your passwords,
01:19:28.980 bank logins,
01:19:29.660 and credit
01:19:30.260 card details.
01:19:31.300 Now,
01:19:31.680 you might think,
01:19:32.460 what's the big
01:19:33.000 deal?
01:19:33.360 Who'd want my
01:19:34.000 data anyway?
01:19:34.900 Well,
01:19:35.300 on the dark
01:19:35.720 web,
01:19:36.060 your personal
01:19:36.660 information could
01:19:37.440 fetch up to
01:19:37.980 $1,000.
01:19:39.320 That's right,
01:19:40.060 there's a whole
01:19:40.660 underground economy
01:19:41.580 built on stolen
01:19:42.580 identities.
01:19:43.560 Enter ExpressVPN.
01:19:45.320 It's like a
01:19:45.780 digital fortress,
01:19:46.800 creating an
01:19:47.280 encrypted tunnel
01:19:47.980 between your
01:19:48.600 device and the
01:19:49.320 internet.
01:19:50.020 Their encryption
01:19:50.480 is so robust
01:19:51.600 that it would
01:19:52.080 take a hacker
01:19:52.620 with a supercomputer
01:19:53.480 over a billion
01:19:54.480 years to crack it.
01:19:55.660 But don't let
01:19:56.240 its power fool you,
01:19:57.360 ExpressVPN is
01:19:58.320 incredibly user-friendly.
01:19:59.660 With just one
01:20:00.460 click,
01:20:00.820 you're protected
01:20:01.260 across all your
01:20:02.120 devices.
01:20:02.840 Phones,
01:20:03.340 laptops,
01:20:03.900 tablets,
01:20:04.440 you name it.
01:20:05.020 That's why I use
01:20:05.800 ExpressVPN whenever
01:20:06.860 I'm traveling or
01:20:07.900 working from a
01:20:08.560 coffee shop.
01:20:09.140 It gives me peace
01:20:09.840 of mind knowing
01:20:10.460 that my research,
01:20:11.540 communications,
01:20:12.320 and personal data
01:20:13.300 are shielded from
01:20:14.160 prying eyes.
01:20:15.140 Secure your online
01:20:15.980 data today by
01:20:16.940 visiting
01:20:17.320 expressvpn.com
01:20:18.780 slash jordan.
01:20:19.900 That's
01:20:20.220 E-X-P-R-E-S-S-V-P-N
01:20:22.520 dot com slash jordan
01:20:23.660 and you can get an
01:20:24.440 extra three months
01:20:25.300 free.
01:20:26.340 expressvpn.com
01:20:27.440 slash jordan.
01:20:28.180 We'll be right back.
01:20:30.420 We'll be right back.
01:20:32.920 We'll be right back.
01:20:33.800 We'll be right back.
01:20:34.320 Bye.