TRIGGERnometry - June 13, 2021


Jordan Peterson & Heather Heying: Identity, Religion, Death


Episode Stats

Length

1 hour and 58 minutes

Words per Minute

167.14256

Word Count

19,746

Sentence Count

1,161

Misogynist Sentences

29

Hate Speech Sentences

18


Summary

Summaries generated with gmurro/bart-large-finetuned-filtered-spotify-podcast-summ .

Transcript

Transcript generated with Whisper (turbo).
Misogyny classifications generated with MilaNLProc/bert-base-uncased-ear-misogyny .
Hate speech classifications generated with facebook/roberta-hate-speech-dynabench-r4-target .
00:00:00.060 Before we begin, we'd like to say that in our opinion it is not suitable for children or for those of you who may have a nervous disposition.
00:00:18.340 Hello and welcome to this very special live episode of Trigonometry.
00:00:25.440 I'm Francis Foster.
00:00:26.800 I'm Constantine Kishin.
00:00:27.900 And this is a show for you if you want honest conversations with fascinating people.
00:00:33.760 I don't think it gets any more fascinating than the two people we have for you today.
00:00:37.640 You know them all, evolutionary biologist Dr. Heather Hying and of course Dr. Jordan B. Peterson.
00:00:43.260 We're going to bring them in in a second.
00:00:45.180 Before we do, I should just tell you very quickly the format of our conversation today.
00:00:49.400 We're going to talk for about an hour to an hour and 15 minutes.
00:00:52.460 It will be us in conversation with Jordan and Heather.
00:00:54.880 Then we're going to have a quick break and then we're going to take all your questions.
00:00:58.600 If you want to submit a question for us to put to either of our guests or for us to discuss as a group, there's a PayPal link below in the description of the video.
00:01:07.560 Or you can just send a super chat.
00:01:09.140 Our team here will collate everything and then we'll ask the most suitable and the best questions about an hour and 15 minutes from now.
00:01:15.740 So, Francis, I think without any further ado, we should get started.
00:01:19.140 Couldn't agree more.
00:01:20.320 Our first guest is evolutionary biologist Dr. Heather Hying.
00:01:24.460 Welcome to Trigonometry.
00:01:26.160 Thank you so much, guys.
00:01:27.500 I'm thrilled to be here.
00:01:29.300 And our next guest is Dr. Jordan Peterson.
00:01:33.100 Welcome to Trigonometry.
00:01:34.660 Welcome back.
00:01:35.840 Good to see you again.
00:01:37.220 It's great to have you back.
00:01:38.440 Listen, let us get started straight away.
00:01:41.020 There's an issue that we've been talking about on the show about for a long time.
00:01:46.860 It's something that has really become very important.
00:01:49.640 We talk about identity politics, but there's a deeper thing that's going on.
00:01:54.000 Let me summarize it very as briefly as I can.
00:01:57.440 The idea is at some point 50 to 60 years ago, something happened where increasingly you started to see an increase in fatherlessness,
00:02:07.620 increasing breakdown of the family as people describe it.
00:02:10.380 And many people, including one of our former guests, Mary Eberstad, argue there was a sexual revolution that triggered that.
00:02:16.600 Others have other theories.
00:02:17.780 But nonetheless, many people say the product of all of that has been the focus on identity.
00:02:26.220 The breakdown of the family has resulted in many of the things that we're seeing now.
00:02:31.280 Identity politics, obsession with finding your own community that's not really about your family because you don't really feel attached to yours, to the one that you have, etc.
00:02:41.900 Heather, it's your first time on the show.
00:02:43.760 It's such a pleasure.
00:02:44.880 Tell us, what do you make of all of that?
00:02:46.740 As I look at your cat flicking its ear to my long question.
00:02:50.640 What do you make of all of that?
00:02:53.480 And what are your thoughts on that issue?
00:02:57.100 There's about eight different topics in what you asked, right?
00:02:59.740 Since you raised the cat who's partially visible on camera here, cats are, of course, sexually reproducing organisms, as all mammals are, as all vertebrates are, with a couple, a handful of exceptions.
00:03:12.280 But cats don't really have fathers.
00:03:15.080 I mean, they do at the genetic level, but they don't at the social or behavioral level the way that humans do.
00:03:21.000 And so, you know, there's an ongoing live question as to exactly how monogamous humans really are.
00:03:28.300 And in polygynous cultures, children don't know their fathers as much.
00:03:33.600 It's at least a different kind of relationship.
00:03:35.040 But the cultures that we are living in are at least monogamous by expectation.
00:03:42.080 And, you know, that doesn't suggest that there isn't a whole lot of cheating that goes on, of course.
00:03:46.160 But, you know, what does it mean to have a father then?
00:03:48.400 What does it mean for me to say, for instance, that this cat doesn't have a father?
00:03:52.560 You know, he had a relationship with a mother who nursed him and who taught him some things about being a cat.
00:03:59.120 And then they probably never met again.
00:04:01.640 Whereas in humans, the relationship with both parents is critical for most to emerge into adulthood as, you know, as able humans, regardless of what sex we are.
00:04:16.360 So, you know, was it the loss of fathers in the family, the breakdown of the nuclear family that contributed to identity politics?
00:04:26.320 That's not one of the things that I tend to point to, but I think it certainly contributed.
00:04:30.160 I often go a little bit more recently, you know, 30 years ago-ish, we start to have this perfect storm of postmodernism in academia, gaining ground of legal drugs being shoved at children.
00:04:47.560 And it's different ones for girls and boys because we see different so-called failings in boys and girls.
00:04:53.120 It's the screens, it's the helicopter parenting, but, you know, add into that mix having only one parent in a household so that you don't have the tension, the back and forth, the ability for two parents to actually both have their eyes on a problem, you know, be it behavior that a child's engaging in or, you know, whatever else and discuss it.
00:05:16.060 And, yeah, that's going to make things much harder for children growing up.
00:05:20.900 I'm curious.
00:05:21.560 Jordan.
00:05:22.260 Yeah.
00:05:22.700 Sorry, Heather.
00:05:23.460 Jordan, what about you?
00:05:24.400 You've talked about the sexual revolution and its impact in the past.
00:05:27.180 What do you make of all of this?
00:05:28.580 Well, I think there are two questions that are sort of importantly nested in your question.
00:05:35.720 And one is the issue of identity per se and why that's become a question.
00:05:42.360 And then the second question is why has identity politics arisen as an answer to that question?
00:05:50.520 I would say the first issue has arisen because of technological transformation in large part.
00:05:57.540 And you could say that reproductive technology and its consequent effects on the family would be one branch of that.
00:06:05.000 But, look, if you look at more traditional societies, and so let's say those are societies that aren't transforming technologically at the same rate that the world is transforming now,
00:06:16.440 the issue of identity doesn't come up that much because there isn't much choice.
00:06:21.960 You do what your parents did.
00:06:24.700 If you're a man, you do what your father did.
00:06:26.360 If you're a woman, you do what your mother did.
00:06:28.740 And the question doesn't arise because the options aren't there.
00:06:32.840 Well, now we have options.
00:06:35.680 And we don't know how many there are, what the limits are, and we don't even know what the options are to some degree because things change so quick we can't even keep up with them.
00:06:45.520 I can't keep up with my computer, let alone the world.
00:06:49.540 I mean, it's so interesting, well, and distressing in some sense, and this is partly a function of age.
00:06:55.500 I'm surrounded by gadgets I don't understand that are changing faster than I can adapt to that are smarter than me.
00:07:02.800 And that's just one, and I'm actually fairly technologically astute, you know, like I mastered computers, well, to some degree, at least I could use them when I was in my 30s.
00:07:14.440 And, you know, I was sort of on the edge of, I would say, the generation that managed that, at least to some degree.
00:07:20.020 But the issue of identity comes up when the pathway isn't fixed.
00:07:25.220 Well, it's not fixed.
00:07:26.740 Okay, so then, see, Nietzsche said at one point that the issue of morality per se, the question of what is moral, doesn't emerge until you compare many moralities.
00:07:38.540 It doesn't emerge as a question.
00:07:40.720 The question before that is what's right and wrong within a given moral system, not is there morality as such.
00:07:46.660 That only comes up when the idea becomes questionable.
00:07:50.240 Well, now we have this problem.
00:07:51.920 What is identity?
00:07:53.620 Well, and then, because no one knows the answer to that off the top of their heads and has never generated an articulated and lengthy thought-through response to that,
00:08:04.440 someone can say, well, it's your felt sense of gender.
00:08:08.800 It's like, well, oh, maybe.
00:08:12.700 No, actually.
00:08:14.680 That's a very small subset of what it is.
00:08:18.120 That's a very sparse theory.
00:08:20.720 But, you know, your theory of a helicopter is sparse.
00:08:24.660 And so, you know what I mean?
00:08:26.200 If I asked you to draw a helicopter, you'd draw something like a four-year-old would produce.
00:08:29.840 Because what the hell do you know about helicopters?
00:08:31.240 And if you think identity is less complex than a helicopter, well, you're wrong.
00:08:36.080 And so, you know, the issue comes up because we have so many paths to choose from.
00:08:40.700 And then these political answers come up and no one knows what to do with them exactly.
00:08:45.300 Because no one can articulate the rationale for the traditional modes of being.
00:08:51.340 That's not easy.
00:08:52.520 You know, like that's hard.
00:08:54.620 Is the family a patriarchal structure?
00:08:57.140 Well, I don't know.
00:08:59.460 Maybe.
00:08:59.780 Maybe, well, no, upon consideration over many years.
00:09:04.100 Here's a counter-argument.
00:09:05.200 But it's not like you can just generate that on the fly.
00:09:08.520 And so, you know, we fall prey to these casual critiques in some sense.
00:09:13.180 But they're less casual than our ability to articulate our, you know, implicit beliefs.
00:09:18.260 And so we get caught out all the time.
00:09:20.740 Technological transformation makes us uncertain of who we are.
00:09:23.600 And partly by providing us with more freedom.
00:09:26.060 And then answers arise to the questions that are driven by that.
00:09:29.820 And we have a hard time making philosophical sense of them or defending ourselves against them.
00:09:34.780 You know, I've been trying to think through this idea that human social institutions are fundamentally predicated on the arbitrary expression of power.
00:09:42.520 And it's like, well, yeah, sort of.
00:09:47.080 But wait.
00:09:48.440 No.
00:09:49.840 No.
00:09:50.740 Fundamentally, that's wrong.
00:09:51.820 But it's not like it's simple to think through.
00:09:53.580 And we can return to that later.
00:09:56.540 Heather, before Francis moves on to the next little bit, I was going to ask you, is the family a patriarchal structure?
00:10:02.820 I don't see it that way.
00:10:08.740 And there are compelling arguments in evolutionary biology space as to how it is that males wield power in the outside world more often in a nuclear family situation than females do.
00:10:28.060 But there's a question then of what your definition of power is.
00:10:31.800 You know, this will seem like we quickly fall into a semantic trap.
00:10:35.560 But it's, you know, if you're going to argue, you know, patriarchy.
00:10:39.140 Arche is about power.
00:10:40.620 That is what, you know, patriarchal refers to.
00:10:43.640 So you have to define power.
00:10:45.860 And if women tend to have power within the family, within social systems, women tend to engage in hierarchies in which the competition is more covert.
00:10:56.420 Men tend to engage in competition and hierarchies that are more, that's more overt.
00:11:03.380 And so, you know, that is the power that is more overt.
00:11:07.940 More overt, of course it is.
00:11:09.800 It's so factor, right?
00:11:10.560 You can point to it and say, well, that's where the power is.
00:11:12.940 But the power that women tend to yield, being more cryptic, being more covert.
00:11:17.540 And, you know, that's partially because we are sexually dimorphic.
00:11:22.580 We do have some history of at least, you know, some history of polygamy in our past such that men are on average bigger and stronger and all of this.
00:11:30.540 And so in engagement between the sexes, women aren't going to use power, aren't going to use physical power.
00:11:37.140 That wouldn't make any sense.
00:11:38.260 That would be a losing move.
00:11:39.600 And so once you start down that road where women are going to be more likely to be able to compete on something like an even playing field,
00:11:50.680 if they're going to be using psychological or social tools as opposed to physical tools when engaging with men,
00:11:56.760 then also you expect, and we see this is, there's plenty of evidence in the psychological literature as to, you know,
00:12:04.340 the kinds of competition that women engage in being more covert.
00:12:08.400 So all of that is sort of a background answer to, you know, is the system we live in patriarchal?
00:12:14.620 Only if the only kind of power you're interested in is the overt expressions of power, especially raw physical power.
00:12:20.680 Well, it's also, it's also whether or not you think that the fundamental principle that structures social organizations is in fact the expression of power.
00:12:29.400 And it's not.
00:12:30.700 That's just wrong, I believe.
00:12:32.520 I think it's antithetical to the truth.
00:12:35.400 What do you think it is?
00:12:36.100 Well, if you look even at primates like chimps, and chimps are a lot more violent.
00:12:41.760 The males are a lot more violent than human males, at least reactively.
00:12:45.660 And their structure and their social structures are more vicious and violent than human social structures.
00:12:50.680 The alpha male, so to speak, that the top chimp is constantly engaged in grooming and reconciliation and in stable chimp societies attends to the needs of the females and the infants to a large degree.
00:13:07.780 And so what DeWall has demonstrated quite nicely is that even among chimpanzees, social structures predicated on the arbitrary expression of power, which we could define as the use of force to compel others to act against their wishes, let's say, or against their intrinsic desires, produces very unstable social hierarchies that are ripe for revolution.
00:13:31.340 And the chimps that rule for a long time are markedly cooperative.
00:13:36.820 Now, they're still capable of exerting force, but force isn't the animating principle of the society, which is the claim that social relations among humans are predicated on power.
00:13:48.620 That's the claim.
00:13:49.360 I mean, unless we weasel out of the definition of power, power is when I get you to do something you don't want to do or else.
00:13:57.320 And that's that's not the basis for human social institutions.
00:14:01.080 It's an aberration.
00:14:02.220 And people who use power to meet to obtain and maintain their positions are by and large incompetent.
00:14:11.740 I agree.
00:14:13.080 Yep.
00:14:13.200 I think I think that's that's exactly right.
00:14:15.400 And just exactly as you say, alpha alpha males are not brutes in non-human primates and in human primates where the systems are functional.
00:14:25.440 There may be other males who are brutes, but it's not the alpha.
00:14:28.320 Alphas use affiliative behaviors and conciliatory behaviors and cooperation to to accrue loyalty, which then is a way of of maintaining power.
00:14:40.780 And, you know, that's that's only sort of a male hierarchy space.
00:14:44.180 And, of course, you know, what's what I think one of the modern problems that we are experiencing is male hierarchies and female hierarchies both exist.
00:14:54.980 They follow different rules.
00:14:56.460 There are different rules of engagement.
00:14:57.700 It's a very rare species that actually has both because typically in other species, one or the other sex disperses and you don't and and you also don't have multiple males and multiple females in a group together.
00:15:10.440 Baboons being an interesting exception to that.
00:15:13.300 But for us, we've combined, you know, we now live in this attempting to be egalitarian world where we've got this, you know, call it six million years since we split with the ancestors of bonobos and chimps.
00:15:27.780 Six million years of evolution within our particular lineage in which we've got female hierarchical relationships and male hierarchical relationships.
00:15:37.080 And now, I don't know, post-industrial, something since at least agriculture.
00:15:42.220 So at a minimum, at a maximum 10,000 years old, we're combining these two hierarchies and we're pretending it's just going to be easy and fine.
00:15:50.980 And we have to figure it out.
00:15:52.400 Like we have to figure out how to work together.
00:15:54.300 But imagining that we're just going to default into male style of hierarchy and it's going to be cool with all the women or that we're going to just default into female style of hierarchy and it's going to be cool with all the men is an amazing error.
00:16:07.740 And in fact, what I feel like we see is this like pendulum swing from, well, we were just doing it the male way because women came into sort of the external realizations of power later.
00:16:16.800 And that wasn't really working for everyone.
00:16:19.600 And at the moment, we're seeing this pendulum swing over into really female styles of dominance relationship.
00:16:26.840 And it's equally dysfunctional.
00:16:29.020 It's just something we don't have as much history with recognizing, but that doesn't make it any better.
00:16:33.640 You know, this, no, the future is all of us.
00:16:37.740 And that's a very controversial thing to say, especially nowadays, Heather.
00:16:43.520 Right.
00:16:43.840 But don't you think part of this problem, and I'll direct this question first at Jordan, is that we now think that we are above animals.
00:16:52.080 We think that we are evolved, that we are higher, superior, more intelligent.
00:16:57.420 But isn't that one of our main failings, not acknowledging the fact that we are part of the animal world?
00:17:02.960 Well, I think it's very useful to inform your ideas about what constitutes human beings by looking at carefully controlled experiments on animals and observations of animals.
00:17:18.120 Because, well, if you look at our morphology, our physiology, I mean, we share organs pretty much all the way down the vertebrate chain.
00:17:27.060 You know, there's a lot about us that's like other creatures.
00:17:29.660 Now, you know, there's important differences, too.
00:17:31.500 So you look across species and you try to make comparisons where that seems useful.
00:17:34.860 And sometimes that can clue you into how things originated.
00:17:39.000 So DeWall, for example, who studied, primarily studied chimpanzees, has looked at the emotional basis of morality as it emerged in chimpanzees.
00:17:48.680 And before that, people like Jock Panksepp, who's a brilliant animal rat psychologist, essentially, rat researcher, researcher of rats, showed that even rats engage in fair play when you put them together in repeated play bouts.
00:18:04.340 So if a big rat can dominate a little rat by the expression of power, which he can do if he's about 10% bigger, he will do that if you put them together and they play once.
00:18:14.140 But if they play repeatedly, if the big rat doesn't let the little rat win 30% of the time, at least, the little rat will stop playing.
00:18:23.800 And the proclivity for reciprocity has been well documented in many mammalian species.
00:18:32.820 And so we can see that our behavioral patterns have roots that are very deep, and that's useful.
00:18:39.360 It helps clarify the philosophical discussion.
00:18:41.460 It helps us understand ourselves better.
00:18:42.960 It helps us understand that moral behavior, for example, isn't merely a consequence of rational decision-making or a rational inference, let's say.
00:18:52.140 It's much more deeply rooted in us than that.
00:18:55.420 And so, now, having said that, I would also say there are many important differences between human beings and other animals,
00:19:02.900 not least our linguistic capacity and our incredible capacity for complex mimicry, which is unparalleled, I would say.
00:19:10.620 In our visual ability, there's lots of things about us that are quite unique.
00:19:13.980 But, you know, you try to derive your wisdom where you can, and you're a fool if you ignore biology.
00:19:18.880 And it's a lot easier to ignore it because it's actually difficult to study.
00:19:22.180 And you have to work at it, and so it's easier, especially now, especially if you're driven by utopian social theory that implies that human beings are infinitely malleable,
00:19:33.160 and you want to make them malleable in your mold of what constitutes the ideal.
00:19:37.400 Well, then you can ignore biology and, you know, call people biological essentialists and other such idiocy.
00:19:42.500 And we see a fair bit of that.
00:19:46.780 And, Heather, what do you think?
00:19:48.700 Do you think by turning our back on the animal world, thinking that, you know, that blanks latism, et cetera, et cetera,
00:19:55.620 what we've actually done is create a whole raft of problems for ourselves that didn't exist before?
00:20:02.820 We're absolutely creating a whole raft of problems for ourselves.
00:20:06.200 I would add to what Jordan said another, I guess, another way in, another way in to point out how similar we are to many other species out there.
00:20:17.520 That what Jordan's pointing out is our shared history, our shared phylogenetic history, evolutionary history,
00:20:22.400 such that we can look at non-human primates and see in chimps and bonobos, who are our two closest extant relatives,
00:20:29.660 many similarities, you know, ability to, you know, and yet chimps and bonobos are so different socially.
00:20:35.880 You know, one of them, bonobos, tend very much towards explicit affiliation and peacemaking.
00:20:42.580 And one of them, chimps, not as much, with the big caveat, as Jordan indicated before, that the alpha is actually a conciliatory figure.
00:20:50.940 And you go farther back, you know, you go into other mammals.
00:20:54.060 And, yes, because we all share this sort of founding thing for which we are named, mammary glands,
00:21:01.340 that obligates us to maternal care, and that obligates us to sociality and relationship, and it brings in love.
00:21:08.700 And, therefore, we get this sort of cascading series of events wherein we have closer relationships
00:21:14.340 than we're at least obligated by anatomy and physiology before the evolution of milk and lactation.
00:21:21.700 But I would say that that's actually not even the most interesting thing, that we can also look to birds.
00:21:29.600 And our most recent common ancestry with birds was some, you know, reptile-y thing that bears no resemblance to any extant reptile now,
00:21:37.460 you know, many hundreds of millions of years ago.
00:21:39.360 So birds and mammals are not all that closely related, but in some species of birds, in crows, in parrots, in a variety of lineages of birds,
00:21:48.440 we also find what looks like love, and long-term relationships, and fair play, and exchanging of ideas.
00:21:57.200 So what is common?
00:21:59.000 How is it that we have converged between, like, crows, and parrots, and dolphins, and elephants, and wolves, and chimps, and humans?
00:22:07.720 And I'm leaving some things off that list.
00:22:09.900 But that list is, like, okay, social, but there's a lot more things than that that are social.
00:22:14.760 Long-lived, generational overlap, so you can have exchange of ideas and learning between generations, and long childhoods.
00:22:22.980 So that, you know, these organisms are born or hatched, depending on if they're mammals or birds, without yet fully being what they're going to be.
00:22:30.900 You know, compare that to, I mean, any mammal, excuse me, is somewhat, somewhat what we call altricial, which is to say born, you know, not fully able yet to do everything it's going to do,
00:22:43.800 because it's going to need milk from its mother.
00:22:45.540 But a horse, within a half an hour of being born, is able to run around and keep up with the herd-ish, because if not, it's going to get eaten.
00:22:53.540 Whereas the more altricial you're born, the more helpless you're born, the more software you are.
00:23:00.660 So, no, we're not blank slates.
00:23:02.800 But, and this is something that Brett and I talk about a lot, and we've actually written into the book that we have coming out in the chapters on childhood and on parenthood and relationship.
00:23:11.640 We're not blank slates, but we are, we argue, the blankest slates of any species on the planet.
00:23:17.260 Like, when we are born with whatever genetics we have on board, we have the most potential of any species on the planet to become anything else.
00:23:26.320 Like, our fate, our fate as individuals, is less guaranteed at birth than even for a dolphin or an elephant or a parrot, and certainly than for a squid or a cedar or, you know, a mouse.
00:23:39.740 So, yeah, we aren't blank slates, but we have tremendous potential to become almost anything.
00:23:47.380 And that's not at odds with the pushback that you guys, you know, all of you and I as well are giving to identity politics.
00:23:55.580 Because that, you know, that, I think, identity politics is a failure of an understanding of emergence, that the whole is greater than the sum of the parts.
00:24:07.760 It is a reductionist approach, that the identity politics says, you know, I am white and female and middle-aged and American and, you know, over-educated.
00:24:19.560 And, okay, all those things are true, but they don't even begin to encompass a description of me, and I could do the same thing for any of you or any other human being.
00:24:27.240 And that is a reductionist, sort of metric-laden approach to an understanding of what humans are.
00:24:33.300 It's an insistence.
00:24:34.760 It's an insistence, and it's an innumeracy, too.
00:24:38.000 Like, if you can count it, then it's real.
00:24:40.180 And, no, sometimes the most important things aren't countable, and sometimes the things that you can count don't matter.
00:24:45.660 So, let's be our whole best selves as opposed to reducing us to the things that you can label.
00:24:53.280 I wanted to weave a couple of the themes that we've talked about together there, particularly technology and the sexual dimorphism that is inherent in us.
00:25:02.300 Do you think there will come a point where technologically we're capable of eradicating that, should we choose to?
00:25:08.680 There seems to be a lot of people who'd like to do that.
00:25:11.640 Sexual dimorphism?
00:25:13.240 Yeah.
00:25:13.880 Or to compensate for it somehow in a way that it doesn't manifest itself in the way that it currently does, where one sex is stronger and therefore, quote-unquote, more oppressive, and the other one is, you know, you know what I'm talking about.
00:25:27.580 Yeah, no, of course.
00:25:28.580 I mean, I think this is just, can I swear here?
00:25:32.140 Absolutely.
00:25:32.780 Absolutely.
00:25:34.100 This is just batshit crazy, right?
00:25:36.180 I grew up in the 70s and 80s, and all options were on the table for me.
00:25:41.300 And granted, I grew up on, you know, the West Coast of America with parents who, you know, looked at me and said, okay, we weren't expecting that in a girl, but cool, you want to be mathy, you want to be sportsy, like, go for it.
00:25:54.160 You know, do those things.
00:25:55.400 And it was because we, the world seemed to be willing to abandon those regressive gender roles that weren't necessary anymore without ever pretending that me being good at math meant I wasn't a girl, right?
00:26:09.920 Like, that's the thing that we are emerging into now.
00:26:13.420 It's this regressive, backward slide into stereotypes that I thought we were free of.
00:26:21.340 Like, actually, no, men can't gestate or lactate or give birth or any of that.
00:26:27.000 And if you are a particularly caregiving man who wants to be the primary parent for your child, that's awesome, but you still don't get to breastfeed because that's not what men do, right?
00:26:38.980 So, like, we can get rid of a lot of the layer on top that we weren't able to before.
00:26:46.700 And, you know, there's a tension, as you, Jordan, have pointed out, you know, like, we've got all this freedom and you push it too far and you get people really, really confused and unable to make choices among all of the things that are downstream of the freedom.
00:27:00.140 But if we can just hold on to the reality at base and then say, okay, now, you know, if you're a scientist, that doesn't make you male-ish.
00:27:10.280 And if you're an artist, that doesn't make you female-ish.
00:27:13.300 And we, like, we should be able to live in that world.
00:27:16.860 I feel like that is the world that we were creating until, you know, frankly, until sort of the 90s when this grip on academia that looked postmodern and poststructuralist was happening.
00:27:29.260 And then it bled out into the schools of ed and the media and Hollywood and big tech and every place else.
00:27:40.100 And, Jordan, do you think the problem comes from the fact that we don't believe in religion anymore?
00:27:48.140 And if you don't believe in religion, I don't know who said this, then you don't believe.
00:27:51.460 People need to believe in something.
00:27:53.560 And as a result, what we've taken is identity politics as a way to belong to a group.
00:27:58.320 Or do you think it's far more complex than that?
00:28:02.080 Well, there's a New Testament doctrine that you render unto Caesar what is Caesar and unto God what is God's.
00:28:09.560 And I actually think that's a very astute psychological statement.
00:28:14.760 And now it depends on whether or not you believe that human beings have a religious instinct.
00:28:18.760 And I think broadly that the answer to that is that they do, for a variety of reasons.
00:28:25.940 First of all, religious experience is illicitable by biochemical means.
00:28:33.320 So that's pretty powerful evidence right there.
00:28:35.960 But there are all sorts of religious-like experiences that we all pursue.
00:28:40.880 So we feel awe to beauty.
00:28:44.300 We feel awe to the ideal.
00:28:46.380 We participate in, like, mass worship of sports prowess.
00:28:52.420 We like to see people hit the target, which is the opposite of sin, which means to miss the target.
00:28:58.260 We get enraptured by a gospel-oriented rock performance in a stadium.
00:29:06.320 And we do that with other people.
00:29:08.340 And it fills us with enthusiasm, which is to be filled with God, because that's what enthusiasm means.
00:29:15.000 We feel awe towards certain actions that people undertake, particularly if they're radically altruistic or radically truthful.
00:29:23.920 We're possessed by the desire to mimic people that we see act out an ideal that we don't understand, etc., etc.
00:29:33.620 And, like, I've made a list of about 50 of these things that I'm not going to go through all of them.
00:29:37.780 But so you imagine there's an instinct that orients us towards the ideal.
00:29:42.660 And that would be part of the consequence of us having to emerge from this insufficient biological substructure that Heather was talking about
00:29:51.300 and move towards whatever we could be.
00:29:53.740 We need a vision of what that is.
00:29:55.960 And I believe that vision is deeply influenced by biological factors.
00:30:00.340 Reciprocity, for example, which would emerge, again, from the phenomena that our social nature, our protracted childhood dependence, etc.
00:30:09.720 It's emergent property of that.
00:30:11.660 All of that is reflective of a religious instinct.
00:30:14.520 So now then you might posit societies have to organize themselves so that that religious instinct is fulfilled without it disrupting other functions or without other functions being disrupted by it.
00:30:28.220 So the religious system collapses.
00:30:30.620 The need for it doesn't disappear.
00:30:32.940 And so then what happens, as far as I can tell, is it just plummets down the value hierarchy until it hits a point where it can find some satisfaction.
00:30:40.600 So politics becomes religion.
00:30:43.420 And then look out.
00:30:44.460 See, you know, maybe we have some sense of an absolute ideal.
00:30:49.320 We have the need for an absolute ideal.
00:30:51.460 And maybe you make that abstract and you view that as God.
00:30:55.940 And God is distant and ineffable and undefinable and abstract and eternal.
00:31:00.360 He's not local to your country and your place like a figure like Stalin or Mao.
00:31:08.780 And maybe if you don't have something to project that ideal onto, something abstracted, let's say, and perhaps even something communal,
00:31:18.080 then it starts to contaminate other functions.
00:31:23.480 And I think this is where the new atheists went so wrong.
00:31:27.520 They assumed that if we just dispensed with all that idiotic superstition, we'd all become, you know, avatars of Richard Dawkins-style logic,
00:31:36.960 assuming we had the capability, which we generally don't.
00:31:40.040 And I just don't see that that's true.
00:31:42.380 I don't see any evidence that that's the case.
00:31:44.600 I think what happens is we become—we dispense with the catastrophic superstition of Catholicism
00:31:52.280 and end up swallowing something much more ungainly and horrible.
00:31:57.520 Jordan, I want to latch on to that because we've been talking about religion and, you know, how we've turned our back on religion.
00:32:06.160 Don't you think a lot of these crises come from the fact that we don't yet know in the West how to address the issue of death?
00:32:14.060 And you can see the way we dealt with COVID.
00:32:16.460 To me, it seemed like a moral crisis.
00:32:18.520 It seemed like a general panic.
00:32:20.220 Like, we don't know how to address the fact that we're mortal.
00:32:25.200 And as a result of that, we just seem to be scrabbling around for anything, don't we?
00:32:29.480 Including identity politics.
00:32:30.920 Well, I think there's some truth in that.
00:32:33.640 I mean, one of the existential facts that compels us to ask religious questions is the fact of death.
00:32:44.480 And so—and that's why I also think you can't eradicate religion, for that matter, because what are you going to do?
00:32:50.120 Are you going to eradicate people's concern with death?
00:32:52.640 I don't think so.
00:32:54.420 You know, and I've been at funerals, and people stand around, and they don't know what to say to each other.
00:32:59.140 Not really.
00:32:59.860 Maybe they can give each other a hug.
00:33:01.320 That's something, isn't it?
00:33:02.640 But they don't know what to say, and that's because we're all speechless in the face of mortality.
00:33:07.220 But that doesn't erase the question.
00:33:09.420 And the question begs an answer.
00:33:11.200 What's it all for?
00:33:13.140 Well, we're trying to figure that out in our collective religious ideation, let's say.
00:33:16.920 So, for example, when I look at Christianity, I think, well, at minimum, speaking as a psychologist, the figure of Christ is the consequence of a centuries-long, a millennial-long discussion of what constitutes the ideal human being.
00:33:31.000 Now, you might ask, is it any more than that?
00:33:33.900 That would be a metaphysical question.
00:33:35.800 But it's certainly at least that.
00:33:38.400 And as that ideal, well, have you got a better solution to that problem?
00:33:43.360 You know, you think about how it's expressed.
00:33:45.520 That ideal is expressed in music.
00:33:47.600 It's expressed in art.
00:33:48.720 It's expressed in literature.
00:33:50.160 It's expressed in architecture.
00:33:51.920 It's expressed in dogma and spiritual belief and experience and commitment and faith, all of that.
00:33:56.880 It's a very multidimensional conception.
00:33:59.720 And, you know, you can dispense with it rationally.
00:34:03.020 But, well, then what?
00:34:06.800 You got something better?
00:34:08.420 Well, no, you don't.
00:34:09.640 You have something that looks like a set of arbitrary axiomatic assumptions that are enforced by fiat.
00:34:16.300 And that's not a—like, part of Christianity, let's say, was an exploration of what the ideal was.
00:34:22.720 There was at least that exploratory element.
00:34:24.820 I mean, I know things degenerate into insistence.
00:34:27.340 But it wasn't—you really think all of that was insistence by force?
00:34:32.520 I don't think that's a plausible explanation in the least.
00:34:35.300 I mean, Christ isn't the sort of character that a power-hungry maniac would dream up to exercise arbitrary control over a docile population.
00:34:45.160 It's just—that just doesn't make sense.
00:34:47.520 I'm going to give our ancestors and us more credit than that.
00:34:50.600 You know, we were at least trying to figure out how to be, ideally, in the face of our catastrophic morality.
00:34:57.280 And that question, that's not going away.
00:34:59.220 Heather, I can't see your facial expressions at the moment, which is deeply frustrating to me, because I suspect—I can't tell whether there's a lot of agreement or disagreement with you on this.
00:35:11.860 Tell me, from an evolutionary biologist perspective, what is the function of religion?
00:35:17.880 Do we need it to thrive as a species?
00:35:21.100 Well, I'm going to punt the first part of that a bit, because it warrants its own dedicated conversation.
00:35:30.420 But I agree—I agree very much with what Jordan was saying.
00:35:35.240 And I think it actually indicates that the problems with modernity, in part, are about a decohering.
00:35:43.600 So, I also agree with the problems with the new atheists.
00:35:49.660 Atheist is a term that I have used to describe myself, but never new atheist.
00:35:54.080 And that comes with the full understanding that religion has adaptive value.
00:35:59.860 That if it did not, we wouldn't find every single culture in the history of humanity having had some form of religion.
00:36:07.920 And what does it mean for us to be abandoning religion so quickly?
00:36:12.220 Well, we can see part of it.
00:36:14.540 You know, add this to the perfect storm of why it is that people are grasping for meaning.
00:36:20.440 Because, sure, any system that becomes orthodox, that becomes the representation of the sacred, will begin to become corrupt.
00:36:30.160 Like, this is the nature of systems.
00:36:32.700 And they will need to be renewed.
00:36:34.740 And there needs to be tension.
00:36:36.420 And, again, you know, again, one of the concepts that Brett and I explore in our forthcoming book, there needs to be tension between orthodoxy and heterodoxy, between the sacred and the shamanistic.
00:36:47.120 And, you know, and without this system that people are raised into and they can say, I see that, I see the sacred there, I see the holy, and maybe I want to go to it and maybe I want to turn away.
00:37:02.940 People are just unable to even know where to find meaning.
00:37:07.120 And so, you know, mind, body, and soul.
00:37:09.780 I don't know, I don't have a brilliant grasp of what all is meant by soul, although much of it is in what Jordan is talking about, in the ineffable, in awe, in glory, what we feel when we experience a piece of music that just moves us viscerally.
00:37:25.120 But even that, you know, that word that I just used at the end, that is a recognition that they all tie together, visceral, it's body, right?
00:37:34.560 And so much of what we're experiencing now, the decohering of not just society, but I think of individuals, is people thinking, well, I don't need any of the stuff of the past.
00:37:47.580 I don't even need to be tied to my own body.
00:37:49.840 I can pretend that the thing that I am from the very beginning is simply a social assignment.
00:37:56.780 No.
00:37:57.440 Like, all of this is who we are.
00:37:59.860 Everything about our mind and our soul is embodied as well.
00:38:04.220 And imagining otherwise is wrong.
00:38:08.480 So, you know, I admitted that I was going to punt the question of exactly what religion is for evolutionarily up front, and I did.
00:38:15.980 But what I did say clearly was, given that every culture ever known has had some form of religion, we know there is adaptive value in it.
00:38:26.240 And there is yet to be a culture that is completely without religion, that is succeeding widely.
00:38:35.220 And, you know, there will be those, and Jordan is in a much better position to speak to this than I am, who knows more history than I am.
00:38:40.880 There will be those who will argue perhaps, you know, that China is an exception.
00:38:44.480 But I think that some of what is passing for ideology, for instance, there is unto a new religion.
00:38:53.840 You know, something will fill the vacuum.
00:38:56.500 Christianity is growing faster in China than it did in Rome when it first emerged.
00:39:00.780 So, you know, that's an interest, and the suppression is probably helping it.
00:39:06.760 No doubt.
00:39:08.040 It seems that there's an emergent issue.
00:39:10.660 Heather, maybe you can tell me what you think about this if you don't mind.
00:39:13.620 So, you know, we have to organize ourselves in relationship to values because we can't figure out what to pursue unless we rank order things because we can't pursue everything at once.
00:39:23.760 And so we have to rank order things in terms of their value and their approachability.
00:39:27.540 So then imagine that, well, out of that we can abstract something like value as such, right?
00:39:33.720 And then you might say, well, there's an implicit understanding there that there are higher and lower values.
00:39:39.520 And the whole value hierarchy is oriented around whatever value is.
00:39:44.340 And you could say that the drive to God is the drive to understand the central factor that organizes all value.
00:39:53.200 Or you could say it's representative of the thing that's at the top of a properly structured value hierarchy.
00:39:59.660 We all kind of have an intimation of that.
00:40:01.580 I mean, everyone I ever met upbraids themselves for not being all they could be.
00:40:07.880 Well, compared to what exactly?
00:40:10.480 Well, something, right?
00:40:11.400 Compared to the ideal that we hold.
00:40:12.780 What's that?
00:40:13.520 Compared to the ideal that we hold in our heads.
00:40:15.580 Yes.
00:40:15.960 Well, and it's an odd ideal because what it does is whack you when you've deviated from it.
00:40:22.020 And then your conscience tortures you.
00:40:24.000 Now, conscience can go astray.
00:40:25.940 You know, it can be too severe and it can bring you down.
00:40:29.080 So it's not an unerring guide.
00:40:30.780 But it's an interesting phenomenon because it sort of operates independently.
00:40:35.060 And it's not subject to your will exactly because, you know, wouldn't we all be able to like to be able to just dispense with the dictates of conscience?
00:40:43.740 It's no, you compare yourself to an implicit ideal and you torture yourself to the degree that you deviate.
00:40:51.000 And you might rationalize and lie and deceive and try to ignore it and all of those things.
00:40:55.800 But I've never seen that work.
00:40:58.900 And so that's another example, I think, of something that's akin to, well, that's partly the function of religion from a social and biological perspective.
00:41:06.980 It's like it orients you towards a collective ideal.
00:41:09.840 I think that's absolutely right.
00:41:12.500 If I may tell an anecdote from my deep history, maybe the first time that I was in Madagascar, Brett and I were in Madagascar, before I was doing my research there.
00:41:22.440 And we'd been on a, I don't remember the numbers, but it was something like a 43-hour bus ride that took us only 430 kilometers across the southern half of Madagascar.
00:41:33.280 It was excruciating in every regard.
00:41:35.220 But we met a young man who was on the, on the bus, bus with us, who invited us at the, at our end point to have dinner with his family.
00:41:44.760 And it was at the home of a, of a female judge.
00:41:47.400 This was the early nineties, female judge and her husband, whose work, line of work, I don't remember.
00:41:53.360 And they were, you know, solidly middle-class Malagazi, which is still, there's almost no relationship to what middle-class looks like in the, in the weird world, in the Western educated, industrialized, rich democratic world.
00:42:03.940 Um, but we were having, um, a very interesting meal, mostly in French.
00:42:09.700 Um, we spoke almost no Malagazi and they spoke almost no English.
00:42:13.680 Um, until one of them, it was extended families.
00:42:18.600 There's like eight or nine of the Malagazi family and, and Brett and me as young, you know, between college and graduate school at that point.
00:42:25.060 So 22, 23, something, one of them asked us what our religion was.
00:42:31.180 And we said, we didn't have any, we didn't say we were atheists.
00:42:35.120 And Brett actually does not, will not use that word about himself.
00:42:38.480 Um, we said we didn't have any and they recoiled in horror and they almost kicked us out of the house.
00:42:44.800 And if we hadn't already had an hour, hour and a half with them, establishing rapport and establishing that we were decent human beings, I think they would have.
00:42:54.680 And the conversation that continued forward was them trying to figure out how we could possibly be relied on to be moral if we didn't have religion.
00:43:03.620 And, you know, I think having, having been forced into that conversation by well-meaning Malagazi people in my early twenties, really, you know, forced into full consciousness.
00:43:15.920 How is it?
00:43:17.160 How is it then that if I am not answering to a God that I am going to be the, who do I answer to?
00:43:25.300 Right.
00:43:25.480 And, you know, is it this sort of invisible pyramid of values?
00:43:29.180 What is it exactly?
00:43:30.340 And it can't be just you.
00:43:32.640 It has to be, you know, in, in my case, I'm very lucky and, you know, I had Brett and I had each other then and we still do.
00:43:39.420 And so, you know, to have someone else to whom you can always go to and say, tell me if I have done wrong.
00:43:45.960 You tell me if I have done wrong and you, you know, you don't hold back and better if you have a few of those people.
00:43:52.560 Because, you know, absent imagining that someone is up there who can see all, you need, you need more than just your own honesty, no matter how honest you are.
00:44:05.480 Francis, aren't you glad you and I have each other?
00:44:07.320 No, to put it quite simply.
00:44:12.040 But, Jordan, I was going to ask.
00:44:14.760 So we are currently experiencing a pandemic.
00:44:18.380 Do you think that's going to see a reawaking of religion as we're confronted with our own mortality, as we're reminded that actually we are not these omnipotent beings?
00:44:28.120 We don't have control over Mother Nature.
00:44:30.660 We can't even control something as simple as a virus.
00:44:34.580 Which religion is the question, isn't it?
00:44:36.140 Yeah.
00:44:37.400 Well, Dr. Randy Thornhill, who studied the relationship between infectious disease and political belief, would predict that people will become more conservative as a consequence of this.
00:44:51.940 And so that's one possibility.
00:44:53.720 You know, I think that when you're faced with a challenge, the issue of what constitutes the ideal and what constitutes deviation from that starts to loom large, right?
00:45:07.600 And there's a reason for that, which is when you're subjected to a catastrophe, one of the logical things to do is to figure out where you went wrong so it didn't happen again.
00:45:18.120 And to figure out where you went wrong is to do something like what Heather described doing with Brett, except internally, right?
00:45:27.720 You posit an ideal in your imagination, and you make that into an avatar.
00:45:33.300 That avatar becomes your judge, and you have a conversation with it, something like that.
00:45:38.240 Or maybe that's all played out at the level of emotion, but that's not optimal.
00:45:41.680 You want to be able to articulate it.
00:45:43.260 And then you call yourself on it, you say, well, maybe if I was a better person, this wouldn't happen to me.
00:45:49.220 And, you know, sometimes that's not true.
00:45:52.620 Sometimes you just got run over by a bus, driven by a maniac, right?
00:45:56.740 It's just, you're in the wrong place at the wrong time, and tough luck for you.
00:46:00.520 But by and large, if you take account of those things that you can take account of, the desire to not reproduce past errors that produced a catastrophe is a pretty useful approach.
00:46:15.800 And we're very, we abstract these things up to a very, very high level.
00:46:20.020 You know, that's one of the things that's very peculiar about human beings is, well, another thing that God is, is the voice of the omniscient collective within.
00:46:28.800 And, you know, you might think, you might object, well, that's not God.
00:46:33.200 It's like, well, these things are not as obvious as they look.
00:46:36.140 I mean, the conscience, again, calling to you from within.
00:46:40.040 I mean, here's, here's part of what that is.
00:46:42.920 So imagine that you're subject to social pressure of all sorts, right?
00:46:46.560 And, and you, you see, you see that social structure abstracted as well in literature and stories.
00:46:53.920 And so that's like a concentrated social structure.
00:46:56.800 And you're just completely bombarded by that information all the time.
00:47:01.540 And you build an internal representation of the spirit that that characterizes.
00:47:07.100 And it's what calls you out on your own misbehavior.
00:47:10.980 But that spirit is something that has aggregated itself over God only knows how long.
00:47:17.600 Since the dawn of consciousness itself.
00:47:19.760 A long, long time.
00:47:22.940 And so distinguishing that from something omniscient that's watching you, that's not so easy.
00:47:29.120 And you might say, well, that's still not a metaphysical claim.
00:47:31.980 It's like, fair enough.
00:47:33.200 But when you tangle in the fact that consciousness itself generated that, at least in part, and that you have no idea whatsoever what consciousness is, then you start to skirt the edges of a profound metaphysics of, like the omniscient observer, pretty closely.
00:47:51.660 And those things aren't easily rationalized away, not as far as I can tell.
00:47:58.320 The deeper I've looked into it, the more peculiar and odd it becomes, especially when you start to think about consciousness itself.
00:48:07.120 Heather, I want to give you the last word on this because I want to move on to something a little bit different.
00:48:11.980 So is there anything you'd like to add?
00:48:13.240 You know, hours worth, but maybe we move on because that was a great place to end, I think, unconsciousness.
00:48:19.900 All right.
00:48:20.240 Well, listen, in addition to both of you being experts in your field and really making a significant contribution to the sort of abstract discussion, there's something else that we wanted to talk to you about, which I think is very relevant in the current climate.
00:48:33.560 And, Jordan, I'm sure you'll argue that partly this is a product of technology.
00:48:37.820 Both of you have made a stand at one point or another for something that you felt was important to make a stand about, whether it was principle, whether it was a line being crossed, whether it was an approach you just felt was not appropriate.
00:48:51.380 And I think increasingly all of us in our society are confronted with the fact that there are certain things that are being violated on the one hand that we're unwilling to see violated and say nothing.
00:49:03.380 And on the other hand, it's very easy at the same time to descend into the sort of social media driven desire to destroy people with facts and logic, quote unquote, and have these pointless, meaningless, superficial, trivial battles with each other on the time.
00:49:21.380 And I was wondering, as you two people who've managed to be at the very, you know, in the very heart of this argument that is raging in Western society, yet conduct yourself with a lot of dignity.
00:49:35.140 How do we, as individuals, how do we, as individuals, pick our battles so that we can distinguish between making a stand on something that's really important without finding that Nietzsche quote, the abyss is staring so much back into us that we become the monster that we're fighting?
00:49:55.240 There's a lot to say.
00:49:58.000 And there's not going to be a one-size-fits-all answer.
00:50:00.560 Definitely pick your battles and don't spend time waiting in mucking about battles that you have decided that you're not going to fight in.
00:50:11.680 You know, it used to be before social media was so ubiquitous that authors would say, you know, I only read a couple of reviews or I don't read reviews at all because getting that into my head is not helpful for my creative process.
00:50:25.040 And I think there's a lot of truth in that, that we need to be able to hear our critics, certainly, you know, certainly as a scientist, you know, being open to ideas that challenge what it is that you currently think is true is absolutely paramount, right?
00:50:43.580 So how do you walk that fine line where you can be open enough to hear reasonable challenge from good faith interlocutors and not find yourself just, you know, pelted with tomatoes hurled by, you know, sometimes real people, but often not even, right?
00:51:04.900 So, you know, the fact that much of the chaos online doesn't even seem to be organic actually for me makes it easier to say, you know what, off, just not doing that because I'm not turning off people.
00:51:21.840 I'm turning off some combination of people and chaos.
00:51:25.140 And I suspect actually, Jordan, that you and I and Brett are effective in some ways in this realm for similar reasons, that even though you were a professor also for many, many years, it was, I think, your clinical psychology practice and your relationship with your patients that gave you such a deep sense of the humanity of people.
00:51:51.520 And in Brett's, in my case, it was the particular institution we were at and the way that we were able to teach.
00:51:56.980 And so it was our relationship with our students, wherein you see that so many people in their interactions with other human beings actually kind of maybe don't even regard the person on the other side as a full human.
00:52:09.080 And, you know, we see the dehumanization certainly in, you know, Brexit era and Trump era and, you know, just each side picking a moment, a person aside and accusing, you know, fully half of the other, fully half of the population of not quite being real, honestly, you know, deplorable or reprehensible or incompetent or, you know, whatever it is.
00:52:36.280 And I think that having been face-to-face with so many, you know, patients in your case, Jordan, and students in my case who I just came to know so well.
00:52:49.900 You know, what I say is that teaching actually made me much less of a misanthrope than I used to be.
00:52:56.120 I used to have almost no hope for humanity.
00:52:57.940 And people who were superficially familiar with what happened at Evergreen would be like, are you kidding me?
00:53:04.700 It was those Yahoo students who, you know, took Brett out and you were like, no, actually, the Yahoo students were indoctrinated by Yahoo faculty who were in charge there.
00:53:14.520 That was the indoctrinated faculty and staff who did that.
00:53:18.460 And the cowardly administrators.
00:53:20.180 Oh, for sure, the cowardly administrators.
00:53:22.620 Yes, yes.
00:53:23.180 But none of our students behaved that way.
00:53:27.680 Our students were loyal to a person because we created them, we created them, we treated them like whole human beings.
00:53:35.580 And, you know, what I used to say at Evergreen was, my God, the bar for being an educator at the college level should be, you know, something real.
00:53:43.460 You know how to communicate it.
00:53:44.720 And you fundamentally believe in the humanity of your students.
00:53:47.160 And I'm shocked at how many people fail one or two or all of those.
00:53:52.280 And so I do think that for those of us who do have something real to share, do know how to communicate it, and do fundamentally believe in the humanity of the people on the other side, that comes across.
00:54:04.140 It's clear that we're real and human and actually, you know, fallible, but doing our best.
00:54:10.360 And it probably also makes us more susceptible to garbage arguments from bad faith actors that come our way.
00:54:18.100 And so it may be, for me, I say, you know what, I am willing to just, you know, shut the curtain a lot of the time, because I know that I engage truly and honestly and really with people when I know they're actually people.
00:54:32.080 And I therefore don't have to with a lot of the social media stuff.
00:54:36.560 And Jordan, how do you manage to keep your head when all around you are losing theirs?
00:54:45.800 Well, it isn't self-evident that I did.
00:54:51.020 You know, I mean, first of all, battles pick you much as you pick them, maybe more.
00:54:56.600 You know, when I made the first video I made, I was playing around with YouTube.
00:55:01.300 I put my lectures on YouTube.
00:55:02.700 I had a little bit of experience with TV in Ontario.
00:55:05.140 You know, so I was dabbling.
00:55:07.720 And I didn't know what YouTube was, neither did anybody else.
00:55:11.880 You know, I didn't know it was going to be the world's biggest television network by an order of magnitude.
00:55:17.180 I thought it was a place for cute cat videos.
00:55:20.720 It still is, to be fair.
00:55:22.920 Yeah, well, fair enough, you know.
00:55:24.660 Maybe that says something positive about people, too.
00:55:27.960 So, but I got swept up.
00:55:30.500 And, you know, in many ways, it's been absolutely devastating.
00:55:36.040 So, now, I made, I decided a long time ago that I was going to say what I thought.
00:55:42.860 You know, that I was going to try to orient myself.
00:55:45.440 And I think I had experiences that were similar.
00:55:47.720 Heather said something very interesting.
00:55:49.140 You know, that she's much less misanthropic than she might have been, had she not been a teacher.
00:55:53.840 A caring teacher.
00:55:54.800 Well, you know, the spirit that animates our social institutions is a lot more in the nature of caring teacher with deviation than it is power-hungry tyrant.
00:56:05.200 I really believe that.
00:56:06.800 I mean, you have to be extraordinarily cynical not to see at least some of that.
00:56:10.940 And so, the people I've known that were competent in their positions, in competent institutions, were mentors who encouraged.
00:56:21.460 And that's a good relationship to have with people.
00:56:23.880 And it's the opposite of the arbitrary expression of power.
00:56:26.420 And it's something you do, both of you, the mentor and the mentee, let's say, the teacher and the student.
00:56:31.440 You do that in the service of a higher ideal.
00:56:34.200 And I've never met anyone that I admired who exploited their students, who considered that a power.
00:56:39.860 I mean, I've met professors who didn't put their graduate students' names on research papers or were stingy about credit.
00:56:48.500 But it's not like they were thriving.
00:56:51.300 It just doesn't work.
00:56:53.480 So, and when do you stay silent?
00:56:55.700 Well, you stay silent until the weight of having to lie is worse than the danger of speaking.
00:57:03.480 And I'm not fond of deceptive silence or outright lies.
00:57:10.820 In my clinical practice, I never, ever, ever, ever saw anyone get away with anything.
00:57:20.440 And so, that was horrible.
00:57:22.820 It's horrible to think that that's true.
00:57:25.420 It's true.
00:57:26.920 You think you got away with it.
00:57:28.900 You just don't know the causal chain that led to your demise.
00:57:33.480 So, you know, that plus my understanding, too, that one of the factors that sustained and generated totalitarian atrocity was the willingness of ordinary people to swallow things they knew to not be true.
00:57:51.960 That was insisted upon by the sophisticated observers of totalitarian states that I found most influential.
00:57:59.040 You swallow enough lies, your state becomes a monster.
00:58:03.480 And so, you know, I've upbraided myself continually for bringing the chaos of the world into my nice, tight, tidy little family.
00:58:11.940 And it's been terrible.
00:58:13.360 But, you know, in my defense, I would say, I'm trying to stave off something worse.
00:58:21.780 And if you think that it can't get worse, that just means you don't know anything.
00:58:27.240 So, you know, it's up to everybody in their conscience, you know.
00:58:34.360 And I think it's a matter of figuring out what to be afraid of.
00:58:37.520 I'm more afraid of silence and lies.
00:58:42.100 I think they're more terrifying.
00:58:44.460 As am I.
00:58:44.840 I mean, you know that I was born in the Soviet Union and I've seen the process of whereby people are having to make those decisions every day.
00:58:52.460 And actually, that's very much the reason we asked you both this question, because you are both remarkable people in your own right.
00:58:59.580 And you've, as you say, you've suffered terribly, both of you, for what you took a stand on.
00:59:05.800 But also, people would argue you've been rewarded, and I would argue you've had both, right?
00:59:09.500 You've been punished and rewarded at the same time.
00:59:12.640 But there are a lot of people who, maybe they're wrong, but they do feel, you know, we get the messages every day.
00:59:18.500 I'm sure you get heaps of messages every day from people saying, look, you know, I hate the fact that my kids are being indoctrinated at school, but I also want to be able to feed them.
00:59:28.960 And if I lose my job speaking out, then, okay, great, they're not being indoctrinated anymore.
00:59:33.360 Or maybe they still are, but I'm also not able to feed them anymore.
00:59:36.620 How do ordinary people who are not college professors, let's say, who may not become a YouTube host or whatever it is,
00:59:45.840 who maybe don't feel that they want to be a public person as a result of making some kind of stand, how do they square those two conflicting desires, Heather?
00:59:56.580 That's an easy question for you.
00:59:58.200 Yeah, exactly.
00:59:59.520 Right. We're all trying to work it out.
01:00:01.160 You're right.
01:00:01.560 I'm sure, I'm sure that all of our inboxes look similar in this regard, and it's heartbreaking.
01:00:07.960 And I do, when Evergreen was actively blowing up, I was saying, you know, I completely understand why all of the people who are coming to us privately to say,
01:00:24.220 hey, I see what's going on, and I support you, but I can't have your back publicly.
01:00:30.880 I felt very generous to them then, and I feel less generous to them now.
01:00:36.720 And I actually am a little surprised that that's the direction it's gone.
01:00:40.280 What is clear is if there is a fever pitch, if there is a moment, then you need a critical mass of people to stand up and you can't stop it.
01:00:50.120 I don't know what the equivalent of that is when there's not a precipitating event, right?
01:00:56.140 Because most of social media is just this slow burn, and I don't know what it would mean to stand up on mass there.
01:01:03.640 I don't think there is any virtual equivalent in that slow burn space.
01:01:08.560 But when there is an event and someone is, you know, someone's the witch, when someone's a witch, and there are 30% of people in the organization who are privately saying, oh, my God, this is really dangerous and bad.
01:01:23.200 And half of them are saying to the would-be witch, I know you're not a witch, but I can't stand up.
01:01:28.560 If that 15%, you know, and I'm making up numbers here, but, you know, if some substantial fraction, substantial minority of people can actually see what's going on, and I think that's actually a conservative estimate, and then even 50% of those people said, this will not stand, not in my name, no, then those events can get turned back.
01:01:51.180 And, you know, I don't, in the particular case of my story, I don't know that the institution was savable, even if 15% of the faculty and staff had gone public with, you know, and we know that at least that number of people actually saw what was going on and didn't approve.
01:02:10.300 But in some cases, systems aren't that far gone, and they would be savable, and it requires courage.
01:02:19.120 And that wasn't easy for me to feel back when we were losing two tenured positions, and that was all we knew.
01:02:27.140 And it feels easy to, you know, it may feel like it's easy for me to say now with the hindsight of four years and, you know, effectively being funded by the crowd.
01:02:38.120 And it's, you know, it's both less security and vastly more security.
01:02:44.080 Most people won't end up in, you know, being able to transition in that way, nor will they want to, exactly as you said, Constantine.
01:02:50.520 So how can you encourage people to stand up in defense of the system that they are in, you know, the functionality of the institution that they're at,
01:03:00.320 such that it can continue on without being captured, without any more witch hunts?
01:03:04.440 You know, that's the piece of it that I don't exactly know.
01:03:08.960 No, it has to be something like, don't lie.
01:03:13.420 You know, like, if you're being called upon to lie, or to bury your head in the sand, well, where's that going to end?
01:03:22.480 It's not going to stop.
01:03:25.280 It isn't how it works, as far as I can tell.
01:03:27.420 So, it has to be something like that.
01:03:31.140 And, I mean, I'm not saying these are easy decisions.
01:03:33.940 I'm certainly not saying that I made the right decision.
01:03:36.300 I hope that they'd have to cart my skeleton out of the University of Toronto.
01:03:40.340 You know, I liked my job a lot.
01:03:42.060 It was just fine.
01:03:44.980 And, you know, it's been nothing but chaos and upheaval since then.
01:03:48.500 I mean, I have been rewarded, but I haven't been rewarded with security, something like that.
01:03:56.560 That's not the right word, exactly.
01:03:59.360 Stability, maybe.
01:04:00.500 Yeah, well, it's high risk, high reward, fine.
01:04:02.560 But I would have settled for much less reward and much less risk.
01:04:06.600 So, and I was perfectly happy with that.
01:04:08.840 Now, I'm not ungrateful for what I've accrued in the meantime, but it's not an improvement, that's for sure.
01:04:15.300 And I wouldn't lecture people about, you know, their moral obligation, because you've pointed out the problem.
01:04:22.080 It's the proximal cost and the distal reward, right?
01:04:25.620 You know, you see a system degenerating, and you're going to say something, and the cost to you is going to be bloody high.
01:04:31.920 And that impact is going to be perhaps not even measurable.
01:04:35.580 But then you run into the problem that Heather just described, which is, well, if 70% of you don't like what's going on, and you don't say anything, then, you know, where is it going to end?
01:04:46.260 And people have to make those moral decisions for themselves.
01:04:49.380 The only thing that I know, perhaps, is that lying doesn't seem like a good idea.
01:04:55.280 And being forced to, particularly, you know, this is where I took my stand.
01:05:00.500 I'm not saying other things that other people want me to say.
01:05:03.680 It's like, up yours.
01:05:05.160 You go and live the consequences of your speech, and I'll live the consequences of mine.
01:05:10.840 And I'm not going to live the consequences of your speech.
01:05:13.980 And that's where my government went too far, as far as I was concerned.
01:05:16.820 It's like, no, no compelled speech.
01:05:19.020 I don't give a damn what your rationale is.
01:05:22.540 Because I don't, why should I trust your rationale when you're transgressing against something so sacred?
01:05:30.180 Why would I possibly trust your rationale?
01:05:33.200 There isn't anything more sacred than the word.
01:05:36.880 I mean, unless you want to just dispense with our accrued morality in its entirety,
01:05:41.880 the one thing we've more or less agreed on in the, you know, relatively Christian West,
01:05:49.080 or perhaps the Judeo-Christian West, is that the word is sacred.
01:05:53.320 Well, if you want to throw that out, be my guest.
01:05:56.160 But you're not getting me to throw that out.
01:05:59.120 So now, whether I'm right about that or not, well, probably, you know, right, Christ.
01:06:03.540 There's lots of ways I'm not right.
01:06:05.040 I can tell you that.
01:06:05.920 But lying is not a good idea.
01:06:10.120 Well, you say that lying is not a good idea.
01:06:13.160 Do we need to redefine the meaning of courage?
01:06:17.240 In particular, when we look at courage, what it used to mean in generations past, going to war,
01:06:22.720 where really, you look at a lot of the courageous people now, they tend to be whistleblowers.
01:06:27.460 They tend to be people who stand out from the crowd and say that they don't agree.
01:06:32.840 Do we need to redefine what it means to be courageous?
01:06:37.260 I think that's always been the case.
01:06:39.080 I mean, if you look at the prophetic tradition in the Old Testament, that's just one story after another of society degenerating towards a tyrannical state and a prophetic voice emerging, you know, representative of God in some abstract sense, saying,
01:06:53.560 be careful, you're deviating from the proper path.
01:06:57.580 And if you don't think there are going to be consequences, you're just not paying enough attention.
01:07:03.460 Well, you know, how do you distinguish a false prophet from a true prophet?
01:07:06.940 I don't know.
01:07:07.440 Maybe the false prophet tells you what you want to hear.
01:07:09.700 I guess I would just add to that, that I think physical courage and mental courage are related, that there's a reason that we use one word for both,
01:07:18.680 and that in the modern world, there is far less opportunity to manifest physical courage or to know what it would be like.
01:07:26.680 And, you know, it's been much discussed, and I agree with the sense that, you know, the lack of war is both beautiful and, in some ways, demasculinizing for society, not having that as even a possibility on the horizon.
01:07:45.600 And, you know, there's plenty of things that girls and women do as well that requires physical courage, but it tends to look different.
01:07:53.200 And in our disembodied behind screens, especially for the last, you know, especially during COVID era, just, you know, a-physical lives,
01:08:03.900 without having any sense of what physical courage might be or what physical risk might be, of course we're now also more confused about mental courage.
01:08:12.900 And so, you know, being told things like silence is violence and also speech is violence.
01:08:18.260 So, you know, the whole thing just falls apart very quickly.
01:08:23.240 Before we go to our break and the Q&A, Jordan, I just want to come back to you on this because I'm not satisfied with the conversation we've had so far,
01:08:30.800 and that's maybe because there is no answer, but I just feel like I've had so many conversations with people where I've said exactly the same thing to them.
01:08:39.080 Lying is bad.
01:08:40.180 It has consequences.
01:08:41.320 If we carry on down this path, we will go to a very bad place, and we're already in a very bad place.
01:08:47.900 And the answer is, yeah, you're just exaggerating.
01:08:50.480 It's just the pendulum swinging the other way.
01:08:53.000 It will swing back.
01:08:54.200 Everything will be fine.
01:08:55.820 And even if it's not, well, at least I get to survive today.
01:08:59.200 You go and die.
01:09:00.000 I'll survive today.
01:09:01.660 What does one say to something like that?
01:09:05.120 They might be right.
01:09:06.260 You know, you just can't tell, can you?
01:09:09.660 I mean, let's say the pendulum does swing back.
01:09:13.320 Well, maybe it swings back because people stopped it from swinging too far.
01:09:17.360 But then the people who said it's going to swing back will say, well, we said it was going to swing back, and look, there it went.
01:09:22.680 Well, that's why you can't, in some sense, make your moral determinations as a consequence even of the analysis of an outcome.
01:09:29.640 I think that's partly why something like faith is necessary.
01:09:34.360 So, do you have faith in truth?
01:09:36.920 Well, prove to me that truth is the best path.
01:09:39.860 Well, I can't.
01:09:41.300 That's not provable, and for exactly the reasons that you just laid out.
01:09:44.660 It's a decision you make at some point, and you weigh the evidence.
01:09:49.580 You stake your life on, you know, you're either incoherent and all over the place, which is its own catastrophe,
01:09:54.900 or you stake your life on a certain set of principles.
01:09:57.780 You do that, you have to do one of those two things.
01:10:01.780 You're either incoherent, or you're coherent.
01:10:04.100 If you're coherent, you stake your existence on some axiomatic principles.
01:10:07.520 They're basically axioms of faith.
01:10:09.520 And you have to decide what those are.
01:10:12.660 And then you live out the consequences of that.
01:10:15.260 And that's each individual soul's journey.
01:10:19.740 And I'm not even sure you can convince people, in some sense.
01:10:24.240 I mean, you can talk to people, and you can tell them what you think,
01:10:26.600 and you can let them make up their own mind.
01:10:28.200 It's one thing I learned as a therapist, and I've watched therapists on TV all the time.
01:10:31.740 They're always telling their clients what to do.
01:10:33.560 I didn't tell my clients what to do.
01:10:35.320 I didn't know what the hell they should do.
01:10:37.880 I mean, they were different people than me.
01:10:41.040 And I didn't want to take ownership of their victories or responsibility for their defeats.
01:10:47.560 You know, that's not a good thing.
01:10:49.060 And so I was always trying to help them figure out what they should do.
01:10:52.740 And, you know, insofar, I think, as I've been successful in my lecturing capability,
01:10:58.340 that's because I'm doing that for the people I'm talking to.
01:11:01.860 I'm not saying you have to believe this.
01:11:03.780 You know, it's...
01:11:06.060 People wrestle with their own conscience.
01:11:09.040 But, you know, if you're forced to violate your conscience, something's up.
01:11:16.180 Either your conscience is malfunctioning, or the thing that's forcing you to violate it is becoming tyrannical.
01:11:22.460 I don't...
01:11:22.980 What are the other options?
01:11:24.020 It's very, very true.
01:11:27.840 Right.
01:11:28.360 We are going to take a very short advert break now from our sponsors.
01:11:33.200 And what we will do is we're going to collate all of your questions that you want to ask Jordan and Heather.
01:11:38.180 And then when we'll come back, we'll be able to ask as many of your questions as possible.
01:11:42.800 We won't be able to get through all of them, but we will do our best to pick the ones that we can.
01:11:48.320 We'll see you in a few minutes, guys.
01:11:52.220 Francis, do you like biscuits?
01:11:55.000 Stupid question.
01:11:56.240 If you like biscuits as much as him, you have to try Zingy Berry Bakery.
01:12:01.480 They're a small, family-run bakery that make award-winning sweet cookies and savory crackers.
01:12:07.140 Francis will explain how many awards they've won, won't you, Francis?
01:12:11.540 Their sumptuous cookies are made with whole grain oats and real butter,
01:12:15.760 while their savory crackers are made with whole grain oats and are both wheat and dairy-free.
01:12:20.640 They've got a brilliant offer.
01:12:21.860 All you have to do is enter our code, which is, of course, triggered on your first order.
01:12:26.620 And you'll not only get 10% off, they'll give you free delivery as well.
01:12:30.540 That's 10% off and free delivery on your first order with our code, which is triggered.
01:12:34.860 Go to Zingy BerryBakery.co.uk.
01:12:39.140 The link is in the description.
01:12:40.900 It's Zingy BerryBakery.co.uk.
01:12:43.460 And get your biscuits today.
01:12:45.180 I think I've eaten too many biscuits.
01:12:47.160 Never heard him say that before.
01:12:48.320 Hey, Konstantin.
01:12:51.180 Do you like talented designers and tech developers from Latin America?
01:12:55.460 Not this again.
01:12:56.760 I told you, we're not hiring your cousin.
01:13:00.280 Ay, Dios mio.
01:13:01.380 Que Pedro alto dice.
01:13:03.380 I'm talking about an online tech platform for clients to hire highly skilled and vetted freelance developers
01:13:09.640 and designers from Latin America called Cloud Devs.
01:13:14.600 Oh, that sounds great.
01:13:16.040 Tell me more about that.
01:13:17.040 The distinctive feature of Cloud Devs is that all of the highly vetted professionals they offer
01:13:22.660 have at least five years of experience.
01:13:25.800 Plus, you get matched with a suitable talent in just 24 to 48 hours.
01:13:31.340 And I bet their rates are competitive as well.
01:13:33.460 Of course, customers can access a remote pool of highly qualified senior developers
01:13:39.500 at the very competitive rate of 40 US dollars per hour.
01:13:44.360 In fact, we should replace our producer with a Cloud Devs talent.
01:13:48.960 Pedro?
01:13:49.600 No, stop oppressing my people.
01:13:52.980 His name is Juan Carlos and he's not my cousin.
01:13:57.340 He's my uncle.
01:13:58.380 Seriously though, Cloud Devs offer a 14-day risk-free trial period
01:14:04.340 so all their customers can ensure they land the ideal talent for them.
01:14:08.840 Cloud Devs are offering Trigonometry fans a limited time discount of 200 US dollars.
01:14:14.480 Just mention Trigonometry when you place your order.
01:14:17.600 Go to clouddevs.com to find out more.
01:14:21.080 Seriously, please go to clouddevs.com.
01:14:24.340 My family really need the work.
01:14:26.560 Hey, Constantine, do you love Trigonometry?
01:14:30.760 Of course.
01:14:31.600 Incredible interviews, hilarious live streams, hard-hitting satire, plus my handsome jawline.
01:14:37.980 Whatever takes away from your hairline.
01:14:39.980 But if you do love Trigonometry and you want to support us,
01:14:43.280 there's only one place to do that and that's on Locals.
01:14:46.860 Yes, Locals is a brilliant platform that has been incredibly supportive to our show
01:14:51.680 and other problematic creators.
01:14:53.980 The great thing about Locals is that it's a community for people who love Trigonometry.
01:14:59.140 That's right.
01:14:59.620 It's a place for you to hang out with like-minded people, share thoughts, memes and discuss the show.
01:15:04.340 You can enjoy it for free, but it also gives you the option of supporting us for as little as $7 a month.
01:15:11.020 And if you want to give more, you can.
01:15:12.940 We have incredible rewards for our higher-tier supporters as well.
01:15:16.280 We've got everything from mugs, monthly group calls and one-on-two chats with me and KK.
01:15:21.300 Get in.
01:15:22.280 Join our community by hitting the link in the description and the pinned comment below.
01:15:27.000 See you there, guys.
01:15:28.000 Hey, everybody.
01:15:32.140 Welcome back.
01:15:33.040 We're going to bring Heather and Jordan back into the conversation now and ask them some of your questions.
01:15:38.680 Anton, if you can move us around.
01:15:39.980 Here we go.
01:15:40.720 I apologize to both of you for those cringy ads, but it's our brand.
01:15:47.660 Jordan, this is one for you, I think, but I'm sure, Heather, you'll have something to say on this as well.
01:15:51.900 Rob Evans from Australia, I think, says, it seems to me that the descent of the West into Marxism stroke communism is inevitable at this point.
01:16:00.460 Unfortunately, particularly with the COVID pandemic, he says, being used to accelerate it.
01:16:04.860 What can we do about this and how can we reverse it if we can?
01:16:08.460 Well, I don't think it's inevitable.
01:16:13.620 So I think it's not a foregone conclusion by any stretch of the imagination.
01:16:23.920 I mean, if you look at the United States, for example, there's a fair bit of power on both sides of the political spectrum.
01:16:31.300 And the battle's raging, but it isn't obvious that one side is one.
01:16:34.360 So, and I think the Americans have negotiated their way through worse crises than this.
01:16:40.380 I mean, it isn't obvious to me that this is worse than, well, certainly not worse than the time that Nixon was president.
01:16:47.640 That period of time, say, from 66 to about 74, it was pretty, it was much like it is now, except worse, I would say.
01:16:57.840 So I'm not hopeless about this.
01:17:00.780 I think that, and I do think a lot of this is a consequence of radical technological transformation.
01:17:07.460 That's a more unsettling phenomena in some sense.
01:17:12.040 But I don't really have any answers about what to do with that.
01:17:16.280 Francis, let me tack on to this question.
01:17:18.500 Maybe, Heather, you want to jump on in this because Sherry says,
01:17:21.800 what do you guys think is the best way to peacefully protest against critical race here in our tax-funded public schools?
01:17:28.360 And that's part of the same conversation, I feel.
01:17:31.760 Yeah.
01:17:33.960 I guess peacefully protest brings up images of people in the streets,
01:17:40.820 and I don't think that that is how CRT is going to get disappeared from the schools.
01:17:46.720 I think that, well, I think we need our schools of ed back.
01:17:52.740 Yeah, good luck with that.
01:17:54.260 Yeah, and that's a long-term solution that is the only one that will stick.
01:18:00.840 So in the moment, I imagine that, again, your inboxes may also look like mine in this regard.
01:18:07.200 But I certainly hear from a lot of parents and teachers, both, whose kids are in schools, both K-12 and higher ed,
01:18:17.200 but especially K-12, saying, this is a disaster.
01:18:21.220 What do we do?
01:18:22.800 And I guess I would go back.
01:18:25.480 It's not an operational answer, but I would go back to my answer from before and say,
01:18:30.040 be assured that if you are seeing this, either as a parent or as a teacher or staff or administrator at some school that you are affiliated with,
01:18:40.040 you are not the only one, and you're not the only one who's concerned.
01:18:44.640 I feel certain, and it's impossible to do the numbers from where I'm sitting,
01:18:50.240 but I feel certain that it's never just one person, and it's often a significant minority,
01:18:55.200 and sometimes it's actually a majority of people at an institution who don't like what's happening
01:19:01.140 are putting their head down and just trying to do what they perceive as their job
01:19:05.800 or just trying to get their kids to the school because they don't feel like they can make trouble,
01:19:09.240 but they're actually dying inside.
01:19:12.920 They really cannot tolerate what is happening, and find each other.
01:19:18.880 Figure out some way to find each other, and this is one of the few places where I would say,
01:19:22.640 I am with you, Jordan, on the hyper-novel technology being a big part of the crisis that we're living through right now,
01:19:29.660 and that's the thing that makes this different from Nixon era.
01:19:33.000 It's just not like that because they didn't have social media.
01:19:37.200 But one of the things that our hyper-connected, hyper-novel technology allows us to do is potentially find people
01:19:44.340 and talk privately and establish some rapport and figure out what to do.
01:19:52.000 And I think with regard to in the schools, know that there are many of us out here who see it and are doing what we can with bigger platforms,
01:20:02.840 but figure out who it is, who else, where you are, can actually see this as well.
01:20:09.760 And, you know, just start almost every, and this is not about CRT in schools, but like when I just engaged,
01:20:15.680 I'm here in Portland, Oregon, which is sort of like the epicenter of ridiculous in a lot of ways.
01:20:20.500 And it's, I love it.
01:20:21.480 I love Portland.
01:20:22.360 I love the people here.
01:20:23.400 And there's just a tiny minority of yahoos who are just driving the narrative and driving a lot of chaos here.
01:20:30.620 And when I talk to a barista, a waitress, you know, the person at the front desk at physical therapy, you know,
01:20:41.200 when I just talk to any, you know, at UPS, whatever, and just give any indication when they say something standard and generic
01:20:49.880 that I don't actually fit into the standard generic model that they're assuming is what you have to do to do small talk now,
01:20:58.100 almost everyone responds with relief and wants to have a real conversation.
01:21:02.900 Almost everyone.
01:21:04.060 I think, I think that actually it's a majority of us who want to be having a real conversation and want this to stop.
01:21:10.140 I think the evidence for that is pretty clear is that the extremists on both sides are a very small minority.
01:21:17.460 I wouldn't underestimate the utility of writing a letter.
01:21:20.220 Write a letter to the school board.
01:21:21.440 At least, you don't even have to send it.
01:21:23.520 At least you get your arguments down and articulate them.
01:21:26.200 That's something.
01:21:26.840 And then maybe once you do that, you know, you can write the letter and you can let it sit for a week and you can think about it.
01:21:32.180 Think about it.
01:21:32.880 Make sure you got your words right as much as you possibly can.
01:21:35.960 Then maybe send it.
01:21:37.100 It's like there's not much risk there.
01:21:38.920 And, you know, we do have political structures.
01:21:41.760 People aren't involved in them.
01:21:43.180 And I got involved in political structures in Canada a bit when I was younger.
01:21:46.800 And they're so starving for people's participation that it's beyond belief.
01:21:51.420 And so you can make an effect locally in your community or in your state or province or nationally, for that matter, if you want to put in the time.
01:22:00.540 So those institutions, well, that's what we have.
01:22:04.040 And so unless you want to generate new institutions, you could try using those.
01:22:08.080 And people think they won't get listened to.
01:22:11.140 And sometimes they don't.
01:22:12.560 But letters have more effect than you think.
01:22:15.700 So that's something, at least, isn't it?
01:22:18.120 And it's not going to expose you to a tremendous amount of risk, especially if you word it carefully.
01:22:22.240 And you write before you're so angry that you can't control yourself.
01:22:26.900 In England, we love a strongly worded letter.
01:22:31.020 It's a national pastime.
01:22:32.520 It is a national pastime to find something that mildly pisses us off and then writing a strongly worded letter about it.
01:22:38.700 Yes.
01:22:38.940 Well, thank God for England.
01:22:40.460 So the next question is from the rather charmingly named Toey Redback.
01:22:48.820 It's a dreadful name, but it's a good question.
01:22:50.520 Do our big brains have a downside that we overthink things and impose behavior contradictory to our nature, e.g. veganism and gender, et cetera?
01:23:01.860 So over to you, Heather, and then we'll go to Jordan with that one.
01:23:05.620 Yeah.
01:23:05.980 I mean, of course, there's a tradeoff there.
01:23:08.920 And just at the most maybe banal level, they're really expensive to run.
01:23:14.800 You know, our brains are the most expensive organ in the body to run.
01:23:18.040 And, you know, once we started growing our brains big, we needed more food, and then we needed to spend more time getting food, and then we needed to spend time figuring out new technologies in order to get our food to be more accessible to us.
01:23:31.780 We started cooking, for instance.
01:23:33.300 And you get a sort of cascade of events that is really hard to stop at the point that you start going down the we're going to solve this intellectually road.
01:23:46.080 On the other hand, you know, we don't tend to solve things with brute force.
01:23:51.380 We're not the brawniest ape or primate or mammal.
01:23:56.540 That's not what we do.
01:23:57.440 So, you know, is it, is there a tradeoff?
01:24:02.300 Yes.
01:24:03.460 Can we figure out how to, I mean, I feel like a broken record here, but, you know, can we return ourselves to something of an integrated, coherent whole?
01:24:13.780 And remember that we are actually fully embodied beings with emotion, in addition, you know, fully embodied with emotion and not just analysis and intellect.
01:24:24.260 It's not just the logic and the words and the, you know, typing furiously while hunched over your phone, right?
01:24:30.580 It's, you know, we're here in these bodies, going out into nature, feeling all of our senses and having a sense of awe sometimes, of love, of anger, of spite even.
01:24:42.880 You know, all of these are part of what it is to be human.
01:24:45.620 And it is, I think, again, a failure that is best described as reductionist to say that's, you know, our big brains is what we are.
01:24:53.660 It's the only thing that we are.
01:24:54.760 That's what makes us human.
01:24:56.020 No, it's all of this.
01:24:57.320 And it's the integration of all of the parts that makes us human.
01:24:59.720 And so, yeah, trade-offs, for sure.
01:25:02.340 How to evade the trade-offs?
01:25:03.760 No perfect way, but remembering that we are more than the sum of our parts and that we are a holistic whole is an approach.
01:25:13.460 And, Jordan, what is your view on this question?
01:25:15.740 Well, you guys kind of pointed to it earlier.
01:25:17.820 I mean, we're the only creatures that have discovered our temporal finitude in an explicit sense.
01:25:24.640 There's evidence from other animals that they have something akin to grief.
01:25:28.860 And maybe grief itself.
01:25:30.740 There isn't a lot of evidence that they conceptualize themselves as finite beings.
01:25:36.120 That's a big difference.
01:25:38.200 And it might be a fatal difference in some sense.
01:25:40.740 I mean, if you look at religious stories like the story of Adam and Eve, you know, when our eyes were opened, we discovered that we were going to die.
01:25:49.760 And that's the fall, essentially.
01:25:53.100 And we've never recovered from that.
01:25:54.940 And you can view the whole Bible, in some sense, as an attempt to redress that problem.
01:25:58.980 And the story there is something like, well, a sufficiently noble, ethical endeavor might be sufficient to remove the sting of death.
01:26:11.920 And maybe that's true.
01:26:15.300 So, the big brain has a price, right?
01:26:18.100 We're very self-conscious.
01:26:20.800 And that's great, in some sense, because we're aware of who we are.
01:26:24.300 But it's not like it's not a fall.
01:26:29.220 It's a fall, right?
01:26:30.600 And for the writers of the Bible, that was the beginning of history, was the dawn of self-consciousness.
01:26:37.840 And history is, in some sense, the attempt to erect a culture that protects us from our death.
01:26:45.840 So, you know, on the upside, we live a long time.
01:26:49.780 And our lives are less brutish, you might claim, than those of most animals.
01:26:56.280 And so forth, you know, we've gained, but not without a price.
01:27:01.860 And it isn't obvious that we can bear the price.
01:27:03.760 It's not obvious.
01:27:07.260 A former guest of the show, one Michaela Peterson, says, go dad, in the chat.
01:27:12.360 So, Michaela, good to see you again.
01:27:14.440 Also, a few of you are surprised to discover you can ask questions.
01:27:17.280 We mentioned it at the beginning.
01:27:18.480 There is a link in the bottom.
01:27:19.580 You can send us a PayPal, send a super chat, and we'll get to your question, if we possibly can.
01:27:24.940 Robin asked a question.
01:27:25.960 And this is coming back to the very beginning of our discussion, which is the breakdown of the family.
01:27:30.300 There is quite a persuasive argument.
01:27:33.180 Particularly, I think it's more likely to be on the political right.
01:27:36.260 The introduction of the welfare state in the United States is what fueled the breakdown of the family.
01:27:44.000 Do either of you have any strong feelings or thoughts on that issue?
01:27:46.940 Well, as a conservative, in some sense, which I think I became, at least in part because I was a social scientist and learned about the law of unintended consequences, is that you mess with complex functional systems at your peril.
01:28:04.640 And if you think that your well-meaning intervention is going to have the positive effect that you think it will, and only that effect, you're blinded by your own ignorance of your ignorance.
01:28:16.300 It's a really profound form of ignorance.
01:28:18.860 And so I think there were more factors driving the breakdown of the family.
01:28:25.120 And we should also point out, it's like, well, what do you mean by the breakdown of the family?
01:28:28.740 If you lived in the 1900s or the 1800s, the late 1800s, let's say, the probability that you were going to lose a parent before the age of 10, before you were at the age of 10, was pretty damn high.
01:28:41.900 And so the intact two-parent family with kids that lived and parents that lived, you know, for a substantial amount of time is a relatively novel and modern phenomenon.
01:28:54.460 Certainly, it's a 20th century phenomenon, as far as we can tell.
01:28:57.740 But, Jordan, I would argue that, and obviously I wouldn't argue with you about it, because you're actually an expert on this, but psychologically, I would imagine the impact of losing a parent to a natural cause like disease or whatever is different to the experience of feeling that your dad, let's say, walked out on you before you've even been born.
01:29:19.020 Well, look, you can decompose that in a lot of ways.
01:29:21.820 You know, I don't know if you can make a blanket case that it's worse to have a father who leaves or one that dies.
01:29:28.180 Those are both pretty bad.
01:29:29.980 Sure.
01:29:30.160 There is less of a sense of perhaps the risk of feeling that it's your fault or that something could have been done about it, arguably.
01:29:37.940 But putting that aside, like, I think if you look at the data on the development of children, the evidence that children who have intact two-sex, two-parent families do better is overwhelming.
01:29:52.680 Now, I don't think we know enough to specify the fact that it has to be two sex, like, you know, a male and a female.
01:30:01.160 I suspect that that's the case for a variety of complex reasons, but that doesn't mean that two people of the same sex couldn't make a good go of it.
01:30:08.740 It also doesn't mean that single parents can't make a go of it.
01:30:12.560 But on aggregate, it's better to have a mother and a father.
01:30:16.340 And I think the reasons for that are self-evident, actually.
01:30:19.480 It's like, well, because who are you going to model your masculinity from if you're a boy and your femininity from if you're a girl?
01:30:27.220 You'll get echoes of that in the same-sex parent, but it's not enough for creatures that are as developmentally complex as us.
01:30:35.960 And so it's kind of stunning to me that this is a discussion that we actually have to have.
01:30:41.820 It's like, well, and especially when you also consider something we didn't talk about, relationship to the patriarchal structure of the nuclear family.
01:30:50.480 It's like, I don't know how twisted you have to be to think that a man's commitment long-term to monogamy and child-raising over multiple decades is an expression of his desire for arbitrary patriarchal power.
01:31:06.240 That's preposterous.
01:31:08.260 It's exactly the opposite.
01:31:10.120 Obviously.
01:31:12.680 So the fact that we, I can't believe that we're actually confused about this.
01:31:17.140 Like, where is the selfish advantage precisely?
01:31:21.540 The selfish advantage, the selfish advantage is manifested by psychopaths.
01:31:28.120 They mate promiscuously, like mosquitoes, and their offspring live or die, and that's how it is, and there's a niche for that.
01:31:38.020 And that's the arbitrary expression of power.
01:31:40.740 So, we're so demented about so many fundamental things that it just staggers me.
01:31:47.480 We swallow things that aren't, like, incomplete theories.
01:31:53.160 They're absolute preposterous falsehoods.
01:31:56.180 And yet, to oppose them means that there's something wrong with you ethically and politically.
01:32:01.720 I just can't see how you can make a case that, on aggregate, a family without a father and a mother is better than or equal to a family that has both.
01:32:11.480 How can that possibly be true?
01:32:13.900 You don't have to make a case.
01:32:15.220 You just have to say that the people who do make that case are bad people, which is what happens.
01:32:19.760 Heather, do you want to weigh in on this?
01:32:21.600 Yeah.
01:32:21.880 First, I'm just delighted.
01:32:24.520 I've never heard a promiscuous mating system identified as being like mosquitoes before Jordan, so that made me laugh.
01:32:33.080 The one thing that I would add, and I basically agree with what you've said here, is that I think I don't see a reason to expect.
01:32:42.280 I think that, you know, I agree with what you said, that all else being equal, two parents is better than one, and two parents of opposite sex is going to provide more diverse input to the developing child.
01:32:57.420 And therefore, if you have a loving relationship between two women or between two men who are raising children, then, you know, one thing in their favor is they had to really, really want those children, whereas many heterosexual parents end up parents without necessarily wanting kids, right?
01:33:17.160 You can happen into parenthood being hetero, and you can't if you're homosexual.
01:33:22.180 But I think it's more important if you're raising children as a homosexual couple of either sex to make sure that you have adults around of the sex that you are not, so that they are providing some kind of model for your children of what it is to be the other sex.
01:33:41.260 And that's, you know, this is not some sort of a bioessentialist imagining that, like, all men are like this and all women are like this, but that there are different ways of doing things.
01:33:50.740 You know, even down to, on average, women being more agreeable than men, and, you know, just seeing what that looks like in real time, like, the more exposure to the more diverse kinds of people you give your children, the more likely they are to be able to wade through life later on.
01:34:10.420 And especially if your, you know, if their natal home only has men or only has women, make sure you introduce them to some women or men.
01:34:17.960 I can give you an example of that.
01:34:20.540 It's a bit offside, but not exactly.
01:34:23.820 Girls with brothers are less likely to suffer sexual assault.
01:34:28.440 Right.
01:34:29.060 Why?
01:34:31.960 Well, it's hard to say exactly, but it's partly because they, their exposure to men from an early age allows them to read the social situations more carefully, more accurately.
01:34:46.860 It's something like that.
01:34:47.880 I'm sure I'll get, like, lambasted for this, but that is a fact.
01:34:52.700 So, and, you know, we do know that there are differences between men and women.
01:34:57.980 What's that?
01:34:58.680 Is it not also the fact that they're perceived as having protection?
01:35:02.640 You know, it works later in life, even.
01:35:05.380 Yeah.
01:35:05.800 It's a protracted effect.
01:35:07.200 So, you know, that might be part of it to begin with, but it seems to be longer lasting than that.
01:35:12.380 I mean, it's not so preposterous to claim that you're more likely to deal with people that you understand more deeply, deal, you know, effectively with people that you understand more, more deeply.
01:35:21.380 Right.
01:35:21.600 How dare you, sir?
01:35:23.180 Yeah, well.
01:35:23.900 An extraordinary statement.
01:35:25.220 I do, I think, look, I do think it's a, it's an indication of how confused we are by how much change is occurring, that all these things have become open to question.
01:35:37.320 It might also be the case that, you know, that's a consequence of, well, a university education that isn't sufficiently cognizant of the debt that it owes to the past.
01:35:48.780 Yeah, no, and I guess actually picking up on something that you said earlier with regard to what's going on at universities, I think, you know, you indicated that people who are concerned about their school boards could fill those spots because people are hungry, like the political environment is hungry for people at that level.
01:36:11.420 And I think this is part of what happened in higher ed, that no one who didn't have to went and, you know, pursued governance.
01:36:19.560 You know, we're all expected, all us academics are expected to do a certain amount of governance.
01:36:22.780 But if you were one of the scientists with big grants, you'd normally got an exemption.
01:36:27.240 Or if you, you know, if you otherwise had a lot going on, that was how the university basically paid you.
01:36:33.740 Like, you don't have to do the governance.
01:36:35.260 So who was left in all the governance spots?
01:36:37.780 People who couldn't do the research.
01:36:39.160 The people who were otherwise uninterested or incapable of bringing new ideas to the university and, frankly, to some degree, who were driven to bring ideology in.
01:36:52.500 And so, you know, we absolutely need people who are creative and analytical and logical and compassionate and actually know something and know how to communicate it and believe in the humanity of other people to be in positions on school boards and in governance.
01:37:05.320 And, you know, the work that doesn't feel like almost anyone's highest and best use, but that we need real good people in those positions.
01:37:15.700 And the next question is by another awful name.
01:37:20.400 He's actually a big fan.
01:37:21.340 Stop slagging off our audience.
01:37:23.660 His name is I'll Fight You Naked.
01:37:25.420 Come on.
01:37:26.060 That's not a good name.
01:37:27.320 I don't know if that says more about him or the channel.
01:37:29.180 Anyway, but it's a very good question.
01:37:31.460 His question is women seem more vulnerable to the woke religion, I think, because it hijacks the mama bear circuits.
01:37:38.560 Is there any psychological intervention or an evolutionary perspective that could be used to at least curtail this?
01:37:46.040 And that's over to you, Heather, and then Jordan.
01:37:49.440 Yeah.
01:37:50.780 Great question.
01:37:51.520 I'm actually familiar with I'll Fight You Naked as well.
01:37:54.360 He writes in our Q&A as well.
01:37:57.300 I'm assuming it's a he with that name somehow.
01:38:02.720 Yeah.
01:38:03.800 It's not a very feminine approach, is it, Heather?
01:38:06.120 No.
01:38:06.680 No.
01:38:07.080 No.
01:38:07.320 Not so much.
01:38:11.780 Women, both, you know, to go back to what I was saying, you know, right at the beginning of our time together today,
01:38:17.260 women being more likely to engage through covert competitive means than men with their overt competitive means,
01:38:25.200 and also being more likely to be agreeable, which is related but not the same thing,
01:38:31.820 are, you know, and a third thing, more likely to have their entire world wrapped up in their social universe,
01:38:39.480 as opposed to men who are more likely to also, say, have a shop, be tinkering, like have something else going on.
01:38:45.640 And all of those things, I think, make women more susceptible to groupthink, to social pressure,
01:38:55.240 and to actually, at the point that they have adopted some new ideology, or just even just a piece of it,
01:39:02.680 to feeling compelled to spread it to others.
01:39:06.820 And I don't, I wouldn't actually put it, I feel like mama bear, to the degree that that's something that we can all immediately grok what that means,
01:39:16.420 is one of the best, and frankly, I don't think capturable pieces of womanhood.
01:39:25.560 That, you know, protection of your children, and of course fathers do this too,
01:39:29.300 but protection of your children at all costs is actually what I think we need to be finding.
01:39:37.120 I was just actually talking with Brett about this last night.
01:39:39.520 Like, we need to get the mothers to discover what it is that is being done to the children.
01:39:44.520 And once the mothers stand up, once the mothers stand up and say, actually, no, you don't get my kids.
01:39:51.060 You do not get to do that to my kids.
01:39:53.980 That is the thing that will make women disagreeable.
01:39:56.680 And what we need is disagreeability.
01:39:58.280 We need people to stand against the crowd enough that then the people who won't go disagreeable say,
01:40:03.940 oh, wait, but there's a lot of people I like and respect who are going this way,
01:40:07.880 so I don't have to be disagreeable to go this way.
01:40:10.200 I'm going to follow that new line over there.
01:40:12.380 And it's going to be the one that is actually, you know, honest and honorable and protective of children
01:40:18.000 and allows us, you know, freedom of expression and exploration of ideas.
01:40:23.980 Yeah, agreeable means that you like your kin, but it doesn't necessarily mean that you're that fond of perceived predators.
01:40:33.200 Yeah.
01:40:33.640 Right?
01:40:33.920 So that's a weird thing.
01:40:35.400 You know, we don't understand.
01:40:37.380 You think, well, you love your family.
01:40:40.060 Therefore, you're a loving person.
01:40:41.480 It's like, no, wait a minute.
01:40:43.240 Wait a minute.
01:40:43.740 It's not that simple.
01:40:45.340 You love your family.
01:40:46.880 Well, what do you do when you perceive a threat to them?
01:40:49.900 You don't love the threat and love your family at the same time.
01:40:53.320 That's very unlikely.
01:40:54.620 And so agreeableness, a very complicated personality dimension.
01:40:58.260 It has marked advantages and disadvantages at every position along the distribution.
01:41:04.760 You know, like disagreeable people are rather self-centered, let's say,
01:41:08.640 and they have a tendency to be, in the extreme, they have a tendency to be antisocial.
01:41:13.300 But there's pathologies associated with extremely high levels of agreeableness, too.
01:41:17.240 And we don't understand how that'll manifest itself precisely in the political realm,
01:41:21.880 because this is all new territory in some sense.
01:41:24.900 So what does female political pathology look like?
01:41:28.200 Well, it doesn't exist.
01:41:29.420 It's like, right.
01:41:31.080 Sure.
01:41:32.080 Mm-hmm.
01:41:32.380 No, that's wrong.
01:41:34.060 You know, we know that the covert kinds of aggression that Heather was talking about,
01:41:40.080 it's reputation destruction that characterizes feminine aggression.
01:41:45.020 And if you've, you know, what's the joke from The Simpsons?
01:41:47.580 There's nothing meaner than a roving pack of 13-year-old girls.
01:41:51.980 And, you know, the boys, and my daughter used to comment on this,
01:41:55.760 now and then my friend, my son would have a fight with one of his friends.
01:41:58.840 They'd actually have a fight, and then they'd make up.
01:42:00.960 But if the girls got their knickers in the knot, let's say,
01:42:04.580 Jesus, all hell would break loose for a long time, and often on social media.
01:42:10.520 So, you know, is it reasonable to see some of this reputation destruction culture
01:42:14.800 as a manifestation of female political pathology?
01:42:17.940 It's like, I don't know.
01:42:19.200 But I do know that, because I've studied antisocial behavior,
01:42:22.940 and I've been in touch with leading experts in that area,
01:42:25.960 that that is the most extreme form of female aggression.
01:42:31.720 And so, and it scales.
01:42:33.620 That's the thing that's interesting about it.
01:42:35.220 You can't punch someone on Twitter, much as you might want to,
01:42:39.280 but you can sure savage their reputation.
01:42:43.100 And so it scales.
01:42:44.240 And that's, we don't know what to make of that,
01:42:47.320 but to turn a blind eye to it seems to me to be prematurely foolish.
01:42:53.300 Heather, I was going to, let me just follow up before I go to a question.
01:42:56.060 Just, I want to talk a little bit more about this.
01:42:58.360 You've written in the past about toxic femininity,
01:43:00.680 which I think is what Jordan is alluding to a little bit there.
01:43:04.040 Do you think, you know, this obsession with toxic masculinity and toxic,
01:43:08.760 do you think that's a useful frame,
01:43:11.100 or do we need to de-sex this and just go,
01:43:13.820 there are some behaviors that are productive and some behaviors that are unproductive,
01:43:18.060 and we just need to try and steer society towards the productive ones,
01:43:21.560 rather than sort of focusing on which sex it's more in tune with?
01:43:27.020 No, I actually, so obviously the,
01:43:30.120 the pointing of the finger at any kind of masculinity and declaring it toxic
01:43:34.420 is a disaster and foolish and often far worse than foolish,
01:43:39.420 like actually a kind of diabolical behavior.
01:43:41.480 And there will be, there will be places,
01:43:44.840 I would say where approaching it without reference to the sex that tends to engage
01:43:49.460 in a particular thing will be useful,
01:43:50.860 but there is utility in understanding that sex roles have been around for millions of years
01:43:58.480 and that, you know,
01:44:00.460 men are more likely to engage in competitive interactions overtly in front of you
01:44:06.260 where you can see it and when it's over, it's over and you know it.
01:44:09.120 And women are more likely to engage in competitive interactions covertly,
01:44:13.760 behind the scenes, behind the people involved perhaps,
01:44:16.780 and you may not even know that you're playing.
01:44:19.240 And, you know, those different dynamics,
01:44:21.240 it's not that women can't do the overt thing and men can't do the covert thing,
01:44:24.960 but understanding that there's actually a sort of evolutionary basis
01:44:30.100 for the types of hierarchy and competition
01:44:34.040 that men and women engage in and that they are different is,
01:44:37.800 I think, important, you know,
01:44:39.300 and it's not because we want to reify what's male and what's female,
01:44:46.000 but because we want to understand so that we can move past the regressive parts of it
01:44:50.740 and those that we can't move past understand them
01:44:53.800 so that we can actually create some kind of, frankly,
01:44:56.740 shared hierarchy so that we all can all work together
01:44:58.900 without pretending that, you know,
01:45:00.920 the male way of doing things is always wrong
01:45:03.000 and therefore it's going to be the female way
01:45:05.440 and, you know, and we can't even point out when there's a pathology there,
01:45:09.540 as Jordan is pointing out, you know, or vice versa.
01:45:12.140 You know, we need to integrate the best of the male typical approach
01:45:16.060 and the female typical approach,
01:45:17.360 and I don't know exactly what that looks like,
01:45:19.480 but pretending that there's no difference isn't the right way,
01:45:22.580 I'm sure of that.
01:45:23.140 Well, attempting to integrate them would be a start,
01:45:26.960 and we seem to be quite far off that at the moment,
01:45:29.000 but let me ask this question.
01:45:30.400 It's from Important John, another great name,
01:45:34.580 and his question really talks about there's obviously a lot of
01:45:39.360 what you might call conspiracy theories floating around at the moment,
01:45:43.120 the idea that, you know, COVID was created by Bill Gates
01:45:46.580 and all this other stuff,
01:45:48.280 and Heather, I know that there are some conspiracies
01:45:50.960 which have now increasingly turned out to be actually true.
01:45:54.300 Lab leak is something we'll be talking with Brett about on Tuesday.
01:45:57.520 But from a psychological perspective, Jordan,
01:46:01.000 what is the appeal of a conspiratorial view of the world?
01:46:06.560 Why are people drawn to conspiracies?
01:46:12.400 I think it's often a manifestation of low agreeableness, distrust.
01:46:19.040 It's simple and explanatory, so that's helpful.
01:46:24.960 We tend to personify things.
01:46:27.940 It's easy to see malevolence and intent
01:46:30.340 where there's just ignorance and randomness,
01:46:32.980 and because it's important to identify malevolent intent,
01:46:36.420 we might have a tendency to over-perceive it,
01:46:39.120 especially in the out-group, let's say.
01:46:43.560 But I think there's additional factors at play right now,
01:46:46.780 like, it's not that easy to distinguish conspiratorial thinking
01:46:51.600 from just creative thinking, right?
01:46:54.940 So there's, something happens,
01:46:56.460 there's a bunch of explanations why it might happen.
01:46:59.280 Now, you might have a cockeyed conspiratorial version,
01:47:02.660 and then you go talk to your friends and your family,
01:47:05.040 and they sort of tune you in,
01:47:06.460 but let's say it's COVID and you're isolated.
01:47:09.540 Well, you know, every one of us has got a proclivity
01:47:12.600 towards our own particular form of insanity,
01:47:14.920 like we blow out at our weakest point when we're stressed,
01:47:18.620 and the way that we deal with that
01:47:20.200 is by banging ourselves up against other people.
01:47:22.740 It's like, what do you think of this?
01:47:24.140 I think that's stupid.
01:47:25.820 Oh, okay, well, maybe I'll have to go rethink it.
01:47:29.340 Well, that's part of what keeps our conspiratorial thinking
01:47:32.300 under wraps, right?
01:47:33.880 We have to share it with other people,
01:47:35.240 and if they think it's preposterous,
01:47:37.400 then we go back and rethink it.
01:47:38.800 It's distributed cognition in the social space.
01:47:44.200 Now you can find like-minded people
01:47:46.220 for your conspiracy instantly,
01:47:47.680 and they're not even real people.
01:47:49.860 Often they're disembodied,
01:47:51.240 and so we also have no idea what that's doing
01:47:54.740 to human social cognitive political thinking.
01:48:00.160 We don't know,
01:48:01.240 and that's another consequence
01:48:02.300 of this radical transformation in technology.
01:48:05.420 We don't even know what it does to people
01:48:06.960 to have to speak over video instead of in person,
01:48:11.500 what that does to discourse.
01:48:12.600 We have no idea what Twitter does to thought.
01:48:15.100 You have to compress your thought
01:48:16.280 into a minimum number of,
01:48:18.320 or maximum number of characters.
01:48:19.820 Well, does that bias you
01:48:22.220 towards the expression of impulsive aggression?
01:48:24.720 Kind of looks like it.
01:48:27.000 You know, does it increase the probability
01:48:28.640 that you're going to speak out
01:48:30.320 kind of in the privacy of your own cocoon,
01:48:33.400 like you're behind the wheel of an automobile
01:48:34.960 and shielded when something irritates you?
01:48:37.380 Could easily be.
01:48:38.960 These are mass communication networks
01:48:41.320 with huge effects,
01:48:42.560 and we have no idea about their,
01:48:44.980 what they do to us psychologically.
01:48:47.460 And we'll never figure it out,
01:48:48.840 because by the time we study it,
01:48:50.180 they'll have transformed
01:48:50.900 into something completely different.
01:48:53.040 So it's tough.
01:48:54.960 It's a tough one.
01:48:57.160 Heather?
01:48:59.500 I agree.
01:49:00.240 Don't assume malevolence
01:49:01.480 where you can assume ignorance.
01:49:03.540 Yeah, that's true.
01:49:05.040 That's a good mental hygiene approach
01:49:08.340 to conspiratorial thinking.
01:49:10.000 It's probably stupidity.
01:49:14.080 Yeah, I agree with that.
01:49:17.000 And just, this isn't an,
01:49:21.220 and also, not following from that,
01:49:24.140 but and also,
01:49:25.120 and I know you're having a conversation,
01:49:26.900 as you said, with Brett on Tuesday.
01:49:28.000 Currently, alternative approaches
01:49:34.320 to thinking through what we are being told
01:49:36.560 are being branded as conspiracy
01:49:38.640 as a way to shut down the conversation.
01:49:41.640 And so this is, you know,
01:49:43.240 this is the death of science.
01:49:45.800 This is exactly the same people
01:49:47.500 who are branding things conspiracy theory
01:49:49.660 are saying, hashtag follow the science.
01:49:52.340 And these aren't people
01:49:53.200 who would recognize a hypothesis
01:49:54.460 that was alternative to the only one
01:49:56.200 that they're allowing us to talk about
01:49:57.600 if it hit on the face.
01:49:59.700 So, you know,
01:50:01.660 there's a weaponization of words,
01:50:04.440 you know, again,
01:50:05.660 but specifically, you know,
01:50:08.260 as of five, ten years ago,
01:50:10.480 we all knew that we were most afraid
01:50:12.280 of being called racist.
01:50:13.840 And now it's understood
01:50:16.300 that you certainly don't want to be called
01:50:17.660 a conspiracy theorist.
01:50:19.020 But, hey, remember
01:50:20.440 that every big idea
01:50:22.340 and probably most of the small ones, too,
01:50:24.720 that we now understand to be true
01:50:26.280 were once promoted
01:50:27.860 by one or a couple of people
01:50:30.080 and they were roundly mocked
01:50:31.660 and perhaps stigmatized
01:50:33.020 and perhaps even ostracized
01:50:35.000 or even executed
01:50:35.900 for even saying the thing.
01:50:38.380 And so, yes,
01:50:39.160 LabLeak is a perfect example
01:50:40.400 and there are more in COVID space
01:50:42.020 and there are more all the time.
01:50:43.860 And what social media does
01:50:45.540 is it gives the bullhorn
01:50:46.600 to the orthodoxy.
01:50:50.220 To those who are claiming
01:50:51.160 they have no power at all,
01:50:52.440 but actually have the power
01:50:53.640 to shout down any dissent
01:50:55.100 to the reigning narrative.
01:51:00.380 You know, if you look at the way science works,
01:51:02.900 you think about it this way,
01:51:05.120 well-meaning people
01:51:06.780 doing everything they possibly can
01:51:08.460 to rein their egos in,
01:51:10.620 generate hypotheses
01:51:11.720 about why something happens.
01:51:13.860 99 out of 100 times,
01:51:15.580 they're wrong.
01:51:16.480 Right?
01:51:18.660 And now,
01:51:19.080 if you write 1% of the time
01:51:20.900 and that's cumulative,
01:51:22.140 man, you're just,
01:51:22.980 your knowledge is progressing
01:51:23.960 so damn fast.
01:51:24.960 You can't even keep up with it.
01:51:26.240 But your error rate is still 99%
01:51:28.260 and that's with the best method
01:51:29.680 we possibly have
01:51:30.660 to detect, you know,
01:51:31.820 some facet of the truth.
01:51:33.960 In the political space,
01:51:35.380 well, it's much harder.
01:51:36.640 So the error rate
01:51:37.500 is going to be much, much greater.
01:51:40.320 And so,
01:51:42.000 and of course,
01:51:42.620 everybody's tails in a knot
01:51:43.960 because they've been locked down
01:51:45.320 for a year and a half
01:51:46.340 and God only knows
01:51:47.500 how insane we are
01:51:48.620 because of that
01:51:49.280 and what the consequence
01:51:50.080 of that's going to be.
01:51:51.400 I mean, I certainly don't think
01:51:52.860 that the lockdown
01:51:54.480 is going to result
01:51:55.400 in a net,
01:51:57.100 in net life saving.
01:51:58.560 I doubt that very much.
01:52:00.920 Now, you know,
01:52:01.540 we'll probably never have the data
01:52:03.080 and I can understand
01:52:04.680 that the Western world
01:52:06.700 and the rest of the world
01:52:07.900 for that matter
01:52:08.380 didn't want to see
01:52:09.040 their medical system
01:52:10.080 completely swamped.
01:52:12.300 But,
01:52:12.640 but,
01:52:15.420 I doubt very much
01:52:17.780 that we save people.
01:52:19.080 We'll see.
01:52:19.680 I don't know.
01:52:20.380 I don't know.
01:52:21.520 We save some people
01:52:22.540 at the cost of others
01:52:23.380 is what you mean.
01:52:24.020 We save some people
01:52:24.980 in the short term
01:52:25.680 at the cost of many others
01:52:26.900 in the long run.
01:52:28.060 Agreed.
01:52:28.520 And look,
01:52:29.260 those sorts of trade-offs,
01:52:30.140 we make those sorts of trade-offs
01:52:31.340 because that immediately evident
01:52:34.040 catastrophe, mortality,
01:52:37.020 is a lot more difficult
01:52:38.780 to deal with psychologically
01:52:39.960 than, you know,
01:52:41.500 a 5% increase
01:52:42.840 in the rate of alcoholism
01:52:44.160 or something like that,
01:52:45.100 which is basically invisible.
01:52:46.800 I'm not castigating
01:52:48.100 the people who did this.
01:52:49.180 Well,
01:52:49.360 the people who did that,
01:52:50.460 this,
01:52:50.760 were us.
01:52:51.520 So,
01:52:52.260 castigate away.
01:52:53.020 And we have time
01:52:56.340 for one last question.
01:52:58.040 We've got a couple,
01:52:58.760 actually.
01:52:58.980 We started late,
01:52:59.640 remember?
01:53:00.040 Oh, yeah,
01:53:00.360 we did start late.
01:53:01.400 So,
01:53:01.740 I will leave that question.
01:53:03.840 I'll ask it now.
01:53:04.840 It's the most important question
01:53:06.160 of our time.
01:53:06.840 It's from Elana.
01:53:08.920 And we ask this to all our guests
01:53:10.700 on the live shows.
01:53:12.360 Heather and Jordan,
01:53:13.420 what's your favourite biscuit
01:53:14.600 and wine?
01:53:16.940 Biscuit?
01:53:18.120 Yes.
01:53:18.680 Cookie.
01:53:19.180 Cookie.
01:53:19.440 Cookie.
01:53:19.720 Cookie.
01:53:20.320 Cookie.
01:53:20.420 Cookie.
01:53:20.560 Cookie.
01:53:20.660 Would you like to take
01:53:24.260 this one first,
01:53:24.920 Jordan?
01:53:25.480 Well,
01:53:25.800 at the moment,
01:53:26.340 I'm still mostly eating
01:53:27.440 a carnivore diet,
01:53:28.400 so I have these meat chips
01:53:29.960 that are a substitute
01:53:31.220 for everything
01:53:32.180 that might vaguely
01:53:33.180 be dessert-like.
01:53:34.600 So,
01:53:35.340 I mean,
01:53:36.000 if I,
01:53:37.480 the unwanted image
01:53:39.680 of strawberry cheesecake
01:53:43.520 with chocolate drizzling
01:53:45.020 comes to mind,
01:53:45.660 but that's not precisely
01:53:46.520 a cookie.
01:53:46.980 It's just an indication
01:53:48.220 of my state of deprivation
01:53:49.800 for having endured
01:53:51.080 this bloody diet
01:53:52.040 for so long.
01:53:54.620 Heather?
01:53:55.340 Cookie.
01:53:55.980 I think it's probably
01:53:57.980 the most American answer,
01:53:59.420 but a good,
01:54:01.060 chewy chocolate chip cookie
01:54:02.040 is probably the best.
01:54:03.740 Still warm from the oven,
01:54:05.040 crisp on the edges,
01:54:07.000 melty on the inside.
01:54:08.780 It's pretty good.
01:54:10.160 You could be a voice
01:54:10.860 on there, too,
01:54:11.960 as far as I'm concerned.
01:54:13.820 We're going to,
01:54:14.780 let's stop,
01:54:15.680 because Jordan looks like
01:54:16.720 he's about to go off
01:54:17.660 his diet,
01:54:18.480 the way he's vigorously
01:54:19.780 nodding there.
01:54:21.180 Listen, guys,
01:54:22.260 before we ask you
01:54:23.380 the last question,
01:54:24.440 can I just thank you both
01:54:25.620 from the bottom of our hearts
01:54:26.940 for joining us,
01:54:27.680 for having this conversation?
01:54:28.880 It's been an absolute pleasure.
01:54:30.120 Our audience
01:54:30.700 have been so grateful
01:54:32.660 that we've been able
01:54:33.360 to host it.
01:54:34.460 So thank you,
01:54:35.500 Jordan,
01:54:35.800 and thank you,
01:54:36.280 Heather,
01:54:36.500 for coming on the show.
01:54:37.600 It's been an absolute pleasure,
01:54:38.560 and I hope we can do it
01:54:39.300 again sometime.
01:54:40.980 Francis,
01:54:41.400 do you want to ask,
01:54:41.900 maybe we can end
01:54:42.500 on a positive.
01:54:43.160 Should we ask this one
01:54:43.840 from April?
01:54:44.200 This is from April,
01:54:46.240 and the question is,
01:54:48.200 April Hill,
01:54:48.980 how we avoid apathy,
01:54:50.540 but also not cave
01:54:51.820 into negative thought
01:54:53.020 when informed
01:54:54.140 of all that is wrong?
01:54:55.660 Basically,
01:54:56.180 how do you avoid apathy
01:54:57.760 if you watch trigonometry,
01:54:59.460 is the question.
01:55:05.200 Early in the pandemic,
01:55:06.520 when Brett and I
01:55:07.020 started live streaming,
01:55:08.000 I started saying
01:55:08.880 at the end,
01:55:09.980 get outside.
01:55:10.480 And that is the thing
01:55:13.600 that I do
01:55:14.460 to avoid apathy,
01:55:16.240 is as much as,
01:55:17.860 you know,
01:55:18.060 I learned from you guys
01:55:19.080 and Jordan,
01:55:20.100 like all of you
01:55:21.360 and many others,
01:55:23.840 still,
01:55:24.560 the less time
01:55:25.300 on the screens,
01:55:26.240 the better,
01:55:27.100 and that even holds
01:55:28.700 for just taking in text.
01:55:30.140 I still prefer to read
01:55:31.180 through, you know,
01:55:31.880 actual books,
01:55:32.560 not on screens,
01:55:33.520 and I read a lot,
01:55:35.160 but the more time
01:55:36.640 I'm outside,
01:55:37.860 actually,
01:55:38.520 you know,
01:55:38.880 pursuing speed,
01:55:40.460 in my case,
01:55:41.160 because I just love
01:55:41.960 going fast outside
01:55:43.440 in nature
01:55:44.080 and also being
01:55:44.820 totally still
01:55:45.820 outside in nature
01:55:46.880 and letting nature
01:55:47.620 come alive around me.
01:55:49.220 That gives a renewed
01:55:50.560 sense of meaning
01:55:51.860 and purpose
01:55:52.580 and, frankly,
01:55:54.680 direction,
01:55:55.980 such that I can
01:55:56.620 come back inside
01:55:57.720 to this kind of work
01:55:58.740 with less apathy
01:56:00.780 and more purpose.
01:56:03.620 Jordan?
01:56:04.820 Well,
01:56:05.260 I probably find
01:56:06.140 most of what's
01:56:07.440 renewing in my
01:56:08.380 relationships
01:56:08.960 with my family
01:56:09.780 and friends,
01:56:10.480 and so,
01:56:11.680 you know,
01:56:12.040 to the degree
01:56:12.520 that I'm able
01:56:13.080 to engage in that,
01:56:14.940 that's almost
01:56:15.820 always to the good.
01:56:17.860 So,
01:56:18.740 apathy as such,
01:56:19.840 well,
01:56:20.060 I find inactivity
01:56:21.200 much more
01:56:22.320 horrifying
01:56:23.580 than activity.
01:56:25.000 So,
01:56:25.580 I don't know
01:56:26.380 how to answer
01:56:27.000 the question
01:56:27.420 in a broader sense
01:56:28.160 because I just
01:56:28.800 can't stand
01:56:29.520 not being
01:56:30.860 engaged in something.
01:56:31.980 and so,
01:56:33.540 I guess,
01:56:34.680 in some sense,
01:56:35.240 that question
01:56:35.700 doesn't come up.
01:56:37.080 If it's hopelessness,
01:56:40.180 well,
01:56:40.580 you know,
01:56:40.820 in some sense,
01:56:41.440 things are hopeless,
01:56:42.280 but you move
01:56:42.900 ahead regardless.
01:56:44.620 And that's the
01:56:45.120 nature of life,
01:56:45.960 isn't it?
01:56:46.420 You know that
01:56:47.000 in the final analysis,
01:56:48.240 in some sense,
01:56:48.920 it's hopeless,
01:56:49.540 you're going to
01:56:49.960 disappear.
01:56:50.940 I don't mean to be
01:56:51.980 pessimistic about that,
01:56:53.120 but it does speak
01:56:53.920 to things we've
01:56:54.540 already talked about
01:56:55.220 in this episode,
01:56:56.340 and you persevere
01:56:57.280 to the degree
01:56:58.020 that you're capable of
01:56:59.280 because the
01:57:00.640 alternative is
01:57:01.320 far worse
01:57:01.920 than that.
01:57:04.460 And not just
01:57:05.340 for you,
01:57:05.800 for everyone else
01:57:06.460 around you
01:57:06.980 as well.
01:57:10.220 Well,
01:57:10.700 I said let's
01:57:11.200 finish on a high.
01:57:11.900 We didn't quite
01:57:12.520 get there.
01:57:13.300 I sure put the
01:57:14.080 kibosh on that.
01:57:15.700 We should have
01:57:16.240 finished with the
01:57:16.780 biscuit question.
01:57:17.680 I knew it.
01:57:18.180 I knew it.
01:57:19.480 Guys,
01:57:20.080 thank you so much
01:57:21.160 for speaking with us.
01:57:22.300 It's been an
01:57:22.660 absolute pleasure,
01:57:23.540 and thank you all
01:57:24.340 for watching,
01:57:25.560 being here with us,
01:57:26.600 asking the questions.
01:57:27.960 As you know,
01:57:28.740 we will be back
01:57:29.620 on Tuesday
01:57:30.720 with Brett Weinstein
01:57:32.040 talking about
01:57:32.800 LabLeak and COVID
01:57:33.780 and many other things.
01:57:35.100 But for now,
01:57:35.840 let's say again
01:57:36.460 thank you to
01:57:36.920 Jordan Peterson,
01:57:37.620 thank you to
01:57:38.060 Heather Hying,
01:57:38.800 and thank you
01:57:39.280 for watching,
01:57:39.880 and we will see
01:57:40.620 you very soon
01:57:41.520 with another episode
01:57:42.500 or Raw show.
01:57:43.700 All of them go out
01:57:44.260 at 7pm UK time.
01:57:45.480 Take care.
01:57:46.140 See you soon, guys.
01:57:47.720 We hope you've
01:57:48.680 enjoyed this
01:57:49.340 incredible interview.
01:57:50.920 Remember to
01:57:51.380 subscribe and
01:57:52.560 hit the bell
01:57:53.120 button so that
01:57:54.140 you never miss
01:57:54.980 another fantastic
01:57:56.180 episode.
01:57:57.060 And if you believe
01:57:57.940 that the work
01:57:58.580 we do here
01:57:59.380 at Trigonometry
01:58:00.160 is important,
01:58:01.400 support us
01:58:01.960 by joining
01:58:02.520 our local
01:58:02.940 team.
01:58:07.560 Thank you.