Aspen Ideas Festival: From the Barricades of the Culture Wars
Episode Stats
Length
1 hour and 38 minutes
Words per Minute
173.50992
Summary
In this episode, Dr. Jordan B. Peterson joins Dr. Kelly to discuss his new podcast, The Jordan Peterson Podcast. Dr. Peterson is a clinical psychologist and a professor at the University of Toronto. He s written two books, Maps of Meaning and 12 Rules for Life, which is currently being translated into 40 languages. He s also the host of the popular TV show Big Ideas and hosts the podcast Big Talk on Canadian Public Television. In this interview, he talks about his views on gender, the role of technology in society, and what it means to be a public intellectual in the 21st century. He also talks about why he thinks gender isn t a social construct, and why it s important to understand the role technology plays in our understanding of the world and our place in it. You can support these podcasts by donating to Dr. B.P. Peterson s PODCAST by making a small monthly donation. Go to Dailywire.plus/thejordanbpeterson to support the podcast and get immediate access to all of the episodes of The Jordan B Peterson Podcast! Let s take the first step towards the brighter future you deserve. Thank you so much for listening and supporting Daily Wire Plus. Peace & Love, Eternally grateful, EJ & Elyssa. -The Jordan Peterson Project - Copyright 2019 EJ Peterson and EJ P. Peterson. All rights reserved. This podcast is not to be used without permission from EJ, unless otherwise specified. If you decide to do so, we may not be able to provide you with a copy of this podcast on any other product or service. or use this podcast in any other place else, please contact us at ejpeterson@australia.co.uk or ej@t.ca.org. Thank you for considering it in any way possible. We appreciate the support we can do so. EJ is a kind and we appreciate it. We appreciate your support is greatly appreciated! - Thank you. . -EJBP - EJ.B. Peterson -Thank you, Charlie R. Peterson, Ejr. , EJB. -- Thank you, EK. & EJJ. ? -A.M. & E.A.R.S. -R. M. & J. P.
Transcript
00:00:00.000
Hey everyone, real quick before you skip, I want to talk to you about something serious and important.
00:00:06.000
Dr. Jordan Peterson has created a new series that could be a lifeline for those battling depression and anxiety.
00:00:12.000
We know how isolating and overwhelming these conditions can be, and we wanted to take a moment to reach out to those listening who may be struggling.
00:00:19.000
With decades of experience helping patients, Dr. Peterson offers a unique understanding of why you might be feeling this way in his new series.
00:00:27.000
He provides a roadmap towards healing, showing that while the journey isn't easy, it's absolutely possible to find your way forward.
00:00:35.000
If you're suffering, please know you are not alone. There's hope, and there's a path to feeling better.
00:00:41.000
Go to Daily Wire Plus now and start watching Dr. Jordan B. Peterson on depression and anxiety.
00:00:47.000
Let this be the first step towards the brighter future you deserve.
00:00:59.000
You can support these podcasts by donating to Dr. Peterson's Patreon, the link to which can be found in the description.
00:01:06.000
Dr. Peterson's self-development programs, self-authoring, can be found at selfauthoring.com.
00:01:13.000
The Jordan B. Peterson with a master's piano music and a new shop
00:01:39.380
So I assume you're all here to talk about the early work of Carl Jung and this man's carnivorous
00:01:51.360
diet and the Soviet art he collects. No, in all seriousness, I'm really excited to be here with
00:01:58.700
you. We've never met before. Your official title is that you are a clinical psychologist and a
00:02:04.960
professor at the University of Toronto. You've written two books, one called Maps of Meaning
00:02:10.120
and the best-selling 12 Rules for Life, which is currently being translated into 40 languages.
00:02:15.540
But this description does not capture what you've become, which is a kind of phenomenon.
00:02:20.720
When I was reading 12 Rules for Life in a cafe in the locker room of my gym, it was sitting out
00:02:27.180
on a bench, people were coming up to me and saying, this book saved my life. And yet there are other
00:02:33.120
people in the country, including some of my fellow journalists, who insist that you are
00:02:37.140
actually a gateway drug to the far right. So I'm excited to be here with you. Not the myth of you,
00:02:43.680
but with the man. And I'm hoping we can use this hour or so to talk about your views on meaning,
00:02:50.660
on gender, on feminism, God, higher education. I'm sure we can solve all of that in under an hour.
00:02:56.540
So I want to start with the book 12 Rules for Life, which I'm hoping some of you have read.
00:03:01.700
Here are some of the messages in that book. Gender isn't a social construct. People should strive
00:03:08.260
for meaning in their lives, not happiness. Life is suffering, but there are ways to transcend it.
00:03:14.320
Stand up straight. Make your bed. Now, all of this to me seems pretty commonsensical. And yet I don't
00:03:22.320
think that there is a Canadian in the world that I've read more think pieces about. I don't think
00:03:27.600
it's a stretch to say that you are sort of the most loved and loathed public intellectual in the
00:03:33.320
Western world at the moment. So I'm wondering if you can talk a little bit about what that's like
00:03:38.980
and your understanding of it. You've just come from two days in Vancouver where, on an event with Sam
00:03:45.480
Harris, talking for over two hours about the question of truth. And 5,000 people showed up to
00:03:51.260
those events, not exactly a sexy Beyonce concert. What's going on? How do you understand it and
00:04:00.260
Well, I think you don't want to underestimate the role that technological transformation is
00:04:06.820
playing in this. You know, I've been thinking about YouTube and podcasts quite intensely for
00:04:15.600
about two years. So I started putting my university lectures on YouTube in 2013. And I did that for
00:04:26.140
a variety of reasons, mostly curiosity because, and the drive to learn. And I've found that if I want
00:04:32.640
to learn a technology, the best way to do it is to use it. And I'm always learning new technologies
00:04:37.340
because, well, not that that makes me particularly unique. But, and I had some success with my lectures
00:04:44.560
on public television in Canada. So I did some lectures with a series called Big Ideas on Canadian
00:04:51.220
Public Television. And there's about 200 of those lectures. And I did five of them, 200, done by 200
00:04:58.600
different people. But I did five of them. And they were regularly in the top 10 of the most viewed
00:05:02.900
lectures. And so I knew that there was some broader market for, let's say, ideas. And I thought, well,
00:05:12.800
I might as well put my lectures up on YouTube and see what happens. And then by April of 2016,
00:05:20.060
I had a million views. And I thought, huh. The only reason people are watching these is because they
00:05:27.820
want to watch them because they're actually really hard. And a million of something is a lot. If you,
00:05:34.600
if you sell a million copies of your book, well, first of all, that never happens, right? I mean,
00:05:39.220
it's very, very rare. You're very happy. You never have your paper, scientific papers cited a million
00:05:45.560
times. You rarely have a million dollars. It's a very large number. And I thought, well,
00:05:49.600
this room accepted. Well, fair enough. Fair enough. And it's, of course,
00:05:53.360
it's not as uncommon as it once was, but it's still a significant number. And I didn't really have any
00:05:57.860
way of calibrating that. I thought, well, what am I supposed to do now that I hit a million views?
00:06:01.820
How am I supposed to conceptualize that? What is this YouTube thing anyways that was once a repository
00:06:06.680
for cute cat videos? So what does it mean to have a million views on it? And I thought, and so I
00:06:13.260
really started to think about it because, you know, and there were a lot of people commenting as well.
00:06:17.320
And they were into the lectures and following them avidly. And I thought, okay, so what is this YouTube
00:06:27.000
exactly? And I thought, well, for the first time in human history, the spoken word has the same reach
00:06:36.940
as the written word. And not only that, no lag to publication and no barrier to entry. That's a major
00:06:44.220
technological revolution. That's a Gutenberg revolution. That's a big deal. This is a game
00:06:49.620
changer. And then it was soon after that that I discovered the podcast world, which is about 10
00:06:55.020
times as big as the YouTube world. And the podcast world is also a Gutenberg revolution, except it's
00:07:00.960
even more extensive because the problem with books and videos is that you can't do anything else while
00:07:06.840
you're doing them, right? When you're reading, you're reading. When you're watching a video, you know,
00:07:12.260
you can be distracted, but you have to pay attention to the video. But if you're listening
00:07:16.220
to a podcast, you can be driving a forklift or a long haul truck, or you can be exercising or doing
00:07:23.420
the dishes. And so what that means is that podcasts free up, say, two hours a day for people to engage
00:07:29.860
in educational activities that they wouldn't otherwise be able to engage in. And that's about
00:07:34.960
one eighth of people's lives. So podcasts hand people one eighth of their life back to engage in high
00:07:41.120
level education. So then I thought, well, people actually want to do this. There's a massive market
00:07:50.280
for high level intellectual engagement that's much deeper and more desperate, let's say, than anyone
00:07:55.860
suspected. We really saw that in Vancouver. You know, I mean, the discussion I had with Sam Harris,
00:08:03.160
the two discussions, we talked about the relationship between facts and values, and science and religion
00:08:09.280
more peripherally. But the dialogue was conducted at the level, I would say, approximately at the level
00:08:15.920
of a pretty rigorous PhD defense. And we were only supposed to talk for an hour and then go to Q&A,
00:08:23.060
but the crowd didn't want us to stop. And so we talked the first night for two and a half hours,
00:08:26.500
and the second night for two and a half hours, and the crowd was 100% on board the entire time.
00:08:32.480
And it wasn't because Sam was winning or I was winning. Neither of us, in fact, were trying to win.
00:08:38.080
We were trying to learn something. And we were actually trying to learn something. We weren't just
00:08:42.300
pretending to do that. And, you know, the place erupted at the end. And I think one of the things I've
00:08:49.420
realized in the last couple of days, as I've been thinking this through, is that the narrow bandwidth
00:08:53.340
of TV has made us think we're stupider than we are. And so people have a real hunger for deep
00:08:59.200
intellectual dialogue. And that can be met with these new technologies. And that has revolutionary
00:09:08.720
I wonder about, you love to quote this line, this Nietzsche line, that anyone who has a why to live
00:09:15.400
for can endure almost any how. What's your why? What is driving you? You are the most busy man,
00:09:23.100
I mean, to get you here. You know, I think you're like, wherever you were last night, in Portland
00:09:27.360
tomorrow. Like, I don't know how you're alive, frankly, right now. What is driving you? Like,
00:09:32.800
what is this relentless drive? What are you pushing toward?
00:09:38.820
I'm trying to, well, when I spent 15 years writing the first book I wrote, which is called Maps of
00:09:46.240
Meaning. And it's akin to 12 Rules to Life, although it's a much more difficult book.
00:09:51.840
The audio version of that book is out now, by the way. It's been out since June 12th. And
00:09:56.560
I would, if you like 12 Rules, or you were interested in it, then you could try that. I think
00:10:01.900
the audio version is much more accessible, because it's a difficult book. Getting the cadences of the
00:10:07.920
sentences right is an aid to comprehension. I spent 15 years writing that book, about three hours a day
00:10:14.580
writing, and a lot more time reading. And I was interested in solving a problem, which was,
00:10:22.420
I was interested in the great atrocities of the 20th century, the ones that were committed on the
00:10:27.940
right, and the ones that were committed on the left. But I was interested in it psychologically.
00:10:33.780
And what that meant was, had I been there, what could have I done to not participate?
00:10:45.540
And so that's what I've been trying to figure out. How, so, because for me, what happened in Nazi
00:10:50.980
Germany, and what happened in the Gulag Archipelago, and in Maoist China, many places, was sufficient
00:10:57.540
definition of hell. Convincing, as well. And I wanted to understand what the opposite of that was.
00:11:06.740
And not sociologically, or politically, or economically, because I think that in the final
00:11:11.460
analysis, those levels of explanation are insufficient. But psychologically. How is it that you
00:11:18.100
must conduct yourself in the world, so that if the opportunity to participate in such things arises,
00:11:25.140
you won't? And, you know, when the Holocaust museums went up, there was, there was a motto that went
00:11:32.020
along with them, which was, never forget. And I thought, yeah, fair enough, but you can't remember
00:11:38.900
what you don't understand. And so I wanted to understand it. But I wanted to understand it.
00:11:43.940
You see, when people read history, they, they either read it as a detached observer, or they
00:11:48.180
tend to read it as, well, maybe the heroic, the heroic protagonist. People like to imagine that they
00:11:55.300
would be Schindler in Schindler's List. But that's wrong. So, because the probability that you'll be the
00:12:02.180
perpetrator is much higher, especially merely the perpetrator who's ensconced in silence, when
00:12:07.700
silence is not the appropriate thing. So I wanted to, having figured out what constituted hell, and the
00:12:13.540
pathway to that, which would be, I suppose, the cowardice that produces, the cowardice and resentment
00:12:20.260
that produces either complicitness in those events, or failure to oppose them when they emerge.
00:12:28.580
I wanted to understand what the opposite of that was. Because I think that's what needs to be learned
00:12:33.460
from what happened in the 20th century. And so that's why I wrote Maps of Meaning, was to understand that
00:12:38.260
and to lay out what the opposite was. And then that turned out to be extremely helpful to me, and then to
00:12:44.740
the people I started to teach about that, because it's useful to know what the opposite of hell
00:12:49.940
is. And I've been teaching those things to people since 1993. So that's 25 years. And the response from
00:12:59.140
the students has always been the same sort of response that I'm getting now, absent some of the
00:13:04.500
negative characterizations, let's say, which have emerged for particular reasons. But the students
00:13:10.660
have always said one of two things. And this is the vast majority of them. This isn't cherry-picked responses.
00:13:17.540
It's been the same everywhere. They tell me, and this is the same response I get from my audiences
00:13:23.780
now too, is they say, you've given me words to explain things, to explain and understand things
00:13:29.940
that I always knew to be true. Or I was in a very dark place for one of the seven reasons that people
00:13:37.700
might be in a dark place. Alcohol, or drugs, or failure of relationships, or lack of vision, or nihilism,
00:13:43.780
or hopelessness, or depression, or anxiety, you know, all the pitfalls that people can encounter.
00:13:49.140
And I've been developing a vision for my life, and trying to adopt responsibility, and trying to be
00:13:54.660
careful with what I say. And things are way better. And that's what drives me. So, you know, it's so
00:14:01.460
interesting watching what's happening, because, you know, you said, I'm the most loathed and the most
00:14:06.660
loved man. It's like, I'm loathed by a very small percentage of very noisy people. And so, and there
00:14:13.140
are people who either don't, or haven't, or won't, or take a look at what I'm doing, partly because it
00:14:21.620
doesn't fit within their conceptual scheme. You know, whenever I'm interviewed by journalists with,
00:14:26.900
with, that have the scent of blood in their nose, let's say, they're, they're very willing, and able
00:14:36.500
to characterize the situation I find myself in as political. But that's because they can't see the
00:14:42.500
world in any other manner than political. And the political is a tiny fraction of the world, and what
00:14:47.380
I'm doing isn't political. It's psychological, or philosophical, or theological. The political element
00:14:54.420
is peripheral. And if, if people come to the live lectures, let's say, that's absolutely self-evident.
00:15:02.500
That's not what they're about. That isn't why people are there. That isn't what they talk to me
00:15:05.780
about afterwards. It's fundamentally irrelevant. The only reason this ever became political is because
00:15:13.300
in Canada, our provincial and federal governments had the unspeakable arrogance to propose compelled
00:15:23.300
speech legislation in a British common law system, where that had never been done ever, even once.
00:15:28.900
And despite the fact that your Supreme Court in 1942 made some such things unconstitutional.
00:15:36.420
Just explain to people here what, what actually happened, which is that you opposed this law,
00:15:41.860
which was going to compel you, you say, to use preferred pronouns of people that are transgender.
00:15:48.500
Is that accurate? It's, it's accurate, but partial. So there was a, there's provincial laws that were
00:15:55.220
already in place to compel this sort of thing, but a federal law had been generated. And I went and read
00:16:01.300
the policy guidelines within which the federal law was to be interpreted. And those were produced by the
00:16:06.340
Ontario Human Rights Commission, which is a radical leftist inquisition, fundamentally. And they had
00:16:13.380
documented out a very large number of policies that were, that would make anyone sensible's hair stand
00:16:19.540
on end if they read them, which they didn't, but I did. And not only did I read them, I understood them.
00:16:26.020
And having read them and understood them, I made videos, just one night, I got up at about three in
00:16:31.140
the morning because it was really bothering me for a variety of complicated reasons, including the fact
00:16:35.620
that a number of my clinical clients had been bullied into states of ill mental health by radical social
00:16:41.140
justice warriors at their various workplaces. And this was long before I was embroiled in any of this
00:16:45.860
controversy, by the way, so it wasn't a sampling bias. And so, and at the same, and at the same time,
00:16:52.660
the university, my university, had the gall, the unmitigated gall, to mandate, um, unconscious bias
00:17:01.140
retraining for their human resources staff, despite the fact that unconscious bias measurements are not
00:17:06.820
reliable or valid, even by the testimony of their formulators, and despite the evidence that there
00:17:12.340
is no, there's no data whatsoever lending unconscious bias retraining programs, even the vaguest shred of
00:17:19.540
credible outcome. So I made these videos, and because I was annoyed about this, and I thought,
00:17:24.740
well, what'll happen if I make a video? And so...
00:17:28.660
Well, so this is, this is one of the things that I feel, or maybe you can answer it for us. I feel because of this
00:17:36.020
incident, you are often characterized, at least in the mainstream press, as being transphobic. If you had a
00:17:41.860
student come to you and said, and they said to you, I was born female, I now identify as male, I want to go, I want you to
00:17:48.900
call me by male pronouns. Would you say yes to that?
00:17:51.860
Well, it would depend on the student, and the context, and why I thought they were asking me, and what I
00:17:57.620
believe their demand actually characterized, and all of that. Because that can be done in a way that's
00:18:03.060
genuine and acceptable, and a way that's manipulative and unacceptable. And if it was genuine and
00:18:09.620
acceptable, then I'd have no problem with it. And if it was manipulative and unacceptable, then not a chance.
00:18:16.020
So, and you might think, well, who am I to judge? Well, first of all, I am a clinical psychologist,
00:18:22.660
and I've talked to people for about 25,000 hours. And so, and I'm responsible for judging how I'm going
00:18:28.980
to use my words. I judge it the same way that I judge all the interactions that I have with people,
00:18:33.460
which is to the best of my ability, and characterized by all the errors that I'm prone to. So, you know,
00:18:40.260
I'm not saying that my judgment would be unerring, but I have to live with the consequences. So,
00:18:44.740
I'm willing to accept the responsibility. So, but, but also to be clear about this,
00:18:50.820
that never happened. I never refused to call anyone by anything that they had asked me to call them
00:18:57.380
by. And so, although that's been reported multiple times, it's a complete falsehood. And it had nothing
00:19:03.860
to do with the transgender issue, as far as I was concerned. And besides that, if it was, if it had
00:19:09.940
only to do with the transgender issue in Canada, the probability that this would have had the impact
00:19:15.780
that it had is zero. So that wasn't about that at all. It was about something far more, far deeper,
00:19:23.220
and far more insidious. And everyone knew it, which is why it didn't go away. What should have happened is,
00:19:29.220
there should have been a bit of controversy around it, maybe even a protest, and everyone's attention
00:19:33.940
should have gone away like a week later. And that didn't happen even a little bit. So there's more
00:19:39.700
going on here than, as I knew, there's far more going on here than this little bill would have,
00:19:44.100
would have revealed. One of your rules in 12 rules for life is, I hope I'm getting this right,
00:19:50.020
choose your words carefully. Be ironic if I got that one wrong. Be precise in your speech. Okay,
00:19:54.740
be precise in your speech. Which is, you know, you got it right. Okay, sort of. Yeah, well,
00:19:58.820
you got the gist of it. That's the crucial thing. One of the things that's happened to you in the past
00:20:02.820
two years is that every utterance of yours, and Caitlin alluded to this in her introduction,
00:20:08.340
is analyzed, maybe manipulated. How do you live with that reality? Well, how do you even have the
00:20:18.820
confidence to sort of continue to, from my perspective, rush into the breach on all sorts
00:20:25.380
of what have become third rail issues, knowing that so much of what you say is going to be
00:20:31.540
mischaracterized? And then I have a follow-up to that. Well, I mean, about 25 years ago, 30 years ago,
00:20:40.020
maybe 1985. I guess that's how far long ago is that? It's a long time. Some years. Yeah. I decided
00:20:48.500
that I was going to be very careful with what I said. Like, I noticed that when I was thinking
00:20:53.220
through some of these ideas that I already described, trying to understand what tilted
00:20:58.260
people towards vengefulness and cruelty, I was contemplating that personally, you know,
00:21:09.220
what would tilt me towards that, or what did tilt me towards that. And at the same time,
00:21:13.780
I developed a, what would you call, an acute awareness of my speech. It was part of, because
00:21:20.820
I'd asked a question, eh? And when you ask yourself a question, if it's, you really ask a question,
00:21:25.380
is you start thinking up the answer, whether you want to think it up or not. And you, and the answer
00:21:30.020
that might, you might generate, might bear very little resemblance to the answer that you would like
00:21:34.500
to generate. And I'd asked myself a question, which was, well, what's the pathway out of this hell,
00:21:40.660
let's say? And how might I be tangled up in that? And one of the things I started to realize was that
00:21:45.300
I wasn't very careful with what I said, and that that seemed in some way to be related to that.
00:21:49.780
It's not surprising because, you know, it's not really obvious that the Nazis, for example,
00:21:54.100
were all that careful about what they said in terms of its relationship to the truth. Quite the
00:21:59.140
contrary. And the same with the ideologues in the Soviet Union. And so the idea that there was some
00:22:03.700
relationship between carelessness and speech, lies and deception, and that sort of thing,
00:22:08.900
or self-aggrandizement, or any of the things that you can indulge in if you're careless with your
00:22:13.220
speech. And the weakening of your character to the point where you might get tangled up in great
00:22:19.380
and terrible sociological movements, that seemed to me to be reasonable. And many people had commented
00:22:25.620
on that, like Solzhenitsyn, for example. And so I started to experience discomfort with what I was
00:22:32.660
saying. And what seemed to happen was that I started to realize and could feel it. I was reading Carl
00:22:38.180
Rogers at the same time, and he actually suggested that psychotherapists pay attention to exactly this
00:22:43.860
sort of thing. I started to understand that many of the things I was saying weren't true. I didn't really
00:22:50.020
believe them. They weren't really my thoughts. They made me feel weak when I said them.
00:22:55.140
Can you give an example? That's a good question. Can I give you an example? Oh, maybe I would engage
00:23:01.860
in an argument with someone at a bar on an intellectual issue for the purpose of displaying
00:23:08.740
my intellectual superiority, or at least hypothetically displaying it, you know. So, you know, sometimes
00:23:14.420
people like to argue, and they like to argue because, hypothetically, they would like to win.
00:23:19.540
So you don't mean, though, that you were mouthing platitudes?
00:23:22.180
Oh, sure. I was doing that. Yeah, definitely. Oh, yes, all the time. And sometimes they
00:23:26.900
weren't even platitudes, you know. They might have been things that I picked up in books that
00:23:30.740
weren't cliches. But they weren't mine. I didn't have any right to them. Like, just because you read
00:23:38.020
something doesn't mean you have a right to it. You have to understand it. And understanding something
00:23:42.340
that's deep means a deep transformation. It means you have to live it. And so just because you know a
00:23:47.460
philosophical concept and you can say it doesn't give you the right to utter it as if it's yours.
00:23:51.860
You have to earn that. And I was a smart kid. And so my head was full of ideas that I hadn't earned,
00:23:57.540
and I could lay them out. But that doesn't mean they were mine or me. And so there was a falsity
00:24:02.420
in expressing them. And so I couldn't tell for a while because I would say things, and part of me would
00:24:08.180
be all critical about what I was saying. You don't believe that. That's not accurate. It's kind of a lie.
00:24:13.940
It was saying that to almost everything I said. And I took a risk. I thought,
00:24:18.580
okay, I'm going to assume that the part of me that's critical about what I'm saying is right.
00:24:24.100
Even though that was terrible because it really, often it meant I could hardly speak.
00:24:30.660
And then I learned to only say things that didn't make me feel weak. And then I decided that that's
00:24:36.420
what I was going to do. So I've been careful with what I've been saying for a long time.
00:24:40.100
I'm having a hard time with what you're saying right now. Because
00:24:46.260
shouldn't the test be, I'm only saying things that are true? Not, I'm only saying things that
00:24:53.060
don't make me feel weak. What am I misunderstanding in that formulation?
00:24:56.740
Well, what you're misunderstanding in part is how do you know the things that you're saying aren't true?
00:25:01.780
And I would say one of the ways you know is that they weaken you. And you can learn that. You can learn to feel that.
00:25:08.020
Going online without ExpressVPN is like not paying attention to the safety demonstration on a flight.
00:25:14.340
Most of the time, you'll probably be fine. But what if one day that weird yellow mask drops down
00:25:19.300
from overhead and you have no idea what to do? In our hyper-connected world, your digital privacy
00:25:24.580
isn't just a luxury. It's a fundamental right. Every time you connect to an unsecured network in a cafe,
00:25:30.020
hotel, or airport, you're essentially broadcasting your personal information to anyone with a technical
00:25:35.220
know-how to intercept it. And let's be clear, it doesn't take a genius hacker to do this. With some
00:25:40.180
off-the-shelf hardware, even a tech-savvy teenager could potentially access your passwords, bank logins,
00:25:45.620
and credit card details. Now, you might think, what's the big deal? Who'd want my data anyway?
00:25:50.820
Well, on the dark web, your personal information could fetch up to $1,000. That's right,
00:25:55.860
there's a whole underground economy built on stolen identities. Enter ExpressVPN. It's like a digital
00:26:01.940
fortress, creating an encrypted tunnel between your device and the internet. Their encryption
00:26:06.580
is so robust that it would take a hacker with a supercomputer over a billion years to crack it.
00:26:11.460
But don't let its power fool you. ExpressVPN is incredibly user-friendly. With just one click,
00:26:16.580
you're protected across all your devices. Phones, laptops, tablets, you name it.
00:26:20.820
That's why I use ExpressVPN whenever I'm traveling or working from a coffee shop. It gives me peace of
00:26:25.780
mind knowing that my research, communications, and personal data are shielded from prying eyes.
00:26:30.900
Secure your online data today by visiting expressvpn.com slash jordan. That's E-X-P-R-E-S-S
00:26:37.860
vpn.com slash jordan, and you can get an extra three months free. Expressvpn.com slash jordan.
00:26:48.020
Starting a business can be tough, but thanks to Shopify, running your online storefront is easier
00:26:52.900
than ever. Shopify is the global commerce platform that helps you sell at every stage of your business,
00:26:58.260
from the launch your online shop stage all the way to the did we just hit a million orders stage,
00:27:02.980
Shopify is here to help you grow. Our marketing team uses Shopify every day to sell our merchandise
00:27:08.340
and we love how easy it is to add more items, ship products, and track conversions. With Shopify,
00:27:13.860
customize your online store to your style with flexible templates and powerful tools,
00:27:18.420
alongside an endless list of integrations and third-party apps like on-demand printing, accounting,
00:27:23.540
and chat bots. Shopify helps you turn browsers into buyers with the internet's best converting
00:27:28.340
checkout, up to 36% better compared to other leading e-commerce platforms. No matter how big
00:27:33.780
you want to grow, Shopify gives you everything you need to take control and take your business to the
00:27:38.260
next level. Sign up for a one dollar per month trial period at shopify.com slash jbp, all lowercase.
00:27:45.460
Go to shopify.com slash jbp now to grow your business no matter what stage you're in. That's shopify.com slash jbp.
00:27:55.860
Carl Rogers talked about this a lot in his work in psychotherapy. He said that one of the primary
00:28:01.620
roles of a psychotherapist was to be congruent and what he meant by that was that
00:28:07.620
there was no disjunction between what you felt in a situation, let's say, and what you said. That it
00:28:17.140
was all one piece. And that was an embodied unity, not merely a conceptual unity. So I really do think
00:28:24.100
that there's something to it. So you almost mean psychologically weak, not weak in terms of power.
00:28:32.020
Yeah. Yeah. Yeah. I mean, I mean, I mean morally weak. I mean, weak in character, that sort of thing.
00:28:37.780
Yeah. That's what I mean. Okay. Yeah. And so, and so, you know, I, I got very careful with what I
00:28:43.780
said. And at the same time, I was spending a tremendous amount of time writing. And so I was
00:28:48.980
very careful with what I wrote. So in maps of meaning, I think I rewrote every sentence in that
00:28:53.140
book at least 50 times. And so that's great. And every sentence I, yeah, well, that's for sure.
00:28:59.700
Oh yeah, that's for sure. And now, you know, I'd take the sentence out and then I'd write
00:29:03.700
a bunch of variants of it. And then I would pick the variant that was best. And then I would try to
00:29:08.260
come up with all the arguments I could about why the sentence was stupid.
00:29:11.060
Please don't tell me you still do this. Yeah. I still do this when I'm writing.
00:29:14.900
Okay. Did you do that 15, 15 versions of every sentence in 12 rules for life also?
00:29:25.220
I meant to be precise in my speech. It's okay. It was more like 15 with 12 rules
00:29:29.220
for life. So it was less, but I'm a better writer than I was then. So I didn't have to do it quite
00:29:33.540
as often. So I kept writing it until I couldn't make the sentences any better. That doesn't mean
00:29:38.900
they were good. It just meant that I got to the point where if I was rewriting them, it wasn't
00:29:42.820
obvious that the rewrite was better than the original sentence. So then I had to stop.
00:29:46.420
So my question a few minutes ago was how has knowing that you're going to be intentionally,
00:29:53.940
your words are going to be sort of intentionally torqued, how has that changed you?
00:30:01.700
You know, it's exaggerated the care, but, you know, I had been quite careful and the evidence
00:30:07.540
for that is quite clear. So, you know, when, when all of this political
00:30:10.740
controversy surrounded me and that swirled around me, well, it still is. Maybe it's even
00:30:18.500
exaggerated to some degree, but it was very intense in Canada for a good six months and
00:30:23.540
people were going over what I had put on YouTube with a fine-tooth comb. And there was 200 hours
00:30:29.620
of videos there. And you think, well, with some creative editing and, and with motivation in mind,
00:30:35.220
you'd think if you went over 200 hours of someone's lectures, you could find a smoking pistol,
00:30:39.780
even if you had to chop out a sentence, no one found anything. And the reason for that was
00:30:44.100
there wasn't anything there. That's why they didn't find it. And so I would, had already been
00:30:49.140
very careful. And I discussed all sorts of unbelievably contentious issues, you know,
00:30:54.900
because my classes were very intense. We, we went, we, like the maps of meaning class in particular,
00:31:00.420
it's like, it's, you know, it, it, it's, it's basic presupposition. Partly what I was trying to do
00:31:06.100
with my students was to convince them that had they been in Nazi Germany in the 1930s,
00:31:12.100
they wouldn't have been on the side of the good, right? That's a hell of a thing to drag people
00:31:17.300
through, but it's statistically overwhelmingly likely. So it was a very serious class and certainly a
00:31:23.780
place where you could step badly at, at any given moment. You know, and I talked about gender differences
00:31:28.820
and, and the biological substructure of consciousness and all these things that could easily become
00:31:33.700
politically contentious. But as I said, there weren't any smoking pistols, but now for the last
00:31:38.740
two years, I've been even more careful and I have people watching me, you know, I mean, my family
00:31:44.180
watches me and what I'm doing, they keep very careful track of it. And if I deviate a little bit
00:31:51.060
from what they think I should have, from how I should have behaved, then they tell me and I have
00:31:56.740
friends who are doing the same thing and I listen to them. Do you feel that you deviated from how you
00:32:01.540
should behave when you said of a, I think it was, um, Mishra in the New York Review of Books?
00:32:07.380
No. That, well, let me just share what you said, which is, uh, I'm trying to be precise in my speech,
00:32:12.260
but I believe you said you're a, what did you say? That you were, if you were in the
00:32:16.180
sanctimonious prick and if you were in the room. I said you're a sanctimonious prick and if you're here,
00:32:17.940
I'd slap you. Yeah, so you don't regret that? Not a bit. Okay. And I'll tell you why. Okay.
00:32:23.300
Well, look, it's really complicated. You know, I have this, I have this friend who's a native carver
00:32:30.980
and he, he's, he, he comes from a very rough background, like way rougher than you think.
00:32:37.300
And, and maybe some of you have come from rough backgrounds or you know people who've come from
00:32:41.460
them, but he comes from a plenty rough background. And I started working with him, buying his art 15
00:32:47.060
years ago. And he was a survivor of residential schools in Canada. And, and we got pretty close.
00:32:52.420
And, uh, he helped me design the third floor of my house. And, and, and anyways, that the long and
00:32:57.940
short of it was that I got inducted into his family about two year and a half ago in this big ceremony
00:33:04.420
up in, uh, in a, uh, native reservation in, in, in northern Vancouver. And, uh, you know, we've been
00:33:11.780
through a lot together and a lot of it's been pretty rough and you know, this, whatever the
00:33:17.700
hell his name was, Mishraj or whatever the hell his name was, had the temerity to say that I was
00:33:22.740
romancing the noble savage. It's like, watch your step, buddy. You don't know what the hell you're
00:33:27.780
talking about. Not even a bit. And so had I been a left leaning, uh, what personage, and he had made
00:33:35.700
a comment like that, there would have been hell to pay. So, which isn't to say that I'm a right leaning
00:33:40.660
personage, by the way. So I don't regret it a bit. I think that what he said was absolutely
00:33:45.380
reprehensible and that he should have been called out on it. And so I don't regret it at all. Now,
00:33:49.860
people said, you know, maybe it would have been better for me not to have made that comment. And
00:33:54.580
it's possible that they're right. But I actually thought about it and I thought,
00:33:59.140
there's no excuse for that. You don't know what you're talking about. You're meddling with things
00:34:02.740
you don't understand. And you're making a casual aspersion, not only on me, but on my noble savage
00:34:08.580
friend. It's like, yeah, no. So speaking of things that people have said, um, sort of to defame you,
00:34:16.100
uh, you're currently suing Wilfrid Laurier University, um, because you'll correct me if I'm
00:34:22.180
wrong, but I think administrators there in their meeting with Lindsey Shepard, who was a TA who
00:34:27.380
showed a clip of you, they sort of interrogated her, accusing her of creating a hostile teaching
00:34:32.980
environment for showing a clip of you in her classroom. And during that interaction, which
00:34:38.260
she recorded, they compared you to Hitler. No, they compared me to Hitler or Milo Yiannopoulos.
00:34:44.740
Excuse me. Right? No, it's important. And the reason it's important is because,
00:34:49.220
look, these people, two, one of them- And just to finish that question,
00:34:53.940
maybe you'll braid this in. You are one of the most outspoken
00:34:57.220
champions, I would say, of free speech right now. I would like for you, if you can,
00:35:02.820
to sort of grapple a bit with being, believing in free speech so strongly, and yet also suing
00:35:10.660
this university for slander. Yeah, well, um, so first of all, they compared me to, they said playing,
00:35:19.220
uh, uh, uh, uh, a clip of Jordan Peterson was like playing a clip of Hitler or Milo Yiannopoulos.
00:35:27.380
And I thought, well, let's go a little easy on the Hitler comparisons there. Guys, we might want to
00:35:32.740
save that for when it's really necessary. Because you don't, you don't use, it's, it's sacrilegious to
00:35:39.140
use an insult like that, except in situations where it's justified. It's not appropriate to use a
00:35:46.180
catastrophe like that casually, especially when you're doing it under the guise of moral virtue.
00:35:51.060
There's no excuse for it. And then the second thing is, you're a professor, both of you.
00:35:57.220
Get your damn words straight. Which is it? My Hitler or Milo Yiannopoulos? Seriously, those are not the
00:36:03.300
same people, in case you didn't notice. One of them was the worst barbarian in the 20th century, with the
00:36:08.900
possible exception of Stalin and Mao. And the other one is, is a, a provocator trickster who's quite
00:36:16.260
quick on his feet and, and, and is, what would you say, is stirring things up in a relatively
00:36:25.220
non-problematic way. They're not the same creature. And so to, to combine them in a single careless
00:36:32.660
insult during an administrative, what would you call, investigation, which was entirely unwarranted,
00:36:38.820
by the way, and was predicated on an absolute lie. There hadn't been a student complaint,
00:36:42.820
as the university admitted. There was no excuse for that. And if they weren't professors, then, well,
00:36:46.980
it wouldn't have been so bad. But they were. And the reason that I sued them, there's a whole bunch
00:36:51.060
of reasons. I mean, that, the Hitler comparison and the Milo Yiannopoulos comparison, were only two
00:36:57.060
of about 40 things that they tarred me with. And, and they're all listed in the deposition. And the
00:37:03.300
only reason I brought the lawsuit forward, what, seven months later, or something like that, was
00:37:08.500
because of what happened with Lindsay Shepard. So, what happened to her at Lindsay, at Wilford Laurier,
00:37:14.500
is absolutely inexcusable. Everything they did to her was predicated on a lie. Then the university
00:37:21.540
apologized, and so did the professor. And then he lied during his, his, his apology, which was a
00:37:28.980
forced apology anyways, and therefore a very little utility. They were subject to no disciplinary action,
00:37:34.260
even though the statutes of the university required it. And they made Lindsay Shepard's life a living
00:37:38.660
hell, even after they apologized to her and told her that she did nothing wrong, and that they hadn't
00:37:42.980
followed their own procedures. So I read her deposition, and I actually read it on YouTube, where it's got
00:37:47.780
about 500,000 views, by the way. And I thought, you people haven't learned anything. You've learned
00:37:53.380
absolutely nothing. And so if one lawsuit doesn't convince you, maybe two will. So, and then with
00:37:59.460
regards to free speech, it's like, free speech is still bounded inside a structure of law. And these
00:38:05.860
people broke the law, or at least that's my claim. So I don't see the contradiction there at all. You can't
00:38:10.580
just slander someone, defame them, lie about them. You can't incite people to crime. There's all sorts of
00:38:16.180
reasonable restrictions on free speech that are already codified, essentially, in the British
00:38:21.620
common law system. So, but Wilfred Laurier learned nothing. But this isn't over yet. But isn't it
00:38:28.100
creating a chilling effect, which is something that those of us care so much about free speech,
00:38:32.980
want to sort of stay away from? You could say that these sort of defamation lawsuits are a really,
00:38:39.780
really dangerous, slippery slope. And I'm sort of surprised you don't see it that way.
00:38:43.620
Well, you know, I do see it that way, which is why I spent seven months thinking about it before I
00:38:49.620
decided to do it. But I thought that there's always risk in every decision. There's the risk of doing
00:38:56.740
something, and there's the risk of not doing something. And both of those risks are usually
00:39:01.540
catastrophic in every decision you make in life. It's like I weighed up the risks, and I thought,
00:39:08.740
nope, the risk here of not doing something is greater than the risk of doing something.
00:39:13.140
And had they shown any sign... Look, one of the things that Wilfred Laurier did in the aftermath
00:39:18.340
of this scandal, which, by the way, was the biggest scandal that ever hit a Canadian university,
00:39:22.500
by a large margin. And it was an international scandal. I rarely go places where people haven't
00:39:27.380
heard about this. And so it was a big deal. And they had plenty to learn. And they learned nothing.
00:39:32.740
They set up a panel, hypothetically, to clarify their position on free speech and its relationship
00:39:39.460
to inclusivity, et cetera. And the only two people on the panel who were advocating for the free speech
00:39:44.900
position resigned in frustration. And I know that because I know who they are. And so, well, that's
00:39:52.020
just one of the pieces of evidence that they didn't learn anything. And then they continued to mistreat
00:39:56.740
Shepard, continually. Like, her deposition, it's like a novel of stupidity. You know? It's like...
00:40:04.900
And my sense was, had there been any sign whatsoever of, let's call it, true apology and procedural
00:40:15.940
rectification, that she would have left them alone, and so would have I. But there was zero. In fact,
00:40:22.420
if anything, what they did was double down and go underground. Here's our apology. Here's our
00:40:29.220
procedures. That's what they showed the world. Here's how nothing at all has changed. It's like,
00:40:34.660
no, not good enough. Since we're on the subject of universities, you recently said that what
00:40:40.420
universities have done is beyond forgiveness. I wonder if you can explain what you mean by that.
00:40:45.940
And a second connected question is, should we... I'll put it starkly. Should we abolish universities?
00:40:54.820
No, they'll do that themselves. Okay. Let's hear a little bit about what they've
00:41:00.180
done that you think renders them beyond forgiveness. Well, they're overwhelmingly
00:41:08.660
administratively top heavy. And they don't spend any more money on the faculty than they did
00:41:16.580
30 years ago. And the cost of that administrative top heaviness, which is well documented, not by me,
00:41:24.340
by other people, and it's been that way, it's been accelerating over the last 20 years,
00:41:28.500
has been a radical increase in tuition fees, especially compared to the radical decrease in
00:41:34.900
price of most things over the last 20 years. Now, so they've become administratively top heavy.
00:41:41.140
The way, and this is especially true in the United States, the way that's been managed is that
00:41:46.660
unsuspecting students are given free access to student loans that will cripple them through their 30s
00:41:53.940
and their 40s. And the universities are enticing them to extend their carefree adolescence for a four
00:42:02.500
year period at the cost of mortgaging their future earnings in a deal that does not allow for escape
00:42:09.220
through bankruptcy. So it's essentially a form of indentured servitude. There's no excuse whatsoever for
00:42:14.660
that. It means the administrators have learned how to pick the future pockets of their students.
00:42:19.700
And because they also view them in some sense as sacred cash cows and fragile, let's say, because you
00:42:26.580
might wonder why the students are being treated like they're so fragile. It's like, well, we don't want
00:42:30.740
them to drop out now, do we? And we can't, if they drop out, then we don't get our hands on their future
00:42:35.860
earnings in a way that they can't escape from. And that cripples the economy because the students come
00:42:40.900
out overladen with debt that they'll never pay off right at the time when they should be at the peak
00:42:45.940
of their ability to take entrepreneurial risks. So they can't do that because they're too crippled by
00:42:50.580
debt. And so that's absolutely appalling. They're gerrymandering the accreditation processes so that the
00:42:56.340
degree no longer has its credible value. They're enabling the activist disciplines,
00:43:01.780
which have zero academic credibility whatsoever in my estimation. And I'm perfectly willing to defend
00:43:07.380
that claim. And by enabling the activist disciplines, they're allowing for the distribution
00:43:15.940
of this absolutely nonsensical view that Western society is fundamentally a patriarchal tyranny, which is
00:43:22.980
absurd on at least five dimensions of analysis, but is becoming increasingly the thing you have to
00:43:28.900
believe if you're allowed to speak in public. Um, well, that's what else, that's, that's a good start.
00:43:35.540
That's, they're, they're not teaching students to read critically. They're not introducing them to
00:43:40.420
great literature. They're not teaching them to write. It's like the list goes on and on and on.
00:43:45.300
Do you think in a way that you are a symbol of higher education's failure? Meaning the reason
00:43:52.500
maybe that people are showing up 5,000 people to listen to you, it's going to be 20,000 in London
00:43:58.500
in July is because there aren't that many people who unironically are talking about what it is to live
00:44:06.580
a good life and asking questions about how to live a meaningful one. If you would say that in most
00:44:12.100
universities, I feel that you would be laughed out of the room. Well, it would depend on how you said
00:44:17.140
it and to who, but if you say it to students, then, then they're so happy to listen to you that
00:44:21.460
they can hardly stand it. Because even the most cynical students come to university hoping that
00:44:25.940
there's something there worth learning. And the reason that they're exposed to great literature,
00:44:31.220
for example, because there is such a thing, it's not all power claims, is because great literature
00:44:37.540
contains the key to wisdom and you need wisdom in order to live without undue suffering. So, yes,
00:44:44.500
I mean, so, but I, what I say that what's happened to me is a reflection of the failure of the
00:44:48.820
universities. It is in part, although I did teach this. And not just you, the whole intellectual dark
00:44:54.660
web. The fact that people listen to Sam Harris talk for hours and Michael, I mean, all of these,
00:45:01.540
these people. Well, I think, I think, well, I think, you know, you, you want to go for the simple
00:45:05.940
solutions before you go for the complex ones. And you want to go for the solutions that are associated
00:45:11.060
with ignorance rather than malevolence first. And I would say that we don't want to underestimate the
00:45:16.740
degree to, to which what's happening in YouTube and with podcasts as the consequence of a technological
00:45:24.020
revolution. Like I've known for years that the universities underserved the community,
00:45:28.900
because for some reason we think that university education is for 18 to 22 year olds, which is a
00:45:34.260
proposition that's so absurd that it's, it's absolutely mind boggling that everyone, anyone ever
00:45:39.140
conceptualized it. It's like, you know, why wouldn't you take university courses throughout your entire
00:45:44.660
life? I mean, what, you stop searching for wisdom when you're 22? I don't think so. You don't even start
00:45:50.260
usually until you're like in your mid twenties. So I knew the universities were underserving the,
00:45:55.380
the broader community a long time ago, but there wasn't a mechanism whereby that could be rectified
00:46:00.580
apart from say books. And of course that, that was part of the rectification. So I think you don't
00:46:05.060
want to underestimate the technological transformation, but then, and then I would also say, I mean,
00:46:10.580
I was teaching this in university, you know, so it isn't like there isn't anybody in university
00:46:15.700
still teaching this sort of thing. There, there are plenty of qualified professors who are still doing
00:46:20.260
a good job, but they're being pushed out very rapidly and terrified as well by the, by the activist
00:46:26.820
disciplines. You speak and write a lot about how masculinity is in crisis. What are some of the
00:46:33.380
main signs of it? And then we'll open it up to questions soon. And is Trump a symbol of that crisis
00:46:41.220
or a corrective to it? Well, I don't really think that masculinity is in crisis. I think that
00:46:45.620
that to the degree that masculinity per se is regarded as toxic, that that will produce a crisis,
00:46:52.180
which isn't the same thing. Um, I think there's a, there's a crisis of meaning, let's say in our
00:46:59.140
culture, but that's not new. That's, that's been the case for quite a long time, but I don't think
00:47:03.620
it's specific to masculinity. That's been a story that's kind of aggregated around me. And the way that
00:47:09.860
happened was, well, the people who don't like what I'm saying, look at my audience and they say,
00:47:14.820
oh, well, he's speaking mostly to men. Therefore he must be speaking to men. It's like, well, no,
00:47:19.300
the baseline rates for YouTube utilization, about 80% male. So the fact that most of the people who
00:47:24.740
are watching me on YouTube were male is an artifact to some degree of the fact that most of the people
00:47:30.500
who watch YouTube are male. Now it may also be that the sorts of things that I'm saying are more
00:47:36.420
pertinent to men. Although I'm not convinced of that. Most of my students throughout my university
00:47:41.460
career have been women because psychology is fun, you know, is dominated by women to a great degree.
00:47:47.060
And ever since I published my book, the proportion of people who are coming to my lectures that is
00:47:52.740
female is reliably increasing. It's probably up to about 35, 35%, I would say now from about probably
00:47:59.140
20. So I don't think it is a message that's particularly germane to men, although it is germane
00:48:05.860
to men. And I don't think, I don't think that there's like an independent crisis of masculinity.
00:48:12.900
There might be a crisis of concepts of masculinity. And I think that's hard on young men in some ways.
00:48:18.660
And the reason for that is, you know, you're, you're, you're supposed to be duty bound as a
00:48:24.580
virtuous person to buy the doctrine of the tyrannical patriarchy. It's like, well, look, first of all,
00:48:30.580
every hierarchical system tends towards tyranny. That's a universal truism. And our structures have
00:48:38.420
the same problem, obviously. And we have to be eternally vigilant so that they don't devolve into
00:48:44.260
tyranny. But that doesn't mean that they are tyrannies and always have been. And of course,
00:48:48.980
also compared to what? Compared to your hypothetical ideological utopia? Yes. Compared to every other
00:48:56.820
society that's ever existed on the planet, including most of the ones that exist now?
00:49:02.420
Definitively not. But anyways, if you buy the, that idiot unidimensional idea, which is a pathological
00:49:08.980
error, and you see your, the, your, your culture as a tyrannical patriarchy, then you see any attempt
00:49:15.780
to move up that hierarchy as a manifestation of patriarchal tyranny. Now, the problem is,
00:49:21.300
is that a lot of the ways that you move up a modern functional hierarchy is through competence.
00:49:27.220
And if you take young men, it doesn't happen as much with young women for reasons we can go into,
00:49:31.620
but if you take young men and you say, every manifestation of your desire to move up the
00:49:36.340
hierarchy is nothing but proof of your participation in the tyrannical patriarchy,
00:49:40.820
then you tend to demoralize them, which is exactly what you're trying to do, by the way,
00:49:44.580
if you, if you take that stance to begin with. Because I really think that at the bottom of the most,
00:49:49.780
of the most, of the most pathological manifestations of the collectivist dictum,
00:49:53.860
is an assault on the idea of competence itself. And that's another unforgivable sin that the
00:49:58.580
university has committed. Like every, look, there's no doubt that human hierarchies are error prone,
00:50:05.220
and they tilt towards tyranny. Obviously. But that doesn't mean that they are unidimensionally
00:50:13.140
patriarchal tyrannies. They're neither patriarchal nor tyrannies. So, but that's received wisdom now,
00:50:18.740
and to question it means that you're a misogynist fascist. So, well, so I tell young men, it's like,
00:50:26.260
no, no, no, no, no, no. It's like, there's something to competence, man.
00:50:30.660
Well, I'll just- Speaking as a, as a woman who has read your book, and I'm with you for,
00:50:35.540
for so much of it, and then you start to lose me when you talk about archetypes. The way you talk about
00:50:43.380
archetypes in the book, and again, forgive me if I'm being slightly imprecise, but I'm trying to gloss it for an
00:50:48.020
audience who might not have read it, is that in this sort of Jungian archetypal world, chaos is
00:50:54.660
feminine, order is masculine, and the subtitle of your book is an antidote to chaos. So as a woman
00:51:02.820
reading that, you know, I'd like for you to explain to me maybe what I'm missing there, because that's
00:51:09.620
when you started to lose me a little bit as a reader. Why does there need to be an antidote to the
00:51:14.980
feminine in that way? Well, there has to be an antidote to anything that's manifesting itself in
00:51:21.380
excess. And it's chaos that's manifesting itself in excess at the moment in our culture.
00:51:29.540
And so, and so that's what I decided to address in this book. And mostly that was because I suppose
00:51:37.380
it was addressed at least in part to younger people. And what younger people have to contend with,
00:51:43.140
generally speaking, is an excess of chaos, because they're not very disciplined.
00:51:47.620
And so you need to, you know, we kind of have this idea that while you're free as a child, and then you
00:51:55.620
let me see if I can, if I can put this properly, that you have a certain delightful, wonderful,
00:52:00.900
positive freedom as a child, and then that's given up as you approach adulthood. But the truth of the
00:52:06.660
matter is, is that you have a lot of potential as a child, but none of that is capable of manifesting itself
00:52:12.580
as freedom before you become disciplined. And discipline is a matter of the imposition of
00:52:17.060
order. And the order is necessary, especially for people who are hopeless and nihilistic. And lots of
00:52:22.260
people are hopeless and nihilistic. Way more people than you think. And part of that is because no one's
00:52:27.620
ever really encouraged them. And so the book is, in part, a matter of encouragement. It's like,
00:52:32.660
lay yourself, lay a disciplinary structure on yourself. Get the chaos in, in, in check. And then you can
00:52:38.980
move towards a state that's freer, because it's discipline first. Like, look, if you're going to
00:52:42.900
become a concert pianist, there's going to be several thousand hours of extraordinarily disciplined
00:52:48.340
practice. That's the imposition of order on your potential, let's say. But what comes out of that is
00:52:53.220
a much grander freedom. And so in virtually every freedom that you have in life that's true freedom
00:52:58.660
is purchased at the price of discipline. And so, because I think that it's, it's nihilism and hopelessness that
00:53:06.500
constitute the major existential threat, especially to young people at the moment,
00:53:11.300
then I was concentrating on the necessity of discipline and order. So, and the issue with
00:53:16.340
regards to the metaphysical or symbolic representation of chaos as feminine, well, that's a very complex
00:53:23.220
problem. And the first thing you have to understand is that there's no a priori supposition that
00:53:30.260
order is preferable to chaos in any fundamental sense. They're both constituent elements of reality.
00:53:36.020
You can't say one's bad and the other's good. You can say that they can become unbalanced,
00:53:40.660
and that's definitely not good. Too much chaos is not good, obviously. Too much order is not good,
00:53:47.860
equally obviously. Those are the two extremes that you have to negotiate between. And I'm not making a
00:53:53.700
casual claim with regards to the idea that reality is an amalgam of chaos and order. I don't think that
00:54:00.180
there is any more accurate way of describing the nature of reality. That's the most fundamental,
00:54:06.820
maybe not the most fundamental truth, but it's certainly, there's, there's two, there's two
00:54:12.820
fundamental truths. Reality is composed of chaos and order, and your role is to mediate between them
00:54:21.700
successfully. That's metaphysical and symbolic truth. But it's more than that, because
00:54:28.820
that's actually how your mind and your brain is organized. Not only conceptually, but emotionally,
00:54:35.780
motivationally, and physiologically. So, and I don't really understand how that can be, because it isn't
00:54:41.540
obvious to me how the most fundamental elements of reality can be chaos and order. But the evidence that
00:54:48.980
that that is the case is overwhelming. I can give you a quick example, which is quite interesting. So,
00:54:55.940
you have two hemispheres. There's a reason for that. The fundamental reason for that is that one of them
00:55:02.740
is adapted for things you don't understand. That's, roughly speaking, the right hemisphere. And the other
00:55:08.260
is adapted for things that you do understand. That's the left hemisphere. And so that's a chaos order
00:55:13.380
dichotomy. And the fact that you're adapted to that, that you're, that the very structure of your brain
00:55:21.940
reflects that bifurcation, indicates, as far as I can tell, beyond a shadow of a doubt, because it's also
00:55:29.700
characteristic of non-human animals, many of them, that that differentiation is fundamentally true in some
00:55:38.820
sense. Now, you might ask, well, why is that conceptualized as masculine versus feminine?
00:55:44.260
Because it's not male versus female. By the way, those are not the same thing, because one's conceptual.
00:55:50.740
That's extraordinarily complicated. I think the reason is, is that we're social cognitive primates,
00:55:56.660
and that our fundamental cognitive categories, a priori cognitive categories, are masculine,
00:56:04.660
masculine, feminine, and child. It's something like that. That's the fundamental structure of reality.
00:56:09.940
Because we're social creatures, and we view reality as something that's essentially social in its
00:56:16.660
nature. And then when we started to conceptualize reality outside the social world, which wasn't very
00:56:21.700
long ago, by the way, and which is something that animals virtually don't do at all, we use those a priori
00:56:27.300
social categories as filters through which we interpreted the external world. And we're sort of stuck with that,
00:56:34.260
in some deep sense. And you might say, well, why do we have to be stuck with that? It's like,
00:56:38.820
well, because some things are very difficult to change. Like, if you go watch a story, and the
00:56:47.220
characters in this story slot themselves into those archetypal categories, then you'll understand the
00:56:52.340
story. And if they don't, you won't. Because your understanding is predicated on application of the
00:56:57.220
archetypal a priori to the story. You wouldn't understand it otherwise. So you can't get under that. There's no
00:57:03.220
under that, not to remain human. So, and I can give you a quick example. I like to use Disney movies
00:57:12.580
for a variety of reasons, mostly because everybody knows them. But it's not accidental that the evil
00:57:18.500
queen, the evil queen in Sleeping Beauty is not an accidental character. She's the way she is,
00:57:23.860
because we understand her. And the reason we understand her is because we see the world through
00:57:28.580
the categories that I just laid out. And you can say, well... But are you saying she has to be a
00:57:32.420
queen and not a king? No, if she was an evil king, she'd be different. She'd be like Scar in The Lion
00:57:37.700
King. Just as evil, man. But not the same character. Right? Yeah.
00:57:43.940
As a homeowner, some of the most tedious and easily forgotten maintenance tasks are often the most
00:57:49.220
important. Take gutter cleaning. It's one of those out of sight, out of mind chores that can lead to
00:57:54.340
serious issues if neglected. LeafFilter offers an investment engineered to protect your whole home.
00:58:00.420
Clogged gutters aren't just a nuisance. They can cause extensive repairs costing thousands of dollars
00:58:05.940
and causing major headaches. LeafFilter's patented technology is designed to take
00:58:10.180
care of everything from start to finish, making the process hassle-free for homeowners.
00:58:14.660
Their professionals will clean out, realign, and seal your existing gutters before installing
00:58:19.540
the LeafFilter system, ensuring optimal performance from day one. Plus, every installation comes with
00:58:25.060
a free inspection, estimate, and lifetime guarantee. By choosing LeafFilter, you're not just solving a
00:58:30.500
maintenance problem. You're investing in your home's long-term health and your own peace of mind.
00:58:35.220
Protect your home and never clean out your gutters again with LeafFilter, America's number
00:58:39.620
one protection system. Schedule your free inspection and get up to 30% off your entire purchase at
00:58:45.060
LeafFilter.com slash build. That's a free inspection and up to 30% off at
00:58:49.860
LEAFFilter.com slash build. See representative for warranty details. Promotion is 20% off plus a
00:58:56.420
10% senior or military discount. One discount per household. I guess I'm struck that it seems like a
00:59:02.100
lot of your intellectual project is reasserting difference in an age where we're told that everything
00:59:09.300
is the same. Yeah, but it's stupid. Okay. Well, look, look, I'm sorry to be so blunt, but look,
00:59:17.380
the problem, the problem with some of this, the problem with some of this, some of it's willful
00:59:24.020
blindness, but some of it's just ignorance. So, let me just, let me just lay out a couple of things. So,
00:59:30.100
for example, I've been taken to task along, let's say, with James Damore, who had actually been highly
00:59:35.700
influenced by my videos before he, and my classes before he did what he did at Google. You know,
00:59:42.020
I've studied personality differences between men and women for 25 years and written papers on the
00:59:46.900
topic. And it's actually an area of expertise of mine and substantial expertise too, and not
00:59:52.020
pseudoscience expertise. Thank you very much. I'm not a pseudoscientist. So my publication record puts
00:59:58.580
me in the top 0.5% of psychologists. So I'm not a pseudoscientist by any stretch of the imagination.
01:00:04.980
And I have 10,000 citations. And that's not a million, but it's a lot, and a hundred published
01:00:10.100
papers. So, so let me lay out one of the, the personality differences between men and women,
01:00:15.700
because it's worth understanding. And, and you might say, well, there can't be personality
01:00:19.540
differences between men and women, because that's anti-feminist. It's like, no, it's not.
01:00:24.260
We might have to actually understand that there are differences between men and women,
01:00:27.860
so that we can let men and women make the choices they're going to make without, without
01:00:31.540
subjecting them to undue manipulation. Okay. So one of the reliable differences between men and
01:00:37.060
women cross-culturally is that men are more aggressive than women. Now, what's the evidence
01:00:42.580
for that? Here's one piece of evidence. There are 10 times as many people, men in prison. Now,
01:00:48.820
what's that? A sociocultural construct? It's like, no, it's not a sociocultural construct. Okay.
01:00:54.660
Here's another piece of data. Women try to commit suicide more than men by a lot. And that's because
01:01:00.740
women are more prone to depression and anxiety than men are. And there's reason for that. And
01:01:05.060
that's cross-culturally true as well. They're more likely to try to commit suicide, but men are way
01:01:10.100
more likely to actually commit suicide. Why? Because they're more aggressive, so they use lethal means.
01:01:17.860
Okay. So now the question is, how much more aggressive are men than women? And the answer is,
01:01:25.140
not very much. So the claim that men and women are more the same than different is actually true.
01:01:30.900
But this is where you have to know something about statistics to actually understand the way the
01:01:34.740
world works, instead of just applying your a priori ideological presupposition to things that are
01:01:40.900
too complex to fit in that rubric. So if you drew two people out of a crowd, one man and one woman,
01:01:50.740
and you had to lay a bet on who was more aggressive, and you bet on the woman,
01:01:54.260
you'd win 40% of the time. Okay. So that's quite a lot. It's not 50% of the time, which would be no
01:02:00.580
differences whatsoever, but it's quite a lot. So there's lots of women who are more aggressive than
01:02:05.060
lots of men. So the curves overlap a lot. So there's way more similarity than difference. And this is
01:02:11.460
along the dimension where there's the most difference, by the way. Right? But here's the
01:02:16.500
problem. You can take small differences at the average of a distribution. The distributions move
01:02:25.620
off to the side. And then all the actions at the tail. So here's the situation. You don't care about
01:02:31.940
how aggressive the average person is. It's not that relevant. What you care about is who is the most
01:02:39.860
aggressive person out of 100. Take 100 people, and you take the most aggressive person, because
01:02:46.020
that's the person you better watch out for. And what's the gender? Men. Because if you go three
01:02:53.620
standard deviations out from the mean on two curves that overlap but are slightly disjointed,
01:02:59.860
then you derive an overwhelming preponderance of the overrepresented group. And that's why men are
01:03:06.180
about 10 times more likely to be in prison. Has nothing to do with socialization. So and then
01:03:12.340
and then there are other differences too. So it turns out that differences in aggression and agreeableness
01:03:18.900
also predict differences in interest. And so it turns out that men are more interested on average
01:03:26.580
than in things than women are. And women are more interested in people on average. And that's actually
01:03:31.620
the biggest difference that's been measured between men and women. It's nothing to do with ability.
01:03:36.660
It has to do with interest. And so the way that manifests itself is that women are more likely to
01:03:42.980
go into disciplines that are characterized by the care of others. And you can tell that by the way
01:03:48.500
occupations are segregating. All you have to do is look at the data for like 15 minutes. Women
01:03:53.940
overwhelmingly dominate health care. And that's that's accelerating, by the way. And men dominate
01:03:59.940
engineering, let's say. And so you say, well, that's sociocultural. It's like, no, it's not. And here's the proof.
01:04:08.980
So so now now what you do, because you want to test this hypothesis, right? It's like and believe
01:04:14.660
and the other thing that you want to understand is that left leaning psychologists generated this data.
01:04:21.060
And you think, well, how do you know that? That's easy. There are no right leaning psychologists.
01:04:30.740
And that's been well documented. And so people have published this data despite their ideological
01:04:38.340
proclivities and despite the fact that this is not what they expected to find or what they wanted
01:04:43.300
to find. So what you do now is you you stack countries by how egalitarian their social policies
01:04:49.700
are, right, from the least egalitarian to the most. And you say, well, the Scandinavian countries are the
01:04:54.820
most egalitarian. And by the way, if we don't agree on that, then there's no sense having this
01:04:59.220
discussion at all because we don't agree on what egalitarian means. If you don't think that what the
01:05:04.180
Scandinavians have done has been a move in the direction of egalitarianism, then I have no idea
01:05:10.020
what you mean by egalitarianism. Now, you could say, well, they haven't done it perfectly. It's like,
01:05:14.340
yeah, yeah, that's true. But it's not relevant to this argument. So what you do is you stack countries by
01:05:20.260
how egalitarian their social policies are, and then you look at occupational and personality
01:05:25.380
differences between men and women as a function of the country. And what you find is,
01:05:32.420
as the country becomes more egalitarian, the differences between men and women increase.
01:05:41.780
They don't decrease. And so what that means is that the radical social constructionists
01:05:48.020
are wrong. And it's not a few studies with a couple of people done by some half-witted
01:05:52.980
psychologists in some tiny little university. It's population level studies that have been published
01:05:58.260
in major journals that have been cited by thousands of people. It's not pseudoscience. It's not,
01:06:04.740
it's not questioned, it's not questioned by mainstream psychometricians and personality theorists.
01:06:10.340
We figured this out back in like 1995. Everyone thought it was settled. And so what's the big problem?
01:06:16.580
Well, who knows what the big problem is? The outcome is not exactly the same between the genders.
01:06:22.980
It's like, well, who says it has to be? And more importantly, and this is something to ask yourself
01:06:27.540
constantly, just who the hell's going to enforce that? And just exactly how are they going to enforce
01:06:32.980
that? And believe me, it's not going to be in some manner that you like. Because there are differences
01:06:37.620
between men and women. And if you leave them alone, those differences manifest themselves in different
01:06:42.260
occupational choices. That's the other finding. This is a newer one. As the societies become more
01:06:47.860
egalitarian, the occupational choices between men and women maximize. And what that means is that fewer
01:06:55.060
and fewer women go into the STEM fields. Now, no one wanted that. No one predicted it. No one was
01:07:01.060
hoping for it. It actually flew in the face of, I would say, the most established psychological theories,
01:07:06.260
because my presupposition certainly was 20 years ago that what would have happened as we made societies
01:07:12.820
more egalitarian would be that men and women would converge. That's not what happened. The biological
01:07:17.220
differences maximized as we eliminated the sociocultural differences. And so maybe you don't
01:07:22.900
like that. It's like, that's fine with me. I didn't say I liked it. But whether or not I like a piece of
01:07:28.340
data has very little bearing on whether or not I'm liable to accept it. You know, I'm trying to look at the
01:07:33.460
damn scientific literature and to draw the conclusions that are necessitated by the data.
01:07:39.700
And then you can say, well, the whole thing is suspect because it's the it's the construction of
01:07:45.700
the patriarchal tyrants who generated the Eurocentric scientific viewpoint. It's like you want to have
01:07:51.300
that conversation, then go to an activist discipline and have it because it's not the sort of conversation
01:07:56.740
that anyone sensible would engage in. So I'd love to open up the room to
01:08:03.300
questions, please, sensible questions, and please keep them short, but genuine questions. Someone
01:08:11.540
with a microphone will find you if you raise your hand.
01:08:24.100
Good evening. My name is Prera. I wanted to understand a little bit of your view,
01:08:28.820
more on the fact that, not fact, but at least observation, that over generations and generations,
01:08:35.540
or at least what I've heard and seen from my family, I can take up that women being told about
01:08:42.660
their position in the home and men being told their position to work and be a little more aggressive,
01:08:50.660
you know, the social conditioning. So how does that play a role? Because I didn't hear that being a
01:08:57.620
being a dimension of reaching these conclusions.
01:09:01.300
Well, I've never claimed that the differences between men and women are 100% biologically determined.
01:09:07.700
They're biologically influenced. The radical constructionists make the opposite claim. There
01:09:11.940
are no biological differences between men and women. It's like, well, first of all, that's so preposterous that
01:09:18.100
it barely even requires an answer. But you might specify it a bit and say,
01:09:22.260
no, there are no biological differences that manifest themselves psychologically.
01:09:26.580
And that's not quite as preposterous, but it's also incorrect. It's obviously the case that all sorts
01:09:31.780
of things about sex roles and gender roles, let's say, are conditioned by sociocultural mechanisms,
01:09:38.500
because human beings are very, very plastic. And so the manner in which those biological differences
01:09:46.100
manifest themselves in a culture is radically influenced by the nature of the culture.
01:09:50.980
But that doesn't mean that the biological influences don't exist. So...
01:09:55.060
But are you saying, should we be countering that sort of traditional,
01:10:04.100
So what I still didn't understand is like, at one point, you're saying it's not necessarily biological, but...
01:10:14.740
Yeah, at one point, you're saying that it's not necessarily biological or inherent, if I had to paraphrase it.
01:10:21.700
Yeah, but it's very unclear in the way, at least maybe one hour is very short, and maybe it needs a larger
01:10:28.500
discussion. It seems that it's easy to deduce that these are inherent differences which
01:10:34.580
exist, and social conditioning wasn't taken as a parameter to arrive at that.
01:10:39.860
Well, that's controlled for by the comparison between societies that have different levels of
01:10:43.540
egalitarianism built into their social structure. That's all taken care of in the analysis.
01:10:49.060
If the biological differences manifest themselves maximally, where the sociocultural influences
01:10:54.020
to equalize gender are maximal, then obviously the biological differences are powerful and profound.
01:10:59.860
It's conclusive. So it's taken into account in the data analysis.
01:11:06.260
So that's why you stack up the countries by the egalitarian nature of their social policies,
01:11:12.020
is to control for the sociocultural influence. And so, you know, you've got to admit,
01:11:16.260
you've got to just think it through for a minute.
01:11:17.780
It isn't even that what you would have expected, theoretically, is that the societies that
01:11:24.900
are the least egalitarian would have the biggest differences between men and women,
01:11:29.380
and that as the societies got more and more egalitarian, those differences would get smaller,
01:11:33.540
and maybe disappear even. But that isn't what happened. It's exactly the opposite is what happened.
01:11:39.700
They maximized in the most egalitarian societies. Therefore, the social constructionist position,
01:11:46.980
the radical social constructionist position is wrong. It's wrong. It's been refuted,
01:11:54.980
which is partly why the radical social constructionists have taken the legislative
01:11:58.900
route to impose their viewpoint. They lost the scientific war, but then, well, then we can just
01:12:03.860
attack science. It's like, well, it's science itself that's suspect. It's like, well, then quit using
01:12:08.500
your iPhones. Right? Well, if you're going to have your convictions, man, lay them out in your life. If you
01:12:15.060
think the scientific process is suspect and tyrannical and oppressive and all that, then quit using the
01:12:21.700
products that it produces. You don't get to have your cake and eat it too. Let's go to this young
01:12:27.860
woman right here. Yeah. And then we'll go to you. Hi, my name's Julia, and I recently read in the New York
01:12:34.020
Times an article about your comments on forced monogamy. What are your comments on how that was perceived by the
01:12:40.340
public and specifically the left? Great question. Well, I think it was enforced monogamy, though.
01:12:48.980
Enforced monogamy. Yeah. Yeah. Enforced monogamy. First of all, that's a technical term, by the way,
01:12:53.300
that's been used in the anthropological literature for 100 years. And the journalist, who was not stupid,
01:12:59.460
knew that perfectly well and reported the story the way she reported it despite that. But what's even more
01:13:06.260
surreal than that about that story is that if you're going to try to undermine someone's credibility,
01:13:13.060
like, and do it effectively, you should attribute them to them an extreme view that some person
01:13:19.940
somewhere actually holds. Okay. And so the view that was attributed to me was something like,
01:13:26.500
I want to... Handmaid's Tale. The road to handmaid's tale. Yes. I want to
01:13:29.780
find useless men and distribute women to them at the point of a gun so that they don't become violent.
01:13:36.020
It's like no one has ever believed that ever anywhere, certainly including me. Except Barbara
01:13:41.780
Atwood. Well, right. That's right. She wrote a book about that. But so, you know, it's just absolutely
01:13:46.980
preposterous. And it's preposterous in a bunch of ways because she interviewed me for two days and we
01:13:52.580
talked about that for about two minutes. And it was a peripheral conversation. And it's an
01:13:58.340
anthropological truism generated primarily by scholars on the left, just so everyone's clear
01:14:04.420
about it, that societies that use monogamy as a social norm, which by the way is virtually every
01:14:10.820
human society that's ever existed, do that in an attempt to control the aggression that goes along
01:14:16.740
with polygamy. It's like, oh my God, how contentious can you get? It's like, well,
01:14:22.180
how many of you are in monogamous relationships? Well, the majority. How is that in force?
01:14:26.820
I think this is a very polyamorous room, try to look around.
01:14:32.900
it was, it was desperate. That's what it looked like to me. But the problem is it was also desperate
01:14:38.420
and amateurish. It's like she could have done a much better job with a much less extreme
01:14:42.740
characterization. It's like, oh yes, I want to take women at the point of a gun and distribute them
01:14:47.700
to useless men. It's so stupid. Partly because, like, if she, if she would have been reasonable,
01:14:54.180
and she knew this too, one of the things I've told men, specifically, over and over and over and over,
01:15:01.940
is if you're being rejected by all the women that you approach, it's not the women.
01:15:19.220
Right. So, so because, and so that's a, because, you know, these characters who, who, like the guy
01:15:24.340
that mowed down those people in Toronto, he ends up blaming women. And he's blaming more than women,
01:15:29.140
in some sense. He's blaming the structure of being for producing women that reject him. It's like,
01:15:33.860
and so that's part of what makes him violent. It's like, well, what the hell's wrong with him?
01:15:38.100
You know, he's got it completely backwards. If everyone, if you, if everyone you talk to is boring,
01:15:45.140
it's not them. Right. And so if you're rejected by the opposite sex, assuming that you're heterosexual,
01:15:53.460
then you're wrong. They're not wrong. And you've got some work to do, man. You've got some difficult
01:16:01.060
work to do. And there isn't anything that I've been telling, let's say, young men, that's clearer
01:16:07.060
than that. You know, my, the, it's actually something I've been criticized by, by people on the left,
01:16:13.300
because they think I don't take structural inequality, for example, and so forth, into account
01:16:18.180
sufficiently. What I've been telling people is, take the responsibility for your failure onto yourself.
01:16:24.180
And that certainly applies to, well, especially when you're trying to formulate relationship,
01:16:28.580
and you're getting rejected left, right, and center. It's like, that's a hint that you have some work
01:16:34.580
to do. Now, it also might be a hint that you're just young and useless, and why would the hell would
01:16:38.660
any... Absolutely. That, why the hell would anybody have anything to do with you? Because you don't
01:16:43.460
have anything to offer. You know, so, but that's rectifiable, and partly, even maturity rectifies
01:16:50.180
that. But, so, so not only was that, what would you call it, accusation, surreal and absurd, made by
01:16:59.300
a journalist who knew perfectly, knew perfectly well what I was suggesting, and chose to misrepresent
01:17:04.340
it anyways. It's actually the opposite, that the conclusion that people derive from that is exactly the
01:17:10.340
opposite of what I've been suggesting, in particular to young men. So, it's absolutely preposterous.
01:17:15.940
Uh, yes, where the microphone is. Yes. Professor Peterson. Oh. I have... Hi, Barry. Hi. So good to see you up there.
01:17:28.100
You too. I teach students. I teach trans students, and I'm asked often to call people singularly,
01:17:38.020
they. It started probably about four years ago. It struck me as very odd. I'm 52. And some of them,
01:17:48.740
you can tell that it's coming from a very deep place, and that's how they feel, and they deeply
01:17:54.100
need to be called they. Some of them, my horse sense says that they're kind of enjoying giving me
01:18:02.740
a certain shock, and that there's a certain theatrical aspect. It's my horse sense that
01:18:08.020
there's a certain epate, a le bourgeois aspect to it. I kind of feel it, and I'm probably right.
01:18:14.580
But I can't know. I'm a linguist. I'm a person. And my general feeling has been, whatever they ask,
01:18:23.300
just go with it, and let's change our usage of the pronouns, because we have a lot to do. Now, what you
01:18:29.700
said was interesting. You said that the way that you make the difference in deciding these cases
01:18:36.260
is based on the fact that you have psychological training, and you can tell. What I want to know is,
01:18:43.700
for my own elucidation, and also because I think many of us wondered, but then it kind of went by,
01:18:49.220
how do you know? Now, I want to specify. I'd rather you didn't recount the whole episode of how
01:18:57.780
ridiculously you were treated amidst that whole controversy. Sure. Three-quarters of the room
01:19:03.300
knows. I sympathize with you. I thought it was ridiculous. I want to know specifically, because
01:19:10.020
I'm a linguist. You have psychological training. How would you know? And if you hear, I'm almost done.
01:19:16.980
Oh yeah, no problem. If you hear a tiny bit of skepticism in my voice, you're correct. However,
01:19:23.460
I am open to being convinced. Based on your training, which is immense, how would you know
01:19:30.980
which students to discount as opposed to which ones to go along with? Okay. Well, first of all,
01:19:36.500
I wouldn't know, right? Which is partly why your skepticism is justified. But I have to be
01:19:42.820
responsible for what I say based on my willingness to take responsibility for my judgment. So I would be
01:19:48.180
willing to do that despite the fact that I might be wrong. But having said that, in any reasonable
01:19:54.020
situation, I would err on the side of addressing the person in the manner that they requested to
01:19:59.620
be addressed. But that's not the issue for me. The issue is now I'm compelled by law to do so. It's
01:20:05.780
like, no, not doing it. Not now, because it's compelled by law. So that's the end of the game as far
01:20:12.820
as I'm concerned. So because there is no excuse for compelling it by law. That's my my position. And
01:20:19.540
I think I think there's all sorts of reasons for that. I don't think it was an isolated legislative
01:20:24.580
move. I think it's part and parcel of a whole sequence of legislative moves that have been made
01:20:29.700
and that continue to be made in Canada. I think it's an attempt by a certain radical ideological,
01:20:36.500
what would you say, a certain radical ideology to gain the linguistic upper hand, which I think is a
01:20:42.020
terrible thing to do, to allow. So I had lots of reasons for rejecting the legislation. But it had
01:20:48.740
...about how your psychological training would make the difference.
01:20:53.940
That's very interesting. We're talking about expertise here. And my ears pricked up when you
01:21:00.100
talked about how there is a way of thinking that would allow us to decide. I know some of my students...
01:21:05.540
No, there's a way of thinking that would allow me to decide for me.
01:21:08.580
No, us to decide for us. Surely you have a larger mission than just what's going on in your own
01:21:14.180
head. And I mean that. No, I had a perfectly straightforward mission, which was there's
01:21:17.620
no damn way I was going to say those words when I was compelled to by law. That was my mission.
01:21:22.260
You weren't trying to model for the rest of us a way of thinking it was really only about you?
01:21:28.180
Well, it was about me and the law. I thought the lawmakers had gone too far. They'd stepped out of
01:21:34.500
their appropriate territory into the domain of linguistic freedom. And as far as I was concerned,
01:21:39.860
I was going to put up with that. And so if people were happy about that and wanted to follow the
01:21:43.940
example, that was fine with them. But for me, it was something... and that was the statement.
01:21:49.780
I'm not doing this. And then if people can draw their own conclusions from that, maybe they want
01:21:54.580
to do it. I mean, and I've spoken with no shortage of trans people. And my proclivity has been without
01:22:00.820
exception so far to address them in the manner that seems most socially appropriate under the
01:22:05.540
circumstances. Now, you asked a specific question, which was, do I have special expertise that I might
01:22:13.780
share with other people? Because you're doing Martin Luther. And I think that these issues are
01:22:19.220
a little subtler than those. And so I'm just waiting. Well, what makes you think that you're
01:22:23.940
doing the kids that are grandstanding any favours by going along with their manipulation? Because I
01:22:28.020
can't decide which ones those are. I just have my gut instincts and that's not good enough.
01:22:31.700
Look, fair enough. But you have a type one and type two error problem. So one error is that you don't
01:22:37.060
call students what they deserve to be called. That's one error. And the other error is that you
01:22:42.900
you call students what they want to be called even though they don't deserve it. And so what you're
01:22:47.620
trying to do optimally is to minimize both those errors. And to do that, you have to take a middle
01:22:52.420
route. Now, what you've decided to do, and I'm not criticizing it, is you've decided to allow for the
01:22:57.700
possibility 100% of one of those errors because you think it's a less significant error. And you know,
01:23:03.860
you might be right, but it's not like you're acting in an error-free manner. You've just decided to
01:23:08.340
minimize one form of error at the expense of the other. Because I would say you're allowing,
01:23:13.460
what would you call it, attention-seeking and somewhat narcissistic undergraduates to gain the
01:23:17.540
upper hand over you in your class. Now, believe me, it's not a criticism. It's not a criticism. I
01:23:23.940
understand why you're doing it. No, but isn't John just erring on the side of
01:23:29.140
generosity and compassion? I have one more thing to say because I'm not going to take up any more
01:23:33.620
space. Okay. Are you saying that psychological theory has nothing to teach us about this? Because
01:23:39.940
you're talking around my question. You're gorgeously articulate. You're smarter than me. Does psychology
01:23:45.780
have anything to teach us or not? Yes or no? On this question. I don't think that it has anything to
01:23:52.340
teach. I don't think it has anything to offer that I could teach you without... let me think. So it's
01:24:00.500
just too complicated? No, no, it's not. No, no, it's not that. Well, it is that in part because it's not
01:24:05.380
easy to articulate out the principles, the unerring principles by which you would make such a categorical
01:24:10.660
judgment, right? Because those are very situation-specific problems. You know, and it's part of
01:24:16.820
the problem of how to make a generic moral truth applied to a very individualistic situation.
01:24:24.740
And the problem in the sorts of situations that you're describing is generally the devil's in the
01:24:28.820
details, right? You have all these students, the ones that you just laid out, they vary in their
01:24:34.740
attitude towards their self-professed gender from the ones who are grandstanding to
01:24:40.500
some degree, let's say, to the ones that are very serious. And you have to make a judgment
01:24:44.580
in the moment that is dependent on the variables that present themselves in a very complex way in
01:24:49.300
that situation. And I understand why you took the pathway that you took, and it's perfectly reasonable
01:24:54.820
to do so. My point was that you don't minimize all the errors by doing so. It's fine. It's still a fine
01:25:01.300
way of approaching. It isn't. My point was that because of my psychological acumen, I would say,
01:25:07.060
that the experience that I've derived is that I would be comfortable in making the judgment and
01:25:12.180
taking the consequential risk. I'm not saying I'd be correct. That's not the same thing at all.
01:25:17.940
I'm willing to suffer the consequences of my error. That's not the same thing as being right.
01:25:23.940
And so if I feel that a student is manipulating me, then I'm not going to go along with it. Now,
01:25:27.860
I might be wrong about that and actually hurt someone who's genuinely asking for something that they need.
01:25:33.780
But I'm also, what would you say, sensitive to the error of allowing manipulation to go unchecked.
01:25:41.700
So, aha, you're back. No, John. Okay. And then there could be a two-hour
01:25:49.220
podcast about this on your wonderful podcast which everyone should listen to.
01:25:52.580
Everything you're saying is very well put, but it's awfully slippery. And I know you can do better.
01:26:10.740
Thank you, Barry. And thank you both for this really interesting conversation,
01:26:19.460
which is not like most of the conversations we've had here at the Ideas Festival.
01:26:24.180
This is my first one, so I have no idea. Great.
01:26:30.500
So, Dr. Peterson, there are a million questions that I'd like to ask you. I'm only going to ask one,
01:26:35.700
obviously. I'm a psychologist. I'm a social psychologist with a clinical background.
01:26:41.300
And the thing that I think I'd like to most hear about right now at this moment is
01:26:50.420
the very noisy, small percentage of people who oppose you. Have you thought about something they
01:26:57.540
might be right about? That they might actually have a point about that you hadn't thought of,
01:27:06.180
but you've started to think they might actually have a point?
01:27:10.020
Um, I don't know if I've started to think about the point that they have that I didn't think about
01:27:16.340
before. I mean, people have been characterizing me as right-wing. It's like, I'm not right-wing,
01:27:24.580
so the characterization isn't very helpful. And one of the things I do all the time in my public
01:27:30.340
lectures is make a case for the utility of the left. So the case can be made quite rapidly.
01:27:39.860
If you're going to pursue things of value in a social environment, you're going to produce a
01:27:43.460
hierarchy. It's unavoidable, because some people are better at whatever it is that you value.
01:27:50.420
And so when that lays itself out socially, it will produce a hierarchy.
01:27:54.500
The hierarchy has a necessity if you're going to pursue the things of value, but it has a risk.
01:28:01.540
The risk is that we'll ossify and become corrupt. That's risk number one. And risk number two is that
01:28:06.420
when you produce the hierarchy, you're going to dispossess a number of people, because there'll
01:28:10.660
be lots of people in the hierarchy who aren't good at it, and they'll be dispossessed. So you need a
01:28:16.020
political voice for them. That's the left. So I make that case over and over. Now, what the right
01:28:23.140
does is say, yeah, but we still need the hierarchy. It's like, yes, you still need the hierarchy.
01:28:27.860
The reason we need the political dialogue is because we need the hierarchy, and we can't let
01:28:32.340
it get out of control. And the way to balance those two competing necessities isn't by only having the
01:28:41.940
hierarchy or dissolving the hierarchy. You have to live with the tension, because the situation keeps
01:28:49.300
shifting. So the way you live with the tension is by talking. You say, well, here's the current state.
01:28:55.060
The hierarchy needs to be tweaked this much because it's getting too tyrannical, and it's dispossessing
01:29:00.900
too many people. So we need to tweak it so that it's not as corrupt and so that it's a little bit
01:29:05.460
more open. And we have to talk about that all the time. And that's what the right and left, it's not
01:29:09.780
the only thing they do, because they also talk about the necessity of borders. That's the other fundamental
01:29:14.340
thing that they do. The dialogue has to continue so that we can have the hierarchies and utilize them as
01:29:20.420
tools without allowing them to descend into tyranny. Okay, so I made a case. I made a case on the web.
01:29:26.180
I did a talk at the University of British Columbia, a left-wing case for free speech, as if that's so
01:29:32.340
difficult to make. I mean, that's the sort of case that was made until like 2014 or something like that.
01:29:39.140
So the left-leaning types have all sorts of things that are correct to say. Now, the problem is,
01:29:47.460
one of the problems of the left, but this is, and this is another thing that I talk about all the time
01:29:51.140
in my public lectures, by the way, is we have a problem. We know how to put a box around the
01:29:56.020
extremists on the right. Basically, we say, oh, you're making claims of ethnic or racial superiority.
01:30:01.700
You're not part of the conversation anymore. What do we do on the left? Nothing. That's not good,
01:30:08.740
because there's an issue. Can the left go too far? Yes. When? Oh, we don't know. Oh, that's not a very
01:30:20.980
good answer. Now, you could say, well, then it's up to the moderate leftists to figure that out,
01:30:26.180
so they can dissociate themselves from the radicals, and it is up to them, but that's actually not a very
01:30:31.540
good answer either, because it's all of our problem. It's not centrists don't know how to
01:30:36.260
reliably identify the radical, the two radical left. Right-wingers don't know how, and it's partly
01:30:41.780
because I think it's actually conceptually more complex. Like, with the radical right,
01:30:46.260
you can kind of lay it down to one dimension. Oh, racial superiority. Nope. Sorry, you're out of
01:30:51.780
the conversation. But that's Milo, who you mentioned before. Well, I didn't say I was a fan of Milo.
01:30:57.300
No, but you called him a prankster. Well, he is a prankster, mostly. Yeah, but he's also a racist.
01:31:02.500
Well, possibly, yeah. I haven't followed Milo that carefully, you know, so, and it's possible that he is.
01:31:08.500
I mean, it's hard to tell what Milo is exactly. He's a very complicated and contradictory person,
01:31:13.460
destined to implode, which is exactly what happened. Well, there's just no way you can be that
01:31:19.300
contradictory a person and manage it. It's just not possible. He was just too many things happening
01:31:23.860
at the same time for anyone to ever manage. So, but on the left, you know, I don't know what it is.
01:31:31.540
I think the left becomes toxic. One of the things that makes the left unacceptable is demands for
01:31:37.780
equality of outcome. It's like, nope, you crossed the line, man. That's not an acceptable demand.
01:31:42.580
And that's increasingly a moderate leftist demand as well. Now, but I don't know. I,
01:31:46.900
it might be more complex. It might be that there's four things that you have to demand on the left
01:31:51.860
that all of a sudden makes what you're doing unacceptable. And we don't know what those four
01:31:56.180
things are. And so I actually think it's a conceptual problem as well as an ethical problem. We don't know
01:32:00.980
how to bind the, the necessary left so that we don't, so that the radicals don't dominate
01:32:09.300
counter-productively. And if you don't think that the radical leftists can dominate counter-productively,
01:32:19.300
No, that I agree with, but the idea that it's so clear on the right is not clear to me. I mean,
01:32:24.980
look at, look at the Trump administration. Oh, I don't think that it's necessarily applied very
01:32:30.180
clearly, but at least conceptually it's more... But we can point it out. Well, we can point it out
01:32:35.380
better. So, and it, I mean... Do you think that's because of World War II? Yes.
01:32:42.740
Yeah, that helped quite a lot, actually. Yeah. Yeah. But, but the thing is, is that the communist
01:32:47.700
catastrophes don't seem to have made it any clearer on the left. Yes. And so, and now that's another thing that
01:32:53.140
the universities have done that's unacceptable, by the way, the intellectual class, I would say,
01:32:57.220
is that it's never come to terms properly with the fact that the intellectual class as a whole was,
01:33:02.420
was supportive of the communist experiment. And it was an absolutely catastrophic failure on every,
01:33:09.300
what, measure of analysis. People say, well, that wasn't real communism. It's like,
01:33:13.380
you really shouldn't ever say that. Because what it means is, this is what it means. It's the most
01:33:17.620
arrogant statement that a person can make. It means that, had I been in the position of Stalin,
01:33:22.900
with my proper conceptualization of the Marxist utopia, I would have ushered in the utopia.
01:33:31.140
That's what it means. And it's like, no. First of all, if you actually were that good-spirited,
01:33:36.660
and you're not, by the way, if you were, you would have been eliminated so fast after the revolution
01:33:42.900
occurred that it would have, well, it would have killed you. Because that's what happened. It's what
01:33:48.420
happened. Like, all the well-meaning people after the Russian revolution, the small minority of
01:33:52.900
people that were genuinely well-meaning, they were dead, like, within two or three years. So,
01:33:58.260
that wasn't real communism. I think I'm seeing zero, as in zero questions,
01:34:01.300
zero time, zero something. One more question. Really? Okay.
01:34:07.140
I know, several people do. Can we take a few, and he'll answer them shortly? Like, maybe two more?
01:34:15.780
Okay. Let's go here, and the front row right here. Yes, but make it very, very short.
01:34:21.300
Very short. I just thank you for coming, and I'm honored. Very important. Great mentor, great help to
01:34:31.540
me, and a lot of people that I've been sharing your work with. I have two books here, and I would
01:34:36.900
like you to sign them for me. Okay, you could do that. Yes, he will do that after, I'm sure. Yes.
01:34:43.540
Professor Peterson, this is akin to the question that the young woman over there asked, but over,
01:34:50.260
if you could get in a self-reflective mode over the course of your life and career to date,
01:34:54.580
what could you say, honestly, to us about where you felt you've been most wrong, and what provoked
01:35:03.700
that self-assessment? I'm not thinking about how I've been wronged. You have to...
01:35:08.900
How you've been wronged. Oh, incorrect. Like a mind, like a wrong in your thinking, where you sort of...
01:35:15.300
Oh, I was wrong about the big five personality theory for about five years, so I know that's not
01:35:19.940
very interesting to any of you, but I didn't like it at all. It was brute force, statistically derived.
01:35:26.260
It wasn't theoretically interesting. I didn't like it at all, but I was wrong about that,
01:35:31.220
because the science was well done. What else have I been wrong about? Well, you asked for profound
01:35:38.820
examples of being wrong, and in my field, that's actually a profound example, because that's one of
01:35:42.980
the major theories in the field. You're thinking about more interesting examples.
01:36:01.940
What have I changed radically? Oh, well, you know, when I was a kid, I was an avid socialist.
01:36:07.460
I was wrong about that. But more specifically, I was wrong about that, because I thought that
01:36:15.220
in that doc... that there were questions that I want answered that that doctrine could answer,
01:36:19.220
and it wasn't that it was socialism that didn't make the answers emerge. It was that it was the
01:36:24.900
wrong level of analysis. And so that was a major source of error. It was sort of the source of error
01:36:30.340
that the journalists who are going after me are making. They think everything's political. It's like,
01:36:34.420
no, it's not. There's lots of levels of analysis, and the political is one. And I learned eventually
01:36:39.460
that the political wasn't the right level of analysis for the questions that I was interested
01:36:43.380
in addressing. And that was a major... that was a major error. It took me years to sort that out and to
01:36:51.060
figure out what the consequence was. I was wrong about the significance of religious ideas, because when
01:36:56.900
I was a kid, I... and, you know, 13 or so, and I was smart enough at that point to see the
01:37:03.380
contradiction between an evolutionary account of the origin of human beings in a, say, a scriptural
01:37:08.740
account. And so I just dispensed with that in this sort of new atheist move. And, you know, I threw the
01:37:14.660
baby out with the bathwater, and I was really wrong about that, like profoundly wrong about that. And I'm
01:37:19.620
sure I'm wrong about a bunch of other things, but I'll figure out what some of those are as we go ahead. So that's
01:37:24.740
three things. Those are big things. So, you know, if I thought more, I could come up with other
01:37:31.620
examples, but those are pretty big things that I was wrong about. Thank you all so much. Clearly
01:37:37.540
an hour and a half is not enough with you. But thank you so much for your time. Thank you very much.