The Jordan B. Peterson Podcast


507. The Insanity of Woke Psychologists | Lee Jussim


Summary

Lee Jossum is a distinguished professor of psychology at Rutgers University, and he s been the chair of the Department of Psychology and Anthropology there. In this episode, we discuss his work on left-wing authoritarianism, and why it s so important.


Transcript

00:00:00.000 So the podcast today took a turn back to the psychological, which is an improvement over
00:00:20.340 the political, as far as I'm concerned, generally speaking, likely because the topic of concentration
00:00:27.760 has more long-lasting significance, all things considered.
00:00:32.220 So in any case, I spoke today with Lee Jossum, and Lee is a distinguished professor of psychology
00:00:40.080 at Rutgers, and he's been the chair there of the Department of Psychology and separately
00:00:45.480 of Anthropology, which is a peculiar happenstance that we discuss in the podcast.
00:00:51.360 I was interested in Lee's work because there's a lot of trouble in the field of social psychology.
00:00:57.760 A lot of the claims of the field are not true.
00:01:00.480 Now, you've got to expect that in scientific inquiry because a lot of the things we believe
00:01:06.920 are false, and the whole reason that we practice as scientists is to correct those falsehoods.
00:01:12.680 And it's also the case that much of what's published is not going to be true because the
00:01:18.120 alternative would be that everything that was published was a discovery that was true, and
00:01:23.100 we'd be overwhelmed by novelty so fast that it would be untenable if that ever happened.
00:01:30.000 Lee is one of the rarer social psychologists who's actually a scientist, and he's done a
00:01:38.820 lot of interesting and also controversial work.
00:01:42.880 That's partly how you can tell it's interesting and valid because it also is controversial.
00:01:48.020 One of the things he's established, which is of cardinal importance, is that our perceptions
00:01:55.140 of other people are not mostly biased, right?
00:01:59.160 This is, the contrary claim is rather preposterous, which is that all of the categories that we
00:02:04.720 use to structure our interactions with other people are based on the power distortion of our
00:02:13.080 perceptions, let's say, which is essentially a Marxist and postmodern claim.
00:02:18.780 And Lee became infamous, at least in part, because he showed that our perceptions, our stereotypes,
00:02:25.020 if you will, are mostly accurate.
00:02:27.540 There are sources of bias, and they do enter into the process, and they're relevant, but that's
00:02:33.840 a very different claim than that the foundations of our perceptions themselves are indistinguishable
00:02:39.500 from the biases we hold as motivated agents.
00:02:42.820 And so, his work is extremely important.
00:02:45.740 It's core to the culture war that is tearing us apart.
00:02:50.200 So, if you're interested in the definition of perception, the relationship between perception
00:02:59.640 and reality, and the analysis of bias in a manner that's credible, then pay attention to this
00:03:08.260 podcast and get things cleared up.
00:03:11.940 So, I guess we might as well get right to the point.
00:03:14.640 And the first thing I'm curious about is, and this is something I think that can be like
00:03:18.860 fairly definitively laid at the feet of social psychologists, was that there was an absolute
00:03:24.100 denial that anything like left-wing authoritarianism existed, even conceptually, literally until
00:03:30.500 2016.
00:03:31.640 Yeah, that's right.
00:03:32.160 It was like-
00:03:32.740 For 60 years.
00:03:33.540 I came across that, and I thought, what do you mean there's no such thing as left-wing
00:03:37.900 authoritarianism?
00:03:38.720 We know that.
00:03:39.420 It's like, that's insane.
00:03:41.780 It's the same.
00:03:42.280 That's absolutely insane.
00:03:43.000 It's insane.
00:03:43.460 It's insane.
00:03:44.560 Yeah.
00:03:44.840 And then there were a couple of papers published in 2016 on left-wing authoritarianism in the Soviet
00:03:50.220 Union.
00:03:51.120 That was the first breaking of that.
00:03:53.960 Damn, I did a master's.
00:03:55.100 I supervised a master's thesis at that time.
00:03:57.600 It was a very good thesis on left-wing authoritarianism because we showed that there were statistical
00:04:04.320 clumps of reliably characterizable left-wing authoritarian beliefs that did, in fact, associate
00:04:13.180 statistically, and that identifiable groups of people with identifiable temperamental proclivities
00:04:19.080 did hold.
00:04:19.760 So I really wanted to follow up on that because it was a very rich potential source of new
00:04:25.900 information.
00:04:27.180 But my academic career exploded at that point.
00:04:29.940 It became impossible.
00:04:31.400 So-
00:04:31.540 Well, people have taken that ball and run with it.
00:04:34.260 Yeah, yeah.
00:04:34.820 So, well, tell us about it.
00:04:36.880 What have you found?
00:04:38.020 Well, okay.
00:04:38.540 How do you-
00:04:39.100 Let's start with some definitions.
00:04:40.700 Yeah.
00:04:40.980 Like, what constitutes left-wing as opposed to right-wing authoritarianism, let's say?
00:04:45.160 Right.
00:04:45.320 So, there are measurement issues across the board, but that is with respect to both left
00:04:53.500 and right-wing authoritarianism.
00:04:55.000 There are questionnaires, commonly used questionnaires to assess right-wing authoritarianism and to assess
00:05:03.540 left-wing authoritarianism.
00:05:04.660 They're different.
00:05:05.180 They're different.
00:05:07.460 The reason-
00:05:12.080 Let me give a little context.
00:05:14.160 For a long time, people tried to develop a non-partisan authoritarianism scale.
00:05:26.540 It's authoritarianism.
00:05:27.800 Right.
00:05:28.100 It's a psychological construct rather than a political one.
00:05:31.480 Right.
00:05:31.780 And they couldn't really do it.
00:05:33.240 Right.
00:05:33.400 Because one of the core toxic elements of authoritarianism is a motivation to crush, deprive
00:05:43.100 of humanity and human rights, one's political opponents.
00:05:46.800 Right.
00:05:47.240 So, you need to assess either right or left-wing authoritarianism vis-a-vis the attitudes towards
00:05:54.560 one's opponents in order to measure the construct.
00:05:57.280 Okay.
00:05:57.600 So, that's the-
00:05:58.200 Okay.
00:05:58.440 That's a very interesting-
00:05:59.660 That's a very interesting definition, though, because you're pointing to the fact that
00:06:03.260 arguably, and tell me if you think this is right, the core of authoritarianism, which,
00:06:10.900 as you said, can't be measured outside the political, isn't precisely political.
00:06:15.560 It's your attitude towards those who don't agree with you.
00:06:18.900 Yes, it is.
00:06:19.320 But you have to have some beliefs for that to be-
00:06:21.240 I didn't say can't.
00:06:22.100 I say they have not succeeded.
00:06:24.340 Actually, one of my current graduate students is, for her master's thesis, in the process of
00:06:30.420 trying to develop a non-partisan authoritarianism scale-
00:06:33.900 Based on that idea.
00:06:34.460 Yes, based on that idea.
00:06:35.800 I don't know if she's going to succeed, but yes.
00:06:36.920 Okay, so I'm thinking about that clinically.
00:06:38.420 It's like, well, that's where you'd start to look at overlap between cluster B personality
00:06:44.780 psychopathology, narcissism, borderline personality disorder, histrionic, because those are the
00:06:50.860 people who are very likely to elevate their own status at the cost of other people, including
00:06:58.560 their children, and those they purport to love.
00:07:02.160 So the first step to do that is to develop scales that adequate, survey questions that
00:07:10.200 adequately get at left or right-wing authoritarianism, and then correlate them with things measuring
00:07:15.260 narcissism or sadism or whatever.
00:07:18.000 People have done that on the left, and it does correlate with left-wing authoritarianism.
00:07:24.400 I don't know, you know, you never know for sure the limits of your own knowledge, so I
00:07:29.600 don't know if anyone has even tried to do this on the right, or maybe they have, and it doesn't
00:07:35.640 actually correspond with narcissism on the right.
00:07:38.120 You know, it corresponds with other things on the right, but not so much with, well, if
00:07:43.840 there's evidence on narcissism correlating with right-wing authoritarianism, I don't know
00:07:48.260 it.
00:07:49.980 Nothing at the moment comes to mind.
00:07:52.080 I have a memory of a memory of something associated with that, because I've tried to follow the
00:07:57.380 literature, but I've definitely seen it emerge on the left.
00:08:01.800 Yeah.
00:08:02.020 Correlations on the right, well, from what I remember, and I'm vague about this because
00:08:09.000 I can't give you sources, is that dark tetrad traits stand out quite markedly as associated
00:08:15.620 with authoritarianism, and I thought that was somewhat independent of whether it was left
00:08:20.260 or right, but I can't provide the sources at the moment.
00:08:22.920 I review them in this new book I wrote on We Who Wrestle With God.
00:08:26.900 There's a lot of reference to the dark tetrad personality constellations and the political
00:08:32.300 manifestations.
00:08:33.760 But okay, but you've been studying it.
00:08:35.360 Okay, so when we looked at the way we developed our measure, because I'd like to know how you
00:08:40.300 developed yours, is we took a very large sample of political opinions and then factor analyzed
00:08:46.660 them to find out if we could identify first clumps of left-wing and clumps of right-wing
00:08:51.640 belief, which you can clearly identify.
00:08:54.140 And then to look within the left-wing constellation to see if there is a reliable subcategory of
00:09:00.520 clearly authoritarian proclivities.
00:09:02.760 And we found, you know, we found the biggest predictor of left-wing authoritarianism was low
00:09:06.900 verb like you.
00:09:08.360 It was a walloping predictor, negative 0.40.
00:09:13.120 Immense predictor.
00:09:14.740 Yeah.
00:09:15.120 So that's something to, because, you know, one of the things we talked about at the beginning
00:09:19.020 of the podcast was that some of these ideas sound good in the absence of further critical
00:09:25.360 evaluation.
00:09:25.780 So that you might say, well, if you lack the capacity for deep verbal critical evaluation,
00:09:31.760 what apparently moral ideas would appeal to you?
00:09:35.600 And, well, you can imagine that there might be a set of them and one of them would be, well,
00:09:40.460 don't be mean to people who aren't like you.
00:09:42.460 You know, which is a perfectly good rule of thumb.
00:09:44.800 Yes, absolutely.
00:09:45.280 That doesn't mean it's the, it doesn't mean that everyone who says that's what they're
00:09:49.260 for are, in fact, agitating on behalf of that principle.
00:09:52.620 Absolutely.
00:09:52.820 Okay, so back to your research.
00:09:54.100 Yeah, yeah, yeah.
00:09:54.540 Okay, so first of all, let me be clear.
00:09:56.280 We're, other than my student, Sonia, who is trying to develop a nonpartisan authoritarianism
00:10:03.020 scale, the work that we have done using either left-wing or right-wing authoritarianism scales
00:10:09.180 are scales developed by other people.
00:10:10.900 We haven't developed the scales.
00:10:12.640 So for left-
00:10:13.080 And do you think there are good scales now for left and right-wing authoritarianism?
00:10:19.240 Adequate scales?
00:10:21.100 Adequate for right, yes, and pretty good for left.
00:10:25.360 Even though left, the research on left is much more recent.
00:10:29.200 Right, right.
00:10:29.500 You might think it would be, therefore, less well-established.
00:10:33.020 There's two teams, one led by Luke Conway and a different one led by Tom Costello.
00:10:39.180 Have done a lot of very good, both psychometric, sort of statistical assessment of how things
00:10:46.360 hang together, and also validity assessment of their two slightly different, somewhat different
00:10:52.260 scales.
00:10:52.960 You can tell if someone's belief is part of a set of identifiable beliefs.
00:10:58.840 If they hold that belief, the fact they hold that belief predicts reliably that they hold
00:11:04.040 another belief, right?
00:11:05.160 And then you want to see a pattern like that emerge across a lot of people.
00:11:08.440 Then you see that there are associations of ideas, right?
00:11:11.980 Those would be something like the manifestation of an ideology.
00:11:16.200 You want to see if that's identifiable, what its boundaries are, that it can be distinguished
00:11:20.600 from other clumps of ideas.
00:11:22.100 So left could be distinguished from right.
00:11:24.140 This can all be done statistically and very reliably.
00:11:27.140 Yeah.
00:11:27.440 Right.
00:11:27.760 Now, it wasn't done by social psychologists from the end of World War II till 2016.
00:11:32.440 Right.
00:11:32.880 Right.
00:11:34.320 Shameful lacuna in the history of political analysis within the psychological community.
00:11:40.560 It shocked me when I first discovered it.
00:11:42.680 Me too.
00:11:43.140 Me too.
00:11:43.380 It was shocking.
00:11:44.600 Yeah, yeah, yeah.
00:11:45.280 Really?
00:11:45.800 Talk about blind spots.
00:11:47.040 Oh, my God.
00:11:47.660 I mean, I know.
00:11:48.060 It's like, oh, do you guys miss Mao and Stalin?
00:11:51.160 Yeah, I know, right.
00:11:51.900 How do you miss that?
00:11:52.540 How do you miss that?
00:11:53.280 That's like, it's fairly obvious.
00:11:54.660 And then they're denied.
00:11:55.440 You're a social psychologist.
00:11:58.200 The biggest pathological social movements of the 20th century had their existence denied
00:12:06.580 for 70 years.
00:12:09.080 Right, right.
00:12:09.480 Mind-boggling.
00:12:10.240 It's mind-boggling.
00:12:10.960 It's mind-boggling.
00:12:11.740 It's just, I've never recovered from discovery.
00:12:14.280 Yes.
00:12:14.600 It took me like a year to even believe it was true.
00:12:17.380 Okay, so you're using other people's questions.
00:12:19.920 Yes.
00:12:19.940 So what's your approach?
00:12:21.180 What are you, how are you investigating this?
00:12:22.340 So it does depend on the study.
00:12:23.940 So this is one good one that I think I can describe shortly, quickly.
00:12:29.900 The, we administered cartoons, like political cartoons, as if they were memes, like social
00:12:41.520 media memes, to an online sample, about a thousand people, and asked them how much they liked the
00:12:52.080 cartoons and memes, and which, and we, we told them to vote for the one, for one, their, their
00:13:01.160 favorite, because the one that received the most votes, we would actually post on social
00:13:07.140 media.
00:13:07.700 Now, that was a lie.
00:13:09.140 It was a deception, and we explained that at the end.
00:13:11.600 But we wanted them to believe that when they were selecting something, that this was, as
00:13:18.020 close as we could get to a behavior.
00:13:19.360 It was close to them posting it.
00:13:21.780 They believed their vote could influence what we posted.
00:13:24.640 Right, right.
00:13:24.740 So it was a real-world outcome.
00:13:26.000 A real-world, quasi-behavioral.
00:13:27.580 Of something that would be promoted.
00:13:28.160 Rather than just like liking or disliking.
00:13:30.300 Right, right.
00:13:30.960 Or self-report that they believe something.
00:13:33.160 That's right.
00:13:33.620 So two of the, I'm going to describe two of the cartoons, which were quite a contrast
00:13:37.980 to each other.
00:13:38.440 We actually had a set, kind of like the first, and a set like the second.
00:13:41.880 Okay.
00:13:42.160 But I can describe the two quickly enough.
00:13:45.700 The first was actually a political propaganda cartoon from the Soviet Union.
00:13:52.380 We didn't tell them that, from the 1930s, 1940s, anti-American propaganda.
00:13:56.180 But we didn't tell them that, we just presented the cartoon, which showed a long-distance shot
00:14:03.100 of this, in the top panel was a long-distance shot of the Statue of Liberty.
00:14:09.040 The bottom panel was a close-up of her head and her crown, and the spires of the crown
00:14:16.720 were KKK members.
00:14:20.800 People dressed in KKK, whatever.
00:14:24.340 Right, right, right.
00:14:25.400 So the true nature of American liberty.
00:14:28.020 Yeah, American liberty is just.
00:14:29.340 Right, so that's a power reference.
00:14:29.840 It's typical Marxist rule.
00:14:31.620 Yeah, yes, right.
00:14:32.240 Okay.
00:14:32.880 That was one.
00:14:33.680 Yeah.
00:14:34.140 And then the second was an image of a diverse group of people.
00:14:42.920 People, different racial and ethnic groups, wearing clothes for different professions.
00:14:49.720 Might be a bus driver, or a businessman, or a secretary, or a teacher, or whatever.
00:14:54.140 There were a whole bunch of different kinds of people in obviously different roles, kind
00:14:59.800 of in a crowd with their arms around each other under an American flag.
00:15:04.840 Sort of pluralistic diversity.
00:15:06.860 That's kind of a humanistic form of diversity.
00:15:09.240 And then we simply asked people, you know, we asked them, which ones do you like the most?
00:15:12.760 Which ones do you want to share on social media?
00:15:15.760 And who?
00:15:17.360 So was that a benevolent left view?
00:15:19.940 Yeah, sort of.
00:15:20.340 And an authoritarian left view.
00:15:21.940 Yes, exactly.
00:15:22.760 Right, right.
00:15:23.120 That's right.
00:15:23.500 Yeah, yeah, yeah.
00:15:24.860 Demonizing America versus, you know.
00:15:26.820 We did find in our analysis that there were, there's a liberal left that's clear, and there's
00:15:31.800 an authoritarian left.
00:15:32.940 And the liberal left, this is part of our investigation, the liberal left isn't, how
00:15:40.080 did we figure that out?
00:15:42.160 The liberal left doesn't partake of the attitudes of the radical authoritarian leftists.
00:15:48.660 But they're the ones that I also think that they're sort of oblivious.
00:15:51.780 Denies.
00:15:52.380 Yes.
00:15:52.540 They're oblivious.
00:15:53.060 Yes, I think that's true.
00:15:54.020 Okay.
00:15:54.620 And that is what we found.
00:15:55.860 That's what we found in the study.
00:15:56.900 I got a research idea.
00:15:57.980 Yes, yes.
00:15:58.920 That's relevant to this.
00:15:59.760 Well, with regards to these questionnaires, it's something that I wanted to do.
00:16:02.400 You know, the large language models track statistical probability.
00:16:06.000 So you can take those left-wing questionnaire sets, and you can ask ChatGPT, here's an item,
00:16:12.500 or here's three items, generate 30 more, and it does it.
00:16:16.540 Yeah, yeah, yeah.
00:16:17.220 Right?
00:16:17.460 So if you wanted to improve the statistical reliability of the measures, so you can imagine,
00:16:22.240 take the measures that already exist, put them in clumps of three in ChatGPT, have
00:16:28.640 it expanded out to like 300 items, administer it to 1,000 people, and distill it.
00:16:34.880 Because the thing-
00:16:35.480 That's a great idea.
00:16:36.280 This will speed things up radically, because the thing about the large language models is
00:16:40.000 they already have the statistical correlations built in.
00:16:43.180 When you ask ChatGPT to generate 40 items that are conceptually like these four, that's
00:16:50.180 what it does.
00:16:51.160 Yeah.
00:16:51.360 It's not an opinion.
00:16:53.040 This is great.
00:16:53.580 You can use ChatGPT to purify the questionnaires, and you can do that on the left and on the
00:16:58.340 right.
00:16:59.020 And you can, it takes like 10 minutes instead of two years.
00:17:02.020 I'm going to bring this back to Sonia.
00:17:03.040 This is great.
00:17:04.060 Sonia is a fan of your podcast.
00:17:06.460 She's, I'm sure she's going to see this.
00:17:08.380 I'll probably talk to her before, but hi, Sonia.
00:17:11.420 We should be doing this with all the questionnaires.
00:17:13.720 Like, it's the same with narcissism.
00:17:15.380 Yeah, yeah.
00:17:15.540 See, the other thing you could do with ChatGPT is you could say, here's 20 items significant
00:17:21.400 of narcissism, okay?
00:17:23.600 Which is the central item, and can you generate 20 items that are better markers of that central
00:17:29.060 tenancy?
00:17:29.720 And the thing is, it can do it, because it's mapped the linguistic representations.
00:17:34.160 Yeah, yeah.
00:17:34.700 So all the factor structure is already built into the ChatGPT systems.
00:17:38.720 Like, all of it.
00:17:39.720 That's great.
00:17:40.140 Yeah, so, okay, so let, this is one of the things I would pursue if I still had a research
00:17:46.160 lab, right?
00:17:46.780 These things are hard to pursue without having that infrastructure in place.
00:17:50.660 But I think this would radically speed up the process of, and also make it much more reliable
00:17:58.380 and valid, right?
00:17:59.860 Yeah, I think that's right.
00:18:00.680 Yeah, yeah, okay.
00:18:01.820 Yeah, we'll have to try it.
00:18:02.760 We'll have to try it.
00:18:03.240 Yeah, yeah, yeah, yeah, yeah.
00:18:04.520 Try it out.
00:18:05.200 Absolutely.
00:18:05.620 Okay, absolutely.
00:18:06.260 All right, so back to, all right, so now you've got people voting for one,
00:18:10.120 yes, and it was exactly, as you described before we went down the large language model
00:18:15.900 path, that liberals who are not, so we use statistical regression, we can separate out
00:18:21.680 being liberal but not authoritarian from being a left-wing authoritarian but not liberal.
00:18:27.780 Yeah.
00:18:28.000 Liberalism predicted endorsement of the sort of humanistic diversity image.
00:18:32.100 Right, right.
00:18:32.280 The people together are under an American flag.
00:18:34.200 We're all, you know, we're all different, but we're all in it together.
00:18:36.520 We love America, blah, blah, blah.
00:18:37.860 As Christmas approaches, there's a beautiful opportunity to keep God at the center of this
00:18:43.300 season through Hallow, the number one prayer app.
00:18:45.740 They're launching a special Advent prayer challenge called For God So Loved the World
00:18:49.380 that promises to be a transformative spiritual journey.
00:18:52.500 This challenge features an incredible lineup of guides.
00:18:55.320 Survivalist Bear Grylls and Jonathan Rumi from The Chosen will lead you through a severe
00:18:59.460 mercy, a powerful story of faith and divine love.
00:19:02.220 You'll join Pastor Francis Chan and Jeff Cavins for profound scripture reflections, while
00:19:07.160 actor Kevin James explores the spiritual classic Divine Intimacy.
00:19:11.240 The experience is enriched with beautiful Advent music from award-winning artists including
00:19:15.260 Gwen Stefani, Lauren Daigle, and Matt Marr.
00:19:18.160 It's more than just daily prayer.
00:19:19.680 It's an invitation to experience God's love, mercy, and healing in a deeper way.
00:19:24.340 Don't wait.
00:19:25.060 Get three months of Hallow free at hallow.com slash Jordan.
00:19:28.300 Make this a time of putting God at the center and experience the good that comes from it.
00:19:32.000 That's hallow.com slash Jordan for three months free.
00:19:37.240 It was left-wing authoritarianism powerfully predicted endorsement of the Soviet propaganda.
00:19:43.280 The Statue of Liberty is KKK.
00:19:45.760 And so the questionnaires predicted that?
00:19:47.600 Yes, yes.
00:19:48.360 Oh, that's good.
00:19:49.140 That's good.
00:19:49.480 Yeah, yeah.
00:19:49.860 It was, it was, it's a great study, so.
00:19:52.400 Another thing you might want to do is take that questionnaire, do an item analysis with
00:19:57.040 regards to preference and rank order the items in terms of their predictive validity in relationship
00:20:01.840 to the cartoon because you might be able to see which of the items are central.
00:20:05.800 Are the most, yeah.
00:20:06.400 Yeah, especially if you saw that pattern across multiple cartoons.
00:20:09.120 Yeah, yeah.
00:20:09.940 Yeah, yeah.
00:20:10.460 Okay, okay, okay.
00:20:12.220 Yeah, so that's one.
00:20:13.040 That's, yeah.
00:20:13.700 Yeah.
00:20:14.060 We did this kind of thing.
00:20:14.980 Well, how many studies have you done now on left-wing authoritarianism?
00:20:17.900 Well, it's a lot.
00:20:19.380 I mean, it's a lot.
00:20:20.700 And we include it in almost everything.
00:20:22.700 And we include measures of left and right-wing authoritarianism in most of the studies we've
00:20:27.620 been conducted.
00:20:27.840 Right, right, right.
00:20:28.680 So, the most recent splash, and I think that's what got your staff member interested in having
00:20:39.800 me on here, were three experimental studies assessing the psychological impacts of common
00:20:50.020 DEI rhetoric and headache.
00:20:52.420 Right, right.
00:20:53.280 And we did it with three types of, three different kinds of DEI rhetoric.
00:20:59.640 Yeah, those are probably studies that I'd run across of years.
00:21:02.580 I remember that now.
00:21:03.520 Yeah, that's fairly recent, and they've made more of a splash than I would have expected.
00:21:08.740 Well, it's one thing to say that DEI programs work.
00:21:11.900 It's another thing to say they don't work, and it's a completely different thing to say
00:21:15.520 they do the opposite of what, yeah, that's not good.
00:21:18.620 And it seems to me highly probable.
00:21:20.780 So, you know, suicide prevention programs, the kind the government's always running, they
00:21:25.220 make suicide rates go up.
00:21:27.080 Well, because, why?
00:21:29.220 Well, you're advertising and normalizing suicide, right?
00:21:35.160 And you think, well, we're going to put up a prevention program.
00:21:38.100 It's like, first, are you clinically trained?
00:21:41.920 Second, did you do the research?
00:21:44.780 Third, did you ever stop to consider that your conceptualization of the problem might be
00:21:49.400 inadequate in relationship to its solution?
00:21:51.900 Yeah.
00:21:52.280 There's so many things like this that happen.
00:21:54.480 Clinicians have become, the research-oriented clinicians have become very, very sensitive
00:21:58.560 to such things because it's frequently the case that a well-meaning intervention will
00:22:03.160 make things worse.
00:22:03.960 Right.
00:22:05.140 And then you might ask why.
00:22:07.520 It's like, well, there's 50,000 ways something could be worse.
00:22:11.360 And like, one way, it could be better.
00:22:13.200 And so just, it's an overwhelmingly high probability that whatever you do to change something that
00:22:19.200 works makes it worse.
00:22:20.520 Yeah.
00:22:20.860 Right.
00:22:21.440 Okay.
00:22:21.840 So now, so do you, what was your evidence that the DEI interventions made?
00:22:28.540 What was made worse?
00:22:30.500 What interventions and what was your evidence linking them?
00:22:33.580 Yes.
00:22:33.840 Okay.
00:22:34.280 So let me walk through, let me qualify this a little bit.
00:22:42.500 We examined the rhetoric that is common to many DEI interventions.
00:22:50.400 ChatGPD could do a very good job of that, by the way.
00:22:53.240 Kind of.
00:22:53.940 The problem is a lot of the materials used in DEI trainings aren't publicly available.
00:23:00.960 So it's actually hard, and we can say they're common to things we had access to, but we
00:23:06.720 don't, a lot is not publicly available.
00:23:09.660 Okay, so where do you find it?
00:23:10.500 And that's an important limitation, well, hold on, that's an important limitation that
00:23:14.060 your listeners, viewers should understand.
00:23:16.780 It's not like we evaluated the effectiveness of the DEI training program instituted by the
00:23:23.040 HR department of the city of Milwaukee.
00:23:25.480 We didn't do that.
00:23:26.340 We took the intellectual ideas from three different kinds of sources, anti-racism rhetoric, anti-Islamophobia
00:23:35.040 rhetoric, and anti-caste, the Hindu caste system, anti-caste oppression rhetoric.
00:23:41.480 And there are, for race, we used passages from Kendi's How to Be an Antiracist and from
00:23:51.980 D'Angelo's White Fragility.
00:23:53.500 These books were widely required for our colleges.
00:23:57.480 You know, there's sometimes, she's paid $40,000 a session to come in and give her training.
00:24:06.120 So we also actually used this sort of large language model, this sort of language network
00:24:13.860 analysis to examine the extent to which this type of rhetoric was common throughout the
00:24:20.300 training materials we had access to, and it was very common.
00:24:24.120 Yeah, okay, okay, fine, fine.
00:24:25.660 So you used that as a validation technique.
00:24:27.000 You know what, I have on my phone, just so, I have this here.
00:24:30.140 So let me give an example from the race.
00:24:37.560 And this is just a short excerpt.
00:24:39.220 So people would read, so they would read, say, an anti-racist passage or a control passage.
00:24:46.100 The control passage in these studies, in two out of the three, was about how to grow corn
00:24:50.560 on the farm.
00:24:51.320 It was completely separate.
00:24:52.980 And this is only a short excerpt of a longer passage.
00:24:55.740 Yeah, okay.
00:24:56.160 White people, this is the anti-racism.
00:24:58.840 White people raised in Western society are conditioned into a white supremacist worldview.
00:25:04.540 Racism is the norm.
00:25:05.980 It is not unusual.
00:25:07.500 Okay.
00:25:07.720 So this went on for a full paragraph.
00:25:09.160 And it was quotes smoothed together with a little writing by us of Kendi and D'Angelo.
00:25:15.980 Yeah.
00:25:16.100 Okay.
00:25:16.640 All right.
00:25:17.220 So they then were presented with a very brief scenario in which a college admissions officer
00:25:28.000 interviews an applicant, and ultimately the applicant is rejected from admission.
00:25:35.520 Okay.
00:25:36.220 That's the whole scenario.
00:25:37.180 Yeah.
00:25:37.360 I mean, the words are slightly different because I'm doing that piece from memory, but that's
00:25:40.680 basically the whole scenario.
00:25:43.620 They were then asked a series of questions assessing how much perceived racism and bias
00:25:50.440 was-
00:25:51.840 Was the causal factor.
00:25:52.660 Yeah, yeah, yeah.
00:25:53.680 What, you know, on the part of the admissions officer.
00:25:55.560 Yeah, yeah, yeah.
00:25:56.060 Okay.
00:25:56.300 And what we found is when they got the Kendi-D'Angelo essay, they claimed to have seen or observed
00:26:07.900 the admissions officer committing more microaggressions, treating the applicant more unfairly, and that
00:26:18.860 the admissions officer was more biased.
00:26:20.480 Okay, so I'm going to put on my devil's advocate hat.
00:26:25.660 Yes.
00:26:25.880 And I'm going to play Robin D'Angelo, despite wearing this Trump badge, and I'm going to say,
00:26:35.140 well, the effects of institutional racism are so pervasive that they even invaded your experimental
00:26:43.060 material, and the consequence of being exposed to the contents of my writing, speaking as
00:26:49.260 Robin D'Angelo, was that the scales fell from the eyes of your experimental subjects, and
00:26:54.380 they were able to perceive the racism that we claimed was there in a manner they couldn't
00:26:58.820 before.
00:26:59.240 So, yes, that is probably what D'Angelo would say.
00:27:04.520 Actually, I can tell you a little bit about what Kendi did say, because he was asked about
00:27:07.680 it.
00:27:08.020 He did not say that.
00:27:10.340 If someone said that, I would say, well, in our scenario, none of that was evident.
00:27:14.500 You had to read that into the scenario.
00:27:16.460 And that is the point.
00:27:17.780 How do you know that your own implicit bias didn't stop you from seeing the bias that was
00:27:24.020 there?
00:27:25.220 Because anyone can look at the scenario.
00:27:27.280 People didn't even have racial information about the admissions officer and the applicant.
00:27:33.160 So, you don't-
00:27:33.880 Okay, so you regarded it as highly improbable that what they were reading into the situation,
00:27:40.040 that what they were, you regarded as highly probable that they were reading into the situation.
00:27:45.120 Okay, let me ask you a couple more technical questions.
00:27:47.240 Yes, yes.
00:27:47.620 Okay.
00:27:48.820 How much of this material were they exposed to before they did the evaluation?
00:27:52.520 About a paragraph.
00:27:53.440 Oh, just a paragraph.
00:27:54.700 Just a paragraph.
00:27:55.040 How soon before the evaluation?
00:27:57.280 Pretty soon.
00:27:57.900 Okay.
00:27:58.260 Do you have any idea what the lag time, like if you did a dose response study, so to speak?
00:28:04.340 Yeah.
00:28:05.080 Is there a decay?
00:28:06.880 Like how permanent are the effects?
00:28:08.780 I know I couldn't expect you to do all that in one study, but it's germane, right?
00:28:12.760 Yeah.
00:28:13.480 Well, it is kind of.
00:28:16.080 So, on the narrow issue of how long do the effects we observed in the study last, we didn't study that.
00:28:25.040 Right, right.
00:28:25.140 So, I have no answer to that.
00:28:26.420 Yeah, of course.
00:28:27.040 Okay.
00:28:27.200 But given that we observed the effects that we did, the sort of people concocting racism where there was no evidence of it, on the basis of a very minor intervention, that's like reading a single paragraph.
00:28:44.940 Right.
00:28:45.260 It at least raises the possibility that when people are in a culture or organizational context in which this type of rhetoric is pervasive,
00:28:57.540 that they are constantly being exposed or primed to think about race in these terms, and because of the steady diet of this kind of rhetoric, the effects are likely to be more enduring than anything we could possibly observe.
00:29:17.780 Right, right.
00:29:18.080 Fair enough.
00:29:18.500 Well, I would also say probably you evaluated some of the weaker systemic effects of that kind of rhetoric, because it isn't merely exposure to the rhetoric, it's the fact that post hoc detection of such things as microaggressions, let's say, are radically rewarded by the participants in those ideological systems.
00:29:40.040 Absolutely.
00:29:40.280 That being even more, that's a more powerful effect.
00:29:42.900 So, you got it with weak exposure fundamentally.
00:29:46.020 Okay, so.
00:29:46.400 Right, and no reward, right?
00:29:47.580 I mean, you're peeking to the social reward.
00:29:49.600 Exactly, exactly.
00:29:50.160 Yeah, yes, yes, yes.
00:29:51.660 So, I would say, the weakness of your intervention demonstrated the power of the rhetoric.
00:30:00.880 Okay, what did Kendi have to say about this?
00:30:02.980 He described us as racist pseudoscientists.
00:30:06.220 Oh, yeah, oh, yeah.
00:30:07.040 Okay, well, that pretty much covers the territory.
00:30:09.620 Did he say why?
00:30:11.940 Or was that unnecessary?
00:30:12.920 Yeah, you know, that quote—
00:30:15.800 Where are you at wasting money?
00:30:17.140 My sense is that he was particularly good at that.
00:30:20.060 So, yeah, university money, counterproductively.
00:30:23.280 Well, I think most of his was from actually—what's his name?
00:30:25.640 Jack Dorsey from Twitter.
00:30:28.860 Yeah, I think gave him $10 million.
00:30:30.540 So, at least it wasn't state money, right?
00:30:32.500 Yeah, right, right, right.
00:30:33.660 Okay, well, then we can just let it go.
00:30:35.800 So, okay, okay.
00:30:40.220 Okay, you said that produced quite a splash.
00:30:42.560 Yes.
00:30:43.260 Including enhanced probability of being on this podcast, for example.
00:30:46.740 Yes, yes.
00:30:47.080 So, I'd followed your work for a long time before coming across that.
00:30:50.360 So, what effect has it had?
00:30:53.960 When was the study published, first of all?
00:30:55.860 Well, so—
00:30:56.700 And is it a sequence?
00:30:57.700 Is it a single study?
00:30:58.740 No, it's three studies.
00:30:59.660 Three studies.
00:31:00.120 So, it's essentially the same structure for an anti-Islamophobia intervention and an anti-caste oppression.
00:31:08.460 And it's essentially the same results.
00:31:09.980 There's little minor differences.
00:31:11.360 But it's essentially the same pattern of results.
00:31:13.660 They're not published.
00:31:15.180 So, these studies I conducted in collaboration with the NCRI.
00:31:21.460 NCRI is the Network Contagion Research Institute.
00:31:24.800 They are a freestanding research institute that started out mostly doing research along the lines of this sort of large language model stuff that you were talking about earlier, analysis of social media and analysis of radicalism, conspiracy theories, hate, sort of groups and individuals mobilizing online.
00:31:50.360 And they've done it with all sorts of stuff.
00:31:52.040 They've done it with COVID conspiracies.
00:31:54.300 They've done it with QAnon.
00:31:55.520 They've done it with Islamophobia.
00:31:57.640 They've done it with anti-Hindu hate.
00:31:59.260 They've done it with anti-Semitism.
00:32:02.020 They were the first group of any kind, as far as I know, in the summer of 2020, the height of the George Floyd social justice protests, which, as you remember, the rhetoric on the left—this is consistent with what you were talking about earlier, about how the left is—the reasonable left is in complete denial of the far left.
00:32:24.300 It is literally true that most of the protests were peaceful.
00:32:29.300 Whenever someone would present evidence of some protests not being peaceful at all, like firebombing a police station or capturing downtown Seattle or all sorts of, you know, setting—by creating, sort of setting the stage for lawlessness, you would have looting and robberies that weren't really part of the protests, but people were taking advantage of the sort of police-free zones and stuff.
00:32:58.900 When you would talk about that, the response was, this is all just—
00:33:03.900 Of course.
00:33:05.040 Oh, yeah.
00:33:05.580 I talked to moderate Democrats who told me that Antifa was a figment of the right-wing imagination.
00:33:12.100 Yes, right.
00:33:12.480 I thought—but, you know, there's something weird about that that's very much worth pointing out, I believe, is that we radically underestimate the effect a very small minority of people who are organized can have in destabilizing a society.
00:33:28.280 So, for example, in the flux of the aftermath of World War I, Russia was chaotic enough so that a very small minority of people, that would be the Bolsheviks, destabilized and captured the entire country.
00:33:43.840 So, even if the true radicals on the left are 3 percent, say, well, 97 percent of them are peaceful, it's like, fair enough, but you're—
00:33:55.120 Yes.
00:33:55.720 You're suffering from the delusion that a demented minority is harmless.
00:33:59.720 That means fewer abandoned carts and more completed sales.
00:34:28.320 Our marketing team uses Shopify every day to sell our merchandise, and we love how easy it is to add more items, shift products, and track conversions.
00:34:36.960 In today's market, growth-minded businesses need a commerce solution that's just as flexible and dynamic as they are.
00:34:42.900 Whether your customers are browsing your website, scrolling through social media, or wandering into your store, Shopify ensures you're ready to sell.
00:34:50.440 Ready to elevate your business?
00:34:52.100 Upgrade your business today and get the same checkout Daily Wire uses.
00:34:55.280 Sign up for your $1 per month trial period at shopify.com slash jbp, all lowercase.
00:35:01.720 Go to shopify.com slash jbp today to upgrade your selling.
00:35:05.660 That's shopify.com slash jbp.
00:35:10.920 And that's seriously wrong.
00:35:13.480 Okay, so this, enter the NCRI.
00:35:16.400 So, in summer of 2020, when this was all the record, most of the protests on Facebook, complete denial, mainstream media, that there was violence and bombings and all sorts of other stuff,
00:35:30.960 that the NCRI, this is the first project I did with them, produces an analysis finding that the far-left groups, not conventional liberals or Democrats,
00:35:48.360 but these far-left radical groups were exploiting the earnest commitment to anti-racism or the social justice on the part of people justifiably upset about George Floyd's murder
00:36:06.400 and the implications about that for racism beyond that, but these far-left groups were exploiting that to both gin up supporters and to mobilize online,
00:36:19.840 this is all occurring on social media, to capture protests, to ratchet up and inspire more aggressive violence at the protests.
00:36:31.040 So this, you know, that's exactly what you'd expect.
00:36:33.540 Of course that's going to happen.
00:36:34.840 I know, right.
00:36:35.520 Clearly.
00:36:35.940 Yes, right.
00:36:36.420 If you believe in criminals.
00:36:37.960 Yeah, right.
00:36:38.600 Right.
00:36:38.780 Right.
00:36:39.060 Okay.
00:36:39.240 So, and then, so, and NCRI would, in this report, would then link the increased online activity.
00:36:50.380 You know, there would be memes like ACAB, All Cops Are Bastards.
00:36:53.420 You know, so there'd be things like that.
00:36:54.880 Um, and some of the groups were actually using social media to coordinate their, you know, the sort of violent protest activities.
00:37:06.060 So, live, I'm making this up, but it was this kind of thing.
00:37:13.280 People would be, you know, at these protests on their phones.
00:37:16.660 They would get instructions from some sort of central place that the cops were over here, so everybody needs to go over there.
00:37:25.860 Mm-hmm.
00:37:26.020 And that's how they would have, so they were getting tactical instructions live via social media in addition to sort of ginning up the rhetoric to garner support and adherence.
00:37:38.280 Okay, so they, before they brought me on, maybe two or three months before, the NCRI had posted a report on how far-right groups do essentially the same thing.
00:37:50.940 You know, sort of mobilize online using memes and catchphrases and, you know, garner adherence, you know, gain adherence and stuff.
00:37:59.600 So, um, they bring me on, we do this thing, and this paper on the far-left, which really looks to me, it looked to me like the far-left groups were seeking to ignite an actual revolution.
00:38:15.860 Well, that is what they do.
00:38:17.600 I know, right.
00:38:18.420 Yes, this doesn't seem far-fetched, right?
00:38:20.360 They can dance in the ashes that way.
00:38:22.340 Right, yes.
00:38:22.800 Yeah, yeah, yeah.
00:38:23.440 The real criminal psychopaths, the short-term guys, the narcissists, they thrive in chaos.
00:38:29.280 Yeah, yeah.
00:38:29.720 Because their, their niche is chaos, right?
00:38:34.720 It was, yes.
00:38:35.940 So, I was kind of new to that at the time, but in hindsight, yes, absolutely.
00:38:40.700 Yeah, yeah.
00:38:40.820 Yeah, well, it's a shocking thing to know.
00:38:43.220 The NCRI, to no credit to me, I'm an academic, I'm a professor, I don't do this kind of thing, had access to journalists that I, at the New York Times and Washington Post, who ran stories on this report.
00:38:59.760 And it was the first time there was any acknowledgement in the mainstream media that there was any level of violence and danger in the protests.
00:39:11.420 I felt really good about it.
00:39:12.420 This was like September 2020.
00:39:14.420 We did the work over the summer, the thing came, but what, but that report is not published in a peer-reviewed journal.
00:39:20.840 NCRI has its own website.
00:39:22.780 And they publish these reports, kind of like old times.
00:39:25.020 And that's where your studies were, were what?
00:39:27.640 Yes and no.
00:39:28.260 So, as of right now, that's where they are.
00:39:30.700 They're available on the NCRI website.
00:39:32.660 Okay, and who did that?
00:39:33.720 Was it a postdoc, a doctoral student?
00:39:35.640 There was a bunch of, well, it was, so, it was me, two of my grad students, although one of my, both of my grad students also work closely with the NCRI.
00:39:46.940 Yeah.
00:39:47.200 And then there were a series of analysts at the NCRI, including their head researcher.
00:39:51.080 Okay.
00:39:51.300 So, a bunch of us are co-authors on this.
00:39:53.580 We have this, so I've not been working with them for several years.
00:39:57.100 And it took a while for us to get used to each other.
00:39:59.520 You know, their strength is this online social media, large language model, topic network stuff, you know, with an eye towards threats and conspiracy theories and hate.
00:40:11.500 And my strength is conventional social science surveys.
00:40:14.100 Yeah, that's a nice overlap.
00:40:15.540 It is.
00:40:15.860 Yeah, it's a nice overlap.
00:40:16.740 We needed to figure out the best synergies.
00:40:19.360 Yeah, yeah, no doubt.
00:40:19.680 It took a while, but we have this rhythm.
00:40:22.100 So, why that approach with regards to the dissemination of this information, this particular experimental information, rather than the more standard journal approach?
00:40:31.040 Yeah, so, one of the things, first, let me give context, a little more context.
00:40:37.620 So, our rhythm is, first, we post stuff as a, essentially as a white paper, as a report on the NCRI site.
00:40:43.900 It gets some attention, some public vetting, we get some feedback on it, and then we scale it up for peer review.
00:40:49.580 Well, that's not unlike doing a pre-release on a convention.
00:40:52.680 That's like doing a pre-print.
00:40:54.240 Yes, okay.
00:40:54.860 Now, it's a little different.
00:40:57.460 It's different.
00:40:58.340 It is like, I have taken to calling it a homespun pre-print.
00:41:02.300 And here's why I call it a homespun pre-print.
00:41:05.160 It's like a pre-print in that it's a report of empirical studies that is posted online that haven't been peer-reviewed.
00:41:12.120 Yeah.
00:41:12.320 It is unlike a conventional pre-print in that it is, and this is the answer to your question, why did we do it this way rather than make it for peer review?
00:41:20.620 Yeah, yeah, yeah.
00:41:20.860 It's part of the answer.
00:41:22.260 It is, even though some of it is highly technical, a lot of the worst of the technical stuff is stripped down so that it is comprehensible to the lay intelligent audience.
00:41:35.660 Mm-hmm.
00:42:05.660 In this day and age, a two-year lag to publication, it's completely insane.
00:42:10.780 It's crazy.
00:42:11.460 That's right.
00:42:11.960 You spend 30% of your time writing grant applications that go nowhere and two years to lag to publication that almost no one is likely to read.
00:42:20.540 That's right.
00:42:21.060 Yeah, that's right.
00:42:21.680 How the hell have you not been cancelled?
00:42:25.800 Why is that?
00:42:26.920 Because it's weird.
00:42:27.980 There have been repeat attempts to cancel me that have failed.
00:42:31.200 Okay, well, so why don't you tell me and everybody else, first of all, why you're, what would you say, why you so richly deserve cancelling?
00:42:41.680 That's the first issue.
00:42:44.300 And then the next issue, which is of equal importance, is how you've managed to not have that happen.
00:42:50.380 Because that's actually really hard.
00:42:53.080 So, because if people try to cancel you, especially given the things that you've researched and have insisted upon and said, if people try to cancel you, there's an overwhelming probability in academia in particular that that will be successful.
00:43:09.700 So, let's start by talking about the sorts of things that you've been pointing to in, well, in academia in general, and then more specifically in psychology and social psychology.
00:43:22.660 Sure.
00:43:24.500 There are probably too many of these attempts for me to go through, so I'm going to pick one.
00:43:29.440 Yeah, pick the cream of the crop.
00:43:30.540 Yeah, this is probably the cream of the crop.
00:43:32.080 Okay.
00:43:32.260 It is, I refer to, so I have a very active Substack site, Unsafe Science, and I have several posts on this.
00:43:45.100 You can find it under the POPs Fiasco Racist Mule Trope.
00:43:51.620 There's a whole series on this.
00:43:52.980 Okay, so what is POPs?
00:43:54.360 POPs is Perspectives on Psych Science, one of the very prestigious journals within the field of psychology for publishing reviews and commentaries and the like.
00:44:04.800 The short version is that I was invited by the editor to do a commentary on a main paper that was critical.
00:44:20.040 Well, the main paper by a psychologist named Hamel, Bernard Hamel, was critical of prior work in psychology advocating for diversity in a variety of ways.
00:44:33.300 The nature of his critique was that much of the rhetoric in psychological scholarship around diversity was narrowly focused on, and the terms are constantly changing, underrepresented, minority, minoritized, disadvantaged, oppressed groups.
00:44:56.780 And that from a scientific standpoint—
00:44:58.800 Intersectionally—
00:45:00.020 Yeah, yeah, right, exactly.
00:45:02.360 That's right.
00:45:02.860 And so—
00:45:04.220 Intersectionally deprived.
00:45:05.800 And there was a recent article which argued that on scientific grounds, we need to do exactly that.
00:45:13.400 Hamel's critique was that—was really multiple, but two of his key points were that, well, there are some types of things we—it's irrelevant.
00:45:22.140 Diversity is irrelevant for certain kind of theoretical scientific tests.
00:45:25.140 And then the other point is that if diversity matters, it matters for scientific purposes, it matters extremely broadly, and it's not restricted to underrepresented groups.
00:45:37.540 And a very simple example would be if you would compare a study based on undergraduate psychology students versus one based on a nationally representative sample.
00:45:47.320 The research based on the nationally representative sample is going to be broader and more generalizable and more credible.
00:45:54.680 A nationally representative sample represents the population.
00:45:59.360 It's not focused entirely on any subset of the population.
00:46:02.500 That would be a very simple example of Hamel's point.
00:46:05.500 I was asked to do a commentary.
00:46:07.860 I did.
00:46:09.220 And there's—okay, there's a distinction there, too, that we should draw.
00:46:12.040 Clearly, it's the case that if you want to draw generalizable conclusions about human beings from a study, that the study participants should be a randomly selected and representative sample of the population to whom you're attempting to generalize, obviously, because otherwise it doesn't generalize.
00:46:31.540 That's very different than making the case that underrepresented groups should be preferentially hired or employed or promoted or unspecified.
00:46:43.420 Yes, completely different, completely different, completely different.
00:46:45.980 That was sort of part of Hamel's critique.
00:46:48.140 Yeah, yeah.
00:46:48.900 But I guess so, again, the editor invited me to publish a commentary on this exchange.
00:46:55.080 And the title of my commentary was—it eventually got published—is Diversity is Diverse, because there's lots of different kinds of diversity.
00:47:05.820 And if we're arguing for diversity on scientific grounds, then what the science needs to be is fully representative of the—whether it's the participants or the topics, or it goes way beyond oppression.
00:47:21.900 I mean, oppression is a part of that and shouldn't be excluded, but it's only one piece of that.
00:47:27.900 So I basically was in agreement with Hamel's critique and augmented it.
00:47:32.440 As part of that, I critiqued progressive academic rhetoric around diversity as disingenuous and hypocritical.
00:47:45.800 And the way I framed that, the way I captured it, was using a quote from Fiddler on the Roof.
00:47:53.100 So in Fiddler on the Roof, which is what an early 20th century Jewish life in the—
00:47:59.280 One of the great movies of all time.
00:48:01.040 Right, yeah, yeah.
00:48:01.440 Everyone should watch it.
00:48:02.360 And probably its most famous song is Tradition, which is about the importance of tradition and keeping the community together.
00:48:09.260 But then there were exceptions.
00:48:11.120 So there's an interlude in the song Tradition where the—whatever—the villagers get into an argument because one chimes in, there was the time he sold him a horse but delivered a mule.
00:48:26.220 And I used that to frame my discussion of progressive disingenuousness around—
00:48:34.980 They all disintegrate into fractions, arguing in the middle of this song about unity to know that when that comes out.
00:48:41.360 When that comes out, that's right.
00:48:42.520 Yeah, yeah.
00:48:42.900 That's right.
00:48:43.580 And I argued in this paper that the way and the reason that's a good metaphor for progressive rhetoric around diversity is that diversity sounds—you know, superficially, it sounds good to a lot of people.
00:48:56.200 Right, because who doesn't want to be included?
00:48:58.720 No matter what group you're a member of, the idea that someone is advocating for diversity, you—you know, it's kind of appealing.
00:49:07.960 And so, for example—
00:49:09.440 Yes, with two seconds of thought, it's a positive thing.
00:49:12.100 Yes, with two seconds of thought, it's a positive thing.
00:49:14.180 Or that people should be free of arbitrary exclusion.
00:49:17.660 Yeah, of arbitrary exclusion.
00:49:18.520 That's right.
00:49:19.240 That's right.
00:49:19.780 And, for example, one thing you might think—one might think if one had a little bit of knowledge is that, especially in the social sciences and humanities, but really in academia writ large, there's hardly anyone who is not left of center.
00:49:33.800 I mean, the range goes from sort of center left to the far, far left.
00:49:38.060 I have a former—
00:49:40.020 Yes, and that's very well documented.
00:49:41.620 It's very—
00:49:42.140 No one disagrees with that claim.
00:49:45.080 Well, so, Nate Honeycutt, my former student, he's now a research scientist at FIRE, the Foundation for Individual Rights and Expression, did a dissertation on this.
00:49:54.480 He surveyed almost 2,000 faculty nationwide at the top colleges and universities and found that 40 percent self-identified, not just as on the left—not on the left was about 90, 95 percent, but 40 percent self-identified as radicals, activists, Marxists, or socialists.
00:50:18.520 Yeah.
00:50:19.320 40 percent.
00:50:20.220 So, this is the extreme left.
00:50:22.160 This is no longer just like Democrats or liberals.
00:50:25.780 This is nearly half on the far left.
00:50:28.640 And that was a sample of how many people?
00:50:30.180 It was almost 2,000.
00:50:31.280 Yeah.
00:50:33.160 Are you tired of feeling sluggish, run down, or just not your best self?
00:50:36.640 Take control of your health and vitality today with Balance of Nature.
00:50:39.940 With Balance of Nature, there's never been a more convenient dietary supplement to ensure you get a wide variety of fruits and vegetables every day.
00:50:46.440 Balance of Nature takes fruits and vegetables, they freeze-dry them, turn them into a powder, and then they put them into a capsule.
00:50:51.820 The capsules are completely void of additives, fillers, extracts, synthetics, pesticides, or added sugar.
00:50:57.760 The only thing in Balance of Nature fruit and veggie capsules are fruits and veggies.
00:51:01.700 Right now, you can order with promo code JORDAN to get 35 percent off your first order, plus get a free bottle of fiber and spice.
00:51:07.840 Experience Balance of Nature for yourself today.
00:51:09.980 Go to balanceofnature.com and use promo code JORDAN for 35 percent off your first order as a preferred customer, plus get a free bottle of fiber and spice.
00:51:17.600 That's balanceofnature.com, promo code JORDAN for 35 percent off your first preferred order, plus a free bottle of fiber and spice.
00:51:24.520 Now, how many faculty members at colleges and universities do you suppose there are in the United States, approximately?
00:51:35.700 Do you have any idea?
00:51:36.780 I have looked into this.
00:51:38.100 It's hundreds of thousands.
00:51:39.840 I don't know the number.
00:51:40.980 Okay, so when I say—
00:51:41.460 I don't remember.
00:51:42.340 I have looked into it.
00:51:43.200 It's very large.
00:51:43.860 So that means there's 80,000 academic activists who are being employed full-time in the United States.
00:51:50.260 I don't know if you could go that far, because he looked at the top colleges and universities.
00:51:54.680 If you wanted to generalize to all colleges and universities, you would have to include community colleges and, you know, primarily liberal arts.
00:52:02.460 Do you think they'd be less?
00:52:06.460 I don't know.
00:52:07.600 Bias?
00:52:07.960 Okay, we don't know.
00:52:08.860 I don't know.
00:52:09.340 Okay, so it's not 80,000, but it could easily be 50,000.
00:52:13.020 Yes, yes, yes.
00:52:14.020 Okay, so that's a number I want to return to.
00:52:16.240 Okay, okay.
00:52:17.240 Yeah, because there's implications.
00:52:19.200 So one might think, if someone is advocating for diversity, given the extreme political skew and given the extent to which academia deals with politicized topics,
00:52:28.400 that there would be an embrace of people, an attempt to bring into academia professors, researchers, scholars, teachers from across the political spectrum.
00:52:39.000 That has never gotten any traction in academia, and in fact, it's gone in the complete opposite direction.
00:52:44.720 If you go back 50, 60 years, I think it's fair to describe the way academia has functioned is to produce a slow-moving purge of conservatives and even people center and libertarians from its ranks.
00:52:56.820 So my point in this commentary was using things like that as examples of the disingenuousness of progressive rhetoric around diversity, that it wasn't really diversity in the broadest sense.
00:53:12.980 It was a very narrow—
00:53:15.000 See, that's actually the fundamental flaw of intersectionality, is intersectionality devolves into combinatorial explosion almost immediately, right?
00:53:24.600 Because once you start combining the categories of oppression, you don't have to make—your list of combinations, black, women, gay, etc., every time you add another variable to that multiplicative list, you decrease the pool of people that occupy that list radically, right?
00:53:45.140 Right.
00:53:45.280 But there's also an infinite—there's literally an indefinite—this is your point—an indefinite number of potentially relevant group categories, right?
00:53:55.480 Yes, oh, yes, yes.
00:53:55.920 So how in the world are you going to ensure that every possible combination of every possible group category is—you can't even measure it, much less ensure it?
00:54:06.440 Yeah, you can't do that.
00:54:07.160 So there's this underlying insistence, which you're pointing to, I believe, that there are privileged categories of oppressed people, right?
00:54:16.180 And it's a weird thing, right?
00:54:18.120 It's like, why is it that it's race and sex?
00:54:22.260 And you might think, well, those are the most obvious differences between people, and maybe you can make that case.
00:54:27.040 But then it's also gender, which is a very weird insistence, because whether the idea of gender is a valid—I don't think the idea of gender is a valid idea at all.
00:54:37.500 I think it's super—it's, what would you say?
00:54:40.280 It's a warped misconceptualization of everything that's captured by temperament much more accurately and precisely.
00:54:47.640 We can talk about that.
00:54:48.780 But also sexual orientation, I can't see at all why that would emerge as a privileged category of oppression alongside something like sex.
00:54:57.680 Like, it could, but it's not obvious why.
00:55:00.260 Okay, so you're pointing some of—and then you said, well, there's important elements of diversity, especially intellectually.
00:55:07.800 That's right.
00:55:10.280 Like adequate distribution of political or ethical views across the spectrum that's completely off the table.
00:55:16.480 Yeah, it's completely off the table.
00:55:17.440 It's like rejected.
00:55:18.680 It's worse than off the table.
00:55:20.200 So that was my paper.
00:55:23.920 And there's more to the story than this, but to keep this succinct, eventually what happened was almost 1,400 academics, probably mostly psychologists, signed an open letter denouncing—
00:55:45.440 So my paper was one of several commentaries.
00:55:50.480 All of the commentaries were critical of this oppression framing of diversity.
00:55:56.220 All of them, yeah.
00:55:57.120 All of them were.
00:55:57.760 Okay.
00:55:58.020 And this was in P.O.P.S.?
00:55:59.400 It was in, yeah, Perspectives on Psych Science, yes.
00:56:01.440 Okay, so I just want to provide people some background on this, and correct me if I get any of this wrong.
00:56:06.120 So scientists publish in research journals, and they generally publish articles of two types.
00:56:13.100 One type would be a research study, an actual experiment, let's say, or a sequence of experiments.
00:56:20.120 And the other, I guess there's two other types.
00:56:23.140 There's reviews and there's commentaries.
00:56:25.460 And then there's a variety of different journals that scientists publish in, and some of those cover all scientific topics, science and nature.
00:56:35.780 The world's premier scientific journals used to do that before they became woke institutions.
00:56:40.480 And then there are specialized journals that cover fields like psychology, and then there are sub-specialized journals.
00:56:47.920 And the less specialized the journal, all things considered, the more prestigious it is.
00:56:54.120 Anyways, that's where scientists publish.
00:56:57.160 And they do publish commentaries on each other's material, especially if it's a review of something contentious or something that's emerging in a field.
00:57:04.800 And now, this journal, Perspectives on Psychological Science, there's also an interesting backstory here, because that's an American Psychological Society journal.
00:57:14.640 Yes.
00:57:15.100 Okay, so there's two major organizations for psychologists, especially research-oriented psychologists in North America.
00:57:24.100 There's the American Psychological Association, which has its journals.
00:57:28.240 And then a newer organization, which is now a couple of decades old, American Psychological Society.
00:57:34.020 And the American Psychological Society was actually set up, at least in part, because the American Psychological Association had started to become ideologically dominated, particularly in the leftist and progressive direction, and that that was having an arguably negative effect on research, reliability, accuracy, and probability of publication.
00:57:57.300 That was set up 25—
00:57:59.440 Okay, so that's a little off-kilter.
00:58:00.920 Okay, okay.
00:58:01.480 Yes, I do know this history.
00:58:02.760 So, in the first place, APS started out as the American Psychological Society.
00:58:08.580 They changed their name to the Association for Psychological Science in an attempt to be broader.
00:58:14.420 And what triggered the breakaway of APS from APA in the 90s, maybe this was?
00:58:23.800 90s, yeah, I think so.
00:58:25.000 Yeah, it wasn't political.
00:58:26.980 It was the scientists who formed APS believed that APA was too focused on clinical practice and practitioner issues.
00:58:38.300 Right.
00:58:38.500 And it was becoming unscientific, but not because of the politics.
00:58:42.400 Well, okay, so let—yeah, yeah, yeah, fair enough.
00:58:45.620 But, see, I was watching that happen because I knew some of the people who were setting up the APS at the time.
00:58:51.840 And my sense, though, also was that part of the reason that the APA was tilting in a more and more clinical direction was because there was an underlying political ethos that was increasingly skeptical of science as the privileged mode of obtaining valid information.
00:59:10.640 Yeah, I think that's fair.
00:59:11.520 I think that's fair, yes.
00:59:12.540 Okay, okay, okay.
00:59:12.820 So the proximal cause was the overemphasis on the clinical.
00:59:18.320 Yes.
00:59:18.560 But, you know, it's also the case that, as you've seen, is that certainly the clinical psychology and the whole therapeutic enterprise has taken a cataclysmic turn towards the woke direction in the last—specially in the last 10 years.
00:59:34.540 It's been absolutely devastating, and I don't know, is social psychology—I think you could probably say the same thing about social psychology.
00:59:42.880 Maybe you could say a deep—maybe that's even worse.
00:59:45.740 Anyways, we can get into that.
00:59:46.620 Well, it's probably worse politically, but it's probably not worse practically because social psychologists don't really—aren't responsible for helping anybody get on with their lives.
00:59:55.020 I mean, they're responsible for teaching and students and things.
00:59:58.020 They're not—typically, they're not—
00:59:59.880 They are responsible for implicit bias.
01:00:02.360 That's all we can—you're going to get me.
01:00:04.240 You are going to get me distracted.
01:00:06.100 You started by asking me to tell the story of my cancellation attack.
01:00:08.840 Yeah, yeah, yeah.
01:00:08.860 Let's continue with that.
01:00:09.840 Yes, that's it.
01:00:10.720 Okay, so now you're—there's 1,400 people who write a letter—
01:00:14.820 Yes, declaring all of us, me as well as the other commentators, we're all racists.
01:00:22.380 Yeah.
01:00:22.960 The editor should be fired, and our articles should be taken down.
01:00:26.900 They shouldn't be published.
01:00:27.060 Right, so I presume that these 1,400 are a subset of the 50,000 activists that are right now.
01:00:34.240 I'm curious about the 1,400, too, because you often see legacy media headline news that 1,400 scientists have signed some petition.
01:00:44.020 Yes.
01:00:44.300 But then when you look into it, you know, it's often—I know the distinction between graduate student and, let's say, full-fledged scientist is murky.
01:00:54.140 Yeah.
01:00:54.320 But part of the issue is always, well, exactly, who were these 1,400 people, right?
01:00:59.540 And out from under, which rocks did they climb?
01:01:02.780 And so, who were the 1,400?
01:01:04.880 Like, roughly speaking, who were these people that signed them?
01:01:08.220 So, it was 1,400.
01:01:09.140 I mean, I didn't recognize many of the names.
01:01:10.820 Yeah, okay, okay.
01:01:11.100 But if you assume the first five or 10 names are the likely organizers, those were all well-established psychologists, especially social psychologists.
01:01:21.140 Okay, social, okay.
01:01:22.200 Yes.
01:01:22.420 Okay, okay.
01:01:22.820 Yeah, they were social psychologists.
01:01:23.260 So, you got a backlash from—
01:01:25.540 A huge backlash.
01:01:26.400 And part of the accusation, for me in particular, was that by using this line from Fiddler on the Roof, there was the time he sold him a horse but delivered a mule, as a frame for progressive disingenuousness around diversity, I was comparing black people to mules.
01:01:45.160 Oh, yeah.
01:01:45.760 Oh, yeah.
01:01:46.160 I see.
01:01:46.800 I see.
01:01:47.400 And so, that drove—
01:01:49.500 That was your subtext, was it?
01:01:50.840 Yeah, I was explicit in part of the denunciation.
01:01:54.140 Right, right.
01:01:54.800 And so, this is an immediate firestorm.
01:01:59.940 This was when?
01:02:01.620 What year did this happen?
01:02:02.920 2022.
01:02:04.240 Oh, yeah.
01:02:04.640 Okay, so this is very new.
01:02:05.860 There is a—actually, part of this backstory is very interesting.
01:02:08.780 The editor of the journal at the time is a European psychologist named Klaus Fiedler.
01:02:16.240 Klaus Fiedler is very accomplished.
01:02:18.540 He's unbelievably honored.
01:02:20.120 Hundreds of journal articles, multiple editorships and awards.
01:02:23.920 He was the editor overseeing all this.
01:02:28.200 And my and the other commentaries that he eventually accepted started out as simple reviews.
01:02:37.400 So, when Hamill submitted his paper, it was subjected to peer review.
01:02:41.680 I was one of the peer reviewers.
01:02:43.560 Oh, yeah.
01:02:43.920 So, with one or the other—Fiedler so liked the reviews that he asked all of us to scale them up to full-length articles.
01:02:54.060 Scientists publish their research findings and their reviews of the literature in scientific journals.
01:03:00.380 And it's one of the ways that the quality of these articles is vetted is by submitting the manuscripts before they're published to—well, first of all, the editor reviews them to see if they're even vaguely possibly suitable for publication in that particular journal.
01:03:19.500 On the basis of, let's say, topic and quality and apparent integrity of research.
01:03:25.040 Then they're sent out to experts in that area, multiple experts, for analysis.
01:03:33.100 And that's part of the quality control process.
01:03:35.300 And that's worked—that worked pretty well up until about 2015, I would say, or maybe even spectacularly well, all things considered.
01:03:44.440 So, that's the peer review process.
01:03:46.760 And what's—what happened in this case was the reviews of this—the peer reviews of this particular article were of sufficient quality so that the editor decided that they might—
01:03:57.600 Invite them as commentary.
01:03:58.480 Right.
01:03:58.760 They might turn into standalone pieces with some development.
01:04:02.380 But I warned Fiedler, the editor, in my review, before anyone had the idea that a version of my review would get published, that if he accepted Hamill's critique of the way in which psychologists write and think about diversity, what they've been advocating with respect to diversity, that he would be at heightened risk of people coming after him, demanding the papers be retracted, and coming after his job.
01:04:27.200 This is in my review.
01:04:29.840 And, Jordan, that is exactly—
01:04:31.380 Was that part of—was that included when it was published, or was that—
01:04:35.800 I don't remember.
01:04:36.560 I'd have to go by—I don't—I think I may have taken it out because it wasn't really appropriate because the commentary wasn't—it was about the exchange.
01:04:46.340 It wasn't a message to the editor.
01:04:47.720 Yeah, yeah, yeah.
01:04:47.980 No, fine.
01:04:48.580 I mean, it's not necessarily the case that it would stick.
01:04:51.020 Yeah, so, Firestorm, APS, the, like, executive director, committee of APS, whatever that group is, of committee, put an immediate kibosh on this—it was going to be all published as a discussion forum.
01:05:11.700 That's how Fiedler framed it, it's a discussion forum about diversity issues.
01:05:15.680 They put an immediate halt.
01:05:18.900 Okay, who's they?
01:05:20.300 It's the officers of the American—of the Association for Psychological Psychologists.
01:05:26.080 Okay, so now they're broadly overseeing the group of journals that publish under their ages.
01:05:31.720 Yes, that's right.
01:05:32.060 Okay, but they generally don't have an editorial say.
01:05:35.240 No, they don't.
01:05:36.100 Right, and shouldn't.
01:05:37.200 And shouldn't, right.
01:05:38.480 But the editor is, to some extent, beholden.
01:05:42.300 I mean, that's who he's working for.
01:05:45.300 So—
01:05:45.740 Right, but it's still the case that generally they don't do such things.
01:05:48.500 Yeah, they don't do this, right.
01:05:49.760 Partly because often—well, they don't have the specialized expertise, at least in part, which is partly why they hire the editors to begin with, who then they give pretty much carte blanche.
01:06:01.100 Yes, right.
01:06:01.920 As they should, because that's part of academic freedom.
01:06:04.420 That's right.
01:06:04.980 Right.
01:06:05.520 Yes.
01:06:05.840 Okay, but they decided that they were not going to proceed with the publication.
01:06:09.240 Well, or—
01:06:10.560 So, the open letter had two main demands.
01:06:14.460 They weren't even required, they were demands.
01:06:15.940 That Fiedler be fired, and the papers be retracted.
01:06:21.240 When a woman experiences an unplanned pregnancy, she often feels alone and afraid.
01:06:25.900 Too often, her first response is to seek out an abortion, because that's what left-leaning institutions have conditioned her to do.
01:06:33.360 But because of the generosity of listeners like you, that search may lead her to a pre-born network clinic, where, by the grace of God, she'll choose life, not just for her baby, but for herself.
01:06:43.620 Pre-born offers God's love and compassion to hurting women, and provides a free ultrasound to introduce them to the life growing inside them.
01:06:51.360 This combination helps women to choose life, and it's how Pre-born saves 200 babies every single day.
01:06:57.140 Thanks to the Daily Wire's partnership with Pre-born, we're able to make our powerful documentary, Choosing Life, available to all on Daily Wire Plus.
01:07:05.920 Join us in thanking Pre-born for bringing this important work out from behind our paywall, and consider making a donation today to support their life-saving work.
01:07:14.520 You can sponsor one ultrasound for just $28.
01:07:17.360 If you have the means, you can sponsor Pre-born's entire network for a day for $5,000.
01:07:22.580 Make a donation today.
01:07:23.900 Just dial pound 250 and say the keyword baby.
01:07:26.360 That's pound 250, baby.
01:07:28.800 Or go to preborn.com slash Jordan.
01:07:31.520 That's preborn.com slash Jordan.
01:07:37.160 Okay.
01:07:38.340 They conducted what looked to me, and really to all of us involved, like a kangaroo court, you know, into what happened.
01:07:48.140 They concluded that Fiedler had somehow violated editorial ethics and norms.
01:07:55.000 Which is a serious accusation.
01:07:57.900 Yes.
01:07:58.660 Like a career-ending accusation, if it's true.
01:08:01.660 Yes.
01:08:02.320 Well, he's had a very nice career since, so it did not succeed.
01:08:06.180 Well, that's good, but that doesn't detract from the seriousness of the allegation.
01:08:13.060 The fact that he was able to successfully wend his way through the thicket.
01:08:17.340 Yes, exactly.
01:08:18.280 That's right.
01:08:18.800 So, he was ousted almost immediately.
01:08:21.940 And then the papers, mine included, that were part of Fiedler's discussion forum.
01:08:27.620 And that had been published.
01:08:29.160 They had been accepted, but not published.
01:08:30.840 Oh, I see.
01:08:31.200 Okay.
01:08:31.600 Okay.
01:08:31.860 Accepted, but not published.
01:08:33.240 That's right.
01:08:33.440 Well, how the hell did the complainants get access to the papers?
01:08:38.460 Like, how did they know what the papers were if they hadn't been published?
01:08:41.300 Well, someone must have, you know, maybe through the—the editorial process is largely online,
01:08:48.000 so I'm sure they could have accessed the papers through the online editorial process.
01:08:53.120 I'm sure they could have asked Fiedler for the papers.
01:08:54.980 Had they asked us for the papers, I would have—
01:08:56.620 Oh, they weren't in secret.
01:08:57.500 They weren't secret.
01:08:58.160 Yeah, they weren't secret.
01:08:58.860 I mean, people want—they're publishing their papers so that people don't read them.
01:09:02.340 I was just curious because it's strange that a brouhaha of that sort would emerge prior to publication.
01:09:08.820 But it's—there was quasi-publication.
01:09:11.260 Yeah, well, there was—right, exactly.
01:09:12.800 It was accepted, but not published.
01:09:14.640 So, they ousted him almost immediately.
01:09:17.120 And then the papers—they brought in two special editors to figure out what to do with the papers accepted as part of the discussion forum.
01:09:27.400 And—
01:09:28.580 And who were these special editors, and what made them special?
01:09:32.160 Well, there was Samin Vizier and E.J. Wagenmacher's.
01:09:36.040 And both of them—I think Samin is now the head editor at Psychological Science.
01:09:42.340 So, they both have had long careers advocating, with some success, for upgrading the quality and credibility and rigor of psychological science.
01:09:55.180 They both have made important contributions that way.
01:09:57.980 Okay, yeah, yeah.
01:09:58.720 And so, I think that's why they were brought in.
01:10:00.840 They had a certain cachet as able to figure out what to do.
01:10:04.840 I think that's what the APS directory believed.
01:10:08.580 On what grounds do you think this investigation was—how was the progression of this investigation justified?
01:10:18.620 I mean, there's no established precedent in the scientific community for re-evaluating an editorial decision based on political objection, right?
01:10:29.280 Like, there's no—we'll re-evaluate if 500 people sign a petition.
01:10:33.140 Like, this isn't the domain of rule or principle or tradition, right?
01:10:38.080 So, it's—what's the fear here, do you think?
01:10:42.600 These 1,400 people signed this petition, which is something that takes like two seconds and costs you nothing and has no risk to you whatsoever.
01:10:50.380 And so, it's not an ethical statement of any profundity unless you're an activist.
01:10:53.880 So, what was it, do you think, that raised people's hackles about the mere fact that these complaints had been raised?
01:11:02.480 To this second, I don't really know.
01:11:05.280 Like, from their perspective—
01:11:06.340 Are you willing to speculate?
01:11:07.780 Well, so, sure.
01:11:10.720 The main object of Hummel's critique was a black or biracial social psychologist at Stanford, Stephen Roberts, and Roberts denounced the whole process as racist.
01:11:36.760 Publicly.
01:11:37.500 Okay, okay.
01:11:38.580 Publicly.
01:11:39.060 And I do think that—
01:11:42.220 On what grounds?
01:11:44.520 The mere fact of questioning the diversity agenda constitutes racism.
01:11:49.920 Well, I would say probably had three main grounds.
01:11:51.880 Yeah.
01:11:53.060 That was one of them, absolutely.
01:11:55.080 You know, you criticize this.
01:11:56.400 This shows—
01:11:57.760 That you're racist.
01:11:58.380 The racism and so that is pervasive throughout psychology.
01:12:01.340 Right.
01:12:01.680 That would be one grounds.
01:12:02.440 Second ground was my use of this, me comparing blacks to mules with the, you know, there was the time he sold a horse and delivered a mule.
01:12:11.220 And then the third was there was a considerable—so, Fiedler offered—
01:12:16.100 Kind of missing the point of that.
01:12:17.600 I know, yeah, right.
01:12:18.900 Fiedler offered Roberts the opportunity to respond to the full set of papers which were supporting—were generally supporting Hummel's critique, which was really about diversity in general.
01:12:33.440 But it's jumping off point was a prior paper by Roberts.
01:12:38.260 Okay.
01:12:38.720 Okay, got it.
01:12:39.320 But it gave Roberts a chance to reply to the critiques.
01:12:43.080 But that—there was a considerable back and forth between Roberts and Fiedler about whether, when, and how to publish Roberts' response.
01:12:55.400 Okay.
01:12:57.140 Fiedler was probably kind of a pain in the ass.
01:12:58.980 But, I don't know, in my experience, editors—I don't know how many times—I don't have enough fingers and toes to count the number of times I have subjectively experienced editors' comments as pains in the ass.
01:13:12.160 Hmm.
01:13:13.420 But—
01:13:14.300 One—at least once per paper submitted.
01:13:17.600 Yeah, yeah, yeah, right?
01:13:19.300 But whatever.
01:13:20.760 So, but those were his grounds for denouncing all of us as racist.
01:13:24.820 Fiedler made his life difficult.
01:13:26.220 This whole critique of diversity is a testament to white supremacy, pervasive in psychology, and me comparing black people to mules.
01:13:34.620 Yeah, okay, got it.
01:13:35.300 Right, that was the grounds.
01:13:37.320 And you asked me to speculate.
01:13:39.760 I have no—I don't have—I have at best very circumstantial evidence.
01:13:44.280 I may not even have circumstantial evidence.
01:13:45.600 I strongly suspect—I would really like to test this in the lab or in surveys—that liberals, especially white liberals, are so wracked with guilt and shame over the bona fide history of white supremacy and discrimination and oppression in the United States, in Europe, and especially in the UK.
01:14:04.820 It's more about colonialism, right?
01:14:06.940 I'm so wracked with guilt that there is a vulnerability to just believing anything a person from one of these oppressed, stigmatized groups says denouncing others.
01:14:21.180 Yeah, well, it's a very quick and easy way to signify the fact that you're not part of the oppressor camp.
01:14:26.280 That's right.
01:14:26.540 Yes, well, that has no one—has that not been formally tested as a hypothesis?
01:14:30.980 If it has, I don't know.
01:14:32.000 Well, it needs to be.
01:14:32.940 It definitely needs to be.
01:14:33.880 It totally needs to be.
01:14:34.880 Well, that's something like—it's something like, from more broadly speaking, is that are there—it's a mechanism of gaming the reputation domain, right?
01:14:47.220 Because obviously our reputations are probably, arguably, the most valuable commodity, so to speak, that we possess.
01:14:54.660 And every system of value is susceptible to gaming in a variety of ways.
01:15:02.980 And one way of gaming the reputational game is to make claims of reputational virtue that are risk-free, broad, immediate, and cost-free.
01:15:14.340 Right, and for me, if you're accused of something, and I can say—and accused of transgressing against a group towards whom I feel guilt, I can signify my valor as a moral agent by also denouncing you.
01:15:30.500 And it costs me nothing, right, which is a big problem, right?
01:15:34.580 It's like, maybe it's the problem of our time.
01:15:37.580 It's a very big problem.
01:15:38.760 It's a huge problem.
01:15:39.500 Well, especially now, because there's something else that's happened, right, is that groups of denunciators can get together with much greater ease than they ever could.
01:15:49.300 Yes, because of social media.
01:15:50.120 And the effort necessary to make a denunciation has plummeted to zero, and the consequences of making a false denunciation are also zero.
01:16:01.180 Yes, zero.
01:16:01.980 This is not good.
01:16:03.100 It's like denunciation firestorm time, and that's certainly happening.
01:16:07.140 Well, so, you know, I mostly agree, certainly in the short term, the personal consequences of engaging in this sort of denunciation behavior are non-existent.
01:16:20.580 But the consequences are not non-existent.
01:16:24.740 So the credibility and trust and faith in academia has been in decline for a very long time.
01:16:33.580 People hate this kind of stuff.
01:16:35.640 Yeah, well, just because something's advantageous for some people in the short run does not mean that it's good for the whole game in the medium to long run.
01:16:44.160 Right.
01:16:44.680 That's for sure.
01:16:45.520 Yes, that's right.
01:16:46.240 That's exactly right.
01:16:46.880 Well, that's actually, I think, in some ways, the definition of an impulsive moral error.
01:16:51.260 Like, if it accrues benefit to you in the short run, but does you in in the medium run, that's not a very wise strategy.
01:16:57.780 Yes.
01:16:58.240 Right.
01:16:58.540 And that's what impulsive people do all the time.
01:17:01.220 Yes.
01:17:01.940 Right, right.
01:17:02.500 That's even the definition of what constitutes a temptation.
01:17:06.620 I was recently listening to your interview for this podcast with Keith Campbell on narcissism.
01:17:14.540 Yes.
01:17:15.240 And that was one of the things you talked about, this sort of impulse control and short-term benefits versus long-term benefits, especially regarding social relations.
01:17:24.000 Yeah, yeah.
01:17:24.520 Right?
01:17:24.940 Yeah, yeah.
01:17:25.840 Reputation's a long-term game.
01:17:27.400 Oh, it's a long-term game.
01:17:28.700 Yeah, yeah.
01:17:29.260 And there has been emerging evidence that people high in left-wing authoritarianism, sort of extreme.
01:17:37.180 Now that we all agree that that exists.
01:17:39.280 It exists.
01:17:39.700 I know.
01:17:39.880 Which already started in 2016.
01:17:41.760 I know, I know.
01:17:42.760 Yeah.
01:17:43.120 That's a whole back story.
01:17:44.200 That's for sure.
01:17:44.960 So, but it's correlated with narcissism and that this pleasure that people, that people on this sort of cancel culture that has emerged.
01:17:57.300 I mean, the right is not immune to cancel culture type activities, but it emerged primarily originally on the left.
01:18:03.480 Any place infiltrated by narcissists is going to be susceptible to exactly, and the narcissist will use whatever political stance gains them the most immediate credibility.
01:18:13.620 That's right.
01:18:13.920 Completely independent of the validity of the ideological stance.
01:18:16.660 Yeah.
01:18:16.840 See, one of the things I, we'll get back to the story right away.
01:18:19.880 Yeah.
01:18:20.060 See, one of the things I've observed, this is very interesting, because I've talked to, I've talked to a lot of moderate progressives, let's say, or moderate, or actually even genuine liberals within the Democrat.
01:18:33.480 Yeah, yeah.
01:18:33.640 Congressman and senators, many of them.
01:18:36.280 And I've been struck by one thing, and I'm curious about what you think about this.
01:18:40.960 We know that a tilt towards empathy, so agreeableness, trade agreeableness, a tilt, tilts you in a liberal direction and maybe in a progressive direction.
01:18:49.400 And there are concomitants of being more agreeable on the personality side.
01:18:54.000 But I think one of them is that the moderates that I've talked to always denied the existence of the pathological radicals on the left.
01:19:04.660 And I've really thought, I mean, this is to a man for a moment.
01:19:08.740 Yeah, yeah.
01:19:09.240 And I think what it is, I think it has something to do with the unwillingness or inability of the more liberal types to have imagination for evil.
01:19:19.640 Like, I would make the case that most criminals, you could validly interpret most criminals whose criminal history is sporadic and short as victims.
01:19:35.660 They're, they've come from abusive families, alcoholic families, often multi-generationally, antisocial families, etc.
01:19:45.580 But there's a subset of criminals.
01:19:48.400 It's 1% of the criminals, 65% of the crimes.
01:19:51.400 There's a subset of criminals who are not victims.
01:19:53.700 They are really monsters.
01:19:54.880 And I don't think there's any imagination for the monstrous among the compassionate left.
01:20:01.180 It's all victims.
01:20:02.380 It doesn't matter how egregious the crime.
01:20:05.820 Now, I would have, that's something I would have tested as a social psychologist if I still had an active research lab, which I don't.
01:20:12.820 But the problem with what we know that, we know from simulations that networks of cooperators can establish themselves in a way that's mutually beneficial and productive.
01:20:23.440 But that if a shark is dropped into a tank of cooperators, then the shark takes everything.
01:20:30.000 So the problem with being agreeable and cooperative is that the monsters can get you.
01:20:34.740 And if you're temperamentally tilted towards denying the existence of the monster, so much the worse.
01:20:42.280 Now, I made that case because you talked about the relationship between narcissism and left-wing authoritarianism.
01:20:48.120 I mean, narcissism shades into sadism as well.
01:20:51.800 And so this is a very big problem, especially with online denunciation.
01:20:56.420 Okay, so back to 2022.
01:20:59.100 Hey everyone, real quick before you skip, I want to talk to you about something serious and important.
01:21:06.120 Dr. Jordan Peterson has created a new series that could be a lifeline for those battling depression and anxiety.
01:21:12.100 We know how isolating and overwhelming these conditions can be, and we wanted to take a moment to reach out to those listening who may be struggling.
01:21:19.720 With decades of experience helping patients, Dr. Peterson offers a unique understanding of why you might be feeling this way in his new series.
01:21:27.060 He provides a roadmap towards healing, showing that while the journey isn't easy, it's absolutely possible to find your way forward.
01:21:34.980 If you're suffering, please know you are not alone.
01:21:38.140 There's hope, and there's a path to feeling better.
01:21:41.420 Go to Daily Wire Plus now and start watching Dr. Jordan B. Peterson on depression and anxiety.
01:21:47.100 Let this be the first step towards the brighter future you deserve.
01:21:50.620 Now there's debate about whether these papers are going to proceed to publication.
01:21:59.160 Right.
01:21:59.360 And there's allegations made against the people who want the reviews.
01:22:01.820 Yes, absolutely.
01:22:02.260 We're all racist, and the whole thing is racist, and an abuse of editorial power, and all these accusations.
01:22:08.380 Right, and the editor loses his position.
01:22:10.080 He loses his position, and these two special editors are brought in.
01:22:13.340 Okay.
01:22:13.900 Negotiations go on for almost two years.
01:22:15.940 Like, what are they negotiating about?
01:22:18.580 Who's going to—so part of Robert's denunciation, public denunciation of all of us, was he posted the draft of his commentary response that was headed for the discussion forum,
01:22:38.160 and the full set of emails he exchanged with Fiedler about publishing it.
01:22:44.960 And those are typically confidential communications between an editor and an author, and so—
01:22:53.160 Or at least typically private.
01:22:55.300 Yes, right.
01:22:55.940 They're typically private.
01:22:56.780 So that added to the difficulty on the part of the special editors to decide what to do,
01:23:10.260 because they didn't want to just publish those.
01:23:13.900 Roberts didn't agree not to at first.
01:23:16.480 Fiedler—they wanted Fiedler's permission to publish the correspondence.
01:23:24.380 He wouldn't grant it.
01:23:26.100 So why did Smith have such an outsized say in all this?
01:23:29.820 Like, that isn't how the scientific process generally works.
01:23:32.420 So they—once APS blew up the journal by firing Fiedler, so there was no—
01:23:41.820 Right, right, which is like an admission of fault.
01:23:43.800 So—and about two-thirds of the editorial board resigned when he was ousted.
01:23:52.140 So—
01:23:52.520 That was protest resignation.
01:23:53.760 Yeah, I don't know whether it was protest.
01:23:55.860 We know they resigned, whether it was protest or not.
01:23:58.740 Yeah, okay, okay, okay.
01:23:59.080 So they were—
01:24:00.840 Maybe they also thought it was trouble they didn't need.
01:24:03.340 Yeah, right.
01:24:03.900 Right, yes.
01:24:04.160 I mean, these are generally—
01:24:05.540 If you're—
01:24:05.760 Right.
01:24:06.180 When you're working for a scientific journal, you're not doing it for the money, right?
01:24:10.580 It's a lot of work, and the editors—was he paid?
01:24:14.460 Was that his full-time job?
01:24:15.680 It was not his full-time job, and I don't know whether he was paid.
01:24:17.880 Right, right.
01:24:18.460 Okay, so that just illustrates the point, is that people are doing this because that's
01:24:22.980 actually what you do as a scientist.
01:24:24.720 There's not a lot of—you know, you—it's a prestigious position, and you meet people.
01:24:29.700 You have a certain say over the direction the field might go, and those are perks.
01:24:34.500 But generally, people do this like they do peer review because it's part of the tradition
01:24:40.100 of scientific activity.
01:24:42.680 Right.
01:24:42.880 Right, right.
01:24:43.720 Right, that's right.
01:24:44.880 And so you can see why people might bail out if it was going to just be nothing but
01:24:49.200 reputation catastrophe.
01:24:50.700 Right, exactly, right.
01:24:51.280 Because they'd be thinking, why the hell am I going to expose myself to, like, this dismal
01:24:55.640 risk when there's, like—it's already hard, and there's very little upside.
01:25:00.200 Right, exactly, right, okay.
01:25:01.260 So the journal was a mess for a long time.
01:25:05.500 And these editors—and there was this exchange between the editors, Roberts, Fiedler, and
01:25:13.620 the other contributors, myself and the other contributors, about whether and when to publish
01:25:19.640 it.
01:25:19.960 And again, this went on for almost two years.
01:25:22.160 So there was, like, first a discussion, we're going to publish it.
01:25:25.540 Then there was radio silence.
01:25:27.240 Well, it turns out we've run into an obstacle.
01:25:29.260 Can we resolve—and it just went on for almost two years.
01:25:32.160 Eventually, that was resolved, and it was all published.
01:25:36.020 It's all published.
01:25:37.540 And, you know, your original question was framed as, you can't believe I haven't been
01:25:44.580 subject to a cancellation attack.
01:25:45.740 I have.
01:25:46.400 I have.
01:25:47.200 You then asked, well, how did you survive it?
01:25:49.400 Yeah.
01:25:49.580 So let me add this little punchline.
01:25:51.140 At the time that all this was happening, my immediate associate dean—so I was chair
01:25:59.400 of the psychology department at Rutgers, and Rutgers is in the School of Arts and Sciences.
01:26:06.480 The School of Arts and Sciences has a dean.
01:26:08.960 Under the—but the School of Arts and Sciences at Rutgers is gigantic.
01:26:12.480 Even as chair, I had very little direct contact with the dean.
01:26:15.460 The dean was doing big deanly things.
01:26:18.200 But the department chairs have a lot of contact with an associate chair.
01:26:22.220 So there might be an associate chair for the sciences.
01:26:25.000 Associate dean?
01:26:25.900 Associate dean.
01:26:26.660 Yeah.
01:26:26.880 Sorry.
01:26:27.100 Associate dean.
01:26:27.860 Yeah.
01:26:28.000 Sorry.
01:26:28.420 Sorry.
01:26:28.740 Associate dean.
01:26:29.560 So there'd be an associate dean for math and for STEM, associate dean for social science,
01:26:35.220 and associate dean for humanities.
01:26:36.980 I had a lot to do with the associate dean for social sciences, who was a psychologist from
01:26:41.600 the psychology department.
01:26:43.440 Okay.
01:26:44.260 So I never actually had this conversation exactly with him, but I'm pretty sure he knew about
01:26:49.460 the whole thing.
01:26:50.800 A year—so at the end of my term, so this is now 2023, I go on sabbatical.
01:26:57.500 Remember, this event occurred—the POPs event occurred in 2022.
01:27:01.800 It's not until almost two years later that this stuff was published.
01:27:05.040 So I complete my term as department chair 2022.
01:27:08.640 2022-23, I go on sabbatical.
01:27:10.880 It's still not published.
01:27:11.920 Um, and then at the end of that sabbatical term, the associate dean approaches me with
01:27:21.880 an offer to chair the anthropology department.
01:27:25.720 Okay, so this is very weird.
01:27:27.360 Yeah, definitely.
01:27:28.300 It's very weird.
01:27:29.480 There was an internal political snafu, which is beyond the scope of this discussion, and
01:27:34.720 they couldn't appoint an internal chair, and they wanted an external—you know, the department
01:27:40.900 needed a chair.
01:27:42.320 The dean's office had a lot of faith and confidence in my ability because—
01:27:48.520 Despite this.
01:27:49.080 Despite the—
01:27:49.860 Because of it.
01:27:51.000 One of the things they said to me was, you know, this is going to be a difficult situation
01:27:56.380 because the department is not going to be happy about having an outside chair imposed on them,
01:28:01.340 but we know you have a thick skin.
01:28:04.140 Hmm.
01:28:04.620 Wow.
01:28:05.000 And I parlayed that into a very large raise.
01:28:08.760 Jordan, it was one of the best things I've ever done.
01:28:12.040 So not only did I escape cancellation, I parlayed it into an improvement in the quality—
01:28:20.000 This is a good thing for people to know, too.
01:28:21.860 You know, if you've watched my podcast, you know, because I say this all the time, that
01:28:27.160 mythologically speaking, that every treasure has a dragon, right?
01:28:32.300 And that's a representation of the world because the world is full of threat and opportunity.
01:28:38.020 And the co-association of the dragon and the treasure is a mythological trope indicating
01:28:45.720 that there's opportunity where there's peril.
01:28:48.480 But there's a corollary to that, which is a very interesting one, which is where there's
01:28:53.540 peril, there's opportunity.
01:28:55.860 And so you might think when something negative happens to you, let's say on the social side
01:29:01.400 that you become the brunt of a cancellation attempt, you might think, oh my God, my life's
01:29:05.660 over.
01:29:06.020 It's like, yeah, that's one possible outcome.
01:29:08.780 That's the same outcome as, you know, ending up as dragon toast, let's say.
01:29:13.700 But the other outcome is that you find the treasure that's associated with the dragon.
01:29:17.720 And that can definitely happen.
01:29:19.980 And that's a good thing to know because it means that when things become shaky around
01:29:24.720 you, one of the things you can validly ask yourself is, there's something positive lurking
01:29:29.820 here if I had the wisdom to see it and the, what would you say, the capacity for transformation
01:29:36.960 necessary to allow the challenge to change me?
01:29:41.920 Yeah, that's right.
01:29:42.880 Jordan, I wouldn't wish that experience.
01:29:45.900 At the time that was happening, it was horrible.
01:29:48.060 Yeah.
01:29:48.480 I wouldn't wish it on anyone.
01:29:50.680 In hindsight, it has made me a better person.
01:29:54.900 And I would not, I wouldn't undo it now if I could.
01:29:59.080 Yeah.
01:29:59.260 Well, you know what Nietzsche said, if it doesn't kill you, it makes you stronger.
01:30:04.780 Here.
01:30:04.900 Now, unfortunately, there's an if.
01:30:07.020 Well, seriously, right?
01:30:08.300 Yes, absolutely.
01:30:08.740 So, and the if is that the dragon is real.
01:30:11.020 It's not a game.
01:30:11.940 Yeah.
01:30:12.360 Right.
01:30:12.520 Well, no, FIRE, the same outfit, Foundation for Individual Rights and Expression, keeps
01:30:17.400 a faculty under FIRE database of faculty who have been subject, usually to mobs, sometimes
01:30:24.360 administrative investigations, seeking to punish them for what should have been legitimate
01:30:29.120 academic, speech protected by academic freedom or even free speech.
01:30:33.340 At U.S. state colleges, they're subject to the First Amendment, which means they shouldn't
01:30:38.460 be in the business.
01:30:39.460 However, sympathetically.
01:30:40.580 Well, yeah.
01:30:41.500 Well, yeah.
01:30:41.960 But they have documented that hundreds of faculty have been fired for what should have
01:30:46.900 been legitimately a protected speech.
01:30:49.780 So your point about-
01:30:50.560 We have to get rid of the remaining conservatives.
01:30:52.520 Well, I will.
01:30:54.120 Or liberals.
01:30:55.320 Your metaphor about the dragon is dead on.
01:30:57.320 That there's no guarantee, you know, people have lost their livelihoods running into these
01:31:02.140 dragons.
01:31:03.020 So that's not-
01:31:04.040 Yeah, well, the other thing, so there's some concrete recommendations that can be brought
01:31:07.600 out of that, too.
01:31:08.260 I would say, like, if you find yourself in serious trouble, this is one of the things
01:31:12.520 I learned about, I learned from dealing with, like, very dangerous people in my clinical
01:31:17.280 practice, let's say.
01:31:18.660 Dangerous and unstable people, it's a very bad idea to lie when you're in trouble.
01:31:24.100 Like, it's a seriously bad idea.
01:31:25.960 And so if the mob or the monster comes for you, your best defense is extremely cautious,
01:31:35.260 plain truth.
01:31:36.680 Now, that's very different than trying to, what would you say, strategize and manipulate
01:31:42.380 your way out of a difficult situation.
01:31:44.520 It's also very different than apologizing.
01:31:46.460 And my experience on the woke mob cancellation side is, if you lie in your own defense or falsify
01:31:54.980 your speech, you're in serious trouble.
01:31:57.320 And if you apologize, a different mob will just come for you.
01:32:01.200 That'll be the post-apology mob that comes for you.
01:32:03.880 It's not a good idea.
01:32:05.540 So, you know, what we've been outlining here is the fact that if you're in serious social
01:32:11.280 peril, there's two outcomes.
01:32:12.840 One is that, perversely enough, in retrospect, it might turn out to be an opportunity and
01:32:20.280 one you wouldn't forego now that you know the consequences.
01:32:24.240 That's not impossible, but it's difficult.
01:32:26.880 The other one is, is you're seriously done.
01:32:29.440 And so then the question is, what can you do to maximize the possibility of the former
01:32:34.740 and minimize the latter?
01:32:35.920 And those are some things that I know.
01:32:38.080 So, okay.
01:32:38.940 Okay, so let's back up a bit then.
01:32:43.420 We still haven't exactly described why the cancellation attempts weren't successful for
01:32:51.280 you.
01:32:51.940 Now, you said you demonstrated your ability to keep a calm head under fire and that you
01:32:57.880 did that well enough so the university actually recognized that and that turned out to be of
01:33:02.460 substantive benefit to you.
01:33:03.760 But we don't know why it was that you maintained a calm head under fire or how you did that
01:33:09.760 without, well, having the reputation damage that was certainly directly implied by the
01:33:17.400 accusation take you out.
01:33:19.760 Like, do you have, was it good fortune?
01:33:22.380 Were there things you did right?
01:33:23.800 Like, how do you assess that?
01:33:25.980 Yeah, yeah.
01:33:26.420 So, that was not my first, as I mentioned at the beginning, I, this was not my first go
01:33:32.540 around with this kind of thing.
01:33:34.360 It helps to have some experience.
01:33:36.340 It helps to have done some reading.
01:33:38.960 The people have addressed, there's some good articles and essays out there about what to
01:33:43.140 do when you're subject to these attacks.
01:33:45.280 Some of them have very good, make very good points.
01:33:47.940 Um, and so, about six months ago, I, again, I posted an essay on my sub stack.
01:34:00.260 What's the name of your sub stack?
01:34:01.400 Unsafe Science.
01:34:02.300 Unsafe Science.
01:34:02.980 It's called my Vita of Denunciation.
01:34:05.520 Okay.
01:34:06.000 And it's called my Vita of Denunciation because I've been, it goes through several of these
01:34:11.640 sorts of attacks that I have been through and how, first place, it also goes through
01:34:16.740 the tactics.
01:34:17.980 It's a short version.
01:34:19.060 I have a longer version in a different place, but it goes through a short version of how
01:34:23.100 to deal with these attacks.
01:34:24.300 So, the very first piece is that if you're, if you find yourself in the midst of such an
01:34:33.840 attack, go silent.
01:34:36.240 Go silent.
01:34:37.940 Do not engage.
01:34:39.620 Do not engage with your attackers because nearly all of these cancellation type attacks
01:34:46.580 are massive, brutal, and short.
01:34:49.540 Right, right, right.
01:34:50.400 Two weeks.
01:34:51.160 Yeah, and most.
01:34:51.780 That's right.
01:34:52.400 Yeah, yeah, yeah, yeah.
01:34:53.420 And people forget.
01:34:54.500 That's the weird thing, because the present is so large.
01:34:57.600 Yes.
01:34:58.120 You're going to panic.
01:34:58.920 Yes.
01:34:59.320 Yeah, yeah.
01:34:59.860 Don't panic.
01:35:00.720 Don't panic.
01:35:01.300 That's right.
01:35:01.800 Don't panic.
01:35:02.540 Don't assume that it's going to be successful.
01:35:04.700 That's right.
01:35:05.100 Yeah, because people, they might be interested in you today, but they weren't interested
01:35:09.960 in you yesterday.
01:35:10.780 Right.
01:35:11.080 They probably won't be interested in you tomorrow.
01:35:13.080 And it's just, it's like, as a kid, we used to go to the beach and body surf, and occasionally
01:35:17.840 like a wave that was way bigger than you could handle would, and there was nothing you could
01:35:22.000 do except let it wash over you and knock you around, and you come out, and it washes
01:35:25.760 you on shore, and you're fine.
01:35:26.480 As long as you don't do anything to make it worse.
01:35:28.420 Yeah, yeah.
01:35:29.640 Like apologize, for example.
01:35:31.100 Well, you know, I would add this.
01:35:33.240 If you genuinely, in your heart of hearts, believe you have done something wrong, then
01:35:39.020 maybe you should apologize.
01:35:40.280 Yeah, yeah.
01:35:40.660 But you should not apologize.
01:35:42.640 Let me add something to that.
01:35:43.940 Yeah, yeah.
01:35:44.460 No, not if you genuinely believe it, because you might not be your own best defender.
01:35:50.360 That's why you have a Fifth Amendment.
01:35:52.480 No, seriously.
01:35:53.500 Yeah.
01:35:54.140 Conscientious, guilt-prone people will accuse themselves.
01:35:58.300 So then I would say, if you feel that you've done something wrong, remember the presumption
01:36:04.540 of innocence before provable guilt.
01:36:07.220 Remember that.
01:36:08.020 It applies to you, too.
01:36:08.800 And then go talk to five or six people that you trust, and lay out the argument on both
01:36:13.940 sides, and see if they think you're the bad guy.
01:36:17.440 That's good.
01:36:17.940 I agree with that.
01:36:18.560 You need that.
01:36:19.080 Yes, that's good.
01:36:20.180 Yeah.
01:36:20.580 I'm completely on board with that.
01:36:21.780 So don't assume that you're morally obligated to apologize, even if you feel guilty.
01:36:27.140 That's right.
01:36:27.580 Because your guilt feelings are not an unerring indication of your guilt.
01:36:32.260 That's right.
01:36:32.760 And may distort how you think about your culpability.
01:36:34.860 Yes, definitely.
01:36:35.680 Yeah, no, that's a very good point.
01:36:37.280 That is, see, this is why I think, too, the cancel mob is particularly effective against
01:36:42.140 genuine conservatives.
01:36:43.980 Because genuine conservatives tilt towards higher conscientiousness.
01:36:47.300 And it's very easy to make conscientious people feel guilty.
01:36:51.180 Right, right.
01:36:52.000 So that could be weaponized.
01:36:53.540 Okay, so go silent.
01:36:55.360 Yeah, go silent.
01:36:56.420 Including, you can always apologize in a month after you've thought it through.
01:37:01.340 Absolutely.
01:37:01.940 If anyone still cares.
01:37:03.240 That's right.
01:37:03.640 Yeah.
01:37:04.340 Okay, go silent.
01:37:05.520 Go silent.
01:37:06.280 Yeah.
01:37:06.980 Record everything.
01:37:08.100 Yeah, that's for sure.
01:37:10.480 Right?
01:37:11.420 And you're...
01:37:11.940 Everything.
01:37:12.480 Everything.
01:37:12.840 You don't know how you're going to use it.
01:37:16.720 You may use it to defend yourself going forward, depending on how things unfold.
01:37:22.820 You may decide after the wave of the attack passes that you want to counterattack.
01:37:31.480 Yeah, right.
01:37:32.240 You want some level...
01:37:33.620 Carefully and strategically.
01:37:34.440 Carefully and strategically.
01:37:36.420 And by recording everything, you have the raw material to damn your attackers.
01:37:41.980 Yeah, yeah, yeah.
01:37:42.440 So that's...
01:37:43.440 Right?
01:37:43.680 So you go silent.
01:37:44.900 Yeah, that's especially true if someone's interviewing you.
01:37:48.500 Yeah.
01:37:48.920 It's like, record all of it.
01:37:50.640 Record all of it.
01:37:51.480 Yeah.
01:37:51.800 Record all of it.
01:37:53.440 Seek allies.
01:37:54.800 Yeah.
01:37:56.280 Because you may feel alone.
01:37:59.640 Yeah.
01:37:59.920 Mobs are very good at coming after somebody who seems alone.
01:38:04.240 But if you can...
01:38:06.000 If you have networks, support networks, activate those networks.
01:38:11.180 If you don't have them, and, you know, if you're in the intellectual type of professions,
01:38:15.080 whether it's academia or mainstream media, could be in something else, you probably have
01:38:19.580 a support network.
01:38:20.800 Let them know what's going on.
01:38:22.740 Most...
01:38:23.700 My experience has been, at least the kind of networks that I have, they will...
01:38:28.060 People will stand up for you.
01:38:29.480 I mean, I had numbers of people writing essays that got posted in some pretty good places.
01:38:36.720 Real clear politics, I think, was one.
01:38:38.760 Yeah, yeah, yeah.
01:38:39.800 On this Pops fiasco.
01:38:42.920 So, actually, of all places, the Chronicle of Higher Ed did a great...
01:38:48.400 Some great reporting on it.
01:38:49.920 And it really kind of damned the mob and the...
01:38:52.760 Right.
01:38:52.900 That's also why you need that time of silence, is to muster your resources.
01:38:57.800 Yes.
01:38:57.900 Yes.
01:38:58.240 And you could also assume, even if people are nervous in the aftermath of the accusations
01:39:04.180 for two or three days or a week, even, they may come to their senses as the temperature
01:39:11.560 drops.
01:39:12.260 Yes.
01:39:12.680 That's right.
01:39:13.280 Yes.
01:39:13.540 That's absolutely right.
01:39:15.420 And then...
01:39:15.940 Right.
01:39:16.160 So, go silent.
01:39:17.740 Record everything.
01:39:18.860 Activate your support networks.
01:39:20.180 Yeah.
01:39:20.300 Yeah.
01:39:21.200 And then, again, it depends on the situation.
01:39:24.740 It's going to vary from person to person and situation to situation.
01:39:29.020 It depends, in part, on what your skills and resources are.
01:39:34.520 But then you are ready to either defend yourself and or counterattack.
01:39:39.080 Okay.
01:39:39.500 And I don't...
01:39:40.080 Jordan, I don't know how many essays I posted on unsafe science surrounding this event.
01:39:46.260 One of them is titled, There Is No Racist Mule Trope.
01:39:49.620 So, the argument, the grounds for denouncing me as a racist for comparing black people to
01:39:55.140 mules was that there was a historical trope of making an equivalence between black people
01:40:03.440 and mules.
01:40:04.520 This is...
01:40:04.800 Roberts was the...
01:40:06.020 Presented this.
01:40:07.260 And he had one reference to support this.
01:40:10.560 Right.
01:40:10.960 Which I was not familiar with.
01:40:12.160 Yeah.
01:40:12.540 So, I tracked it down.
01:40:14.200 That's what you say.
01:40:15.040 I know.
01:40:15.660 Let's see what the article actually says.
01:40:18.500 Yeah.
01:40:18.680 This article was a really good article.
01:40:22.180 And what it documented was that there was a historical linkage between black people and
01:40:30.160 mules because originally, American blacks were overwhelmingly in the American South, in
01:40:36.260 the agrarian South.
01:40:37.360 And so, the mule was a symbol of both the kind of work that was done in the South, this agricultural
01:40:47.700 work.
01:40:48.700 Manual work.
01:40:49.200 And it was a symbol of the flawed liberation of black people from slavery because one of
01:40:56.820 the promises that they never delivered on was 40 acres and a mule.
01:41:00.180 Right.
01:41:00.380 And even though that was never delivered on for a very long time until you had the mass
01:41:06.140 migration into the North, the black people living in the American South, you know, aspire
01:41:12.720 to be successful farmers.
01:41:13.960 And getting a mule was one way to have a successful farm.
01:41:18.280 And so, you would see images, paintings, even, you know, if you go to southern museums, there's
01:41:26.980 some very famous paintings of black people in fields with a mule pulling a wagon or a, I
01:41:34.140 don't know, you know.
01:41:35.780 Plow?
01:41:36.100 Yeah, yeah, yeah, like a plow or, yeah, yes, right.
01:41:38.720 That's very, very common.
01:41:40.400 And in fact, the mule figures fairly largely in African-American folk stories from the American
01:41:50.560 South.
01:41:50.940 So, he documents all this.
01:41:52.100 Yeah.
01:41:52.360 So, much so that the mule really became a symbol of people who were oppressed and part
01:42:00.580 of the liberation of people who were oppressed.
01:42:02.740 So, that when, after Martin Luther King's assassination, his casket was pulled in a wagon
01:42:11.640 pulled by mules.
01:42:13.280 Okay.
01:42:13.660 So, there is, isn't that a myth, right?
01:42:15.500 Oh, so it's, okay.
01:42:16.840 So, given all that, it's less surprising that that speculation might have arisen.
01:42:22.040 In relationship to your analogy.
01:42:24.080 Right, right, right, right.
01:42:25.520 Things you find out too late.
01:42:27.260 I know, yes, right, right.
01:42:28.480 So, so, but, but it is ironic because the, you know, mule is the symbol of the liberation
01:42:34.740 from the oppression rather than the oppression.
01:42:37.260 Right, right, right.
01:42:37.840 You know, right, so.
01:42:38.600 Okay, so let me ask you a question about strategy there, too.
01:42:41.480 You know, like, I've spent a lot of time strategizing with people because that was a big part of my
01:42:47.420 clinical practice, but in terms of silence and then mustering your support network, right,
01:42:59.240 and then you said, well, you can, you can start your defense.
01:43:03.840 It's like, my sense is that a good offense is a very strong defense, right?
01:43:12.580 Because you can, if you're careful, now, you know, you can defend yourself or you can turn
01:43:18.020 the tables, and I would say, if you're turning the tables because you're angry, that's not a good
01:43:23.820 idea because you're going to make mistakes and you're strategizing, right?
01:43:26.780 I think you can distinguish the search for justice and truth from the search for revenge
01:43:33.020 by the intermediating role of especially resentment.
01:43:37.280 If you're resentfully angry, your head isn't clear.
01:43:40.640 But if you can quell that and you want to establish the truth and you can do that with a certain
01:43:48.120 amount of detachment, then a good defensive strategy is offense.
01:43:52.460 It's like, what's actually, you can flip the table, so to speak, and the problem with
01:43:58.740 a defense is there's something, well, there's something defensive.
01:44:03.040 There's a better defensive, exactly, yes, exactly, yes, exactly.
01:44:05.580 Well, I might have made a mistake when you're speaking, it's like, no, no, you're seriously
01:44:10.680 wrong.
01:44:11.240 Yeah, yeah.
01:44:11.700 And in a manner that's actually detrimental to the cause you purport to be putting forward.
01:44:16.900 Yeah, okay.
01:44:17.460 Yeah, well, so that and some of the prior experiences fueled what my, what was then very early interests
01:44:31.340 in left-wing authoritarianism and far-left radicalization and its consequences.
01:44:40.220 And so I've been doing all sorts of studies on that.
01:44:44.460 All right, look, we have to stop this part of the discussion, even though there's like
01:44:49.340 50 other things I want to talk to you about, but we'll continue.
01:44:53.100 I'm going to, I think, focus the discussion on the Daily Wire side.
01:44:56.780 You guys listening on YouTube know about this, that we do another half an hour there.
01:45:00.740 I think I'm going to talk about categorization and implicit bias and delve a little bit more
01:45:07.720 into social psychology's role for better or worse in promoting many of the policies, the
01:45:16.040 DEI policies, for example, and justifying them hypothetically on scientific grounds.
01:45:21.240 I want to delve into that because it's definitely been social psychologists who've been particularly
01:45:27.020 interested in the issue of implicit bias, even though to some degree that notion came from
01:45:33.660 the clinical world, including from people like Carl Jung, who were very interested in
01:45:38.100 the idea of complex and implicit association back in the 1920s.
01:45:42.880 Anyways, there's a veneer of scientific respectability that's been laid over the diversity,
01:45:49.400 inclusivity, and equity claims, the notion of implicit and systemic bias.
01:45:53.680 And that's always bothered me because I think the social psychologists have done a terrible
01:45:58.360 job distinguishing between categorization, which is like the basis of perception itself,
01:46:04.840 bias, because you can't consider categorization bias.
01:46:09.820 It's like, that's insane.
01:46:11.340 That's insane.
01:46:12.340 Even though the postmodernists really do make that claim.
01:46:15.560 And Lee's done work too, looking at the accuracy of such things as so-called stereotypes, because
01:46:22.160 what's the difference between a stereotype and a category?
01:46:25.960 Like, that is a hard question.
01:46:27.800 You could spend a thousand years trying to figure that out.
01:46:30.980 Anyways, I think that's what we'll delve into if you want to join us on the Daily Wire side.
01:46:35.200 And so thank you very much, sir, for, well, for offering what you know and also your story
01:46:40.260 to the more general public.
01:46:42.560 And join us on the Daily Wire side if you want to continue with the discussion.
01:46:46.500 The Daily Wire short information on the Daily Wire side if you want to continue with the
01:46:58.360 Daily Wire on which you want to join us when you want to follow us on the next media.
01:47:00.280 And to see you soon on the plane, please.
01:47:01.540 Thank you.
01:47:02.040 And I'll watch you soon on the other dialogue.
01:47:03.740 Thank you.
01:47:04.500 And I'll see you soon on the next day.
01:47:05.460 Good night.
01:47:05.560 Good night.
01:47:06.120 Good night.
01:47:06.300 Good night.
01:47:06.520 Good night.
01:47:07.580 Good night.
01:47:08.140 Good consent.
01:47:08.500 Good night.
01:47:09.300 Good night.
01:47:10.000 Good night.
01:47:11.360 Good night.
01:47:11.760 Good night.
01:47:12.160 Good night.
01:47:14.160 Good night.
01:47:14.620 Good night.
01:47:15.440 Good night.