TRIGGERnometry - November 24, 2019


James Lindsay and Peter Boghossian: Social Justice is a Mind Virus


Episode Stats

Length

1 hour and 3 minutes

Words per Minute

195.14877

Word Count

12,304

Sentence Count

569

Misogynist Sentences

15

Hate Speech Sentences

22


Summary

Summaries generated with gmurro/bart-large-finetuned-filtered-spotify-podcast-summ .

Transcript

Transcript generated with Whisper (turbo).
Misogyny classifications generated with MilaNLProc/bert-base-uncased-ear-misogyny .
Hate speech classifications generated with facebook/roberta-hate-speech-dynabench-r4-target .
00:00:00.000 Hello and welcome to Trigonometry. I'm Francis Foster. I'm Constantin Kishin. And this is a
00:00:09.960 show for you if you're bored with people arguing on the internet over subjects they know nothing
00:00:14.820 about. At Trigonometry, we don't pretend to be the experts, we ask the experts. You may remember
00:00:20.560 that a few months ago we had Helen Pluckrose on the show, who was one of the co-authors of the
00:00:25.660 grievance studies papers and i'm pleased to say today we have her co-conspirators james lindsey
00:00:31.400 and peter bogossian welcome to trigonometry just to interrupt on a very special secret location oh
00:00:37.040 yeah i forgot to say different location well here we are a social justice you guys have just been
00:00:41.840 speaking at a social justice speaking truth to social justice conference that's correct we were
00:00:46.240 there we enjoyed it most of our audience know what social justice is about but for anyone who doesn't
00:00:51.480 Just give us a quick lowdown before we get into the meat of it.
00:00:55.140 What is social justice?
00:00:57.020 Isn't that a great thing, social justice?
00:00:59.300 Justice, I mean, that's a good thing, isn't it, James?
00:01:01.000 Social justice is two things at once, and that's the trick.
00:01:04.200 So you have this nice thing that's about trying to overcome disenfranchisement
00:01:09.480 and discrimination and racism, sexism, and so on.
00:01:13.360 So trying to make society actually more fair to treat people with respect
00:01:17.200 based on mostly their immutable characteristics,
00:01:19.560 Although they dip into things like class a little bit.
00:01:22.920 Hopefully class is still mutable.
00:01:25.800 And then you have this ideology that's gone in service to this goal and kind of just upended the thing.
00:01:31.360 Like to give you an idea of how upended it is, you can take some of – there's a fellow on Twitter who has gone through some of the biggest books of kind of the social justice academic pantheon.
00:01:41.260 And it's seen like Judith Butler, for example, and starts listing who does Judith Butler cite.
00:01:47.740 And it's, you know, this postmodern philosopher, that postmodern philosopher, this lunatic, that lunatic.
00:01:53.540 And then you have this guy, John Rawls, this philosopher who has this very famous book called The Theory of Justice and really lays out what a socially just society would look like from a liberal perspective.
00:02:03.900 How many times has he cited? Zero.
00:02:05.860 They almost never talk about the other side of what social justice means, which is what most people think social justice means, which is what most people actually agree with and support.
00:02:15.720 Whether they're on the right, whether they're on the left, whether they're somewhere in the center, whatever their politics, most people agree that a more fair society is better and something worth trying to generate.
00:02:28.260 And so you've got this kind of two things.
00:02:30.260 Like in my talk yesterday, I pointed out that even in their own books, they admit we're doing something different than what most people think.
00:02:37.820 We call ours critical social justice to distinguish it from mainstream standpoints.
00:02:42.740 And, you know, some people call it social justice to try to tap into its true commitments, but we mean something else. So there are two things. There's this idea that society can be more fair. And then there's an ideology that claims to be working for that in a very specific way. That's basically postmodernism, the philosophy of postmodernism and grievance as we took.
00:03:04.500 And the argument you made yesterday is that essentially they're trying to have a revolution.
00:03:09.920 Yeah, they're revolutionaries, social revolutionaries.
00:03:12.400 The goal is, don't take my word for it, go look.
00:03:16.020 It's like everything they write.
00:03:17.220 They want to take the current system, which would be liberal systems, advanced democracies, tear down how they work,
00:03:24.120 and replace them with a social justice system, which is rooted in critical theory, which is openly not liberal, anti-liberal, in fact.
00:03:34.500 And so you're saying you're talking about anti-liberal.
00:03:38.040 They are not keen on freedom of speech.
00:03:41.180 They tend to see freedom of speech as a right wing issue or my favorite term is a racist dog whistle.
00:03:46.700 I never knew what a dog whistle was.
00:03:48.700 And now all I hear are dog whistles.
00:03:50.880 Why is it the freedom of speech issue seen as right wing to social justice people?
00:03:55.440 That's a good question.
00:03:56.680 Can I linger on something?
00:03:58.500 Linger away, Peter.
00:03:59.640 Linger away.
00:04:00.100 So James mentioned John Rawls, who's an American philosopher, contemporary American philosopher, recently died.
00:04:06.540 The thing about John Rawls and the theory of justice is that he says that these ideas for what a just society is are rationally derivable.
00:04:15.100 And he gives you experiments that you can do that are very easy to construct.
00:04:18.920 You can do them in five minutes if you want.
00:04:20.120 But the basic idea there is that every single individual is able to use—there's a univocality of reason.
00:04:27.540 Excuse me.
00:04:28.060 Reason speaks in one voice.
00:04:29.200 So we're able to rationally derive principles of social justice.
00:04:34.480 I don't think that the social justice advocates with an uppercase S and J, I don't think their principles are rationally drivable at all.
00:04:42.860 I don't think they're univocal.
00:04:44.300 I don't think an independent, rational inquirer could come to the same principles.
00:04:49.600 They're explicitly irrational.
00:04:51.120 I mean, that's all.
00:04:51.800 That's exactly right.
00:04:52.880 Explicitly irrational.
00:04:53.820 And I think that's a fundamental difference, but that leads into your question is if the problem with speech or the idea of speech is that you could take this in many different directions, but one is if your principles are not rationally derivable, then you need some way to enforce them.
00:05:14.780 You need some mechanism there to pick up the slack between whatever the belief is
00:05:20.800 and the reason that you have for believing that.
00:05:23.600 And part of that is you just don't allow speech, you don't allow debates.
00:05:28.920 You consider debates platforming.
00:05:31.600 I think Judith Butler in an article she co-authored called it
00:05:35.140 non-consensual co-platforming.
00:05:38.740 It sounds like some kind of sexual assault.
00:05:41.760 Yeah, so the idea is that one person's article was put up beside another person's article, and one was a pro and one was a con, which is normally...
00:05:53.020 That's debate.
00:05:54.160 It's how we... I'm not saying it's the best way to do it, but that's the traditional way to do it.
00:06:00.000 There's no violence there, and in my talk, I spoke about how they conceptualize speech as violence, which is part of the problem.
00:06:08.900 And when my girlfriend does it, it's certainly...
00:06:11.080 Yeah, so they have mechanisms to enforce, but she doesn't actually mean speech is violence.
00:06:17.580 She just doesn't like him, that's all.
00:06:19.280 Okay, well then maybe there's a different recipe for that.
00:06:23.880 So they have all these mechanisms, they have bias response teams, and then these mechanisms act to limit not just freedom of speech, but what they're really limiting is cognitive liberty.
00:06:36.240 Yeah.
00:06:36.380 Yeah. And so the way that I view it, it comes down to how you are free to pursue your truth.
00:06:43.680 And in the social justice universe, there is a certain truth that is an ideologically driven truth.
00:06:50.440 It is a dogma. And that's just what it is. And anything that prevents you from thinking in one way.
00:06:58.240 So if we can limit speech so you won't think this way, that's a good thing.
00:07:02.400 But it's even more than that. And this is something you guys talked about.
00:07:05.180 I remember we had Douglas Meyer in the show about two or three weeks ago, and he was talking about, I asked him, do you feel like we live in a society where it is now dangerous to say the truth?
00:07:15.320 But actually what you guys were talking about is that the whole point of social justice is to erode the concept of truth as an idea at all, to teach us all that there is no such thing as truth.
00:07:26.300 That's correct.
00:07:26.700 Truth is this invention of the patriarchy because this is who had power.
00:07:32.020 And truth is a negotiation or a piece of combat, essentially, between competing power structures.
00:07:39.780 So there is no truth anymore, according to these people.
00:07:43.200 It's not quite no truth.
00:07:44.980 It's that truth is relative to who's saying it and why they're saying it.
00:07:48.980 And so truth is a contrivance of power.
00:07:51.360 And truth is a political construct, is the way they see it.
00:07:54.100 So there's no truth in the sense that, you know, the correspondence theory of truth, as it's called it, what we call true somehow corresponds to reality.
00:08:01.320 There's no truth in that regard within this.
00:08:04.400 Or as, you know, some of the more moderate, even the old postmodern thinkers like Richard already put it, the truth might be out there, but we can't get it.
00:08:11.540 But it was kind of the idea.
00:08:12.880 We're stuck in our biases and everybody has biases and everybody's biases are basically equally bad.
00:08:18.460 So if there's truth out there, fine, but we can't get to it.
00:08:22.180 So it's kind of to give them as much credit as they deserve is that the truth is inaccessible.
00:08:29.500 So all we have left are politics, which.
00:08:33.400 So if we run with our idea for a second, let me just finish this thing, Francis.
00:08:37.600 And I put this question to Helen when we had her on the show.
00:08:40.840 If we were living in Nazi Germany right now, there would be some scientific truths that would be the product of the power structures of the time, right?
00:08:52.360 We would believe scientifically that black people are inferior.
00:08:56.080 Phrenology.
00:08:56.760 Right.
00:08:57.160 So is there not some truth to this idea that what we believe to be true, including scientifically, is a product of the power structures of the day?
00:09:06.340 Is there not some truth to that?
00:09:07.540 So it's possible to call things scientific truths that don't qualify.
00:09:13.380 What you often find, and this is kind of a norm that's breaking down, and maybe it needs to break down, I don't know.
00:09:18.060 But I always think back at times like this to Carl Sagan, who's very famous.
00:09:21.760 He's a scientist and also a science popularizer.
00:09:25.120 And unfortunately, he's mostly known for being the latter.
00:09:28.440 Somebody who spoke about science very eloquently, wrote about science very beautifully.
00:09:32.520 And Carl was not particularly well-received for doing this.
00:09:35.960 So, within scientific communities, most scientists, I mean, you do have your arrogant git here and there, lots of them actually, but there's this overarching kind of norm or understanding that everything that it's doing is provisional.
00:09:51.620 And, I mean, when I was doing physics, physics is sort of special because physics is really hard.
00:09:56.000 Anything that's not, you know, 20 or 30 years old is kind of, you know, maybe unless it has something like five, six sigma confirmation levels, extremely high confirmation levels.
00:10:07.260 So there's sort of this attitude that there's science, which is this very cautious, self-doubting process.
00:10:17.140 And then you have kind of the image of science, or almost like some people call it scientism, is like science going too far.
00:10:25.160 It's like what you would see out of a science journalist that barely understands the science.
00:10:28.380 And, you know, it's like they figure out some chemical can inhibit the growth of a cancer cell or something like that in a Petri dish.
00:10:35.300 And the cell happened to be for like a turtle or something like that.
00:10:39.100 And then they write, you know, like bananas cure cancer.
00:10:41.820 And it's like science says.
00:10:43.340 So the thing that gets called science may not actually be science.
00:10:48.520 Once you acknowledge that, then you can say, yeah, they had a point.
00:10:51.920 Things that were recognized as carrying scientific authority and promoted as such and still are promoted as such can be used as tools of power because, well, you'd be a fool to argue with science.
00:11:07.060 Therefore, it's right.
00:11:08.100 Therefore, you have to defer to that.
00:11:10.360 And that's actually kind of a power.
00:11:12.320 I think there are a few keys here.
00:11:16.740 One is the provisional nature, which Jim just talked about.
00:11:20.540 And the other is there's the science and then there's the policy implications from that science.
00:11:24.980 So there are just data points.
00:11:28.300 And I'm just talking to Michael Shermer about this, about how if you look at global climate change,
00:11:36.580 the vast majority of beliefs about global climate change fall along party lines.
00:11:42.260 But that shouldn't be the case.
00:11:44.360 Those are just factual.
00:11:45.460 What you do about it should maybe even, of course, fall along party lines.
00:11:50.360 So there are facts.
00:11:52.600 Those facts are just, you know, does, is global warming anthropogenic, for example?
00:11:59.600 It's a very complicated question, but we seem to have a convergence among people who have PhDs in climate science.
00:12:06.220 The answer to that is yes.
00:12:07.620 And what we do about it is a different question.
00:12:09.440 So, you know, the whole Nazi Germany thing, you take that as provisionally true, and then you ask what political inferences one can draw from that.
00:12:18.020 And that is a function of the pre-existing ideology and the framework in which that fits.
00:12:22.540 Right. So these postmodern thinkers, you know, they were looking accurately, well, not according to their critics at the time, their scholarship was a bit sloppy, apparently.
00:12:33.040 But we'll give them a lot of credit.
00:12:35.360 We'll say they were looking fairly accurately at ways that the history of how science was being used and, of course, finding wrong answers.
00:12:43.900 The idea with science, of course, is that you go from wrong answers to less wrong answers to less wrong answers.
00:12:49.200 It's not that you go to right answers.
00:12:50.740 It's less wrong all the way down.
00:12:53.500 And so this is happening.
00:12:55.000 But if the answer is just less wrong, it's still wrong.
00:12:57.420 And so you had, you know, these theorists like Foucault in particular.
00:13:00.600 And Michel Foucault was very interested in saying, oh, well, we thought about homosexuality
00:13:04.100 this way in this era, and that was wrong, and it caused all these problems.
00:13:07.700 And then we got a little bit more knowledge, and we thought about homosexuality in this
00:13:13.400 other way, or it was also madness.
00:13:15.900 He did homosexuality and madness, where it was two big things that he investigated like
00:13:20.540 that.
00:13:21.500 And now, you know, we thought about it this way, and it still caused all these problems.
00:13:24.980 Then we thought about it this way later, and it still caused all these problems.
00:13:27.920 But if you look, it's like the scale of the problems actually just keeps getting smaller.
00:13:32.480 It does cause problems, but the scale of the problems is shrinking.
00:13:35.980 But he doesn't say that.
00:13:36.780 He says, oh, well, there were problems, and then there were still problems, and then there were still problems.
00:13:39.920 So it's just a cynical read on the history of progress.
00:13:44.260 Broadway's smash hit, the Neil Diamond musical, A Beautiful Noise, is coming to Toronto.
00:13:49.900 The true story of a kid from Brooklyn destined for something more, featuring all the songs you love,
00:13:55.040 including america forever in blue jeans and sweet caroline like jersey boys and beautiful the next
00:14:01.440 musical mega hit is here the neil diamond musical a beautiful noise april 28th through june 7th
00:14:07.960 2026 the princess of wales theater get tickets at mirvish.com and touching on science as well
00:14:17.300 you were taught you you were talking in both your speeches uh some very exciting developments that
00:14:22.400 the social justice movement has done in science including woke engineering woke engineering which
00:14:27.580 i found particularly uh brilliant actually because driving over a bridge isn't exciting enough
00:14:32.340 yeah you want to make it more exciting yeah i was thinking of woke helicopters
00:14:36.460 so what is woke science and how has it actually infected the scientific community so the idea
00:14:43.920 there is that again who is asking the questions and for what purpose so science asks questions
00:14:49.780 tries to answer them. Who is asking those questions and for what purpose are they asking
00:14:53.800 them? That's what's at the heart of what's going on. And so it is a realignment of the
00:14:58.760 commitments of who's going to get to ask the questions and what the purposes are. And their
00:15:02.560 purposes are going to be social justice, emancipation, or liberation, which means following
00:15:06.960 their rules. So something like woke engineering would, first of all, prioritize changing the way
00:15:17.600 the engineering is taught the way the engineering is engaged and maybe there's something to this
00:15:21.460 in some degree so that more women are interested uh particularly women would be interested in
00:15:27.440 becoming engineers is that not quite a sexist concept this idea that you have to change
00:15:32.520 engineering to get women to be interested yeah helen pluckrose is really um keen to say that
00:15:38.600 what this actually does is is just continues the old sexist idea that men are the default human
00:15:45.160 And if women aren't making choices in the same proportions as men, then they must be something must be wrong.
00:15:51.820 It used to be the view. So this is really common is to get the same thing backwards and then kind of go the opposite direction with it.
00:15:58.580 It used to be that, you know, women didn't have access or something like that or were told that they couldn't.
00:16:06.400 You know, you're not good at math because you're a girl kind of attitude.
00:16:09.660 And now it's the system itself must be the thing that's keeping women out.
00:16:14.660 oh, we socialize women to think that they're bad at math, or we socialize women not to be
00:16:19.300 interested in working with bridges and helicopters or, you know, city streets or whatever it happens
00:16:25.760 to be. And if we just change how we educate people, especially women, but also boys so that
00:16:31.560 they don't get too encouraged, you know, however, then we can kind of twist the system so that more
00:16:37.180 and more women will become interested. See, in other words, they think that it's this weird
00:16:40.860 social forces. Systemic social forces. Like slatism. Right. It can't possibly
00:16:44.880 be that there's some psychological difference on average, not in any specific
00:16:48.600 individual. Because they're all biology analysts. Right. Well, the evidence, I mean,
00:16:52.840 the Baron Cohen studies on infants and stuff, and it's very clear
00:16:56.820 that there are fundamental... Yeah, that's if you value evidence. Yeah. So on average,
00:17:00.660 there seem to be some differences. And of course, now
00:17:04.720 we're talking in statistics, and people freak out when you talk in statistics because they see,
00:17:08.740 Oh, on average, it's different.
00:17:10.000 That means you think that they're different.
00:17:11.100 Well, not for most people.
00:17:12.360 Most people land in the overlapping zone in the middle.
00:17:15.780 When you get out on the ends, which is where you have, in particular, your high performers in any given field, who's going to become an engineer?
00:17:22.000 Are the people who are worst at it?
00:17:23.160 No, the people who are best at it.
00:17:24.480 And so the highest performing people with engineering talent are likely to become engineers in many cases.
00:17:30.540 And if it happens that the averages are even slightly different, the statistical bell curves, you're going to end up having more men, say, than women who are interested.
00:17:42.200 Same reason why prisons are filled with men, right?
00:17:44.760 The most violent people are men, therefore that's who fills the prisons, right?
00:17:48.360 And so this approach to thinking about the world that's taken up by social justice is what's called social constructivist.
00:17:54.360 constructivist and so what it believes is that that can't possibly be the result of anything
00:17:59.040 innate to biology and men and women that has to be the result of how society treats men and women
00:18:06.460 and so if we can just shatter society and rebuild it to where we don't treat men and women different
00:18:12.200 in any way yeah we should probably linger on that because i'm sure you have a question because
00:18:16.820 that's such a weird idea it's properly weird but i mean it is it's almost like the term for
00:18:23.040 it's defamiliarization they often use that concept it's almost like helen describes it really well
00:18:29.060 it was like an alien came down and has never seen a sexually reproducing species before a sexually
00:18:35.960 dimorphic species before and is trying to without interacting and not reading you know the science
00:18:42.480 or anything just watching from above trying to figure out why is this happening and all they can
00:18:47.360 see are the social interactions and say oh well boys tend to play with trucks and legos or whatever
00:18:53.300 and girls tend to play with these other you know dolls and whatever and so clearly the parents are
00:18:58.120 giving them the dolls and their grandparents are buying them this kind of toy versus that kind of
00:19:02.260 toy and they're being told this is what it's like these a boy you know there's a viral video in
00:19:06.780 around if you guys watch it if you should haven't you should watch it like 30 times where the kids
00:19:10.340 or two little boys are stepping on the trash can lid and it pops up hits them in the face and they
00:19:15.880 laughing laughing it's like everybody's like boys yeah no so they would they would they would
00:19:21.300 look at this and say well someone has clearly taught those boys that violence is fun and funny
00:19:28.060 and so we have to figure out who's teaching them that and they think that it's not just that their
00:19:33.280 parents or grandparents but it's a thing that's pervasive through all of us we're all complicit
00:19:38.940 they say in this system of socialization and so basically imagine it that every possible behavior
00:19:45.200 you can imagine is somehow learned by being taught that by society there's nothing innate we're the
00:19:53.100 only animal without any kind of instinct right and if you can somehow change the institutions
00:19:57.500 in society you can create an equality of outcome right and so this isn't let me let me be fair this
00:20:04.100 isn't totally crazy because we are in and we are a species that's like a hyper social ultra social
00:20:11.520 I think is the phrase for it. So there is a lot of actual, you know, social influence in these
00:20:15.680 things. And if you look at history, it's very clear. You can mold and bend that. The thing is,
00:20:21.380 if there's an underlying biology, which shouldn't be a controversial point,
00:20:25.840 then there are also limits to how far that can be molded and bent. And that's where they
00:20:30.480 cross off the rails. Almost everybody, people when we get into this conversation think, oh,
00:20:34.880 there's the biology people and then there's the social construction people. That's an incorrect
00:20:39.440 characterization there's a social construction people and then there are the biology plus social
00:20:44.020 constructive people and then you have a handful of weirdos that are like no it's all biology
00:20:48.200 a handful of weirdos that basically nobody respects almost everybody that's reasonable
00:20:52.720 on the topic recognizes that biology plays some influence covariant yeah and and uh so
00:20:59.380 socialization plays some role and those interact somehow and then you have these social
00:21:03.420 constructionists who are like nope they're the weirdos on the other end yeah the weirdos on the
00:21:07.020 them but there's a there's not a few of them and so we're talking about this and you know and
00:21:14.540 there are some merits to what they're saying not a lot but there's some merits why is it so
00:21:18.940 dangerous why can't we just say look you know if these people have these ideas that's fine and
00:21:23.660 they're more than welcome to them and it's a broad church you should read some of the accounts they're
00:21:28.480 charming um online where somebody's you know talking about the challenges and trials and
00:21:33.680 tribulations of raising a gender neutral child they decide that they're going to raise their
00:21:37.860 child without any reference to gender identity they're you know they're always gonna not going
00:21:42.320 to put them in blue or pink or whatever the different thing trucks and toys and they're
00:21:45.640 like why is my little girl always choosing dolls no matter how many times i buy her a truck or why
00:21:51.000 is she putting a dress on the truck you know we try not to have gendered behavior and she just
00:21:57.460 keeps choosing it what do i do she must be picking it up from society everywhere so when you start to
00:22:04.120 how is it so dangerous um i don't i don't know how trying to force something on somebody that
00:22:14.740 doesn't match whatever their proclivities are doesn't result in something that's at least
00:22:21.220 suboptimal psychologically like i grew up in in the southeastern u.s which is a little bit behind
00:22:26.340 the times. And so in the Bible, maybe you've, I don't know if you've read that book. It's an
00:22:30.720 interesting book. Hey man, I'm from Russia. We're still living in the biblical times.
00:22:36.620 Well, in the Bible, you may know that they mentioned it's like 27 times that being left
00:22:40.440 handed is a sin. It's something that the devil, it's sinister or whatever. So I actually know
00:22:44.720 one person who, because he was born left handed, had his left hand tied to his body until he was
00:22:50.660 something like seven years old for most things, especially when he was at school or having to do
00:22:54.460 writing tasks so you had to learn how to use his right hand even though he's left-handed and
00:22:58.580 there's been some study there's enough people that have gone through this to where um there's
00:23:03.820 been been some studies on these people and what you find is find out is that they're almost always
00:23:07.280 as adults ambidextrous so you can use both hands very well the thing is is that all the evidence
00:23:14.340 we have points to the the idea that they will never be as good with either of their hands as
00:23:19.800 they would have been with their left hand if you just leaned into right what their natural
00:23:24.700 proclivity was and so there's the threat there of um well you're limiting people limiting people
00:23:32.300 yeah the other there are further threats of course which is if you think that the the mechanism for
00:23:36.820 something is uh if you have the wrong diagnosis for for the the disease the prescription you get
00:23:43.940 is probably going to be wrong so if the prescription if you believe that oh my child
00:23:48.420 won't conform to the non-gender roles that i'm trying to force upon him or her uh what do i do
00:23:54.960 it must be society now you have this mission well i have to break society because it's perverting my
00:24:00.180 child and so at that point you're limiting your child potentially which you also get bent on this
00:24:07.740 mission that tries to break apart the structures of society but if the structures of society have
00:24:12.720 some correlation to the same underlying problems that you see problems that you're seeing in your
00:24:19.700 child you can't break them without forcing it on people and when you start forcing it on people
00:24:24.380 all kinds of weird stuff happens so let me ask you this peter because i i found it very interesting
00:24:29.320 you talked about social justice as being a mind virus or in a cult i think you also is that was
00:24:35.220 that you that said that? Oh, I was the virus guy. Oh, you were the virus guy. Yeah, we both said
00:24:39.120 the ideas before. It's a pretty obvious idea. And they said
00:24:43.040 that they were the virus. Oh, they said that. Okay, even more. So
00:24:47.080 these ideas are dangerous, right? Let's take it as given.
00:24:51.840 How has this mind virus spread
00:24:55.540 so quickly and so contagiously? Because
00:24:59.160 That is a great question. Well, this is the thing. I'm afraid no one really talks
00:25:03.200 about this but is it is it that they found a way to weaponize our empathy is it that in a word
00:25:08.640 something that douglas murray would say that in a world without god we have found in your religion
00:25:14.520 is it this mystical thing that we have a craving for is why is it so powerful yes in spreading
00:25:22.340 there that's a great question i've thought about it quite a bit thank you
00:25:26.000 one thing we should probably talk about is the substitution hypothesis why don't you take a step
00:25:44.920 at that and then i'll oh yeah so that's the douglas murray idea so the substitution hypothesis
00:25:48.580 is this idea that um you know people have left traditional religion so now they're looking for
00:25:55.440 to get those psychological and social needs met uh by a not traditional religion in this case by
00:26:01.240 politics for most people i think that there's something to that uh but not a lot to that uh
00:26:06.580 because the way that that normally gets approached is well people need meaning in their lives yeah
00:26:10.860 they're looking for meaning and they used to find meaning in god now they can't find meaning in god
00:26:14.620 so they're going to find meaning in social activism or whatever it happens to be their
00:26:17.780 political stuff and there's i don't want to say well that's not it because it's one of these
00:26:22.460 things. There are multiple factors involved in why people turn to religions if you look at the
00:26:27.120 psychology of religion, and one of them is that, and that's certainly going to be an issue. I think
00:26:31.520 that you're going to find more of an issue there because as their economies go increasingly towards
00:26:36.760 service, those are less fulfilling jobs, and people are looking to find meaning not so much
00:26:42.480 because they lost God, but because they're losing meaning in their careers. But what I ultimately
00:26:47.020 think is that the root of this is the problem of feeling out of control, which is a result of our
00:26:51.900 polarization. And so we're so polarized, we're so afraid, Helen and I have called this existential
00:26:57.660 polarization, that we're so afraid that the other side will get power and misuse it,
00:27:02.620 that it'll cause existential threats. That's what we see as extinction rebellion going on right now
00:27:06.740 is it's extinction, no kidding, that we can't possibly let the other side have any power.
00:27:12.900 And that's a crisis of feeling out of control. And so when you feel a crisis of feeling out
00:27:17.780 control, then you want to reassert control. Now, we can add in another element. Our fourth
00:27:23.800 companion, Mike Naina, that's doing a film about us and our work. He has this really interesting
00:27:30.580 hypothesis right now that video clips work kind of like miracles in today's social media economy.
00:27:38.960 So, when did all of this blow up and spread so quickly? It was right after, really right after
00:27:43.460 the Black Lives Matter thing blew up. So you have these videos, a minute and a half long,
00:27:48.680 or a minute long, or 30 seconds long. You don't really know what happened. And then a narrative
00:27:52.800 is attached to it. Turns out later, apparently, that the narrative that was spun widely was not
00:27:57.620 correct. Not at all correct. But this spreads like wildfire. So now you have many people who
00:28:03.960 feel like society is out of control. They feel like there's probably a lot of race issues,
00:28:08.760 especially after coming out of the United States, Obama's presidency. The conservatives today in
00:28:12.740 the u.s like to pretend that they weren't racist as hell during that and i'm like what i was
00:28:17.820 what and some of them weren't not all of them of course but holy shit i mean they were burning
00:28:23.240 black effigies and you know i saw people with bumper stickers driving around where i live that
00:28:28.800 said stuff like it's called the white house for a reason and the confederate flag i mean there's
00:28:35.060 confederate flags everywhere and they were really bad during during obama uh his presidency all
00:28:40.740 through the south so there's a lot of people who are aware that there's still this racist resentment
00:28:45.140 and that's that is that's racist resentment there's nothing else to call it and then all of a sudden
00:28:49.960 you see racist cops and that's the narrative that spun off the miracle story video and boom that
00:28:56.980 crisis of control just goes nuts to everybody and all of a sudden you have these people critical
00:29:03.640 race theorists who since the 1980s have been writing down exactly the explanation for this
00:29:08.280 society as hidden racism. Racism never got better from Jim Crow era. It never got better even from
00:29:14.940 slavery. It just kept putting on a nicer and nicer white mask. They literally call it that.
00:29:21.640 So it's all still there. All the same resentment, all the same inequality, all the same racism as
00:29:27.120 during slavery. Jim Crow are still there. They just present themselves nicer. And then all of
00:29:32.200 a sudden you have proof. You have this miracle video that's proof of that theory. And people
00:29:38.120 very rapidly turned to the critical theory to start uh people on the left in particular who
00:29:44.240 were most worried about these issues to start getting an explanation and so these critical
00:29:48.800 race theorists and in gender you have you know the equivalence with queer theory post-colonial
00:29:54.760 and just right across the board all of these different different approaches they all of a
00:29:58.740 sudden had the answers that a panicking left was looking for and then you'd had the condition
00:30:05.700 which was that the shooting in ferguson that was then cast as a racist event of course you also
00:30:11.600 had the trayvon martin shooting which i don't know if that was racist or not that one's really
00:30:16.840 morally ugly um so you have this kind of series of events and so you have lots of people all of
00:30:23.080 a sudden saying we finally have an explanation and genuinely i think a lot of the people on on
00:30:27.940 the left in general and certainly all of the social justice reactionary type people um firmly
00:30:33.820 believe that things like critical race theory and the rest of this social justice scholarship
00:30:37.700 is like having discovered the germ theory of disease, but
00:30:41.980 for bigotry. And so they think we now have the diagnosis
00:30:45.800 and this is the thing we have to fix. And in many ways, it's a simple
00:30:50.100 solution to an incredibly complex problem in society. It's an over
00:30:53.760 simple solution. Yeah. And that's why it won't work. I mean, it
00:30:57.920 completely gets, again, it gets everything backwards.
00:31:01.480 It starts by, well, we're going to reify differences in identity. We're going to make differences in identity matter more. We're going to get more difficult for people with different races to talk to one another and find common ground because we have to be constantly aware of the eggshells or landmines you might step on when you interact across any line.
00:31:20.060 Because, as Robin DiAngelo tells us, racial dynamics are always present between people of different races.
00:31:28.000 And the question is not, did racism manifest in the situation?
00:31:31.860 It's how did racism manifest in the situation?
00:31:35.280 It's assumed that the racism's there.
00:31:38.060 So if we speak across any racial line, the racism's there.
00:31:40.980 And it's just a matter of time or a matter of digging to discover how it actually is.
00:31:45.580 Every time you do a joke about me, it's because I'm brown.
00:31:48.040 Well, that is actually true.
00:31:49.580 Yeah. Now we have we have created a generation of people or almost a generation of people who for whom everything is a problem. Everything. There's racism in every interaction. There's racism in every film. There's racism. And and if you don't see it, it's because you're a racist.
00:32:06.500 I'll give you an idea of how ridiculous this is.
00:32:08.920 Last night during one of the panels, now I'm going to admit this and somebody's going to go looking for it and I'm just totally, I'm canceled.
00:32:14.100 I was sitting there kind of like this with my hands between my legs and my index finger and thumb were close together while the other three fingers were resting on the chair.
00:32:23.780 And it suddenly became, I was just kind of like out of it.
00:32:25.920 It's like a long day.
00:32:27.840 And I suddenly realized, because I'd been holding my own finger like this and I just kind of let him fall apart.
00:32:31.860 And I suddenly realized, oh, no, I'm doing the OK symbol between my legs at this.
00:32:39.220 Which is now the white nationalist symbol because 4chan.
00:32:41.500 Because 4chan, because it's a stupid thing.
00:32:43.680 And so it's like I accidentally ended up in that somehow.
00:32:47.480 And then I realized I was in it.
00:32:48.960 And then I was like, we're in the middle of a panel.
00:32:52.360 And all I can think about is, shit, how do I undo this smoothly?
00:32:58.440 What's going to happen if somebody notices this?
00:33:00.880 and it's like it's taken up it's ridiculous and now i'm i'm devoting cognitive energy to this
00:33:08.500 that i didn't even have to give to anything at that point and so now multiple not me but multiply
00:33:14.760 this by you know everybody that's constantly got to be aware of frankly stupid bullshit all the
00:33:19.800 time yeah i can't negotiate these kinds of things i want to add to that because there was a video i
00:33:24.520 I think it was an MSNBC guy is a black guy.
00:33:27.420 And someone made the symbol behind him.
00:33:30.120 And even if it was manufactured by 4chan, what a horrible thing for that guy.
00:33:39.240 He's just a guy who happens to be black, who some, can I swear on your show?
00:33:45.180 Yeah, I already did.
00:33:46.040 fucking asshole, some racist prick is making a symbol, even though these other idiots made
00:33:53.980 the whole thing up in the meeting, like how horrible, he's just a guy doing his job and
00:33:58.280 then he has to deal with all this bullshit, so even though, and this is part of the problem,
00:34:04.440 even though they manufactured it, if you're constantly looking for things like that, excuse
00:34:11.260 me what happens is when you're in a situation in which someone does it intentionally and there's a
00:34:17.520 black guy there like that's obviously a problem well this is that's the thing though it's not
00:34:23.180 obviously a problem and here's what i mean and this is a this does create a real problem how do
00:34:28.280 you know that that guy wasn't just cashing in on the joke knowing he's going to go viral maybe he's
00:34:32.780 not no no no listen but even if he was because this is really freaking important because if it's
00:34:37.600 just a troll and everybody wants to be a troll now because you can go viral yeah now we've lost
00:34:44.080 the ability to tell is this guy being racist so now we've we're losing the ability to detect
00:34:50.560 genuine racism where it occurs because we have to pay attention to stupid shit yeah that's
00:34:55.440 certainly true and from the book that's certainly true and how horrible for that guy i mean what's
00:35:02.720 horrible for the guy is that he has it's it's not even the symbol or even if necessarily the guy
00:35:07.440 was a racist it's that he no longer can tell and he has to go through this whole ordeal now he's
00:35:12.080 now the subject this now goes viral over something ridiculous with his face in it and so now he has
00:35:18.080 to deal with what was it racist wasn't racist people talking oh my gosh people saying oh that
00:35:22.840 was horrible just like Pete was doing just now and then other people saying oh downplay it kind
00:35:26.580 of like I was doing just now and it's like everything just becomes morally ambiguous
00:35:30.620 and something that should be relatively clear and easy to pick out is this something racist
00:35:35.360 happening or not sometimes it's a little ambiguous but it's usually fairly easy it used to be fairly
00:35:40.260 easy to tell a lot of times now it's kind of as you were mentioning the dog whistles now it's like
00:35:44.780 this universe of ambiguous dog whistles you don't know if somebody's pulling a joke you don't know
00:35:48.920 somebody's just being stupid you don't know if it's genuinely racist you don't know if it's like
00:35:52.600 racist joke at the same time it's impossible to tell so when you destroy the ability to detect
00:35:58.760 genuine racism and you make it all ambiguous you start attacking the wrong targets you start going
00:36:03.360 after the wrong things you start just wasting resources if the guy was a troll and they start
00:36:07.740 wasting resources trying to fix his racism he's having a laugh he's just laughing the whole way
00:36:12.040 through it and it made him famous maybe infamous but at the same time the whole thing is a tragedy
00:36:17.520 the thing that bothers me about it on it on a broader level as well at the level of society is
00:36:22.820 this hyper racialization of society forces us to think in terms of race all the time it's the
00:36:28.480 That's the exact opposite of what you're supposed to do.
00:36:30.100 That's exactly right.
00:36:30.640 And I catch myself doing it.
00:36:32.420 I was driving to one of our interviews a couple of weeks back, and there was a black girl waiting to cross the road.
00:36:39.380 And I literally found myself thinking, I better stop and let her pass because she's black.
00:36:44.420 Correct.
00:36:45.220 Now, there is no more racist thinking than that, that you have to treat people differently because of their skin color.
00:36:51.620 And we've imbibed it.
00:36:53.580 We've imbibed it because of all this bullshit.
00:36:55.640 Yeah, that's exactly right.
00:36:56.580 They get everything backwards.
00:36:57.900 And so the real threat there, Helen talks about this frequently, is that we're going to see as a pushback continues to mount, because it is already mounting, as it continues to mount, we run a real risk now of actually resurrecting, and this is their biggest fear, resurrecting genuine racism.
00:37:14.000 That's correct.
00:37:14.440 I mean, I didn't get it. I wanted to talk about it, but the conversation went away on us last night on one of the panels, so I can mention it here.
00:37:21.480 There's a great talk I watched a long time ago on YouTube by Dan O'Reilly, and he's talking about, frankly, it's a psychological mechanism, and he's talking about why does Catholic confession exist?
00:37:34.240 What's it for?
00:37:34.920 and so psychologically what they figured out is probably the root of that is that people sort of
00:37:41.020 have a on their moral calibration they kind of have a kind of a fuck it line you know like i'm
00:37:46.240 trying to be a good person but it's hard i'm trying to be a good person you can think about
00:37:48.920 this with people trying to stick to their diet i'm eating clean i'm eating clean i'm eating good
00:37:52.960 i'm not eating crap and then you have a couple days where you travel and you had too many beers
00:37:57.640 and you're eating cake and you wow that ribeye looked good or whatever happens to be depending
00:38:01.340 on what you're into and you overeat and you eat badly and you try to get back on track and
00:38:06.420 eventually you cross the line and you just go bad enough and then you're off your diet completely
00:38:10.400 and this turns out to be with all moral behaviors you have a line where oh well i'm not a good
00:38:14.900 person anymore so fuck it i'll just not be a good person and what aurelia was discussing was that
00:38:19.640 the discovery was like confession gets you in front of a mechanism that resets you above the
00:38:23.940 line so then you're trying to be good again wow which is really interesting yeah well here's the
00:38:28.880 thing you're a racist you're a racist you're racist you have to think in racial ways sooner
00:38:32.420 or later people are like well fuck it i'm a racist and then where are they going yeah and there's no
00:38:37.740 mechanism to come back to the other side yeah there is no right the mechanism they offer is
00:38:42.020 awful so most it's not like you know the catholic confession i grew up catholic it's not that
00:38:47.160 painful you go in there you talk to the priest hopefully nothing weird happens and then you know
00:38:52.760 You're like, forgive me.
00:38:54.280 It's funny that someone else makes a pedophile joke from the show, Francis.
00:38:58.160 Forgive me, Father, for I have sinned.
00:38:59.680 It has been like two years since I was here.
00:39:03.180 You're supposed to be every two weeks.
00:39:04.500 And it's like, I know I'm going to have to say a rosary for that.
00:39:08.360 Shit.
00:39:09.100 And so, you know, but you do the whole thing.
00:39:10.840 It's not really that difficult of a process.
00:39:12.520 And boom, you're back over the line.
00:39:13.960 With this, it's you have to sign up for a lifelong commitment to soul searching and political activism.
00:39:18.360 And if you step out of line so much as ever, we're going to remind you of what a terrible person you are forever.
00:39:23.400 So you have no way to get back over the line.
00:39:24.940 So now we're in this real risk where, you know, traditional gender norms are starting to become cool again on certain parts of it.
00:39:31.520 That's horrifying.
00:39:32.480 And then genuine racism.
00:39:33.960 You're going to get people below that fuck it line.
00:39:35.520 You're like, oh, yeah, well, let's do the symbol everywhere, you know, whatever it is.
00:39:38.680 Let's just go nuts.
00:39:40.420 And why can't I say the N-word to whoever I want?
00:39:43.560 And it's just you really don't want to break whatever societal detente exists that keeps most people above those lines trying to do it right, even if they don't always measure up.
00:39:56.500 You really don't want lots of people to start falling below that line or decide that that line doesn't matter anymore.
00:40:01.500 Fuck it.
00:40:01.880 Okay, I'm a racist.
00:40:02.760 Let's go.
00:40:03.000 So you're talking about forgiveness, really.
00:40:04.880 That's what you're talking about.
00:40:05.880 Yeah, there's a mechanism for redemption or forgiveness.
00:40:08.840 And those things, I mean, if you can cancel somebody for tweets they made 20 years ago.
00:40:13.340 Even when they were a teen.
00:40:14.380 As a teenager.
00:40:15.200 Like they wrote some stupid thing when they were 16 years old or now in like their 30s or whatever.
00:40:20.100 I guess they couldn't have written a tweet that long ago.
00:40:22.200 But, you know, you get this.
00:40:23.220 They uncover some dire injury or something.
00:40:25.660 Oh, he wrote this or that.
00:40:27.040 If you can cancel somebody for that, you deny their ability to grow, their ability to change, their ability to try to get on the right track and stay above the bad line.
00:40:36.520 And you offer them what forgiveness is in that.
00:40:40.060 There's no, oh, well, you messed up because you were a teenager.
00:40:42.020 We're all stupid when we're teenagers.
00:40:43.100 We all did stupid stuff when we were teenagers, whatever it is.
00:40:45.240 Even me saying we all did stupid stuff when we were teenagers,
00:40:47.600 somebody will clip this out of context possibly and say,
00:40:50.460 oh, this is a guy with something to hide.
00:40:52.400 Well, if you're going to do that, please make it go viral for our show.
00:40:54.940 Thank you very much.
00:40:55.960 But I was going to say as well, James, you know, there's, you know,
00:40:59.980 what you're talking about, but it's the impact on your mental health as well.
00:41:04.020 If you were going to look through every single thing that you were saying
00:41:07.960 and analyze it at every single moment, I mean,
00:41:10.500 That's just a fast track to getting anxiety in an inevitable breakdown.
00:41:14.080 And not only that, that assumes that there's some kind of moral stability.
00:41:18.660 Right.
00:41:19.000 So for all you know, the moral norm could change.
00:41:22.020 I mean, the moral norms have changed.
00:41:23.520 I'm 53.
00:41:24.220 The moral norms have changed very radically.
00:41:26.860 I'll give you an example.
00:41:27.920 I was taught when I was a kid, and I've asked other people my age
00:41:31.660 and similar experiences, that when you shook hands with a woman,
00:41:34.920 you're supposed to shake.
00:41:36.160 You're just supposed to offer your hand like this.
00:41:37.860 All right.
00:41:38.080 And you weren't supposed to, you know, with a guy that was a different kind of.
00:41:43.280 Now, today, that could possibly be interpreted as sexist.
00:41:46.900 Right. Right. And so because you're treating people how, you know, even something is as little as a handshake.
00:41:52.400 But that could be today. The zeitgeist is race and gender and things with identity level salience.
00:41:59.400 But maybe tomorrow it's either something we can't imagine.
00:42:01.660 My own prediction for this has been that we'll start, if a vestige of cancel culture exists, the next moral frontier will be factory farming.
00:42:13.100 Oh, that guy ate factory farming, cancel him.
00:42:15.520 Oh, that woman, that person.
00:42:17.200 So the moral norms always change.
00:42:19.180 So as I said before about the tying the left hand down, there's something that's incredibly psychologically alienating because you're not allowed to speak because your first question about censorship.
00:42:29.520 and thus, as I said in my talk, you can't form those authentic relationships
00:42:32.980 because you don't know what anybody means and everybody's walking on eggshells
00:42:36.280 and you're not looking at people as people.
00:42:38.080 Oh, there's a black woman across the street.
00:42:39.740 I have to let her go.
00:42:40.720 And I catch myself doing that too and I have to just rub that out
00:42:45.000 because I'm in that milieu where I'm just constantly bombarded by that.
00:42:49.220 Like this is how we have to look.
00:42:52.580 This is how we have to think of people.
00:42:54.440 It's a really toxic way to think.
00:42:57.580 instead of just a person oh it's a black person oh there's a gay person right and you talked about
00:43:02.040 the you talked about the way that that's anxiety inducing and uh yeah i don't know if that's by
00:43:08.840 design or not but we were just talking a few moments ago about how you can have a i think
00:43:13.320 that the root of this is feeling people in a crisis of control yeah they feel out of control
00:43:17.720 will give them anxiety and now they feel even more out of control and um that becomes kind of a you
00:43:24.560 know, a problem that just kind of keeps going. From the perspective of having studied religious
00:43:28.820 psychology before getting into all of this business, that's what I really got into before
00:43:32.940 I got swept up in woke land, I was learning that one of the, I got really interested in
00:43:39.660 conversion mechanisms. How do people convert to a religion or a cult? How do they get induced?
00:43:44.600 And the mechanism is nearly always the same, whether you do it with a child or whether you
00:43:48.260 do it with some adult. With children, it's obviously much easier is that you induce a
00:43:52.500 state of vulnerability, and then you give them a pathway out of the vulnerability through the
00:43:55.400 ideology or the cult. So here, you give them identities to lean into. Oh, well, you have
00:44:00.820 anxiety. Well, you're an anxious person. You're not a person with anxiety. That's something that
00:44:05.080 could be treated. They'd be medicalizing it. That's the Foucaultian idea that you can't do
00:44:09.460 that. So you're an anxious person. So now this is who you are. You are anxiety. And so you give them
00:44:16.760 Your full Great Outdoors Comedy Festival lineup is here on September 11th through 13th at Arendelle Park.
00:44:24.280 Three nights, five shows, huge laughs.
00:44:27.460 September 11th through 13th.
00:44:29.120 Buy tickets now at greatoutdoorscomedyfestival.com.
00:44:33.020 And then, you know, the way to do it is to lean into that victimhood, just lean into it and come with us and be activists.
00:44:39.460 And we can fight this problem that makes you an anxious person, even though that's the thing that's causing them to be anxious.
00:44:44.940 So you give somebody a, you induce vulnerability, you manipulate their vulnerability, make them feel usually bad or guilty or something, or afraid, fear of hell, for example, you might think about.
00:45:00.440 And then you give them a path out of that.
00:45:02.720 Oh, well, isn't hell scary, blah, blah, blah.
00:45:04.540 Well, if you just believe in Jesus, then you don't go to hell.
00:45:07.200 Free pass away from that.
00:45:08.620 He died for you.
00:45:09.360 You don't have to go.
00:45:09.880 And so it's a way to create – anytime you have this kind of emotional manipulation that taps into some path of vulnerability, oh, you've been complicit in racism.
00:45:21.140 Don't you feel guilty about that because you're white and that means you're comfortable and you don't have to think about what it's like to be a racial minority and what they have – all the crap they have to put up.
00:45:28.640 Well, did you know that if you take up anti-racism, you can do something about that.
00:45:33.540 It's not even – you're not allowed to do it to make yourself feel better.
00:45:36.560 So it's like constantly self-flagellating.
00:45:38.640 That'd be positioning yourself or myself, I guess, as a good white if I did it.
00:45:42.360 And that's for voting.
00:45:43.960 You can't do that.
00:45:44.720 But you give people, you say, oh, well, you can end the system that's creating this problem.
00:45:49.940 All you have to do is sign up for a lifelong commitment of soul searching for how you're
00:45:52.740 complicit in it and spread it to all your friends.
00:45:55.920 I think it makes apostasy more difficult.
00:45:58.680 Yeah.
00:45:59.200 Oh, sure.
00:46:00.060 Well, we've got about 15 minutes left.
00:46:02.460 And I want to end on a more positive note if we can.
00:46:05.920 This wasn't positive?
00:46:08.680 You guys haven't spent like two years living in critical decisions.
00:46:13.080 But we need someone to dive into it and take the hit for the rest of us to maybe look for a way out.
00:46:18.620 And you guys have just written a book, which is How to Have Impossible Conversations.
00:46:21.540 That's right.
00:46:23.260 I've had a good look at it.
00:46:24.560 It's a very interesting book.
00:46:25.660 So chart for us some of the paths out of this.
00:46:30.100 because I put it to you that the vast majority of the public intuitively agree with what we're
00:46:36.520 talking about here. Yeah. The exhausted majority. Yeah. I talked about it in my comedy show in
00:46:41.120 Edinburgh Festival this year. I talked about the fact that most people are terrified to say what
00:46:44.700 they think and they are. I mean, that is a fact, right? So most people get this, but I feel like
00:46:50.800 on an individual level, none of us really know what to do about it. How do we get out of it? I
00:46:55.720 mean the show we try and do is about trying to create something that gives an out to all these
00:47:00.240 feelings and a solution so tell us how do we get out well the solution that i advocated yesterday
00:47:05.540 is hearkening back to the greek value of parahisia which means speaking truth in the face of danger
00:47:11.260 speaking boldly and speaking bluntly and speaking honestly but before we can even do that we need to
00:47:17.680 think about what it how to speak to people again because we're losing that because as you said i
00:47:24.880 agree completely we're walking on eggshells people are afraid to say what's on their mind so
00:47:30.160 they need some basic tools for how to have a conversation so the for example really listen
00:47:37.800 to someone try to figure out what they're saying if you don't understand ask place the burden of
00:47:42.980 understanding on yourself don't say oh that wasn't clear say oh i'm not sure i understand that to
00:47:47.440 place the burden of understanding on yourself but hold on peter sorry to interrupt but i just i'm
00:47:51.820 listening through the prism of an ordinary person and i'm going well look at you you did your
00:47:57.400 paradesia you spoke the truth look what happened to me right look what happened to me look what
00:48:02.260 happened to us we get called natis because we have conversation with people like you right uh
00:48:06.520 we're all right adjacent all this crap right how many people we had a guy who was fired from his
00:48:12.540 job in a supermarket because he shared a comedy routine about religion from billy connolly one
00:48:18.620 of the greatest comedians in the history of this country on his facebook he got fired okay so so
00:48:23.640 right so don't tell me about parahesia sorry let's take a step back so socrates had parahesia
00:48:30.140 they killed him the state killed him i mentioned my talk malala uh she was shot in the head there
00:48:35.560 is and you're not telling this i'm being honest i'm being honest about the nature of the solution
00:48:40.900 yeah there's no wand that you can give and we're going to get out of this mess this is a catastrophe
00:48:45.520 yeah and so you know if you look at what they do to us and the fact that that you guys painted as
00:48:52.460 nazis for talking to us i'm jewish by the way yeah well that's my favorite yeah that doesn't
00:48:59.400 protect you right because you've seen what happens to to brett weinstein and ben shapiro etc that's
00:49:05.260 not a prophylactic against me it's good that you don't deny your your nazi nature because
00:49:09.320 that would be Nazi fragility.
00:49:13.740 Nazi fragility.
00:49:17.740 But we have to be honest that if you want to dig your way out of this, you're going to
00:49:22.620 pay a price for this.
00:49:23.600 There's just no way.
00:49:25.140 You can't sugarcoat this to people.
00:49:26.840 In fact, that itself is perihesio.
00:49:28.520 Why don't you get in on that?
00:49:29.360 Because you're not selling it, man.
00:49:30.980 That's horrifying.
00:49:33.160 So that was kind of my point.
00:49:34.780 So go get shot, guys.
00:49:35.900 the argument there before you jump in is 400 years ago
00:49:40.680 the four of us and there's a few more of us in this room right now who would
00:49:44.340 we'd all be burned at the stake right right right now our tongues pulled out right you've
00:49:48.380 lost your job i have my job but i've lost considerable right we we you know the whole
00:49:55.320 comedy industry in this country hates us because we are right wing or whatever right you know all
00:50:00.100 of us have paid some price but we're not burned at the stake no we haven't been kicked off youtube
00:50:05.720 we can still make a living from doing this you know isn't isn't isn't it kind of all right i
00:50:11.100 mean we can it is kind of all right actually so people reach out to me with this all the time
00:50:14.820 like what do i do i'm terrified that my friends and my family that i hear that more i hear my
00:50:19.480 my the two things i hear a lot are my job and my friends and my family are going to get rid of me
00:50:23.300 one or the other and what do i do well you mentioned 400 years ago people being burned
00:50:27.920 at the stake you know how that ended is people getting together and having conversations like
00:50:31.900 this, but not with cameras. They got together actually in small groups, often in bars where
00:50:36.880 there was a lot of ambient noise or pubs or whatever, and spoke in low voices. And so what
00:50:41.000 I tell people, thinking back to that, they created the Reformation. They created liberal movements,
00:50:48.100 liberalism itself. And out of those conversations, they couldn't be overheard because it was
00:50:53.360 treason to speak against the monarchy or the episcopate. And so I tell people all the time,
00:51:02.340 Reach out to people.
00:51:04.080 Pay attention.
00:51:04.740 I mean, we all have social media right now unless you get banned from it.
00:51:07.460 And pay attention.
00:51:08.500 If you're small, you're not going to get banned probably unless you say something pretty ridiculous.
00:51:12.700 Pay attention.
00:51:13.600 Find people who are speaking out against this.
00:51:16.620 Follow them.
00:51:17.860 Pay attention to who interacts with them.
00:51:19.560 Reach out to those people and start making your own social network.
00:51:22.880 Speak in direct message behind the scenes.
00:51:25.740 Get on WhatsApp, which is encrypted.
00:51:28.320 Meet up with each other, whether it's virtually or whether it's in person.
00:51:30.760 and start having the conversation so that you guys can find your feet understand what you need
00:51:35.320 to understand realize that you have you're not alone you're not crazy you're you're there's
00:51:40.020 lots of that's a more hopeful message in the morning once you've found your feet then you
00:51:45.420 can start to speak with that courage yes because you you know you have a fallback you have a group
00:51:50.140 who's going to back you up yeah you know you're coming from a place where you've kind of figured
00:51:53.360 some things out and you're not just shooting in the dark uh so i tell people to do that all the
00:51:58.020 time the thing is is the there is safety in numbers if more and more and more people start
00:52:04.680 speaking up then it's not as risky for any of us who are speaking up and there really is something
00:52:11.320 to you just have to know whose voice not to listen to yeah yeah you know you just have to know
00:52:19.780 we are obviously not nazis yeah we're not even nazi adjacent it's ludicrous and for someone to
00:52:27.760 call you that, they kind of opt out of an adult conversation. So there's no reason we ought to
00:52:33.440 listen to them. I mean, there's already a norm on Twitter that started that if somebody, I mean,
00:52:36.840 I don't necessarily advocate this, of course, but it's got some weight behind it. If somebody
00:52:44.500 has pronouns in their bio, you pretty much disregard whatever they say.
00:52:47.320 Instantly. Every time I see some vitriol, some particular nastiness, click on the profile,
00:52:53.260 invariably pronouns. And it doesn't have to be that. I mean,
00:52:56.100 And anybody who slings abuse at you on Twitter, do what I do.
00:52:58.940 Mute them right away.
00:52:59.940 They gave abuse or they give abuse to other people.
00:53:02.660 Like if you respond to me and somebody abuses you, that person's there.
00:53:05.900 I'm never going to see that person again.
00:53:07.480 I don't care if they're like you disagreed with me and they're on my side.
00:53:10.340 If they sling abuse, mute, gone.
00:53:12.800 And if enough people are doing this long enough, it will create a norm to where slinging abuse isn't getting people what they want.
00:53:19.700 And they'll burn themselves out and it just won't go anywhere.
00:53:23.260 So I really encourage liberal use of that as well.
00:53:26.700 And then you just have to endure the fact that people are going to say, oh, you don't have a conversation, blah, blah, blah.
00:53:31.440 No, I won't have conversations in the wrong forum with people whose goal is to sling abuse at me.
00:53:36.800 That's not a conversation.
00:53:37.820 It's something different.
00:53:38.360 And I'll add to that.
00:53:39.300 My golden rule is if someone has harassed me and there's a considerable number of people or is currently harassing me, I will not have a conversation with them.
00:53:49.140 That's just not an option anymore.
00:53:50.560 But we talked about the road out of this just to kind of finish when we say, find your, I said, find your feet, you know, get with people and find your feet.
00:53:56.320 There are three things really that have to be done.
00:53:57.980 That's what we tried to do yesterday.
00:54:00.080 And each of us kind of spoke to it differently.
00:54:02.520 But we've got to expose what's going on here.
00:54:05.460 That's where I was talking about the picture on the social justice box, the diversity and treat everybody with respect doesn't match the contents, which is this kind of totalitarian ideology.
00:54:16.000 They're different.
00:54:16.840 So expose that.
00:54:17.700 Make it clear how.
00:54:19.020 Go read some of their stuff and then share it in context.
00:54:21.580 It's horrifying.
00:54:22.620 People don't, like you said, most people won't agree with it once they see what it really is.
00:54:26.360 You just have to get past that branding, which is really good.
00:54:29.300 Second is if you have the time and energy, maybe not everybody does, try to explain it.
00:54:33.780 If you guys have a show, have people on who can explain it.
00:54:35.740 If you can read a few of their books or key papers or articles that people are writing on outlets like Aerial Magazine or Quillette or whatever it happens to be,
00:54:45.520 learn to explain what's going on with some of this again people don't really agree with it and
00:54:50.320 then if you can start looking for alternatives can you if you care about social justice i get
00:54:53.840 asked this a lot i care about social justice but i don't like social justice warriors what do i do
00:54:57.820 look look for and start articulating other ways how can you do the right thing like you're saying
00:55:04.080 it's like maybe we shouldn't be thinking about race all the time maybe this whole thing like
00:55:08.460 putting race first maybe we should ratchet that back down maybe we should ignore people who tell
00:55:12.960 us that we have to pay attention to that all the time. And so you want to look for, but we still
00:55:17.520 have to care and take people seriously. So maybe the lesson out of that is let's listen a little
00:55:21.200 more than we did before. Let's take it a little more seriously and not be so quick to dismiss if
00:55:25.260 that's still a problem. I'm not sure if that's even still a huge problem. Maybe it is, but fine.
00:55:30.540 It's good advice in any case. And then let's investigate, but not this listen and believe
00:55:34.680 crap, not shut up your story's been told. No, we'll listen. We'll check it out and you can help
00:55:40.120 us check it out and it's all fine so look for alternatives that aren't just coming from this
00:55:46.700 place of ideology that the the liberal approaches so that's what helen was talking about like
00:55:51.340 let's just find ways to do the the thing let's take their complaints seriously figure out what's
00:55:56.620 real inside their complaints and let's address those things and steal the whole thing from them
00:56:00.120 and what advice would you give to somebody who's working in a particular industry or a company that
00:56:06.340 is overtly work that's a great question so i don't ever tell anybody
00:56:11.060 i don't ever advocate anybody stick out their neck if you have the the ability to do it what
00:56:20.440 i often tell people when they ask me that i say oh i have to go to this diversity training what
00:56:23.480 should i do blah blah blah i say take notes take very detailed notes and understand exactly what
00:56:29.940 they're saying try to get a very clear picture if you try to challenge it on the spot they're
00:56:33.760 just going to make life difficult for you it was no mistake that one of the papers we started to
00:56:37.120 write but didn't have time to finish was things like if you go to diversity training we'll give
00:56:41.240 you a survey did you like diversity training if you say no you have to take more diversity training
00:56:44.740 or you get sent to diversity training and he gets called in as a character witness and you're like
00:56:50.000 do you think he needed a diversity training he's a great guy it was misunderstood now you need
00:56:54.420 diversity training you know so if you call it out on the spot once this stuff's been
00:56:58.240 institutionalized you're just really causing yourself yeah there's nothing you can do at
00:57:01.580 that way in a work environment start to understand it go out for drinks with your colleagues who
00:57:06.700 also don't like it start talking because they won't talk either i have another idea for you
00:57:11.400 guys as you know someone recently bought all our video equipment for us that is actually someone
00:57:15.600 who works in the very work industry so if you want to change things check your number she's
00:57:20.100 got a patreon page is all i'm saying yeah i'm saying that too yeah just give them money sounds
00:57:25.540 good to us yeah that will solve the problem the magic wand p is to give them money but no um
00:57:31.280 seriously start talking to each other and if you challenge it directly it's the wrong way start
00:57:35.940 understanding it so you can explain it you can expose it and then you can start articulating
00:57:40.640 alternatives yeah and we just wrote a piece about how important it is to listen and it really does
00:57:44.980 start with listening because people will either say well they don't actually believe that well
00:57:49.120 actually they do yeah and you should listen to them and in our piece argued that you should
00:57:54.520 listen to them and believe them, not believe the content of their speech, but believe that they
00:57:58.340 believe it. And then once you do that, you can say, okay, now I have a much more clear
00:58:02.400 understanding of this. In a professional context, the idea is the same when you
00:58:06.540 have to go to these trainings. Just take really good notes and listen and really understand
00:58:10.020 what it is. What is this ideology? Once you really get it, then
00:58:14.480 you can start speaking up because you're coming from a place where you can't
00:58:18.300 easily be shut down or refuted. So start paying attention. Look into what they're saying.
00:58:22.200 Oh, well, this comes from this book.
00:58:23.200 Go read that book.
00:58:24.000 See what's there.
00:58:24.760 Well, that's been one of the great things of doing the show for us is we've educated ourselves on these issues.
00:58:29.080 So when we get invited to write or speak about it, we are very comfortable that we actually know what we're talking about.
00:58:34.520 Right.
00:58:34.900 That's key.
00:58:35.480 Speaking of which, did you like our book, How to Have Impossible Conversations?
00:58:38.740 Yes, we did.
00:58:39.440 Very much so.
00:58:40.260 Yeah.
00:58:40.560 And we recommend that people get it.
00:58:43.180 Yeah, definitely give them money.
00:58:46.080 We set all that up in advance, just recommending each other's stuff.
00:58:49.100 Listen, guys, the one question in the spirit of educating ourselves and having an opportunity for our audience to do the same is we always like to give our guests an opportunity to talk about something that's left field.
00:59:00.420 So the last question we ask, and we'll ask it of each of you separately, is what is the one thing that we're not talking about that we should be talking about?
00:59:08.700 In the area of social justice?
00:59:09.820 In any area whatsoever.
00:59:11.680 Anything in the whole world.
00:59:12.820 Anything in the whole world.
00:59:16.000 I mean, some people are talking about the thing that I would say.
00:59:19.100 And this isn't what anybody would expect.
00:59:22.080 And nuclear power, there should be a lot more conversation on nuclear power.
00:59:26.140 I hear all this stuff about renewables and green energy and, you know, great, you know.
00:59:31.480 But there's so much resistance, especially from environmentalists, which is so strange, to nuclear power.
00:59:38.160 I think we need lots more open-ended, clear discussion that defers to the science.
00:59:43.240 You've got to, again, educate yourself a little bit.
00:59:45.640 But people need to get more clear on actually in general how safe and clean nuclear power is and how it actually is something implementable on large scale rather quickly.
00:59:57.600 And if we're looking at environmental problems and taking them seriously, as I think we should, it's something that just keeps falling in this crack because everybody doesn't like to talk about nuclear.
01:00:07.380 So you probably didn't expect that. We are not talking enough about nuclear power, and we should be.
01:00:11.920 Well, it's one of the safest forms of energy.
01:00:14.300 And I say this as someone whose wife grew up right next to Chernobyl.
01:00:17.040 So, yeah, it is very, very safe.
01:00:18.880 What about you, Peter?
01:00:19.920 I'll throw you a curveball.
01:00:21.600 One of the best things I've done as a parent is I've placed my kids in an immersion program at the age of three.
01:00:30.880 A language immersion program.
01:00:32.180 A language immersion program.
01:00:33.860 Thanks.
01:00:34.360 Mandarin.
01:00:35.280 So if someone has a child, when I was in school, you got it in 7th, 8th, and 9th grade.
01:00:41.220 that's way too late it's got to be as young as possible so that they don't even know that they're
01:00:45.860 they don't even have an awareness that they're learning it so that's been a gift that's paid
01:00:50.980 a remarkable dividend since i've done that people thought i was insane too but that's one thing i
01:00:56.020 would suggest if someone's a parent is to put their kid as early as possible in an immersion
01:01:00.020 program language immersion and you've picked the right language as well you're really future
01:01:03.380 proofing there well you know i it's been a good it was a good decision let me add on my nuclear
01:01:10.100 thing. Nuclear boats. I don't remember what the number is of big tanker ships or carrier ships,
01:01:17.280 not like aircraft carriers. Those are already nuclear. But cargo ships and cruise liners and
01:01:22.740 things like that. I don't remember what the number is of the biggest ships that we run,
01:01:26.440 but it's something like seven equals in the amount of exhaust pollution that they put out
01:01:32.820 from burning diesel is equal to all the cars on the planet or something like that. But boats are
01:01:38.180 the easiest thing to put nuclear reactors in to drive them pretty much anything. We already have
01:01:41.860 all the technology. So not just even nuclear power, but nuclear-powered vehicles. All of our
01:01:47.560 big-scale sea transport can be done with nuclear power and cut down. If you're worried about
01:01:53.700 atmospheric emissions, there are actually workable solutions if you just go nuclear.
01:02:00.480 You'll never win a Nobel Peace Prize with that kind of talk.
01:02:03.380 I should win a Nobel Peace Prize for my role
01:02:06.620 in exposing the crap in the universities
01:02:08.440 well there you go, the campaign begins
01:02:10.640 here, well listen guys
01:02:12.480 thank you so much for coming on, we really appreciate
01:02:14.540 your time and your thoughts
01:02:15.820 follow both Peter and James on Twitter
01:02:18.280 they're very good and if you do follow them
01:02:19.940 it's a very mentally healthy
01:02:22.160 contribution to your intake if you follow
01:02:24.400 you guys, so we'll put the links in below
01:02:26.260 get the book, How to Have Impossible
01:02:28.300 Conversations, it's absolutely fantastic
01:02:30.000 and we will see you again in a week's time
01:02:32.000 see you later guys
01:02:32.980 We'll be right back.