TRIGGERnometry - February 07, 2024


Why Critical Thinking is Dead - Peter Boghossian


Episode Stats

Length

1 hour and 8 minutes

Words per Minute

176.59323

Word Count

12,027

Sentence Count

861

Misogynist Sentences

13

Hate Speech Sentences

19


Summary

Summaries generated with gmurro/bart-large-finetuned-filtered-spotify-podcast-summ .

Peter Bogostian is back and better than ever! He's back with a new podcast called "Street Epistemology" in which he challenges random people on the street to think about the way they're thinking and try to make them change their mind. In this episode, we talk about the importance of critical thinking and why we should all be taught critical thinking.

Transcript

Transcript generated with Whisper (turbo).
Misogyny classifications generated with MilaNLProc/bert-base-uncased-ear-misogyny .
Hate speech classifications generated with facebook/roberta-hate-speech-dynabench-r4-target .
00:00:00.720 So passive failure, we fail to teach kids critical thinking.
00:00:04.880 Active failure is when you teach them stuff that is actually wrong and makes things worse.
00:00:09.120 Well, that's the situation we're in now.
00:00:11.640 So if you indoctrinate the teachers, you don't need to indoctrinate the kids because the teachers will take care of it for you.
00:00:16.740 When you don't learn the other side of the issue, you have an artificial confidence in the things that you believe.
00:00:21.700 The core bedrock beliefs of this ideology, there's no other way to say it.
00:00:27.880 They're so fucking stupid.
00:00:29.100 They're so fucking idiotic.
00:00:32.020 Peter Bogostian, welcome back.
00:00:33.800 It's been so long since we had you on the show.
00:00:35.660 Thank you.
00:00:35.920 It's a pleasure.
00:00:36.620 It's great to have you back.
00:00:38.000 And since we last had you on, you've been doing these amazing what you call street epistemology videos, which sounds fancy.
00:00:44.760 What you do is you get a bunch of random people on the street and you get them to start thinking about the way they're thinking.
00:00:52.380 Correct.
00:00:52.680 We did it earlier with you today in the street here in Westminster.
00:00:55.500 Right in front of the Supreme Court.
00:00:56.680 Right in front of the Supreme Court.
00:00:57.680 Supreme Court is kind of irrelevant in this country.
00:01:00.260 It's not like in the U.S.
00:01:01.200 Yeah.
00:01:01.320 So we don't give a shit.
00:01:03.060 But street epistemology, very, very interesting.
00:01:06.280 Some of your videos have gone super viral with it because it's fascinating to see how people respond to being made to think, being challenged on how they think, being encouraged to change their views about things.
00:01:18.240 We're presented with evidence.
00:01:19.260 What have you learned from doing those videos?
00:01:23.540 Oh, boy.
00:01:24.240 Reid and I have gone all around the world and we've done this.
00:01:28.380 We're about to actually to go to Taiwan and New Zealand.
00:01:31.040 We've gone to Puerto Rico.
00:01:32.220 We've been to Australia.
00:01:35.780 Here in London, we did them.
00:01:37.540 Well, one of the things we've learned is that people in London are pretty chill about it.
00:01:41.700 They're pretty relaxed.
00:01:42.740 In the United States, particularly on college campuses, students are looking for a reason to be offended.
00:01:47.740 They want to be offended.
00:01:49.000 I've learned so we have lines.
00:01:52.940 We put lines on the pavement, strongly disagree, disagree, slightly disagree, neutral, and on the other side.
00:01:59.940 And one of the biggest things I've learned is that kind of reinforced what I had already been thinking before is that people will stand on a line not based upon the evidence they have, but based upon some moral reason.
00:02:16.420 So good people stand on this line, I'm standing on this line, I'm a good person.
00:02:22.100 Good people should stand on this line, I should stand on this line, I am standing on this line, I'm a good person.
00:02:28.440 That's something.
00:02:29.320 So people kind of outsource their thinking to their tribe.
00:02:33.140 Is that kind of what you're saying?
00:02:35.500 Well, they find their tribe based upon a moral reason to believe in it.
00:02:39.880 Oh, I see.
00:02:40.440 So people will think about what it means to them to be a good person, and then they'll align their confidence in a belief.
00:02:48.960 So we'll throw out a claim, trans women should be in women's sports, the two-state solution is the best solution, and they'll stand on the line that they think a moral person should stand on.
00:03:00.680 And it's a fascinating experience.
00:03:04.840 So here's some of the other things that I've learned.
00:03:07.400 One of the things that we did it in our street epistemology exercise, when you have a claim, if somebody goes to the strongly disagree and someone goes to the strongly agree,
00:03:16.740 we will ask them to write down the reasons that they think that the other person has for holding that position.
00:03:24.260 And we'll ask the person like we do, what's your best reason for believing that?
00:03:27.220 What do you think his best reason is for abortion to be legal in the first trimester?
00:03:37.160 What is, now Frances, you have to listen to see if she's correct.
00:03:40.860 Take a guess what you think his best reason is.
00:03:44.180 I think it's going to be something about how bodily autonomy for women.
00:03:50.600 Bodily autonomy for women.
00:03:52.860 Is it bodily autonomy for women?
00:03:55.040 Don't say what it is.
00:03:56.160 Sort of, yeah, kind of.
00:03:58.960 Sort of or yeah?
00:04:01.440 Is it bodily autonomy?
00:04:03.060 Yes, it is bodily autonomy, yeah.
00:04:04.580 Okay.
00:04:05.680 Don't show him.
00:04:06.900 What do you think her best reason is for being against abortion in the first trimester?
00:04:12.940 It's a killing of innocent life.
00:04:15.380 Is he correct?
00:04:19.060 To a certain extent.
00:04:20.300 Yeah, I think that's pretty, okay.
00:04:26.980 So this is excellent.
00:04:28.900 So you understand his argument.
00:04:31.700 So you turn around, show it to him.
00:04:35.720 You want, not agree with it, but you understand it.
00:04:38.200 I understand it, yeah.
00:04:39.060 And do you understand, not agree?
00:04:41.320 Okay, I'm going to grab these.
00:04:47.980 What would it take you?
00:04:49.940 Yeah.
00:04:50.680 What would either she have to tell you, or you have to learn to move you to the agree?
00:04:56.880 I'm going to ask you the same question, but to the disagree.
00:05:01.120 Okay.
00:05:01.640 One, Matt.
00:05:02.460 Just one.
00:05:02.980 I don't think I can, and because I believe, sorry, I forgot your name.
00:05:07.560 Isabel.
00:05:07.860 I believe Isabel's position is fundamentally an absolutist position, which is, and maybe
00:05:14.880 I don't know this, but which is that it should always be, if a child, if there is, if a child,
00:05:21.800 if there is a child in the womb or however you want to classify it, it should always be
00:05:26.200 brought to, you know, there should be, there is no excuse or no reason for abortion ever.
00:05:31.960 So, so, what is preventing you from moving to the agree?
00:05:37.940 What, what, what piece of information, and again, I'm going to ask you the exact same
00:05:42.060 questions.
00:05:44.040 So, okay, I'm going to you first, I'm going to you first.
00:05:48.340 So what would you have to learn to move you to the disagree?
00:05:52.300 Nothing, because I've already looked into this subject and I know human life begins at
00:05:55.520 conception and I know that every human life has value.
00:05:58.440 Okay.
00:05:58.700 What would you say to someone on that end who is unwilling to move?
00:06:05.040 I think it would be worth looking at the impact of abortion on women, because I think often
00:06:10.020 that position, and I don't mean to make obviously any judgments about, sorry, I've forgotten
00:06:13.760 your name again.
00:06:14.300 Frances.
00:06:14.840 Frances's position, I only know the statement he's written on that board, but I know a lot
00:06:19.600 of people who I have discussed abortion with who say that they support abortion think
00:06:24.100 they're doing it to help women.
00:06:25.600 And I work very closely with post-abortive organizations, and I realize that actually
00:06:30.540 abortion doesn't really help women.
00:06:33.400 It doesn't solve the initial problems that they've got.
00:06:36.320 It can actually create much larger problems.
00:06:38.720 To take the life of your child doesn't solve your problems.
00:06:42.060 If somebody is standing on that line and they are unwilling to move, does it strike you that
00:06:49.200 the person, now we don't forget about Frances, does it strike you that the person is an extremist?
00:06:54.380 Er, you'd have to quantify what exactly...
00:07:01.120 Unwilling to change one's mind, independent of the evidence.
00:07:06.340 Er, I don't know that he's unwilling to change it, dependent on evidence.
00:07:09.980 Are you willing to change your mind on the basis of evidence to move to the degree?
00:07:14.860 Yeah, there's always.
00:07:16.360 If there is something new that has been presented which I have not thought about, or I have not
00:07:20.160 considered, then absolutely.
00:07:21.740 So you just don't know what the evidence is?
00:07:24.180 Yeah, so maybe, but I have never heard a good reason why a 14-year-old who is a victim of
00:07:29.660 rape and has a child should be forced to carry that child, should be forced to give birth
00:07:34.980 to that child.
00:07:36.800 Are you willing to change your mind about abortion in the first trimester if a new piece of evidence
00:07:43.880 comes along that challenges a fundamental assumption you have?
00:07:46.980 That would be the equivalent of asking me whether I would change my mind about killing another
00:07:52.100 human, innocent human being, because that's essentially what it is.
00:07:55.100 Nothing can make me think killing human beings is a way to solve our problems.
00:07:58.800 What if someone could show you that life didn't begin at conception?
00:08:02.080 Would you change your mind?
00:08:03.340 But I know it does.
00:08:04.540 I've looked into it long and hard, and I quite definitely know it does.
00:08:07.800 When we did it in the Israeli-Palestinian conflict, people couldn't guess the best reason that
00:08:16.420 they couldn't guess any reason that the people had for it.
00:08:19.660 So there's such a disconnect.
00:08:22.720 It's not even that it's incommensurable.
00:08:24.800 It's that people don't even understand why someone would hold the belief.
00:08:28.800 And what do you think this tells us about society more broadly?
00:08:33.260 That the educational system has failed people.
00:08:36.700 That it's complicated because we know that there are simple mechanisms that you can use
00:08:45.160 to help people think more clearly and critically.
00:08:47.500 But we're not using those methods.
00:08:49.000 And they're free.
00:08:49.780 I mean, anybody can access them.
00:08:51.060 They're totally free, and we're not using them.
00:08:52.700 So I think that some of it is, at least some of it is, that the institutions have become
00:08:57.640 ideology mills to replicate the dominant moral ideology.
00:09:01.180 And do you not think as well that it might be that people simply don't think about these
00:09:05.980 things in any great depth or detail because they've got lives, et cetera, et cetera?
00:09:10.440 If you don't think about something in any detail, then you should stand on the neutral line.
00:09:14.980 But they don't.
00:09:15.920 They go to one extreme.
00:09:17.300 They'll go and strongly agree, or even the agree.
00:09:19.540 So they haven't appropriately calibrated the confidence in their belief to the evidence.
00:09:25.840 So they've extended the confidence in the belief beyond the warrant of the evidence.
00:09:29.920 Do you think as well that the more contentious the issue, the more that you are pulled to
00:09:35.700 the strongly agree or the strongly disagree?
00:09:38.440 Because the more contentious an issue, the more likely it is that you are seen to be immoral
00:09:45.060 if you take the strongly agree or strongly disagree position.
00:09:48.140 No, I don't think so.
00:09:50.440 But what I do think is that there's intrinsic selection bias.
00:09:53.500 So only certain people who hold certain beliefs will come to the line in the first place.
00:09:58.680 What I would love, if I had a magic wand and couldn't wave it, is I would love to be situated
00:10:03.680 in such that I could have all the people come to the lines who would normally never go.
00:10:08.440 You know, we thought about giving people Starbucks cards or something, but that would...
00:10:12.640 That also creates perverse incentives.
00:10:14.520 Yeah, and you don't want that.
00:10:17.480 But there is no way that I know of, maybe, I think it's probably impossible to get people
00:10:22.060 to play the game, the exercise, who would not play it.
00:10:26.220 That's the real problem, is that some people won't engage with ideas.
00:10:30.940 And have you found any particular differences?
00:10:33.480 Because you talk about the education system failing.
00:10:36.120 Yeah.
00:10:36.340 And it can fail in different ways, can't it?
00:10:39.660 One, I think what you initially were talking about is that it fails to teach people critical
00:10:44.660 thinking.
00:10:45.520 Is that kind of what you meant?
00:10:46.640 Or did I misread that?
00:10:47.560 Yeah, it fails to...
00:10:49.020 So let's disambiguate that.
00:10:50.420 Let's break that down.
00:10:51.200 It fails to do a few things.
00:10:53.500 It's kind of okay at teaching them skill sets, inference, evaluation, explanation.
00:11:00.820 It's an abject failure at teaching them dispositions.
00:11:05.000 Tell us what all those things mean, because many people are not.
00:11:07.620 So the famous...
00:11:11.140 There's an American Philosophical Association, Delphi Report.
00:11:14.100 It's kind of like a...
00:11:16.140 It's not without its problems.
00:11:17.540 It was from 1990, 91.
00:11:18.880 It's kind of like the definition of the ideal critical thinker.
00:11:21.900 And yeah, it's kind of written the way a committee would write it.
00:11:24.160 But it's the best we have today.
00:11:26.500 It's not perfect.
00:11:27.380 It's probably revisable at some point.
00:11:29.640 But it breaks critical thinking down into a skill set, into an attitudinal disposition.
00:11:35.000 The skill set, it's fairly easy to teach people basic skills in about 20 hours.
00:11:39.880 If you really cram it through, you could probably teach it in 15, maybe, maybe 10, depending
00:11:45.040 on the people, the students.
00:11:46.140 But the dispositions are the hard things to cultivate.
00:11:49.980 For example, being trustful of reason, being willing to revise your beliefs.
00:11:54.820 I think that's the most important one, personally.
00:11:56.680 Are you willing to revise your beliefs?
00:11:59.700 And the problem is that if you have...
00:12:03.040 It's easy to test.
00:12:04.300 One of the reasons we teach the skill set in school is because it's easy to test.
00:12:08.200 You know, you can...
00:12:09.700 It lends itself to multiple choice.
00:12:11.620 But you can't really test a disposition.
00:12:13.380 You can test if someone can identify a flaw in reasoning or a fallacy.
00:12:17.640 But the dispositions are difficult.
00:12:19.800 But here is the thing that I find absolutely fascinating.
00:12:23.500 If you have the skill set but don't have the disposition, you're actually going to make
00:12:29.140 your epistemic situation or your knowledge situation worse.
00:12:32.180 Yes.
00:12:32.420 So the most important thing is the disposition.
00:12:37.080 And there are things like the California critical thinking disposition inventory where we can
00:12:42.200 kind of test that.
00:12:43.420 But people can cheat it and game it and lie about it.
00:12:45.920 But you can't really test a...
00:12:48.360 You can't really give someone a disposition.
00:12:50.160 You can tell them why they should be willing to revise their belief.
00:12:52.780 One of the ways that I always try to get around that is if someone says...
00:12:58.360 If I ask them a question and they say, I don't know, like we did today.
00:13:00.840 I say, that's a great answer.
00:13:02.000 It's always a good answer.
00:13:03.180 I don't know is a great answer.
00:13:04.380 And when I'm asked a question, if I don't know, I will both change my mat or say, I don't know.
00:13:10.560 So I'll try to model those behaviors for other people.
00:13:12.840 But the dispositional aspect of critical thinking is absolutely indispensable to everything,
00:13:19.500 to participation in life in a civil civic society, to having a reflective inner life.
00:13:25.780 There's literally no domain in which that does not improve one situation.
00:13:29.140 And how do you teach people critical thinking?
00:13:31.360 You said in 15 hours.
00:13:32.740 Is there some key concepts?
00:13:34.880 It's pretty much in every textbook you can think of how to be less wrong more often.
00:13:41.140 So how to identify a fallacy.
00:13:44.000 So look, we have two things that we want to do.
00:13:46.360 We want to believe more true things and believe fewer false things.
00:13:50.400 But often those are in conflict with one another.
00:13:53.760 So what we do is we...
00:13:56.280 One of the things you can do is teach this is a fallacy.
00:13:58.940 And there are names of fallacies, ad hominem, reductio, ad absurdum, etc.
00:14:02.940 There are names of fallacies, but people will forget the names.
00:14:06.400 But what I always try to do when I was teaching is give them a concept of, like, can you explain
00:14:12.280 in plain English what is the problem?
00:14:14.260 Like, what's the problem here?
00:14:15.860 And they'll remember that sometimes for a lifetime if that's bundled with the idea of
00:14:23.060 an appeal to self-interest.
00:14:24.700 Like, your life will be better.
00:14:26.040 Your human flourishing, your community, your relationships, everything will be better if
00:14:30.060 you can be less wrong more often.
00:14:31.680 Coming back to the education question that I asked you, what I was trying to get at is
00:14:36.420 there's a kind of negative failure and then there's a passive failure and active failure.
00:14:41.520 So passive failure, we fail to teach kids critical thinking.
00:14:45.380 Active failure is when you teach them stuff that is actually wrong and makes things worse.
00:14:49.980 Well, that's the situation we're in now.
00:14:52.100 That's why I was asking.
00:14:53.180 Yeah.
00:14:53.460 So we're in a situation in which we have wide-scale organizational capture that's in
00:14:57.520 service to a moral orthodoxy, a dominant ideology.
00:15:00.640 It goes by many names.
00:15:03.800 Majit Nawaz, regressive leftism, successor ideology, Kellen Pluckrose from the island
00:15:08.400 here, critical social justice.
00:15:10.560 But the idea is that there's a suite of propositions into which one must assert to be educated.
00:15:19.360 And the goal of the educator is to help students develop what the Brazilian educator,
00:15:23.400 Paulo Freire, says a critical consciousness.
00:15:25.180 So you want to develop the tools by which you can find oppression everywhere.
00:15:31.380 Racial oppression, gender oppression, sexual oppression.
00:15:35.020 It's swayed, moved off somewhat from economic or maybe considerably.
00:15:41.260 It's bartered really identity politics for economic variables.
00:15:44.580 And how does that show up when you do these experiments with college students?
00:15:50.080 Oh, it's completely conspicuous.
00:15:51.840 So here's one of the rubs.
00:15:58.820 When you're, particularly an educational institution, is held hostage to an ideology, almost invariably,
00:16:08.180 though not invariably, Christian apologetics is the exception to this.
00:16:12.360 You don't learn the other side of the issue.
00:16:15.360 And so when you don't learn the other side of the issue, you have an artificial confidence
00:16:19.580 in the things that you believe.
00:16:20.760 You inflate your confidence well beyond the warrant of the evidence.
00:16:24.880 And that's coterminous with the idea that you have to make up that slack somehow.
00:16:32.120 You have to be offended.
00:16:33.380 You have to have a microaggression.
00:16:35.120 You have to complain to an authority figure.
00:16:37.180 But it's a catastrophic failure for what we're doing as educators to children.
00:16:44.900 We're teaching them, we're kind of giving them this critical consciousness so that they
00:16:49.480 can identify and remediate oppression.
00:16:51.980 But what we're not doing is helping them value what's true.
00:16:56.620 We're not telling them, well, here are other alternatives or here are other points of view.
00:17:02.360 And you really have to have that component so you can, your epistemology should always
00:17:08.800 precede everything else.
00:17:09.840 Like why you're doing something, how you know what you know is always the first question.
00:17:14.400 And then once you figure that out, then you can go on to think about how you should,
00:17:18.920 Socrates' question, how should we live our lives?
00:17:20.960 What's a moral life?
00:17:21.860 Are some types of lives better than others?
00:17:23.620 Can a man be unjust towards himself?
00:17:25.360 That's Kant's question.
00:17:26.260 But once you figure out how you know what you know, other things within that epistemological
00:17:33.540 framework will follow your metaphysics.
00:17:36.200 In other words, what you think is this the natural realm all there is?
00:17:39.480 Or is there a supernatural realm?
00:17:41.600 Or what happens to me after I die?
00:17:43.340 Or will I go to hell?
00:17:44.680 Or how should I be kind?
00:17:46.480 Or what role should compassion play in my life?
00:17:49.500 So Pete, how do these kids engage with the exercise?
00:17:53.540 You know, the ones who come from these captured institutions, as you put it.
00:18:00.180 How do they, my first reaction is somewhat unreflectively.
00:18:04.480 Again, I do want to stress that there is a selection bias for people who come to the line.
00:18:08.880 So like today, those people who are at the trans rally.
00:18:11.560 Well, tell people.
00:18:12.080 Tell people in more detail.
00:18:13.080 Okay, so we...
00:18:15.640 We might include a clip, but we might not be able to.
00:18:17.660 Okay, so our friend Luke went out and tried to find people.
00:18:21.780 And there was a trans rally around the corner.
00:18:24.660 And people were in these full masks, if memory serves me correctly.
00:18:28.740 And I invited them to do the exercise with us.
00:18:32.560 And I said, you will get more people, correct me if I'm wrong on this.
00:18:36.200 You'll get more people in 20 or 30 minutes doing this than you would if you stood on that
00:18:41.920 street corner literally for a decade, eight hours a day.
00:18:44.580 It's YouTube.
00:18:45.260 It's just, it's a vehicle.
00:18:46.560 Now we, from our combined platforms, we do that.
00:18:49.160 And they were having absolutely none of it.
00:18:51.320 They were having none of it.
00:18:52.440 And one of the reasons, they were having none of it for multiple reasons.
00:18:55.420 But one of the reasons they were having none of it is because they have not been Socratically
00:19:01.820 trained.
00:19:02.560 They have not been trained to defend their ideas.
00:19:07.220 They've only been, it's kind of like a Catholic catechism or Marxist ideological training.
00:19:12.160 They've been taught that certain propositions are true.
00:19:15.740 And they haven't really even been taught how to defend those.
00:19:19.260 So they don't know the other side of a position.
00:19:22.240 And if you don't know the other side of a position, you can't argue, you can't rebut the
00:19:27.960 arguments that you know, right?
00:19:30.020 It's not, it's literally not possible.
00:19:31.520 That, coupled with the idea that they believe that intrinsic in dialogue itself is some kind
00:19:37.300 of hierarchical power relations which support white dominance or white supremacy or something
00:19:45.240 that's completely insane.
00:19:47.220 And do they say this when they're actually doing the exercise, when they come to their
00:19:53.560 justifications or they try to rebut a particular argument?
00:19:57.140 Is it as avert as that?
00:19:58.360 Yeah, some people will either be explicit about it or some people will beat around the
00:20:03.200 bush about it.
00:20:04.560 But there's no question at all that most of these people, I would say the vast majority,
00:20:11.820 and I want to specify this because this is important, of US college age kids, most of
00:20:19.000 whom are on college campuses, do not know the other side of the position.
00:20:23.020 They have never even heard it.
00:20:24.420 So even in an educational institution as prestigious as Harvard or?
00:20:29.840 No, especially in an educational institution as prestigious as Harvard.
00:20:34.580 So they have never been told the other side of the argument.
00:20:38.180 They've never been shown it.
00:20:39.340 They've never had it explained to them.
00:20:40.760 Correct.
00:20:41.260 And in fact, if you look at Greg Lukianoff's FIRE, that Harvard has among the worst rankings
00:20:49.720 for college free speech.
00:20:50.860 I just wrote, I wrote the foreword about a year ago to Rajiv Malhotra's book.
00:20:54.640 He's an Indian public intellectual.
00:20:56.420 And he really, really explains that in that book, the, if you will, to borrow a term for
00:21:05.820 hegemonic, the mono thinking, the mono culture that's created.
00:21:10.680 And what, I mean, to me, that is awful because what you're essentially saying is these institutions
00:21:17.060 aren't fit for purpose because the purpose of this institution is to educate, but they're
00:21:21.920 not doing that.
00:21:22.780 They're indoctrinating.
00:21:23.820 Well, okay.
00:21:24.220 So fit for purpose, they're fit for the purpose of the ideologues who run the institutions.
00:21:28.960 So for their purpose, they're discharging exactly what their mission should be.
00:21:32.920 But the problem with it is that once you veer from the truth, once you stop valuing what's
00:21:39.460 true, you're like the horse that rides off furiously in all directions.
00:21:43.240 Any conclusion that you'd want to forward is itself arbitrary.
00:21:47.020 It's the result of either some kind of capricious external force or some kind of, like my buddy
00:21:53.140 Faisal Al-Muttar was telling me that many countries, Qatar and other countries in the Oman are funneling
00:22:01.520 in BLM propaganda and funding it and pushing it into the United States.
00:22:05.440 So you're, you become held hostage to exogenous or external forces.
00:22:11.080 And what happens when people like that do participate in these experiments?
00:22:16.780 Because I, you're very good at being neutral, but you're also very good at pursuing the logical
00:22:22.060 conclusions of what people say and presenting them with challenges and so on.
00:22:26.200 And I imagine that quite quickly that worldview starts to unravel when it's challenged by someone
00:22:31.680 like you.
00:22:32.400 How do they react once they're forced to confront the side of the argument that they've never
00:22:37.560 been exposed to?
00:22:38.460 Well, let's take a step back from that and say how ought a sane person to react?
00:22:45.280 A sane, rational person who wants, who's self-interested, that's the other thing, self-interested,
00:22:51.200 would want to align their beliefs with reality.
00:22:53.780 They would want to tether the things that they believe to, they'd want to have some kind
00:22:58.080 of linguistic hook that hooks to something that's real.
00:23:00.900 That's not true though.
00:23:01.800 No.
00:23:02.120 That's not true at all.
00:23:03.080 Because if you're self-interested, you're interested in the opinion of your tribe because
00:23:06.900 your self-interest is linked to how your tribe sees you, right?
00:23:10.460 Yeah.
00:23:11.160 So that's, okay, so let's, okay, so we can, let's go down that.
00:23:15.140 This is a rabbit hole, but let's go down it.
00:23:16.760 As I wrote in my last book, How to Have Impossible Conversations, the only thing people want more
00:23:22.300 than to know what's true is to belong.
00:23:24.140 Yes.
00:23:24.420 So when you start questioning someone's belonging, in other words, their sense of tribe, to use
00:23:30.720 that word, one of the consequences of that is that you could sever their relationship
00:23:35.720 with their families, their friends.
00:23:37.240 You know, Jehovah's Witnesses call it de-fellowshipping.
00:23:40.640 You know, Scientologists call it, you know, they call them squirrels, people who don't
00:23:45.640 agree.
00:23:46.340 Actually, there's even something, Klingons have something from structure.
00:23:50.420 But anyway, but the idea is that belonging is a hook.
00:23:54.780 So they could do these street epistemological exercises and then go back into their communities
00:24:00.460 and ask their, quote unquote, you know, church leaders, whatever leaders, whatever particular
00:24:05.040 flavor or sect they happen to belong.
00:24:08.860 What is the answer to this?
00:24:10.300 Like, what should I say?
00:24:11.520 My argument in that case would be even then they'd be better off than that they didn't
00:24:15.500 participate because they would know that there's some alternative.
00:24:18.120 They would know that there's something out there that may give them pause or may give them
00:24:23.380 the gift of doubt.
00:24:24.560 See, I would push back again on that.
00:24:26.220 I think they actually might know that.
00:24:28.560 But they are so inculcated within this particular type of thinking and this group.
00:24:34.560 And particularly, Pete, when you're young, your friends are everything.
00:24:39.820 Right.
00:24:40.060 So you're going to be, what, a 23-year-old kid who's going to stand up against his mates
00:24:44.500 and go, what, I disagree with all of you and the way you see the world and the way we've
00:24:48.760 brought up to believe that this particular side is evil and white supremacy.
00:24:52.780 That is a very rare type of person who's going to do that.
00:24:54.720 And I will answer that as well, Pete, that we've had people on the show where we've asked
00:24:59.220 them a question and I can literally see them in their head going to where the question
00:25:07.020 is obviously going to take them and get uncomfortable and shut down the conversation.
00:25:13.000 This is, I'm so happy you guys said this is so important.
00:25:16.600 If we want to bring about the kind of society that leads to our flourishing, the older I
00:25:23.840 get, the more completely convinced I become that the way to do that is to not just tell
00:25:31.320 people that, for example, revising your belief, when they hear it, it shuts down.
00:25:36.000 Not tell them that, you know, you'd be better off.
00:25:39.140 But convince people that changing their mind on the basis of evidence, reason and evidence,
00:25:45.100 is a good thing to do.
00:25:46.580 It is a moral thing to do.
00:25:47.960 It is morally responsible to change your mind when presented with something.
00:25:53.860 And so there's a famous thing from Mormons about doubting your doubt.
00:25:58.660 This idea that you should doubt your doubt because it makes you a better person.
00:26:03.880 So if you want to promulgate or promote certain values in the society, the best way to do that
00:26:10.040 is through a moral means.
00:26:11.240 Other than, of course, kind of create a fascist architecture or some kind of...
00:26:15.100 Overarching institutionalization of principles.
00:26:18.800 But if you want those to be based upon something, ideally, that has a staying power and is
00:26:26.440 sustainable, then you would link those to something moral.
00:26:30.880 A moral...
00:26:31.760 Something that has a moral valence to it.
00:26:33.480 It's a good idea to change your mind.
00:26:34.580 It's a good idea to be trustful of reason.
00:26:36.560 It's a good idea to listen to people.
00:26:38.180 And there's also the other element of it.
00:26:40.220 And I'm going to be honest with my experience.
00:26:42.220 Yeah.
00:26:42.320 I loved it.
00:26:43.380 It was great.
00:26:44.760 It was uncomfortable.
00:26:46.420 Oh, when we did the exercise.
00:26:47.900 Yeah.
00:26:47.960 Yeah, it is uncomfortable.
00:26:49.060 You know, all of a sudden you've got a mic in your hand or you're mic'd up and you've been
00:26:52.500 asked to justify your opinion.
00:26:53.960 You suddenly realize, I'm not actually sure about this.
00:26:57.640 And then you're being asked to check.
00:26:59.700 And then you've also got someone who's challenging your opinion.
00:27:03.120 And we have brought up our kids, and you know this better than anyone, and we've taught
00:27:09.720 them that discomfort is bad and any form of discomfort should be avoided.
00:27:13.820 Correct.
00:27:14.600 So having your opinion challenged, being involved in a debate is challenging.
00:27:19.560 It's uncomfortable.
00:27:20.520 Yeah.
00:27:21.040 Being asked to change your opinion is even more uncomfortable.
00:27:24.140 Yeah.
00:27:24.420 And I'm going to use everything you just said to forward the point that I just made.
00:27:29.180 If we can help people understand that being uncomfortable isn't necessarily an intrinsic
00:27:35.780 good, but it's a byproduct of what happens when you examine your ideas and when you live
00:27:41.760 an examined life.
00:27:43.260 Being uncomfortable, it's okay to be uncomfortable.
00:27:47.600 Right?
00:27:47.760 And there are different degrees.
00:27:49.300 I mean, if you wanted to demarcate a reasonable, some reason to be uncomfortable, it would be this.
00:27:58.460 If somebody says something that attacks an immutable property of you, your height, my hair color,
00:28:10.220 that would be off limits.
00:28:13.800 But ideas, people deserve dignity.
00:28:15.980 Ideas don't deserve dignity.
00:28:17.220 So people, we need to create these cultures in which we value dialogue, we value discourse,
00:28:24.080 and we let people know it's okay to be uncomfortable if you're engaging and wrestling with ideas.
00:28:30.600 Rogan said something the last time he had us on his show, actually, about this.
00:28:34.420 He said, the reason that I am willing to listen to people and debate in good faith and discuss
00:28:39.360 things in good faith is I don't conflate my ideas with my identity.
00:28:43.340 Correct.
00:28:43.640 And that, to me, is a distinction that it feels like has been eroded in my lifetime a hell of a lot,
00:28:51.180 where people now are so attached to things that they believe, it's part of their identity.
00:28:56.980 And then, of course, they can't change their opinion because then it's not ideas that are under threat.
00:29:03.560 It's you.
00:29:04.240 Yeah, and I'll take that a step further.
00:29:08.840 We've institutionalized that.
00:29:10.680 It says, oh, you're uncomfortable, you go to the diversity office.
00:29:13.760 Or you're uncomfortable, you call the bias response team and file an anonymous complaint.
00:29:17.600 Or you're uncomfortable, like you also bypass the traditional stories.
00:29:22.540 We don't go to the teacher, we go to the diversity board.
00:29:25.460 We go to the dean.
00:29:26.280 We go to the, so we've taught people that instead of having a self-reliance and a resilience,
00:29:33.560 they can, I don't know, go to, you know, there is no exculpation from this whole thing.
00:29:40.000 There's no kind of removal.
00:29:41.480 Somebody has to be held guilty for the offense.
00:29:44.060 And I'm glad you said we've taught people because I think it's tempting sometimes for people like us
00:29:50.700 to be like, oh, look at these stupid college kids.
00:29:53.280 Oh, they're so dumb.
00:29:54.280 Oh, they're so this.
00:29:55.100 Oh, they're so that.
00:29:56.620 But they weren't born that way.
00:29:58.980 No, they're not dumb at all.
00:29:59.940 They've just been indoctrinated.
00:30:01.780 By our generation and yours.
00:30:04.520 I'm going to be more specific than that.
00:30:06.840 And yours, as if our generation is different.
00:30:09.040 We're 20 years apart.
00:30:09.980 I'm going to let that back.
00:30:13.160 Okay, all right.
00:30:14.740 The other thing that almost nobody talks about, which we should talk about,
00:30:20.220 and I really wish we talked about it more, is how that, what's the genesis of that?
00:30:24.720 Like, how does that metastasize in society?
00:30:27.840 And one of the ways, again, nobody is talking about that, virtually nobody, is through colleges
00:30:32.420 of education.
00:30:33.040 So you can't just teach in a high school or a school.
00:30:38.180 You have to go through, you have to get a certificate, and these certificates are basically indoctrination
00:30:45.080 mills.
00:30:45.440 They're Paulo Freire's indoctrination mills.
00:30:47.420 They're ways that we've modified, piggybacking off of what we said before, we've modified the
00:30:52.840 purpose of education from, we've actually almost done more than modified it, but from a truth-based
00:31:01.260 centered education to the alleviation of oppression.
00:31:04.700 And we've taught, you don't need to teach, it's kind of like the Amway of it.
00:31:09.340 Do you get the reference, do you have Amway?
00:31:11.280 Okay, forget about that.
00:31:12.160 But it's kind of, it's like a multi-tier marketing scheme.
00:31:15.460 Okay.
00:31:15.940 Okay.
00:31:16.360 So if you can teach teachers, if you can indoctrinate teachers or teach teachers in a way to think,
00:31:23.820 particularly, and we can go into this if you want, a pedagogical practice, which is not
00:31:28.760 Socratic, it's not challenging, it's not questioning, it's not helping people to develop their ideas
00:31:35.400 in a kind of cleaning, a clear, clean, and meaningful way.
00:31:39.100 But if you can teach teachers this, then you can literally indoctrinate generations of people.
00:31:43.460 And then once those kids who have been indoctrinated get to college, well, they've already been
00:31:47.940 indoctrinated.
00:31:48.840 Right.
00:31:49.200 So if you indoctrinate the teachers, you don't need to indoctrinate the kids because the teachers
00:31:52.600 will take care of it for you.
00:31:53.840 Correct.
00:31:54.060 And that's why one of the things that we need to do is we need to change the way that we
00:31:58.100 certify teachers, fundamentally change.
00:31:59.940 But we can't change the way we certify teachers because there's educational rot in our,
00:32:05.400 in our institutions.
00:32:07.560 So we're in a very difficult position.
00:32:09.920 We're left with either attempting to reform the institutions, which is virtually impossible
00:32:13.920 because people have tenure, they have jobs for life, or we have to build new institutions.
00:32:18.160 I personally am a fan of the latter, but I'm not going to be Pollyanna about it.
00:32:21.680 I understand that this is not going to come overnight.
00:32:24.360 And I also understand to maintain any kind of economic competitiveness, it's, it's probably
00:32:30.740 not the best idea to rip down your institutions, your legacy, your academic institutions, your
00:32:35.140 maybe your media might be a good idea.
00:32:36.660 But even then you have, even then you're, it's the, the road is dark and sinewy and fraught
00:32:45.920 with danger.
00:32:46.480 I mean, this is not something that you want to do a process by, by which you just want
00:32:51.380 to capriciously.
00:32:52.200 May I just, just on this point, just to finish.
00:32:54.480 This is where I'm going to start sound very libertarian in a way that broadly I'm not, but
00:33:02.160 it seems to me like this is where the lack of competition in education is a massive problem
00:33:07.520 because if you had a competitive educational sector, then parents would be able to choose
00:33:15.500 number one in a way that is more difficult now because you really have to cough up a lot
00:33:20.080 of money to be able to send your kids to a different type of school.
00:33:23.320 But also it would, the results would be borne out in the outcomes for the children.
00:33:29.800 So if you go to a school where someone like you is teaching, you're going to have a very
00:33:33.780 different set of life outcomes.
00:33:35.360 Correct.
00:33:35.940 But we don't have that because certainly in this country, because it's like either you
00:33:39.820 pay 50 grand a year or chances are your child is going to, there's some exceptions,
00:33:44.860 but broadly speaking, your child is going to go to a school where they're going to get
00:33:47.780 this shit.
00:33:48.200 Correct.
00:33:48.560 So we have consigned a generation of people, a generation of students, the generation of
00:33:55.740 the next leaders of our countries.
00:33:58.040 We've consigned them to not be reflective, to not think clearly and critically, to not develop
00:34:06.340 the dispositions necessary to economic prosperity, to understand why our institutions are the way
00:34:15.260 we are, you know, would Ronald Reagan's famous quotation, freedom is only one generation away
00:34:20.280 from being extinguished.
00:34:21.600 So we have created a situation in which the core pillars and values of the Western tradition
00:34:28.880 are under a sustained and prolonged attack.
00:34:33.420 As Florence Reed from Unheard said, we are living through a time that is uniquely stupid.
00:34:39.660 This is a uniquely stupid time in human history.
00:34:42.200 I mean, the apodictic pronouncements, or even if you don't even think about it that
00:34:47.260 way, the core bedrock beliefs of this ideology, there's no other way to say it.
00:34:55.920 They're so fucking stupid.
00:34:57.440 They're so fucking idiotic.
00:34:59.540 You know, every disparity in outcome is due to the system.
00:35:03.040 That's just demonstrably false.
00:35:05.260 Every racial disparity, for example, is due to the system.
00:35:10.100 And as Helen Pluckrose says, and I just love this, I think it's amazing, you can have a
00:35:14.080 conspiracy without any conspirators.
00:35:16.580 Nobody can be racist, but the whole system is somehow.
00:35:19.420 But yet when you parse it out, when you look at it in a more granular way, and you look at,
00:35:24.900 for example, success rates of people who have historically been discriminated against,
00:35:29.160 like, I don't know, Indians from India, East Asians, cold climate Asians, et cetera, those
00:35:36.280 disparities don't pan out.
00:35:38.020 So then you have to do mental gymnastics to try to make the ideology work.
00:35:41.560 And the reason that the ideology works is, as someone who used to work in the system,
00:35:47.800 is because we don't train kids, like you say, to critically think.
00:35:51.120 What we do is we feed them information, they process information, they go to an exam, they
00:35:58.380 regurgitate information.
00:36:00.220 So actually, you're not, and the way I describe it is like this.
00:36:04.000 So the way that we used to, I always thought I wasn't particularly good at maths.
00:36:08.260 Yeah.
00:36:08.800 But the reason was, I was never taught maths.
00:36:11.020 I was just taught a process.
00:36:12.960 And when all you do, and all you're taught is a process, then you don't actually understand
00:36:18.180 maths because you're just doing a pattern.
00:36:20.040 Okay.
00:36:20.300 It's only when you were taught mathematics do you actually understand it, and then you
00:36:26.040 are then able to apply it properly.
00:36:28.220 Okay.
00:36:28.520 So we have two additional components or features that are necessary to understand this.
00:36:33.180 The first one is that the reason that kids are taught with, they're taught now without
00:36:38.980 going into details is because of a moral component, right?
00:36:43.480 So we teach the kids this about oppression, about systemic discrimination, about disparities,
00:36:48.080 because this is a, it is a moral good.
00:36:51.620 This is what we need to be a just, equitable, I haven't used that word yet, equitable society.
00:36:57.340 So that's one key component of it.
00:37:00.880 And I can't remember where I was going.
00:37:02.680 But there's also, and whilst you think about this, there's another component of it, which
00:37:06.520 is what, in my experience, I saw, which is schools are no longer about education.
00:37:13.600 They're glorified data factories.
00:37:15.360 Which are then, kids are then shunted through, which they then do these exams.
00:37:21.960 And then the government is able to justify their position and say, look, what a great
00:37:26.660 job we're doing.
00:37:27.700 70 or 92% or whatever it is percentage-wise of kids are good at grammar.
00:37:34.260 Right.
00:37:34.360 But they're taught to pass a test, which is just learning and then regurgitation.
00:37:40.460 They can't write better.
00:37:41.620 They're not more articulate.
00:37:42.600 They can't construct a sentence.
00:37:44.720 Yeah.
00:37:44.960 So how do we, so the problem has been analyzed to death.
00:37:48.900 We all know it's a problem.
00:37:50.300 Yeah.
00:37:50.460 We've kind of grafted these, grafted these moral values onto eggs and systems that we
00:37:55.780 have.
00:37:56.260 And then the question is, what do you do about it?
00:37:59.060 Right?
00:37:59.340 Like, we know that this is a catastrophe.
00:38:01.740 We know that our institutions are in a sustained attack.
00:38:05.460 We know that the, that the, we're pretty familiar with the data of the people who hold the ideology.
00:38:13.740 Again, to mention Greg Lukianoff and Ricky Schlott's book, I just finished it about a cancellation.
00:38:20.920 We know that there are manifestations of this problem.
00:38:24.280 What do we do about it is the question.
00:38:26.100 Well, I mean, one of the things we've just discussed, which is you have to have alternative
00:38:29.620 structures for education.
00:38:31.760 I, for, we were just in Austin and everyone you talk to in Austin has just moved to Austin.
00:38:37.700 And one of the things they are, they're all saying is like, well, look, the American education
00:38:42.320 system is fucked.
00:38:43.480 Totally.
00:38:44.940 But at least in Austin, I feel like there's 20 of us and we can pull together and do like
00:38:50.540 homeschooling for all our kids together.
00:38:52.860 Or, you know, uh, there's charter schools in America.
00:38:56.280 There are grammar schools here in the UK.
00:38:58.980 There's other, there's, there's other kinds of options.
00:39:01.360 Catherine Burble Singh, who's a frequent guest on our show, her school isn't teaching kids
00:39:05.400 any of this crap.
00:39:06.120 Right.
00:39:06.360 So part of it has got to be alternative education.
00:39:10.380 I would say.
00:39:11.280 Yeah.
00:39:11.980 Part of it is you have to build new things.
00:39:14.640 I see no other way around it.
00:39:16.440 And in the media space, actually, I think it's one of the areas where the pushback has
00:39:20.780 been the most successful.
00:39:21.680 Um, and it's maybe that's because it's the easiest place to do it.
00:39:25.900 You need, you know, four cameras, a few mics and a room and you, you can actually start
00:39:31.460 to change the media conversation.
00:39:33.700 Education, obviously much bigger project.
00:39:37.180 Um, there are a few, you know, there's a Raleigh college, there's Ralston, Ralston, Stephen
00:39:42.420 Blackwood.
00:39:42.880 Yeah.
00:39:43.720 And, uh, university of Austin in Austin.
00:39:46.540 Yeah.
00:39:47.060 Um, is that it?
00:39:49.240 Well, there are others that are emerging right now, but those are the two main competitors
00:39:52.860 to the system.
00:39:53.840 So slight, slight progress in education, a bit more progress in media.
00:39:58.220 Yeah.
00:39:58.440 And we would really need to break down.
00:40:00.860 There's an architecture in place like Substack for alternative media.
00:40:06.220 There's much less so for educational systems or educational and various educational arenas.
00:40:12.420 But we would also need to take a look at like, what should they teach?
00:40:16.820 What should their mission be?
00:40:17.960 What pedagogy should they use?
00:40:20.380 What forms of engagement of students?
00:40:23.980 These are, I don't think they're particularly difficult questions, but when the whole discourse
00:40:30.360 has been hijacked for so long now, for a decade, they become, it just puts the conversation
00:40:36.000 in a hard mode.
00:40:36.700 They become more difficult to answer.
00:40:38.060 And look, this is me going to sound very left wing, but I think there's government intervention.
00:40:42.320 There should be, there should be inspectors and they should go and inspect.
00:40:45.800 But the inspectors themselves participate in the ideology.
00:40:48.460 The whole system is corrupt.
00:40:50.000 The inspectors are there to make sure they're being told this nonsense.
00:40:52.520 That's the problem.
00:40:53.560 Okay.
00:40:54.400 So you need to, there's a lot of teachers who would agree with you on that fact.
00:41:01.440 But then introduce something, there may be a new type of inspector or something along
00:41:07.580 those lines.
00:41:07.920 But the people introducing the inspectors are also part of this worldview.
00:41:11.100 The institution is set up to replicate the dominant ideology.
00:41:15.080 It is itself the problem.
00:41:17.180 Yeah.
00:41:17.300 But there are people who don't believe in this ideology, who fundamentally disagree with it.
00:41:23.340 And that's what you need to set up in order that they, and maybe it starts out as an investigation,
00:41:29.460 just a government investigation where they go out.
00:41:31.200 Forget the government.
00:41:32.120 So think about it like this.
00:41:33.240 No, let me just interject here.
00:41:35.960 What you're saying is all true, provided we ever elected a government that had any interest
00:41:41.180 in doing that and seeing it through.
00:41:43.460 And what we've seen repeatedly, by the way, certainly in this country, is there are governments
00:41:47.680 that were elected and had ministers for education.
00:41:50.120 Michael Gove, for example, he wanted to take on the educational blob.
00:41:53.760 I don't know if he did it well.
00:41:54.980 I don't know if he did it badly.
00:41:56.280 But what happened is he tried and then he got sacked or resigned or whatever.
00:42:00.840 Yeah.
00:42:01.060 Right.
00:42:01.540 So there's also this other thing, which is, this is a bigger step back, but we're in a position
00:42:08.820 where actually the democracy isn't delivering the results that the people are voting for.
00:42:16.220 You can, in this country, we've had a conservative government for 13 years now.
00:42:20.020 They are not doing these things.
00:42:22.420 It's amazing.
00:42:23.340 Like a government minister will write an article in the Telegraph complaining about wokeness.
00:42:28.220 And I'm like, why don't you fucking fix it?
00:42:30.340 You're elected.
00:42:31.700 You are the government.
00:42:32.720 But it's not working, right?
00:42:35.740 So this is the conundrum, which is why we're all doing death by analysis, because we're
00:42:39.700 like, we can vote for people, but they don't do anything.
00:42:42.940 And so that's why I think it's like homeschooling or bust, because no institution is answerable
00:42:47.860 to the people anymore.
00:42:48.800 And martial arts.
00:42:50.340 I've actually thought about this a lot, because, you know, you can have all these stupid ideas,
00:42:55.960 but when you're doing a martial art or you're doing a contact sport, that goes out the window.
00:43:00.580 No, because if you start going into your own head, you're going to get a punch on the
00:43:04.500 nose, you're going to get rolled over.
00:43:06.020 There's a reason for that, that I think is very profound, in that in certain martial arts,
00:43:12.920 there's a built-in corrective mechanism.
00:43:15.540 And when you remove the corrective mechanism, crazy starts to come in.
00:43:21.760 There's no way to, you have to have a corrective mechanism.
00:43:26.800 Actually, I published a paper on this, but you have to have a corrective mechanism in
00:43:31.000 what you do to make sure that your ideas align with reality.
00:43:34.720 And that's the problem, right?
00:43:36.080 We've said before, the ideas don't align with reality.
00:43:38.780 So there needs to be a constant reinforcement with these kids and young people that, look,
00:43:44.380 you can have all the intellectual masturbation you want.
00:43:47.320 The reality is the reality.
00:43:49.340 Well, that brings us back to street epistemology.
00:43:51.400 Street epistemology is a way to help people make their ideas clear.
00:43:56.680 And all of these resources are free.
00:43:59.200 They're for anybody who can use them.
00:44:01.440 And I'd highly encourage educators to use them exactly.
00:44:04.660 You don't need mats like we have.
00:44:06.320 You can use tape or chalk.
00:44:08.180 And then you can just ask kids to start in the neutral line and ask them a claim, how confident
00:44:13.480 are you in that?
00:44:13.940 And then very gently question them and encourage them to move.
00:44:20.100 You could go on the line first.
00:44:21.520 And then over time, maybe two or three sessions, you stand on the line and you have kids do
00:44:27.380 that for you.
00:44:28.280 But let's come back to what we were talking about, which I think your criticism is very
00:44:31.760 fair.
00:44:32.160 And it's one I've been making of our team, whatever our team is, which is we keep talking
00:44:37.500 about what's not working.
00:44:39.420 What are we for, right?
00:44:40.600 So you asked the question, how do we fix this?
00:44:45.060 And then we shouted at you for five minutes with our ideas.
00:44:48.840 What do you think we should do?
00:44:50.540 Well, first, we have to build new things.
00:44:53.420 OK.
00:44:54.640 Second of all, we have to move the Overton window.
00:44:57.180 We have to change the way that people think about what it means to be uncomfortable, what
00:45:04.680 it means to change your mind, what it means to engage in civil dialogue.
00:45:08.480 We have to start listening to people more.
00:45:11.060 We have to understand why someone believes what they believe.
00:45:15.460 It's one of, quote unquote, Rappaport's rules.
00:45:18.860 Before you offer any criticism, make sure that you absolutely understand.
00:45:22.960 So a very quick technique that people can use for this is once you have listened to
00:45:28.000 someone and you think you know what they mean, say, let me put it on yourself, the burden
00:45:33.120 of yourself.
00:45:33.460 Let me see if I understand this correct.
00:45:35.340 Is this is this correct?
00:45:36.720 And you repeat it back to them.
00:45:38.060 That one little technique would avoid so much confusion.
00:45:42.560 And it's a way to help people to understand each other.
00:45:47.200 Oh, and that's the other thing is to seek not to persuade.
00:45:51.280 This is Jurgen Habermas, the German philosopher's idea.
00:45:53.600 But to seek, and he writes what's called the theory of communicative action, to seek to
00:46:01.240 understand.
00:46:02.400 We want to seek.
00:46:03.140 So I want to try to understand why you believe this.
00:46:05.980 And if I understand why you believe this, the likelihood that we could come to a solution
00:46:11.380 that's agreed upon or some kind of a maybe consensus wouldn't be exactly the right word,
00:46:17.680 but some kind of agreement about how to proceed is greatly enhanced.
00:46:23.400 Except what we saw today is that the ideology we are concerned about is explicitly designed
00:46:31.660 to prevent that from happening.
00:46:33.540 100% correct.
00:46:34.820 It's explicitly circular and exclusionary of the precisely the thing you're saying, which
00:46:40.900 is the opportunity for discussion.
00:46:42.440 Right, so if somebody doesn't adhere to, so it's not just that they don't adhere to the
00:46:48.680 rules of reason and rationality, it's that they've opted out of the system altogether.
00:46:52.780 Yes.
00:46:53.180 And the way that they engage is to disrupt and dismantle, quote unquote.
00:46:56.820 The way that they engage is to bring bullhorns into an auditorium.
00:47:00.260 The way that they engage is to say that you're a rapist or a Nazi or get all their friends
00:47:05.000 to gang up on you on Twitter.
00:47:06.580 So we have many, there are multiple, multiple things I said there, and there are many ways
00:47:13.620 to engage it, but first and foremost, we must not cede to lunatics.
00:47:18.800 There's no reason that we should be held hostage to the demented vagaries of people with whom
00:47:26.920 we have nothing, what of substance do they have to contribute to the conversation?
00:47:33.100 If you want to come into, enter the conversation, you want to say something great, if not, there's
00:47:37.800 no polite or politic way to say this.
00:47:40.000 We're just going to relegate you to the kiddie table.
00:47:42.320 You can go in there and you can, you can do it.
00:47:45.220 Balaji has some interesting stuff on this about gray cities, blue cities, and we saw, we listened
00:47:52.120 to the Peter Thiel lecture the other night, and he linked the idea of real estate, structural
00:47:58.360 inequalities, et cetera, to the replication of the ideology.
00:48:01.760 To simplify that, just for people listening, the idea being the housing crisis causes wokeness,
00:48:07.180 basically, because young people are locked out of a future, they're locked out of having
00:48:11.520 a family, and therefore they are nihilistic, and they don't see a future, and they do see
00:48:16.400 and experience a lot of inequality, and therefore they're angry and resentful and bitter and
00:48:20.640 unconstructive about life.
00:48:21.600 Correct, and that was at the Roger Scruton lecture that people can, I'm sure, will be released.
00:48:25.980 So we have to stop ceding moral ground.
00:48:29.260 We cannot listen to people who scream and shout.
00:48:33.160 Like, there's just, if you want to present your argument, present it.
00:48:36.440 If not, you can go off in the corner.
00:48:39.080 Like, we just cannot cede to, we cannot continue to be held hostage to the most irrational people.
00:48:48.260 But this goes back to my point about discomfort, which is something I wanted to delve into.
00:48:53.500 We, as a society, and I am just as guilty of this as anything, as anyone else, we are
00:48:58.500 looking to avoid discomfort.
00:49:00.540 And that being the case, you're going to avoid the confrontation.
00:49:03.960 And then, when things happen, when you get pylons, you capitulate.
00:49:08.000 You couldn't achieve anything in your life without discomfort.
00:49:11.520 Exactly.
00:49:12.220 Literally enough, you couldn't build a business without discomfort.
00:49:14.400 You couldn't build, if people are interested in social media, you couldn't build a social
00:49:17.440 media, you couldn't be good at jiu-jitsu, you couldn't be good at skiing, you couldn't
00:49:22.320 do anything in which there was a corrective mechanism.
00:49:25.140 You could, for example, play air guitar, right?
00:49:28.300 Because there's no lawful relationship between you moving your fingers and a sound coming
00:49:32.840 out.
00:49:32.980 You could play air basketball.
00:49:34.880 Like, you could do something in which there's no corrective mechanism.
00:49:37.960 The moment you attempt to tell people that it's a good thing to remove discomfort,
00:49:43.300 not to go down this rabbit hole, but we attempted to do this by saying that
00:49:47.240 pain, we have an almost entire generation that's susceptible to opioid addiction because we've
00:49:55.500 told them it's like a fifth vital sign, that we've attempted to remove all kinds of, all
00:50:00.600 sources of discomfort.
00:50:01.940 Look, that doesn't mean when you go to the dentist, you shouldn't get Novocaine.
00:50:06.180 But I think we need to, I think you're right, Francis.
00:50:08.160 I think we need to rethink the idea of discomfort and help people understand that it is a certain
00:50:16.380 kind of virtue to endure a degree of discomfort in order to achieve something.
00:50:22.720 And what kind of life do you want to lead?
00:50:25.680 Do you want to lead a life in which you're not striving, in which you're not achieving,
00:50:29.200 in which you're not trying, in which you're either so, you find discomfort so dyspeptic,
00:50:36.220 or you find it so, you're so averse to discomfort, that is a barrier to any possibility that you
00:50:43.320 could achieve anything.
00:50:44.500 You simply cannot achieve anything that's real if you spend your life eschewing discomfort.
00:50:50.080 And it's also, and this is the other problem linked to discomfort, which is going to link
00:50:55.000 back to what we just spoke about a second ago, we are uncomfortable about saying to people,
00:51:01.420 you are behaving like a child.
00:51:03.280 If you do not want to take part in the discussion, in your own words, if you want to scream and
00:51:07.900 shout, then you're going to be on the kiddie table.
00:51:09.680 Right.
00:51:09.980 Because we experience that as confrontation.
00:51:12.880 Okay.
00:51:13.100 So I'm not, I'm not uncomfortable with that at all.
00:51:16.020 And I'm going to say, I'm going to suggest something to you.
00:51:18.540 If you speak bluntly and honestly to people and you tell them exactly what you just said,
00:51:23.540 they will respect you more, not less.
00:51:26.200 They will respect you more and not less rather than, oh, sniveling, or I'm sorry, or forgive
00:51:31.020 me, or, oh, you know, you, you go over there and you, you know, do your own thing with your
00:51:34.780 wave, your flag, or wear your bandana or what have you.
00:51:38.260 So, so the, the very idea that people don't engage other people or don't call people out
00:51:44.800 on bullshit because they think that they're going to, I don't know, be mad at them or hurt
00:51:48.480 their feelings.
00:51:49.280 It's almost exactly the opposite.
00:51:51.400 They will respect you more.
00:51:52.580 They respect people who are forthright in their speech.
00:51:55.380 They respect people who are blunt and honest.
00:51:57.880 That doesn't mean you have to be rude, but just be forthright in your speech.
00:52:02.200 Pete, let me ask you this, because I think this is at the core of all of this.
00:52:06.240 Can this be corrected without a catastrophe?
00:52:10.480 Because this is why I've been concerned about all of this from day one, because ideologies
00:52:17.880 that are out of touch with reality cause people to do things that are impractical or dangerous
00:52:23.480 and damaging, which is where I feel we are, particularly in the West.
00:52:26.620 We've undermined ourselves domestically.
00:52:29.980 Correct.
00:52:30.160 Foreign policy has been a disaster in terms of signaling weakness, which is why, in my opinion,
00:52:34.560 we're seeing many of the things that we're now seeing happening in the world.
00:52:38.160 However, people who are delusional in this way don't generally tend to self-correct unless
00:52:46.580 there is something even more painful than the discomfort of having to correct their delusions.
00:52:51.980 So my question to you is, do you actually think this can be corrected without a giant war
00:53:00.160 or some kind of, you know, something of that nature?
00:53:02.620 I guess it depends on how you define catastrophe.
00:53:05.800 We've already had a catastrophe, right?
00:53:07.760 We're already in a catastrophe of the legitimation crisis, Habermas' 1973 piece.
00:53:14.260 We're already in a crisis in which people don't trust their institutions.
00:53:18.260 They don't trust their media.
00:53:19.360 They don't trust their...
00:53:20.980 The only people...
00:53:21.640 But that's bearable.
00:53:23.000 The ordinary person, you go out there in the street and you ask people, you know,
00:53:26.020 the legitimacy crisis, the breakdown of institutions, they don't know what the fuck you're talking
00:53:29.720 about.
00:53:29.840 Yeah, but they...
00:53:30.520 That's not hitting them in the pocket, it's not hitting them in the butt.
00:53:33.640 True, but if you ask them, I can't speak to this context, but do you trust Congress in
00:53:39.100 the United States is an all-time...
00:53:40.160 The only people, if you look at survey after survey, the only people individuals trust are their
00:53:45.040 own physicians.
00:53:46.140 They don't trust the medical system.
00:53:47.320 My point is something else.
00:53:47.820 My point is it's not painful enough for people to care enough to actually...
00:53:50.900 Well, it makes you wonder, doesn't it?
00:53:52.960 What would happen if something real actually happened to people?
00:53:54.800 That's what I'm saying.
00:53:55.680 What would happen...
00:53:56.680 I mean, we have inculcated a sense of fragility in people.
00:54:01.560 And we've told them it's actually a good thing to be fragile.
00:54:04.740 Yes.
00:54:05.040 So can that get fixed...
00:54:07.540 Of course it can get fixed.
00:54:08.740 ...without an actual catastrophe, as in people dying en masse, starving en masse?
00:54:15.180 Yeah, I mean, there's no question it can, but people need to want to fix it, right?
00:54:18.520 People need to identify it as a problem.
00:54:20.480 But you're dodging my point.
00:54:21.600 My point is exactly this.
00:54:23.200 Can people experience the need to fix it without suffering terrible consequences first?
00:54:30.060 No, I wasn't dodging the point.
00:54:31.460 I wasn't dodging the question.
00:54:32.000 Sorry, I didn't mean to be rude.
00:54:33.320 No, no, you can be as rude as you want.
00:54:35.080 You can say anything you want to be.
00:54:37.740 No, it gets back to the point I made before about what they value.
00:54:41.620 If they value not doing that, they can, because it's like leading the dog.
00:54:48.040 It's like the leash, right?
00:54:49.420 If they place their values ahead of...
00:54:52.840 If they value being resilient, if they value tolerating a certain amount of discomfort for
00:54:59.680 achievement, if they value the right things, you can change it.
00:55:04.380 The problem is, and I don't mean this to beg the question of being honest with you, how
00:55:08.780 do we help people value the right things?
00:55:10.760 Well, so, okay, let me refresh, because I think we're talking about the same thing, but
00:55:14.900 talking past each other.
00:55:15.900 That's exactly what I'm saying is, is it possible...
00:55:20.380 Oh, do we need that?
00:55:21.420 I don't...
00:55:21.860 Is it possible for people to, I hate to use this term, but to wake up from the delusions
00:55:28.100 into which they've been educated without something so uncomfortable that it overrides the discomfort
00:55:34.200 of having to let go of that worldview?
00:55:35.980 It's totally possible.
00:55:37.840 Part of the problem with waiting or expecting or having some catastrophe befall people,
00:55:45.900 is that among the people who are already susceptible to this, they would say, look, we didn't
00:55:52.400 adhere to the ideology deeply enough.
00:55:54.320 I mean, they wouldn't explicitly say that, but we always knew we were victims.
00:55:58.300 Look, now the system did this to us.
00:55:59.960 The system got us in this war.
00:56:02.060 The system caused Hamas to...
00:56:03.520 Whatever it is.
00:56:05.400 So I don't think that the solution to the problem is to either hope or wait for or expect or anything
00:56:13.040 some kind of catastrophe, because that might actually worsen the situation.
00:56:17.940 I'm not hoping for a catastrophe.
00:56:19.500 No, I know you're not.
00:56:19.980 I'm saying...
00:56:20.520 I'm fearful that that's the only thing that will shake people up.
00:56:23.400 I mean...
00:56:23.520 It might not shake people up.
00:56:24.800 That's the point.
00:56:25.300 I see what you're saying.
00:56:26.060 I mean, the Hamas attack on Israel, I think, woke up a lot of people on the center-left
00:56:30.920 in particular.
00:56:34.780 Are you asking?
00:56:36.120 No, I'm making a statement.
00:56:37.380 Oh, okay.
00:56:37.660 Okay.
00:56:38.120 I mean, I wrote an article for the Free Press.
00:56:40.980 I read that.
00:56:41.540 It was excellent.
00:56:42.700 Which is essentially my assertion is there are a lot of people who that shifted their
00:56:47.780 perspective.
00:56:48.300 Right.
00:56:48.500 Because they saw things about their own side that were so expressed.
00:56:52.380 By the way, and they were the same people screaming about microaggressions and someone
00:56:56.420 asking, where are you from?
00:56:57.780 Yeah.
00:56:58.280 You know, you've seen video after video of people openly shouting at Jews because there's
00:57:03.720 nothing to do with Israel.
00:57:05.140 Nothing.
00:57:05.600 Zero.
00:57:06.280 Yeah.
00:57:06.680 The same people.
00:57:07.820 And now, fortunately, people who have been traditional philanthropists are removing their
00:57:12.420 money.
00:57:12.820 And you should not be donating.
00:57:14.220 You want to know what you can do about this?
00:57:15.600 One of the things you do is you stop donating to your alma mater.
00:57:17.860 They're simply not the same schools.
00:57:19.600 You can stop donating to a process that furthers this madness.
00:57:25.840 And Pete, before the interview, you made a statement about America being in the hospice
00:57:31.980 or on its way to the hospice.
00:57:32.940 Marching toward hospice.
00:57:34.040 Marching towards hospice.
00:57:36.320 Is this what you're talking about or is there something else?
00:57:40.040 I think this is a piece of it.
00:57:42.300 I'm...
00:57:45.540 Look, I don't like this.
00:57:46.780 I'm not...
00:57:47.660 I get no joy of sitting here and telling you that we're more than the United States is
00:57:51.820 marching toward hospice.
00:57:53.440 I don't see things...
00:57:55.200 I'm not particularly rosy, might be one way to put it.
00:57:59.180 We have $33 trillion in debt.
00:58:02.340 We're printing money like it's going on a style.
00:58:05.680 Evidently, the best people the system can produce are Donald Trump and Joe Biden, which
00:58:10.400 I think will be...
00:58:11.280 Nobody I know wants that.
00:58:13.660 To be, again, extremely blunt with you, I'm embarrassed as an American.
00:58:18.380 I'm embarrassed as a U.S. citizen that those are the two people we've produced.
00:58:23.720 Independent of what you think of the particular policy positions of any one individual.
00:58:29.020 One-third of the taxes collected last year went to pay the interest on the debt.
00:58:35.060 People aren't engaging substantive questions in any sustained way.
00:58:41.460 Our institutions have suffered from wholesale ideological capture, particularly our educational
00:58:45.680 institutions.
00:58:46.400 The meritocracy has been undermined.
00:58:50.400 Again, you could look at it if you normed out...
00:58:53.800 If you looked at SAT scores, for example, at Ivy League schools, over 51% of those students
00:58:59.920 should be Asians, but they're not.
00:59:02.460 You want to know if there's systemic discrimination?
00:59:04.160 Yeah, there's actually systemic discrimination, and we know exactly who it's against.
00:59:07.280 It's against Asians.
00:59:08.320 The same people screaming for microaggressions have no problem with calling for the heads of
00:59:11.700 Jews.
00:59:11.920 I mean, it's just...
00:59:13.940 And these people are educational administrators, they're professors, they're teachers.
00:59:17.560 Again, I say this as a kind of nucleation point for a larger nuclear catastrophe, if you
00:59:22.840 will.
00:59:23.100 We know that people don't trust their institutions.
00:59:26.960 I just don't think that the current system as we have it is sustainable.
00:59:30.940 There are too many geopolitical instabilities, hot spots in the world right now.
00:59:35.720 I don't think we're having an appetite for any...
00:59:38.400 I'm not saying we ought to have an appetite for foreign conflict, but I think that we need
00:59:41.920 to have a serious conversation about China and Taiwan, about Ukraine.
00:59:46.060 I think we need to have a serious conversation about a lot of things that we're simply...
00:59:50.760 We have removed the tools for people that we know what those tools are for how to have
00:59:57.520 better conversations, for how to speak to each other.
01:00:00.160 We're not doing that.
01:00:02.060 You cannot solve a problem unless you're willing to be honest and forthright and have an honest,
01:00:07.600 open dialogue and discourse about it.
01:00:10.740 And if you value that, and we've lost that.
01:00:14.060 Now, the question is, will we get it back?
01:00:17.100 I think we will.
01:00:18.560 What damage will that have done to our institutions, our judiciary?
01:00:22.840 What damage will that have done to trust our confidence in the institutions?
01:00:26.540 Again, it's just so epically stupid.
01:00:30.800 I mean, what are they offering?
01:00:32.080 They're offering Chaz?
01:00:33.660 What is the...
01:00:35.180 Right?
01:00:35.420 I mean, it's just...
01:00:36.140 It's so...
01:00:37.440 You know, Hamas knows what a woman is.
01:00:39.120 Everybody knows what a woman is.
01:00:40.520 I mean, it's just so...
01:00:42.340 We went from pretending to know things we don't know to pretending to not know things that
01:00:46.320 everybody knows.
01:00:47.340 Yeah.
01:00:47.480 I mean, there's a whole lot of pretending going on.
01:00:49.540 But that's not a uniquely American phenomenon, nor is it merely in the Anglesphere.
01:00:54.300 It's a neocolonial export that's spread through the world.
01:00:58.340 And so we're looking at a pretty precarious, perilous geopolitical situation.
01:01:04.580 We're looking at a country in which our citizens don't know or fundamentally question the American
01:01:13.820 values.
01:01:14.440 They don't think that the values of the country are worth fighting for.
01:01:18.280 Enlightenment principles, freedom of speech, freedom of press, freedom of assembly.
01:01:22.360 We have demeaned the very principles that has made our country not just great, but a shining
01:01:31.820 city upon a hill.
01:01:33.400 And that is where we stand right now.
01:01:35.600 We stand with deficits that are crippling us.
01:01:40.020 We're standing with nobody talking about issues, people being consumed by wokeness or anti-wokeness
01:01:45.000 when we have substantive problems.
01:01:46.560 We have substantive homeless problems.
01:01:48.320 My friend Michael Schellenberger, a good friend of mine, he ran for governor.
01:01:51.880 And one of the things that he was going to do, even a simple thing, like most people in
01:01:56.020 West Coast, in East Coast cities will acknowledge that homelessness is a problem.
01:02:00.800 They will acknowledge that it's...
01:02:01.940 In fact, when you look at the data in California in particular, people rank that as one of the
01:02:07.040 top five problems.
01:02:08.500 But we're not having an honest conversation.
01:02:10.620 In San Francisco, Michael writes about shelter-first housing earned.
01:02:15.220 Shelter-first housing earned.
01:02:17.320 We've had Michael on a couple of times.
01:02:18.380 Yeah, okay.
01:02:18.900 I know you've had him on.
01:02:19.620 When you say that people aren't having an honest conversation, we should also make clear
01:02:23.400 it's not just the left, it's the right as well.
01:02:25.280 Oh, 100%.
01:02:26.420 The right does this moronic thing where they're like, yeah, we've got money for Ukraine, but
01:02:30.400 we haven't got it for the homeless.
01:02:31.760 It's not a money problem.
01:02:33.040 It's an ideology problem.
01:02:34.260 Yeah.
01:02:35.320 So if you want to talk about a criticism of the right, because I wasn't meaning any of
01:02:41.020 this as a criticism.
01:02:41.640 I think often this is framed as a criticism from the right or from the left, like the trans
01:02:46.520 issue, whereas it should be just what's the evidence.
01:02:48.880 And if people on one side happen to disagree, then they need to marshal their evidence and
01:02:53.960 show it.
01:02:54.380 Because I don't think it's, I think casting it that way is a mistake because people will
01:02:59.360 be, to borrow your word, people will be tribal and they won't actually look at what the evidence
01:03:05.320 for a position is.
01:03:06.900 Well, look, I don't want to finish on that low note.
01:03:10.760 I think one of the ways that this can be influenced, whether it can be changed, there's a different
01:03:16.080 conversation is, like I say, I think in the media space, we actually have been extraordinarily
01:03:22.240 successful in country.
01:03:23.400 Well, you guys in particular.
01:03:24.480 But I don't just mean us, there are people way bigger than us.
01:03:26.720 I mean, look, our politics are different to the Daily Wire's politics, but when I see
01:03:30.560 a media organization of that size emerging, which challenges some of these narratives,
01:03:34.840 I think that's very powerful.
01:03:36.140 When I see Joe Rogan as the biggest show in the world, I think that's very powerful.
01:03:39.540 Yeah, look how they come for him too, right?
01:03:42.240 But the thing is, they can't get him anymore.
01:03:45.400 That's the thing that's reassuring to me.
01:03:47.680 Like this cancel culture stuff, it works at the lower levels, but there are certain levels
01:03:52.680 at which it no longer works.
01:03:53.740 I agree.
01:03:54.240 And Rogan has spearheaded that and there are other people who've spearheaded that.
01:03:58.660 And the fact that, you know, two idiot comedians can build this up to the point that it is
01:04:02.520 now, that to me is also a reassuring sign.
01:04:04.960 And the number of people who now listen to all of those people in that ecosystem, it's
01:04:10.140 quite significant and growing daily as well.
01:04:13.420 The one thing I'm worried about is the young generation, the Gen Zeders, the boys, they are
01:04:19.840 based, as the kids now say, the girls are going uber woke even more.
01:04:25.320 And that disparity...
01:04:26.780 We know that Jonathan Haidt's research has talked about particularly the dangers of TikTok
01:04:31.280 and social media.
01:04:32.140 So that disparity is going to be troubling.
01:04:33.880 But I do think Breitbart was right that politics is downstream of culture.
01:04:40.560 This is culture.
01:04:42.160 And if more people are listening to these kind of conversations with amazing people like
01:04:45.660 you who are offering a critical thinking about these issues, we're going to persuade
01:04:50.880 some people.
01:04:52.140 And that may be how we start to unwind some of these things over time and offer a better
01:04:57.420 alternative.
01:04:58.080 So I hope that's true.
01:04:59.900 And if it's not, we'll have World War III and I'll die.
01:05:02.000 So, Pete, it's been great having you back.
01:05:05.860 I thought we were going to end on a positive note.
01:05:07.520 It is positive note.
01:05:08.920 Either everything is great or we're all dead.
01:05:11.820 Isn't that...
01:05:12.620 That's very Russian.
01:05:13.720 Yes.
01:05:14.020 We're all dead.
01:05:14.640 Drink vodka.
01:05:15.340 But you know what?
01:05:16.260 Yeah.
01:05:16.800 A death in a nuclear Armageddon in which we all die...
01:05:19.280 Be quick.
01:05:19.800 ...is much better than a slow, painful, you know, and loads of civil war and all of that
01:05:25.440 kind of stuff.
01:05:26.160 Everybody's dead.
01:05:27.060 Everybody's happy.
01:05:28.580 Anyway.
01:05:29.220 Mm-hmm.
01:05:29.980 On that positive note, Pete, as you know, before we go to locals where we have questions
01:05:34.500 for you from our supporters, the question we always ask at the end is...
01:05:38.660 What's your favorite type of nuclear holocaust?
01:05:41.720 A quick one.
01:05:42.980 There we go.
01:05:43.400 The question is...
01:05:44.780 That wasn't as funny as you thought.
01:05:45.800 No, no, it wasn't.
01:05:46.580 Just depressing.
01:05:47.480 Yeah, it was.
01:05:47.840 Just depressing.
01:05:48.260 What should we be talking about tonight?
01:05:50.040 Yeah, baby.
01:05:50.600 I think, because I listen to your podcast religiously, if you will, I think it's less what we should
01:05:57.960 be talking about and more how we should be talking about it.
01:06:01.600 Yeah.
01:06:01.920 And I think we need to shift the conversation instead of from a particular issue.
01:06:06.660 Don't get me wrong.
01:06:07.180 It's important.
01:06:07.840 But how do we engage people with whom we disagree?
01:06:10.700 How do we listen better?
01:06:12.340 How do we ask better questions so that if you actually do want to persuade somebody,
01:06:18.680 you'll be more successful than screaming or not listening or not engaging?
01:06:22.920 And so what we should be talking about is how to engage people and what it actually really
01:06:30.420 means to understand somebody's position.
01:06:33.380 And I would say that it's maybe not something that we should be talking about and it's more
01:06:38.280 something we should be modeling.
01:06:39.620 And that's actually what we try to do.
01:06:41.080 The way we have conversations, including recently loads of people on the show with whom we
01:06:46.340 fundamentally disagree, but we still give them the time.
01:06:50.360 We treat them with respect.
01:06:51.500 We try to get to the core of what they're saying.
01:06:54.680 That's what it's always been our approach.
01:06:56.720 And I think that's why every time we get like new exposure in terms of like a big media event
01:07:03.220 or something happens where a lot of people come to our channel or come to our podcast,
01:07:07.840 the first thing they all say is the one thing that distinguishes you from most other media
01:07:13.360 is that you listen to your guests and you let them talk.
01:07:16.580 And that is the first step, actually, for what you're talking about.
01:07:21.840 So maybe it's less about talking about and it's much more about showing modeling and trying to be
01:07:28.100 in that, operate in that way.
01:07:30.920 Pete, awesome to have you back.
01:07:32.640 Thanks, man.
01:07:33.140 Everybody should be checking out your street epistemology videos because they're just amazing.
01:07:37.220 And it also teaches you how to think.
01:07:38.920 Like Francis said, you know, we are obviously in the space.
01:07:41.400 We think a lot.
01:07:42.580 We think for a living.
01:07:43.800 Yeah.
01:07:43.980 And yet, you know, we took something away from that.
01:07:46.320 So it's really great.
01:07:47.140 And I wish you all the best success with that.
01:07:49.760 Thank you.
01:07:50.180 Head on over to Locals where we ask Pete your questions.
01:07:54.560 What is so appealing about woke that allowed it to take over academia?
01:07:58.300 Is there a certain personality type that is susceptible?
01:08:01.680 What trait about yourself conferred immunity to this way of thinking and behaving?
01:08:05.720 That's a great.