TRIGGERnometry - August 18, 2022


Sam Harris: Trump, Religion, Wokeness


Episode Stats

Length

1 hour and 33 minutes

Words per Minute

162.03853

Word Count

15,197

Sentence Count

685

Misogynist Sentences

3

Hate Speech Sentences

10


Summary

Summaries generated with gmurro/bart-large-finetuned-filtered-spotify-podcast-summ .

Sam Harris is a neuroscientist, philosopher, and one of America s and the world s most prominent public intellectual. In this episode, Francis and Constance talk with him about how he became who he is, how he got to where he is now, and what it means to be an intellectual.

Transcript

Transcript generated with Whisper (turbo).
Misogyny classifications generated with MilaNLProc/bert-base-uncased-ear-misogyny .
Hate speech classifications generated with facebook/roberta-hate-speech-dynabench-r4-target .
00:00:00.000 Your capacity to be offended is not something that anyone need or should respect in you.
00:00:11.540 You were calling for Twitter to shut down Trump's account and were happy that it happened.
00:00:16.520 Yeah.
00:00:16.820 That's a very different position to pretty much everybody else.
00:00:20.000 Why did you take that position?
00:00:21.400 Trump University as a story is worse than anything that could be in Hunter Biden's laptop, in my view.
00:00:27.880 That's just a conspiracy, that's a left-wing conspiracy to deny the presidency to Donald Trump.
00:00:34.240 Absolutely it was. Absolutely. But I think it was warranted.
00:00:37.600 But Sam, you can't do that, Sam. You've got to be fair.
00:00:42.960 Most people in our society, even if they're nominally religious, really are struggling to find meaning in their day-to-day.
00:00:52.220 But when you look at just the hour-by-hour increments at which life has doled out to us,
00:00:57.400 you're cast out of deep sleep or, you know, the phantasmagoria of dreams, you know,
00:01:04.560 when the alarm goes off in the morning, and how do you feel about your life?
00:01:10.540 And what is going to give you moral urgency and meaning?
00:01:14.340 Millions and millions of people found it at specific moments in our, you know, recent history.
00:01:20.500 Like, George Floyd killing was certainly one of those moments where it's like, okay, enough is enough.
00:01:26.700 This is my religion.
00:01:27.640 Did you know that you can ask guests your questions?
00:01:41.260 That's right.
00:01:42.100 When you join our locals community, not only will you know who we're about to interview,
00:01:47.140 you have the opportunity to ask them your questions.
00:01:50.320 You have the chance to ask...
00:02:20.320 Carl Benjamin, and so many more.
00:02:23.020 Plus, we're about to interview some of the biggest guests in the world.
00:02:28.740 We can't name them just yet, but trust me, they're huge.
00:02:33.740 Metaphorically speaking, not just because they're American.
00:02:37.100 Our Locals gives you access to a great community of like-minded people,
00:02:41.200 where you can share memes and make new and problematic friends.
00:02:44.740 You also get early access to live shows, and we're about to release details of our tour,
00:02:50.120 so you'll want to know about that as well.
00:02:52.520 On the higher tiers, you get monthly supporter calls and the opportunity to have a meal or a call with us.
00:02:58.480 Click the link below or go to trigonometry.locals.com and join the community.
00:03:05.480 That's trigonometry.locals.com.
00:03:08.420 We'll see you there.
00:03:10.180 Hello and welcome to a very special episode of Trigonometry on the Road from the USA.
00:03:18.140 I'm Francis Foster.
00:03:19.700 I'm Constance M. Kissin.
00:03:20.760 And this is a show for you if you want honest conversations with fascinating people.
00:03:26.640 Delighted to say our brilliant guest today is a neuroscientist, philosopher,
00:03:30.560 and one of America's and the world's most prominent public intellectual, Sam Harris.
00:03:34.840 Welcome to Trigonometry.
00:03:35.960 Thanks, guys.
00:03:36.720 Yeah, great to be here.
00:03:37.500 It is great to have you on the show.
00:03:39.360 We mentioned that usually when we start the show, we ask our guests to introduce themselves.
00:03:43.880 You're well known enough that you don't need to do that.
00:03:46.160 But what we did want to talk to you about, which is what we've been asking a lot of our guests on this trip in the U.S.,
00:03:51.040 is how are you who you are?
00:03:53.260 Because you've done things that most people wouldn't do or wouldn't want to do,
00:03:57.260 would be scared to do, calling out some ideologies that people are afraid to call out.
00:04:01.660 That takes courage, but it also takes determination.
00:04:04.700 It takes something.
00:04:06.300 Why do you have that something?
00:04:07.860 How did you get it?
00:04:10.240 It's a hard question to answer.
00:04:11.420 I mean, I think there's one algorithm I'm running more than most, which is, you know,
00:04:20.800 what I would call intellectual honesty, right?
00:04:23.020 And so the burden is not to be who you were yesterday.
00:04:27.800 The burden isn't to join some tribe who, you know, you'll get social reinforcement from for, you know, conforming to.
00:04:36.660 So it's insofar as I'm continually just trying to figure out what's true
00:04:44.820 and what's consistent with what I claimed was true five minutes ago or five years ago,
00:04:50.260 that causes me to just bump up against taboos and blasphemies and ideologies that are more rigid than that, right?
00:05:02.320 I mean, if you're, I mean, really it's, I mean, even having an identity itself is too much, you know?
00:05:09.960 It's like not only can you not really conform to a tribe, you can't really even conform to who you were yesterday
00:05:17.540 if your master value is to be honest and rigorous and available to new data and new arguments and new insights.
00:05:28.560 That's a very good answer, but it doesn't answer my question, which is how did you become that way?
00:05:32.880 Why are you that way?
00:05:33.500 Yeah, I have no idea.
00:05:34.320 I just, like, that was my, you know, factory setting.
00:05:37.460 And so from a very early age, you know, I guess I showed up as a skeptic on many fronts.
00:05:47.000 I mean, I was certainly an argumentative teenager, you know?
00:05:51.340 I like to say that.
00:05:52.620 So, yeah, I mean, that was really some kind of default, and that part really hasn't changed.
00:06:01.700 I think I pick my battles better now than I used to.
00:06:04.580 I mean, I could use, you know, I could turn any dinner party into a, you know,
00:06:08.560 a knock-down, drag-out debate about the most fundamental issues anyone could summon.
00:06:13.540 And, you know, I think I, as I get older, I pick my battles more just because I, like,
00:06:23.080 the hassle factor of touching certain topics in certain ways has become more salient for me.
00:06:29.100 I mean, Twitter was the real teaching tool for me.
00:06:32.280 Like, I just, you know, I got on the platform, as everyone did,
00:06:36.160 not really aware that I was enrolling myself into a psychological experiment
00:06:40.480 to which no one had consented, and the outcome is as yet unforeseen.
00:06:46.120 And, yeah, I just kind of let loose, you know, on various topics.
00:06:53.340 And I would, you know, when I would see some malefactor there who was treating me or other people badly,
00:06:58.600 well, then I had to deal with that right then and there.
00:07:00.720 And, yeah, it's just the hassle factor of dealing with the toxicity of it.
00:07:10.620 And also just the, I'm convinced now that at bottom it's a misrepresentation of humanity, right?
00:07:18.740 It doesn't seem like, I mean, you're actually dealing with what someone wrote there.
00:07:22.280 I mean, it's not in error.
00:07:23.820 But you're not getting the whole person.
00:07:26.840 You're getting a part of them that has been amplified by the frame into which the conversation has been put.
00:07:36.100 And the frame has certain features that are not helpful to good conversation.
00:07:41.160 I mean, it's got anonymity.
00:07:42.200 It's got, you know, it's got facelessness.
00:07:44.440 It's got the performative aspect.
00:07:47.160 You're doing it in front of your crowd or some possible crowd.
00:07:50.100 So it just, you know, it's disastrous for intellectual honesty and compassion and, you know, theory of mind to actually understand,
00:08:02.800 taking the extra moment to understand where the other person is coming from.
00:08:07.040 And so it's, you know, there's no principle of charity.
00:08:09.460 So it's just, it's really a disastrous machine for manufacturing discord.
00:08:15.100 We had a guest on the show called Richard Grannon who made the point that what it does is reduces everyone to an avatar.
00:08:22.860 So if everyone's an avatar, what does it matter if you attack them, if you, you know, dehumanize them, if you misrepresent them?
00:08:30.320 Because the game is to win, right?
00:08:32.180 It's not to actually have a discussion.
00:08:34.040 It's to destroy and ultimately to win.
00:08:35.980 Yeah, well, especially if you're not, if you're a public person dealing with people who are not public people, right?
00:08:43.340 Because then there's really, then it's just a single shot, you know, lottery.
00:08:48.120 It's not a, there's no ongoing future of collaboration or cooperation that is being maintained.
00:08:55.420 I mean, it's even bad when you have two public people who, you know, are, who know, both know they're going to meet each other in real life at some point.
00:09:05.160 Still the wheels come off rather often to a surprising degree.
00:09:09.580 But yeah, it's just, in the end, I think it's bad technology, which is still somewhat inscrutable because it seems like it should be good.
00:09:21.460 I mean, and in some ways it is good because you're seeing, you know, you're seeing a lot of smart people tell you what the most interests them and most worries them on a daily basis and, you know, sending you articles and videos that you, and that's, that's why I'm on, that's why I can't break my connection to it because I'm following so many smart people who are curating for me an information diet that I still appreciate.
00:09:44.440 And then I, you know, occasionally put my own stuff out there just so it's like a kind of a marketing channel, but I'm doing much less in the weeds back and forth with even public people who I notice, you know, you know, poke me on, on a given issue or whatever.
00:10:00.140 So Sam, does it not go, do you not have a little thought in your head when you go to tackle these very contentious subjects, you know, and you know that you're going to get pushback, you know, that you're going to get flat, you know, that you're going to get misrepresented.
00:10:15.180 Do you not think I really shouldn't be doing this or what goes through your mind before you go out and you make your point?
00:10:22.380 I love shopping for new jackets and boots this season.
00:10:25.480 And when I do, I always make sure I get cash back with Rakuten.
00:10:29.360 And it's not just fashion.
00:10:30.780 You can earn cash back on electronics, beauty, travel, and more at stores like Sephora, Old Navy, and Expedia.
00:10:37.120 It's so easy to save that I always shop through Rakuten.
00:10:40.160 Join for free at rakuten.ca and get your cash back by Interactee Transfer, PayPal, or check.
00:10:45.740 Download the Rakuten app or sign up at rakuten.ca.
00:10:48.900 That's R-A-K-U-T-E-N dot C-A.
00:10:52.380 Well, again, I think about it more than I used to now.
00:11:00.020 I mean, I used to do it veryβ€”basically, there was no friction in the system.
00:11:05.240 I mean, I just was like, okay, this is like that cartoon meme, you know, somebody on the Internet is wrong about something.
00:11:11.780 I was that guy on Twitter, and so I'm not that guy anymore.
00:11:19.900 And I really do pick my moments, and there's a cost to that because there's, you know, that you decide to sit certain moments, you know, cultural moments.
00:11:30.740 You sit on the sidelines.
00:11:31.680 But if I guess I could distill it to a lesson here, it's like you're notβ€”you don't always need to have an opinion about everything, right?
00:11:46.940 You certainly don't always need to have a strong opinion about everything.
00:11:49.280 And even if you do have a strong opinion, you don't always have to be the person expressing that opinion because very likely someone else will, right?
00:11:58.620 And, you know, given those adjustments to the machine, you can just decide, you know, is thisβ€”do I really want to spend the next 24 hours dealing with the aftermath of this thing that I'm tempted to tweet, right, or to say in some other format?
00:12:16.420 And in particular, it relates to the likelihood that certain personalities are going to go berserk.
00:12:24.320 And then with certain people, it's, you know, it's guaranteed that they're going to go berserk.
00:12:28.000 So, like, do Iβ€”yeah, here's this odious opinion expressed by a, you know, a semi-odious person who really deserves to hear what I think right now.
00:12:41.180 Is it worth, you know, whacking that hornet's nest and then just dealing withβ€”dealing with and being seen to deal with or to not deal with and to maybe look like you can't deal with the aftermath, right?
00:12:53.920 It's likeβ€”
00:12:54.920 Sam, you talked about tribalism at the beginning, and that's something that Francis and I both feel very strongly is contributing to much of the divisiveness and the way things are going.
00:13:05.260 And, look, let's be clear, you know, the show is called Trigonometry.
00:13:07.980 We want to explore difficult subjects.
00:13:10.160 There's no question that neither him or I are woke in theβ€”opposing that ideology has been a big part of what we do.
00:13:17.060 But the tribalism is a different thing.
00:13:19.900 We don't want to be in the anti-woke tribe or the woke tribe, right?
00:13:23.020 Yeah, yeah.
00:13:23.920 And there was a tribe around 2015, 2016, this very small tribe of very smart people, which was referred to as the intellectual dark web, which I remember at that time, we weren't doing this, we were just two comedians.
00:13:36.720 I remember watching you guys have those conversations and being inspired by people.
00:13:41.160 I don't think you guys had the answers, but you had the right questions.
00:13:44.160 You did have the right questions.
00:13:45.420 And then, over time, we watched that loose tribe of very bright people, as loose tribes of very bright people always do, crumble, disintegrate, fall out.
00:13:55.920 Yeah.
00:13:56.300 What happened?
00:13:56.940 Well, the first thing that happened is that it was actually, for some of us more than others, a tongue-in-cheek label for a tribe that really wasn'tβ€”I mean, none of us are tribal people.
00:14:09.140 You know, it really is a herding cats sort of situation.
00:14:13.620 And when I launchedβ€”so, you know, it was Eric Weinstein coinage, which I launched on a podcast we did.
00:14:20.620 And I think in that context, I telegraphed that I thought it was tongue-in-cheek.
00:14:26.820 I mean, I think he probably thought it was more in earnest than I did, or he was at least more attached to the label.
00:14:32.320 And then, very quickly, there were people who sort of joined this collection or who were said to be in it, some of whom I had never heard of at that point, who, you know, upon just a little bit of analysis, revealed themselves to be people who I, you know, I really don't agree with.
00:14:54.320 Not just on the actual substance of specific opinions, just their methodology by which they would generate opinions or their lack of methodology.
00:15:04.300 So, you know, I'm not inclined to name names, but there are people who, like, it's just wrong to think, you know, they were ever moving in the same lane I was in at that point when we were all called IDW people, right?
00:15:18.900 But I think the biggest force of fragmentation was Trump and what certain people did or didn't do with that phenomenon, you know?
00:15:31.440 And this is what I was going to ask you. I'd say there were two things that fractured it from looking in from the outside.
00:15:36.300 I mean, COVID later, but Trump was the big one.
00:15:38.600 So let's start with Trump, because I want to talk about COVID as well.
00:15:41.080 But if we start with Trump, you took a different view to almost everybody, I would say, in what was described as the IDW, in the sense that you were, I think you were calling for Twitter to shut down Trump's account and were happy that it happened.
00:15:56.740 Yeah.
00:15:57.020 That's a very different position to pretty much everybody else.
00:16:00.240 Why did you take that position?
00:16:01.500 Well, for two reasons.
00:16:03.260 One, so the non, the generic reason is, and this is something I've never gotten a clear answer on from any of the people who took the different side of this.
00:16:17.460 And many, so many of these people are ostensibly libertarians, or at least, you know, quasi-libertarians.
00:16:22.340 And they want something like a minimum of state coercion and control.
00:16:28.120 They don't want just a proliferation of laws, you know, just to make our lives more difficult.
00:16:35.520 And that's an orientation, you know, though I consider myself liberal and have always voted as a Democrat.
00:16:43.000 I mean, until we dealt with this woke apocalypse, you know, I would have certainly called myself a Democrat without much self-consciousness.
00:16:50.280 But, you know, I've always had this libertarian kind of underpinning to my politics, which is, you know, if the private sector can handle it, it's probably best done there, right?
00:17:03.280 I mean, just given the level of inefficiency and poorly aligned incentives you get in a government bureaucracy.
00:17:09.480 And peaceful, honest people should have the right to be left alone, you know, so it's like unless somebody is harming people or, you know, guilty of fraud, you know, i.e. theft, you know, stealing from people, we don't need the government involved.
00:17:30.200 And so, you know, that's my general framework.
00:17:35.040 And many people ostensibly in this group ostensibly agreed with that.
00:17:39.360 So when I look at Twitter, you know, Twitter is a company that can decide to, you know, as someone who has started, you know, information-based companies at this point, I'm just thinking about what's the scenario under which I would want the government to force me to have Alex Jones on my podcast or to have Donald Trump on my podcast?
00:18:03.860 Shouldn't I be able to have anyone I want on my podcast?
00:18:05.600 Is it conceivable that my podcast could grow so big or that my, you know, that any other platform, you know, I've considered creating a social media platform, right?
00:18:15.520 If that could grow so big that suddenly the government would have an interest in forcing me to have people on it who, for whatever reason, I object to having on it.
00:18:26.040 I mean, so this is a way in which I'm more extreme than most people on the left.
00:18:32.120 Like, I do think at this point in history, you should be able to have a social media platform and exclude any specific group you want and just say that's the way we do it, right?
00:18:45.360 And if you don't like it, boycott us, right?
00:18:47.780 So, like, I wouldn't have said this in 1964 when we have to pass the Civil Rights Act.
00:18:52.700 But at this point, I think you should have the right to be an asshole who destroys your reputation and suffers the penalties in, you know, in the marketplace of ideas, right?
00:19:05.160 So, I think if you want to just have a social media network for beautiful people, right, or people who are, you know, guys who are over six feet two and blonde hair and blue eyes, right, you know, I can't get on.
00:19:15.460 You should feel free to, you know, raise money for that enterprise, launch it, and I'll be, you know, I'll laugh when it fails, right?
00:19:22.820 So, like, that's – now, under some control, that kind of thing, you know, is or should be illegal, you know, if you're just a normal person on the left.
00:19:34.880 But I don't think – I think at this moment in history it shouldn't be.
00:19:39.260 But in any case, I just – when I look at Twitter, I see a company that has a term – has terms of service, which people like Alex Jones and Trump clearly violated.
00:19:49.780 I mean, whether they in fact violated this terms of service as written, I think they violated any coherent terms of service that Twitter should have had, right?
00:20:00.200 Like, you shouldn't knowingly be able to turn your mob on a private citizen and ruin their lives through doxing, right, which is what Jones and Trump were doing just again and again and again to people.
00:20:14.280 Every time – I mean, Jones was doing it with the Sandy Hook parents, right?
00:20:18.380 You literally have –
00:20:19.700 But, Sam, you're conflating two very different people.
00:20:21.800 I mean, Alex Jones does not belong in this conversation.
00:20:23.880 I'm not interested in Alex Jones right now.
00:20:25.460 Well, but I would dispute that.
00:20:26.840 I think Trump is essentially – we got Alex Jones as president of the United States.
00:20:31.220 I don't think they're very different people.
00:20:32.760 I think it's the same phenomenon in my world.
00:20:35.260 Because just the level of misinformation, disinformation, lying, the charlatanism, the conscious fraudulence of everything at scale and the targeting of individuals with known consequences, right?
00:20:51.920 Like, Trump – every time Trump singles out a specific citizen and says, look at this jackass who's trying to – whatever the claim would be, that is a human sacrifice.
00:21:06.740 We know that person's life is just never the same again because he's turned tens of millions of morons on that person and, you know, vicious morons on that.
00:21:19.280 I mean, like that's – the core of the Trump phenomenon is now and has been for many years.
00:21:26.840 I mean, really, since the beginning, since he – certainly since he became the frontrunner and certainly since he became elected in 2016, it's a personality cult.
00:21:37.840 I mean, it has all the dynamics of a personality cult.
00:21:39.880 These are not reasoning – yes, there are some – there are a few calculated people like Peter Thiel on the margins who have some story as to why they would back him, right?
00:21:49.440 But the core of the cult, you know, which is all, you know, nested with QAnon and conspiracy thinking and the big lie and, you know, it's like Trump can do no wrong, right?
00:22:02.240 He's – that is so – it's – I mean, as a Venn diagram, it's just – it overlaps 80 percent with the Alex Jones phenomenon.
00:22:11.980 So I just – I see them as the same problem.
00:22:14.340 I see – these are – these are, you know, if they're not actually clinically, you know, diagnosable as psychopaths, they're the next best thing.
00:22:24.260 These are people who are so malignantly selfish and so careless with respect to the consequences of their actions in the lives of others that if you are – if you own a platform or you're, you know,
00:22:38.960 you're overseeing a public company that owns a platform, why should the government force you to keep these people on, right?
00:22:51.280 Like, you should be free to say, sorry, you're not on my watch.
00:22:54.860 Are you going to be having these consequences?
00:22:57.580 And with Trump, it was – after January 6th, there was just – I mean, that's when it happened.
00:23:01.560 I thought it happened a year too late.
00:23:03.980 But, I mean, January 6th finally convinced Dorsey he should kick Trump off.
00:23:08.180 And that – I mean, if that's not going to convince you that, you know, we have – we had a – at that point we had a sitting president who for months and months and months, I mean, you know, at least six or eight months, you know, certainly months prior to the November election, would not commit to a peaceful transfer of power.
00:23:25.980 And then he did, you know, certainly something, whether it was everything in his power or just a lot, he managed to see that we did not have a peaceful transfer of power, right?
00:23:39.120 And then, you know, so what's the mob going to do on January 7th and 8th and 9th, you know, if you just leave Trump on the platform?
00:23:48.520 I mean, I just thought it was a very simple decision to kick him off and totally analogous to the Alex Jones decision.
00:23:57.160 I mean, Alex Jones is just less consequential, but, I mean, there are Sandy Hook parents who have had to move 10 times since their kids were murdered.
00:24:08.220 That's all on Alex Jones, right?
00:24:10.660 And it's all conscious.
00:24:12.200 It's all – he could see the consequences of his actions in real time.
00:24:15.900 It's not like he woke up after, you know, five years and thought, oh, my God, I can't believe that, you know, it was totally inadvertent.
00:24:23.500 I released a podcast and, you know, then it had this totally unforeseeable consequence in the lives of these grieving parents.
00:24:31.320 No, no, he monetized their misery with just a blizzard of lies.
00:24:39.040 Alex Jones is, for me, a different case, but I hear what you're saying.
00:24:41.920 I think Trump – I mean, Trump is just –
00:24:43.540 I hear what you say.
00:24:44.260 In your mind, they're similar.
00:24:46.480 He got the reputation washing of having successfully become president.
00:24:50.760 You know, he's Alex Jones.
00:24:52.600 Okay.
00:24:53.300 Francis, before you move us into COVID, let me try from a different angle, Sam, because I want to explore this intellectual point, right?
00:25:00.340 Do you really want to live in a country where you have a digital public square, which, in my opinion, Twitter is.
00:25:06.360 We can disagree about that if you want, but that's my opinion.
00:25:08.540 It's a digital public square, and you have a company that has clearly one-sided enforcement.
00:25:14.820 I hear what you're saying about delegitimizing the electoral process that Trump did, and I was concerned about that.
00:25:22.060 I think you can't question the system in that way.
00:25:24.280 But when you see that he gets banned, and then a story about Hunter Biden gets banned, that under the guise of it being Russian disinformation, we later learn it wasn't Russian disinformation.
00:25:36.800 That, to a lot of people, seems like, you know, I said it when we were talking to Joe Rogan, it's putting your hand on the scales in favor of one side.
00:25:45.520 In the digital public square, you add that to the banning of Trump and lots of other people being banned from one side, predominantly.
00:25:54.160 Is that the world you want to live in, where one team gets to just ban people it disagrees with off the platform?
00:26:01.860 It gets to pretend that things that are true are not true.
00:26:05.080 It gets to shut down the sharing of information with people who want to make their own democratic choice?
00:26:09.880 Well, it's a hard question, and there are pieces of the question that are individually hard.
00:26:15.960 It's like the Hunter Biden laptop story is something that I still don't have a full opinion about.
00:26:25.560 I actually don't know what we should have done about that.
00:26:28.320 I mean, so I see the reason.
00:26:29.780 I see both sides of it.
00:26:31.240 I can argue either side of it.
00:26:33.460 So let's leave that piece aside.
00:26:35.660 The bias on the platform, so either Twitter is a company that can do what it wants, right?
00:26:42.800 It can have its own terms of service.
00:26:44.580 It can change its policies.
00:26:46.220 It can have a point of view or not, right?
00:26:52.220 Or we have to seize it as some kind of crucial piece of public infrastructure that has to function by different terms.
00:27:02.060 Do you not think that it is a crucial piece of public infrastructure?
00:27:04.400 I think people who are addicted to Twitter feel it is, right?
00:27:09.800 Most, you know, and I think it's, you know, I don't think it should be.
00:27:15.380 And it's odd to say that it's just so, first of all, it's just, I mean, Facebook is much, much, much bigger, right?
00:27:21.880 It's just that we have a lot of smart people, journalists, brands, political people concentrated on Twitter.
00:27:29.540 So Twitter moves the conversation more than Facebook does.
00:27:33.580 But the scale of it is much smaller.
00:27:35.880 I don't know.
00:27:39.760 I just feel like people can start their own companies, which they have, right?
00:27:45.320 So they can start competitors at Twitter.
00:27:46.760 There are many people who, you know, Twitter is not, it's still a failing business, right?
00:27:50.560 It's like it's not, it doesn't work, really.
00:27:53.780 I mean, Facebook is a much better business.
00:27:55.740 There's nothing stopping Facebook from becoming stickier for intellectuals and journalists and attracting more of the conversation over there.
00:28:10.820 I don't know.
00:28:11.880 It's just, it's an extreme move to say you can't, you can't be biased, right?
00:28:21.220 Like who's, who's going to say that, but behind, behind the saying of that is a law in the end and there, and therefore it's a gun.
00:28:29.200 Therefore it's, it's, it's jail time for the person who wants to keep breaking the law, right?
00:28:33.640 So like, just imagine, imagine if Twitter, the Twitter board, it's like what you, everyone gets what they want.
00:28:38.780 You know, everyone who's, who's of this opinion gets what they want.
00:28:41.160 You just, we're going to, we're going to come in and, and, and, and enforce something like, um, uh, a zero bias state in Twitter, insofar as that's possible.
00:28:52.440 And if the, if the employees and the board just say, you know, sorry, we, we have a point of view.
00:28:57.140 We want, we want to have, we, we don't like these people and we like these people.
00:29:01.980 Um, what does now you just break up the company?
00:29:05.540 You just say, you know, I mean, I, I thought what I thought it should have happened with Twitter.
00:29:09.800 I thought Jack Dorsey should have deleted it.
00:29:12.960 I mean, I literally thought he should, would have got the Nobel Peace Prize and he just at a certain point deleted it.
00:29:18.820 Right.
00:29:19.460 Um, but, uh, yeah, I don't, so in any case, what should there, should they be forced to be impartial?
00:29:28.700 I'm very skeptical of that.
00:29:31.220 Should they be cajoled by unhappy people like yourselves or like, you know, the, uh, um,
00:29:38.540 you know, the Trump fans to, um, to behave better?
00:29:42.560 I'm just putting the counter argument to you, Sam.
00:29:44.400 I mean, I think, so yeah, yes, I think
00:29:46.080 if they were going to be, the first thing to admit is it may be impossible to do this impeccably, right?
00:29:54.460 It's like, it's like the, until we have, you know, perfect artificial intelligence, it's just going to be impossible to be truly consistent with your terms of service because you're always going to be able to find the example of the thing that was not appropriately moderated.
00:30:10.280 Yes, but if, we all know that if that laptop was Donald Trump's Jr., this would be treated, that's, that's all I'm asking.
00:30:17.480 Oh yeah, but that's, so let's take that piece.
00:30:21.060 Um, I think it was totally appropriate to view Trump in a, to be existing in a, in a domain that was orthogonal to partisan politics.
00:30:36.560 I, my criticism of Trump is totally nonpartisan, right?
00:30:40.980 There is absolutely, there's literally nothing I say about Trump that I could say about any other Republican, right?
00:30:47.820 And I think Liz Cheney is a total hero, right?
00:30:50.320 So, so, and I don't agree with her politics at all, right?
00:30:53.320 Like Liz Cheney is a religious maniac by my lights, right?
00:30:57.360 And in, in, in that sense, kind of a terrifying political figure, like, like, like, like the, the old me who, you know, was just worried about the Christian theocracy in the United States, um, would have just revolted at everything she would attempt to implement as a politician.
00:31:16.840 But, at this moment, she's, you know, she has no bigger fan than me because of how she's dealing with the Trump phenomenon.
00:31:27.240 The Trump phenomenon is not a matter of political partisanship.
00:31:30.880 He, he is a, he is just a sui generis, uh, phenomenon.
00:31:35.420 And it's, again, it's, it's, it's analogous to having elected Alex Jones president of the United States.
00:31:41.280 It's, it's a, it's a, it's a, it's not a matter of his, like, I probably agree with half of his policies or more than half of his policies.
00:31:49.380 It's not a matter of policy.
00:31:50.820 It's a matter of having someone who's totally unfit to have power, be given more power than any person in a generation.
00:31:58.880 And, and he's unfit for, in every possible way.
00:32:04.380 It's like, it's not, it's not that he's just got a few screws loose.
00:32:08.200 Like, every screw is loose.
00:32:09.600 Every screw that you would want totally cranked down is loose or non-existent in him.
00:32:15.720 Um, and so, yeah, so it's, I mean, that, that's my argument.
00:32:19.600 So, so my argument is that it was appropriate for Twitter and the heads of big tech and journal, and the heads of journalistic organizations to feel that they were in the presence of something like a, a once in a lifetime moral emergency, right?
00:32:36.760 Whereas this is not the same thing as not liking George Bush, you know, or not liking John McCain or not liking Mitt Romney for their politics.
00:32:46.460 This was, here's a guy who is capable of anything, right?
00:32:53.900 He's not, he's not ideological, but he's, again, he's, he's a black hole of selfishness, right?
00:32:59.500 He's, he's, he's just, and so there's no telling what he's going to do.
00:33:04.060 Um, and we cannot afford to have four more years with this guy, right?
00:33:09.680 And, and, and so, um, so what, what should well-intentioned people do who have a lot of power in these various ways?
00:33:19.640 You know, you're running the New York Times, you're running CNN, you're running Twitter.
00:33:23.480 What should they conspire to do?
00:33:26.640 Admit that it's their fault.
00:33:27.460 Under those conditions.
00:33:28.340 What was that?
00:33:28.960 Admit that Trump is their fault.
00:33:30.480 And look, I'm someone from the left, Sam.
00:33:32.020 Absolutely.
00:33:32.340 That, that's, well, no, that, that's the perverse thing.
00:33:34.460 It's totally their fault.
00:33:35.540 So, he, I mean, CNN, CNN gave us Trump, right?
00:33:38.740 Well, no, before CNN gave us Trump, Mark Burnett gave us Trump.
00:33:42.500 I mean, if there's one person who could have not done what he did and, and could have closed the door to this whole phenomenon, it was Mark Burnett.
00:33:52.320 Um, but, yeah, no, by giving him the attention, you know, but he was, he was great ratings, you know, for a year, for the whole run-up to, to the 2016 election.
00:34:04.500 And, oh, yeah, no, there, no one has clean hands here.
00:34:08.720 But at the 11th hour, when it's, when, who knows how this election is going to go, who knows, who knows what the capacity for, you know, disinformation at the last minute to, to tip the balance is.
00:34:22.960 And then what do you do with the Hunter Biden laptop story when we already know, we, we know how this played out in 2016 with the Hillary Clinton email, you know, press conference where, where Comey in, in, in, in, in, in an abundance of scrupulosity felt like he had to come before the cameras, I think 10 days out from the election.
00:34:44.140 And say, you know, we've, we're going to open up this, this investigation again, because we've got Anthony Weiner's laptop.
00:34:52.520 We could see, I mean, again, her failure to become president was overdetermined.
00:34:56.860 She was a, an appallingly bad candidate.
00:34:58.820 But in terms of just tracking the poll numbers, you could, like, that was, that was the killing blow to her candidacy, right?
00:35:07.940 That, that final moment.
00:35:09.940 And this was a, this was a highly analogous situation.
00:35:13.060 This was, we're going to open up this laptop from hell and the news cycle for who knows how long is going to be just, just conceivably just a nuclear bomb of an October, October surprise.
00:35:29.200 And we're going to get four more years of Trump if we actually give this a fair hearing.
00:35:34.060 But Sam.
00:35:34.580 But you can't do that, Sam, surely.
00:35:36.680 You've got to realize that you've got to be fair and number, the thing that I want to.
00:35:41.240 We're all equal before the law.
00:35:42.420 Yeah.
00:35:42.960 Aren't we?
00:35:43.480 And the other thing.
00:35:44.020 This isn't the law.
00:35:44.540 I know it's not the law.
00:35:45.720 But if this is, if you accept my, my supposition that this is the public square, then it is the law.
00:35:51.660 It is, if it is the public square, then it is law.
00:35:54.420 Now, you're arguing it's not the public square, which is fair enough.
00:35:56.940 Yeah.
00:35:57.420 Right?
00:35:57.680 That's fine.
00:35:58.280 But why don't we move on?
00:35:59.280 Because I think we've done enough.
00:36:00.500 Yeah, true.
00:36:00.940 He's sucked up a lot of it.
00:36:02.640 He's got to have that.
00:36:04.240 He's doing that.
00:36:04.860 No, but I'll just say, just finally, I do, again, it's like a coin toss for me, the Hunter Biden laptop thing.
00:36:11.900 Because I do understand how corrosive it is for an institution like the New York Times to show obvious bias and inconsistency and dishonesty in how they, it's like they couldn't even frame it honestly.
00:36:30.200 It's not like, it's not like, it's like the way I would frame it is, listen, I don't care what's in Hunter Biden's law.
00:36:38.940 I mean, Hunter Biden, at that point, Hunter Biden literally could have had the corpses of children in his basement.
00:36:45.480 I would not have cared.
00:36:47.060 Right?
00:36:47.240 It's like, there's nothing.
00:36:48.960 First of all, it's Hunter Biden, right?
00:36:50.420 It's not, it's like, it's not Joe Biden, but even if Joe, like, even whatever scope of Joe Biden's corruption is, like, if we could just go down that rabbit hole endlessly and understand that he's getting kickbacks from Hunter Biden's deals in Ukraine or wherever else, right?
00:37:07.200 Or China.
00:37:07.720 It is infinitesimal compared to the corruption we know Trump is involved in.
00:37:16.220 It's like, it's like, it's like a firefly to the sun, right?
00:37:18.920 I mean, like, there's just, it doesn't even, it doesn't even stack up against Trump University, right?
00:37:24.680 Trump University as a story is worse than anything that could be in Hunter Biden's laptop, in my view, right?
00:37:31.500 Now, that's not, that doesn't answer the people who say it's still completely unfair to not have looked at the laptop in a timely way and to have shut down the, you know, the New York Post's Twitter account.
00:37:43.660 Like, that's just a conspiracy, that's a left-wing conspiracy to deny the presidency to Donald Trump.
00:37:50.560 Absolutely it was.
00:37:51.760 Absolutely.
00:37:52.640 Right?
00:37:52.900 But I think it was warranted.
00:37:54.380 Right?
00:37:54.680 And again, it's a coin toss as to whether or not that particular piece is.
00:37:58.480 I'm really sorry.
00:37:59.000 I was the one that said we should move on, but you've just said something I really struggled with there, which is you support.
00:38:05.080 The kids in the basement?
00:38:06.740 No, no.
00:38:07.540 Fuck the kids in the basement.
00:38:09.240 I'm interested in democracy.
00:38:11.080 You're saying you are content with a left-wing conspiracy to prevent somebody being democratically re-elected as president.
00:38:19.220 Well, no, I'm content.
00:38:20.460 Well, so it's, but the thing is, it's just not left-wing, right?
00:38:22.640 So Liz Cheney is not left-wing, right?
00:38:24.920 Liz Cheney is doing everything in her power.
00:38:27.020 Conspiracy to prevent somebody being democratically re-elected.
00:38:29.160 No, but there's nothing, conspiracy, it's not, it was a conspiracy out in the open.
00:38:33.660 It does, but it doesn't matter if it was, it doesn't matter what part's conspiracy, what part's out in the open.
00:38:38.620 I mean, I think it's like, if people get together and talk, and talk about what should we do about this phenomenon, you know,
00:38:44.740 if it's like, if there was an asteroid hurtling toward Earth, and we got in a room together with all of our friends and had a conversation about what we could do to deflect its course, right?
00:38:56.980 Is that a conspiracy?
00:38:58.240 You know, like some of that conversation would be in public, some of it would be in private.
00:39:01.700 We have a massive problem.
00:39:03.460 We have an existential threat, right?
00:39:05.300 Politically speaking, I consider Trump an existential threat to our democracy, right?
00:39:10.440 Now, he's not going to destroy the world, very likely.
00:39:13.140 If you destroy democracy in the process of protecting democracy.
00:39:16.540 But that doesn't destroy, no, our, I'm not, what I'm not suggesting, at no point was I suggesting we should stuff ballots.
00:39:26.260 Or actually break the machinery of democracy, but the political opinion is already being just completely inundated with misinformation, bias takes, half-truths, and outright lies, right?
00:39:44.400 Or just the amplification of bad or misleading information based on, you know, the algorithm, right?
00:39:50.220 So, it's like it's already just an abattoir of opinion, right?
00:39:59.400 And now the question is, you know, what can you do with your own biases and your own, to get the outcome you think is actually better, not just for yourself personally, but for the world, right?
00:40:13.260 So, like, I have, like, it is, I'm completely unconflicted in the claim that a first Trump term was bad and a second Trump term would be bad.
00:40:27.260 And it literally doesn't matter what was, what else was on the menu.
00:40:33.280 Like, literally, pick a random American better than Trump in the Oval Office.
00:40:39.200 Like, the likelihood that you're going to get someone who's worse than Trump, given what I consider that is bad about Trump, is, I mean, it's on the order of one in a million, right?
00:40:51.800 Like, you're just not, you're not going to get, you're not going to get worse than Trump if you pick at random.
00:40:56.360 And, you know, Hillary Clinton, for all of her flaws, was not worse than Trump.
00:41:00.240 Joe Biden, for all, Joe Biden, we could have known Joe Biden was going to just be comatose in office, not worse than Trump, right?
00:41:07.720 Kamala Harris, not worse, like, it's all, and again, it's not just a marginal call.
00:41:13.480 It's just, these are people who are normal politicians who are so much more constrained by predictable machinery, right?
00:41:24.560 Like, there's such less of an opportunity there to destroy institutions that we have to rely on, right?
00:41:35.120 With any of those people in charge, including a random person in charge, a random person who's going to be terrified at the responsibility of the office and default to expert opinion, you know, across the board.
00:41:46.680 No, Trump is, again, Trump is an Alex Jones level figure for me.
00:41:53.240 And so, you know, it's like a smaller problem is to just for some billionaire to buy the New York Times and give it to Alex Jones to run, right?
00:42:02.040 That would be an enormous, that would be a catastrophic loss and mistake.
00:42:05.520 But that's a smaller problem than getting Trump reelected.
00:42:09.780 The last question I'm going to ask, which actually isn't really about Trump, is, I think, could you agree that with Trump, the reason he is created is because he is a symptom of the system, whereby people, ordinary people feel that their voices aren't being heard?
00:42:26.280 Yeah.
00:42:27.080 They realize that, you know, Washington is a machine that doesn't particularly care about them.
00:42:32.760 They were betrayed time after time, many times by the Democrats, many times by the Democrats who said that they were representing ordinary working people, like the Labour Party were in my country.
00:42:42.780 And they felt that these politicians didn't care.
00:42:45.860 So why not vote for Trump?
00:42:47.240 What else have you got to lose?
00:42:48.480 Yeah.
00:42:48.860 Oh, yeah.
00:42:49.080 No, I think that explains most of his support and certainly his success.
00:42:55.760 Yeah.
00:42:55.900 But I think we should be honest about how, well, both uninformed and nihilistic by turns that attitude is, right?
00:43:11.240 It's just like, it's not, I mean, that is like the clearest eruption of Thanatos, you know, in our lifetime, right?
00:43:23.000 It's just like, let's just burn it all down on some level.
00:43:26.980 Like, this guy's our wrecking ball.
00:43:29.160 We hate the elites.
00:43:30.680 We hate the so-called experts.
00:43:32.620 Go fuck yourselves.
00:43:34.160 We're just going to enjoy just watching this thing, you know, swing through everything you care about.
00:43:40.780 And, you know, just the sounds of explosions are going to just give us pleasure, right?
00:43:48.060 Like, that's where we are with tens of millions of people in this country.
00:43:53.240 That's a, you know, that is a very scary basis from which to try to, you know, cooperate at scale and produce political outcomes that are actually going to be good, right?
00:44:09.920 And, again, the extremes amplify each other, right?
00:44:12.300 So you've got Trumpism.
00:44:13.920 I mean, there was no greater goad to wokeism than Trumpism, right?
00:44:18.520 And so, like, and, you know, I put myself, you know, in second place to nobody, you know, although I probably spent a little bit less time on it than some people we could name.
00:44:29.860 And in my, in the revulsion, I feel, to the extreme left, you know, activism, right?
00:44:39.380 I mean, it's just, it's as dishonest as it can possibly be.
00:44:44.160 And it's, and it's dishonesty is harder to parse for smart people.
00:44:48.840 Smart ethical people find what's happening on the left much more confusing than what's happening on the right.
00:44:54.440 So it's like, so people ask me, and so, and I spend much more time focused on the left than I, than I do on Trump or on the right, because.
00:45:02.500 Not in this interview, Sam.
00:45:03.840 No, no, no, you, you, you, you, you go to me.
00:45:06.280 But look, revulsion.
00:45:06.820 You've got the full dose of my Trump.
00:45:08.420 Revulsion is a strong word.
00:45:10.660 Why do, and I feel exactly the same thing.
00:45:13.120 And you know why.
00:45:13.700 We talked about my book before.
00:45:14.980 I come from a society that's seen some of these ideas being implemented.
00:45:18.600 Why do you feel revulsion, a very, very strong emotion about this ideology?
00:45:24.440 Well, because it's, I mean, one, it is destroying institutions that I actually care about, right?
00:45:36.440 It's like, you know, white supremacy and far right lunacy is not affecting institutions that matter, you know, by my lights, right?
00:45:47.980 You know, you could argue it affected the, you know, the White House and the U.S. government to some degree at the margins.
00:45:53.900 I mean, I think allegations of Trump's racism or his alignment with the far right and white supremacy, I think that's been massively exaggerated by the left.
00:46:06.580 And, you know, most of the claims to his, I actually have no doubt that he's actually racist, but most of the public claims to his racism, I think, are obviously false and, you know, inconsistent.
00:46:17.620 And so it's, I mean, I think you have to be intellectually honest, even as you derive these dangerous people and extremes.
00:46:26.460 So the left has, you know, as I'm sure you've pointed out many times on your show, I mean, it has captured institutions.
00:46:40.420 It has captured academia.
00:46:41.800 It's captured journalism.
00:46:43.000 It's captured science to an amazing degree.
00:46:44.840 It's captured Hollywood.
00:46:46.280 It's, and for reasons that are understandable, because, you know, it is hard to figure out what's wrong with Black Lives Matter as a movement.
00:46:57.200 It's like it's, you look at it, you know, it's almost perfectly engineered to just, you know, get past the blood-brain barrier and just attach to all the right ethical receptor sites, right?
00:47:14.040 It's like, it's just, this is, this is, of course, I care about, you know, of course, racism is disgusting.
00:47:23.120 I would, the last thing I would want to be is a racist.
00:47:25.640 Of course, I acknowledge the legacy of slavery and just how hard fought all of our civil rights gains have been in the United States.
00:47:33.500 Of course, I don't want, you know, members of minority groups feeling victimized, you know, much less being victimized.
00:47:39.960 You know, I want fair hiring practices, you know, just check all the boxes on, you know, to have a good liberal conscience, right?
00:47:49.400 If you're that sort of person and you're confronted by Black Lives Matter as a social phenomenon and the protests over George Floyd and all of that, it is very hard to see that you're in the presence of a completely dishonest moral panic, right?
00:48:09.900 Because there's so many points of contact with real grievance or potential points of contact with real grievance.
00:48:16.420 And so, yeah, it's, it's, it's harder to parse, therefore more interesting.
00:48:23.040 I mean, and it's also more consequential in my world because it's vitiating the New York Times and Princeton University and, you know, Science Magazine.
00:48:32.520 And so, like, it's just, it's, it's a, it's a full-on moral panic out there.
00:48:39.460 And, and what's more, you have this layer of smart people who think all of that's being exaggerated, right?
00:48:44.500 It's not really happening.
00:48:45.360 It's just a few college campuses.
00:48:47.320 It's a few kids, you know, on a few college campuses.
00:48:50.240 It's just like, you know, 18 people at Yale, you know, lambasting Nicholas Christakis and everyone else is really just a bystander to this.
00:49:01.760 And it's, it's, it's all being exaggerated.
00:49:05.980 The kernel of truth there is it really is, it's still a minority of people who actually believe this stuff.
00:49:11.140 But, you know, you only need something like 5% or 8% of, you know, really energized activist minority to completely co-opt a conversation.
00:49:21.180 And that's what has been accomplished.
00:49:23.000 And, but it's not just that they're a minority.
00:49:25.480 They're an exceptionally powerful minority, Sam.
00:49:27.680 Oh, yeah.
00:49:28.120 You know, they're the ones who dictate culture.
00:49:30.140 They're the ones who set the tone.
00:49:32.200 They're the ones who, you know, who edit and create newspapers.
00:49:36.180 100%.
00:49:36.580 So that's the real problem, isn't it?
00:49:38.560 But the question that I want to ask you is, where do you think this is going to go?
00:49:43.640 Where do you think this is going to end up?
00:49:44.840 Because he's more positive about it.
00:49:46.480 And I'm a rabid pessimist.
00:49:48.320 Right.
00:49:48.740 Where do you think this is going to go?
00:49:49.780 Well, I think, if I had to bet, I think the vapors of wokeism will magically dissipate at a certain point.
00:50:02.340 I think it's just whether we're going to have one example of hypocrisy or, you know, just one own goal that is so spectacular that everyone will just all of a sudden pretend that they were never woke.
00:50:18.280 You know, like whether it's going to be just going to be a salient moment where you can point to in your timeline or it's just going to be this magical dissipation of, you know, where people start making much more sense on these topics.
00:50:30.540 If I had to bet, I would think that's going to happen.
00:50:34.300 And I don't think it's so.
00:50:35.800 And I think it's going to happen in some short order.
00:50:37.700 I don't think we're going to be having this conversation in five years.
00:50:42.720 I would be very surprised if we're having this conversation in five years now that, you know, count me, I guess, as an optimist on that front.
00:50:50.160 And I certainly could be wrong, but I would be surprised.
00:50:52.720 I mean, the one caveat I would put there is if we get four more years of Trump, then that goes completely out the window.
00:51:03.700 I mean, I think if we get four more years of Trump or a Trump-like phenomenon that's just as provocative to the left, then that calculation changes.
00:51:12.720 But if we've got a normal presidency in 2024, you know, Democrat or Republican, I think the woke thing has just become so unpragmatic.
00:51:29.980 And, yeah, I just don't see how people don't begin aging out of it in some short order.
00:51:40.760 I mean, it's somewhat analogous to the, it's a much bigger phenomenon, but it is analogous to the child sexual abuse, satanic panic thing we had.
00:51:51.040 And I don't know if you guys had it in England.
00:51:53.600 No, we just had the Catholic Church.
00:51:55.040 Yeah, which is the true, yeah, which is the true version of many of these concerns, yeah.
00:52:02.580 But, yeah, I mean, in the States, I don't know if you know the story that the journalist Lawrence Wright told on my podcast, but he wrote a book on this.
00:52:13.300 And when he was doing the New Yorker article that became a book, he was just researching the whole satanic panic phenomenon.
00:52:20.180 And this is in the 80s in the States.
00:52:21.900 And so the, for those who are too young to remember this, I mean, the allegation was that, you know, satanic cults had infiltrated preschools.
00:52:31.360 And they just, in a very, you know, conspiratorial way, had decided to just get access to kids so that they could perform human sacrifices and ritual abuse.
00:52:43.680 And this was now happening at scale in American society.
00:52:47.540 And, you know, we had this massive problem.
00:52:49.620 And it was, you know, who knows what was truly at the bottom of it.
00:52:54.740 And, you know, whether it's, you know, certain, you know, rock lyrics were getting into the heads of teenagers and spawning a generation of devil worshipers.
00:53:03.320 You know, who could tell?
00:53:04.560 But we clearly have a problem on our hands.
00:53:06.380 And so Lawrence Wright, in kind of getting onboarded to this phenomenon, went to a seminar run by law enforcement.
00:53:18.240 I think, I'm not sure if it was, it might have been in Texas, where he's lived for many years.
00:53:24.020 So it's just, this is, you know, a seminar for, you know, journalists run by law enforcement.
00:53:28.380 And he remembers that moment where the, you know, the sheriff or some Leo said to the group, last year, 50,000 children were murdered in ritual sacrifices by satanic cults in this country.
00:53:51.240 Right?
00:53:51.360 This is a cop saying this.
00:53:53.100 And it took Lawrence, you know, five seconds to understand that there's been no year in American history where there have been 50,000 murders of any kind.
00:54:03.420 Right?
00:54:03.700 And yet here we have a cop saying that 50,000 kids have been killed.
00:54:09.100 There's 50,000 missing, you know, and murdered kids.
00:54:11.480 Right?
00:54:14.140 So what explains that level of confusion and derangement?
00:54:18.600 Right?
00:54:18.740 Like, so we're in a moment like that.
00:54:20.540 And here's the question we were going to ask you about that.
00:54:22.880 And I'm really glad you phrased it in that way, Sam, because I was a big fan of the New Atheist Movement.
00:54:30.020 Francis and I, none of the three of us are religious.
00:54:34.160 I was a big admirer, and still am a big admirer of yours, Richard Dawkins.
00:54:38.300 I read many of his books.
00:54:39.740 However, is it possible, just is it possible that people like us who think in the way that we do have forgotten that thing?
00:54:52.140 I think it was a Chesterton who said this, that when you stop believing in God, you don't believe in nothing, you believe in anything.
00:54:57.380 Right.
00:54:57.580 Is it possible that this new religion, and I certainly see wokeness as a religion, is a product of a society that has let go of that religion that it used to follow?
00:55:05.540 Well, I think it's – I mean, the short answer is probably not, because I think many of the woke are, you know, religious by my lights.
00:55:18.620 I mean, they would certainly claim to be religious.
00:55:20.120 It's not like you have a – I don't know if polling research exists on this.
00:55:24.300 It would be interesting to run these polls, but yes, yes, loss of faith has been kind of ramping up in America, you know, really in all secular democracies.
00:55:40.500 But it's still not – you know, you don't have a minority of people – you don't have a majority of people identified as atheists, right?
00:55:46.640 And the minority that identify with atheists is still in the single digits because atheism as a concept is just – it's got bad, you know, PR associated with it.
00:55:56.640 You have something like 20 to 25 percent of the so-called nuns who are – again, these are not people who identify as atheists, but these are people who would say they're not identified with any specific church.
00:56:08.060 But you still have most people who are, you know, at least nominally Christian and pretend to care about being Christian, you know, in the U.S. at this moment.
00:56:17.340 And you have something like fully half who are, you know, really will check many or most – all of the boxes to attest to their belief.
00:56:29.160 I mean, it's more than – you know, it's – if you ask – if you – and again, a lot of these people are on the Christian right,
00:56:36.520 but many of them are woke or woke adjacent, you know.
00:56:41.500 It's like I just was on Van Jones' podcast, right?
00:56:44.900 Now, he's not – I mean, he's much woker than any of us.
00:56:52.400 I think he's probably said some rational, pragmatic things.
00:56:57.420 We didn't actually talk about this topic.
00:56:58.680 He has. I remember seeing some of that.
00:56:59.680 He's kind of taken the Obama line of like saying, listen, kids, like this is not – you know, they're bigger problems than pronouns or whatever.
00:57:08.560 I don't know how he touched it.
00:57:09.720 But still, he's like – he's someone whose coverage of Black Lives Matter I would have, you know, many critical things to say about.
00:57:21.060 And again, the topic didn't come up, but he's – you know, he's someone who if you ask him, do you think Jesus will be returning to earth to raise the living and the dead, I am pretty sure he would say yes, right?
00:57:38.100 And you'd be surprised at the number of – the percentage of sober, non-Bible-thumping people would say yes to that question.
00:57:48.300 I mean, I was – I've been amazed at the – like the people who I would have – bet a lot of money would be skeptical of that piece.
00:57:58.820 They might be Christian.
00:58:01.100 They might be like, listen, I love the Bible.
00:58:03.280 It gives me a great moral framework.
00:58:05.620 It gives my kids a great moral framework.
00:58:08.760 This is the tradition I'm identified with.
00:58:11.200 This is all super important to me.
00:58:14.860 But that's kind of as far as it goes, right?
00:58:17.180 Like I'm not going to make magical claims about flying saviors who are literally going to come down from where is heaven exactly given that we have, you know, multiple telescopes up there, you know, beaming back, you know, tens of billions of years worth of information.
00:58:37.380 I'm amazed at the number of people who will bite the bullet on the core doctrine and say, yeah, I think Jesus is going to come back and raise the dead, right?
00:58:50.400 But, Sam, surely you have to agree in a society which is becoming ever more atomized –
00:58:54.660 Let me just close the loop on it.
00:58:55.900 Yeah, sure, go for it.
00:58:56.740 Many of these people are awoke, right?
00:58:58.720 So you can't say – the punchline can't be, well, they lost their religion and now they have a vacuum of ethical and existential vacuum that they're filling with wokeness.
00:59:10.940 Now, there's – I would grant you that it's – don't lose your point.
00:59:15.040 I would grant you that it's drawing a lot of quasi-spiritual, quasi-religious energy from the fact that most people in our society, even if they're nominally religious, really are struggling to find meaning in their day-to-day.
00:59:38.280 But when you look at just the kind of the hour-by-hour increments at which, you know, life is doled out to us, like you get up and, you know, you're just – you're cast out of, you know, deep sleep or, you know, the phantasmagoria of dreams.
00:59:53.920 Sometimes, you know, when the alarm goes off in the morning and how do you feel about your life and what is going to give you moral urgency and meaning.
01:00:05.120 And a lot of – millions and millions of people found it at specific moments in our, you know, recent history.
01:00:14.940 Like, and, you know, the George Floyd killing was certainly one of those moments where it's like, okay, this is – enough is enough.
01:00:23.440 Like, this is my religion, right?
01:00:25.460 Like, and that's – it's understandable and it is – yes, it does have a religious dynamic and there's a religious dynamic.
01:00:36.420 I mean, to call it religious is to just basically say – it's actually an invidious statement about religion.
01:00:45.060 It's basically like all the things I don't like about religion is tribalism, it's dogmatism, it's immunity to good arguments and good evidence, right?
01:00:53.740 The fact that it just – it can't be reasoned with, really, because it's just chucked reason out the door, you know, initially.
01:00:59.960 And what it's brought back in the name of reason is functioning under the – the – the sort of the new physics of just casuistry.
01:01:08.520 Like, like, we already know that God exists and we know that the Bible is perfect and we know the Quran is perfect.
01:01:13.220 And so within that frame, now we're going to get really reasonable, like, you know, St. Thomas Aquinas or St. Augustine.
01:01:20.900 And that's all the stuff about religion that I find so obviously wrong and it's so easy to see once you're not indoctrinated into that religion.
01:01:33.960 A lot of that is – explains what is happening politically on the far left and the far right at the moment.
01:01:42.100 Or, you know, the far right being Trumpistan.
01:01:44.540 So, Sam, but you – I think all of us have got to admit that in a society where we're becoming ever more atomized, where people are becoming more isolated, religion, organized religion, it was a bond.
01:01:59.480 It was a community.
01:02:00.500 People could go.
01:02:01.520 They could meet other people.
01:02:03.080 They could feel connected.
01:02:04.320 And so when people are disconnected, they're going to look for ways to connect with someone else.
01:02:09.920 And what better way to do that was then with, you know, I support this political movement, BLM.
01:02:14.680 Or, you know, you share the same immutable characteristics as me.
01:02:18.940 You know, I'm gay or I'm black or et cetera, et cetera.
01:02:22.200 Because we're so desperate, because we're literally programmed to form communities, that we're going to have this ideology which is going to enable us to create a community.
01:02:31.920 Yeah, and on the woke side, it has this – I mean, it has a precursor in Christianity, but it's somehow in a purer form now.
01:02:44.860 It has inverted the value structure such that, you know, the lower status you are, the higher status you come out, you know, once the calculation has been done.
01:02:57.220 It's like, you know, Dungeons and Dragons with sort of the new dice, where it's like the least power points you have, you know, the more you find yourself winning.
01:03:07.060 And so the victimology of it is, you know, and the meek shall inherit the earth.
01:03:13.660 I mean, it's really – it is that ethos implemented in a very weird way and sort of gamified somehow in all of the intersectionality details of it.
01:03:24.720 So, yeah, I mean, it's – there's no question people draw a tremendous amount of energy and, you know, I hesitate to say meaning.
01:03:37.300 I mean, there's a meaning in scare quotes from this.
01:03:41.540 And it's all – I mean, I guess – so, I mean, to steel man all of it, you know, briefly, it is – again, it's – especially on the left, it's genuinely confusing, right?
01:03:53.380 Like the mad work that tiny pieces of misinformation or just fraudulent assumptions is doing is – it's really impossible to exaggerate.
01:04:08.340 I mean, so if you ask most people who – like most people who saw the George Floyd moment, I mean, it's – again, I don't – we've yet to totally understand what happened there because, like, who knows?
01:04:24.900 Just let's bracket that because I don't – we still don't know who Derek Chauvin really is and why he did what he did, right?
01:04:30.220 So, like, either it was a racist murder or it was a – you know, his brain malfunctioned or, like, I just – I don't – honestly, I look at that video.
01:04:38.040 I don't know what I'm looking at there.
01:04:39.460 It's just a – it's apart from the horrible killing of a person who certainly did not need to be killed in that situation.
01:04:47.200 But you ask most – most people who saw that, the vast majority of people who saw that, you know, certainly left of center, would bet their lives, bet the lives of their children that what they saw there was a racist lynching, right?
01:05:08.180 Like, that was a – what we have is a white man killing a black man because of racism, because, like, that wouldn't have happened to a white man.
01:05:17.180 It wouldn't have been perpetrated by a black man.
01:05:19.580 Race is 100 percent of the explanatory variable there.
01:05:23.060 And not only was that as unambiguously evil and sadistic and racist as it seemed, that happens thousands of times a year in America.
01:05:40.900 Like, you ask people to estimate – how many black people do you think get murdered by white racist cops in America every year?
01:05:47.440 They imagine – we're talking thousands, right?
01:05:50.200 So if you believe that, right, then what would you do?
01:05:55.620 You know, what would you – wouldn't you two take to the streets when everyone says we're going to – you know, we're protesting on Tuesday?
01:06:04.400 Of course, right?
01:06:05.600 So it's like – so you don't have to add too many pieces of, you know, distorting, you know, pseudo facts to get people who I otherwise totally understand to mouth, you know, all the predictable pieties on this topic.
01:06:25.740 But the truth is all of that is wrong, right?
01:06:29.480 Like, you know, you can count on two hands the number of unarmed black men who get killed every year by cops.
01:06:38.400 And you can count more white people who get killed every year by cops, right, under identical circumstances.
01:06:43.920 Again, I've talked about this on my podcast, so we need not go there.
01:06:48.140 Those who are interested could look at the episode, Can We Pull Back from the Brink?
01:06:53.320 It was beautifully done.
01:06:54.180 Yeah, like two hours I talk about this, right?
01:06:55.500 It was exactly what needed to be said in that moment, so I really congratulate you on that.
01:06:59.340 I invite people to go and find that.
01:07:00.960 Really very good.
01:07:01.880 Yeah, so we can leave that aside, but the misinformation or the faulty assumptions occur at the highest level, right?
01:07:11.540 It's like this is not – I mean does – I guess there are some people who actually know what is real here and are just cynically manipulating the politics.
01:07:20.520 But I mean it's hard for me to believe that someone like Kamala Harris doesn't actually know the numbers, right?
01:07:26.740 But it's not in her political interests or in what she conceives of as her political interests to act like she knows the numbers, right?
01:07:36.080 But anyway, so it's not – I mean the charitable view is there are very few people who are consciously lying or seeking to do things that they know are wrong.
01:07:59.960 I mean just conscious evil is a rare thing.
01:08:03.040 Yes.
01:08:03.300 Mostly useful idiots is what I talk about in my book.
01:08:06.480 Yeah, yeah.
01:08:06.940 I agree with you.
01:08:08.480 Sam, listen, you've been very generous with your time.
01:08:10.560 There's about 50 other questions we want to ask you.
01:08:12.920 We're not going to get a chance to.
01:08:14.560 So we'll pick out from our – we do a couple of questions for our supporters, so I'll make sure to pick out a COVID question because that came up a lot.
01:08:21.640 Okay.
01:08:22.580 Before we ask you the final question, if you don't mind a few more minutes.
01:08:26.440 Go for it.
01:08:27.120 I wanted – well, Francis and I both wanted actually to ask you.
01:08:30.060 We've talked about these very divisive things and people will have a different opinion about Trump and COVID and Brexit and all of this stuff, whatever you want to – but one thing that strikes me is you're one of the few people that we've met who is content, who's happy.
01:08:45.140 I can tell.
01:08:46.500 Right.
01:08:47.940 How does – how – if people are watching this and they would like to be happy in spite of all the terrible things that they think about, they see happening on Twitter.
01:08:56.440 It strikes me – Francis uses your app every morning.
01:09:01.780 How do you – how does one in the modern world get closer to that point where whatever is happening out, whatever storms are out there, you are calm and peaceful inside?
01:09:16.640 Well, first, let me say I'm not – I'm certainly not always calm and peaceful, but the half-life –
01:09:21.340 Your voice is, though.
01:09:22.220 Your voice is great.
01:09:24.160 Yeah, yeah.
01:09:26.500 But the truth is – so the back story here is that, you know, in my early 20s, I got really into meditation.
01:09:34.120 And I mean, so first, psychedelics just showed me that it was possible to have a very different experience of the world.
01:09:41.440 And that there was a landscape of mind that could be explored based on just how you paid attention to experience, right?
01:09:48.300 So prior to psychedelics, I would have really just been kind of waiting for the third-person brain-based discussion to deliver, you know, all the right answers about, you know, what the human mind is.
01:09:59.980 And it was pretty well-established and still is, you know, thought to be well-established in Western science, you know, psychological science, cognitive science, and even Western philosophy that introspection was a dead end.
01:10:14.440 I mean, they tried to get it off the ground somewhere around 120 years ago, and it just – you know, you come up short almost immediately.
01:10:22.980 I mean, the truth is you close your eyes and you look inside, and you can't even tell that you have a brain, right, much less that the brain is doing all of these complex things that is actually delivering your experience of the world.
01:10:33.440 So – and, I mean, this is just one curious asymmetry of cultural wisdom.
01:10:42.800 In the East, you know, for all the failings of what didn't happen civilizationally in Eastern culture – and there's a lot to be said about that – they didn't lose this strand of wisdom,
01:10:58.360 which is there actually is something to be discovered in a first-person way about the nature of your own mind that is liberating, right?
01:11:06.200 Like, you suffer by a certain machinery, a certain dynamics, which could be either completely inscrutable to you or can become more and more transparent.
01:11:20.180 And in its transparency, less and less operative on a moment-by-moment basis.
01:11:27.240 And so, I mean, take any of the topics we've talked about.
01:11:30.240 So we've talked about me getting on Twitter and getting really spun up over, you know, somebody saying something about me or about something else that I care about.
01:11:38.580 But, you know, I've talked about, you know, anyone who thinks I have Trump derangement syndrome is going to look at me and say,
01:11:46.020 well, why are you talking to this guy about meditation?
01:11:48.420 This guy is so worked up over Trump, you know, what – it's like this is a – you know, it's a performative contradiction, right?
01:11:55.040 That's actually to misunderstand my, you know, emotional relationship to the phenomenon of Trump, right?
01:12:02.680 Like, I can say everything I say and think about Trump without spending much time feeling contracted around Trump.
01:12:14.880 I mean, but it's not to say – that's not to say no time, but it's just much less time than I otherwise would if I didn't know how to, you know, quote, meditate, right?
01:12:24.340 Now, the word meditation can mean many different things to people, but what I think it should mean is a – just a simple recognition of what consciousness is like prior to entanglement with thought, right?
01:12:43.780 So we're all – three of us are sitting here and we're having an experience of the world that's happening in, you know, five sensory channels, but there's this other mode or this other, you know, aspect to our condition, which is our thinking about what we directly experience through our senses, right?
01:13:05.800 And for most people, most of the time, the thoughts are incessant and uninspected, right?
01:13:13.640 And their arising is unnoticed, right?
01:13:16.020 So you're just – it just feels like you, right?
01:13:18.680 It's like so you'll say something that I disagree with and there's a voice in me which says, what's he talking about?
01:13:27.020 Or like, what does that mean?
01:13:27.880 Or like, what – but you just – like, there's just that voice that, you know, is – that either feels like a self, I mean, in, you know, nearly 100% of the cases, that just feels like I, right?
01:13:43.520 That feels like it's me.
01:13:45.200 And then you're told something about the project of – well, again, so you could have an experience haphazardly or on psychedelics where that gets – that identification gets interrupted, right?
01:13:58.740 Where all of a sudden, there's just – the mind is suddenly much more vast than that, right?
01:14:05.720 It doesn't feel like there's a subject in the head looking out through your eyes at a world that's not you and, you know, forever implicated by the glances of other people and the opinions of other people.
01:14:18.260 And it's just me in here, this sort of embattled ego trying to navigate a world that is fundamentally or at least potentially hostile to my interests, right?
01:14:28.680 Like, that subject-object dichotomy where it's just – like, I'm the man in the boat trying to steer it, you know, to some safe place and not go over the falls emotionally, that suddenly relaxes.
01:14:44.200 Again, it matters – now, I guess maybe I'm talking about psychedelics because it's more replicatable for people.
01:14:52.200 Depending on what drug you've taken, that can relax in one or another way.
01:14:56.200 I mean, MDMA is really just the relaxing of the emotional tone of all that without the pyrotechnics of changing your perceptions.
01:15:06.560 If it's LSD or psilocybin, you can have a much more fundamental transformation of how you perceive the world.
01:15:13.800 But whatever is the case, it just so happens that our nervous systems are perturbable pharmacologically or just by happenstance, right?
01:15:22.340 This could happen to you just because it happens to you, right?
01:15:25.100 And people have those stories.
01:15:27.940 But there's vast testimony on this topic that you can experience your mind as a much vaster place than you tend to experience it as.
01:15:40.660 And then when you come back from one of those experiences, you might become interested in what is it that trims it down so reliably to this experience of confinement where you feel like it's just me here feeling uptight again, right?
01:15:58.500 You know, like what's that about?
01:16:00.260 You know, virtually 100% of that is just what it's like to be you identified with thought.
01:16:10.940 And then if you're identified with thought habitually, you are at the mercy of whatever you happen to think about, right?
01:16:17.960 It's just like there's a, I mean, I've, I mean, the analogy I've drawn somewhere is just, it's really, it's like the most boring person in the world comes through the front door of your house and takes you hostage, right?
01:16:34.080 You're like, follows you from room to room, telling you the same stories over and over again.
01:16:39.300 You can't shut him up and you can't get away from him and it's just, and that's your life, right?
01:16:46.760 And you're, you're thinking about the past, about what you could have said or should have said, or almost said, you're thinking about the future.
01:16:51.900 What's this, what's, you know, how's this going to go?
01:16:54.340 And most of the futures you, you visualize never happened the way you've, you've obsessed about them in the first place.
01:17:00.000 So like 99% of your self-talk is, I mean, at best it's neutral with respect to its emotional tone.
01:17:09.900 I mean, that's really, I mean, some people have, I'm convinced some people have a fairly happy self-talk and that's a, you know, it's sort of hard to get through to them because they really don't think of themselves as ever suffering much psychologically, right?
01:17:24.080 They, they, they're very confident. They love the people in their lives. They get a lot of love back. They're not really conflict. They don't have regrets and disappointments that they're trailing.
01:17:33.360 They're not, you know, they're not worried about anything and they just want to get up and do it again tomorrow because they're having so much fun. There are people like that.
01:17:41.000 Psychopaths.
01:17:41.480 Yeah, but it's, most people are not like that, right? Most people are sensitive to this criticism of the default, which is most of what you're saying to yourself isn't making you happy and worse, it's, it's predicated on a fundamental illusion of selfhood, of identification with this, this subset of your, your mental experience, which is this, again, that this discursive thought.
01:18:09.320 And when you break that identification, there's just much more space there and the past and the, and the, and the, I mean, the, the, the, it is in thought, it's in identification with thought that the past and the future exert their weight on the present, right?
01:18:28.380 So like, it's like, we're, it's because we're, we're processing everything we experience in the present through this scrim of discursive thought that we don't, we never actually make satisfying contact with the present or we rarely do.
01:18:42.880 And, and, and those moments where we do, you know, those, those peak kind of peak experience moments, what has made it be a peak is breaking the spell of thought for long enough for just to let in some of, of the breeze of, you know, awareness that it's always, I mean, it's always there, but we just don't, you know, we're blocking it continually.
01:19:03.320 We just haven't opened the door or the window.
01:19:07.640 And it's, so meditation really is, again, there are many different techniques or many different ways to describe it and frame it.
01:19:18.060 In the end, it's actually not even a practice you're doing.
01:19:22.280 It's not, it's, in the end, it is something you're ceasing to do.
01:19:25.420 I mean, it's just, it's non-distraction.
01:19:28.440 You're ceasing to be distracted by thought.
01:19:30.760 You're, you're starting to notice thoughts themselves as appearances in consciousness and noticed as, as appearances, they don't have force.
01:19:42.180 They don't have, they don't have, they certainly don't have emotional force.
01:19:44.280 It's not like you suddenly become an idiot and you can't figure out what you want to eat for dinner or, you know, how to, how to find your car or, I mean, you can, you can think and you can plan.
01:19:52.800 But the moment you, you, you begin to suffer, you become, you, your, your new default is to become interested in, it's like, it's like a, it's like a mindfulness alarm, you know, starts sounding.
01:20:08.900 And then you relax your identification with it, just the, just the, the, the physiology of suffering.
01:20:18.680 I mean, so to bring it back to what we were just talking about.
01:20:21.080 So, yeah, there's a moment where I notice something that I find, you know, either like personally annoying or, or the appropriate target of moral outrage.
01:20:34.780 I mean, I don't think we should get, I'm not envisioning psychological health as being synonymous with never being angry ever again or never being fearful ever again.
01:20:44.320 I mean, you know, negative emotions are, you know, from, from a, an enlightened point of view in my book are still salience cues, right?
01:20:55.380 Like if I walk outside this house on the way to my car and someone physically attacks me on the sidewalk, like I don't want to be just a puddle of goo, you know, just beaming love at the person.
01:21:06.380 Like now it's not, it's not to say that there's not, that's not a possible state of consciousness.
01:21:11.480 It certainly is.
01:21:12.380 And it actually, there, there are definitely scenarios where that, that quote works, right?
01:21:19.180 Like just being the guy who's, you know, beaming unconditional love as your only response to anything, right?
01:21:27.340 It's possible to get out of some physical altercation because it's just so surprising, right?
01:21:32.100 Someone comes to mug you and you're just, you know, you're, you're on MDMA and you just say, listen, man, I love you, right?
01:21:39.100 Like that could either, like that could turn out well, but practically speaking, it strikes me as totally appropriate to feel these kind of punctate, classically negative emotions.
01:21:54.740 The quick, the real question is how long do they last and what are they good for?
01:22:00.280 Like what, like when, when is it, when do you, when do you want to cease being angry so that you can actually function intelligently?
01:22:07.560 And in my book, it happens very, very soon after the arising of anger.
01:22:13.580 I mean, like, you don't, you don't want to stay angry, right?
01:22:15.600 Like, but the, but the initial jolt of anger in, in many cases is totally appropriate.
01:22:22.920 And it is the orienting response that you actually need to respond intelligently to the, you know, whatever the, you know, emergency or quasi-emergency is.
01:22:33.280 So, but once you know how to meditate, you do notice that the half-life of negative emotions is really, really brief.
01:22:41.640 I mean, it's, it's actually impossible to stay angry or embarrassed or, you know, whatever it is.
01:22:48.420 Pick your negative emotion for longer than, you know, some tens of seconds, unless you're then, you're, you're taken in by thought again about why you should be angry or why you should be embarrassed.
01:23:01.720 And, um, yeah, your life becomes completely different when you can get off the ride.
01:23:08.500 You know, I mean, the difference between, between being angry for 10 seconds and being angry for 10 minutes, even, you know, much less 10 hours or 10 days.
01:23:17.400 It's, it's enormous, right?
01:23:19.180 I mean, just, you just think of how life deranging those periods are where you're just helplessly motivated by anger, right?
01:23:27.760 I mean, 10 minutes is enough to completely fuck up your life, right?
01:23:31.880 I mean, to say the thing to your spouse that you, you can't, to ring the bell you can't unring, you know?
01:23:38.020 And I mean, just like, you just see how people's lives run off the rails because their minds are out of control.
01:23:45.720 And literally everything we see out there that is producing massive human suffering and, you know, existential risk even, you know, like literally everything beyond naturally occurring disasters, right?
01:24:01.600 Is a matter of people's minds being out of control, right?
01:24:05.720 I mean, we just have, we just have, we're running terrible legacy code, you know, in a condition of increasingly destabilizing power amplified by technology.
01:24:21.780 I mean, it's getting increasingly easy for one person to screw it up for the rest of us.
01:24:27.720 I mean, so the topic of existential risk is its own thing, which I've, you know, I'm focusing on more and more.
01:24:32.980 I think it's, you know, it's, and it's, you know, neglected to a scary degree.
01:24:38.180 I mean, it's just, they're just not that, they're not enough people thinking about how we can shore up our civilization against existential risk and, you know, man-made and otherwise.
01:24:47.620 But, I mean, so much of the daily evidence of conflict and needless human misery is just born of people being captured by their thoughts and not knowing that there's any alternative, right?
01:25:03.800 They're just, they're just talking to themselves, right?
01:25:05.780 And they're just claiming to know things that they don't know and being persuaded by those, those inner proclamations, right?
01:25:13.280 I mean, just like, what does it feel like to have a very strong opinion that is going to dictate everything you do next?
01:25:22.920 And how often is that just an automaticity that's totally uninspected that would be, could be completely deflated just with another, with just a moment's pause if you only knew how to, you know, or just take the other side in that.
01:25:44.320 Forget about meditation for a moment, just the ability to be skeptical about one's own opinions.
01:25:51.100 Like, that's, talk about an untrained skill.
01:25:53.840 I mean, that's just something that almost nobody has, right?
01:25:56.960 Nobody even has it as a possible norm that you could endorse, even in the abstract, right?
01:26:02.940 Like, why would you want to be skeptical about your own opinions?
01:26:05.380 Like, this is just what I think.
01:26:07.080 That's why I'm like you, I'm always starting debates around the dinner table, because I'm always testing what I think against what other people think.
01:26:14.920 And because I'm aware that it's just thought, and it needs refining.
01:26:19.120 But anyway, we've got one, Sam, first of all, thank you so much for coming on the show.
01:26:23.260 I'm happy to do it.
01:26:23.440 What a pleasure to see you and speak with you and get a little bit of your opinion and wisdom.
01:26:27.820 Can I just say, if people are listening to this, I use your app.
01:26:31.240 It's actually brilliant, and it has changed my life.
01:26:33.580 The ability to just sit and meditate for 10 minutes every morning is one of the most, if not the best way, to deal with obtrusive and obsessional thoughts.
01:26:42.280 Every time he comes into the studio and he's all over the place, they go, have you meditated?
01:26:47.300 The answer's always no.
01:26:48.480 Yeah, so, thank you.
01:26:50.000 No, it's brilliant.
01:26:50.860 And the podcast, Making Sense for Sam Harris.
01:26:53.300 I'm a big fan of, like we talked about, during the BLM situation, you covered it, I thought, exactly the way that it needed to be covered.
01:27:00.880 And you have important conversations on there.
01:27:02.520 And we'll do a couple of questions for our locals.
01:27:05.380 I will ask you about COVID, because we promised people we'd do that.
01:27:08.260 But before we do, our final question is always the same.
01:27:10.880 What's the one thing we're not talking about that we really should be?
01:27:14.620 In this conversation, or just as a society?
01:27:17.120 As a society.
01:27:17.820 Well, we've touched on pieces of it.
01:27:25.100 I mean, I do think that, again, at the generic level, I mean, the problem is always failures of cooperation for us at this point.
01:27:40.880 I mean, like virtually anything that's going to just happen to us, you know, coughed up by the hand of nature, we can figure out how to solve at this point.
01:27:51.260 I mean, including an asteroid hurtling towards us.
01:27:54.520 I mean, at this point, we have enough tech, and I'm not so sure we have enough people watching, but close, that we'd have tens of years, right?
01:28:05.080 And we'd have some decades to deal with a problem, that specific problem.
01:28:10.980 But so all of our problems on some level are of our own making.
01:28:15.560 I mean, if nothing else, it's just our – the opportunity costs born of all of the needless bullshit we get entangled with based on our own, you know, incapacity to cooperate.
01:28:27.440 So it's just – it's – and that's the first order of business and the next one to figure out how we can have successful conversations on some level, right?
01:28:40.580 Because, again, all we have is a capacity to persuade one another so as to engineer, you know, forward-looking cooperation, or we have violence, right?
01:28:53.040 And, like, in the end, it's like we just have to force people to do stuff if we can't persuade them to do stuff or that they're – you know, they can't come to the epiphanies on their own.
01:29:02.700 And that's where politics comes in.
01:29:04.900 And – but more and more, I think we're in a situation where because of technology, it's strangely getting harder and harder to get our, you know, our cognitive horizons to fuse, right?
01:29:24.600 We've got 8 billion strangers more or less trying to figure out how to cooperate and persuasion is the only good tool.
01:29:35.440 Again, I mean, we're going to have to use force in certain circumstances and, you know, I think we should be very – I don't think pacifism is a plan there.
01:29:46.100 I mean, I think we actually do need to have our force game together for the situations where we need it, you know, and that's individually and collectively, right, at the level of nation states as well.
01:29:58.920 But, yeah, I'm increasingly worried about our incapacity to converge on just a dispassionate, fact-based discussion on things that are just so easy to assess.
01:30:14.480 I mean, just like we've touched several topics here, but just like how many people of any identity get killed by cops every year in America?
01:30:23.000 And just like what are – it's like how does that relate to the levels of crime, you know, perpetrated by people of various identities?
01:30:30.740 And like what situations are cops actually getting into and what are their reasonable expectations of people in a society where there are 400 million guns?
01:30:38.720 You know, like why is it different when an American suddenly turns around and reaches into the cab of his pickup truck while getting arrested than when that happens in Japan, right, like where there are no guns, right?
01:30:49.820 But this is such an – this should be such an easy conversation to have, right?
01:30:55.400 There should be no – like – I mean, this is – it's not even – I mean, it's hard to think of a simpler one where the facts are easier to get.
01:31:08.960 I mean, most of the – and most of the stuff is on – so much of the stuff is on videotape.
01:31:13.000 It's so easy to parse.
01:31:14.080 It's – it's repeatable year after year.
01:31:19.440 It's a problem that everyone – like it's in no one's interest that the problem be bigger than it is and not smaller than it is.
01:31:26.320 Like we want – we all want to solve this problem, right?
01:31:31.300 And we find it impossible to talk about.
01:31:33.980 See, your problem is, Sam, you meditate too much and you're too smart.
01:31:38.080 And you think other people are like you, but they're not, you know.
01:31:41.100 Well, no, but I know – like there's – none of this is foreign to me.
01:31:44.080 I know what it's like to get emotionally hijacked by anything – by something, right?
01:31:48.420 Whether something happens right in front of you and your – or your ideas about it.
01:31:51.940 You care about the truth.
01:31:53.080 And so you will get emotionally hijacked and then you will go and look at the facts.
01:31:56.440 Yeah.
01:31:56.800 Most people just get emotionally hijacked.
01:31:58.660 Yeah.
01:31:59.160 Yeah.
01:31:59.460 So we – I mean, we need – that is a – I mean, that's certainly a software flaw in our operating system.
01:32:07.500 It's not a feature and – I mean, one way I've summarized this in the past for people is that – I mean, your capacity to be offended is not something that anyone need or should respect in you.
01:32:23.980 Like that's just not – it's not an – it's certainly not an argument, but it's not even a basis for respect.
01:32:29.640 Like that – like table stakes for any ethical conversation is more than just your capacity to be offended, right?
01:32:38.800 And until you understand that, like you're just not – you can't play the game we need to play in order to ensure an open-ended circumstance of cooperation.
01:32:53.680 What a great note to finish on.
01:32:55.220 Sam Harris, thank you so much for coming on.
01:32:57.120 We really recommend you check out Sam's podcast and his brilliant app.
01:33:01.380 We're going to ask him a couple of questions from you, for you.
01:33:04.080 But for now, Sam, thank you so much for joining us.
01:33:06.800 Really great to meet you and great to chat.
01:33:08.820 We'll see you very soon with another brilliant episode.
01:33:10.680 I mean, it won't be quite like this, but it will be a brilliant episode as well or also.
01:33:14.860 All of them go out at 7 p.m. UK time.
01:33:16.600 And for those of you who like your trigonometry on the go, it's also available as a podcast.
01:33:21.260 Take care and see you soon, guys.
01:33:22.880 Some of our listeners would be disappointed in how you handled yourself during COVID.
01:33:30.800 How do you reflect on the way you thought and spoke about COVID?
01:33:36.000 Are you happy with it?
01:33:37.220 Would you have done it differently?
01:33:38.360 Well...
01:33:38.700 We'll see you soon.
01:33:46.620 Have a great day.